0% found this document useful (0 votes)
14 views5 pages

SPCC Exp3

System programs and compiler construction experiments

Uploaded by

sanjanabhosle27
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views5 pages

SPCC Exp3

System programs and compiler construction experiments

Uploaded by

sanjanabhosle27
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 5

EXPERIMENT NO.

Aim of the Experiment: Implementation of Lexical Analyzer.

Lab Outcome: Identify and Validate tokens for given high level language and
Implement synthesis phase of compiler.

Date of Performance: - ___________________

Date of Submission: - _________________

Implementatio Understanding Punctuality & Total


n (05) (05) Discipline (05) Marks
(15)

_______________________________

Practical Incharge
EXPERIMENT NO-03

Aim: Implementation of Lexical Analyzer

Theory:

Lexical analysis is the process of converting a sequence of characters (such as in a


computer program of web page) into a sequence of tokens (strings with an
identified “meaning”). A program that perform lexical analysis may be called a
lexer, tokenize or scanner

Token: A token is a structure representing a lexeme that explicitly indicates its


categorization for the Purpose of parsing. A category of token is what in linguistics
might be called a part of- speech. Examples of token categories may include
“identifier” and “integer literal”, although the set of Token differ in different
programming languages. The process of forming tokens from an input stream of
characters is called tokenization. Consider this expression in the C programming
language: Sum=3 + 2;
Tokenized and represented by the following table:

ALGORITHM / PROCEDURE/PROGRAM:
1. Start
2. Read the input file/text
3. Initialize the counters for characters, words, lines to zero
4. Scan the characters, words, lines and
5. increment the respective counters
6. Display the counts
7. End
Program:

#include <stdio.h>
#include <ctype.h>
#include <string.h>
int main() {
FILE *input, *output;
int l = 1;
int t = 0;
int j = 0;
int i, flag;
char keyword[6][6] = {"int", "main", "if", "else", "do", "while"};
char ch, str[20];
input = fopen("input.txt", "r");
output = fopen("output.txt", "w");
fprintf(output, "Line no. \t Token no. \t Token \t Lexeme\n\n");
while (!feof(input)) {
i = 0;
flag = 0;
ch = fgetc(input);

if (ch == '+' || ch == '-' || ch == '*' || ch == '/') {


fprintf(output, "%7d\t\t %7d\t\t Operator\t %7c\n", l, t, ch);
t++;
} else if (ch == ';' || ch == '{' || ch == '}' || ch == '(' || ch == ')' || ch == '?' ||
ch == '@' || ch == '!' || ch == '%') {
fprintf(output, "%7d\t\t %7d\t\t Special symbol\t %7c\n", l, t, ch);
t++;
} else if (isdigit(ch)) {
fprintf(output, "%7d\t\t %7d\t\t Digit\t\t %7c\n", l, t, ch);
t++;
} else if (isalpha(ch)) {
str[i] = ch;
i++;

ch = fgetc(input);

while (isalnum(ch) && ch != ' ') {


str[i] = ch;
i++;
ch = fgetc(input);
}

str[i] = '\0';
for (j = 0; j <= 5; j++) {
if (strcmp(str, keyword[j]) == 0) {
flag = 1;
break;
}
}

if (flag == 1) {
fprintf(output, "%7d\t\t %7d\t\t Keyword\t %7s\n", l, t, str);
t++;
} else {
fprintf(output, "%7d\t\t %7d\t\t Identifier\t %7s\n", l, t, str);
t++;
}
} else if (ch == '\n') {
l++;
}
}

fclose(input);
fclose(output);
return 0;
}

Input.txt:
Output:

Conclusion: Hence successfully implemented the Lexical Analyzer

You might also like