0% found this document useful (0 votes)
21 views9 pages

LEX Examples

The document discusses how to generate a lexical analyzer using LEX. It explains that a LEX specification file is written, run through the LEX compiler to generate a C file, and then compiled to create the lexical analyzer executable. The LEX file contains definitions, rules, and user subroutines to define the token patterns and actions.

Uploaded by

kondareddy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
21 views9 pages

LEX Examples

The document discusses how to generate a lexical analyzer using LEX. It explains that a LEX specification file is written, run through the LEX compiler to generate a C file, and then compiled to create the lexical analyzer executable. The LEX file contains definitions, rules, and user subroutines to define the token patterns and actions.

Uploaded by

kondareddy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 9

LEX-lexical analyzer

generator
lexical analyzer with Lex:
• First, a specification of a lexical analyzer is prepared by creating a
program lex.l in the Lex language.
• Then, lex.l is run through the Lex compiler to produce a C program
lex.yy.c.
• Finally, lex.yy.c is run through the C compiler to produce an object
program a.out.
• This a.out is the lexical analyzer that transforms an input stream
into a sequence of tokens

58
• Lex.l is an a input file written in a language which describes
the generation of lexical analyzer.
• lex compiler transforms lex.l to a C program known as lex.yy.c.
• Lex.yy.c is compiled by the C compiler to a file called a.out.
• The output of C compiler is the working lexical analyzer which
takes
stream of input characters and produces a stream of tokens.
• yylval is a global variable which is shared by lexical analyzer
and
parser to return the name and an attribute value of token.
• Another tool for lexical analyzer generation is Flex

60
Lex Specification:
A Lex program consists of three parts:
• definitions
• rules
• user subroutines
Lex Specification:
• Definitions include declarations of variables, constants, and
regular
definitions
• Rules are statements of the form p1 {action1} p2 {action2} … pn
{action}
where pi is regular expression and actioni describes what action
the lexical
analyzer should take when pattern pi matches a lexeme.
Actions are written in C code.
• User subroutines are auxiliary procedures needed by the actions.
These
can be compiled separately and loaded with the lexical analyzer.

62
Structure of Lex Programs:
Lex program will be in following form
declarations
%%
translation rules
%%
auxiliary functions
Structure of Lex Programs:
• Declarations :This section includes declaration of variables,
constants and regular definitions.
• Translation rules: It contains regular expressions and segments.
• Auxiliary functions: This section holds additional functions
which are used in actions. These functions are compiled
separately and loaded with lexical analyzer. 63
How to compile and run LEX programs on windows
lex count.l(lex file)

gcc lex.yy.c(c file)

./ a.out(lexiczl analyzer)

enter the input:


compiler design
sixth sem

No.of words are:4


No.of characters are:24
No.of lines are:2
No.of spaces are:2

You might also like