0% found this document useful (0 votes)
34 views1 page

Lexeme Token Category

Lexical analysis is the process of converting a sequence of characters like a computer program into meaningful tokens by breaking it down into lexemes or syntactic units, with each lexeme being categorized by its token type like identifiers, operators, literals, and punctuation based on regular expression patterns.

Uploaded by

Aniil J Kumaar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
34 views1 page

Lexeme Token Category

Lexical analysis is the process of converting a sequence of characters like a computer program into meaningful tokens by breaking it down into lexemes or syntactic units, with each lexeme being categorized by its token type like identifiers, operators, literals, and punctuation based on regular expression patterns.

Uploaded by

Aniil J Kumaar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 1

lexical analysis is the process of converting a sequence of characters (such as a computer program)

into a sequence of tokens (strings with an identified "meaning").

A lexeme is a string of characters which forms a syntactic unit.

Example: Sum =3+2;

Lexeme Token category


sum "Identifier"
= "Assignment operator"
3 "Integer literal"
+ "Addition operator"
2 "Integer literal"
; "End of statement"
Regular expressions compactly represent patterns that the characters in lexemes might follow.

You might also like