0% found this document useful (0 votes)
3 views2 pages

Compiler Design 1marks

The document outlines various concepts related to compiler construction, including tools like Lex and Yacc for lexical analysis and parsing. It discusses look-ahead reading, lexemes, left factoring, operator precedence, symbol table entries, and the process of canonical LR parsing. Additionally, it covers attributes in syntax-directed translation, optimization techniques like common subexpression elimination, and different parameter passing methods.

Uploaded by

anilkatta639
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as TXT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views2 pages

Compiler Design 1marks

The document outlines various concepts related to compiler construction, including tools like Lex and Yacc for lexical analysis and parsing. It discusses look-ahead reading, lexemes, left factoring, operator precedence, symbol table entries, and the process of canonical LR parsing. Additionally, it covers attributes in syntax-directed translation, optimization techniques like common subexpression elimination, and different parameter passing methods.

Uploaded by

anilkatta639
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as TXT, PDF, TXT or read online on Scribd
You are on page 1/ 2

--------(a) Two Compiler Construction Tools:

Lex (Lexical Analyzer Generator) - Used for generating lexical analyzers or


scanners.
Yacc (Yet Another Compiler Compiler) - Used for generating parsers based on
context-free grammar.

--------(b) Look-Ahead Reading in Input Buffering Mechanism:


Look-ahead reading refers to the process of reading multiple characters ahead in
the input stream to make decisions about token classification or parsing. It helps
the lexical analyzer to resolve ambiguities in tokenization.

---------(c) Lexemes:
Lexemes are the smallest units of meaning in the source code, such as keywords,
operators, identifiers, or literals, which are matched by the rules defined in the
lexical analyzer.

--------(d) Left Factoring:


Left factoring is a grammar transformation technique used to eliminate left
recursion or ambiguity by factoring out common prefixes in production rules.
Example:

css
Copy code
A → αβ | αγ
is rewritten as:

css
Copy code
A → αA'
A' → β | γ

---------(e) Operator Precedence in Syntax Analysis:


In syntax analysis, operator precedence determines the order in which operators are
evaluated in an expression. Operators with higher precedence are grouped and
evaluated before operators with lower precedence. This is managed by constructing
operator-precedence tables or by defining grammar rules.

---------(f) Entries of the Symbol Table:


Name of the identifier (e.g., variable, function).
Type (e.g., integer, float, array).
Scope (global, local, etc.).
Memory location or address.
Attributes (e.g., size, dimensions for arrays).
Initialization values (if any).

---------(g) Process in Canonical LR Parser:


Construct LR(1) items by augmenting the grammar.
Create the canonical collection of LR(1) items (state transitions).
Build the LR parsing table (action and goto tables).
Parse the input string by using a shift-reduce mechanism and resolve conflicts (if
any).

------(h) Properties of S-Attributes:


Synthesized Attributes: S-attributes are purely synthesized attributes computed
from the attributes of child nodes.
They are used in bottom-up parsing, especially in syntax-directed definitions.

-----(i) Example of Boolean Expressions Using SDT:


mathematica
Copy code
E → E1 OR E2 { E.value = E1.value || E2.value; }
E → E1 AND E2 { E.value = E1.value && E2.value; }
E → NOT E1 { E.value = !E1.value; }
E → TRUE { E.value = true; }
E → FALSE { E.value = false; }

----------(j) Place of SDT in Phases of Compiler:


Syntax-Directed Translation (SDT) is associated with the syntax analysis phase. It
connects syntactic structure to semantic rules, allowing for intermediate code
generation or attribute evaluation during parsing.

----------(k) Why Quadruples Are Preferred Over Triples in an Optimizing Compiler:


Quadruples are preferred because:

They allow easier rearrangement of instructions during optimization since they use
explicit names for temporary results.
Triples use index-based referencing, which complicates code optimization and
reordering.

------(l) Basic Block:


A basic block is a sequence of consecutive statements in a program with:

One entry point (the first statement).


One exit point (control leaves after the last statement).
No branching except at the end.

---(m) Common Subexpression Elimination:


This is an optimization technique where multiple instances of the same expression
(with identical operands and operators) are computed only once, and the result is
reused.

Example:
Original code:

css
Copy code
a = b + c;
d = b + c;
Optimized code:

makefile
Copy code
t = b + c;
a = t;
d = t;

---------(n) Parameter Passing Methods:


Call by Value: Passes a copy of the argument.
Call by Reference: Passes the memory address of the argument.
Call by Value-Result: Combines call by value and reference.
Call by Name: Passes an expression or code block to be evaluated.
Call by Pointer: Passes a pointer to the variable.

You might also like