0% found this document useful (0 votes)
7 views6 pages

Compiler Design CAT

The document provides an overview of compiler design, detailing the phases of the front end including lexical, syntax, and semantic analysis. It discusses the importance of tokens, the role of the symbol table, error recovery strategies, and various classifications of compilers such as single-pass, multi-pass, and just-in-time compilation. Additionally, it covers parsing techniques, compiler architecture, optimization methods, and the significance of intermediate code in enhancing portability and simplifying optimization.

Uploaded by

Collins Bett
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views6 pages

Compiler Design CAT

The document provides an overview of compiler design, detailing the phases of the front end including lexical, syntax, and semantic analysis. It discusses the importance of tokens, the role of the symbol table, error recovery strategies, and various classifications of compilers such as single-pass, multi-pass, and just-in-time compilation. Additionally, it covers parsing techniques, compiler architecture, optimization methods, and the significance of intermediate code in enhancing portability and simplifying optimization.

Uploaded by

Collins Bett
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 6

INTRODUCTION TO COMPILER DESIGN

CAT

1. Phases of the Front End of a Compiler:


- Lexical Analysis: Tokenizes the source code.
- Syntax Analysis: Constructs parse trees.
- Semantic Analysis: Ensures logical correctness.

2. Lexemes to Tokens & Importance of Tokens:


- Lexical analyzer groups characters into lexemes.
- Converts lexemes into tokens.
- Tokens simplify syntax analysis.

3. Role of Symbol Table:


- Stores variable names, types, and scopes.
- Facilitates type checking and optimization.

4. Error Recovery Strategies:


- Panic Mode: Skips erroneous statements.
- Phrase-Level: Attempts local corrections.
- Error Productions: Anticipates likely errors.
- Global Correction: Modifies code for correctness.

5. Classifications of a Compiler:
(i).Single-Pass vs Multi-Pass.
Single-Pass Compiler:
Processes the source code in one pass without revisiting earlier
stages.
Faster but less optimized.
Example: Early Pascal compilers.
Multi-Pass Compiler:
Processes the source code in multiple passes, refining the code
at each stage.
Enables better optimization and error handling.
Example: GCC (GNU Compiler Collection).
(ii).Just-In-Time (JIT) Compilation.-
Compiles code at runtime instead of beforehand.
Improves execution speed by optimizing frequently used code
segments.
Used in Java Virtual Machine (JVM), .NET CLR, and modern
JavaScript engines (V8, SpiderMonkey).

(iii).Cross Compiler. – is a type of compiler that generates machine


code for a different platform than the one it runs on.

(iv).Incremental Compiler.- is a type of compiler that compiles


only the modified portions of code instead of recompiling the
entire program.

6. Passes in Compiler Design:


- Single-Pass Compiler: Reads code once.
- Multi-PassCompiler: Processes code multiple times for
optimization.
7. Types of Parsing:
- Top-Down Parsing (LL Parsing). -This a parsing technique that
starts from the highest-level rule and breaks it down into
smaller components until it matches the input string.
- Bottom-Up Parsing (LR Parsing). – this is a parsing technic that
starts with the input string and applies reduction steps to
construct the parse tree, working towards the start symbol of
the grammar.

8. Architecture of a Compiler:

Source Code

Lexical
analyzer

Syntax
Analyzer

Semantic
analyzer

Intermediate code Generator

Optimizer

Code
Generator

Target machine code


Lexical Analysis – Tokenizing the source code.

Syntax Analysis – Creating a parse tree.

Semantic Analysis – Ensuring logical correctness.

Intermediate Code Generation – Producing an intermediate


representation.

Optimization – Enhancing performance.

Code Generation – Producing machine code.

Target machine code– Compiled code for the target machine.

9. Peephole Optimization:
-A small code segment optimization technique.
- Eliminates redundant instructions.

10. Code Optimization:


- Enhances execution efficiency and performance.

11. Code Optimization Techniques:


- Constant
Folding.- Replaces constant expressions with their
computed values at compile time.
- DeadCode Elimination.-Removes code that never executes or
whose results are never used.
- LoopUnrolling.-Expands loop iterations to reduce loop
overhead.
- Strength Reduction.-Replaces expensive operations with
cheaper alternatives.

12. Lexical Analysis Phase:


- Converts source code to tokens.
- Identifies keywords, identifiers, and symbols.

13. Outputs of Preprocessor Phase:


- Expanded macros.
- File inclusions.
- Conditional compilation results.
- Comment removal.

14. Reasons for Intermediate Code:


- Enhances portability.
- Simplifies optimization.
- Bridges high-level and machine code.
- Enables retargeting for different architectures.

You might also like