0% found this document useful (0 votes)
70 views19 pages

Code Optimization in Compiler Design: Kunal Jangra Shivam Tripathi Tushar Jain

The document discusses code optimization techniques used in compiler design. It describes how code optimization aims to improve intermediate code to reduce resource usage and speed up execution. Some key techniques discussed include common subexpression elimination, copy propagation, dead code elimination, and loop optimizations like induction variable elimination and strength reduction. The goal of optimization is to increase program speed and performance while keeping compilation time and impact on code readability reasonable.

Uploaded by

Shubham Tripathi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
70 views19 pages

Code Optimization in Compiler Design: Kunal Jangra Shivam Tripathi Tushar Jain

The document discusses code optimization techniques used in compiler design. It describes how code optimization aims to improve intermediate code to reduce resource usage and speed up execution. Some key techniques discussed include common subexpression elimination, copy propagation, dead code elimination, and loop optimizations like induction variable elimination and strength reduction. The goal of optimization is to increase program speed and performance while keeping compilation time and impact on code readability reasonable.

Uploaded by

Shubham Tripathi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 19

Code

Optimization in
Compiler Design
Kunal Jangra
Shivam Tripathi
Tushar Jain
Code Optimization
 The code optimization in the synthesis phase is a program transformation technique, which tries
to improve the intermediate code by making it consume fewer resources so that faster-running
machine code will result.
 Compiler optimizing process should meet the following objectives :
• The optimization must be correct, it must not, in any way, change the meaning of the program.
• Optimization should increase the speed and performance of the program.
• The compilation time must be kept reasonable.
• The optimization process should not delay the overall compiling process.
Objectives of compiler optimizing

 Compiler optimizing process should meet the following objectives :

• The optimization must be correct, it must not, in any way, change the meaning of the
program.
• Optimization should increase the speed and performance of the program.

• The compilation time must be kept reasonable.

• The optimization process should not delay the overall compiling process.
When to Optimize?

 Optimization of the code is often performed at the end of the development stage since it
reduces readability and adds code that is used to increase the performance.
Why Optimize?
 optimizing an algorithm is beyond the scope of the code optimization phase. So the
program is optimized. And it may involve reducing the size of the code. So optimization
helps to:
• Reduce the space consumed and increases the speed of compilation.
• Manually analyzing datasets involves a lot of time. Hence, we make use of software like
Tableau for data analysis. Similarly, manually performing the optimization is also tedious
and is better done using a code optimizer.
• An optimized code often promotes re-usability.
Types of Code Optimization

 The optimization process can be broadly classified into two types :


1. Machine Independent Optimization – This code optimization phase attempts to
improve the intermediate code to get a better target code as the output. The part of the
intermediate code which is transformed here does not involve any CPU registers or
absolute memory locations.
2. Machine Dependent Optimization – Machine-dependent optimization is done after
the target code has been generated and when the code is transformed according to the
target machine architecture. It involves CPU registers and may have absolute memory
references rather than relative references. Machine-dependent optimizers put efforts to
take maximum advantage of the memory hierarchy.
Code Optimization is done in the following
different ways :
 Compile Time Evaluation :
 (i)  A = 2*(22.0/7.0)*r 
      Perform 2*(22.0/7.0)*r at compile time.
 Variable Propagation :
 //Before Optimization 
 c = a * b                                               
 x = a                                                  
 till                                                     
      
 d = x * b + 4 
  
  
 //After Optimization 
 c = a * b  
 x = a
 till
 d = a * b + 4
 Dead code elimination : Variable propagation often leads to making assignment
statement into dead code
 c = a * b                                                
 x = a                                                
 till                                                          
 d = a * b + 4   
  
 //After elimination :
 c = a * b
 till
 d = a * b + 4
 Code Motion :
• Reduce the evaluation frequency of expression.
• Bring loop invariant statements out of the loop.
 a = 200;
  while(a>0)
  {
      b = x + y;
      if (a % b == 0}
      printf(“%d”, a);
    }
  //This code can be further optimized as
 a = 200;
 b = x + y;
 while(a>0)
  {
      if (a % b == 0}
      printf(“%d”, a);
    }
 Induction Variable and Strength Reduction :
• An induction variable is used in loop for the following kind of assignment i = i + constant.
• Strength reduction means replacing the high strength operator by the low strength.
 i =
1;                                                                      
 while (i<10)                                                          
 {                                                                       
      
     y = i * 4; 
 }  
 //After Reduction
 i = 1
 t = 4
 { 
    while( t<40) 
    y = t; 
    t = t + 4;
 }
Where to apply Optimization?
• Source program
Optimizing the source program involves making changes to the algorithm or changing the
loop structures . User is the actor here.
• Intermediate Code
Optimizing the intermediate code involves changing the address calculations and
transforming the procedure calls involved. Here compiler is the actor.
• Target Code
Optimizing the target code is done by the compiler. Usage of registers , select and move
instructions is part of optimization involved in the target code.
Phases of Optimization

• There are generally two phases of optimization:

• Global Optimization:
Transformations are applied to large program segments that includes
functions,procedures and loops.
• Local Optimization:
Transformations are applied to small blocks of statements.The local optimization is
done prior to global optimization.
PRINCIPAL SOURCES OF OPTIMISATION

 A transformation of a program is called local if it can be performed by looking only at


the statements in a basic block; otherwise, it is called global. Many transformations can
be performed at both the local and global levels. Local transformations are usually
performed first.

 Function preserving transformations examples:

Common sub expression elimination


Copy propagation,
Dead-code elimination
Constant folding
Common Sub expressions elimination:

 An occurrence of an expression E is called a common sub-expression if E was


previously computed, and the values of variables in E have not changed since the
previous computation. We can avoid recomputing the expression if we can use the
previously computed value.
For example
 
t1: = 4*i
t2: = a [t1]
t3: = 4*j
t4: = 4*i
t5: = n
t6: = b [t4] +t5
 
The above code can be optimized using the common sub-expression elimination as
t1: = 4*i
 t2: = a [t1]
t3: = 4*j
t5: = n
t6: = b [t1] +t5
 
The common sub expression t4: =4*i is eliminated as its computation is already in t1 and the
value of i is not been changed from definition to use.
 
Copy Propagation:

Assignments of the form f : = g called copy statements, or copies for short. The idea behind the copy-
propagation transformation is to use g for f, whenever possible after the copy statement f: = g. Copy
propagation means use of one variable instead of another. This may not appear to be an improvement, but as
we shall see it gives us an opportunity to eliminate x.
 
• For example:
x=Pi;
 
A=x*r*r;
 
The optimization using copy propagation can be done as follows: A=Pi*r*r;
 
Here the variable x is eliminated
Dead-Code Eliminations:

A variable is live at a point in a program if its value can be used subsequently; otherwise, it is dead at that point. A
related idea is dead or useless code, statements that compute values that never get used. While the programmer is
unlikely to introduce any dead code intentionally, it may appear as the result of previous transformations.
 
Example:
 
i=0;
if(i=1)
{
a=b+5;
}
 
Here, ‘if’ statement is dead code because this condition will never get satisfied.
 
Loop Optimizations:

In loops, especially in the inner loops, programs tend to spend the bulk
of their time. The running time of a program may be improved if the number of
instructions in an inner loop is decreased, even if we increase the amount of
code outside that loop.
 
Three techniques are important for loop optimization:
Ø     Code motion, which moves code outside a loop;
Ø     Induction-variable elimination, which we apply to replace variables from
inner loop.
 
Ø     Reduction in strength, which replaces and expensive operation by a cheaper
one, such as a multiplication by an addition.

You might also like