0% found this document useful (0 votes)
51 views

Principles of Code Optimization

The document summarizes Lecture IX on code optimization techniques in compilers. It discusses representing a program as a flow graph using basic blocks. The principal sources of optimization covered are common subexpression elimination, copy propagation, dead code elimination, constant folding, and loop optimization. These techniques aim to improve code speed and reduce space by transforming intermediate code through control and data flow analyses.

Uploaded by

TIJUKA
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
51 views

Principles of Code Optimization

The document summarizes Lecture IX on code optimization techniques in compilers. It discusses representing a program as a flow graph using basic blocks. The principal sources of optimization covered are common subexpression elimination, copy propagation, dead code elimination, constant folding, and loop optimization. These techniques aim to improve code speed and reduce space by transforming intermediate code through control and data flow analyses.

Uploaded by

TIJUKA
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 28

Free University of Bolzano–Principles of Compilers. Lecture IX, 2003/2004 – A.

Artale (1)

Principle of Compilers
Lecture IX: Principles of Code
Optimization

Alessandro Artale
Faculty of Computer Science – Free University of Bolzano
Room: 221
[email protected]
http://www.inf.unibz.it/ artale/

2003/2004 – Second Semester


Free University of Bolzano–Principles of Compilers. Lecture IX, 2003/2004 – A.Artale (2)

Summary of Lecture IX

Code Optimization

Basic Blocks and Flow Graphs

Sources of Optimization
1. Common Subexpression Elimination
2. Copy Propagation
3. Dead-Code Elimination
4. Constant Folding
5. Loop Optimization
Free University of Bolzano–Principles of Compilers. Lecture IX, 2003/2004 – A.Artale (3)

Code Optimization: Intro

Intermediate Code undergoes various transformations—called Optimiza-


tions—to make the resulting code running faster and taking less space.

Optimization never guarantees that the resulting code is the best possible.

We will consider only Machine-Independent Optimizations—i.e., they don’t


take into consideration any properties of the target machine.

The techniques used are a combination of Control-Flow and Data-Flow


analysis.
– Control-Flow Analysis. Identifies loops in the flow graph of a program
since such loops are usually good candidates for improvement.
– Data-Flow Analysis. Collects information about the way variables are
used in a program.
Free University of Bolzano–Principles of Compilers. Lecture IX, 2003/2004 – A.Artale (4)

Criteria for Code-Improving Transformations

The best transformations are those that yield the most benefit for the least
effort.
1. A transformation must preserve the meaning of a program. It’s better to
miss an opportunity to apply a transformation rather than risk changing
what the program does.
2. A transformation must, on the average, speed up a program by a
measurable amount.
3. Avoid code-optimization for programs that run occasionally or during
debugging.
4. Remember! Dramatic improvements are usually obtained by improving
the source code: The programmer is always responsible in finding the
best possible data structures and algorithms for solving a problem.
Free University of Bolzano–Principles of Compilers. Lecture IX, 2003/2004 – A.Artale (5)
Quicksort: An Example Program
We will use the sorting program Quicksort to illustrate the effects of the
various optimization techniques.
void quicksort(m,n)
int m,n;

int i,j,v,x;
if (n <= m) return;
i = m-1; j = n; v = a[n]; /* fragment begins here */
while (1)
do i = i+1; while (a[i]<v);
do j = j-1; while (a[j]>v);
if (i>=j) break;
x = a[i]; a[i] = a[j]; a[j] =x;


x = a[i]; a[i] = a[n]; a[n] =x; /* fragment ends here */


quicksort(m,j); quicksort(i+1,n);

Free University of Bolzano–Principles of Compilers. Lecture IX, 2003/2004 – A.Artale (6)
Quicksort: An Example Program (Cont.)
The following is the three-address code for a fragment of Quicksort.
Free University of Bolzano–Principles of Compilers. Lecture IX, 2003/2004 – A.Artale (7)

Summary

Code Optimization

Basic Blocks and Flow Graphs

Sources of Optimization
1. Common Subexpression Elimination
2. Copy Propagation
3. Dead-Code Elimination
4. Constant Folding
5. Loop Optimization
Free University of Bolzano–Principles of Compilers. Lecture IX, 2003/2004 – A.Artale (8)

Basic Blocks and Flow Graphs

The Machine-Independent Code-Optimization phase consists of control-flow


and data-flow analysis followed by the application of transformations.

During Control-Flow analysis, a program is represented as a Flow Graph


where:
– Nodes represent Basic Blocks: Sequence of consecutive statements in
which flow-of-control enters at the beginning and leaves at the end
without halt or branches;
– Edges represent the flow of control.
Free University of Bolzano–Principles of Compilers. Lecture IX, 2003/2004 – A.Artale (9)
Flow Graph:An Example


Flow graph for the three-address code fragment for quicksort. Each is a


basic block.
Free University of Bolzano–Principles of Compilers. Lecture IX, 2003/2004 – A.Artale (10)

Summary

Code Optimization

Basic Blocks and Flow Graphs

Sources of Optimization
1. Common Subexpression Elimination
2. Copy Propagation
3. Dead-Code Elimination
4. Constant Folding
5. Loop Optimization
Free University of Bolzano–Principles of Compilers. Lecture IX, 2003/2004 – A.Artale (11)

The Principal Sources of Optimization

We distinguish local transformations—involving only statements in a single


basic block—from global transformations.

A basic block computes a set of expressions: A number of transformations


can be applied to a basic block without changing the expressions computed
by the block.
1. Common Subexpressions elimination;
2. Copy Propagation;
3. Dead-Code elimination;
4. Constant Folding.
Free University of Bolzano–Principles of Compilers. Lecture IX, 2003/2004 – A.Artale (12)
Common Subexpressions Elimination

Frequently a program will include calculations of the same value.

An occurrence of an expression is called a common subexpression if


was previously computed, and the values of variables in have no changed
since the previous computation.

Example. Consider the basic block . The assignments to both and





have common subexpressions and can be eliminated.


After local common subexpression elimination, is transformed as:






 








 



 

 





 

 







Free University of Bolzano–Principles of Compilers. Lecture IX, 2003/2004 – A.Artale (13)

Common Subexpressions Elimination (Cont.)

Example (Cont.) After local elimination, still evaluates and



which are common subexpressions.

is evaluated in


by . Then, the statements




 
 
 







 

 

can be replaced by
 
 
 







 


Now, is also a common subexpression, computed in by . Then,


 

 
the statements
 
 
 





 

can be replaced by
 


.


 

Analogously, the value of is the same as the value assigned to in block




; while can be eliminated and replaced by .


 

 
Free University of Bolzano–Principles of Compilers. Lecture IX, 2003/2004 – A.Artale (14)
Common Subexpressions Elimination (Cont.)

Example. The following flow graph shows the result of eliminating both
local and global common subexpressions from basic blocks and .


Free University of Bolzano–Principles of Compilers. Lecture IX, 2003/2004 – A.Artale (15)

DAGs for Determining Common Subexpressions

To individuate common subexpressions we represent a basic block as a DAG


showing how expressions are re-used in a block.

A DAG for a Basic Block has the following labels and nodes:
1. Leaves contain unique identifiers, either variable names or constants.
2. Interior nodes contain an operator symbol.
3. Nodes can optionally be associated to a list of variables representing
those variables having the value computed at the node.
Free University of Bolzano–Principles of Compilers. Lecture IX, 2003/2004 – A.Artale (16)

DAGs for Blocks: An Example

The following shows both a three-address code of a basic block and its
associated DAG.








+ ,prod




 



 

 







prod

 
*




 

[] [] (1)

 

 





 

 

 

a + ,i

  



b * 20

 





 




 


4 i 1





 




 

















Free University of Bolzano–Principles of Compilers. Lecture IX, 2003/2004 – A.Artale (17)

Summary

Code Optimization

Basic Blocks and Flow Graphs

Sources of Optimization
1. Common Subexpression Elimination
2. Copy Propagation
3. Dead-Code Elimination
4. Constant Folding
5. Loop Optimization
Free University of Bolzano–Principles of Compilers. Lecture IX, 2003/2004 – A.Artale (18)

Copy Propagation

Copy Propagation Rule: Given the copy statement x := y use y for x


whenever possible after the copy statement.

Copy Propagation applied to Block yields:





 

 

 

 











This transformation together with Dead-Code Elimination (see next slide)


will give us the opportunity to eliminate the assignment altogether.



Free University of Bolzano–Principles of Compilers. Lecture IX, 2003/2004 – A.Artale (19)

Dead-Code Elimination

A variable is live at a point in a program if its value can be used subsequently,


otherwise it is dead.

A piece of code is dead if data computed is never used elsewhere.

Dead-Code may appear as the result of previous transformation.

Dead-Code works well together with Copy Propagation.

Example. Considering the Block after Copy Propagation we can see that


x is never reused all over the code. Thus, x is a dead variable and we can
eliminate the assignment from .
 


Free University of Bolzano–Principles of Compilers. Lecture IX, 2003/2004 – A.Artale (20)

Constant Folding

Based on deducing at compile-time that the value of an expression (and in


particular of a variable) is a constant.

Constant Folding is the transformation that substitutes an expression with a


constant.

Constant Folding is useful to discover Dead-Code.



Example. Consider the conditional statement:


.






If, by Constant Folding, we discover that is always false we can eliminate
both the if-test and the jump to L. 
Free University of Bolzano–Principles of Compilers. Lecture IX, 2003/2004 – A.Artale (21)

Summary

Code Optimization

Basic Blocks and Flow Graphs

Sources of Optimization
1. Common Subexpression Elimination
2. Copy Propagation
3. Dead-Code Elimination
4. Constant Folding
5. Loop Optimization
Free University of Bolzano–Principles of Compilers. Lecture IX, 2003/2004 – A.Artale (22)

Loop Optimization

The running time of a program can be improved if we decrease the amount


of instructions in an inner loop.

Three techniques are useful:


1. Code Motion
2. Reduction in Strength
3. Induction-Variable elimination
Free University of Bolzano–Principles of Compilers. Lecture IX, 2003/2004 – A.Artale (23)

Code Motion

If the computation of an expression is loop-invariant this transformation


places such computation before the loop.

Example. Consider the following while statement:


while (i limit - 2) do


The expression limit - 2 is loop invariant. Code motion transformation will


result in:
t := limit -2;
while (i t) do

Free University of Bolzano–Principles of Compilers. Lecture IX, 2003/2004 – A.Artale (24)

Reduction in Strength

Is based on the replacement of a computation with a less expensive one.

Example. Consider the assignment


in Block .





j is decremented by 1 each time, then


.






Thus, we may replace by  .






Problem: We need to initialize to before entering the Block .




– Result. The substitution of a multiplication by a subtraction will speed up


the resulting code.
Free University of Bolzano–Principles of Compilers. Lecture IX, 2003/2004 – A.Artale (25)

Induction Variables

A variable x is an Induction Variable of a loop if every time the variable x


changes values, it is incremented or decremented by some constant.

A common situation is one in which an induction variable, say i, indexes an


array, and some other induction variable, say t, is the actual offset to access
the array:
– Often we can get rid of i.
– In general, when there are two or more Induction Variables it is possible
to get rid of all but one.
Free University of Bolzano–Principles of Compilers. Lecture IX, 2003/2004 – A.Artale (26)

Induction Variables Elimination: An Example

Example. Consider the loop of Bock . The variables j and are Induction


Variables. The same applies for variables i and in Block .




After Reduction in Strength is applied to both and , the only use of i and




j is to determine the test in .




Since and the test is equivalent to .








After this replacement in the test, both i (in Block ) and j (in Block )


become dead-variables and can be eliminated! (see next slide for the new
optimized code).
Free University of Bolzano–Principles of Compilers. Lecture IX, 2003/2004 – A.Artale (27)
Induction Variables Elimination: An Example (Cont.)
Flow Graph after Reduction in Strength and Induction-Variables elimination.
Free University of Bolzano–Principles of Compilers. Lecture IX, 2003/2004 – A.Artale (28)

Summary of Lecture IX

Code Optimization

Basic Blocks and Flow Graphs

Sources of Optimization
1. Common Subexpression Elimination
2. Copy Propagation
3. Dead-Code Elimination
4. Constant Folding
5. Loop Optimization

You might also like