0% found this document useful (0 votes)
4 views39 pages

CSE2002 Session38 Code Optimization3038

The document discusses various optimization techniques in programming, focusing on machine-independent code optimization and the importance of identifying critical code sections. It classifies optimization methods, highlights factors influencing optimization, and outlines themes such as redundancy elimination and code locality. Additionally, it details specific optimizing transformations like dead code elimination, loop optimization, and peephole optimization, along with their respective strategies and examples.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views39 pages

CSE2002 Session38 Code Optimization3038

The document discusses various optimization techniques in programming, focusing on machine-independent code optimization and the importance of identifying critical code sections. It classifies optimization methods, highlights factors influencing optimization, and outlines themes such as redundancy elimination and code locality. Additionally, it details specific optimizing transformations like dead code elimination, loop optimization, and peephole optimization, along with their respective strategies and examples.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 39

Organization

 Introduction
 Classifications of Optimization techniques
 Factors influencing Optimization
 Themes behind Optimization Techniques
 Optimizing Transformations

1
Reference
 https://www.iitg.ac.in/dgoswami/#Education
 Thanks to
 Diganta Goswami
 Professor
Dept. of Computer Science & Engg.
Indian Institute of Technology Guwahati
Guwahati - 781039, Assam, INDIA

2
Introduction
 Concerns with machine-independent code
optimization
 90-10 rule: execution spends 90% time in
10% of the code.
 It is moderately easy to achieve 90% optimization.
The rest 10% is very difficult.
 Identification of the 10% of the code is not possible
for a compiler – it is the job of a profiler.
 In general, loops are the hot-spots

3
Introduction
 Criterion of code optimization
 Must preserve the semantic equivalence of the
programs
 The algorithm should not be modified
 Transformation, on average should speed up the
execution of the program
 Worth the effort: Intellectual and compilation effort
spend on insignificant improvement.
Transformations are simple enough to have a good effect

4
Introduction
 Optimization can be done in almost all
phases of compilation.
Source Front Inter. Code target
code end code generator code

Profile and Loop, proc Reg usage,


optimize calls, addr instruction
(user) calculation choice,
improvement peephole opt
(compiler) (compiler)

5
Introduction
 Organization of an optimizing compiler

Control
Data flow
flow Transformation
analysis
analysis

Code optimizer

6
Classifications of Optimization
techniques
 Peephole optimization
 Local optimizations
 Global Optimizations
 Inter-procedural
 Intra-procedural
 Loop optimization

7
Factors influencing Optimization
 The target machine: machine dependent factors
can be parameterized to compiler for fine tuning
 Architecture of Target CPU:
 Number of CPU registers
 RISC vs CISC
 Pipeline Architecture
 Number of functional units
 Machine Architecture
 Cache Size and type
 Cache/Memory transfer rate

8
Themes behind Optimization
Techniques
 Avoid redundancy: something already computed
need not be computed again
 Smaller code: less work for CPU, cache, and memory!
 Less jumps: jumps interfere with code pre-fetch
 Code locality: codes executed close together in time is
generated close together in memory – increase locality
of reference
 Extract more information about code: More info –
better code generation

9
Redundancy elimination
 Redundancy elimination = determining that two computations are
equivalent and eliminating one.
 There are several types of redundancy elimination:
 Value numbering
 Associates symbolic values to computations and identifies expressions that have the
same value
 Common subexpression elimination
 Identifies expressions that have operands with the same name
 Constant/Copy propagation
 Identifies variables that have constant/copy values and uses the constants/copies in
place of the variables.
 Partial redundancy elimination
 Inserts computations in paths to convert partial redundancy to full redundancy.

10
Optimizing Transformations
 Compile time evaluation
 Common sub-expression elimination
 Code motion
 Strength Reduction
 Dead code elimination
 Copy propagation
 Loop optimization
 Induction variables and strength reduction

11
Compile-Time Evaluation
 Expressions whose values can be pre-
computed at the compilation time
 Two ways:
 Constant folding
 Constant propagation

12
Compile-Time Evaluation
 Constant folding: Evaluation of an
expression with constant operands to
replace the expression with single value
 Example:
area := (22.0/7.0) * r ** 2

area := 3.14286 * r ** 2

13
Compile-Time Evaluation
 Constant Propagation: Replace a
variable with constant which has been
assigned to it earlier.
 Example:
pi := 3.14286
area = pi * r ** 2
area = 3.14286 * r
** 2

14
Common Sub-expression
Evaluation
 Identify common sub-expression present in different
expression, compute once, and use the result in all the
places.
 The definition of the variables involved should not change

Example:
a := b * c temp := b * c
… a := temp
… …
x := b * c + 5 x := temp + 5

15
Common Subexpression Elimination
t1 = a + b
c=a+b c = t1
d=m*n t2 = m * n
e=b+d d = t2
f=a+b t3 = b + d
g=-b e = t3
h=b+a f = t1
a=j+a g = -b
k=m*n h = t1 /* commutative */
j=b+d a=j+a
a=-b k = t2
if m * n go to L j = t3
a = -b
if t2 go to L

the table contains quintuples:


(pos, opd1, opr, opd2, tmp)
16
Code Motion
 Moving code from one part of the program
to other without modifying the algorithm
 Reduce size of the program
 Reduce execution frequency of the code
subjected to movement

17
Code Motion
1. Code Space reduction: Similar to common
sub-expression elimination but with the
objective to reduce code size.
Example: Code hoisting
temp : = x ** 2
if (a< b) then if (a< b) then
z := x ** 2 z := temp
else else
y := x ** 2 + 10 y := temp + 10

“x ** 2“ is computed once in both cases, but the code size in the


second case reduces.
18
Code Motion
2 Execution frequency reduction: reduce execution
frequency of partially available expressions
(expressions available atleast in one path)

Example:
if (a<b) then if (a<b) then
z=x*2 temp = x * 2
z = temp
else else
y = 10 y = 10
temp = x * 2
g=x*2 g = temp;

19
Code Motion
Move expression out of a loop if the
evaluation does not change inside the loop.
Example:
while ( i < (max-2) ) …
Equivalent to:
t := max - 2
while ( i < t ) …

20
Strength Reduction
 Replacement of an operator with a less costly one.
Example:
temp = 5;
for i=1 to 10 do for i=1 to 10 do
… …
x=i*5 x = temp
… …
temp = temp + 5
end end

• Typical cases of strength reduction occurs in address


calculation of array references.
• Applies to integer expressions involving induction
variables (loop optimization)
21
Dead Code Elimination
 Dead Code are portion of the program which will
not be executed in any path of the program.
 Can be removed
 Examples:
 No control flows into a basic block
 A variable is dead at a point -> its value is not used
anywhere in the program
 An assignment is dead -> assignment assigns a value
to a dead variable

22
Dead Code Elimination
• Examples:

DEBUG:=0
if (DEBUG) print Can be
eliminated

24
Copy Propagation
 What does it mean?
 Given an assignment x = y, replace later uses of x
with uses of y, provided there are no intervening
assignments to x or y.
 When is it performed?
 Atany level, but usually early in the
optimization process.
 What is the result?
 Smaller code

25
Copy Propagation
 f := g are called copy statements or copies
 Use of g for f, whenever possible after copy
statement

Example:
x[i] = a; x[i] = a;
sum = x[i] + a; sum = a + a;

 May not appear to be code improvement, but


opens up scope for other optimizations.

26
Loop Optimization
 Decrease the number if instruction in the
inner loop
 Even if we increase no of instructions in
the outer loop
 Techniques:
 Code motion
 Induction variable elimination
 Strength reduction

27
Peephole Optimization
 Pass over generated code to examine
a few instructions, typically 2 to 4
 Redundant instruction Elimination: Use
algebraic identities
 Flowof control optimization: removal of
redundant jumps
 Use of machine idioms

28
Redundant instruction elimination
 Redundant load/store: see if an obvious replacement is possible
MOV R0, a
MOV a, R0
Can eliminate the second instruction without needing any global
knowledge of a
 Unreachable code: identify code which will never be executed:
#define DEBUG 0
if( DEBUG) { if (0 != 1) goto L2
print debugging info print debugging info
}
L2:

29
Algebraic identities
 Worth recognizing single instructions with a constant operand:
 a := a + 0;
 a := a * 1;
 a := a/1;
 a := a - 0;
 Strength reduction:
A ^ 2 = A * A

30
Usage of Machine idioms
 Use machine specific hardware instruction
which may be less costly.

i := i + 1
ADD i, #1 INC i

31
Thank you all 

Good Luck  

32
Loop Optimization

51
Loop Optimizations
 Most important set of optimizations
 Programs are likely to spend more time in
loops
 Presumption: Loop has been identified
 Optimizations:
 Loop invariant code removal
 Induction variable strength reduction
 Induction variable reduction

52
Loops in Flow Graph
 Dominators:
A node d of a flow graph G dominates a node n, if every
path in G from the initial node to n goes through d.

Represented as: d dom n

Corollaries:
Every node dominates itself.
The initial node dominates all nodes in G.
The entry node of a loop dominates all nodes in the loop.

53
Loops in Flow Graph
 Each node n has a unique immediate dominator
m, which is the last dominator of n on any path
in G from the initial node to n.
(d ≠ n) && (d dom n) → d dom m
 Dominator tree (T):
A representation of dominator information of
flow graph G.
 The root node of T is the initial node of G
 A node d in T dominates all node in its sub-tree

54
Example: Loops in Flow Graph
1 1

2 3
2 3

4
4
5 6
5 6 7
7

8 9
8 9

Flow Graph Dominator Tree


55
Loops in Flow Graph
 Natural loops:
1. A loop has a single entry point, called the “header”.
Header dominates all node in the loop
2. There is at least one path back to the header from the
loop nodes (i.e. there is at least one way to iterate the
loop)

 Natural loops can be detected by back edges.


 Back edges: edges where the sink node (head) dominates
the source node (tail) in G

56
Loop Invariant Code Removal
 Move out to pre-header the statements
whose source operands do not change
within the loop.
 Be careful with the memory operations
 Be careful with statements which are
executed in some of the iterations

67
Loop Invariant Code Removal
 Rules: A statement S: x:=y op z is loop invariant:
y and z not modified in loop body
 S is the only statement to modify x
 For all uses of x, x is in the available def set.
 For all exit edge from the loop, S is in the available
def set of the edges.
 If S is a load or store (mem ops), then there is no
writes to address(x) in the loop.

68

You might also like