Week 5 Lecture Algorithm Analysis
Week 5 Lecture Algorithm Analysis
Week 5 Lecture Algorithm Analysis
&
Asymptotic Notations
How is it determined?
1. Analytically
performance analysis
2. Experimentally
performance measurement
Criteria for Measurement
Space
– amount of memory program occupies
– usually measured in bytes, KB or MB
Time
– execution time
– usually measured by the number of executions
Complexity
Complexity is the resources required by the
algorithms, most commonly known as the time
how many steps it needs and the space how much
memory it takes
Running Time
Running time — the actual time spent in execution
of an algorithm.
It depends on a number of factors
– Inputs data
– The hardware and software environment.
Analysis of Algorithms
There are often many different algorithms which can be
used to solve the same problem. Thus, it makes sense to
develop techniques that allow us to:
– compare different algorithms with respect to their “efficiency”
– choose the most efficient algorithm for the problem
The efficiency of any algorithmic solution to a problem is a
measure of bye the:
– Time efficiency: the time it takes to execute.
– Space efficiency: the space (primary or secondary memory) it uses.
We will focus on an algorithm’s efficiency with respect to
time.
Components of Program Space
Data space
– very much dependent on the computer architecture and
compiler
– The magnitude of the data that a program works with is
another factor
char 1 float 4
short 2 double8
int 2 long double 10
long 4 pointer 2
Unit: bytes
Components of Program Space
Data space
– Choosing a “smaller” data type has an effect on the
overall space usage of the program.
– Choosing the correct type is especially important when
working with arrays.
Best case
t(n) = 2+1+n+(2+2)*(n-1)+1 = 5n At least
Worst case
t(n) = 2+1+n+(2+2+2)*(n-1)+1 = 7n-2 At most
Best, Average and Worst
Average-Case Analysis—expresses the running time of an algorithm
as an
average taken over all possible inputs.
Best-Case Analysis — the shortest running time of an algorithm.
Worst-Case Analysis — the longest running time of an algorithm.
for(int i = 1; i < n; i = i * 2)
sum = sum + i + helper(i); is log2n
Therefore the number of basic operations is:
1 + 1 + ( log2n) + log2n [2 + 4 + 1 + 1 + (n + 1) + n[2 + 2] + 1] + 1
= 2 + log2n + log2n [10 + 5n] + 1
= 5 n log2n + 11 log2n + 3
Operation Count
Justification:
We chose c=31, n0=1 and then we have
20n3 + 5n2 + 6 ≤ 31n3 when n>=1
Thus 20n3 + 5n2 + 6 is O(n3 )
Common Growth Rate Functions
1 (constant): growth is independent of the problem
size n.
log N (logarithmic): growth increases slowly com-
2
pared to the problem size (binary search)
N (linear): directly proportional to the size of the
problem.
N * log N (n log n): typical of some divide and
2
conquer approaches (merge sort)
N2 (quadratic): typical in nested loops
N3 (cubic): more nested loops
2N (exponential): growth is extremely rapid and
possibly impractical.
Practical Complexities
logn n nlogn n2 n3 2n
0 1 0 1 1 2
1 2 2 4 8 4
2 4 8 16 64 16
3 8 24 64 512 256
4 16 64 256 4096 65536
5 32 160 1024 32768 4294967296
Overly complex programs may not be practical given the
computing power of the system.
READ Chapter 2 of the texbook
Proving Big-O Complexity
To prove that f(n) is O(g(n)) we find any pair of values n0 and c that
satisfy:
f(n) ≤ c * g(n) for n n0
Note: The pair (n0, c) is not unique. If such a pair exists then there is an
infinite number of such pairs.
4 3 2 1 n0
3 3.3125 3.55 4.25 8 c