Lecture#04, DAA, Asymptotic Notations - Growth Rate
Lecture#04, DAA, Asymptotic Notations - Growth Rate
Lecture#04
Asymptotic Notations,
Growth Rate
150
100
50
0
1 2 3 4 5 6 7
n 2 4 7 11 16 29 60
T(n) 1 10 21 48 82 178 290
n T(n)
Algorithm Efficiency
Less the resources utilized by an
algorithm, more efficient it is
350
Time & space often go in opposite 300
directions
250
Priority of time and/or space can 200
dictate the choice of algorithm
150
In today’s world time is mostly given 100
higher priority than space
50
0
1 2 3 4 5 6 7
n 2 4 7 11 16 29 60
T(n) 1 10 21 48 82 178 290
n T(n)
Measuring Algorithm Efficiency
Two different ways:
1. Measure & Compare Execution Time
2. Measure & Compare Running time (Asymptotic Analysis)
Note: Running time mostly depends upon input size (n) and is denoted by T(n)
Measuring Algorithm Efficiency… Measuring Execution Time
Dependent upon hardware configuration
Dependent upon IO operations
Dependent upon memory in system
Dependent upon other applications installed on system
May differ for same machine on different times
Does not really help to predict affect on execution time when input size is
significantly increased / decreased
May differ greatly for parallel infrastructure
Involve function call overhead
Measuring Algorithm Efficiency… Measuring Execution Time
Multiple threads trying to access a common resource can increase execution time
significantly
If the program is running on server with multiple disks, some particular raid
configuration might work best for it
Choice of language can increase / decrease the execution time e.g. C language is
faster than Java
Measuring Algorithm Efficiency … Measuring Running Time
Independent upon hardware configuration
Independent upon IO operations
Independent upon memory in system
Independent upon other applications installed on system
Always results same
Can predict affect on execution time when input size is significantly increased
/ decreased
Time & Space Complexity
Time Complexity
Time required to execute an algorithm
Space Complexity
Total memory taken by an algorithm during its execution
Time & Space Tradeoffs
Time & Space Tradeoffs … Example
Application data may be stored in arrays, link lists, trees, graphs etc
For banking / financial transactions time may be compromised (a bit) but
accuracy is a must requirements
For audio/video stream based problems accuracy may be compromised
preferring solutions with low execution time
We want google to respond to our queries promptly while few irrelevant links
may be ignored
Execution Friendly Development … Tips
Move statements out of loop structures that do not belong there
Reduce IO operations as much as possible
Code in a way that produces efficient compiled code
Choose best suitable algorithm (before implementation)
Execution Time Function
Algorithm statements are considered to be executed in equal logical units of
time
Logical units as above will reflect relationship between input size (n) and
execution time function is T(n)
Execution time function will only show significant difference for large (n)
For small input size running time differences do not matter
Running Time Functions …. Example
Say a problem has n as input size & there are two algorithms a1 and a2 with T(n)
respectively as:
For a1: T(n) = 5000 n
For a2 : T(n) = n2 + 2
Constant
log n
n
n log n
n2
n3
2n
nn
Examples:
1. Accessing an element of array
2. Accessing maximum value from a MAX HEAP
3. Accessing header node from a link list
4. Accessing root node from a tree
5. Hashing
Asymptotic Analysis … Logarithmic Growth O(log n)
Logarithmic Growth
The runtime growth is proportional to the base 2 logarithm (log) of n
Examples:
1. Binary Search
2. Max/Min value from a complete binary tree
Asymptotic Analysis … Linear Growth O(n)
Linear Growth
Runtime grows proportional to the value of n
Examples:
1. Linear Search
2. Max/Min value from an array
3. Sum of value from an array
4. Link list traversal
Asymptotic Analysis … O(n log n)
(n log n) Growth
Any sorting algorithm that uses comparisons between elements is
O(n log n), based on divide an conquer approach
Examples:
1. Merge Sort
2. Quick Sort
Asymptotic Analysis … O(n2)
(n2) Growth
Running Time grows rapidly
Slow sorting algorithms
Examples:
1. Bubble Sort
2. Insertion Sort
3. Selection Sort
4. Quick Sort (Worst Case)
Asymptotic Analysis … Polynomial Growth O(nk)
(nk) Growth
Running Time grows rapidly
Suitable for small n
Examples:
1. Matrix multiplication
2. Maximum matching for bipartite graph
3. Multiplying n-digit numbers by simple
algorithm
Asymptotic Analysis … Polynomial Growth O(2k)
(2k) Growth
Running Time grows extremely rapidly
Suitable only for very small n
Examples:
1. Exact solution for travelling salesman
problem
2. Brute force search problems
Asymptotic Analysis … Graph for n (1-15)
log (n) vs n
16
14
12
10
0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
log (n) n
Asymptotic Analysis … Graph for n (1-15)
n vs n log (n)
70
60
50
40
30
20
10
0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
n n log (n)
Asymptotic Analysis … Graph for n (1-15)
n log (n) vs n^2 vs n^3
4000
3500
3000
2500
2000
1500
1000
500
0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
n log (n) n^2 n^3
Asymptotic Analysis … Graph for n (1-15)
n^3 vs 2^n
35000
30000
25000
20000
15000
10000
5000
0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
n^3 2^n
Asymptotic Analysis … Graph for n (1-100)
30000
25000
20000
15000
10000
5000
0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
O(n)?
Asymptotic Notations ………. Example
function find-min-plus-max(array a[1..n])
let j := ; // First, find the smallest element in the array
for i := 1 to n:
j := min(j, a[i])
repeat
let minim := j // Now, find the biggest element
j := ;
for i := 1 to n:
j := max(j, a[i])
repeat
let maxim := j
return minim + maxim; // return the sum of the two
end O(n)?
Asymptotic Notations ………. Example
function max-diff(array a[1..n])
m := 0 Max difference problem
for i := 1 to n-1
for j := i + 1 to n
if |a[i] – a[j]| > m then
m := |a[i] – a[j]|
end proc
O(n)?
Asymptotic Notations ………. Example
function max-diff(array a[1..n])
min := a[1]
max := a[1] Max difference problem
(Another Algorithm)
for i := 2 to n
if a[i] < min then
min := a[i]
else if a[i] > max then
max := a[i]
m := max – min
end proc
O(n)?
Big Oh is not the complete story
Two Algorithms A and B with same asymptotic performance, Why select one
over the other, they're both the same, right?
They may not be the same. There is this small matter of the constant of
proportionality.
Suppose that A does ten operations for each data item, but algorithm B only
does three
It is reasonable to expect B to be faster than A even though both have the
same asymptotic performance. The reason is that asymptotic analysis
ignores constants of proportionality