Lecture 18 Order of Growth
Lecture 18 Order of Growth
1
Efficiency of Algorithm
• Suppose X is an algorithm and n is the size of input data,
• The time and space used by the Algorithm X are the two
main factors which decide the efficiency of X.
5
Every case Time Complexity
• For a given algorithm, T(n) is every case
time complexity if algorithm have to repeat
its basic operation every time for given
input size n. determination of T(n) is called
every case time complexity analysis.
6
Every case time complexity(examples)
7
Every case time complexity(examples)
8
Every case time complexity(examples)
• Exchange Sort
Algorithm exchange_sort(A,n)
For i=1 to n-1
for j= i+1 to n
if A[i]>A[j]
exchange A[i] &A[j]
9
Every case time complexity(examples)
10
Best Case Time Complexity
11
Best Case Time Complexity (Example)
Algorithm sequential_search(A,n,key)
i=0
While i<n &&A[i]!= key
i=i+1
If i<n
return I
Else return-1
12
Best Case Time Complexity (Example)
13
Worst Case Time Complexity
• For a given algorithm, W(n) is every case
time complexity if algorithm have to repeat
its basic operation for maximum number of
times for given input size n. determination
of W(n) is called worst case time
complexity analysis.
14
Sequential Search
• Input size: number of elements in the array
ie n
• Basic operation :comparison of key with
array elements
• worst case: last element is the required key
or key is not present in array at all
• Complexity :w(n)=n
15
Average Case Time Complexity
• For a given algorithm, A(n) is every case
time complexity if algorithm have to repeat
its basic operation for average number of
times for given input size n. determination
of A(n) is called average case time
complexity analysis.
16
Sequential Search
18
What is Order of Growth?
When we want to compare two algorithm with
respect to their behavior for large input size, a
useful measure is so-called Order of growth.
The order of growth can be estimated by
taking into account the dominant term of the
running time expression.
19
Dominant Term
In the running time expression, when n
becomes large a term will become significantly
larger than the other ones: this is the so-called
dominant term.
T1(n)=an+b Dominant term: a n
300 lg n
250 n lg n
Function Value
200 n square
150 n cube
100 2 raise to
power n
50
0
8
2
32
0
8
48
92
12
51
76
20
81
32
Data Size
21
Growth Rates
n lgn nlgn n2 n3 2n
0 #NUM! #NUM! 0 0 1
1 0 0 1 1 2
2 1 2 4 8 4
4 2 8 16 64 16
8 3 24 64 512 256
16 4 64 256 4096 65536
32 5 160 1024 32768 4294967296
64 6 384 4096 262144 1.84467E+19
128 7 896 16384 2097152 3.40282E+38
256 8 2048 65536 16777216 1.15792E+77
512 9 4608 262144 134217728 1.3408E+154
1024 10 10240 1048576 1073741824
2048 11 22528 4194304 8589934592
22
How can be interpreted the order of growth?
24
Asymptotic Notations
25
Asymptotic Notations
• Asymptotic analysis refers to computing the running
time of any operation in mathematical units of
computation.
• For example, running time of one operation is
computed as f(n) and may be for another operation it is
computed as g(n2).
• Which means first operation running time will increase
linearly with the increase in n and running time of
second operation will increase exponentially when n
increases.
26
Asymptotic Notations
27
Big Oh Notation, Ο
28
Big Oh Notation, Ο
There may be a situation, e.g.
g(n)
f(n)
Tt
1 n0 n
f(n) <= g(n) for all n >= n0 Or
f(n) <= cg(n) for all n >= n0 and c = 1
g(n) is an asymptotic upper bound on f(n).
f(n) = O(g(n)) if there exist two positive constants c and n0 such that
f(n) <= cg(n) for all n >= n0 29
Omega Notation, Ω
30
Omega Notation, Ω
Asymptotic Lower Bound: f(n) = (g(n)),
if there exit positive constants c and n0 such that
f(n) >= cg(n) for all n >= n0
f(n)
g(n)
n0 n
31
Theta Notation, θ
32
Theta Notation, θ
Asymptotically Tight Bound: f(n) = (g(n)),
iff there exit positive constants c1 and c2 and n0 such that
c1 g(n) <= f(n) <= c2g(n) for all n >= n0
c2g(n)
f(n)
c1g(n)
n0 n
33
Big-Oh and Growth Rate
• The big-Oh notation gives an upper bound on the
growth rate of a function
• The statement “f(n) is O(g(n))” means that the growth
rate of f(n) is no more than the growth rate of g(n)
• We can use the big-Oh notation to rank functions
according to their growth rate
Big-Omega
f(n) is (g(n)) if f(n) is asymptotically greater than or
equal to g(n)
Big-Theta
f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
37
Common Asymptotic Notations
constant − Ο(1)
logarithmic − Ο(log n)
linear − Ο(n)