Algorithm-Lecture3-Asymtotic Analysis
Algorithm-Lecture3-Asymtotic Analysis
Lecture 3
1
Analytical Approach Effect of time and space
Space Complexity
Usability
Variable part
Efficiency No. of
I/O size dependent Steps as
function of
Reliability Compile time size (n)
Machine
Portability Time Complexity dependent
2
Analytical Approach
3
ATnmi ayl etiCcaolmApppl Effect of Data Size
eroxaticyh
Growth Rates of the Time Complexities of Algorithms
with respect to increasing problem sizes. On a machine running
at Speed of 1 GHz (1,000,000 instructions per second)
Time (sec) = # of inst. ..f(n) / # of Inst. per sec …..Speed (S)
Time= f(n) / S
4
Analytical Effect of Data Size
Approach
Growth Rate of the Complexity time function
f n n 2 100n log
10 n
1000
Applperxoitaych
BIG IDEA:
Focus on number of steps
Ignore machine-dependent constants.
Look at growth of # of steps as
n→∞. n … . Very large
“Asymptotic Analysis”
6
Analytical
Approach
“Asymptotic Analysis”
7
AAnnaallyyttiiccaallAA
ppppo
r roaachch:
Using Pseudo Code
Problem Express No. of Steps
Algorithm
as function of Input
Size … … F(n)
Complexity
Indicator Grow with the Performance
data size (n) Indicator
Focus at Large .. n
Consider Upper No. of Steps
Bound & Lower Examples:
bounds F(n) = 2n+3
F(n) = n2+ n
“Asymptotic F(n) = n logn
Analysis” F(n) = n3 + n logn+ 10
8
Upper bound
No. of g(n) =n2 O(n2) Worst
Big-Oh Case
Steps
Algorithm f(n) = n2+8
𝛳(𝑛) Average
Big- Case
Theta
9
Asymptotic Notation (Worst,
Best, Average) worst case
Big-Oh
f(n) is O(g(n)) if f(n) is asymptotically
less than or equal to g(n)
best case
Big-Omega
f(n) is (g(n)) if f(n) is asymptotically
greater than or equal to g(n)
average case
Big-Theta
f(n) is (g(n)) if f(n) is asymptotically
equal to g(n)
No. of Steps Upper bound Big-Oh
Algorithm 1 g(n) = 3n
f(n) = 2n+3
O(n)
Algorithm 2 f(n) = 4n+8 g(n) =5n
O(n3)
Algorithm f(n) = 2n 3
4
O(log n)
Iterative
?
Ex: Do Loop
f(n) = n
Recurrence
Ex: Merge Sort
f(n) = n + 2f(n/2)
12
Calculating
Running Time
f(n)
0
Constant time operation
for z = 1 to n n z y
n 3 3n 2 2n
for y = 1 to z 1
6
n3
for x = 1 to y z 1 y 1 x1
constant-op
f(n) = n3
15
i = N; N= 8
while i >= 1 i=8
{ print i , i = i / 2 } i=8, print 8, i = 8 / 2 = 4 … . … … … … . . 1
i=4, print 4, i = 4 / 2 = 2 . .
…………….2 i=2, print 2,
i=2/2=1………………3
How many times can we divide by N by 2 until we get below 1? If we
approached this in reverse, we could say: how many times can we multiply 1
by 2 until we get to N? This would be the value x, where 2^x = n. This for loop,
therefore, iterates x times.
10 7 10 100 1000
17
Program Segments
for x = 0 to n-1 f(n) = n + n2 + n + Log n
constant-time op
for y = 0 to n-1
for z = 0 to n-1
constant-time op
for w = 0 to n-1
constant-time op
= n2 + 2n + Log n
i = N;
while i >= 1
{ print i , i = i / 2 } g(n) = n2
18
No. of Steps Upper bound n large
Algorithm 1 g(n) = 3n
f(n) = 2n+3
O(n)
Algorithm 2 f(n) = 4n+8 g(n) =5n
O(log n)
19
Complexity Analysis
The “Big-Oh” Notation
• Big-oh notation indicates an upper bound.
• How bad things can get – perhaps things are not nearly bad
• Lowest possible upper bound
20
The “Big-Oh” Notation
EG: 3x 3 + 5x 2 – 9 = O (x 3)
Doesn’t mean
“3x 3 + 5x 2 – 9 equals the function O (x 3)”
Which actually means
“3x 3+5x 2 –9 is dominated by x 3”
Read as: “3x 3+5x 2 –9 is big-Oh of x 3”
21
“Asymptotic
Analysis”
Calculating Running Determine O-notation
Time f(n)
f(n) a polynomial .. f(n) = 3n2 + 4n
•Use the pseudo code
•Ignore constants Special case:
•Consider Coding If is f(n) a polynomial of degree d,
implementation then f(n) is O(nd)
22
Determine O-notation f(n) a polynomial
If is f(n) a polynomial of degree d, then f(n) is O(nd), i.e.,
Drop lower-order terms
3x 3 + 5x 2 – 9 = O (x 3)
Use the smallest possible class of functions
Say “2n is O(n)” instead of “2n is O(n2)”
7 n 2 = O(n)
Program Segments
for x = 0 to n-1 f(n) = n + n2 + n + Log n
constant-time op
for y = 0 to n-1
for z = 0 to n-1
constant-time op
for w = 0 to n-1
constant-time op
= n2 + 2n + Log n
i = N; = O(n2)
while i >= 1
{ print i , i = i / 2 }
24
Which of
the below
expressions
are •O(3 n3) O(n(n2 + 3))
equivalent •O(n3 - 2)
•O(n3 + n lg n) O(n3 – n2 + n) O((n2 + 3)(n+1))
to O(n3)? •All of them!
25
Determine O-notation f(n) a function of n
2n2 <= n3
27
Example 1: If f(n) = 3n2 then f(n) is in O(n2).
31
Example
32
20n 3 10n log n O(n 3 )
5 3
20n 10n log n 5 35n 3 , for n
1
3log n log log O(log
n)
n3log n log log n 4 log n, for n
2
100 O(1) 2100 2100.1, for n 1
2
O(1 n) 5 n 5(1 n), for n
5n 1
33
The “Big-Oh” Notation
Constant Time: O(1)
34
Quadratic Time: O(N2)
35
Logarithmic Time: O(log N) and O(N log N)
36
Relations Between Oh,Omega,Theta
37
Meaning: For all data sets big enough (i.e., n > n0),
the algorithm always executes in more than cg(n) steps.
38
f(n) = c1n2 + c2n.
39
When big-Oh and meet, we indicate this by using
(big-Theta) notation.
40
41
42
Practical Considerations n= 100
Big O
No. of Steps
x100 x50
x2
No such big difference in running time between Θ1(n) and Θ2(nlogn).
There is an enormous difference between Θ1(n2) and Θ2(nlogn).
43
Remarks
• Most statements in a program do not have much effect
on the running time of that program
• There is little point to cutting in half the running time
of a subroutine that accounts for only 1% of the total.
• Focus your attention on the parts of the program that
have
the most impact
• The greatest time and space improvements come
from
a better data structure or
T(n)=10 n
Old……………. Time= T(n)/S1 = 10*n1/ 10000 =1 sec ………………
n1=1000 New …………… 10*n2/ 10^5 =1 sec
…………… n2=10000 Input size increase = n2/n1 = 10 times
T(n)=5 n log n
Old…………………5n log n / 10000 =1 … n log n = 2000 ……………….. n1=250
New……………… 5n log n / 10^5 = 1 ….n log n = 20000……………....n2=1842
Input size increase = n2/n1 = 1842/250 = 7.37 times
T(n) = 2n2
Old ……… Time= T(n)/S1 = 2n2/10,000 = 1 ……n2 =5000……………. n1=70
New……………… 2n2 / 10^5 = 1 ……n2= 50000…………...…
n2=223 Input size increase = n2/n1 = 223/70 = 3.14 times
45
Faster Computer, or Faster Algorithm?
An algorithm with time equation T(n) = 2n2
does not receive nearly as great an
improvement from the faster machine as an
algorithm with linear growth rate.
46
Faster Computer, or Faster Algorithm?
Exercise:
47