CS1011E:Program Design
Asymptotic Analysis
Algorithm
• An algorithm is a finite sequence of logically related
instructions to solve a computational problem.
Examples:
— Given an integer x, test whether x is prime or not.
— Given a program P, check whether P runs into an infinite loop.
• Properties - Input, Output, Finiteness, Definiteness, Effectiveness
Factorial in (Recursive/Iterative)
Example Iterative Algorithm for Factorial
Fact_Iter(n)
for i = 1 to n
fact = fact * i
return fact
Example Recursive Algorithm for Factorial
Fact_Recr(n)
if n = 1
return 1;
else
return (n * fact(n-1))
ALGORITHM ANALYSIS
Correctness
For any algorithm, a proof of correctness is important which will
exhibit the fact that the algorithm indeed output the desired
answer.
Amount of work done (time complexity)
The time complexity does not refer to the actual running time of an
algorithm, instead it is the step count: the number of times each
statement in an algorithm is executed.
• The step count focuses on primitive operations along with basic
operations.
Space Complexity
• The number of primitive operations increases with the problem size.
• Space complexity is a related complexity measure that refers to the amount
of space used by an algorithm.
• Among the many algorithms for a given combinatorial problem, a natural
question is to find the best algorithm (efficient).
IINPUT SIZE AND PRIMITIVE OPERATIONS
INPUT SIZE AND PRIMITIVE OPERATIONSNPUT SIZE
AND PRIMITIVE OPERATIONS
INPUINPUT SIZE AND PRIMITIVE OPERATIONS
T SINPUT SIZE AND PRIMITIVE OPERATIONS
IZE INPUT SIZE AND PRIMITIVE OPERATIONS
AND PRIMITIVE OPERATIONS
ANALYSIS OF ALGORITHMS USING STEP COUNT METHOD
1 Each basic statement like assignment and return have a count of 1.
2 If a basic statement is iterated, then multiply by the number of times
the loop is run.
3 The loop statement is iterated n times, it has a count of (n + 1). Here
the loop runs n times for the true case and an additional check is
performed for the loop to exit (the false condition).
EXAMPLESExamples
1)Sum of elements in an array
Example
2) algorithm Fibonacci(n)
algorithm Fibonacci(n) Step Count
if n≤1 1
output ‘n’
else
f2=0 1
f1=1 1
for i=2 to n n
f=f1+f2 n-1
f2=f1 n-1
f1=f n-1
output f 1
Total number of steps 4n+1
Order of Growth or Rate of Growth
• Order of Growth or Rate of Growth - A simple characterization
of the algorithms efficiency by identifying relatively significant
term in the step count.
Example
• The low order terms in a function are relatively insignificant for large n
n4 + 100n2 + 10n + 50 ~ n4
i.e., we say that n4 + 100n2 + 10n + 50 and n4 have the same rate of
growth
Asymptotic analysis
• Asymptotic analysis is a technique that focuses analysis on the
“significant term”.
• For example, an algorithm with a step count 2n2 + 3n + 1, the
order of growth depends on 2n2 for large n.
Asymptotic analysis (cont…)
• To compare two algorithms with running times f(n) and
g(n), we need a rough measure that characterizes how fast
each function grows.
• Hint: use rate of growth
• Compare functions in the limit, that is, asymptotically!
(i.e., for large values of n)
Asymptotic Notations
1. Big-oh Notation (O) - To express an upper bound on the time
complexity as a function of the input size.
2. Omega (Ω) - To express a lower bound on the time
complexity as a function of the input size.
3. Theta (Θ) - To express the tight bound on the time
complexity as a function of the input size.
Asymptotic Notations (cont..)
• O notation: asymptotic “less than”:
f(n)=O(g(n)) implies: f(n) “≤” g(n)
• Ω notation: asymptotic “greater than”:
f(n)= Ω (g(n)) implies: f(n) “≥” g(n)
• Θ notation: asymptotic “equality”:
f(n)= Θ (g(n)) implies: f(n) “=” g(n)
Asymptotic notation-Big O notation
Examples
• We say fA(n)=30n+8 is order n, or O (n)
It is, at most, roughly proportional to n.
• fB(n)=n2+1 is order n2, or O(n2). It is, at most, roughly proportional to
n2.
• In general, any O(n2) function is faster- growing than any O(n)
function.
Visualizing Orders of Growth
• On a graph, as
you go to the
right, a faster
growing
function
Value of function
fA(n)=30n+8
eventually
becomes
→
larger...
fB(n)=n2+1
Increasing n
→
19
More Examples …
• n4 + 100n2 + 10n + 50 is O(n4)
• 10n3 + 2n2 is O(n3)
• n3 - n2 is O(n3)
• constants
– 10 is O(1)
– 1273 is O(1)
20
Examples
– 2n2 = O(n3): 2n2 ≤ cn3 ⇒ 2 ≤ cn ⇒ c = 1 and n0= 2
– n2 = O(n2): n2 ≤ cn2 ⇒ c ≥ 1 ⇒ c = 1 and n0= 1
– 1000n2+1000n = O(n2):
1000n2+1000n ≤ 1000n2+ n2 =1001n2⇒ c=1001 and n0 = 1000
– n = O(n2): n ≤ cn2 ⇒ cn ≥ 1 ⇒ c = 1 and n0= 1
21
No Uniqueness
• There is no unique set of values for n0 and c in proving the asymptotic bounds
• Prove that 100n + 5 = O(n2)
– 100n + 5 ≤ 100n + n = 101n ≤ 101n2 for all n ≥ 5
n0 = 5 and c = 101 is a solution
– 100n + 5 ≤ 100n + 5n = 105n ≤ 105n2 for all n ≥ 1
n0 = 1 and c = 105 is also a solution
Must find SOME constants c and n0 that satisfy the asymptotic notation relation
22
Asymptotic notations- Big Omega Notation
• Ω - notation
Ω(g(n)) is the set of functions
with larger or same order of
growth as g(n)
23
Asymptotic Lower Bound-Big Omega Notation
24
Examples
25
Remarks
3n2 + 2 ≠ Ω(n3).
Reason:
There does not exist a positive constant c such that 3n2 + 2 ≥ c.n3 for
every n ≥ n0 as we cannot bound n by a constant. That is, on the
contrary if 3n2 + 2 ≥ c.n3, then n≤(1/c )is a contradiction. That is on
the contrary if 3n2 + 2 ≥ c.n3, then 3.2n ≠ Ω(n!), and 5 ≠ Ω(n).
Asymptotic notations-Theta notation
• Θ-notation
Θ(g(n)) is the set of functions
with the same order of growth
as g(n)
27
28
Examples
1 3n + 1010
3n ≤ 3n + 1010 ≤ 4n, ∀n ≥ 1010 ⇒ 3n + 1010 = Θ(n)
Note that the first inequality captures 3n + 1010 = Ω(n) and the
later one captures 3n + 1010 = O(n).
2 10n2 + 4n + 2 = Θ(n2)
10n2 ≤ 10n2 + 4n + 2 ≤ 20n2, ∀n ≥ 1, c1 = 10, c2 = 20
3 6(2n) + n2 = Θ(2n)
6(2n) ≤ 6(2n) + n2 ≤ 12(2n), ∀n ≥ 1, c1 = 6, c2 = 12
4 2n2 + n lg n + 1 = Θ(n2)
2n2 ≤ 2n2 + n lg n + 1 ≤ 5.n2, ∀n ≥ 2, c1 = 2, c2 = 5
Remarks
3n + 2 ≠ Θ(1).
Reason: 3n + 2 ≠ O(1)
3n + 3 ≠ Θ(n2).
Reason: 3n + 3 ≠Ω(n2)
n2 ≠ Θ(2n).
Reason: n2 ≠Ω(2n)
Solving Recurrences
1)Substitution Method
2)Recursion Tree Method
3)Master Method
Solution for Recurrence relation (cont..)
▪Substitution method
substitute the T(i) repeatedly until i reaches 0 or 1.
▪Recursion Tree
nodes represent cost. Sum up the cost at different levels
▪Master Method
memorize the three cases in upcoming slides
Running Time in Recurrence equation
Divide and Conquer – Recurrences
Recurrence Example 1 by Substitution
• T(n)= 2 T(n/2) + cn
= 2 (2 T(n/4)+ cn/2) + cn
= 4 T(n/4) +2cn
= 4 (2 T(n/8) + cn/4) + 2cn
=8(T(n/8)+3cn
=2^3 (T(n/2^3))+3cn
After k substitutions
T(n)= 2 k T(n/2k)+kcn
Recurrence Eq. Example 1 (cont…)
Assume n=2 k
Taking log on both sides
log n=k
T(n)= nT(n/n)+cnlgn
T(n)= nT(1)+cnlgn
= Θ(nlgn)
Recurrence Example 2 by substitution
• T(n) = T(n/2) + c
= T(n/4)+ c+c=T(n/4)+2c=T(n/2 2 )+2c
=T(n/8)+3c=T(n/23 )+3c
After k substitutions
T(n)= T(n/2k)+kc
Recurrence Eq. Example 2 (cont..)
Assume n=2 k
Taking log on both sides
log n=k
T(n) = T(n/n)+clgn
T(n) = T(1)+clgn
=Θ 𝑙𝑔𝑛
Recursion Tree Method
Cont…
• T(n/2)=2T(n/4) + cn/2=T(n/2)+T(n/2)+ cn/2
Cont…
Recursion Tree when n=8=2^3
Cont…
Cont…
Master Method
•
Ω
Example 1
Example 2
Example 3
T(n)=3T(n/4)+nlgn
a=3,b=4, f(n)=nlgn
𝑛log𝑏 𝑎 =O(n0.793)
f(n)=𝛺(𝑛log𝑏 𝑎+𝜀 ) for ε ≈ 0.2
af(n/b)=3(n/4)lg(n/4)≤(3/4)nlgn
=cf(n) for c=3/4
Solution 𝑇 𝑛 = Θ 𝑓 𝑛 = Θ(𝑛𝑙𝑔𝑛)
Reference
Thomas H Cormen, C E Leiserson, R L Rivest, C Stein (CLRS) “Introduction to
Algorithms”, 3rd ed., PHI, 2010