0% found this document useful (0 votes)
26 views59 pages

Lec 7

The document discusses asymptotic notation, focusing on O-notation (upper bounds), Ω-notation (lower bounds), and Θ-notation (tight bounds) in algorithm analysis. It explains the definitions, examples, and methods for solving recurrences, including the substitution method and recursion-tree method. The lecture is presented by Md. Rafsan Jani from Jahangirnagar University.

Uploaded by

munshijubair7
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
26 views59 pages

Lec 7

The document discusses asymptotic notation, focusing on O-notation (upper bounds), Ω-notation (lower bounds), and Θ-notation (tight bounds) in algorithm analysis. It explains the definitions, examples, and methods for solving recurrences, including the substitution method and recursion-tree method. The lecture is presented by Md. Rafsan Jani from Jahangirnagar University.

Uploaded by

munshijubair7
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 59

CSE-209

Algorithms-I

Lecture 7

Asymptotic Notation-I

Md. Rafsan Jani


Assistant Professor
Department of Computer Science and Engineering
Jahangirnagar University
Asymptotic notation
Asymptotic notation
O-notation (upper bounds):

We w r i t e f ( n ) =
e writ e f( n) =
O (g (n ))if th e re
O( g( n))if t h er e
e x is t c o n s ta n ts
e xi st c o n st a nt s
c > 0 , n 00 > 0 s u c h
c > 0, n > 0 s u c h
th a t 0 ≤f( n ) ≤
t h at 0 f( n)
c g ( n ) f o r a l l n ≥ n 00.
c g( n)f or all n n .
Asymptotic notation
O-notation (upper bounds):

We w r i t e f ( n ) =
e writ e f( n) =
O (g (n ))if th e re
O( g( n))if t h er e
e x is t c o n s ta n ts
e xi st c o n st a nt s
c > 0 , n 00 > 0 s u c h
c > 0, n > 0 s u c h
th a t 0 ≤f( n ) ≤
t h at 0 f( n)
c g ( n ) f o r a l l n ≥ n 00.
c g( n)f or all n n .
Asymptotic notation
O-notation (upper bounds):

We w r i t e f ( n ) =
e writ e f( n) =
O (g (n ))if th e re
O( g( n))if t h er e
e x is t c o n s ta n ts
e
EXAMPLEx i :s t c
2n2 o n s t a n t s
= (c = 1, n = 2)
c3 > 0 , n 00 > 0 s u c h 0
O(n c) > 0 , n > 0 s funny, u c h
a t 0 ≤ f ( n ) ≤ equality
t hfunctions, “one-way”
t not at 0 f( n)
h values
c g ( n ) f o r a l l n ≥ n 00.
c g( n)f or all n n .
Set definition of O-notation

O (g (n ))= { f(n )
O( g( n)) = { f( n)
:th e re e x is t
:t h er e e xi st
c o n s ta n ts
c o n st a nt s
c > 0 , n 00 >
c > 0, n >
0 s u c h th a t
0 s u c h t h at
0 ≤f( n ) ≤
0 f( n)
c g (n ) fo r
c g( n) f or
September 12, 2005a ll n ≥n }
Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.6
Set definition of O-notation

O (g (n ))= { f(n )
O( g( n)) = { f( n)
:th e re e x is t
:t h er e e xi st
c o n s ta n ts
c o n st a nt s
c > 0 , n 00 >
c > 0, n >
0 s u c h th a t
0 s u c h t h at
0 ≤f( n ) ≤
0 f( n)
c g (n ) fo r
c g( n) f or
a ll n ≥n }
Set definition of O-notation

O (g (n ))= { f(n )
O( g( n)) = { f( n)
:th e re e x is t
:t h er e e xi st
c o n s ta n ts
c o n st a nt s
c > 0 , n 00 >
c > 0, n >
0 s u c h th a t
0 s u c h t h at
0 ≤f( n ) ≤
0 f( n)
c g (n ) fo r
c g( n) f or
a ll n ≥n }
Macro substitution

Convention: A set in a formula


represents an anonymous function in the
set.
Macro substitution

Convention: A set in a formula


represents an anonymous function in the
set.
E XAMPLE f(n) = n3 + O(n2)
: means
3
f(n) = n + h(n)
for some h(n) ∈ O(n ) .
2
Macro substitution

Convention: A set in a formula


represents an anonymous function in the
set.
E XAMPLE n2 + O(n) = O(n2)
: means
for any f(n) ∈ O(n):
2
n + f(n) = h(n)
for some h(n) ∈ O(n ) .
2
Ω-notation (lower bounds)
O-notation is an upper-bound notation. It
makes no sense to say f(n) is at least O(n2).
Ω-notation (lower bounds)
O-notation is an upper-bound notation. It
makes no sense to say f(n) is at least O(n2).
Ω(g ( n ) ) = { f ( n ) : t h e r e
g( n)) = { f( n) :t h er e
e x is t c o n s ta n ts
e xi st c o n st a nt s
c > 0 , n 00 >
c > 0, n >
0 s u c h th a t
0 s u c h t h at
0 ≤c g ( n ) ≤
0 c g( n)
f(n ) fo r a ll
f( n) f or all
Ω-notation (lower bounds)
O-notation is an upper-bound notation. It
makes no sense to say f(n) is at least O(n2).

Ω(g ( n ) ) = { f ( n ) : t h e r e
g( n)) = { f( n) :t h er e
e x is t c o n s ta n ts
e xi st c o n st a nt s
c > 0 , n 00 >
c > 0, n >
0 s u c h th a t
EXAMPLE n = Ω(lg s u(cc=h1,tnh0 =a16)
0 n) t
: 0 ≤c g ( n ) ≤
0 c g( n)
Θ-notation (tight bounds)
Θ-notation (tight bounds)

Θ(g ( n ) ) =
g( n)) =
O (g (n )) ∩ Ω
O ( g1 ( 2n ) )
EXAMPLE 2 n − 2n = Θ(n2
)
(g (n ))
: ( g( n))
ο -notation and ω-notation
O-notation and Ω-notation are like ≤ and ≥.
o-notation and ω-notation are like < and >.

ο (g ( n ) ) = { f ( n ) : f o r a n y
g( n)) = { f( n) :f or a n y
c o n s ta n t c > 0 ,
c o n st a nt c > 0,
th e re is a
t h er e is a
c o n s t a n t n >
3 n s t a n t n00 >
2 c o
EXAMPLE 2n = o(n ) (n0 = 2/c)
0 s u c h th a t 0
: 0 s u c h t h at 0
ο -notation and ω-notation
O-notation and Ω-notation are like ≤ and ≥.
o-notation and ω-notation are like < and >.

ω(g ( n ) ) = { f ( n ) : f o r a n y
g( n)) = { f( n) :f or a n y
c o n s ta n t c > 0 ,
c o n st a nt c > 0,
th e re is a
t h er e is a
c o n s t a n t n 00 >
EXAMPLE n = ω(lg t n >
c o n s(nt a= n1+1/c)
0 s u c h0 t h a t 0
: n) 0 s u c h t h at 0
Solving recurrences
•The analysis of merge sort from Lecture 1
required us to solve a recurrence.
•Recurrences are like solving integrals,
differential equations, etc.
o Learn a few tricks.
•Lecture 3: Applications of recurrences to
divide-and-conquer algorithms.
Exercise
IS 2^(n+1)=O(2^n)?
IS 2^2𝑛=𝑂(2^𝑛)?
Substitution method
The most general method:
1. Guess the form of the solution.
2. Verify by induction.
3. Solve for constants.
Substitution method
The most general method:
1. Guess the form of the solution.
2. Verify by induction.
3. Solve for constants.
EXAMPLE: T(n) = 4T(n/2) + n
•[Assume that T(1) = Θ(1).]
•Guess O(n3) . (Prove O and Ω
separately.)
•Assume that T(k) ≤ ck3 for k < n .
Example ofsubstitution
T (n) = 4T (n / 2) +
n
≤ 4c(n / 2)3 + n
3
= (c / 2)n + n
= cn3 − ((c / 2)n3 − n) desired – residual
3 desired
≤ cn
whenever (c/2)n3 – n ≥ 0, for example,
if c ≥ 2 and n ≥ 1.
residual
Example (continued)
•We must also handle the initial conditions,
that is, ground the induction with base
cases.
•Base: T(n) = Θ(1) for all n < n0, where n0
is a suitable constant.
•For 1 ≤ n < n0, we have “Θ(1)” ≤ cn3, if we
pick c big enough.
Example (continued)
•We must also handle the initial conditions,
that is, ground the induction with base
cases.
•Base: T(n) = Θ(1) for all n < n0, where n0
is a suitable constant.
•For 1 ≤ n < n0, we have “Θ(1)” ≤ cn3, if we
pick c big enough.

This bound is not tight!


A tighter upper bound?
We shall prove that T(n) = O(n2).
A tighter upper bound?
We shall prove that T(n) = O(n2).
Assume that T(k) ≤ ck2 for k < n:
T (n) = 4T (n / 2) + n
≤ 4c(n / 2) + n
2

= cn2 + n
= O(n2 )
A tighter upper bound?
We shall prove that T(n) = O(n2).
Assume that T(k) ≤ ck2 for k < n:
T (n) = 4T (n / 2) + n
≤ 4c(n / 2)2 + n
= cn2 + n
= O(n2 )
Wrong! We must prove the
I.H.
A tighter upper bound?
We shall prove that T(n) = O(n2).
Assume that T(k) ≤ ck2 for k < n:
T (n) = 4T (n / 2) + n
≤ 4c(n / 2)2 + n
= cn2 + n
= O(n2 )
Wrong! We must prove the
= cn − (−n) [ desired – residual ]
2
I.H. 2
≤ cn for no choice of c > 0.Lose!
A tighter upper bound!
IDEA: Strengthen the inductive hypothesis.
•Subtract a low-order term.
Inductive hypothesis: T(k) ≤ c1k2 – c2k for k < n.
A tighter upper bound!
IDEA: Strengthen the inductive hypothesis.
•Subtract a low-order term.
Inductive hypothesis: T(k) ≤ c1k2 – c2k for k < n.
T(n) = 4T(n/2) + n
= 4(c1(n/2)2 – c2(n/2)) + n
= c1n2 – 2c2n + n
= c1n2 – c2n – (c2n – n)
≤ c1n2 – c2n if c2 ≥ 1.
A tighter upper bound!
IDEA: Strengthen the inductive hypothesis.
•Subtract a low-order term.
Inductive hypothesis: T(k) ≤ c1k2 – c2k for k < n.
T(n) = 4T(n/2) + n
= 4(c1(n/2)2 – c2(n/2)) + n
= c1n2 – 2c2n + n
= c1n2 – c2n – (c2n – n)
≤ c1n2 – c2n if c2 ≥ 1.
Pick c1 big enough to handle the initial conditions.
Recursion-tree method
•A recursion tree models the costs (time) of a
recursive execution of an algorithm.
•The recursion-tree method can be unreliable,
just like any method that uses ellipses (…).
•The recursion-tree method promotes intuition,
however.
•The recursion tree method is good for
generating guesses for the substitution method.
Example of recursion tree
Solve T(n) = T(n/4) + T(n/2) + n2:
Example of recursion tree
Solve T(n) = T(n/4) + T(n/2) + n2:
T(n)
Example of recursion tree
Solve T(n) = T(n/4) + T(n/2) + n2:
2
n
T(n/4) T(n/2)
Example of recursion tree
Solve T(n) = T(n/4) + T(n/2) + n2:
2
n
(n/4)2 (n/2)2

T(n/16) T(n/8) T(n/8) T(n/4)


Example of recursion tree
Solve T(n) = T(n/4) + T(n/2) + n2:
2
n
(n/4)2 (n/2)2

(n/16)2 (n/8)2 (n/8)2 (n/4)2


Θ(1)
Example of recursion tree
Solve T(n) = T(n/4) + T(n/2) + n2:
2
n n2
(n/4)2 (n/2)2

(n/16)2 (n/8)2 (n/8)2 (n/4)2


Θ(1)
Example of recursion tree
Solve T(n) = T(n/4) + T(n/2) + n2:
2
n n2
(n/4)2 (n/2)2 5 2
16 n
(n/16)2 (n/8)2 (n/8)2 (n/4)2

Θ(1)
Example of recursion tree
Solve T(n) = T(n/4) + T(n/2) + n2:
2
n n2
(n/4)2 (n/2)2 5 2
16 n
(n/16)2 (n/8)2 (n/8)2 (n/4)2 25 2
256n


Θ(1)
Example of recursion tree
Solve T(n) = T(n/4) + T(n/2) + n2:
2
n n2
(n/4)2 (n/2)2 5 2
16 n
(n/16)2 (n/8)2 (n/8)2 (n/4)2 25 2
256n


Θ(1)
Total =
n 2
(
2
5
1 ++ 5
16 16 16
25
( ) +( ) 3

= Θ(n ) geometric series


+· )
The master method

The master method applies to recurrences of


the form
T(n) = a T(n/b) + f (n) ,
where a ≥ 1, b > 1, and f is
asymptotically positive.
Three common cases
Compare f (n) with nlogba:
1. f (n) = O(nlogba – ε) for some constant ε > 0.
•f (n) grows polynomially slower than
nlogba
(by an nε factor).
Solution: T(n) = Θ(nlogba) .
Three common cases
Compare f (n) with nlogba:
1. f (n) = O(nlogba – ε) for some constant ε > 0.
•f (n) grows polynomially slower than nlogba
(by an nε factor).
Solution: T(n) = Θ(nlogba) .
2. f (n) = Θ(nlogba lgkn) for some constant k ≥
0.
•f (n) and nlogba grow at similar rates.
Solution: T(n) = Θ(nlogba lgk+1n) .
Three common cases (cont.)
Compare f (n) with nlogba:
3. f (n) = Ω(nlogba + ε) for some constant ε > 0.
•f (n) grows polynomially faster than nlogba
(by an nε factor),
and f (n) satisfies the regularity condition that
af (n/b) ≤ cf (n) for some constant c < 1.
Solution: T(n) = Θ( f (n)) .
Examples

EX. T(n) = 4T(n/2) + n


a = 4, b = 2 ⇒ nlogba = n2; f (n) =
n.
CASE 1: f (n) = O(n2 – ε) for ε = 1.
∴ T(n) = Θ(n2).
Examples

EX. T(n) = 4T(n/2) + n


a = 4, b = 2 ⇒ nlogba = n2; f (n) = n.
CASE 1: f (n) = O(n2 – ε) for ε = 1.
∴ T(n) = Θ(n2).

EX. T(n) = 4T(n/2) + n2


a = 4, b = 2 ⇒ nlogba = n2; f (n) = n2.
CASE 2: f (n) = Θ(n2lg0n), that is, k = 0.
∴ T(n) = Θ(n2lg n).
Examples
EX. T(n) = 4T(n/2) + n3
a = 4, b = 2 ⇒ nlogba = n2; f (n) = n3.
CASE 3: f (n) = Ω(n2 + ε) for ε = 1
and 4(n/2)3 ≤ cn3 (reg. cond.) for c = 1/2.
∴ T(n) = Θ(n3).
Examples
EX. T(n) = 4T(n/2) + n3
a = 4, b = 2 ⇒ nlogba = n2; f (n) = n3.
CASE 3: f (n) = Ω(n2 + ε) for ε = 1
and 4(n/2)3 ≤ cn3 (reg. cond.) for c = 1/2.
∴ T(n) = Θ(n3).

EX. T(n) = 4T(n/2) + n2/lg n


a = 4, b = 2 ⇒ nlogba = n2; f (n) = n2/lg n.
Master method does not apply. In particular,
for every constant ε > 0, we have nε = ω(lg
Idea of master theorem
Recursion tree:
f a
(n)
f f … f
(n/b) (n/b) (n/b)
2 …
2
f (n/b ) af (n/b2)
f (n/b )

Τ (1)
Idea of master theorem
Recursion tree:
f f
a
(n) (n)
f f … f af (n/b)
(n/b) (n/b) (n/b)
2 … a2 f
2
f (n/b ) a f (n/b2)
f (n/b ) (n/b2)


Τ
(1)
Idea of master theorem
Recursion tree:
f f
a
(n) (n)
f f … f af (n/b)
h = logbn (n/b) (n/b) (n/b)
2 … a2 f
2
f (n/b ) a f (n/b2)
f (n/b ) (n/b2)


Τ
(1)
Idea of master theorem
Recursion tree:
f f
a
(n) (n)
f f … f af (n/b)
h = logbn (n/b) (n/b) (n/b)
2 … a2 f
2
f (n/b ) a f (n/b2)
f (n/b ) (n/b2)
#leaves = ah


log n
Τ = a b nlogbaΤ
log a
(1) = n b (1)
Idea of master theorem
Recursion tree:
f f
a
(n) (n)
f f … f af (n/b)
h = logbn (n/b) (n/b) (n/b)
2 … a2 f
2
f (n/b ) a f (n/b2)
f (n/b ) (n/b2)


C AASSEE1 : T h e w e i g
C 1: T h e w ei g
h tin c re a s e s
Τ hti n cr e a s e s nlogbaΤ
g e o m e tric a lly fro
(1) fmgr aet hco te imrooento or ti ft cot aht lhel tey f r o (1)
fmr at hc tei r o no ot tf to ht he te log a
lo et aa vl we se . iTg hh et .l e a v e s Θ(n b )
Idea of master theorem
Recursion tree:
f f
a
(n) (n)
f f … f af (n/b)
h = logbn (n/b) (n/b) (n/b)
2 … a2 f
2
f (n/b ) a f (n/b2)
f (n/b ) (n/b2)


C AASSEE2 : ( k = 0 ) T
C 2:( k = 0) T
Τ h e w e ig h t
h e w ei g ht
nlogbaΤ
(1) ei saacph po r fot hxb ei ml o ag t e l (1)logba
ie saac ph por f ot hx ie lmo a t e l Θ(n lg n)
y nt hl ee vs ae ml s .e o n
gyb tn hl ee vs ae l ms .e o n
Idea of master theorem
Recursion tree:
f f
a
(n) (n)
f f … f af (n/b)
h = logbn (n/b) (n/b) (n/b)
2 … a2 f
2
f (n/b ) a f (n/b2)
f (n/b ) (n/b2)


C AASSEE3 : T h e w e i g
C 3: T h e w ei g
h td e c re a s e s
Τ ht d e cr e a s e s nlogbaΤ
g e o m e tric a lly fro
(1) fmgr aet hco te imrooento or ti ft cot aht lhel tey f r o (1)
fmr at hc tei r o no ot tf to ht he te Θ( f
lo et aa vl we se . iTg hh et .r o o t h o (n))
Appendix: geometric series

1 + x + x2 · + x n =1− for x ≠ 1
+
n+1
1 −
x x
2 1
1 + x + x +· = for |x| < 1
1−
x
Return to last
slide viewed.
Reference
https://ocw.mit.edu/courses/6-046j-intr
oduction-to-algorithms-sma-5503-fall-2
005/pages/readings/

You might also like