0% found this document useful (0 votes)
14 views23 pages

Session3 - Asymptotic Notations

The document discusses the analysis and design of algorithms, focusing on asymptotic notations used to evaluate algorithm performance. It explains various notations such as Big-O, Omega, and Theta, which help in determining upper and lower bounds of algorithm efficiency. Additionally, it includes examples and practice questions to reinforce understanding of these concepts.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views23 pages

Session3 - Asymptotic Notations

The document discusses the analysis and design of algorithms, focusing on asymptotic notations used to evaluate algorithm performance. It explains various notations such as Big-O, Omega, and Theta, which help in determining upper and lower bounds of algorithm efficiency. Additionally, it includes examples and practice questions to reinforce understanding of these concepts.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 23

Analysis and Design of Algorithms

Dr. Vandna Batra

Unit-1
Introduction to Algorithms
Session-3
Asymptotic Notations

School of Engineering & Technology


K.R. Mangalam University, Gurugram (Haryana)
How do we decide which algorithm is
better?
 Given two algorithms for the same problem, how do we decide
which one is better
 One approach- implement both the algorithms on same machine.
 There are many problems with this approach.
 Dependency on programming & machine dependent factors.
 Second approach-Asymptotic Analysis
Asymptotic Analysis of Algorithms

 We estimate the performance of an algorithm in terms of input size


(we don’t measure the actual running time);
 Determines the growth rate of algorithms;
 Asymptotic analysis determines the best case, average case, and worst
case scenario;
 To compare the efficiency of different algorithms;
 Asymptotic Analysis is not perfect,
but that’s the best way available for analyzing algorithms
Big-O Notation
Big-O is an Asymptotic Notation for the worst case
 It provides us with an asymptotic upper bound for
the growth rate of runtime of an algorithm
 Say f(n) is your algorithm runtime, and g(n) is an
arbitrary time complexity you are trying to relate to your
algorithm.

 f(n) is O(g(n)), if for some real constants c (c > 0) and n 0,


Such that f(n) <= c g(n) for every input size n (n > n0).
 f(n) grows no faster than g(n) for “large” n
 g(n) defines an upper bound on f(n)
Big-O Notation
Examples:

If f(n)=2n+3 is your algorithm runtime ,


Can we write f(n)=O(n)?
f(n) = 3*n^2 and g(n) = n;
Is f(n)=O(n)?

2n + 3 is O(n) and 5n is O(n);


place 2n + 3 and 5n in the same category
 Big-Oh allows us to ignore constant factors
and lower order (or less dominant) terms
Common Asymptotic notations &
comparison
constant Ο(1)
logarithmic Ο(log n)
linear Ο(n)
n log n Ο(n log n)
quadratic Ο(n2)
cubic Ο(n3)
polynomial nΟ(1)
exponential 2Ο(n)

One should remember the general order of following functions.


O(1) < O(logn) < O( n) < O( nlogn)< O(n*n) < O(n*n*n) < O(nk)< O(2n)
Big-O Notation example
Example 1 : Assume that f(n)= 5n+50 and g(n) = n Is f(n)=O
g(n) ???

Solution : We know f(n) <= c g(n) Assume c=6

Why c=6 ???


Big-O Notation example
Example 2 : : Assume that f(n)= 5n+50 and g(n) = log 10 n Is
f(n)=O g(n) ???

Solution : We know f(n) <= c g(n) Assume c=1000 g(n) is


not upper bound
Big-O Notation Practice Questions

9
Little ο asymptotic notation
 Big-Ο is used as a tight upper-bound on the growth of an algorithm’s effort.

 “Little-ο” (ο()) notation is used to describe an upper-bound that cannot be tight.


 Definition : Let f(n) and g(n) be functions that map positive integers to positive
real numbers.
 We say that f(n) is ο(g(n)) if for any real constant c > 0, there exists an integer
constant n0 ≥ 1 such that 0 ≤ f(n) < c*g(n). f(n) = o(g(n)) means
For big Oh: true for at least one constant c lim f(n)/g(n) = 0
For little o: true for all constant c n→∞
 Big-O is an inclusive upper bound, while little-o is a strict upper bound.
Omega Notation (Ω)

 Express the lower bound of an algorithm's running time.


 measure of best case time complexity
for a function f(n)
If f(n)= Ω(g(n) : there exists c > 0 and n0 such
that c.g(n) ≤ f(n) for all n > n0. }

Ex: if f(n)=3n2+2n+1 then we can take g(n)=n2 for


c=1, s.t
f(n)>=3.g(n) for all n>=1(n0), c=3
Therefore f(n)= Ω(n)
Omega Notation (Ω)

Example 1: f(n)=3n2+2n+1

We can take g(n)=n2


for c=1, s.t
f(n)>=3.g(n) for all n>=1(n0), c=3

Therefore f(n)= Ω(n)


Omega Notation (Ω)

Example 2 : Find the lower bound for f(n) = 10n2+5


• Step 1 : Identify dominant term which is 10n2 Assume g(n)= n2
• Step 2 : According to Omega definition
Little ω (omega) asymptotic notation

Small-omega, commonly written as ω,


denotes the lower bound (that is not asymptotically tight) on the growth rate of
runtime of an algorithm.
f(n)=ω(g(n)), if for all real constants c (c > 0) and n 0 (n0 > 0), f(n) is > c g(n) for every
input size n (n > n0).
in f(n) = Ω(g(n)), the bound f(n) >= g(n) holds for some constant c > 0,
but in f(n) = ω(g(n)), the bound f(n) > c g(n) holds for all constants c > 0.
Theta Notation(θ)
 Theta, commonly written as Θ, is an Asymptotic
Notation to denote the asymptotically tight bound on
the growth rate of runtime of an algorithm.
 Express both lower bound and the upper bound of an algorithm's
running time.
Θ(g(n)) = {f(n): there exist positive constants
c1, c2 and n0 such that 0 <= c1*g(n) <= f(n) <=
c2*g(n) for all n >= n0}

f(n) is always between c1*g(n) and c2*g(n)


for large values of n (n >= n0)
Theta Notation(θ)

Example1. : f(n)=3n+2
1) We show that f(n)<=C1.g(n)
• let g(n)=n and C1=4
• by def. of Big-O
• 3n+2<=4 n which is true for all n>2(n0)
2) Now we have to show C2.g(n)<=f(n) for satisfying omega notation
• here g(n)=n, let C2=1, then
• 1. n<=3n+2 is also true for all n>2(n0)
• Therefore by definition of theta notation
• F(n)= Θ(g(n))= Θ(n) for constants C1=4 and C2=1 for all n>2
Theta Notation(θ)

• Example 12: Show that f(n) = n3+3n2 = Θ(n3)


Test Your
Knowledge
Q 1. Consider the following three claims
1.(n + k)m = Θ(nm), where k and m are constants
2. 2n + 1 = O(2n)
3. 22n + 1 = O(2n)
Which of these claims are correct ?
(A) 1 and 2
(B) 1 and 3
(C) 2 and 3
(D) 1, 2, and 3
Test Your
Knowledge
Answer: (A)
Explanation: (n + k)m and Θ(nm) are asymptotically same as theta
notation can always be written by taking the leading order term in a
polynomial expression.
2n + 1 and O(2n) are also asymptotically same as 2n + 1 can be written
as 2 * 2n and constant multiplication/addition doesn’t matter in theta
notation.
22n + 1 and O(2n) are not same as constant is in power.
Test Your Knowledge
Q-2) Is 3 log n + 100 = O(log n)?

Q-3) If f(n) = 3*n^2 and g(n) = n , then f(n)= O(g(n))?

Q-4) Suppose you have algorithms with the five running times listed
below. How much slower do each of these algorithms get when you (a)
double the input size, or (b) increase the input size by one?
1. n2 ;
2. n3 ;
3. 100n2 ;
4. nlog(n);
5. 2n
Test Your Knowledge
Q 3. If f(n) = n+2, Can we write f(n)=O(n2) ?

Q 4. f(n) = 3n2 + 4n + 1. Show f(n) is O(n2)

Q 5. f(n) = 10n + 5 and g(n) = n show that f(n) is O(g(n))


Test Your Knowledge
State True or False

Q.6. 3n^2 + 10 n log n = O(n log n)


Q.7. 3 n^2 + 10 n log n = Omega(n^2)
Q.18. 3 n^2 + 10 n log n = Theta(n^2)

You might also like