0% found this document useful (0 votes)
20 views23 pages

3 Asymptotic Notations

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views23 pages

3 Asymptotic Notations

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 23

Department of CSE

DESIGN AND ANALYSIS OF ALGORITHMS


23CS2205R
Topic:

Asymptotic Notations

Session
Session- -33
AIM OF THE SESSION

To familiarize students with the concept of Asymptotic Notations.

INSTRUCTIONAL OBJECTIVES

This Session is designed to:


1.Demonstrate :- Asymptotic Notations.
2.Describe :- Big-Oh, Theta, Omega notations.

LEARNING OUTCOMES

At the end of this session, you should be able to:


1.Define :- Asymptotic Notations.
2.Describe :- Big-Oh, Theta, Omega notations
3.Summarize:- Analyze the efficiency of Algorithms.
Asymptotic efficiency
• Asymptotic efficiency means study of algorithms efficiency for large inputs.

• To compare two algorithms with running times f(n) and g(n), we need a rough measure that characterizes how fast each function
grows as n grows.

• Hint: use rate of growth


• Compare functions asymptotically!
(i.e., for large values of n)
Functions ordered by growth rate
Function Name
1 Growth is constant
logn Growth is logarithmic
n Growth is linear
nlogn Growth is n-log-n
n2 Growth is quadratic
n3 Growth is cubic
2n Growth is exponential
n! Growth is factorial

1 < logn < n < nlogn < n2 < n3 < 2n < n!


– To get a feel for how the various functions grow with n, you are advised to
study the following figs:
• The low order terms and constants in a function are relatively insignificant for large n
n2 + 100n + log10n + 1000 ~ n2

i.e., we say that n2 + 100n + log10n + 1000 and n2 have the same rate of growth

Some more examples


• n4 + 100n2 + 10n + 50 is ~ n4
• 10n3 + 2n2 is ~ n3
• n3 - n2 is ~ n3
• constants
 10 is ~ 1
 1273 is ~ 1
Big Oh(O) notation
Asymptotic/order Notations
• Asymptotic/order notation describes the behavior of
functions for the large inputs.
• Big Oh(O) notation:

– The big oh notation describes an upper bound on


the asymptotic growth rate of the function f.
Definition: [Big “oh’’]
– f(n) = O(g(n)) (read as “f of n is big oh of g of n”)
iff there exist positive constants c and n0 such
that
f(n)  cg(n) for all n, n  n0.
• The definition states that the function f(n) is at most c times the function g(n) except
when n is smaller than n0.
• In other words, f(n) grows slower than or same rate as” g(n).
• When providing an upper –bound function g for f, we normally use a single term in n.
• Examples
– f(n) = 3n+2
• 3n + 2 <= 4n, for all n >= 2, 3n + 2 =  (n)

– f(n) = 10n2+4n+2
• 10n2+4n+2 <= 11n2, for all n >= 5,  10n2+4n+2 =  (n2)

– f(n)=6*2n+n2=O(2n) /* 6*2n+n2 7*2n for n4 */


• It also possible to write 10n2+4n+2 = O(n3) since 10n2+4n+2 <=7n3 for n>=2

• Although n3 is an upper bound for 10n2+4n+2, it is not a tight upper bound; we can
find a smaller function (n2) that satisfies big oh relation.

• But, we can not write 10n2+4n+2 =O(n), since it does not satisfy the big oh relation for
sufficiently large input.
Omega () notation

• Omega () notation:


– The omega notation describes a lower bound on the asymptotic growth
rate of the function f.

Definition: [Omega]
– f(n) = (g(n)) (read as “f of n is omega of g of n”)
iff there exist positive constants c and n0 such that
f(n)  cg(n) for all n,
n  n0 .
• The definition states that the function f(n) is at least c times the function g(n) except
when n is smaller than n0.
• In other words,f(n) grows faster than or same rate as” g(n).
• Examples
– f(n) = 3n+2
• 3n + 2 >= 3n, for all n >= 1, 3n + 2 =  (n)

– f(n) = 10n2+4n+2
• 10n2+4n+2 >= n2, for all n >= 1,  10n2+4n+2 =  (n2)

• It also possible to write 10n2+4n+2 = (n) since 10n2+4n+2 >=n for n>=0

• Although n is a lower bound for 10n2+4n+2, it is not a tight lower bound; we can find a
larger function (n2) that satisfies omega relation.

• But, we can not write 10n2+4n+2 = (n3), since it does not satisfy the omega relation for
sufficiently large input.
Theta () notation

• Theta () notation:


– The Theta notation describes a tight bound on the
asymptotic growth rate of the function f.

Definition: [Theta]
– f(n) = (g(n)) (read as “f of n is theta of g of n”) iff
there exist positive constants c1, c2, and n0 such that
c1g(n)  f(n)  c2g(n) for all n, n  n0.
• The definition states that the function f(n) lies between c1 times the function g(n) and
c2 times the function g(n) except when n is smaller than n0.
• In other words,f(n) grows same rate as” g(n).
• Examples:-
– f(n) = 3n+2
• 3n <= 3n + 2 <= 4n, for all n >= 2,  3n + 2 =  (n)

– f(n) = 10n2+4n+2
• n2<= 10n2+4n+2 <= 11n2, for all n >= 5,  10n2+4n+2 =  (n2)
• But, we can not write either 10n2+4n+2= (n) or 10n2+4n+2= (n3), since neither of
these will satisfy the theta relation.
Big-Oh, Theta, Omega notations

Tips :
• Think of O(g(n)) as “less than or equal to” g(n)
– Upper bound: “grows slower than or same rate as” g(n)

• Think of Ω(g(n)) as “greater than or equal to” g(n)


– Lower bound: “grows faster than or same rate as” g(n)

• Think of Θ(g(n)) as “equal to” g(n)


– “Tight” bound: same growth rate

• (True for large N)


Example 1 - Iterative sum of n numbers
Statement s/e frequency Total steps

Algorithm sum(a, n) 0 -- 0
{ 0 -- 0
s:=0 ; 1 1 O(1)
for i:=1 to n do 1 n+1 O(n+1)
s:=s+a[i]; 1 n O(n)
return s; 1 1 O(1)
} 0 -- 0
Total O(n)
Example 2 - Addition of two m×n matrices
Statement s/e frequency Total steps

Algorithm Add(a,b,c,m, n) 0 -- 0
{ 0 -- 0
for i:=1 to m do
1 m+1 O(m)
for j:=1 to n do
c[i,j]:=a[i,j]+b[i,j] ; 1 m(n+1) O(mn)
} 1 mn O(mn)
0 -- 0
Total O(mn)
Time complexity of Towers of Hanoi

T(n) = T(n-1) + 1 + T(n-1) = 2T(n-1) + 1


= 2 * ( 2 * T(n-2) + 1) + 1
= (2 ^ 2) * T(n-2) + 2^1 + 2^0
:
= (2^k) * T(n-k) + 2^(k-1) + 2^(k-2) + ... + 2^0
Base condition T(0) == 1
n – k = 0 => n = k;
put, k = n
T(n)=2^n T(0)+2^(n-1)+....+2^1+2^0
It is GP series, and sum is 2^(n+1)-1
T(n) = O(2^n) which is exponential.
SUMMARY

Asymptotic notation provides a simplified way to analyze and compare the efficiency
of algorithms, focusing on their growth rates without being concerned with constant
factors and lower-order terms.
Asymptotic notations are mathematical tools to express the time complexity of
algorithms for asymptotic analysis.
SELF-ASSESSMENT
SELF-ASSESSMENT QUESTIONS
QUESTIONS

1.1. ToTomeasure
measureTime
Timecomplexity
complexityofofan
analgorithm
algorithmBig
BigOOnotation
notationisisused
usedwhich?
which?

2.2.The
Thenumber
numberofofexecutions
executionsgrows
growsextremely
extremelyquickly
quicklyasasthe
thesize
sizeofofthe
theinput
inputincreases
increases
TERMINAL QUESTIONS

1. What does it mean when we say that an algorithm X is asymptotically more efficient than Y?

2. Find upper bound of running time of a cubic function f(n) = 2n 3 + 4n + 5.?

3. Write the asymptotic notations use. for best case ,average case and worst case analysis of algorithms.
REFERENCES FOR FURTHER LEARNING OF THE SESSION

Text Books :
1. Ellis Horowitz, Sartaj Sahni and Sanguthevar Rajasekaran, “Fundamentals of
ComputerAlgorithms”, 2nd Edition, University Press, 2008.
2. Cormen, Leizerson &Rivest, “Introduction toalgorithms”, 3rd Edition, Prentice-Hall, 2002.
3. Jon Kleinberg and Eva Tardos, “Algorithm Design”,Pearson Education, 2006.

Reference Books :
1. Robert Sedgewick and Kevin wayne , “Algorithms”, 4th edition, Addison WesleyProf.,(2011).
2. Anny Levitin, “Introduction to Design and Analysis of Algorithms”, 2rd Edition,
PersonEducation Press. (2007).
3. Michael T.Goodrich and Roberto Tamassia, Algorithm Design: Foundations,Analysis and
Internet Examples, Second Edition, Wiley-India, (2006).
4. Steven S. Skiena, “The AlgorithmDesign Manual”, Second Edition, Springer, (2008)

MOOCS :
1. https://www.coursera.org/specializations/algorithms?=
2.https://www.coursera.org/learn/dynamic-programming-greedy-algorithms#modules
THANK YOU

You might also like