0% found this document useful (0 votes)
10 views11 pages

Unit I Introduction To Algoritm Design

This document introduces algorithm design with a focus on asymptotic analysis and notations. It explains the significance of asymptotic notations (Ο, Ω, and Θ) in measuring the efficiency and time complexity of algorithms. The document provides definitions and examples for each notation to illustrate their application in analyzing algorithm performance.

Uploaded by

Sanjay Baskar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views11 pages

Unit I Introduction To Algoritm Design

This document introduces algorithm design with a focus on asymptotic analysis and notations. It explains the significance of asymptotic notations (Ο, Ω, and Θ) in measuring the efficiency and time complexity of algorithms. The document provides definitions and examples for each notation to illustrate their application in analyzing algorithm performance.

Uploaded by

Sanjay Baskar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 11

UNIT I

INTRODUCTION TO ALGORITM
DESIGN
Asymptotic Notations
⚫ Main idea of asymptotic analysis
◦ To have a measure of efficiency of algorithms
◦ That doesn’t depend on machine specific
constants,
◦ That doesn’t require algorithms to be
implemented
◦ Time taken by programs to be compared.
⚫ Asymptotic notations
◦ Asymptotic notations are mathematical tools to
represent time complexity of algorithms for
asymptotic analysis.
Asymptotic Analysis
⚫ Asymptotic analysis refers to computing
the running time of any operation in
mathematical units of computation.
◦ Ο Notation
◦ Ω Notation
◦ θ Notation
⚫ Big Oh Notation, Ο
◦ It measures the worst case time complexity
or longest amount of time an algorithm can
possibly take to complete.
O(g(n)) = { f(n): there exist positive constants c and n0 such that
0 <= f(n) <= cg(n) for all n >= n0}
0 <= f(n) <= cg(n) for all n >= n0}

If f(n) = 3n+2 If f(n) = 3n+2 g(n)=n2


g(n)=n Then
Then 3n+2 <= cn2
3n+2 <= cn For instance c=1
For instance c=4 n=1,2,3,4,5,6…
3n+2<=4n 3n+2<=n2
n>=2 n>=5
Hence f(n) = O(g(n)) Hence f(n) = O(g(n))
⚫ Ω Notation: (Best Case)
◦ Ω notation provides an asymptotic lower
bound
If f(n) = 3n+2 If f(n) = 3n+2 g(n)=n2
g(n)=n Then
Then 3n+2 >= cn2
f(n)>= cg(n) For instance c=1
For instance c=1 n=1,2,3,4,5,6…
3n+2 >= n 3n+2 >=n2
Hence f(n) = omega(g(n)) Hence f(n) = omega(n)
logn
log(logn)
⚫ Θ Notation:
Θ((g(n)) = {f(n): there exist positive constants c1, c2 and n0 such
that
0 <= c1*g(n) <= f(n) <= c2*g(n) for all n >= n0}
0 <= c1*g(n) <= f(n) <= c2*g(n) for all n >= n0}

If f(n) = 3n+2
g(n)=n
Then
c1*g(n) <= f(n) <= c2*g(n)
C1,c2>0 and N>n0
For instance c2=4
f(n)<= c2g(n)
3n+2 <= 4n n0=1
f(n)>= c1g(n)
For instance c1=1
3n+2 >= n
Hence f(n) = theta(n)
Order of Growth Function

You might also like