0% found this document useful (0 votes)
7 views14 pages

2.1 Lecture 2

The document discusses asymptotic notation and its importance in analyzing the time complexity of algorithms, including best case, average case, and worst case scenarios. It explains different notations such as Big O, Big Omega, and Big Theta, which represent upper, lower, and tight bounds of algorithm performance. Examples of linear search and common asymptotic notations are also provided to illustrate these concepts.

Uploaded by

foreafcbeta
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views14 pages

2.1 Lecture 2

The document discusses asymptotic notation and its importance in analyzing the time complexity of algorithms, including best case, average case, and worst case scenarios. It explains different notations such as Big O, Big Omega, and Big Theta, which represent upper, lower, and tight bounds of algorithm performance. Examples of linear search and common asymptotic notations are also provided to illustrate these concepts.

Uploaded by

foreafcbeta
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 14

Lect02

Asymptotic notation
Analysis of Time complexity
• Best case:. The best-case complexity of the algorithm is the
function defined by the minimum number of steps taken on any
instance of size n for particular problem.
• Particular input Minimum time required for program execution.
• Average case: The Average -case complexity of the algorithm is
the function defined by the minimum number of steps taken on
any instance of size n.
• Worst case: The worst-case complexity of the algorithm is the
function defined by the maximum number of steps taken on any
instance of size n, Minimum time required for program
execution.
Example of Linear Search
• A[10]={1,2,4,7,9,11,13,16,18,19}
• X=1, best case
• X=19/21 worst case
• X=11 average case
Asymptotic notation
• Asymptotic notation is expressions that are used to represent
the complexity of algorithms. The complexity of the algorithm
is analyzed from two perspectives: The time complexity of an
algorithm is the amount of time the algorithm takes to complete
its process.
Why is Asymptotic Notation Important?

1. They give simple characteristics of an algorithm's


efficiency.
2. They allow the comparisons of the performances of
various algorithm.
Asymptotic notation
• Big O
• Big Omega
• thetha
O- Big Oh: Asymptotic Notation (Upper
Bound)
• “O - Big Oh” is the most commonly used notation. Big Oh
describes the worst-case scenario. It represents the upper
bound of the algorithm.
• Function, f(n) = O (g(n)), if and only if positive constant C and
n0 Exist and thus:
• 0 <= f(n) <= C*(g(n)) for all n >= n0
• Therefore, function g(n) is an upper bound for function f(n)
because it grows faster than function f(n).
• The value of f(n) function always lies below the C(g(n)) fun.
Example
• F(n)=2n+5
• 3n+2=O(n) as 3n+2≤4*n for all n≥2
• C= 4, no= 2 f(n) 3n+2
• G(n)=n
O- Big Oh

For Example:
1.1. 3n+2=O(n) as 3n+2≤4n for all n≥2
2.2. 3n+3=O(n) as 3n+3≤4n for all n≥3
Ω-Big omega: Asymptotic Notation
(Lower Bound)
• The Big Omega (Ω) notation describes the best-case
scenario. It represents the lower bound of the algorithm.
• Function, f(n) = Ω (g(n)), if and only if positive constant C is
present and thus:

• 0 <= C* g(n) <= f(n) for all n >= n0
• The value of f(n) function always lies above the C*g(n))
function, as shown in
Ω-Big omega

1.3n+2= Ω(n) as 3n+2>=3n for all n≥1 c=3,


no=1,
2.4n+2= Ω(n) 4n+2>= 4n C=4 n0=1
θ-Big theta: Asymptotic Notation (Tight
Bound)
• θ-Big theta: Asymptotic Notation (Tight Bound)
• The Big Theta (θ) notation describes both the upper bound
and the lower bound of the algorithm. So, you can say that it
defines precise asymptotic behavior. It represents the tight
bound of the algorithm.
• Function, f(n) = θ (g(n)), if and only if positive constant C1,
C2 and n0 is present and thus:

• 0 <= C1(g(n)) <= f(n) <= C2(g(n)) for all n >= n0


θ-Big theta

3n+2= θ (n) as 3n+2≥3n and 3n+2≤ 4n, for n c1=3,c2=4, and n0=2
Common Asymptotic Notations
Following is a list of some common
asymptotic notations −
constant − Ο(1)
logarithmic − Ο(log n)
linear − Ο(n)
n log n − Ο(n log n)
2
quadratic − Ο(n )
3
cubic − Ο(n )
Ο(1)
polynomial − n
Ο(n)
exponential − 2

You might also like