Data Structures & Algorithms - Topic 7 - Asymptotic Notations & Order of Growth
Data Structures & Algorithms - Topic 7 - Asymptotic Notations & Order of Growth
ALGORITHMS
Asymptotic Notations
&
Order of Growth
Time Complexity of Algorithms
■ When we talk about time complexity of algorithms,
what do we actually mean?
■ Solution:
f(n) c.g(n)
3n+2 c.n
Now, c can be any number greater
Than or equal to 4.
3n+2 4.n
Big-Omega (Ω) Notation
■ The Big Ω notation defines a lower bound of an
algorithm.
■ Solution:
f(n) c.g(n)
3n+ 2 c.n
Now, c can be any number smaller
than or equal to 3.
3n+2 3.n
(Big)-Theta () Notation
■ The Theta notation defines a both upper and lower
bounds of an algorithm.
■ Solution:
c1.g(n) f(n) c2.g(n)
From previous two examples, it is
obvious that:
3n 3n+2 4n
where N0=2.
Small-Oh (o) Notation
■ The small O notation also defines an upper bound of
an algorithm.
Ω(n)
ω(n)
ω(n)
Growth of functions
Usually growth of functions
can be categorized into the
following:
– Constant time
– Logarithmic time
– Linear time
– N-logarithmic time
– Polynomial time
– Exponential time
– Factorial time
Remarks
■ Most of the time, we are concerned with the
worst case complexity of algorithms.
…
– Best Case Analysis 1 =Ω(1)
– Worst Case Analysis n =O(n)
– Average Case Analysis ≈ n/2+1 =O(n) num_n
End of Lecture
THANK YOU