0% found this document useful (0 votes)
21 views33 pages

DS - Mod1 - Asymptotic - Notations

Uploaded by

maha
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
21 views33 pages

DS - Mod1 - Asymptotic - Notations

Uploaded by

maha
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 33

Analysis of Algorithms &

Asymptotic Notations
Analysis of Algorithms
• An algorithm is a finite set of precise instructions for
performing a computation or for solving a problem.
• What is the goal of analysis of algorithms?
– To compare algorithms mainly in terms of running time but
also in terms of other factors (e.g., memory requirements,
programmer's effort etc.)
• What do we mean by running time analysis?
– Determine how running time increases as the size of the
problem increases.

2
Types of Analysis
• Worst case
– Provides an upper bound on running time
– An absolute guarantee that the algorithm would not run longer, no
matter what the inputs are
• Best case
– Provides a lower bound on running time
– Input is the one for which the algorithm runs the fastest

Lower Bound Running Time Upper Bound


• Average case
– Provides a prediction about the running time
– Assumes that the input is random
3
How do we compare algorithms?
• We need to define a number of objective
measures.

(1) Compare execution times?


Not good: times are specific to a particular
computer !!

(2) Count the number of statements executed?


Not good: number of statements vary with
the programming language as well as the style of
the individual programmer.
4
Ideal Solution

• Express running time as a function of the input


size n (i.e., f(n)).
• Compare different functions corresponding to
running times.
• Such an analysis is independent of machine
time, programming style, etc.

5
Asymptotic Analysis
• To compare two algorithms with running times
f(n) and g(n), we need a rough measure that
characterizes how fast each function grows.
• Hint: use rate of growth
• Compare functions in the limit, that is,
asymptotically!
(i.e., for large values of n)

6
Rate of Growth
• The low order terms in a function are relatively
insignificant for large n
n4 + 100n2 + 10n + 50 ~ n4

i.e., we say that n4 + 100n2 + 10n + 50 and n4 have the


same rate of growth

7
Asymptotic Notations
• Asymptotic notations are mathematical tools to represent time
complexity of algorithms for asymptotic analysis.
• The following 3 asymptotic notations are mostly used to
represent time complexity of algorithms.

1. O notation [ Big Oh ]

2.  notation [Big Omega ]

3.  notation [ Theta]

8
O-notation
• f(n) and g(n) - non-negative functions

9
O-notation
• Big-O notation provides an Asymptotic upper bound
• f(n) = O(g(n))
• We say that “f(n) is big-O of g(n).”
• some constant multiple of g(n) is an asymptotic upper bound
of f(n), no claim about how tight an upper bound is.

• As n increases, f(n) grows no faster than g(n).

• In other words, g(n) is an asymptotic upper bound on f(n).

10
O-notation
More formally, it can be defined as :
For non-negative functions, f(n) and g(n),
the function f(n)= O(f(n)) if there are positive constants c and n0
such that f(n) ≤ c * g(n) for all n, n ≥ n0.
•This notation is known as Big-Oh notation.
•If graphed, g(n) serves as an upper bound to the curve you are
analysing, f(n).
•It describes the worst that can happen for a given data size.
•f(n) is the time of execution of an algorithm for different
problem sizes n.
•g(n) is the mathematical representation of algorithm as a
function of problem size n.

11
O-notation
• While relating f(n) with g(n) , relation ‘≤’ is used .
• This relation signifies upper bound.
• The time of computation (f(n)) can be equal or less than c * g(n).
It cannot go beyond that value of c * g(n).
• Threshold problem size(n0)is the minimum problem size beyond
which we can predict the behaviour with respect to performance
of the algorithm.

12
Some Common Time Complexity Examples :

• O(1) -- Constant computing time


• O(n) -- Linear
• O(n2) -- Quadratic
• O(n3) -- Cubic
• O(2n) -- Exponential
• If an algorithm takes time O(log n), it is faster, for sufficiently
large n, than if it had taken O(n).
• Similarly, O(n log n) is better than O(n2) but not as good as O(n).
• These seven computing times-O(1), O(log n), O(n), O(n log n),
O(n2), O(n3), and O(2n)-are the ones we see most often in this
book.

13
O-notation

14
Assumption while Finding the Time Complexity

• The leading constant of highest power of ‘n’ and all lower


powers on ‘n’ are ignored in g(n).
Example : Consider the following equation for time complexity of
an algorithm :
f(n) =O(100 n3+29 n2+19n) …(1)
Here the leading constant of highest power of ‘n’ is 100. 100 can be
ignored. Based on the assumption, Equation (1) can be rewritten
as:
f(n)=O(n3+29 n2+19n)…(2)
All the lower powers of n (i.e 29 n2+19n) can be ignored
completely.
So, the above equation is rewritten as:
f(n)=O(n3)
15
O-notation

16
O-notation
Que 1 : Consider the following f(n) and g(n).
f(n) = 3n + 2 and g(n) = n
Is f(n) = O(n) ?
Answer :
If we want to represent f(n) as O(g(n)) then it must satisfy f(n) <= c
g(n) for all values of c > 0 and n0>= 1
f(n) <= c g(n)
⇒3n + 2 <= c n
For c= 4,
3n+2 ≤ 4n for all n≥2.
Above condition is always TRUE for all values of c = 4 and n≥2.
By using Big - Oh notation we can represent the time complexity as
follows... 3n + 2 = O(n)
17
Big-Ω notation(Big-Omega notation)
• Ω notation provides an asymptotic lower bound.

18
Big-Ω notation(Big-Omega notation)
• f(n) = Ω(g(n))
• We say that “f(n) is omega of g(n).”
• As n increases, f(n) grows no slower than g(n).
• In other words, g(n) is an asymptotic lower bound on
f(n).

19
Big-Ω notation(Big-Omega notation)
Que : Consider the following f(n) and g(n)...
f(n) = 3n + 2
g(n) = n
Is f(n) = Ω(n) ?
f(n) > = cg(n)
3n+2 > = c n
C=1
3n+2 >= n
n=1
5>=1

20
Big-Ω notation(Big-Omega notation)
Que : Consider the following f(n) and g(n)...
f(n) = 3n + 2
g(n) = n
Is f(n) = Ω(n) ?
Answer : If we want to represent f(n) as Ω(g(n)) then it must
satisfy f(n) >= c g(n) for all values of c > 0 and n0>= 1
f(n) >= c g(n)
⇒3n + 2 >= c n
Above condition is always TRUE for all values of c = 1 and n
>= 1.
By using Big - Omega notation we can represent the time
complexity as follows...
3n + 2 = Ω(n)
21
Θ Notation

22
Θ Notation

23
Θ Notation
• The theta notation bounds functions from above and below, so
it defines exact asymptotic behavior.
• f(n) = Θ(g(n))
• We say that “f(n) is theta of g(n).”
• As n increases, f(n) grows at the same rate as g(n).
• In other words, g(n) is an asymptotically tight bound on f(n).
• The average case running time of an algorithm is determined
when the algorithm has average condition for execution.
• Θ-notation is used to denote both upper and lower bounds.

24
Θ Notation
Que : consider the following f(n) and g(n)...
f(n) = 3n + 2
g(n) = n
• If we want to represent f(n) as Θ(g(n)) then it must
satisfy c1 g(n) <= f(n) <= c2 g(n) for all values of c1 > 0,
c2 >0 and n0>= 1
c1 g(n) <= f(n) <= c2 g(n)
⇒c1 n <= 3n + 2 <= c2 n
Above condition is always TRUE for all values of c1 = 1,
c2 = 4 and n >= 2.
By using Big - Theta notation we can represent the time
compexity as follows...
3n + 2 = Θ(n)
25
Θ Notation
Que : consider the following f(n) and g(n)...
f(n) = 3n + 2
g(n) = n
• c1 g(n) < = 3n+2 <= c2 g(n)
C1 g(n) < = 3n+2
c1 n < = 3n+2 c1 = 1
n < = 3n+2 n=1

3n+2 <= c2 g(n)


3n+2 <= c2 n c2 = 4
3 n+2 <= 4n n=2

26
Asymptotic Notation

• O notation: asymptotic “less than”:

– f(n)=O(g(n)) implies: f(n) “≤” g(n)

  notation: asymptotic “greater than”:

– f(n)=  (g(n)) implies: f(n) “≥” g(n)

  notation: asymptotic “equality”:

– f(n)=  (g(n)) implies: f(n) “=” g(n)

27
28
Properties
• Theorem:
f(n) = (g(n))  f = O(g(n)) and f = (g(n))

29
Properties
• Transitivity:
– f(n) = (g(n)) and g(n) = (h(n))  f(n) = (h(n))
– // Same for O and 

30
Properties
• Reflexivity:
– f(n) = (f(n))
– // Same for O and 

31
Properties
• Symmetry:

• Transpose symmetry:

32
33

You might also like