DS - Mod1 - Asymptotic - Notations
DS - Mod1 - Asymptotic - Notations
Asymptotic Notations
Analysis of Algorithms
• An algorithm is a finite set of precise instructions for
performing a computation or for solving a problem.
• What is the goal of analysis of algorithms?
– To compare algorithms mainly in terms of running time but
also in terms of other factors (e.g., memory requirements,
programmer's effort etc.)
• What do we mean by running time analysis?
– Determine how running time increases as the size of the
problem increases.
2
Types of Analysis
• Worst case
– Provides an upper bound on running time
– An absolute guarantee that the algorithm would not run longer, no
matter what the inputs are
• Best case
– Provides a lower bound on running time
– Input is the one for which the algorithm runs the fastest
5
Asymptotic Analysis
• To compare two algorithms with running times
f(n) and g(n), we need a rough measure that
characterizes how fast each function grows.
• Hint: use rate of growth
• Compare functions in the limit, that is,
asymptotically!
(i.e., for large values of n)
6
Rate of Growth
• The low order terms in a function are relatively
insignificant for large n
n4 + 100n2 + 10n + 50 ~ n4
7
Asymptotic Notations
• Asymptotic notations are mathematical tools to represent time
complexity of algorithms for asymptotic analysis.
• The following 3 asymptotic notations are mostly used to
represent time complexity of algorithms.
1. O notation [ Big Oh ]
3. notation [ Theta]
8
O-notation
• f(n) and g(n) - non-negative functions
9
O-notation
• Big-O notation provides an Asymptotic upper bound
• f(n) = O(g(n))
• We say that “f(n) is big-O of g(n).”
• some constant multiple of g(n) is an asymptotic upper bound
of f(n), no claim about how tight an upper bound is.
10
O-notation
More formally, it can be defined as :
For non-negative functions, f(n) and g(n),
the function f(n)= O(f(n)) if there are positive constants c and n0
such that f(n) ≤ c * g(n) for all n, n ≥ n0.
•This notation is known as Big-Oh notation.
•If graphed, g(n) serves as an upper bound to the curve you are
analysing, f(n).
•It describes the worst that can happen for a given data size.
•f(n) is the time of execution of an algorithm for different
problem sizes n.
•g(n) is the mathematical representation of algorithm as a
function of problem size n.
11
O-notation
• While relating f(n) with g(n) , relation ‘≤’ is used .
• This relation signifies upper bound.
• The time of computation (f(n)) can be equal or less than c * g(n).
It cannot go beyond that value of c * g(n).
• Threshold problem size(n0)is the minimum problem size beyond
which we can predict the behaviour with respect to performance
of the algorithm.
12
Some Common Time Complexity Examples :
13
O-notation
14
Assumption while Finding the Time Complexity
16
O-notation
Que 1 : Consider the following f(n) and g(n).
f(n) = 3n + 2 and g(n) = n
Is f(n) = O(n) ?
Answer :
If we want to represent f(n) as O(g(n)) then it must satisfy f(n) <= c
g(n) for all values of c > 0 and n0>= 1
f(n) <= c g(n)
⇒3n + 2 <= c n
For c= 4,
3n+2 ≤ 4n for all n≥2.
Above condition is always TRUE for all values of c = 4 and n≥2.
By using Big - Oh notation we can represent the time complexity as
follows... 3n + 2 = O(n)
17
Big-Ω notation(Big-Omega notation)
• Ω notation provides an asymptotic lower bound.
18
Big-Ω notation(Big-Omega notation)
• f(n) = Ω(g(n))
• We say that “f(n) is omega of g(n).”
• As n increases, f(n) grows no slower than g(n).
• In other words, g(n) is an asymptotic lower bound on
f(n).
19
Big-Ω notation(Big-Omega notation)
Que : Consider the following f(n) and g(n)...
f(n) = 3n + 2
g(n) = n
Is f(n) = Ω(n) ?
f(n) > = cg(n)
3n+2 > = c n
C=1
3n+2 >= n
n=1
5>=1
20
Big-Ω notation(Big-Omega notation)
Que : Consider the following f(n) and g(n)...
f(n) = 3n + 2
g(n) = n
Is f(n) = Ω(n) ?
Answer : If we want to represent f(n) as Ω(g(n)) then it must
satisfy f(n) >= c g(n) for all values of c > 0 and n0>= 1
f(n) >= c g(n)
⇒3n + 2 >= c n
Above condition is always TRUE for all values of c = 1 and n
>= 1.
By using Big - Omega notation we can represent the time
complexity as follows...
3n + 2 = Ω(n)
21
Θ Notation
22
Θ Notation
23
Θ Notation
• The theta notation bounds functions from above and below, so
it defines exact asymptotic behavior.
• f(n) = Θ(g(n))
• We say that “f(n) is theta of g(n).”
• As n increases, f(n) grows at the same rate as g(n).
• In other words, g(n) is an asymptotically tight bound on f(n).
• The average case running time of an algorithm is determined
when the algorithm has average condition for execution.
• Θ-notation is used to denote both upper and lower bounds.
24
Θ Notation
Que : consider the following f(n) and g(n)...
f(n) = 3n + 2
g(n) = n
• If we want to represent f(n) as Θ(g(n)) then it must
satisfy c1 g(n) <= f(n) <= c2 g(n) for all values of c1 > 0,
c2 >0 and n0>= 1
c1 g(n) <= f(n) <= c2 g(n)
⇒c1 n <= 3n + 2 <= c2 n
Above condition is always TRUE for all values of c1 = 1,
c2 = 4 and n >= 2.
By using Big - Theta notation we can represent the time
compexity as follows...
3n + 2 = Θ(n)
25
Θ Notation
Que : consider the following f(n) and g(n)...
f(n) = 3n + 2
g(n) = n
• c1 g(n) < = 3n+2 <= c2 g(n)
C1 g(n) < = 3n+2
c1 n < = 3n+2 c1 = 1
n < = 3n+2 n=1
26
Asymptotic Notation
27
28
Properties
• Theorem:
f(n) = (g(n)) f = O(g(n)) and f = (g(n))
•
29
Properties
• Transitivity:
– f(n) = (g(n)) and g(n) = (h(n)) f(n) = (h(n))
– // Same for O and
30
Properties
• Reflexivity:
– f(n) = (f(n))
– // Same for O and
31
Properties
• Symmetry:
• Transpose symmetry:
32
33