Asymptotic Analysis-1
Asymptotic Analysis-1
Asymptotic Analysis-1
Asymptotic Notation
The standard asymptotic notations commonly used in the analysis of algorithms are known
as O (Big Oh), Ω (Big Omega), and θ(Theta).
Sometimes, additional notations o( small-oh) and ω( small-omega) are also used to show
the growth rates of algorithms
¾ The behavior of f(n) and g(n) is portrayed in the diagram. It follows that for n<n0, ,f(n) may
lie above or below g(n), but for all n ≥ n0, f(n) falls consistently below g(n)..
Since the worst-case running time of an algorithm is the maximum running time for any
input, it would follow that g(n) provides an upper bound on the worst running time
The notation O( g(n) ) does not imply that g(n) is the worse running time; it simply means
that worst running time would never exceed upper limit determined by g(n).
=4n2 (Simplifying )
¾ The choice of constant c is not unique. However, for each different c there is a
corresponding value of n0 which satisfies the basic relation .This behavior is illustrated
by the next example
cg(n)=4n2
f(n)=3n2 + 10 n
cg(n)= 13n2
n0=1 n0=10
Growth of functions 13n2 and 4n2 versus the function 3n2 + 10 n
where c1, c2, c3 ,.. ck are constants and n1 ,n2 ,n3,.. nk are positive integers. The functions f1(n),
f2(n), f3(n), .. fk(n) are said to belong to the class O( g(n)). In set notation, the relation is
denoted by
O( g(n) )= { f1(n) , f2(n), f3(n)…,fk(n)}
O(g(n)) ={ f(n): there exist positive constants c and n0, such that f(n) ≤ c.g(n), for all n ≥ n0 }
¾ The behavior of f(n) and g(n) is portrayed in the graph. It follows that for n < n0, f(n) may
lie above or below g(n), but for all n ≥ n0, f(n) falls consistently above g(n). It also implies that
g(n) grows slower than f(n)
Since the best-case running time of an algorithm is the minimum running time for any input,
it would follow that g(n) provides a lower bound on best running time
As before, the notation Ω( g(n) ) does not imply that g(n) is the best running time; it simply
means that best running time would never be lower than g(n).
= n / 20
Therefore, by definition ,
n2 -10n = Ω(n2).
f(n)= n2 - 10n
n
Growth of function n2 /20 versus the function f(n)= n2 - 10 n
= 3 n / 50 for n ≥ 9 (Simplifying )
Therefore, by definition,
f(n)=3n2 -25n
n0=9
Asymptotic Lower bound
cg(n)=9.n2/50
n
Growth of function 9 n2 /50 versus the function f(n)=3 n2 -25 n
Asymptotic Analysis-1/IIU 2008/Dr.A.Sattar/16
Set Builder Ω-Notation
Definition
Consider the functions , say f1(n), f2(n), f3(n), ..fk(n) for which g(n) is the asymptotic
lower bound. By definition,
f1(n) ≥ c1 g(n) for n ≥ n1
f2(n) ≥ c2 g(n) for n ≥ n2
where c1, c2, c3 ,.. ck are constants and n1 ,n2 ,n3,.. nk are positive integers. The functions f1(n),
f2(n), f3(n), .. fk(n) are said to belong to the class Ω(g(n)). In set notation, the relation is
denoted by
Ω( g(n) )= { f1(n) , f2(n), f3(n)…,fk(n)}
Ω(g(n)) ={ f(n): there exist positive constants c and n0, such that f(n) ≥ c.g(n), for all n ≥ n0 }
¾ The behavior of f(n) and g(n) is portrayed in the graph. It follows that for n < n0, f(n) may
be above or below g(n), but for all n ≥ n0, f(n) falls consistently between c1.g(n) and c2.g(n).
It also implies that g(n) grows as fast as f(n) The function g(n) is said to be the asymptotic
tight bound for f(n)
Ө(g(n)) = { f(n): there exit positive constants c1,c2 and positive integer n0 such that
= 5 n / 38 for n ≥ 4
cg(n)=5n2
f(n)=5n2 -19n
cg(n)=25n2/38
n0=4 AsymptoticLower bound
n
Growth of functions 5 n2 and 25n2/38 versus the function f(n)=5 n2 -19 n
Asymptotic Analysis-1/IIU 2008/Dr.A.Sattar/21
Asymptotic Notation
Constant Running Time
If running time T(n)=c is a constant, i. e independent of input size, then by convention,
the asymptotic behavior is denoted by the notation
O(c) = O(1) , θ (c) = θ(1), Ω(c) = Ω(1)
The convention implies that the running time of an algorithm ,which does not depend
on the size of input, can be expressed in any of the above ways.
¾ The above results imply that in asymptotic notation the multiplier constants in an
expression for the running time can be dropped
Ω(n2) = { √n+5n2, n2+5n, lg n+4n2, n1.5+3n2 , 5n2+n3, n3 +n2+n, lg n+4n4, nlg n+3n4 }
√n √n+5n2
√n+5n2 5n2+n3
n+5 n2+5n
n2+5n n3 +n2+n
lg n+4n lg n+4n2 2
lg n+4n lg n+4n4
n1.5+n n1.5+3n2
n1.5+3n2 nlg n+3n4
O(n2) Ω(n2)
√n √n+5n2 5n2+n3
n+5 n2+5n n3 +n2+n
lg n+4n lg n+4n2 lg n+4n4
n1.5+n n1.5+3n2 nlg n+3n4
θ(n2)
f1(n) + f2(n) + f3(n) ….+ fk(n) = O( max (g1 (n), g2 (n), g3 (n)…, gk(n) ) )
¾ It follows from the theorem that in an expression consisting of sum of several functions,
the comparatively slower growing functions can be discarded in favor of the fastest
growing function to obtain the Big oh notation for the whole expression This also true for
the Ө and Ω notations
Thus,
f(n) = n + √n + n1.5+ lg n + n lg n + n2
= O( max (n , √n , n1.5, lg n , n lg n , n2 ) )
= O(n2)