Time Complexity
Time Complexity
3 ASYMPTOTIC NOTATION
Asymptoic notatin is the most simple and casiestway of describing the running time of an algorith
lt represents the efliciency and performance of an algorithm in asystematic and meaningful mannegy
Asymptotic notations describe time complexity in terms of three common measures, bestcase (or 'fastes
possible), worst case (or 'slowest possible'), and average case (or 'average time').
The three most important asymptotic notations are:
1. Big-Oh notation
2. Omega notation
3. Theta notation
Definition Considerfn) and g(n) to be two positive functions ofn, where n is the size of the input data.
Then, f(n) is big-oh of g(n), if and only if there exists a positive constant Cand an integer n, such that
fn) s Cgn) and n > no
Here.f(n) -Og(m)
Figure 3.2 shows the graphical representation of big-oh notation.
Cgln)
{n)
fAn) =0(gln))
Fig. 3.2 Graphical representation of big-ohnotation
big-oh notation are:
Some of the typical complexities (computing time) represented by
1. O(1) ’ Constant
2. O(n)’ Linear
3. On) ’Quadratic
Cg(n)
n
no
f(n) =o (g(n)
Fig. 3.3 Graphical representation of
0mega notation
Example 3.6 Deduce the omega notation iffn) = 2n² + 4 and g(n) = 6n.
Given, fn) = 2n + 4 and g(n) = 6n.
For n = 0,
fn) = 2(0 +4
=4
gn) = 6(0)
=0
i.e.,fn) >gn)
For n = 1,
An) = 2(1) +4
=2+4
=6
gn) = 6(1)
=6
i.e.,fn) =gn)
For n=2,
An)= 2(2) +4
Crgln)
fn) = (g(n)
Fig. 3.4 Graphical representation of Theta notation