0% found this document useful (0 votes)
160 views3 pages

Asymptotic Notations

The document discusses three asymptotic notations - Big-O, Omega, and Theta notation. Big-O notation represents the upper bound of an algorithm's running time (worst case). Omega notation represents the lower bound (best case). Theta notation represents both the upper and lower bounds, describing the average case. It provides examples of functions that belong to orders of O(1), O(n), and O(n^2).

Uploaded by

mankitt786
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
160 views3 pages

Asymptotic Notations

The document discusses three asymptotic notations - Big-O, Omega, and Theta notation. Big-O notation represents the upper bound of an algorithm's running time (worst case). Omega notation represents the lower bound (best case). Theta notation represents both the upper and lower bounds, describing the average case. It provides examples of functions that belong to orders of O(1), O(n), and O(n^2).

Uploaded by

mankitt786
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Asymptotic notations are mathematical tools to represent the time complexity

of algorithms for asymptotic analysis.


There are mainly three asymptotic notations:
1. Big-O Notation (O-notation)
2. Omega Notation (Ω-notation)
3. Theta Notation (Θ-notation)

1. Theta Notation (Θ-Notation):


Theta notation encloses the function from above and below. Since it represents
the upper and the lower bound of the running time of an algorithm, it is used for
analyzing the average-case complexity of an algorithm.
Let g and f be the function from the set of natural numbers to itself. The function
f is said to be Θ(g), if there are constants c1, c2 > 0 and a natural number n0
such that c1* g(n) ≤ f(n) ≤ c2 * g(n) for all n ≥ n0

The above expression can be described as if f(n) is theta of g(n), then the value
f(n) is always between c1 * g(n) and c2 * g(n) for large values of n (n ≥ n0). The
definition of theta also requires that f(n) must be non-negative for values of n
greater than n0.
A simple way to get the Theta notation of an expression is to drop low-order
terms and ignore leading constants. For example, Consider the expression 3n3 +
6n2 + 6000 = Θ(n3), the dropping lower order terms is always fine because there
will always be a number(n) after which Θ(n3) has higher values than Θ(n2)
irrespective of the constants involved. For a given function g(n), we denote
Θ(g(n)) is following set of functions.
2. Big-O Notation (O-notation):

Big-O notation represents the upper bound of the running time of an algorithm.
Therefore, it gives the worst-case complexity of an algorithm.
If f(n) describes the running time of an algorithm, f(n) is O(g(n)) if there exist a
positive constant C and n0 such that, 0 ≤ f(n) ≤ cg(n) for all n ≥ n0

The Big-O notation is useful when we only have an upper bound on the time
complexity of an algorithm. Many times we easily find an upper bound by
simply looking at the algorithm.
Examples :
{ 100 , log (2000) , 10^4 } belongs to O(1)
U { (n/4) , (2n+3) , (n/100 + log(n)) } belongs to O(n)
U { (n^2+n) , (2n^2) , (n^2+log(n))} belongs to O( n^2)

3. Omega Notation (Ω-Notation):


Omega notation represents the lower bound of the running time of an algorithm.
Thus, it provides the best case complexity of an algorithm.
Let g and f be the function from the set of natural numbers to itself. The function
f is said to be Ω(g), if there is a constant c > 0 and a natural number n0 such that
c*g(n) ≤ f(n) for all n ≥ n0

You might also like