Asymptotic Notations
Asymptotic Notations
The above expression can be described as if f(n) is theta of g(n), then the value
f(n) is always between c1 * g(n) and c2 * g(n) for large values of n (n ≥ n0). The
definition of theta also requires that f(n) must be non-negative for values of n
greater than n0.
A simple way to get the Theta notation of an expression is to drop low-order
terms and ignore leading constants. For example, Consider the expression 3n3 +
6n2 + 6000 = Θ(n3), the dropping lower order terms is always fine because there
will always be a number(n) after which Θ(n3) has higher values than Θ(n2)
irrespective of the constants involved. For a given function g(n), we denote
Θ(g(n)) is following set of functions.
2. Big-O Notation (O-notation):
Big-O notation represents the upper bound of the running time of an algorithm.
Therefore, it gives the worst-case complexity of an algorithm.
If f(n) describes the running time of an algorithm, f(n) is O(g(n)) if there exist a
positive constant C and n0 such that, 0 ≤ f(n) ≤ cg(n) for all n ≥ n0
The Big-O notation is useful when we only have an upper bound on the time
complexity of an algorithm. Many times we easily find an upper bound by
simply looking at the algorithm.
Examples :
{ 100 , log (2000) , 10^4 } belongs to O(1)
U { (n/4) , (2n+3) , (n/100 + log(n)) } belongs to O(n)
U { (n^2+n) , (2n^2) , (n^2+log(n))} belongs to O( n^2)