0% found this document useful (0 votes)
5 views28 pages

Asymptotic Notations: DR Ashok Kumar Sahoo Mobile: 9810226795 E-Mail: Ashok - Sahoo@gehu - Ac.in

The document provides an overview of asymptotic notations used in algorithm analysis, including definitions and examples of big-Theta (Θ), big-O (O), and big-Omega (Ω) notations. It emphasizes the importance of comparing algorithms based on their running time and growth rates as input size increases. Additionally, it discusses various types of algorithm analysis, such as worst-case, best-case, and average-case scenarios.

Uploaded by

arthurvai213141
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views28 pages

Asymptotic Notations: DR Ashok Kumar Sahoo Mobile: 9810226795 E-Mail: Ashok - Sahoo@gehu - Ac.in

The document provides an overview of asymptotic notations used in algorithm analysis, including definitions and examples of big-Theta (Θ), big-O (O), and big-Omega (Ω) notations. It emphasizes the importance of comparing algorithms based on their running time and growth rates as input size increases. Additionally, it discusses various types of algorithm analysis, such as worst-case, best-case, and average-case scenarios.

Uploaded by

arthurvai213141
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 28

Asymptotic Notations

Dr Ashok Kumar Sahoo


Mobile : 9810226795
E-Mail : [email protected]
Analysis of Algorithms
• An algorithm is a finite set of precise instructions
for performing a computation or for solving a
problem.
• What is the goal of analysis of algorithms?
– To compare algorithms mainly in terms of running
time but also in terms of other factors (e.g., memory
requirements, programmer's effort etc.)
• What do we mean by running time analysis?
– Determine how running time increases as the size
of the problem increases.
Input Size
• Input size (number of elements in the input)

– size of an array

– polynomial degree

– Number of elements in a matrix

– Number of bits in the binary representation of the


input

– vertices and edges in a graph


Types of Analysis
• Worst case
– Provides an upper bound on running time
– An absolute guarantee that the algorithm would not run longer,
no matter what the inputs are
• Best case
– Provides a lower bound on running time
– Input is the one for which the algorithm runs the fastest

Lower Bound  Running Time Upper Bound


• Average case
– Provides a prediction about the running time
– Assumes that the input is random
Why we compare algorithms?
• We need to define a number of objective
measures.

(1) Compare execution times?


Not good: times are specific to a particular
computer !!

(2) Count the number of statements executed?


Not good: number of statements vary with the
programming language as well as the style of the
individual programmer.
Ideal Solution

• Express running time as a function of the


input size n (i.e., f(n)).
• Compare different functions corresponding
to running times.
• Such an analysis is independent of
machine time, programming style, etc.
Example
• Associate a "cost" with each statement.
• Find the "total cost“ by finding the total number of times
each statement is executed.
Algorithm 1 Algorithm 2

Cost Cost
arr[0] = 0; c1 for(i=0; i<N; i++) c2
arr[1] = 0; c1 arr[i] = 0; c1
arr[2] = 0; c1
... ...
arr[N-1] = 0; c1
----------------------------------- -----------------------------
c1+c1+...+c1 = c1 x N (N+1) x c2 + N x c1 =
(c2 + c1) x N + c2
Another Example

• Algorithm 3 Cost
sum = 0; c1
for(i=0; i<N; i++) c2
for(j=0; j<N; j++) c2
sum += arr[i][j]; c3
------------------------
c1 + c2 x (N+1) + c2 x N x (N+1) + c3 x N2
Insertion Sort C Function
void insertion_sort(int a[], int n)
{
int c, d, t;
for (c = 1 ; c <= n - 1; c++) {
d = c;
while ( d > 0 && a[d] < a[d-1]) {
t = a[d];
a[d]=a[d-1];
a[d-1] = t;
d--;
}
}}
Asymptotic Analysis
• To compare two algorithms with running
times f(n) and g(n), we need a rough
measure that characterizes how fast
each function grows.
• Hint: use rate of growth
• Compare functions in the limit, that is,
asymptotically!
(i.e., for large values of n)
Rate of Growth
• Consider the example of buying elephants and
goldfish:
Total Cost: cost_of_elephants + cost_of_goldfish
Cost  cost_of_elephants (approximation)
• The low order terms in a function are relatively
insignificant for large n
n4 + 100n2 + 10n + 50  n4

i.e., we say that n4 + 100n2 + 10n + 50 and n4 have the


same rate of growth
-notation
For function g(n), we define
(g(n)), big-Theta of n, as the set:
(g(n)) = {f(n) :
 positive constants c1, c2,
and n0, such that n  n0,
we have 0  c1g(n)  f(n) 
c2g(n)
}

g(n) is an asymptotically tight bound for f(n).


-notation
For function g(n), we define
(g(n)), big-Theta of n, as the set:
(g(n)) = {f(n) :
 positive constants c1, c2,
and n0, such that n  n0,
we have 0  c1g(n)  f(n) 
c2g(n)
}

f(n) and g(n) are nonnegative, for large n.


Example
(g(n)) = {f(n) :  positive constants c1, c2, and n0,
such that n  n0, 0  c1g(n)  f(n)  c2g(n)}

• 10n2 - 3n = (n2)
• What constants for n0, c1, and c2 will work?
• Make c1 a little smaller than the leading
coefficient, and c2 a little bigger.
• To compare orders of growth, look at the
leading term.
• Exercise: Prove that n2/2 - 3n = (n2)
Example
(g(n)) = {f(n) :  positive constants c1, c2, and
n0, such that n  n0, 0  c1g(n)  f(n) 
c2g(n)}

• Is 3n3 = (n4) ??
• How about 22n = (2n)??
O-notation
For function g(n), we define
O(g(n)), big-O of n, as the set:
O(g(n)) = {f(n) :
 positive constants c and n0,
such that n  n0,
we have 0  f(n)  cg(n) }

g(n) is an asymptotic upper bound for f(n).


Examples
O(g(n)) = {f(n) :  positive constants c and
n0, such that n  n0, we have 0  f(n) 
cg(n) }

• Any linear function an + b is in O(n2). How?


• Show that 3n3=O(n4) for appropriate c and n0.
 -notation
For function g(n), we define
(g(n)), big-Omega of n, as the
set:
(g(n)) = {f(n) :
 positive constants c and
n0, such that n  n0,
we have 0  cg(n)  f(n)}

g(n) is an asymptotic lower bound for f(n).


Example
(g(n)) = {f(n) :  positive constants c and n0,
such that n  n0, we have 0  cg(n)  f(n)}
• n = (lg n). Choose c and n0.
Relations Between , O, 
Relations Between , , O
Theorem : For any two functions g(n) and
f(n),
f(n) = (g(n)) iff
f(n) = O(g(n)) and f(n) = (g(n)).

• i.e., (g(n)) = O(g(n))  (g(n))

• In practice, asymptotically tight bounds are


obtained from asymptotic upper and lower
bounds.
Asymptotic Notation in Equations
• Can use asymptotic notation in equations to
replace expressions containing lower-order
terms.
• For example,
4n3 + 3n2 + 2n + 1 = 4n3 + 3n2 + (n)
= 4n3 + (n2) = (n3).
More Examples …

• n4 + 100n2 + 10n + 50 is O(n4)


• 10n3 + 2n2 is O(n3)
• n3 - n2 is O(n3)
• constants
– 10 is O(1)
– 1273 is O(1)
Back to Our Example
Algorithm 1 Algorithm 2
Cost Cost
arr[0] = 0; c1 for(i=0; i<N; i++) c2
arr[1] = 0; c1 arr[i] = 0; c1
arr[2] = 0; c1
...
arr[N-1] = 0; c1
----------- -------------
c1+c1+...+c1 = c1 x N (N+1) x c2 + N x c1 =
(c2 + c1) x N + c2

• Both algorithms are of the same order: O(N)


Example (cont’d)

Algorithm 3 Cost
sum = 0; c1
for(i=0; i<N; i++) c2
for(j=0; j<N; j++) c2
sum += arr[i][j]; c3
------------
c1 + c2 x (N+1) + c2 x N x (N+1) + c3 x N2 = O(N2)
Examples

 2n2 = O(n3): 2n2 ≤ cn3  2 ≤ cn  c = 1 and n0= 2

 n2 = O(n2): n2 ≤ cn2  c ≥ 1  c = 1 and n0= 1

 1000n2+1000n = O(n2):

1000n2+1000n ≤ 1000n2+ n2 =1001n2 c=1001 and n0 = 1000

 n = O(n2): n ≤ cn2  cn ≥ 1  c = 1 and n0= 1


Relations Between Different Sets
• Subset relations between order-of-growth sets.

RR
O( f ) ( f )
•f
( f )
Exercises
• Express the function n3/1000 − 100n2 − 100n + 3
in terms of Θ-notation.
• Show that for any real constants a and b, where
b > 0, (n + a)b = Θ(nb)
• Show that the solution of T (n) = T (n/2) + 1 is
O(lg n).
• Solve the followings:
– T (n) = 4T (n/2) + n.
– T (n) = 4T (n/2) + n2.
– T (n) = 4T (n/2) + n3.

You might also like