0% found this document useful (0 votes)
75 views9 pages

Lecture2 Algo - Analysis I PDF

The document discusses algorithm analysis and computational tractability. It defines key concepts like asymptotic upper and lower bounds using big O, Omega and Theta notation. An algorithm is considered efficient if it has a polynomial running time, meaning the running time is bounded by a polynomial function of the input size n. Exponential running times are not tractable as the input size increases.

Uploaded by

divyanshuvarma
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
75 views9 pages

Lecture2 Algo - Analysis I PDF

The document discusses algorithm analysis and computational tractability. It defines key concepts like asymptotic upper and lower bounds using big O, Omega and Theta notation. An algorithm is considered efficient if it has a polynomial running time, meaning the running time is bounded by a polynomial function of the input size n. Exponential running times are not tractable as the input size increases.

Uploaded by

divyanshuvarma
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

Basics of Algorithm Analysis: Part I

Satyajit Thakor
IIT Mandi

17 February, 2017
Computational tractability

I The main focus of the course is to find efficient algorithms for


computational problems.
I How to determine efficiency of an algorithm?
I One way is to analyze its running time mathematically as a
function of its input size N .
I Worst case vs. average case analysis
I What is a reasonable analytical benchmark that can tell us
whether a running-time bound is impressive or weak?
I A simple way is to compare with brute-force search over the
search space of possible solutions.
I Example: Stable matching - what is N ? (= 2n2 )
I Brute-force - search over n! perfect matchings (let alone making
checks for each)
I G-S Algorithm - at most n2 iterations
Computational tractability

I Desirable scaling property: when the input size increases by a


constant factor, the algorithm should only slow down by some
constant factor.
I Consider Property A: There are absolute constants c >0 and
d > 0 so that on every input instance of size N , its running time
is bounded by cN d primitive computational steps.
I If this bound holds than the algorithm has a polynomial
running time.
I Does Property A satisfy the desirable scaling property?
I Yes...

I Hence, efficiency can be formally defined as...


I An algorithm is efficient if it has a polynomial running time.
Computational tractability

I The gulf between the growth rates of polynomial and


exponential functions is enormous...

I This concrete definition of efficiency allows us to raise


fundamental issues
I e.g., existence and non-existence of efficient algorithms
Asymptotic order of growth

Asymptotic upper bounds:


I Let the function T (n) be the worst-case running time of a
certain algorithm on an input of size n.
I Given another function f (n), we say that T (n) is O(f (n)) if, for
sufficiently large n, the function T (n) is bounded above by a
constant multiple of f (n), written T (n) = O(f (n)).
I More precisely, T (n) is O(f (n)) if there exist constants c > 0
and n0 0 so that for all n n0 , we have T (n) cf (n).
I The function T is asymptotically upper bounded by f .

I Example: Let T (n) = pn2 + qn + r, p, q, r > 0, then T (n) is


O(n2 ). Why?
I T (n) is also O(n3 ). But, n2 is a better asymptotic upper bound.
Asymptotic order of growth

Asymptotic lower bounds:


I T (n) is (f (n)) (also written T (n) = (f (n))) if there exist
constants  > 0 and n0 0 so that for all n n0 , we have
T (n) f (n).
I The function T is asymptotically lower bounded by f .

I Example: Let T (n) = pn2 + qn + r, p, q, r > 0, then


T (n) = (n2 ). Why?
I T (n) is also (n). But n2 is a better asymptotic lower bound.
Asymptotic order of growth

Asymptotically tight bounds:


I If a function T (n) is both O(f (n)) and (f (n)), we say that
T (n) is (f (n)).
I That is, T (n) grows exactly like f (n) to within a constant
factor.

I Example: T (n) = pn2 + qn + r, is (n2 ).


I Alternative definition: If the ratio of functions f (n) and g(n)
converges to a positive constant as n goes to infinity, that is,

f (n)
lim
n g(n)

exists and is equal to some number c > 0, then f (n) = (g(n)).


I Proof
Asymptotic order of growth

Properties of asymptotic growth rates:


I Transitivity:
If f = O(g) and g = O(h), then f = O(h).
If f = (g) and g = (h), then f = (h).
If f = (g) and g = (h), then f = (h).
I Sum of functions:
If f = O(h) and g = O(h) then f + g = O(h).
Generalization: If fi = O(h) for all 1 i k then
f1 + f2 + ... + fk = O(h).
I Suppose that f and g are two functions (taking nonnegative
values) such that g = O(f ). Then f + g = (f ), i.e., f is an
asymptotically tight bound for the combined function f + g.
I Proofs
Asymptotic bounds for common functions

I Let f be a polynomial of degree d, in which the coefficient ad is


positive. Then f = O(nd ).
I Also, f = (nd ) and hence f = (nd ).
I A polynomial-time algorithm is one whose running time T (n) is
O(nd ) for some constant d.

I Logarithms are very slowly growing functions:


I For every b > 1 and every x > 0, we have logb n = O(nx ).
I In asymptotic bound, base is not important. Why?

I Exponentials are very fast growing functions:


I For every r > 1 and every d > 0, we have nd = O(rn ).

You might also like