0% found this document useful (0 votes)
79 views2 pages

220handout Lecture04

This document discusses time complexity analysis of algorithms. It introduces common time complexities like polynomial (O(nk)) and exponential (O(2^n)) time. It explains that problems with no known polynomial time solution are considered intractable. It also discusses how Big-O notation describes asymptotic worst-case time complexity but ignores constant factors. Rules for analyzing sums and products of time complexities are presented.
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
79 views2 pages

220handout Lecture04

This document discusses time complexity analysis of algorithms. It introduces common time complexities like polynomial (O(nk)) and exponential (O(2^n)) time. It explains that problems with no known polynomial time solution are considered intractable. It also discusses how Big-O notation describes asymptotic worst-case time complexity but ignores constant factors. Rules for analyzing sums and products of time complexities are presented.
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

-

Introduction to Algorithm Analysis

COMPSCI 220

Time Complexity of Algorithms


If running time T(n) is O(f(n)) then the function f measures time complexity
Polynomial algorithms: T(n) is O(nk); k = const Exponential algorithm: otherwise

Time complexity growth


f(n)
n n log10n n1.5 n2 n3 2n
1 Lecture 4

Intractable problem: if no polynomial algorithm is known for its solution

Number of data items processed per: 1 minute 1 day 1 year 1 century 10 14,400 5.26106 5.26108 10 3,997 883,895 6.72107 10 1,275 65,128 1.40106 10 10 10 379 112 20
COMPSCI 220 - AP G Gimel'farb

7,252 807 29

72,522 3,746 35
2

Lecture 4

COMPSCI 220 - AP G Gimel'farb

Beware exponential complexity


If a linear O(n) algorithm processes 10 items per minute, then it can process 14,400 items per day, 5,260,000 items per year, and 526,000,000 items per century If an exponential O(2n) algorithm processes 10 items per minute, then it can process only 20 items per day and 35 items per century...
Lecture 4 COMPSCI 220 - AP G Gimel'farb 3

Big-Oh vs. Actual Running Time


Example 1: Let algorithms A and B have running times TA(n) = 20n ms and TB (n) = 0.1n log2n ms In the Big-Ohsense, A is better than B But: on which data volume can A outperform B? TA(n) < TB(n) if 20n < 0.1n log2n, or log2n > 200, that is, when n >2200 1060 ! Thus, in all practical cases B is better than A
Lecture 4 COMPSCI 220 - AP G Gimel'farb 4

Big-Oh vs. Actual Running Time


Example 2: Let algorithms A and B have running times TA(n) = 20n ms and TB (n) = 0.1n2 ms In the Big-Oh sense, A is better than B But: on which data volumes A outperforms B? TA(n) < TB(n) if 20n < 0.1n2, or n > 200 Thus A is better than B in most practical cases except for n < 200 when B becomes faster
Lecture 4 COMPSCI 220 - AP G Gimel'farb 5

Big-Oh: Scaling
For all c > 0 cf is O(f ) where f f(n) Proof: cf(n) < (c+)f(n) holds for all n > 0 and > 0
Constant factors are ignored. Only the powers and functions of n should be exploited It is this ignoring of constant factors that motivates for such a notation! In particular, f is O(f ) Examples: 50n O(n) 0.05n O(n) 50000000n O(n) 0.0000005n O(n)
Lecture 4 COMPSCI 220 - AP G Gimel'farb 6

A/P Georgy Gimel'farb - Lecture 4

Introduction to Algorithm Analysis

COMPSCI 220

Big-Oh: Transitivity
If h is O(g) and g is O(f ), then h is O(f ) Informally: if h grows at most as fast as g, which grows at most as fast as f, then h grows at most as fast as f
Examples: h O(g); g O(n2) h O(n2) log10n O(n0.01); n0.01 O(n) log10n O(n) 2n O(3n); n50 O(2n) n50 O(3n)
Lecture 4 COMPSCI 220 - AP G Gimel'farb 7

Big-Oh: Rule of Sums


If g1O(f1) and g2O(f2), then g1+g2 O(max{f1,f2}) The sum grows as its fastest-growing term: if gO(f ) and hO(f), then g + h O(f) if gO(f), then g + f O(f) Examples: if hO(n) and gO(n2), then g + h O(n2) if h O(n log n) and g O(n), then g + h O(n log n)
Lecture 4 COMPSCI 220 - AP G Gimel'farb 8

Rule of Sums

Big-Oh: Rule of Products


If g1O( f1) and g2O( f2), then g1g2O( f1 f2 )
The product of upper bounds of functions gives an upper bound for the product of the functions: if gO( f ) and hO( f ), then ghO( f 2) if gO( f ), then ghO( f h)

Examples: if hO(n) and gO(n2), then ghO(n3) if hO(log n) and gO(n), then ghO(n log n)
Lecture 4 COMPSCI 220 - AP G Gimel'farb 9 Lecture 4 COMPSCI 220 - AP G Gimel'farb 10

Big-Oh: Limit Rule


Suppose L limn f(n)/g(n) exists (may be )

Examples 1.23, 1.24, p.19


Ex.1.23: Exponential functions grow faster than powers: nk is O(bn) for all b>1, n>1, and k0
Proof: by induction or by the limit LHopital approach

Then if L = 0, then f is O(g) if 0 < L < , then f is (g)


if L = , then f is (g)
To compute the limit, the standard LHopital rule of calculus is useful: if limx f(x) = = limxg(x) and f, g are positive differentiable functions for x > 0, then limx f(x)/g(x) = limx f '(x)/g'(x) where f '(x) is the derivative
Lecture 4 COMPSCI 220 - AP G Gimel'farb 11

Ex. 1.24: Logarithmic functions grow slower than powers: logbn is O(nk) for all b>1, k>0

logbn is O(log n) for all b>1: logbn = logba logan log n is O(n) n log n is O(n2)
COMPSCI 220 - AP G Gimel'farb 12

Lecture 4

A/P Georgy Gimel'farb - Lecture 4

You might also like