0% found this document useful (0 votes)
108 views15 pages

Analysis of Algorithms: Big-Oh

This document contains lecture slides about analyzing algorithms using Big-O notation from a class on analysis of algorithms taught by Dale Roberts. It introduces Big-O, Big-Omega, and Big-Theta notations and defines Big-O as describing an asymptotic upper bound on a function. It provides examples of calculating Big-O for functions and classifying algorithms based on their time complexity. The document aims to justify the use of Big-O notation and provide rules for calculating it.

Uploaded by

ayene
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
108 views15 pages

Analysis of Algorithms: Big-Oh

This document contains lecture slides about analyzing algorithms using Big-O notation from a class on analysis of algorithms taught by Dale Roberts. It introduces Big-O, Big-Omega, and Big-Theta notations and defines Big-O as describing an asymptotic upper bound on a function. It provides examples of calculating Big-O for functions and classifying algorithms based on their time complexity. The document aims to justify the use of Big-O notation and provide rules for calculating it.

Uploaded by

ayene
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 15

Department of Computer and Information Science,

School of Science, IUPUI

CSCI 240

Analysis of Algorithms
Big-Oh

Dale Roberts, Lecturer


Computer Science, IUPUI
E-mail: [email protected]

Dale Roberts
Asymptotic Analysis

Ignoring constants in T(n)


Analyzing T(n) as n "gets large"
Example:
T (n)  13n  42n  2n log n  4n
3 2

As n grows larger, n3 is MUCH larger than n2 , n log n, and n,


so it dominates T (n)

The running time grows "roughly on the order of n3"


Notationally, T(n) = O(n3 ) The big-oh (O) Notation

Dale Roberts
3 major notations
Ο(g(n)), Big-Oh of g of n, the Asymptotic Upper
Bound.
(g(n)), Big-Omega of g of n, the Asymptotic Lo
wer Bound.
(g(n)), Big-Theta of g of n, the Asymptotic Tight
Bound.

Dale Roberts
Big-Oh Defined
 The O symbol was introduced in 1927 to indicate relative growth of
two functions based on asymptotic behavior of the functions now us
ed to classify functions and families of functions

T(n) = O(f(n)) if there are constants c and n0 such that T(n) <
c*f(n) when n  n0

c*f(n)
c*f(n) is an upper bound for T(n)
T(n)

n0 n
Dale Roberts
Big-Oh

Describes an upper bound for the running


time of an algorithm

Upper bounds for Insertion Sort running times:


•worst case: O(n2) T(n) = c1*n2 + c2*n + c3
•best case: O(n) T(n) = c1*n + c2

Time Complexity

Dale Roberts
Big-O Notation
We say Insertion Sort’s run time is O(n2)
Properly we should say run time is in O(n2)
Read O as “Big-Oh” (you’ll also hear it as “order”)
In general a function
f(n) is O(g(n)) if there exist positive constants c and n0
such that f(n)  c  g(n) for all n  n0
e.g. if f(n)=1000n and g(n)=n2, n0 = 1000 and c =
1 then f(n) < 1*g(n) where n > n0 and we say that
f(n) = O(g(n))
The O notation indicates 'bounded above by a c
onstant multiple of.'

Dale Roberts
Big-Oh Properties
Fastest growing function dominates a sum
O(f(n)+g(n)) is O(max{f(n), g(n)})
Product of upper bounds is upper bound for the product
If f is O(g) and h is O(r) then fh is O(gr)
f is O(g) is transitive
If f is O(g) and g is O(h) then f is O(h)
Hierarchy of functions
O(1), O(logn), O(n1/2), O(nlogn), O(n2), O(2n), O(n!)

Dale Roberts
Some Big-Oh’s are not reasonable
Polynomial Time algorithms
An algorithm is said to be polynomial if it is
O( nc ), c >1
Polynomial algorithms are said to be reasonable
They solve problems in reasonable times!
Coefficients, constants or low-order terms are ignored
e.g. if f(n) = 2n2 then f(n) = O(n2)

Exponential Time algorithms


An algorithm is said to be exponential if it is
O( rn ), r > 1
Exponential algorithms are said to be unreasonable

Dale Roberts
Can we justify Big O notation?
Big O notation is a huge simplification; can we
justify it?
It only makes sense for large problem sizes
For sufficiently large problem sizes, the
highest-order term swamps all the rest!
Consider R = x2 + 3x + 5 as x varies:
x = 0 x2 = 0 3x = 10 5 = 5 R = 5
x = 10 x2 = 100 3x = 30 5 = 5 R = 135
x = 100 x2 = 10000 3x = 300 5 = 5 R = 10,305
x = 1000 x2 = 1000000 3x = 3000 5 = 5 R = 1,003,005
x = 10,000 R = 100,030,005
x = 100,000 R = 10,000,300,005

Dale Roberts
Classifying Algorithms based on Big-Oh
A function f(n) is said to be of at most logarithmic growth if f(n) = O(
log n)
A function f(n) is said to be of at most quadratic growth if f(n) = O(n2
)
A function f(n) is said to be of at most polynomial growth if f(n) = O(
nk), for some natural number k > 1
A function f(n) is said to be of at most exponential growth if there is
a constant c, such that f(n) = O(cn), and c > 1
A function f(n) is said to be of at most factorial growth if f(n) = O(n!).
A function f(n) is said to have constant running time if the size of th
e input n has no effect on the running time of the algorithm (e.g., as
signment of a value to a variable). The equation for this algorithm is
f(n) = c
Other logarithmic classifications: f(n) = O(n log n)
f(n) = O(log log n)

Dale Roberts
Rules for Calculating Big-Oh
Base of Logs ignored
logan = O(logbn)
Power inside logs ignored
log(n2) = O(log n)
Base and powers in exponents not ignored
3n is not O(2n)
2
a(n ) is not O(an)
If T(x) is a polynomial of degree n, then T(x) =
O(xn)

Dale Roberts
Big-Oh Examples
1. 2n3 + 3n2 + n = 2n3 + 3n2 + O(n)
= 2n3 + O( n2 + n)
= 2n3 + O( n2 )
= O(n3 ) = O(n4)
2. 2n3 + 3n2 + n = 2n3 + 3n2 + O(n)
= 2n3 + O(n2 + n)
= 2n3 + O(n2) = O(n3)

Dale Roberts
Big-Oh Examples (cont.)
3. Suppose a program P is O(n3), and a program Q
is O(3n), and that currently both can solve proble
ms of size 50 in 1 hour. If the programs are run o
n another system that executes exactly 729 time
s as fast as the original system, what size proble
ms will they be able to solve?

Dale Roberts
Big-Oh Examples (cont)
n3 = 503  729 3n = 350  729
n = 3 503 * 3 729 n = log3 (729  350)
n = log3(729) + log3 350
n = 50  9 n = 6 + log3 350
n = 50  9 = 450 n = 6 + 50 = 56

Improvement: problem size increased by 9 times for n3


algorithm but only a slight improvement in problem si
ze (+6) for exponential algorithm.

Dale Roberts
Acknowledgements
Philadephia University, Jordan
Nilagupta, Pradondet

Dale Roberts

You might also like