0% found this document useful (0 votes)
72 views

Asymptotic Notation: Steven Skiena

The document provides an overview of asymptotic notation used to analyze algorithms. It defines big O, Ω, and Θ notation and how they are used to describe the worst-case, best-case, and average-case time complexity of algorithms. Specifically, it explains that big O notation provides an upper bound, big Ω provides a lower bound, and big Θ provides both an upper and lower bound for the runtime of an algorithm. Examples are given to illustrate how to determine the asymptotic notation that describes different functions.

Uploaded by

satyabasha
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
72 views

Asymptotic Notation: Steven Skiena

The document provides an overview of asymptotic notation used to analyze algorithms. It defines big O, Ω, and Θ notation and how they are used to describe the worst-case, best-case, and average-case time complexity of algorithms. Specifically, it explains that big O notation provides an upper bound, big Ω provides a lower bound, and big Θ provides both an upper and lower bound for the runtime of an algorithm. Examples are given to illustrate how to determine the asymptotic notation that describes different functions.

Uploaded by

satyabasha
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 17

Lecture 2:

Asymptotic Notation
Steven Skiena
Department of Computer Science
State University of New York
Stony Brook, NY 117944400
http://www.cs.sunysb.edu/skiena
Problem of the Day
The knapsack problem is as follows: given a set of integers
S = {s
1
, s
2
, . . . , s
n
}, and a given target number T, nd a
subset of S which adds up exactly to T. For example, within
S = {1, 2, 5, 9, 10} there is a subset which adds up to T = 22
but not T = 23.
Find counterexamples to each of the following algorithms for
the knapsack problem. That is, give an S and T such that
the subset is selected using the algorithm does not leave the
knapsack completely full, even though such a solution exists.
Solution
Put the elements of S in the knapsack in left to right order
if they t, i.e. the rst-t algorithm?
Put the elements of S in the knapsack from smallest to
largest, i.e. the best-t algorithm?
Put the elements of S in the knapsack from largest to
smallest?
The RAM Model of Computation
Algorithms are an important and durable part of computer
science because they can be studied in a machine/language
independent way.
This is because we use the RAM model of computation for
all our analysis.
Each simple operation (+, -, =, if, call) takes 1 step.
Loops and subroutine calls are not simple operations.
They depend upon the size of the data and the contents
of a subroutine. Sort is not a single step operation.
Each memory access takes exactly 1 step.
We measure the run time of an algorithm by counting the
number of steps, where:
This model is useful and accurate in the same sense as the
at-earth model (which is useful)!
Worst-Case Complexity
The worst case complexity of an algorithm is the function
dened by the maximum number of steps taken on any
instance of size n.
1 2 3 4 . . . . . .
N
.
.
Number of
Steps
Problem Size
Best Case
Average Case
Worst Case
Best-Case and Average-Case Complexity
The best case complexity of an algorithm is the function
dened by the minimum number of steps taken on any
instance of size n.
The average-case complexity of the algorithm is the function
dened by an average number of steps taken on any instance
of size n.
Each of these complexities denes a numerical function: time
vs. size!
Exact Analysis is Hard!
Best, worst, and average are difcult to deal with precisely
because the details are very complicated:
1 2 3 4 . . . . . .
It easier to talk about upper and lower bounds of the function.
Asymptotic notation (O, , ) are as well as we can
practically deal with complexity functions.
Names of Bounding Functions
g(n) = O(f(n)) means C f(n) is an upper bound on
g(n).
g(n) = (f(n)) means Cf(n) is a lower bound on g(n).
g(n) = (f(n)) means C
1
f(n) is an upper bound on
g(n) and C
2
f(n) is a lower bound on g(n).
C, C
1
, and C
2
are all constants independent of n.
O, , and
(c)
f(n)
c2*g(n)
n
n0
c1*g(n)
c*g(n)
f(n)
n
n0
f(n)
c*g(n)
n
n0
(b)
(a)
The denitions imply a constant n
0
beyond which they are
satised. We do not care about small values of n.
Formal Denitions
f(n) = O(g(n)) if there are positive constants n
0
and c
such that to the right of n
0
, the value of f(n) always lies
on or below c g(n).
f(n) = (g(n)) if there are positive constants n
0
and c
such that to the right of n
0
, the value of f(n) always lies
on or above c g(n).
f(n) = (g(n)) if there exist positive constants n
0
, c
1
, and
c
2
such that to the right of n
0
, the value of f(n) always lies
between c
1
g(n) and c
2
g(n) inclusive.
Big Oh Examples
3n
2
100n + 6 = O(n
2
) because 3n
2
> 3n
2
100n + 6
3n
2
100n + 6 = O(n
3
) because .01n
3
> 3n
2
100n + 6
3n
2
100n + 6 = O(n) because c n < 3n
2
when n > c
Think of the equality as meaning in the set of functions.
Big Omega Examples
3n
2
100n + 6 = (n
2
) because 2.99n
2
< 3n
2
100n + 6
3n
2
100n + 6 = (n
3
) because 3n
2
100n + 6 < n
3
3n
2
100n + 6 = (n) because 10
10
10
n < 3n
2
100 + 6
Big Theta Examples
3n
2
100n + 6 = (n
2
) because O and
3n
2
100n + 6 = (n
3
) because O only
3n
2
100n + 6 = (n) because only
Big Oh Addition/Subtraction
Suppose f(n) = O(n
2
) and g(n) = O(n
2
).
What do we know about g

(n) = f(n) +g(n)? Adding the


bounding constants shows g

(n) = O(n
2
).
What do we know about g

(n) = f(n) |g(n)|? Since


the bounding constants dont necessary cancel, g

(n) =
O(n
2
)
We know nothing about the lower bounds on g

and g

because we know nothing about lower bounds on f and g.


Big Oh Multiplication by Constant
Multiplication by a constant does not change the asymptotics:
O(c f(n)) O(f(n))
(c f(n)) (f(n))
(c f(n)) (f(n))
Big Oh Multiplication by Function
But when both functions in a product are increasing, both are
important:
O(f(n)) O(g(n)) O(f(n) g(n))
(f(n)) (g(n)) (f(n) g(n))
(f(n)) (g(n)) (f(n) g(n))

You might also like