Introduction To Algorithms: Chapter 3: Growth of Functions
Introduction To Algorithms: Chapter 3: Growth of Functions
2
Complexity
Complexity is the number of steps required to solve
a problem.
Complexity of Algorithms
The size of the problem is a measure of the quantity of the
input data n
The time needed by an algorithm, expressed as a function
of the size of the problem (it solves), is called the (time)
complexity of the algorithm T(n)
3
Basic idea: counting operations
Running Time: Number of primitive steps that are
executed
most statements roughly require the same amount of time
y=m*x+b
c = 5 / 9 * (t - 32 )
z = f(x) + g(y)
Each algorithm performs a sequence of basic
operations:
Arithmetic: (low + high)/2
Comparison: if ( x > 0 ) …
Assignment: temp = x
Branching: while ( true ) { … }
…
4
Basic idea: counting operations
Idea: count the number of basic operations
performed on the input.
Difficulties:
Which operations are basic?
Not all operations take the same amount of time.
Operations take different times with different
hardware or compilers
5
Measures of Algorithm Complexity
Let T(n) denote the number of operations required
by an algorithm to solve a given class of problems
6
Measures of Algorithm Complexity
Worst-Case Running Time: the longest time for any
input size of n
provides an upper bound on running time for any input
7
Example: Sequential Search
Step Count Algorithm
// Searches for x in array A of n items
// returns index of found item, or n+1 if not found
0 Seq_Search( A[n]: array, x: item){
1 done = false
1 i=1
n+1 while ((i <= n) and (A[i] <> x)){
n i = i +1
0 }
1 return i
0 }
2n + 4 Total
8
Example: Sequential Search
worst-case running time
when x is not in the original array A
in this case, while loop needs 2(n + 1) comparisons + c
other operations
So, T(n) = 2n + 2 + c Linear complexity
9
Order of Growth
For very large input size, it is the rate of grow, or order
of growth that matters asymptotically
10
Asymptotic Notation
, O, , o,
Used to describe the running times of algorithms
Instead of exact running time, say (n2)
Defined for functions whose domain is the set of
natural numbers, N
Determine sets of functions, in practice used to
compare two functions
11
Asymptotic Notation
By now you should have an intuitive feel for
asymptotic (big-O) notation:
12
Big-O notation
(Upper Bound – Worst Case)
For a given function g(n), we denote by O(g(n)) the set of
functions
O(g(n)) = {f(n): there exist positive constants c >0 and n >0
0
such that 0 f(n) cg(n) for all n n0 }
We say g(n) is an asymptotic upper bound for f(n):
f ( n)
0 lim n
g ( n)
O(g(n)) means that as n , the execution time f(n) is at
most c.g(n) for some constant c
13
Big-O notation
(Upper Bound – Worst Case)
c.g(n)
time
f(n)
n0 n
f(n) = O(g(n))
14
O-notation
For a given function g(n), we
denote by O(g(n)) the set of
functions
O(g(n)) = {f(n): there exist
positive constants c and n0
such that
0 f(n) cg(n),
for all n n0 }
16
Examples of functions in O(n2) New
n2
n2 + n
n2 + 1000n
1000n2 + 1000n
Also,
n
n/1000
n1.99999
n2/ lg lg lg n
17
Big-O notation
(Upper Bound – Worst Case)
Example1: Is 2n + 7 = O(n)?
Let
T(n) = 2n + 7
T(n) = n (2 + 7/n)
Note for n=7;
2 + 7/n = 2 + 7/7 = 3
T(n) 3 n ; n7 n0
c
Then T(n) = O(n)
lim n [T(n) / n)] = 2 0 T(n) = O(n)
18
Big-O notation
(Upper Bound – Worst Case)
Example2: Is 5n3 + 2n2 + n + 106 = O(n3)?
Let
T(n) = 5n3 + 2n2 + n + 106
T(n) = n3 (5 + 2/n + 1/n2 + 106/n3)
Note for n=100;
5 + 2/n + 1/n2 + 106/n3 =
5 + 2/100 + 1/10000 + 1 = 6.05
T(n) 6.05 n3 ; n 100 n0
c
Then T(n) = O(n3)
limn[T(n) / n3)] = 5 0 T(n) = O(n3)
19
Big-O notation
(Upper Bound – Worst Case)
Express the execution time as a function of the input size n
Since only the growth rate matters, we can ignore the
multiplicative constants and the lower order terms, e.g.,
n, n+1, n+80, 40n, n+log n is O(n)
n1.1 + 10000000000n is O(n1.1)
n2 is O(n2)
3n2 + 6n + log n + 24.5 is O(n2)
O(1) < O(log n) < O((log n)3) < O(n) < O(n2) < O(n3) < O(nlog n) <
O(2sqrt(n)) < O(2n) < O(n!) < O(nn)
20
-notation (Omega)
(Lower Bound – Best Case)
For a given function g(n), we denote by (g(n)) the set of
functions
(g(n)) = {f(n): there exist positive constants c >0 and n >0
0
such that 0 cg(n) f(n) for all n n0 }
c.g(n)
n0 n
f(n) = (g(n))
22
-notation
For a given function g(n),
we denote by (g(n)) the
set of functions
(g(n)) = {f(n): there exist
positive constants c and
n0 such that
0 cg(n) f(n)
for all n n0 }
For example
the worst-case running time of insertion sort is O(n 2),
and
the best-case running time of insertion sort is (n)
Running time falls anywhere between a linear
24
Examples of functions in (n2) New
n2
n2 + n
n2 − n
1000n2 + 1000n
1000n2 − 1000n
Also,
n3
n2.00001
n2 lg lg lg n
25
notation (Theta)
(Tight Bound)
In some cases,
f(n) = O(g(n)) and f(n) = (g(n))
This means, that the worst and best cases require
the same amount of time t within a constant factor
In this case we use a new notation called “theta ”
For a given function g(n), we denote by (g(n))
the set of functions
(g(n)) = {f(n): there exist positive constants c1>0,
c2 >0 and n0 >0 such that
c1 g(n) f(n) c2 g(n) n n0}
26
notation (Theta)
(Tight Bound)
We say g(n) is an asymptotic tight bound for f(n):
f ( n)
0 lim n
g ( n)
Theta notation
(g(n)) means that as n , the execution time f(n) is at
most c2.g(n) and at least c1.g(n) for some constants c1
and c2.
c1.g(n)
n0 n
f(n) = (g(n))
28
notation (Theta) New
(Tight Bound)
Example:
n2/2 − 2n = (n2), with c1 = 1/4, c2 = 1/2, and
n0 = 8.
29