02 CS251 Ch3 Functions Growth
02 CS251 Ch3 Functions Growth
Growth of Functions
Computer Science Dept.
Instructor: Ameera Jaradat
Outline
Analysis of Algorithms
Asymptotic notation
Standard notations and common functions
Analysis of Algorithms
An algorithm is a finite set of precise instructions for
solving a problem.
What is the goal of analysis of algorithms?
To compare algorithms mainly in terms of running time.
also in terms of other factors (e.g., memory requirements,
programmer's effort etc.)
What do we mean by running time analysis?
Determine how running time increases as the size of the
problem increases.
Input Size
Input size (number of elements in the input)
size of an array
# of elements in a matrix
# of bits in the binary representation of the input
Vertices (nodes) and edges in a graph
4
Types of Analysis
Worst case
Provides an upper bound on running time
An absolute guarantee that the algorithm would not run longer, no matter
what the inputs are
Best case
Provides a lower bound on running time
Input is the one for which the algorithm runs the fastest
6
Asymptotic Analysis
To compare two algorithms with running times
f(n) and g(n), we need a rough measure that
characterizes how fast each function grows.
Hint: use rate of growth
Compare functions in the limit, that is,
asymptotically!
)i.e., for large values of n(
7
Which running time is faster
9
Function of Growth rate
Asymptotic Notation
11
Asymptotic notations
O-notation
12
Big-O Visualization
O(g(n)) is the set of
functions with smaller
or same order of growth
as g(n)
13
Examples
(upper bound)
14
No Uniqueness
There is no unique set of values for n0 and c in proving the asymptotic
bounds
Prove that 100n + 5 = O(n2)
100n + 5 ≤ 100n + n = 101n ≤ 101n2
for all n ≥ 5
Must find SOME constants c and n0 that satisfy the asymptotic notation relation
15
Asymptotic notations (cont.)
- notation
16
Asymptotic notations (cont.)
-notation
17
Relations Between Different Sets
asymptotic growth rate, asymptotic order, or order of
functions
The Sets:
big oh O(g), big theta (g), big omega (g)
20
Common orders of magnitude
21
f(n) is called polylogarithmically bounded if f (n) =
O(lgkn) for some constant k.
for any constant a > 0
Factorials
• Stirling’s approximation
logb x log a x
log a b
27
Common Summations
n
n( n 1)
Arithmetic series:
k 1 2 ... n
k 1 2
n
x n 1 1
k 2
x 1 x x ... x n
x 1
Geometric series: k 0 x 1
1
Special case: |x| < 1: x
k 0
k
1 x
n
1 1 1
Harmonic series:
k 1 k
1
2
...
n
ln n
n
1
k p 1p 2 p ... n p
k 1 p 1
n p 1
28
How to estimate the running time
Counting the number of iterations
Algorithm COUNT1
Input: n = 2k, for some positive integer k.
Output: count = number of times Step 4 is
executed.
1. count 0
2. while n 1
3. for j 1 to n Observations:
4. count count + 1 The while loop is executed k + 1 times,
where k = log n.
5. end for The for loop is executed n times, and then
6. n n/2 n/2, n/4 ,…,1.
7. end while
8. return count
k
.n ∑ (1/2j) = n (2- (1/2k)) = 2n – 1 = Θ(n) = )n/2j( ∑
J=0
Algorithm COUNT2
n
n/i = Θ (n log n)
i=1
Analysis of BINARY-SEARCH
BINARY-SEARCH (A, lo, hi, x)
if (lo > hi) constant time
return FALSE
mid (lo+hi)/2 constant time
if x = A[mid] constant time
return TRUE
if ( x < A[mid] )
BINARY-SEARCH (A, lo, mid-1, x) same problem of size n/2
if ( x > A[mid] )
BINARY-SEARCH (A, mid+1, hi, x) same problem of size n/2
T(n/2)
T(n) = c +
T(n) – running time for an array of size n
Recurrences and Running Time
Recurrences arise when an algorithm contains recursive
calls to itself
What is the actual running time of the algorithm?
Need to solve the recurrence
Find an explicit formula of the expression (the generic term of
the sequence)
32
Example Recurrences
T(n) = T(n-1) + n Θ(n2)
Recursive algorithm that loops through the input to eliminate
one item
T(n) = T(n/2) + c Θ(lgn)
Recursive algorithm that halves the input in one step
T(n) = T(n/2) + n Θ(n)
Recursive algorithm that halves the input but must examine
every item in the input
T(n) = 2T(n/2) + 1 Θ(n)
Recursive algorithm that splits the input into 2 halves and does
a constant amount of other work
Methods for Solving Recurrences
Iteration method
Substitution method
Master method
The Iteration Method
T(n) = c + T(n/2)
T(n) = c + T(n/2) T(n/2) = c + T(n/4)
= c + c + T(n/4) T(n/4) = c + T(n/8)
= c + c + c + T(n/8)
Assume n = 2k
T(n) = c + c + … + c + T(1)
k times
= clgn + T(1)
= Θ(lgn)
Iteration Method – Example
T(n) = n + 2T(n/2) Assume: n = 2k
T(n) = n + 2T(n/2) T(n/2) = n/2 + 2T(n/4)
= n + 2(n/2 + 2T(n/4))
= n + n + 4T(n/4)
= n + n + 4(n/4 + 2T(n/8))
= n + n + n + 8T(n/8)
… = in + 2iT(n/2i)
= kn + 2kT(1)
= nlgn + nT(1) = Θ(nlgn)
Iteration Method – Example
T(n) = n + T(n-1)
T(n) = n + T(n-1)
= n + (n-1) + T(n-2)
= n + (n-1) + (n-2) + T(n-3)
… = n + (n-1) + (n-2) + … + 2 + T(1)
= n(n+1)/2 + T(1)
= n2+ T(1) = Θ(n2)
The substitution method
1. Guess a solution
n
logb a
f (n) O nlogb a
0
T (n) nlogb a log n
f (n) n
logb a
c 1
f (n)
f (n) n
logb a
AND
af (n / b) cf (n) for large n
The master method – version2
The master method applies to recurrences of the form:
T(n) = aT(n/b) + f (n) ,
(nlogba) a > bc
a >= 1
T(n) = (nclogbn) a = bc b >= 1
c>0
c c
(n ) a<b
Application of Master Theorem
T(n) = 9T(n/3)+n;
a=9,b=3, f(n) =n
nlogba = nlog39 = (n2)
By case 1, T(n) = (n2).
T(n) = T(2n/3)+1
a=1,b=3/2, f(n) =1
nlogba = nlog3/21 = (n0) = (1)
By case 2, T(n)= (log n).
44
Application of Master Theorem
T(n) = 3T(n/4)+nlg n;
a=3,b=4, f(n) =nlg n
nlogba = nlog43 = (n0.793)
By case 3, T(n) = (f(n))= (nlog n).
45