Alg-Chapter2-Part 2
Alg-Chapter2-Part 2
Chapter 2
for i = 1 to n c4 (n + 1) x (1-t)
print(i) c5 n x (1-t)
RecFunc(n - 1)
for i = 1 to n
print(i)
Example 1
Compute the factorial function F (n) = n! for an arbitrary nonnegative integer n
n!= n . (n-1) . (n-2) . ...... . 2 . 1= n . (n − 1)! for n ≥ 1
and 0!= 1 (base case)
we can compute Fact (n) = n. Fact (n − 1) with the following recursive algorithm.
ALGORITHM Fact ( n )
{ cost time
1 If (n == 0)
2 return 1
Else
3 return n * Fact (n – 1)
}
Example 2
Fibonacci Numbers Sequence 1, 1, 2, 3, 5, 8, 13, 21, 34, 55, 89, 144, ……..
ALGORITHM Fib ( n )
{ cost time
1 If (n == 1 || n==2) c1 1
2 return 1 c2 t
Else
3 return Fib (n – 1) + Fib (n-2) T(n-1) + T(n-2) + c3 (1-t)
}
ALGORITHM Fib ( n )
{ cost time
1 If (n == 1 || n==2)
2 return 1
Else
Algorithm A (n )
{ cost time
1 A(n-1) T(n-1) 1 Note: For Algorithm A to
2 for i=1 to n c2 n+1 function properly, it is
crucial to include a defined
3 statement c3 n stopping point.
} T(n) = T(n-1) + c2(n+1) + c3n
T(n) = T(n-1) + cn + d
Recurrence relation
Example 5
Recursive binary search algorithm
Algorithm BinarySearch (A , start , end , key )
{
if (start > end) 𝒅, 𝒏=𝟏
return -1 T 𝒏 =
𝑻 + 𝒄, 𝒏 > 𝟏
else
mid = (start + end)/2
if key == A[mid]
return mid
else if key < A[mid] T(n) = T( ) + c n>1
return BinarySearch (A , start, mid-1 , key)
else
Cost of solving Cost of
return BinarySearch (A, mid+1, end, key one subproblem dividing the
} of size n/2 problem
Example 5
Recursive binary search algorithm
Algorithm BinarySearch (A , start , end , key )
{
if (start > end)
return -1
else
mid = (start + end)/2
if key == A[mid]
return mid
else if key < A[mid]
return BinarySearch (A , start, mid-1 , key)
else
return BinarySearch (A, mid+1, end, key
}
Exercise
Consider the following recurrence relation:
Write a pseudocode for an algorithm that solves a problem with the given recurrence
relation.
𝐴𝐿𝐺𝑂𝑅𝐼𝑇𝐻𝑀 𝑀𝑦𝐴𝑙𝑔 𝐴 𝑛
{
} 13
Solving recurrence equations
Recurrence relation
A recurrence relation is an equation that recursively defines a sequence where the next term is a
function of the previous terms.
• Example: Fibonacci series Fib(n) = Fib(n-1) + Fib(n-2)
When an algorithm contains a recursive call to itself, we can often describe its running time
by a recurrence equation or recurrence, which describes the overall running time on a
problem of size n in terms of the running time on smaller inputs.
Solving recurrence relation means that we want to convert the recursive definition to closed
formula.
There are different techniques for solving recurrence:
Substitution method: Guess the Solution, and then Use the mathematical induction to find the
boundary condition and shows that the guess is correct.
Iteration method: It means to expand the recurrence and express it as a summation of terms of
n and initial condition.
Recursion tree method: A pictorial representation of an iteration method which is in the form of
a tree where at each level nodes are expanded.
Characteristic equation
Master method
Iteration method
Example 1
k T(n)
T( ) = T ( ) + c = T( ) + c
1 T(n) = T( ) + c
2 = [ T( ) + c ] + c = T( ) + 2c T( ) = T ( ) + c = T( ) + c
3 = [T( ) + c] + 2c = T( ) + 3c
T( ) = T ( ) + c = T( ) + c
4 = [T( ) + c] + 3c = T( ) + 4c
…….. ……..
T(1) = 1
k T(n) = T( ) + kc T( ) = 1
=1 → 𝑛=2 → 𝒍𝒈𝒏 = 𝒍𝒈2
𝑙𝑔𝑛 = k lg2 → 𝒌 = 𝒍𝒈𝒏
T(0) = 1
T(n-k) = 1 → 𝑛 − 𝑘 = 0 → 𝒏 = 𝒌
T( ) = 1
=1 → 𝑛=2 → 𝒌 = 𝒍𝒐𝒈𝒏 T(n) ≈ T(1) + 2(n-1) Order of growth is
≈ 1 + 2 (n – 1) O(n)
Example 4 (solution 2)
T( ) = 1
=1 → 𝑛=2 → 𝒌 = 𝒍𝒐𝒈𝒏 T(n) ≈ T(1) +
(
𝒏
)∗ Order of growth is
𝟐 O(n)
≈ 1 + 2 (n – 1)
Exercise:
T(n) = a T( ) + f(n)
T(n) = a T( ) + c.
if a = T(n) = O( )
if a > T(n) = O( )
if a < T(n) = O( )
Examples
T(n) = T( ) + c
T(n) = T( ) + c.𝑛
a=1 b=2 d=0
a ?? 𝑏 1=2 , so we will follow case 1
T(n) = O(𝑛 log 𝑛) T(n) = O(log 𝑛)
T(n) = T( ) + n
a=1 b=2 d=1
a ?? 𝑏 1<2 , so we will follow case 3
T(n) = O(𝑛 ) T(n) = O(𝑛)
Examples
T(n) = 2 T( ) + n
a=2 b=2 d=1
a ?? 𝑏 2=2 , so we will follow case 1
T(n) = O(𝑛 log 𝑛) T(n) = O(n 𝒍𝒐𝒈𝟐 𝒏)
T(n) = 3 T( ) + n2
a=3 b=2 d=2
a ?? 𝑏 3<2 , so we will follow case 3
T(n) = O(𝑛 ) T(n) = O(𝒏𝟐 )
T(n) = 8 T( ) + n2
a=8 b=2 d=2
a ?? 𝑏 8>2 , so we will follow case 2
The given recurrence relation does not correspond to the general form of Master’s theorem.
2) T(n) = 8 T( ) + 1000 n2
3) T(n) = 16 T( ) + n
4) T(n) = 3 T( ) +
5) T(n) = 7 T( ) + n2
Three cases of analysis: Best, worst, and average case running time
Worst case running time:
− constraints on the input, rather than size, resulting in the slowest possible running time.
− Provides an upper bound on running time
− An absolute guarantee that the algorithm would not run longer, no matter what the inputs are.
Best case running time:
− constraints on the input, rather than size, resulting in the fastest possible running time
− Provides a lower bound on running time best case
average case
100
Running Time
80
40
− usually involve probabilities of different types of input
20
The worst case is usually fairly easy to analyze and often close
to the average or real running time.
Example: Linear Search Algorithm
search for x in array A of n items.
If x = 33 → 𝟕 𝒄𝒐𝒎𝒑𝒂𝒓𝒊𝒔𝒐𝒏𝒔
If x = 10 → 1 comparison
Best case: x present at the first element Time is constant → TB(n) = O(1)
Worst case: x does not present in the array Time is linear → Tw(n) = O(n)
Average case:
( )
……….
= = = Time is linear → Tavg(n) = O(n)
𝑛(𝑛 + 1)
Recall 𝑖=
2
Example: Linear Search Algorithm
search for x in array A of n items.
Your algorithm should return the index of found item and n+1 if not found (assuming that the first
index is 1)
Seq_search (A, n , x)
{ cost time
1. i=1 c1 1
2. while(i<=n && A[i] != x) c2 ? The # of times lines 2, 3 are
executed depends on the data in the
3. i++ c3 ? array and not just n
4. return i c4 1
}
Example: Linear Search Algorithm
Worst case: x does not present in the original array
Seq_search (A, n , x)
{ cost time
1. i=1 c1 1
2. while(i<=n && A[i] != x) c2 n+1
3. i++ c3 n
4. return I c4 1
} T(n) = c1 + c2(n+1) + c3 n + c4
= c1+ c2+ c4 + (c2 + c3) n
T(n) = a n + b where a and b are constants O(n)
Example: Linear Search Algorithm
Best case: x is found at the first element A[1]
Seq_search (A, n , x)
{ cost time
1. i=1 c1 1
2. while(i<=n && A[i] != x) c2 1
3. i++ c3 0
4. return I c4 1
} T(n) = c1 + c2 + c4
= C [constant time] O(1)
Example: Linear Search Algorithm
Consider the following algorithm that finds the largest element of an array
(Assume the first index is 1)
1) What is the worst, best, and average cases?
To compare two algorithms with running times f(n) and g(n), we need a
rough measure that characterizes how fast each function grows.
• The low order terms in a function are relatively insignificant for large n
n4 + 100n2 + 10n + 50 ~ n4
i.e., we say that n4 + 100n2 + 10n + 50 and n4 have the same rate of growth
Order of growth
Suppose you have analyzed two algorithms and expressed their run times in terms of
the size of the input:
Algorithm A: takes 100 n + 1 steps to solve a problem with size n;
Algorithm B: takes n2 + n + 1 steps. The leading term is the term
with the highest exponent.
The following table shows the run time of these algorithms for different problem sizes:
1 constant
log n logarithmic
n linear
n log n n-log-n or linearithmic
n2 quadratic
n3 cubic
2n exponential
n! factorial
Execution times for algorithms with the given time complexities
Asymptotic Notations
Notations used for representing the simple form of a function or showing the class of a function.
A way of comparing functions that ignores constant factors and small input sizes.
Asymptotic Notations:
Recall
1 < logn < < n < nlogn < n2 < n3 < ……. < 2n < 3n < ……… < nn < n!
Big-O notation
Let f and g be nonnegative functions on the positive integers
Definition:
0
0
Big-O notation
Example: f(n) = 3n + 2
1
1 < logn < < n < nlogn < n2 < n3 < ……. < 2n < 3n < ……… < nn
Upper bound
But when writing Big-O notation we try to find the closest function, So 3n + 2 is O(n)
More Examples …
Drop constants and lower order terms. E.g. O(3*n^2 + 10n + 10) becomes O(n^2).
g(n) = n
• 30n + 8 31 n n 8
The following table shows some of the orders of growth (using Big-O notation) that
appear most commonly in algorithmic analysis, in increasing order of badness.
O(n2) Quadratic.
O(n3) Cubic.
O(nc), c > 1 Polynomial, sometimes called algebraic.
Examples: O(n2), O(n3), O(n4).
O(cn) Exponential, sometimes called geometric.
Examples: O(2n), O(3n).
O(n!) Factorial, sometimes called combinatorial.
badness
Plot of the most common Big-O notation
notation
Let f and g be nonnegative functions on the positive integers
We write f(n) = (g(n)) and say that:
f(n) is omega of g(n) or,
f(n) is of order at least g(n) or,
g is an asymptotic lower bound for f
Ω(g(n)): class of functions f(n) that grow at least as fast as g(n)
Definition:
Ω
0
0
notation
Example: f(n) = 3n + 2
1
1 < logn < < n < nlogn < n2 < n3 < ……. < 2n < 3n < ……… < nn
Lower bound
But when writing notation we try to find the closest function, So 3n + 2 is (n)
notation
Let f and g be nonnegative functions on the positive integers
We write f(n) = (g(n)) and say that: A tight bound implies that both the lower
and the upper bound for the computational
f(n) is theta of g(n) or,
complexity of an algorithm are the same.
f(n) is of order of g(n) or,
g is an asymptotic tight bound for f
Θ(g(n)): class of functions f(n) that grow at same rate as g(n)
Definition:
0
0
notation
Example: f(n) = 3n + 2
1
1 < logn < < n < nlogn < n2 < n3 < ……. < 2n < 3n < ……… < nn
notation
Example: f(n) = n g(n) = 2n
Is f(n)= (g(n))? Is n is (2n)?
1 < logn < < n < nlogn < n2 < n3 < ……. < 2n < 3n < ……… < nn
Solution:
1 < logn < < n < nlogn < n2 < n3 < ……. < 2n < 3n < ……… < nn
the result does not depend upon the values of the constants
the result does not depend upon the characteristics of the computer
and compiler actually used to execute the program!
Conventions for Writing Big Oh Expressions
(Tight bound)
A problem that has a worst case Polynomial time algorithm is considered to have a
good algorithm.
Such problems are called feasible or tractable
A problem that does not have a worst case Polynomial time algorithm is said to be
intractable
c + c + c + ………. cn
1+2+3+4+5…+n
12 + 22 + 32 + 42 + 52 … + n2
13 + 23 + 33 + 43 + 53 … + n3
2
i 0
i
2 n 1
1 1 + 2 + 4 + 8 + 16 + …+ 2n
Logarithms and properties
• In algorithm analysis we often use the notation “log n” without specifying the base
log b x log a x
log a b