Analysis of Recursive Algorithms
Analysis of Recursive Algorithms
Like all recursive functions, it has one or more recursive cases and one or
more base cases.
Example:
The portion of the definition that does not contain T is called the base case of
the recurrence relation; the portion that contains T is called the recurrent or
recursive case.
Recurrence relations are useful for expressing the running times (i.e., the
number of basic operations executed) of recursive algorithms
The specific values of the constants such as a, b, and c (in the above recurrence)
are important in determining the exact solution to the recurrence. Often however
we are only concerned with finding an asymptotic upper bound on the solution.
We call such a bound an asymptotic solution to the recurrence.
For a given recursive method, the base case and the recursive case of its recurrence relation
correspond directly to the base case and the recursive case of the method.
Example 1: Write the recurrence relation for the following method:
void f (int n)
{
if (n > 0) {
printf(%d, n);
f(n-1);
}
}
The base case is reached when n = = 0. The method performs one comparison. Thus, the number of
operations when n = = 0, T(0), is some constant a.
When n > 0, the method performs two basic operations and then calls itself, using ONE recursive
call, with a parameter n 1.
Therefore the recurrence relation is:
T(0) = a
for some constant a
T(n) = b + T(n 1)
for some constant b
In General, T(n) is usually a sum of various choices of T(m ), the cost of the recursive subproblems,
plus the cost of the work done outside the recursive calls:
T(n ) = aT(f(n)) + bT(g(n)) + . . . + c(n)
where a and b are the number of subproblems, f(n) and g(n) are subproblem sizes, and c(n) is the cost
of the work done outside the recursive calls [Note: c(n) may be a constant]
int g(int n)
{
if (n == 1)
return 2;
else
return 3 * g(n / 2) + g( n / 2) + 5;
}
The base case is reached when n == 1. The method performs one comparison
and one return statement. Therefore, T(1), is some constant c.
When n > 1, the method performs TWO recursive calls, each with the
parameter n / 2, and some constant number of basic operations.
When n > 2, the method performs TWO recursive calls, one with the
parameter n - 1 , another with parameter n 2, and some constant number
of basic operations.
if n = 1 or n = 2
if n > 2
5
The base case is reached when n == 0 or n == 1. T(0) and T(1) is some constant c.
At every step the problem size reduces to half the size. When the power is an odd
number, an additional multiplication is involved. To work out time complexity, let us
consider the worst case, that is we assume that at every step an additional
multiplication is needed. Thus total number of operations T(n) will reduce to number of
operations for n/2, that is T(n/2) with several additional basic operations (the odd power
case)
if n = 0 or n = 1
if n 2
Steps:
Expand the recurrence
Express the expansion as a summation by plugging the recurrence
back into itself until you see a pattern.
Evaluate the summation
In evaluating the summation one or more of the following summation
formulae may be used:
Arithmetic series:
Special Cases of Geometric Series:
Geometric Series:
Others:
= kb + T(n - k)
The base case is reached when n k = 0 k = n, we then have:
T(n) = nb + T(n - n)
= bn + T(0)
= bn + c
Therefore the method factorial is O(n)
10
The recurrence relation for the running time of the method is:
T(n) = a
if n = 1 (one element array)
T(n) = T(n / 2) + b if n > 1
11
Expanding:
T(1) = a
(1)
T(n) = T(n / 2) + b
(2)
= [T(n / 22) + b] + b = T (n / 22) + 2b
= [T(n / 23) + b] + 2b = T(n / 23) + 3b
= ..
= T( n / 2k) + kb
=
= 2k T(n k) + b[2k- 1 + 2k 2 + . . . 21 + 20]
if( n == 1 || n == 2)
return 1;
else
return fibonacci(n 1) + fibonacci(n 2);
}
T(n) = c
T(n) = T(n 1) + T(n 2) + b
if n = 1 or n = 2
if n > 2
(1)
(2)
15
The master method provides an estimate of the growth rate of the solution for recurrences of the
form:
If T(n) is interpreted as the number of steps needed to execute an algorithm for an input of size n,
this recurrence corresponds to a divide and conquer algorithm, in which a problem of size n is
divided into a sub-problems of size n / b, where a, b are positive constants:
Divide-and-conquer algorithm:
divide the problem into a number of subproblems
conquer the subproblems (solve them)
combine the subproblem solutions to get the solution to the original problem
Example: Merge Sort
divide the n-element sequence to be sorted into two n/2- element sequences.
conquer the subproblems recursively using merge sort.
combine the resulting two sorted n/2-element sequences by merging
16
if a > bc
c
T(n) O(n log n)
if a = bc
T(n) O(n c )
if a < bc
Note: Since k and h do not affect the result, they are sometimes not included
in the above recurrence
17
T(n) is
O(n log 4 3 )
Example2: Find the big-Oh running time of the following recurrence. Use the Master
Theorem:
T(1) = 1
T(n) = 2T(n / 2) + n
Solution: a = 2, b = 2, c = 1 a = bc Case 2
18