Analysis and Design of Algorithms 2
Analysis and Design of Algorithms 2
SESSION TOPICS
Time Complexities for some general algorithms
Recurrence relations
Recurrences by substitution
Recursion tree method
Master Method
Sorting
SelectionSort
QuickSort
MergeSort
Selection
Weeks-4,5,6,7 1
Time Complexities for some algorithms
y ← x3 + 10x-20 assign; raise; add; multiply; subtract = 5
1 1 1 11 operations; T(n) = 5 = constant = 1 =O(1)
Weeks-4,5,6,7 2
Time Complexities for some algorithms
Weeks-4,5,6,7 5
Time Complexities for some algorithms
Weeks-4,5,6,7 6
Time Complexities for some algorithms
Weeks-4,5,6,7 7
Time Complexities for some algorithms
func rep7(n){ T(n)
for (i=n;i>1;i=i/2) { Trace: i
x=x+10;// some statement n, n/2, n/22, n/23 n/24,
} n/25, .. n/2k
} for n/2k =1, n=2k
k=log2n, O(log2n)
func rep8(n){ T(n)
for (i=0;i<n;i++ {
x=x*i;// some statement n
}
for (j=0;j<n;j++ { n
x=x*i;// some statement
} 2n
} O(n)
Weeks-4,5,6,7 8
Time Complexities for some algorithms
func rep9(n){ T(n)
p=0;
for (i=1;i<n;i=i*2 { p = logn
p++;
}
for (j=0;j<p;j=j*2 {
x=x*i;// some statement logp ie. loglogn
}
} O(loglogn)
Summary
for (i=1;i<n;i++) ---- O(n)
for (i=1;i<n;i=i+2 ) ---- O(n)
for (i=1;i>n;i=i-- ) ---- O(n)
for (i=1;i<n;i=i*2 ) ---- O(log2n)
for (i=1;i<n;i=i*3 ) ---- O(log3n)
for (i=1;i<n;i=i/2 ) ---- O(log2n)
Weeks-4,5,6,7 9
RECURRENCE RELATIONS
Recall
Recurrence relation: Is an equation that recursively defines a
sequence or multidimensional array of values, once one or
more initial terms are given;
Each further term of the sequence or array is defined as a
function of the preceding terms.
Example
Fn = g(n, fn-1) for n>0 where g:NxX → X
Weeks-4,5,6,7 10
RECURRENCE RELATIONS
Recall
Factorial: defined by the recurrence relation
n ! = n ( n − 1 ) ! for n > 0 , and the initial condition 0 ! = 1
Logistic map: An example of a recurrence relation is the
x n+1 = r x n ( 1 − x n ) , with a given constant r; given the initial term x
0
each subsequent term is determined by this relation.
F 1 = 1.
Weeks-4,5,6,7 11
RECURRENCE RELATIONS
A little maths
Definition again:
A recurrence relation is an equation that recursively
defines a sequence where the next term is a function of
the previous terms (Expressing Fn as some combination
of Fi with i<n).
Examples:
Fibonacci series − Fn=Fn-1+Fn-2 ;with F0 = 0; F1=1
Tower of Hanoi − Fn=2Fn-1+1
Weeks-4,5,6,7 12
RECURRENCE RELATIONS
A little maths
Linear Recurrence Relations
A linear recurrence equation of degree k or order k is a
recurrence equation which is in the format:
xn=A1xn-1+A2xn-1+A3xn-1+…Akxn-k
(An is a constant and Ak≠0) on a sequence of numbers
as a first-degree polynomial.
Weeks-4,5,6,7 13
RECURRENCE RELATIONS
A little maths
How to solve linear recurrence relation
Suppose, a two ordered linear recurrence relation is:
Fn=AFn-1 + BFn-2 , where A and B are real numbers.
Rearrange:
Fn-AFn-1 – BFn-2 = 0
Get characteristic Equation:
The characteristic equation for the above recurrence relation is :
xn−Axn-1−Bxn-2=0; dividing everything by xn-2 we get: x2−Ax −B=0; a
quadratic equation we can solve
Possibilities: same roots; distinct roots; complex roots
Distinct roots: (x-x1)(x-x2)=0, so Fn=ax1n + bx2n is the solution
{
0 if n=0
Fn= 1 if n=1
Fn-1 + Fn-2 if n>=2
{
1 if n=1
n! = n.(n-1)! if n>1
Weeks-4,5,6,7 15
RECURRENCE RELATIONS
Solving recurrence relations
METHODS
There are four methods that can be used to solve the recurrence
equation:
1. The Substitution Method (Guess the solution & verify by
Induction)
2. Iteration Method (unrolling and summing)
3. The Recursion-tree method
4. Master method
Weeks-4,5,6,7 16
RECURRENCE RELATIONS
Solving recurrence relations
The Substitution Method
In this method one guesses a bound and applies mathematical
induction to prove that the guess is correct.
Steps
Step1: Guess the form of the Solution.
Step2: Use Mathematical Induction to prove the correctness of
the guess.
Example 1
Solve the following recurrence by using substitution method.
T(n) = 2T(n/2) + n
Weeks-4,5,6,7 17
RECURRENCE RELATIONS
Solving recurrence relations
The Substitution Method
Example 1 continues
Solve the following recurrence by using substitution method.
T(n) = 2T(n/2) + n
Step1- guess
Due to n/2 it is suggestive of nlog, so guess T(n) = O(nlogn)
ie. T(n) <= c*nlogn
Step2- mathematical induction
Apply mathematical Induction to prove the guess.
Base cases:
Let n=1: Given that T(1) = 1, we find that T(1) <= c*1.0= 0 that leads
to a contradiction;
Weeks-4,5,6,7 18
RECURRENCE RELATIONS
Solving recurrence relations
The Substitution Method
Example 1 continues
Solve the following recurrence by using substitution method.
T(n) = 2T(n/2) + n
Step2- mathematical induction
Base cases:
Let n=1: Given that T(1) = 1, we find that T(1) <= c*1.0= 0 that leads to a
contradiction;
Let n=2; T(2) <= c.2log2=c.2;
from the equation T(2) = T(2/2)+2 = T(1)+2= 0+2 =2 <= c.2 from
above.
Induction step
Assume true for n = n/2; so T(n/2) <= c.(n/2)log(n/2) holds
Weeks-4,5,6,7 19
RECURRENCE RELATIONS
Solving recurrence relations
The Substitution Method
Example 1 continues
Solve the following recurrence by using substitution method.
T(n) = 2T(n/2) + n
Step2- mathematical induction
Induction step: Assume true for n = n/2; so T(n/2) <= c.(n/2)log(n/2) holds
Prove that it holds for n: that is T(n) <= c.nlogn
But T(n) <= 2T(fr(n/2)) + n <= 2(c)(fr(n/2))log(fr(n/2))) + n
<= cnlog(fr(n/2)) + n <= cnlogn – cnlog2 + n <= cnlogn – cn + n
<= cnlogn for every c>=1; So by induction T(n) = O(nlogn)
Drawback of the method: coming up with the correct guess is not
generally easy
Weeks-4,5,6,7 20
RECURRENCE RELATIONS
Solving recurrence relations : METHODS- 2
ITERATION METHOD
The given recurrence is substituted back to itself several times
Steps
➔ Expend the recurrence through substitution
➔ Express the expansion as a summation by plugging the recurrence
back into itself seeking a pattern.
➔ Work out the total sum based on arithmetic or geometric series.
• Example 2.1: T(n) = b, if n =1, else T(n) = c+T(n-1) if n>2
• Solution
• T(1) = b as given and T(n) = c+T(n-1), also given
• At n-1 we have T(n-1) = c + (c+T(n-2))= 2c + T(n-2)
• At n-2 we have T(n-2) = 2c +c +T(n-3) = 3c + T(n-3) ………..
• At n-k we have T(n-k) = c.k + T(n-k) = c.k + T(1)= nc-c+b = O(n) where
for k=n-1
Weeks-4,5,6,7 21
RECURRENCE RELATIONS
Solving recurrence relations : METHODS- 2
ITERATION METHOD
Example 2.2: T(n) = a, if n =1, else T(n) = T(n/2) + n
• Solution
• T(n) = n + T(n/2)
• T(n/2): we have n+n/2+T(n/4)
• T(n/4): we have n+n/2+n/4+T(n/8) ………..
•
T(n/k): we have n+n/2+n/4+n/8 + … + n/(2k-1) + T(n/(2k)
• At the end: T(n/(2k) = T(1), so n/(2k) = 1, k= log2n
• We have geometric series:
•
n+n/2+n/4+n/8+…..+n/(2k-1) + T(1) = n+n/2+n/4+n/8+…..+n/(2k-1) + b
• = n(1-(1/2)log2n)/(1-(1/2)) = 2n(1-nlog1-log2) = 2n(1-n0-1) = 2n(1-(1/n))
• =2n – 2 = O(n)
Weeks-4,5,6,7 22
RECURRENCE RELATIONS
Solving recurrence relations
METHODS-3: The Recursion-tree method
A tree is used to trace the steps iteratively and visually; it is very
convenient. Reccurence is examined until boundary conditions
are reached.
General: T(n) = aT(n/b) + f(n); place f(n) at the root, spread T(n/b) a
times are children
Weeks-4,5,6,7 23
RECURRENCE RELATIONS
Solving recurrence relations
METHODS-3: The Recursion-tree method
Example 1: solve T(n) = 2T(n/2)+n
Place the n at the root; for simplicity replace T(n/2) by n/2
Weeks-4,5,6,7 24
RECURRENCE RELATIONS
Solving recurrence relations
METHODS-3: The Recursion-tree method
Example 1: solve T(n) = 2T(n/2)+n
The level costs each add to n; total cost is therefore n+n+….+n
The sequence:
n, n/2, n/(22),n/(23), ….., n/(2k)
Last level = 1, so n/(2k) = 1
So n=2k, so k=log2n
Weeks-4,5,6,7 25
RECURRENCE RELATIONS
Solving recurrence relations
METHODS-3: The Recursion-tree method
Example 2: solve T(n) = T(n/3) +T(2n/3) +n
Weeks-4,5,6,7 26
RECURRENCE RELATIONS
Solving recurrence relations
METHODS-3: The Recursion-tree method
Example 2: solve T(n) = T(n/3) +T(2n/3) +n
The sequence:
n, (2/3)n, (2/3)2n, (2/3)3n, ……, 1
So (2/3)k = 1, so k= log(3/2)n, k is the height of the tree
Total time estimate:
n+n+n+….+n=n(k times) = n(log(3/2)n times)
But n(log(3/2)n) = (nlog2n)/(log2(3/2))= c.nlog2n
So T(n) = O(nlog2n)
For best case, take shortest path:
n, n/3, n/32, n/33, ….., n/3k
So n/3k = 1, k = log3n which is the tree height
Estimate:
n+n+n+ ….. + n=n(k times) =n(log3n times)
log3n = (log2n)/(log23) , so T(n) = Ω(nlog2n)
So T(n) = Θ(nlog2n), since it both O and Ω for the same
order.
Weeks-4,5,6,7 27
RECURRENCE RELATIONS
Solving recurrence relations
METHODS-3: The Recursion-tree method
Example 3: solve T(n) = 2T(n-1) +1; T(1)=1; T(2)=3; Tower of Hanoi
Weeks-4,5,6,7 28
RECURRENCE RELATIONS
Solving recurrence relations
METHODS-3: The Recursion-tree method
Example 3: solve T(n) = 2T(n-1) +1; T(1)=1; T(2)=3; Tower of Hanoi
Last level: n-(n-1) = 1; also corresponds to
heigh ot tree
Total cost:
T(n) = 1+21+22,+23+…...2n-1 = 1(2n-1)/(2-1) = 2n -1
T(n) = O(2n)
Weeks-4,5,6,7 30
RECURRENCE RELATIONS
Solving recurrence relations
MASTER METHOD
It is therefore a utility method for analyzing recurrence equations
It is used in many cases for divide and conquer algorithms
Format for recurence relations:
T(n) = aT(n/b) + f(n)
Where:
a, b are constants and a>=1 and b > 1,
n is the size of the curent problem
a is the number of subproblems in the recursion
n/b is the size of the subproblems; n/b should be an
integer, otherwise take the ceiling or the floor
f(n) is the the cost of work done outside recursive calls
such has dividing and combining the subproblems
Weeks-4,5,6,7 31
RECURRENCE RELATIONS
Solving recurrence relations
MASTER METHOD
MASTER THEOREM
Let T(n) = aT(n/b) + f(n), where a, b are constants and a>=1 and b >
1, f(n) is asymptotically bounded function and b/n is a positive
integer, otherwise its ceiling or floor is taken.
Then T(n) can be bounded asymptotically as follows:-
Weeks-4,5,6,7 32
RECURRENCE RELATIONS
Solving recurrence relations
MASTER THEOREM - three cases:
1. If f(n) = Θ(nc) where c < logba then T(n) = Θ(nLogba)
Weeks-4,5,6,7 33
RECURRENCE RELATIONS
Solving recurrence relations: MASTER THEOREM
T(n) = aT(n/b) + f(n), a>=1 and b > 1, f(n) is extra cost ; n/b is a
positive integer (or floor or ceiling), (Version 2):-
Case
Case
Case
Weeks-4,5,6,7 34
RECURRENCE RELATIONS
Solving recurrence relations
MASTER METHOD
MASTER THEOREM: Hint on Applying Master Theorem
T(n) = aT(n/b) + f(n)
f(n)< n(logb(a)) for case 3 provided af(n/b) < kf(n) for some k <1
Weeks-4,5,6,7 35
RECURRENCE RELATIONS
Solving recurrence relations : MASTER METHOD: Version 2 cont
Weeks-4,5,6,7 36
RECURRENCE RELATIONS
Solving recurrence relations
MASTER METHOD
MASTER THEOREM
Example: Let T(n) = 3T(n/4) + nlogn;
Then a=3, b=4; f(n) = nlogn = nc>1
But w=log43≈0.79; so nw = n0.79 this showns that c > log43
So T(n) = Θ(nlogn)
Exercise: try for T(n) = 2T(n/2) + nlogn (*****)
Weeks-4,5,6,7 37
RECURRENCE RELATIONS
Solving recurrence relations
MASTER METHOD
MASTER THEOREM
Examples
Given: T(n) = 2T(n/2) +n
1. Extract: a=2, b=2 and f(n) = n=n1, so c=1
2. Evaluate the value of n(logb(a)) we have n(log2(2)) =n1=n
Weeks-4,5,6,7 39
RECURRENCE RELATIONS
Solving recurrence relations
MASTER THEOREM: Examples
Given: T(n) = 3T(n/4) +nlogn
1. Extract: a=3, b=4 and f(n) = nlogn
2. Evaluate the value of n(logb(a)) we have n(log4(3)) =nx<1
Weeks-4,5,6,7 40
RECURRENCE RELATIONS
Solving recurrence relations
MASTER METHOD
MASTER THEOREM Example: Let T(n) = 9T(n/3) + n;
Then a=9, b=3; f(n) = n=n1; c=1; w=log39 = 2; nw = n2
Weeks-4,5,6,7 41
RECURRENCE RELATIONS
Solving recurrence relations
MASTER THEOREM
Example Case 1 Confirm the following
T(n) = 2T(n/2) + 1; T(n) = Θ(n1)
T(n) = 4T(n/2) + 1; T(n) = Θ(n2)
T(n) = 4T(n/2) + n1; T(n) = Θ(n2)
T(n) = 8T(n/2) + n2; T(n) = Θ(n3)
T(n) = 16T(n/2) +n2; T(n) = Θ(n4)
Weeks-4,5,6,7 42
RECURRENCE RELATIONS
Solving recurrence relations
MASTER THEOREM
Example Case 2 Confirm the following
T(n) = T(n/2) + 1; T(n) = Θ(logn)
T(n) = 2T(n/2) + n; T(n) = Θ(nlogn)
T(n) = 2T(n/2) + nlogn; T(n) = Θ(nlog2n)
T(n) = 4T(n/2) + n2; T(n) = Θ((n2logn)
T(n) = 4T(n/2) + (nlogn)2; T(n) = Θ((nlogn)2logn)
T(n) = 2T(n/2) +n1/(logn); T(n) = Θ(nloglogn)
T(n) = 2T(n/2) +n1/(log2n); T(n) = Θ(n)
Weeks-4,5,6,7 43
RECURRENCE RELATIONS
Solving recurrence relations
MASTER THEOREM
Example Case 3 Confirm the following
T(n) = T(n/2) + n1; T(n) = Θ(n1)
T(n) = 2T(n/2) + n2; T(n) = Θ(n2)
T(n) = 2T(n/2) + n2logn; T(n) = Θ(n2logn)
T(n) = 4T(n/2) + n3log2n; T(n) = Θ(n3log2n)
T(n) = 2T(n/2) +n2(logn); T(n) = Θ(n2)
Weeks-4,5,6,7 44
RECURRENCE RELATIONS
Solving recurrence relations
MASTER THEOREM
Weeks-4,5,6,7 46
RECURRENCE RELATIONS
Solving recurrence relations
Solve the following recurence relations
T(n) = T(n-1) + 5, n>1, T(1)=0
T(n) = 3T(n-1) n>1, T(1)=4
T(n) = T(n-1) + n, n>0, T(0)=0
T(n) = T(n/2) + n, n>1, T(1)=1 (solve for n=2k)
T(n) = T(n/3) + n, n>1, T(1)=1 (solve for n=3k)
Set up a recursive algorithm based on 2n = 2n-1 + 2n-1
Weeks-4,5,6,7 47
SORTING
Selection Sort
Process
➔ Scan the entire list to find its smallest element;
➔ Exchange it with the first element, putting the smallest
element,
➔ to find the smallest among the last n − 1 elements and
places.
Weeks-4,5,6,7 48
SORTING
Selection Sort
Process
Weeks-4,5,6,7 49
SORTING
Selection Sort
Algorithm
SelectionSort(A[0..n − 1])
//Sorts a given array by selection sort
//Input: An array A[0..n − 1] of orderable elements
//Output: Array A[0..n − 1] sorted in nondecreasing order
for i ← 0 to n − 2 do
min ← i
for j ← i + 1 to n − 1 do
if A[j ] < A[min] min ← j
Swap A[i] and A[min]
Weeks-4,5,6,7 50
SORTING
Selection Sort
Complexity
Weeks-4,5,6,7 52
SORTING
QuickSort
Weeks-4,5,6,7 53
SORTING
QuickSort Process
Weeks-4,5,6,7 54
SORTING
QuickSort Process
Weeks-4,5,6,7 55
SORTING
QuickSort Process
Divide
Partition (rearrange) the array A[p .. r] into two (possibly empty)
subarrays A[p .. q - 1] and A[q+1 .. r] such that each element of
A[p .. q - 1 is less than or equal to A[q], which is, in turn, less
than or equal to each element of A[q - 1 .. r]. Compute the index
q as part of this partitioning procedure.
Conquer
Sort the two subarrays A[p .. q- 1] and A[q+1 .. r] by recursive
calls to quicksort
Combine
Because the subarrays are already sorted, no work is needed
to combine them: the entire array A[p .. r] is now sorted.
Weeks-4,5,6,7 56
SORTING
QuickSort Algorithm
Quicksort(A[l..r]): //Sorts a subarray by quicksort
//Input: Subarray of array A[0..n − 1], defined by its left and right indices l and r
//Output: Subarray A[l..r] sorted in nondecreasing order
if l < r
s ←Partition(A[l..r]) //s is a split position
Quicksort(A[l..s − 1])
Quicksort(A[s + 1..r])
ALGORITHM Partition(A[l..r])
//Partitions by Hoare’s algorithm, using the first element as a pivot
//Input: Subarray of array A[0..n − 1], defined by its left and right indices l and
r (l < r)
//Output: Partition of A[l..r], split position returned as this function’s value
p ← A[l]
i ← l; j ← r + 1
repeat
repeat i ← i + 1 until A[i] ≥ p
repeat j ← j − 1 until A[j ] ≤ p
swap(A[i], A[j ])
until i ≥ j
swap(A[i], A[j ]) //undo last swap when i ≥ j
swap(A[l], A[j ])
return j Weeks-4,5,6,7 57
SORTING
QuickSort Algorithm
procedure quickSort(left, right)
if right-left <= 0
return
else
pivot = A[right]
partition = partitionFunc(left, right, pivot)
quickSort(left,partition-1)
quickSort(partition+1,right)
end if
end procedure
Weeks-4,5,6,7 58
SORTING
QuickSort Algorithm Complexity
Best
T(n) = 2T(n/2) + n, for n > 1, T (1) = 0
Weeks-4,5,6,7 59
SORTING
QuickSort Algorithm Complexity
Best
T(n) = 2T(n/2) + n, for n > 1, T (1) = 0
Weeks-4,5,6,7 60
SORTING
MergeSort
● Divide
Divide the n-element sequence to be sorted into two
subsequences of n/2 elements each.
● Conquer
Sort the two subsequences recursively using merge sort.
Combine
Merge the two sorted subsequences to produce the sorted
answer.
Weeks-4,5,6,7 62
SORTING
MergeSort Process
Weeks-4,5,6,7 63
SORTING
MergeSort Process
Weeks-4,5,6,7 64
SORTING
MergeSort Process
Weeks-4,5,6,7 65
SORTING
MergeSort Algorithm
Weeks-4,5,6,7 66
SORTING
MergeSort Algorithm
ALGORITHM Merge(B[0..p − 1], C[0..q − 1], A[0..p + q − 1])
//Merges two sorted arrays into one sorted array
//Input: Arrays B[0..p − 1] and C[0..q − 1] both sorted
//Output: Sorted array A[0..p + q − 1] of the elements of B
and C
i ← 0; j ← 0; k ← 0
while i < p and j < q do
if B[i] ≤ C[j ]
A[k] ← B[i]; i ← i + 1
else A[k] ← C[j ]; j ← j + 1
k←k+1
if i = p
copy C[j..q − 1] to A[k..p + q − 1]
else copy B[i..p − 1] to A[k..p + q − 1]
Weeks-4,5,6,7 67
SORTING
MergeSort Algorithm
TIME COMPLEXITY
Divide
The divide step just computes the middle of the subarray,
which takes constant time. Thus D(n) = Θ(1).
Weeks-4,5,6,7 68
SORTING
MergeSort Algorithm
TIME COMPLEXITY
Rewriting we have:
Weeks-4,5,6,7 69
SORTING
MergeSort Algorithm
Weeks-4,5,6,7 70
SORTING
MergeSort Algorithm
Total: cn lg n + cn
O(nlogn) Weeks-4,5,6,7 71
SELECTION
SELECTION ALGORITHM
th
This is an algorithm for finding the k
smallest number in a list or array;
Weeks-4,5,6,7 73
SELECTION
SELECTION ALGORITHM
Let s = <e1 , . . . , en> be a sequence
and let s’ = <e1’ , . . . , en’> be the sorted version of it.
Weeks-4,5,6,7 76
SELECTION
SELECTION ALGORITHM
Weeks-4,5,6,7 77
SELECTION
SELECTION ALGORITHM- COMPLEXITY
Worst-case running time
For simplicity, assume that n is a multiple of 5 and ignore
ceiling and floor functions.
The number of items less than or equal to the median of
medians is at least 3n/10 in this context.
These are the first three items in the sets with medians less
than or equal to the median of medians. I
Symmetrically, the number of items greater than or equal to
the median of medians is at least 3n/10 .
The first recursion works on a set of n/5 medians, and the
second recursion works on a set of at most 7n/ 10 items.
We have:
T(n) <= n + T(n/5) + T(7n/10), that is O(n)
Weeks-4,5,6,7 78
SELECTION
SELECTION ALGORITHM- COMPLEXITY
Worst-case running time
We have:
T(n) <= n + T(n/5) + T(7n/10), that is O(n) that can be proved using
induction
Assume T(m) ≤ c · m for m < n and c a large enough constant;
T(n) <= n + (c/5).n + (7c/10).n = (1+9c/10).n
Tacking c values >=10, we have T(n) <= c.n
Weeks-4,5,6,7 79
CSC 311 DESIGN AND ANALYSIS OF ALGORITHMS
EXERCISES
(1)Estimate the running time of a program that has 2000 lines of
sequential code of a procedural language.
(2) Estimate the running of a program that scans the input two
times.
(3)Estimate the running time of a program that adds two nxn
matrices.
(4)Estiamte the running time of a program that multiplies two nxn
matrices.
(5)Estimate the running time of a program that uses binary search
to locate an item from an unsorted array of size n.
(6) Estimate the running time of a program that requests n
integers and displays their squares.
(7)Estimate the running time of a program that uses bubble sort
technique to sort an array of n unsorted numbers.
(8)Estimate the running time of a program controlled by the loop:
for (x=1; i<n;i++)
(9)Estimate the running time of a program controlled by the loop:
for (x=1; i<n;i--)
Weeks-4,5,6,7 80
CSC 311 DESIGN AND ANALYSIS OF ALGORITHMS
EXERCISES
(1)Estimate the running time of a program controlled by the loop:
for (x=1; i<n;i=I*2).
(2)Estimate the running time of a program controlled by the loop:
for (x=1; i<n;i=i*4).
(3)Estimate the running time of a program controlled by the loop:
for (x=1; i<n;i=i/2).
(4) Define recurrence relation and give an example.
(5)State the recurrence relations for Fibonacci series and Tower of
Hanoi.
(6) Give a general formula of a linear recurrence equation.
(7)Work out a characteristic equation for the recurrence relation:
(1)Rn = ARn-1 + BR, for A, B being real numbers
(8)Give the steps of a recurrence relation
(9)Discuss the methods used to solve recurrence relations giving
examples in each case.
(10)Describe the substitution method
(11)Describe the iteration method
(12)Describe the recursion tree method
Weeks-4,5,6,7 81
CSC 311 DESIGN AND ANALYSIS OF ALGORITHMS
EXERCISES
(1)Describe the master method
(2)Solve T(n) =9T(n/3) using Master Theorem
(3)Solve T(n) =T(2n/3) + 1 using Master Theorem
(4)Solve T(n) =8T(n/2) +1000n2 using Master Theorem
(5)Solve T(n) =n2T(n/2) + n2 using Master Theorem
(6)Solve T(n) =64T(n/8) - n2 logn using Master Theorem
(7)Solve T(n) =4T(n/3) + n2 using Master Theorem
(8)Set up recursive algorithm based on 2n = 2n-1 + 2n-2
(9)Describe selection sort
(10)Implement selection sort
(11)Discuss the complexity of selection sort
(12)Describe quicksort
(13)Implement quicksort
(14)Discuss the complexity of quicksort
Weeks-4,5,6,7 82
CSC 311 DESIGN AND ANALYSIS OF ALGORITHMS
EXERCISES
(1)Describe MergeSort
(2)Implement MergeSort
(3)Discuss the complexity of MergeSort
(4)Describe Selection algorithm
(5)Implement Selection algorithm
(6)Discuss the complexity of Selection algorithm
Weeks-4,5,6,7 83