0% found this document useful (0 votes)
23 views

Analysis and Design of Algorithms 2

The document outlines the design and analysis of algorithms, focusing on time complexities, recurrence relations, and various sorting algorithms such as SelectionSort, QuickSort, and MergeSort. It details methods for solving recurrence relations, including substitution, iteration, recursion tree, and master methods, along with examples illustrating these concepts. Additionally, it provides insights into linear recurrence relations and their applications in algorithms like the Fibonacci sequence and factorial calculations.

Uploaded by

hut86176
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
23 views

Analysis and Design of Algorithms 2

The document outlines the design and analysis of algorithms, focusing on time complexities, recurrence relations, and various sorting algorithms such as SelectionSort, QuickSort, and MergeSort. It details methods for solving recurrence relations, including substitution, iteration, recursion tree, and master methods, along with examples illustrating these concepts. Additionally, it provides insights into linear recurrence relations and their applications in algorithms like the Fibonacci sequence and factorial calculations.

Uploaded by

hut86176
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 83

CSC 311 DESIGN AND ANALYSIS OF ALGORITHMS

SESSION TOPICS
Time Complexities for some general algorithms
Recurrence relations
Recurrences by substitution
Recursion tree method
Master Method

Sorting
SelectionSort
QuickSort
MergeSort

Selection

Weeks-4,5,6,7 1
Time Complexities for some algorithms
y ← x3 + 10x-20 assign; raise; add; multiply; subtract = 5
1 1 1 11 operations; T(n) = 5 = constant = 1 =O(1)

func add(x,y){ T(n)


return x+y; 1
} So T(n) = 1 = constant = O(1)

func sum(A,n){ T(n)


s=0; 1
for i=1 to n{ n+1
s = s + A[i]; n
}
return s; 1
} T(n) = 2n+3=O(n); linear

Weeks-4,5,6,7 2
Time Complexities for some algorithms

func addMat(A,B,C){ T(n)


for i=1 to n{ n+1
for j = 1 to n{ n(n+1)
C[i,j]=A[i,j]+B[i,j]; n*n=n2
} T(n) = 2n2+2n+1; quadratic
} = O(n2)
}

func mulMat(A,B,C){ T(n)


for i=1 to n{ n+1
for j = 1 to n{ n(n+1)
C[i,j]=0 n*n=n2
For k=1 to n{ n2(n+1)
C[i,j]+=A[i,k]*B[k,j]; n*n*n=n3
} T(n) = 2n3 + 3n2 + 2n + 1
} Cubic = O(n3)
}
} Weeks-4,5,6,7 3
Time Complexities for some algorithms

func bsearch(A,x,n){ T(n)


i=1; j=n 1+1
while i<j { logn
mid = (i+j)/2; logn
if (x<A[mid]) j=mid-1 0.5logn
else if (x>A[mid]) i=mid+1; 0.5logn
else return mid; T(n) = 3logn+2;
} = O(logn)
} Logarithmic

func give(n){ T(n)


if (n>0{
print (n) 1
give(n-1) T(n-1)
give(n-1) T(n-1)
} T(n) = 2T(n-1) + 1
} A recurrence relation, needs to
be solved- will turn out to be
exponential
Weeks-4,5,6,7 4
Time Complexities for some algorithms

func rep1(n){ T(n)


for (i=1;i<n;i++) { n+1
print (I); n
} T(n) = 2n+1;
} O(n)

func rep2(n){ T(n)


for (i=1;i<n;i=i+2) { n/2
print (I); n/2
} T(n) = n
} O(n)

Weeks-4,5,6,7 5
Time Complexities for some algorithms

func rep3(n){ T(n)


for (i=0;i<n;i++) { n+1
for (j=0;j<n;j++){ n(n+1)
x=x*j;some statement n*n
} T(n) = 2n2+3n+1;
} O(n2)
}
func rep4(n){ T(n)
for (i=0;i<n;i++) { trace i, j, no of j times
for (j=0;j<i;j++){ To see that the total time
x=x*j; // some statement is 1+2+3..n= n(n+1)/2
2
} O(n )
}
}

Weeks-4,5,6,7 6
Time Complexities for some algorithms

func rep5(n){ T(n)


p=0; Trace: i p
for (i=1;p<n;i++) { 1 0+1
p=p+i; 2 1+2
} k 1+2+..k
} p=k(k+1)/2; k=n½ O(n½)

func rep6(n){ T(n)


for (i=1;i<n;i=i*2 { trace i, to see that total
x=x*i;// some statement time is 2k, when n = 2k
} k=log2n
} O(log2n)

Weeks-4,5,6,7 7
Time Complexities for some algorithms
func rep7(n){ T(n)
for (i=n;i>1;i=i/2) { Trace: i
x=x+10;// some statement n, n/2, n/22, n/23 n/24,
} n/25, .. n/2k
} for n/2k =1, n=2k
k=log2n, O(log2n)
func rep8(n){ T(n)
for (i=0;i<n;i++ {
x=x*i;// some statement n
}
for (j=0;j<n;j++ { n
x=x*i;// some statement
} 2n
} O(n)

Weeks-4,5,6,7 8
Time Complexities for some algorithms
func rep9(n){ T(n)
p=0;
for (i=1;i<n;i=i*2 { p = logn
p++;
}
for (j=0;j<p;j=j*2 {
x=x*i;// some statement logp ie. loglogn
}
} O(loglogn)

Summary
for (i=1;i<n;i++) ---- O(n)
for (i=1;i<n;i=i+2 ) ---- O(n)
for (i=1;i>n;i=i-- ) ---- O(n)
for (i=1;i<n;i=i*2 ) ---- O(log2n)
for (i=1;i<n;i=i*3 ) ---- O(log3n)
for (i=1;i<n;i=i/2 ) ---- O(log2n)

Weeks-4,5,6,7 9
RECURRENCE RELATIONS
Recall
Recurrence relation: Is an equation that recursively defines a
sequence or multidimensional array of values, once one or
more initial terms are given;
Each further term of the sequence or array is defined as a
function of the preceding terms.
Example
Fn = g(n, fn-1) for n>0 where g:NxX → X

is a function, where X is a set to which the elements of a


sequence must belong. For any u0 ∈ X this defines a unique
sequence with u0 as its first element, called the initial value.

Weeks-4,5,6,7 10
RECURRENCE RELATIONS
Recall
Factorial: defined by the recurrence relation
n ! = n ( n − 1 ) ! for n > 0 , and the initial condition 0 ! = 1
Logistic map: An example of a recurrence relation is the
x n+1 = r x n ( 1 − x n ) , with a given constant r; given the initial term x
0
each subsequent term is determined by this relation.

Fibonacci numbers: a type of a homogeneous linear recurrence relation


with constant coefficients, see below.
F n = F n-1 + F n-2 with initial conditions (seed values) F 0 = 0,

F 1 = 1.

Solving a recurrence relation means obtaining a : a non-recursive


function of n.

Weeks-4,5,6,7 11
RECURRENCE RELATIONS
A little maths
Definition again:
A recurrence relation is an equation that recursively
defines a sequence where the next term is a function of
the previous terms (Expressing Fn as some combination
of Fi with i<n).
Examples:
Fibonacci series − Fn=Fn-1+Fn-2 ;with F0 = 0; F1=1
Tower of Hanoi − Fn=2Fn-1+1

Weeks-4,5,6,7 12
RECURRENCE RELATIONS
A little maths
Linear Recurrence Relations
A linear recurrence equation of degree k or order k is a
recurrence equation which is in the format:
xn=A1xn-1+A2xn-1+A3xn-1+…Akxn-k
(An is a constant and Ak≠0) on a sequence of numbers
as a first-degree polynomial.

Weeks-4,5,6,7 13
RECURRENCE RELATIONS
A little maths
How to solve linear recurrence relation
Suppose, a two ordered linear recurrence relation is:
Fn=AFn-1 + BFn-2 , where A and B are real numbers.
Rearrange:
Fn-AFn-1 – BFn-2 = 0
Get characteristic Equation:
The characteristic equation for the above recurrence relation is :
xn−Axn-1−Bxn-2=0; dividing everything by xn-2 we get: x2−Ax −B=0; a
quadratic equation we can solve
Possibilities: same roots; distinct roots; complex roots
Distinct roots: (x-x1)(x-x2)=0, so Fn=ax1n + bx2n is the solution

Same roots: (x-x1)2 = 0, so Fn=ax1n + bx1n is the solution


Weeks-4,5,6,7 14
RECURRENCE RELATIONS
Back to complexity
Steps of recurrence relation
Basic step: also called initial or base condition; one or more
constants that terminate recurrence
Recursive steps: generate new terms from earlier terms; get next
sequence from preceding k values, ie fn-1, fn-2, fn-3, … fn-k.
For Fibonacci sequence we have: F0 , F1 ,F2 , …..,

{
0 if n=0
Fn= 1 if n=1
Fn-1 + Fn-2 if n>=2

Similarly for the factorial:

{
1 if n=1
n! = n.(n-1)! if n>1
Weeks-4,5,6,7 15
RECURRENCE RELATIONS
Solving recurrence relations
METHODS
There are four methods that can be used to solve the recurrence
equation:
1. The Substitution Method (Guess the solution & verify by
Induction)
2. Iteration Method (unrolling and summing)
3. The Recursion-tree method
4. Master method

Weeks-4,5,6,7 16
RECURRENCE RELATIONS
Solving recurrence relations
The Substitution Method
In this method one guesses a bound and applies mathematical
induction to prove that the guess is correct.
Steps
Step1: Guess the form of the Solution.
Step2: Use Mathematical Induction to prove the correctness of
the guess.

Example 1
Solve the following recurrence by using substitution method.
T(n) = 2T(n/2) + n

Weeks-4,5,6,7 17
RECURRENCE RELATIONS
Solving recurrence relations
The Substitution Method
Example 1 continues
Solve the following recurrence by using substitution method.
T(n) = 2T(n/2) + n
Step1- guess
Due to n/2 it is suggestive of nlog, so guess T(n) = O(nlogn)
ie. T(n) <= c*nlogn
Step2- mathematical induction
Apply mathematical Induction to prove the guess.
Base cases:
Let n=1: Given that T(1) = 1, we find that T(1) <= c*1.0= 0 that leads
to a contradiction;
Weeks-4,5,6,7 18
RECURRENCE RELATIONS
Solving recurrence relations
The Substitution Method
Example 1 continues
Solve the following recurrence by using substitution method.
T(n) = 2T(n/2) + n
Step2- mathematical induction
Base cases:
Let n=1: Given that T(1) = 1, we find that T(1) <= c*1.0= 0 that leads to a
contradiction;
Let n=2; T(2) <= c.2log2=c.2;
from the equation T(2) = T(2/2)+2 = T(1)+2= 0+2 =2 <= c.2 from
above.
Induction step
Assume true for n = n/2; so T(n/2) <= c.(n/2)log(n/2) holds
Weeks-4,5,6,7 19
RECURRENCE RELATIONS
Solving recurrence relations
The Substitution Method
Example 1 continues
Solve the following recurrence by using substitution method.
T(n) = 2T(n/2) + n
Step2- mathematical induction
Induction step: Assume true for n = n/2; so T(n/2) <= c.(n/2)log(n/2) holds
Prove that it holds for n: that is T(n) <= c.nlogn
But T(n) <= 2T(fr(n/2)) + n <= 2(c)(fr(n/2))log(fr(n/2))) + n
<= cnlog(fr(n/2)) + n <= cnlogn – cnlog2 + n <= cnlogn – cn + n
<= cnlogn for every c>=1; So by induction T(n) = O(nlogn)
Drawback of the method: coming up with the correct guess is not
generally easy

Weeks-4,5,6,7 20
RECURRENCE RELATIONS
Solving recurrence relations : METHODS- 2
ITERATION METHOD
The given recurrence is substituted back to itself several times
Steps
➔ Expend the recurrence through substitution
➔ Express the expansion as a summation by plugging the recurrence
back into itself seeking a pattern.
➔ Work out the total sum based on arithmetic or geometric series.
• Example 2.1: T(n) = b, if n =1, else T(n) = c+T(n-1) if n>2
• Solution
• T(1) = b as given and T(n) = c+T(n-1), also given
• At n-1 we have T(n-1) = c + (c+T(n-2))= 2c + T(n-2)
• At n-2 we have T(n-2) = 2c +c +T(n-3) = 3c + T(n-3) ………..
• At n-k we have T(n-k) = c.k + T(n-k) = c.k + T(1)= nc-c+b = O(n) where
for k=n-1
Weeks-4,5,6,7 21
RECURRENCE RELATIONS
Solving recurrence relations : METHODS- 2
ITERATION METHOD
Example 2.2: T(n) = a, if n =1, else T(n) = T(n/2) + n
• Solution
• T(n) = n + T(n/2)
• T(n/2): we have n+n/2+T(n/4)
• T(n/4): we have n+n/2+n/4+T(n/8) ………..

T(n/k): we have n+n/2+n/4+n/8 + … + n/(2k-1) + T(n/(2k)
• At the end: T(n/(2k) = T(1), so n/(2k) = 1, k= log2n
• We have geometric series:

n+n/2+n/4+n/8+…..+n/(2k-1) + T(1) = n+n/2+n/4+n/8+…..+n/(2k-1) + b
• = n(1-(1/2)log2n)/(1-(1/2)) = 2n(1-nlog1-log2) = 2n(1-n0-1) = 2n(1-(1/n))
• =2n – 2 = O(n)

Weeks-4,5,6,7 22
RECURRENCE RELATIONS
Solving recurrence relations
METHODS-3: The Recursion-tree method
A tree is used to trace the steps iteratively and visually; it is very
convenient. Reccurence is examined until boundary conditions
are reached.
General: T(n) = aT(n/b) + f(n); place f(n) at the root, spread T(n/b) a
times are children

Example 1: solve T(n) = 2T(n/2)+n


Place the n at the root; for simplicity replace T(n/2) by n/2

Weeks-4,5,6,7 23
RECURRENCE RELATIONS
Solving recurrence relations
METHODS-3: The Recursion-tree method
Example 1: solve T(n) = 2T(n/2)+n
Place the n at the root; for simplicity replace T(n/2) by n/2

Weeks-4,5,6,7 24
RECURRENCE RELATIONS
Solving recurrence relations
METHODS-3: The Recursion-tree method
Example 1: solve T(n) = 2T(n/2)+n
The level costs each add to n; total cost is therefore n+n+….+n

The sequence:
n, n/2, n/(22),n/(23), ….., n/(2k)
Last level = 1, so n/(2k) = 1
So n=2k, so k=log2n

Total time requirement


estimate:
n + n +n + ...= nk terms
n + n +n + ...= n(log2n) terms
So T(n) = O(nlog2n)

Weeks-4,5,6,7 25
RECURRENCE RELATIONS
Solving recurrence relations
METHODS-3: The Recursion-tree method
Example 2: solve T(n) = T(n/3) +T(2n/3) +n

Weeks-4,5,6,7 26
RECURRENCE RELATIONS
Solving recurrence relations
METHODS-3: The Recursion-tree method
Example 2: solve T(n) = T(n/3) +T(2n/3) +n
The sequence:
n, (2/3)n, (2/3)2n, (2/3)3n, ……, 1
So (2/3)k = 1, so k= log(3/2)n, k is the height of the tree
Total time estimate:
n+n+n+….+n=n(k times) = n(log(3/2)n times)
But n(log(3/2)n) = (nlog2n)/(log2(3/2))= c.nlog2n
So T(n) = O(nlog2n)
For best case, take shortest path:
n, n/3, n/32, n/33, ….., n/3k
So n/3k = 1, k = log3n which is the tree height
Estimate:
n+n+n+ ….. + n=n(k times) =n(log3n times)
log3n = (log2n)/(log23) , so T(n) = Ω(nlog2n)
So T(n) = Θ(nlog2n), since it both O and Ω for the same
order.
Weeks-4,5,6,7 27
RECURRENCE RELATIONS
Solving recurrence relations
METHODS-3: The Recursion-tree method
Example 3: solve T(n) = 2T(n-1) +1; T(1)=1; T(2)=3; Tower of Hanoi

Weeks-4,5,6,7 28
RECURRENCE RELATIONS
Solving recurrence relations
METHODS-3: The Recursion-tree method
Example 3: solve T(n) = 2T(n-1) +1; T(1)=1; T(2)=3; Tower of Hanoi
Last level: n-(n-1) = 1; also corresponds to
heigh ot tree
Total cost:
T(n) = 1+21+22,+23+…...2n-1 = 1(2n-1)/(2-1) = 2n -1
T(n) = O(2n)

Exercises- solve the following recurrence relations


1)T(n)= 3T(n/2) + 1; use iteration method
2)T(n) = 4T(n/2) + n; use recursion tree method
3)T(n) = 3T(n/2) + n ; use recursion tree method
4)T(n) = 2T(n/2) + n2 ; use recursion tree method
5)T(n) = T(n/2) + T(n/4) + T(n/8) + n; ; use recursion
tree method
6)Write programs implementing factorial, fibonacci
sequence and Tower of Hanoi and benchmark times for
n=5,10,15. Weeks-4,5,6,7 29
RECURRENCE RELATIONS
Solving recurrence relations
MASTER METHOD
Asymptotically positive function: f(n) for which there is some n0,
such that f(n)>0 for all n>n0
Types of problems solved:
T(n) = aT(n/b) + f(n), where a, b are constants and a>=1 and b > 1,
f(n) is asymptotically positive function
Note that there are a subproblems, each of size n/b
Each a subproblem takes T(n/b) and is solved recursively
The function f(n) provides the cost of dividing and combining the
subproblems
n/b should be an integer, otherwise take the ceiling or the floor
a, and b are natural numbers.

Weeks-4,5,6,7 30
RECURRENCE RELATIONS
Solving recurrence relations
MASTER METHOD
It is therefore a utility method for analyzing recurrence equations
It is used in many cases for divide and conquer algorithms
Format for recurence relations:
T(n) = aT(n/b) + f(n)
Where:
a, b are constants and a>=1 and b > 1,
n is the size of the curent problem
a is the number of subproblems in the recursion
n/b is the size of the subproblems; n/b should be an
integer, otherwise take the ceiling or the floor
f(n) is the the cost of work done outside recursive calls
such has dividing and combining the subproblems
Weeks-4,5,6,7 31
RECURRENCE RELATIONS
Solving recurrence relations
MASTER METHOD
MASTER THEOREM
Let T(n) = aT(n/b) + f(n), where a, b are constants and a>=1 and b >
1, f(n) is asymptotically bounded function and b/n is a positive
integer, otherwise its ceiling or floor is taken.
Then T(n) can be bounded asymptotically as follows:-

There are following three cases:


1. If f(n) = Θ(nc) where c < logba then T(n) = Θ(nLogba)
2. If f(n) = Θ(nc) where c = logba then T(n) = Θ(nclog n)
3.If f(n) = Θ(nc) where c > logba then T(n) = Θ(f(n))

Weeks-4,5,6,7 32
RECURRENCE RELATIONS
Solving recurrence relations
MASTER THEOREM - three cases:
1. If f(n) = Θ(nc) where c < logba then T(n) = Θ(nLogba)

2. If f(n) = Θ(nc) where c = logba then T(n) = Θ(nclog n)

3.If f(n) = Θ(nc) where c > logba then T(n) = Θ(f(n))

Weeks-4,5,6,7 33
RECURRENCE RELATIONS
Solving recurrence relations: MASTER THEOREM
T(n) = aT(n/b) + f(n), a>=1 and b > 1, f(n) is extra cost ; n/b is a
positive integer (or floor or ceiling), (Version 2):-

Case

Case

Case

Weeks-4,5,6,7 34
RECURRENCE RELATIONS
Solving recurrence relations
MASTER METHOD
MASTER THEOREM: Hint on Applying Master Theorem
T(n) = aT(n/b) + f(n)

1. Extract a, b and f(n) from the given recurrence equation


2. Use values of a, b to evaluate the value of n(logb(a))

3. Compare f(n) and what you got in 2 above ie n(logb(a))


4. Identify appropriate case for Master Theorem:
ie f(n)> n(logb(a)) for case 1,

f(n)= n(logb(a)) for case 2 OR

f(n)< n(logb(a)) for case 3 provided af(n/b) < kf(n) for some k <1
Weeks-4,5,6,7 35
RECURRENCE RELATIONS
Solving recurrence relations : MASTER METHOD: Version 2 cont

Weeks-4,5,6,7 36
RECURRENCE RELATIONS
Solving recurrence relations
MASTER METHOD
MASTER THEOREM
Example: Let T(n) = 3T(n/4) + nlogn;
Then a=3, b=4; f(n) = nlogn = nc>1
But w=log43≈0.79; so nw = n0.79 this showns that c > log43

So f(n) = nlogn = Ω(nw+e), where e≈0.21;


apply case 3 [If f(n) = Θ(n ) where c > log a then T(n) = Θ(f(n))];
c
b

So T(n) = Θ(nlogn)
Exercise: try for T(n) = 2T(n/2) + nlogn (*****)

Weeks-4,5,6,7 37
RECURRENCE RELATIONS
Solving recurrence relations
MASTER METHOD
MASTER THEOREM
Examples
Given: T(n) = 2T(n/2) +n
1. Extract: a=2, b=2 and f(n) = n=n1, so c=1
2. Evaluate the value of n(logb(a)) we have n(log2(2)) =n1=n

3. Compare f(n) and what you got in 2 above ie n(log2(2)) =n1=n


4. Identify appropriate case for Master Theorem: Same so Case 2
f(n)= n(logb(a)) for case 2

Applying case 2 [If f(n) = Θ(ny) where y = logba then T(n) =


Θ(nylog n)] we have T(n) = Θ(n1log n) = Θ(nlog n)
Weeks-4,5,6,7 38
RECURRENCE RELATIONS
Solving recurrence relations
MASTER THEOREM: Examples
Given: T(n) = 9T(n/3) +n
1. Extract: a=9, b=3 and f(n) = n
2. Evaluate the value of n(logb(a)) we have n(log3(9)) =n2

3. Compare f(n) and what you got in 2 above ie n(log2(2)) =n2


4. Identify appropriate case for Master Theorem: f(n) is less so
Case 1
Applying case 1 [ If f(n) = Θ(nc) where c < logba then T(n) = Θ(nLogba)] we have T(n)
= Θ(n2)

Weeks-4,5,6,7 39
RECURRENCE RELATIONS
Solving recurrence relations
MASTER THEOREM: Examples
Given: T(n) = 3T(n/4) +nlogn
1. Extract: a=3, b=4 and f(n) = nlogn
2. Evaluate the value of n(logb(a)) we have n(log4(3)) =nx<1

3. Compare f(n)=nlogn with n(log4(3)) =nx<1


4. Identify appropriate case for Master Theorem: f(n) is larger so
Case 3
Applying case 3 [ If f(n) = Θ(nc) where c > logba then T(n) = Θ(f(n))] check af(n/b)
<= kf(n) for k<1; ie 3(n/4)log(n/4) <= kf(n) that is true for k=3/4; so
we apply case 3 and have T(n) = Θ(nlogn)

Weeks-4,5,6,7 40
RECURRENCE RELATIONS
Solving recurrence relations
MASTER METHOD
MASTER THEOREM Example: Let T(n) = 9T(n/3) + n;
Then a=9, b=3; f(n) = n=n1; c=1; w=log39 = 2; nw = n2

So f(n) = O(nw-e), where e=1;


By case 1 [ If f(n) = Θ(n ) where c < log a then T(n) = Θ(n
c
b
)],
Log a
b T(n)= Θ(n2)
Exercise: try for T(n) = 8T(n/2) + 1000n2
Example
Let T(n) = T(2n/3) + 1; a=1; b=3/2; f(n) =1 =nc, c=0; where w=log3/21
Logba= log(3/2)1 = 0; Case 2 [If f(n) = Θ(n ) where c = log a then T(n) = Θ(n log n)]
c
b
c

But f(n) = Θ(nw), where w=log32. So we have T(n) = Θ(nw.logn), so


so T(n) = Θ(n0.logn), as w=0; then now T(n) = Θ(logn)

Weeks-4,5,6,7 41
RECURRENCE RELATIONS
Solving recurrence relations
MASTER THEOREM
Example Case 1 Confirm the following
T(n) = 2T(n/2) + 1; T(n) = Θ(n1)
T(n) = 4T(n/2) + 1; T(n) = Θ(n2)
T(n) = 4T(n/2) + n1; T(n) = Θ(n2)
T(n) = 8T(n/2) + n2; T(n) = Θ(n3)
T(n) = 16T(n/2) +n2; T(n) = Θ(n4)

Weeks-4,5,6,7 42
RECURRENCE RELATIONS
Solving recurrence relations
MASTER THEOREM
Example Case 2 Confirm the following
T(n) = T(n/2) + 1; T(n) = Θ(logn)
T(n) = 2T(n/2) + n; T(n) = Θ(nlogn)
T(n) = 2T(n/2) + nlogn; T(n) = Θ(nlog2n)
T(n) = 4T(n/2) + n2; T(n) = Θ((n2logn)
T(n) = 4T(n/2) + (nlogn)2; T(n) = Θ((nlogn)2logn)
T(n) = 2T(n/2) +n1/(logn); T(n) = Θ(nloglogn)
T(n) = 2T(n/2) +n1/(log2n); T(n) = Θ(n)

Weeks-4,5,6,7 43
RECURRENCE RELATIONS
Solving recurrence relations
MASTER THEOREM
Example Case 3 Confirm the following
T(n) = T(n/2) + n1; T(n) = Θ(n1)
T(n) = 2T(n/2) + n2; T(n) = Θ(n2)
T(n) = 2T(n/2) + n2logn; T(n) = Θ(n2logn)
T(n) = 4T(n/2) + n3log2n; T(n) = Θ(n3log2n)
T(n) = 2T(n/2) +n2(logn); T(n) = Θ(n2)

Weeks-4,5,6,7 44
RECURRENCE RELATIONS
Solving recurrence relations
MASTER THEOREM

Case 3 regularity violation


Weeks-4,5,6,7 45
RECURRENCE RELATIONS
Solving recurrence relations
MASTER THEOREM EXERCISES
Use Master’s Method to solve the following:

Weeks-4,5,6,7 46
RECURRENCE RELATIONS
Solving recurrence relations
Solve the following recurence relations
T(n) = T(n-1) + 5, n>1, T(1)=0
T(n) = 3T(n-1) n>1, T(1)=4
T(n) = T(n-1) + n, n>0, T(0)=0
T(n) = T(n/2) + n, n>1, T(1)=1 (solve for n=2k)
T(n) = T(n/3) + n, n>1, T(1)=1 (solve for n=3k)
Set up a recursive algorithm based on 2n = 2n-1 + 2n-1

Weeks-4,5,6,7 47
SORTING
Selection Sort

Process
➔ Scan the entire list to find its smallest element;
➔ Exchange it with the first element, putting the smallest

element in its final position in the sorted list.


➔ Then we scan the list, starting with the second

element,
➔ to find the smallest among the last n − 1 elements and

exchange it with the second element, putting the


second smallest element in its final position.
➔ Repeat until all the elements are in their correct

places.

Weeks-4,5,6,7 48
SORTING
Selection Sort

Process

Weeks-4,5,6,7 49
SORTING
Selection Sort

Algorithm

SelectionSort(A[0..n − 1])
//Sorts a given array by selection sort
//Input: An array A[0..n − 1] of orderable elements
//Output: Array A[0..n − 1] sorted in nondecreasing order
for i ← 0 to n − 2 do
min ← i
for j ← i + 1 to n − 1 do
if A[j ] < A[min] min ← j
Swap A[i] and A[min]

Weeks-4,5,6,7 50
SORTING
Selection Sort
Complexity

The time complexity of Selection sort is Θ(n2), but key


swaps is Θ(n)
Exercise: Write a program that implements and times
selection sort runs. Keep experimental data on this
Weeks-4,5,6,7 51
SORTING
QuickSort

➢ An important sorting algorithm that is based on the divide-


and-conquer algorithmic approach.

➢ It divides element according to their value, creating


partitions.

➢ A partition is an arrangement of the array’s elements so


that all the elements to the left of some element A[s] are
less than or equal to A[s], and all the elements to the right
of A[s] are greater than or equal to it.

Weeks-4,5,6,7 52
SORTING
QuickSort

➢ An important sorting algorithm that is based on the divide-


and-conquer algorithmic approach.

➢ It divides element according to their value, creating


partitions.

➢ A partition is an arrangement of the array’s elements so


that all the elements to the left of some element A[s] are
less than or equal to A[s], and all the elements to the right
of A[s] are greater than or equal to it.

Weeks-4,5,6,7 53
SORTING
QuickSort Process

Weeks-4,5,6,7 54
SORTING
QuickSort Process

Weeks-4,5,6,7 55
SORTING
QuickSort Process
Divide
Partition (rearrange) the array A[p .. r] into two (possibly empty)
subarrays A[p .. q - 1] and A[q+1 .. r] such that each element of
A[p .. q - 1 is less than or equal to A[q], which is, in turn, less
than or equal to each element of A[q - 1 .. r]. Compute the index
q as part of this partitioning procedure.

Conquer
Sort the two subarrays A[p .. q- 1] and A[q+1 .. r] by recursive
calls to quicksort

Combine
Because the subarrays are already sorted, no work is needed
to combine them: the entire array A[p .. r] is now sorted.

Weeks-4,5,6,7 56
SORTING
QuickSort Algorithm
Quicksort(A[l..r]): //Sorts a subarray by quicksort
//Input: Subarray of array A[0..n − 1], defined by its left and right indices l and r
//Output: Subarray A[l..r] sorted in nondecreasing order
if l < r
s ←Partition(A[l..r]) //s is a split position
Quicksort(A[l..s − 1])
Quicksort(A[s + 1..r])
ALGORITHM Partition(A[l..r])
//Partitions by Hoare’s algorithm, using the first element as a pivot
//Input: Subarray of array A[0..n − 1], defined by its left and right indices l and
r (l < r)
//Output: Partition of A[l..r], split position returned as this function’s value
p ← A[l]
i ← l; j ← r + 1
repeat
repeat i ← i + 1 until A[i] ≥ p
repeat j ← j − 1 until A[j ] ≤ p
swap(A[i], A[j ])
until i ≥ j
swap(A[i], A[j ]) //undo last swap when i ≥ j
swap(A[l], A[j ])
return j Weeks-4,5,6,7 57
SORTING
QuickSort Algorithm
procedure quickSort(left, right)

if right-left <= 0
return
else
pivot = A[right]
partition = partitionFunc(left, right, pivot)
quickSort(left,partition-1)
quickSort(partition+1,right)
end if

end procedure

Weeks-4,5,6,7 58
SORTING
QuickSort Algorithm Complexity
Best
T(n) = 2T(n/2) + n, for n > 1, T (1) = 0

Using the Master Theorem, T(n) ∈ (n log2n);


Solving it exactly for n = 2k gives T (n) = n log2n.
Worst
T(n) = (n + 1) + n + . . . + 3 = [((n + 1)(n + 2))/2]-3 ∈ Θ(n2 )
Average n−1
T(n) = (1/n) ∑ [(n + 1) + C avg (s) + C avg (n − 1 − s)] for n > 1,
s=0
T(0) = 0, T(1) = 0
T(n) ≈ 2n ln n ≈ 1.39n log2n = Θ(n log2n )

Weeks-4,5,6,7 59
SORTING
QuickSort Algorithm Complexity
Best
T(n) = 2T(n/2) + n, for n > 1, T (1) = 0

Using the Master Theorem, T(n) ∈ (n log2n);


Solving it exactly for n = 2k gives T (n) = n log2n.
Worst
T(n) = (n + 1) + n + . . . + 3 = [((n + 1)(n + 2))/2]-3 ∈ Θ(n2 )
Average n−1
T(n) = (1/n) ∑ [(n + 1) + C avg (s) + C avg (n − 1 − s)] for n > 1,
s=0
T(0) = 0, T(1) = 0
T(n) ≈ 2n ln n ≈ 1.39n log2n = Θ(n log2n )
Exercise: implement QuickSort and experiment on the timing
with different input sets with numbers from 10, 50 and 500.

Weeks-4,5,6,7 60
SORTING
MergeSort

● This is a good example of a successful application of the


divide-and-conquer technique.

● It sorts a given array A[0 .. n−1] by dividing it into two


halves A[0..└n/2┘) − 1] and A[└(n/2┘).. n − 1], sorting each
of them recursively, and then merging the two smaller
sorted arrays into a single sorted one.

● Mergesort is the method of choice for sorting linked lists


and is therefore frequently used in functional and logical
programming languages that have lists as their primary
data structure.

● mergesort is basically optimal as far as the number of


comparisons is concerned; so it is also a good choice if
comparisons are expensive.
Weeks-4,5,6,7 61
SORTING
MergeSort

● Divide
Divide the n-element sequence to be sorted into two
subsequences of n/2 elements each.

● Conquer
Sort the two subsequences recursively using merge sort.

Combine
Merge the two sorted subsequences to produce the sorted
answer.

Weeks-4,5,6,7 62
SORTING
MergeSort Process

Weeks-4,5,6,7 63
SORTING
MergeSort Process

Weeks-4,5,6,7 64
SORTING
MergeSort Process

Weeks-4,5,6,7 65
SORTING
MergeSort Algorithm

Weeks-4,5,6,7 66
SORTING
MergeSort Algorithm
ALGORITHM Merge(B[0..p − 1], C[0..q − 1], A[0..p + q − 1])
//Merges two sorted arrays into one sorted array
//Input: Arrays B[0..p − 1] and C[0..q − 1] both sorted
//Output: Sorted array A[0..p + q − 1] of the elements of B
and C
i ← 0; j ← 0; k ← 0
while i < p and j < q do
if B[i] ≤ C[j ]
A[k] ← B[i]; i ← i + 1
else A[k] ← C[j ]; j ← j + 1
k←k+1
if i = p
copy C[j..q − 1] to A[k..p + q − 1]
else copy B[i..p − 1] to A[k..p + q − 1]

Weeks-4,5,6,7 67
SORTING
MergeSort Algorithm
TIME COMPLEXITY

Divide
The divide step just computes the middle of the subarray,
which takes constant time. Thus D(n) = Θ(1).

Conquer: We recursively solve two subproblems, each of


size n/2, which contributes 2T(n/2) to the running time.

Combine: We have already noted that the MERGE


procedure on an n-element subarray takes time Θ(n) and
so C(n) = Θ(n)

The combine recurrence:

Weeks-4,5,6,7 68
SORTING
MergeSort Algorithm
TIME COMPLEXITY

The combine recurrence:

Rewriting we have:

This can be solve using the recursion tree as shown below.

Weeks-4,5,6,7 69
SORTING
MergeSort Algorithm

Weeks-4,5,6,7 70
SORTING
MergeSort Algorithm

Total: cn lg n + cn
O(nlogn) Weeks-4,5,6,7 71
SELECTION
SELECTION ALGORITHM
th
This is an algorithm for finding the k
smallest number in a list or array;

Such a number is called the kth order statistic.

This includes the cases of finding the


minimum, maximum, and median elements.

Selection problems are easily reduced to


sorting,however they do not require the full
power of sorting.
Weeks-4,5,6,7 72
SELECTION
SELECTION ALGORITHM- DETERMINISTIC AND
RANDOMIZED

Weeks-4,5,6,7 73
SELECTION
SELECTION ALGORITHM
Let s = <e1 , . . . , en> be a sequence
and let s’ = <e1’ , . . . , en’> be the sorted version of it.

● Selection of the smallest element requires determining e 1’ ,


selection of the smallest and the largest requires
determining e1’ and en’ ;
● The selection of the k-th largest requires determining e k’.
● Selection of the median refers to selecting the └n/2┘-th
largest element.
● Selection of the median and also quartiles is a basic
problem in statistics.
● It is easy to determine the smallest or the smallest and the
largest element by a single scan of a sequence in linear
time.
● k-th largest element can be determined in linear time.
Weeks-4,5,6,7 74
SELECTION
SELECTION ALGORITHM
1. Divide the n elements of the input array into └n/5┘ groups of 5
elements each and at most one group made up of the remaining n
mod 5 elements.
2. Find the median of each of the ┌n/5┐ groups by first insertion-
sorting the elements of each group (of which there are at most 5)
and then picking the median from the sorted list of group
elements.
3. Use SELECT recursively to find the median x of the ┌n/5┐
medians found in step 2. (If there are an even number of medians,
then by our convention, x is the lower median.)
4. Partition the input array around the median-of-medians x using
the modified version of PARTITION . Let k be one more than the
number of elements on the low side of the partition, so that x is
the kth smallest element and there are k-n elements on the high
side of the partition.
5. If i = k, then return x. Otherwise, use SELECT recursively to find
the ith smallest element on the low side if i < k, or the i - kth
smallest element on the high Weeks-4,5,6,7
side if i > k. 75
SELECTION
SELECTION ALGORITHM-RANDOMIZED

Weeks-4,5,6,7 76
SELECTION
SELECTION ALGORITHM

Weeks-4,5,6,7 77
SELECTION
SELECTION ALGORITHM- COMPLEXITY
Worst-case running time
For simplicity, assume that n is a multiple of 5 and ignore
ceiling and floor functions.
The number of items less than or equal to the median of
medians is at least 3n/10 in this context.
These are the first three items in the sets with medians less
than or equal to the median of medians. I
Symmetrically, the number of items greater than or equal to
the median of medians is at least 3n/10 .
The first recursion works on a set of n/5 medians, and the
second recursion works on a set of at most 7n/ 10 items.
We have:
T(n) <= n + T(n/5) + T(7n/10), that is O(n)

Weeks-4,5,6,7 78
SELECTION
SELECTION ALGORITHM- COMPLEXITY
Worst-case running time
We have:
T(n) <= n + T(n/5) + T(7n/10), that is O(n) that can be proved using
induction
Assume T(m) ≤ c · m for m < n and c a large enough constant;
T(n) <= n + (c/5).n + (7c/10).n = (1+9c/10).n
Tacking c values >=10, we have T(n) <= c.n

Weeks-4,5,6,7 79
CSC 311 DESIGN AND ANALYSIS OF ALGORITHMS
EXERCISES
(1)Estimate the running time of a program that has 2000 lines of
sequential code of a procedural language.
(2) Estimate the running of a program that scans the input two
times.
(3)Estimate the running time of a program that adds two nxn
matrices.
(4)Estiamte the running time of a program that multiplies two nxn
matrices.
(5)Estimate the running time of a program that uses binary search
to locate an item from an unsorted array of size n.
(6) Estimate the running time of a program that requests n
integers and displays their squares.
(7)Estimate the running time of a program that uses bubble sort
technique to sort an array of n unsorted numbers.
(8)Estimate the running time of a program controlled by the loop:
for (x=1; i<n;i++)
(9)Estimate the running time of a program controlled by the loop:
for (x=1; i<n;i--)
Weeks-4,5,6,7 80
CSC 311 DESIGN AND ANALYSIS OF ALGORITHMS
EXERCISES
(1)Estimate the running time of a program controlled by the loop:
for (x=1; i<n;i=I*2).
(2)Estimate the running time of a program controlled by the loop:
for (x=1; i<n;i=i*4).
(3)Estimate the running time of a program controlled by the loop:
for (x=1; i<n;i=i/2).
(4) Define recurrence relation and give an example.
(5)State the recurrence relations for Fibonacci series and Tower of
Hanoi.
(6) Give a general formula of a linear recurrence equation.
(7)Work out a characteristic equation for the recurrence relation:
(1)Rn = ARn-1 + BR, for A, B being real numbers
(8)Give the steps of a recurrence relation
(9)Discuss the methods used to solve recurrence relations giving
examples in each case.
(10)Describe the substitution method
(11)Describe the iteration method
(12)Describe the recursion tree method
Weeks-4,5,6,7 81
CSC 311 DESIGN AND ANALYSIS OF ALGORITHMS
EXERCISES
(1)Describe the master method
(2)Solve T(n) =9T(n/3) using Master Theorem
(3)Solve T(n) =T(2n/3) + 1 using Master Theorem
(4)Solve T(n) =8T(n/2) +1000n2 using Master Theorem
(5)Solve T(n) =n2T(n/2) + n2 using Master Theorem
(6)Solve T(n) =64T(n/8) - n2 logn using Master Theorem
(7)Solve T(n) =4T(n/3) + n2 using Master Theorem
(8)Set up recursive algorithm based on 2n = 2n-1 + 2n-2
(9)Describe selection sort
(10)Implement selection sort
(11)Discuss the complexity of selection sort
(12)Describe quicksort
(13)Implement quicksort
(14)Discuss the complexity of quicksort

Weeks-4,5,6,7 82
CSC 311 DESIGN AND ANALYSIS OF ALGORITHMS
EXERCISES
(1)Describe MergeSort
(2)Implement MergeSort
(3)Discuss the complexity of MergeSort
(4)Describe Selection algorithm
(5)Implement Selection algorithm
(6)Discuss the complexity of Selection algorithm

Weeks-4,5,6,7 83

You might also like