Assignment 2
Assignment 2
{
O ( n ) if d> log b a
d
T ( n )= O ( nd logn ) if d=log b a
O( n ) if d <log b a
log b a
For Algorithm A:
For Algorithm B:
The recurrence relation is 2 T ( n−1 )+ O(1)
T(n) = 2T(n-1)+O(1)
= 2(2T(n-2)+O(1))+O(1) =22T(n-2) + 3O(2)
= 23T(n-3) + 5O(1)
=…
= 2kT(n-k)+ xO(1)
Putting k=n-1
T(n-1) = 2n-1T(1)+ xO(1)
Which is O(2n)
For Algorithm C:
Thus, as Algorithm C has the least running time, we should pick algorithm C.
1. Removing Duplicates:
We can modify the merge function in mergesort to not add the duplicates in the following way:
This is the original merge function (taken from the textbook Dasgupta):
To remove duplicates, we can modify it so that we don’t copy the duplicate entries:
function merge(x[1 … k]; y[1 … l])
if k = 0: return y[1 … l]
if l = 0: return x[1 … k]
if x[1] < y[1]:
return x[1] o merge(x[2 .. k]; y[1 … l])
else if x[1]>y[1]:
return y[1] o merge(x[1 … k]; y[2 …l])
else if x[1]=y[1]:
return x[1] o merge(x[2 … k]; y[2 …l]) [Since both are equal we keep only one of
them and move the
pointer by one in both x and y]
The complexity of this is same as merge sort, which is O(nlogn).
Another solution is to mergesort the array, and then remove duplicates. This also
takes the same time which is O(nlogn).
3. As the query returns kth smallest element, we can set k to n/2 for the median
element.
K1=k2 = n/2
The median will lie between min(m1, m2) and max(m1,m2). We discard the
rest of the elements, and repeat this process each time. Since every time the
size of array is divided by 2, the loop will run log 2n times:
As every time the size is reduced to half, the recurrence relation is:
T(n) = T(n/2)+O(1)
By Master theorem,
{
O ( n ) if d> log b a
d
T ( n )= O ( nd logn ) if d=log b a
O( n ) if d <log b a
log b a
a = 1, b=2, d=0
4. (a)Let A = [ ac bd ]
[ ac bd ][ ac bd ]=[ caa+dc ][ ]
2 2
+ bc a b+bd a +bc b(a+d )
Then A2 = 2 = 2
cb+d c (a+ d) cb+d
X= [ CA DB ]
[ CA DB ] [ CA DB ]=[ CAA ++BCDC ]
2
2 AB+ BD
X =
CB+ D 2
So, there are still 7 subproblems and not 5, and thus it will take the same time as strassen’s
algorithm which is O ( nlog 7 ) and not O ( nlog 5 )
2 2
X= [ B0 A0 ]
X2 = [ B0 A0 ] [ B0 A0 ]=[ AB0 BA0 ]
Given that a matrix of size n can be squared in time O(nc), then the time for squaring X will be
O((2n)c), which is equal to O(nc).
Hence the product AB can be computed in time O(nc). (Ref: L. Trevisan & P. Raghavendra DIS 02)