Ex 3 Sol
Ex 3 Sol
Ex 3 Sol
Solution for assignment 3 1 Determine the asymptotic behavior of T (n) in the following recurrences. For example T (n) = 2T (n/2) + n, is (n log n). In all cases, assume T (constant) = (1), and you may assume there are no divisibility problems (or alternately, assume n to be a suitable power, such as 2k for some k). (a) T (n) = T ( n) + 1. Substitution: T (n) = T (n1/2 ) + 1 = T (n1/4 ) + 2 = T (n1/8 ) + 3 = = T (n1/2 ) + i. Therefore, when i = log log n we have n1/2 = 2log n/ log n = 2. This implies that T (n) = T (2) + log log n = (log log n). (b) T (n) = 4T (n/3) + 5n log n. Master theorem. Check that log3 4 = 1.26 . . . > 1. Since , 5n log n = O(n1.26 ) for, say = 0.1, we are in the rst case of the theorem. Therefore, T (n) = (nlog3 4 ). (c) T (n) = 3T (n/4) + 5n log n. Master theorem. Check that log4 3 = 0.79 . . . < 1. Since, 5n log n = (n0.79+ ), for = 0.1, we are in the third case. However, we have to check that f (n) = 5n log n is a nice function. Namely that 3f (n/4) cf (n) for some c < 1 and suciently large n. 3f (n/4) = 3 5 (n/4) log(n/4) = (3/4) 5n (log n 2) (3/4) 5n log n = (3/4) f (n). Therefore, the requirement holds for all n, with c = 3/4. By the third case of the Master theorem, T (n) = (f (n)) = (n log n). (d) T (n) = 17T (n/20) + (log n)17 . Master theorem. Check that log20 17 = 0.94 . . . < 1. Since (log n)17 = O(n0.94 ) for = 0.5, we are in the rst case. Therefore, T (n) = (nlog20 17 ). (e) T (n) = T (n/2) + 2T (n/4) + 3n/4. We have seen in class (Master theorem/substitution/Induction) that the solution to the recursion S(n) = 2S(n/2) + n/2 is S(n) = (n log n). Let us write the 2S(n/2) as S(n/2) + S(n/2) and substitute one of them. Then S(n) = S(n/2) + S(n/2) + n/2 = S(n/2) + 2S(n/4) + n/2 + n/4. Since this is exactly the same recursion as T (n) we conclude that T (n) = (n log n). (f) T (n) = n T ( n) + n (Hint: Guess and Induction). We give two proofs. The rst is by substitution (which, in retrospect, is easier). T (n) = n + n1/2 T (n1/2 ) = n + n + n3/4 T (n1/4 ) = 3n + n7/8 T (n1/8 ) = = i n + n11/2 T (n1/2 ).
i i i i
Therefore, assigning i = log log n, we get that n1/2 = 2log n/ log n = 2. Therefore T (n) = n log log n + (n/2) T (2) = (n log log n). 1
To prove by induction. We assume T (n) n (log log n + b) is satised for all values less than n (but at least 4), and would like to prove it for n. n T ( n) + n n (log log n + b) + n T (n) = = n (log((log n)/2) + b + 1) = n (log log n + b 1 + 1) = n(log log n + b). The same proof holds for the lower bound, by changing to . Note that special care should be taken with the base case. For example, to calculate T (5) = 5 T ( 5) + 5 (assuming proper oor/ceiling), one uses n outside the range of induction. The simplest solution is to choose a suciently large b so that the basis holds for n = 4, . . . , 15, and prove by induction only for larger n. 2. Consider the recurrence:
T (n) = 0
k i=1
n=0 T (i n) + c n n > 0
k i=1
Assume by induction that T (n) an holds for all values smaller than n, and prove it for n. The basis, n = 0 obviously holds.
k
T (n) =
i=1 k
T (i n) + c n
k
i=1
a (i n) + c n
k
i=1
a (i n) + c n
= n (c +
i=1
The last inequality holds if a(1 b) > c, which is a > c/(1 b). 3. Question 4-2 from the Cormens (et al) book:
An array A[1 n] contains all the integers from 0 to n except one. It would be easy to determine the missing integer in O(n) time by using an auxiliary array B[0 n] to record which numbers appear in A. In this problem, however, we cannot access an entire integer in A with a single operation. The elements of A are represented in binary, and the only operation we can use to access them in fetch the j th bit in binary, which takes constant time. Show that if we use only this operation, we can still determine the missing integer in O(n) time. (Note: The bit by bit access restriction does not apply to operations on variables and other arrays than A.)
First note that the numbers 0 to n have only log(n + 1) interesting bits (all other bits are zero). This gives a simple (n log n) algorithm to nd the missing number. But we would like an O(n) algorithm. Let k = log(n + 1), so the interesting bits are numbered 1 through k. Given the array A, our algorithm keeps two counters and (using a 1 through n for loop), it counts the number of 2
entries that have zero/one in their 1-st (least signicant) bit. If there was no missing number, these counts should have been n/2, n/2 respectively. Therefore we can determine in linear time the value of b, which is the least signicant bit of the missing number. We scan the array again, and store all indices i for which the lsb of A[i] is b, in an auxilary array. This operation yields exactly the same problem, but with smaller n. The new value of n is n/2 for b = 0, and n/2 for b = 1. We only have to note that when scanning the array again, we go only over indices stored in the auxiliary array. When n = 0, we are done. The running time of the algorithm satises the recursion T (n) = T (n/2)+cn, whose solution is T (n) = (n), as needed.