Chapter1 Odd
Chapter1 Odd
Section 1.1
1) Write an algorithm that finds the largest number in a list (an array) of n numbers.
3) Write an algorithm that prints out all the subsets of three elements of a set of n
elements. The elements of this set are stored in a list that is the input to the algorithm.
5) Write an algorithm that finds the greatest common divisor of two integers.
if(b == 0)
return a;
7) Write an algorithm that determines whether or not an almost complete binary tree
is a heap.
return true;
return false;
return false;
if(tree->right == null)
return isHeap(tree->left);
}
Chapter 1: Algorithms: Efficiency, Analysis, and Order Solutions
Section 1.2
9) Give a practical example in which you would not use Exchange Sort (Algorithm 1.3)
to do a sorting task.
Section 1.3
11) Determine the worst-case, average-case, and best-case time complexities for the
basic Insertion Sort and for the version given in Exercise 4, which uses Binary Search.
Basic Insertion Sort: B(n) = n (array already sorted), A(n) = n2/4, W(n) = n2/2 (array
reverse sorted). Proofs can be found in Section 7.2.
Insertion Sort with Binary Search: B(n) = 2lg n (each new key is always inserted in the
middle of the array, so only 2 comparisons are needed to find its location … if repeated
keys are allowed, than an array of identical elements only requires 1 comparison per
key!), A(n) = nlg n, W(n) = nlg n.
Chapter 1: Algorithms: Efficiency, Analysis, and Order Solutions
13) Algorithm A performs 10n2 basic operations, and algorithm B performs 300ln n
basic operations. For what value of n does algorithm B start to show its better
performance?
For n = 7, we have 490 operations in A, and 584 in B, so A is better, but for for n = 8, we
have 640 operations in A and 624 in B, so B is better (and stays that way for all n >= 8).
Section 1.4
15) Show directly that f(n) = n2 + 3n3 (n3). That is, use the definitions of O and to
show that f(n) is in both O(n3) and (n3).
Apply Property 7, with g(n) = n4 + 6n3/4 + 2n2/4 + n/4 + 7/4, h(n) = n5 , c = 4, and d =
5.
A direct proof is also possible: take f(n)= 5n5 + 4n4 + 6n3 + 2n2 + n + 7, and we have
5n5 + 4n4 + 6n3 + 2n2 + n + 7 <=5n5 + 4n4 + 6n3 + 2n2 + n + 7 <= 5n5 + 4 n5 + 6 n5 + 2
n5 + n5 + 7 n5
5n5<=5n5 + 4n4 + 6n3 + 2n2 + n + 7 <=25n5
Therefore the order will be Θ(n5), with c1=5 and c2=25, where n>=1.
19) The function f(x) = 3n2+10n log n+1000n+4log n+9999 belongs in which of the
following complexity categories:
Chapter 1: Algorithms: Efficiency, Analysis, and Order Solutions
f(n) (n2), by repeatedly applying Properties 6 and 7 (or “throwing out” all the
lower-order terms).
21) The function f(x) = n + n2 + 2n + n4 belongs in which of the following complexity
categories:
Property 6: Here we have to compare each category with the next in the sequence.
As an example, we prove bn ∈ o(n!):
Proof: According to the definition of “little-oh”, we have to show that, for every positive
c, there exists N such than bn ≤ cn!, for all n ≥ N. Let n0 be the first integer with the
property n0 > 2b. We rewrite the desired inequality thus:
b b n 0 n0 (n0 + 1) ... n
bn0∙bn-n0 ≤ c(1∙2∙…∙n0∙ … ∙n) ⇔ , where the left-hand side
c n0 ! b b ... b
is a constant C, and each ratio on the right-hand-side is > 2. Therefore we simply make
N large enough so that C < 2N-n0+1, or N =
Chapter 1: Algorithms: Efficiency, Analysis, and Order Solutions
Property 7: If c>=0, d>0, g(n) ∈ O(f(n)), and h(n) ∈ (f(n)), then c*g(n)+d*h(n) ∈
Θ(f(n))
Proof: According to the definitions of O and , there are positive constants c1, N1, c2,
d2, N2 such that
g(n) ≤ c1∙f(n) for all n ≥ N1
c2∙f(n) ≤ h(n) ≤ d2∙f(n) for all n ≥ N2
We multiply the first inequality by c and the second one by d, and denote by N the
maximum of N1 and N2:
cg(n) ≤ cc1∙f(n) (*)
dc2∙f(n) ≤ dh(n) ≤ dd2∙f(n) (**) for all n ≥ N
Adding (*) and the right inequality in (**), we obtain
cg(n) + dh(n) ≤ (cc1 + dd2)∙f(n) for all n ≥ N, which is the right-hand-side
inequality in the definition of .
From the left-hand-side inequality of (**), we have dc2∙f(n) ≤ dh(n), and we add cg(n)
to obtain
dc2∙f(n) ≤ cg(n) + dh(n) for all n ≥ N, which is the left-hand-side
inequality in the definition of .
Chapter 1: Algorithms: Efficiency, Analysis, and Order Solutions
25) Suppose you have a computer that requires 1 minute to solve problem instances of
size n =1,000. Suppose you buy a new computer that runs 1,000 times faster than the
old one. What instance sizes can be run in 1 minute, assuming the following time
complexities T(n) for our algorithm?
a) T(n) = n Answer: 106
b) T(n) = n3 Answer: 104
c) T(n) = 10 n Answer: 1003
For all these results, we use Theorem 1.3 and Theorem 1.4 ( L’Hôpital’s rule):