0% found this document useful (0 votes)
5 views18 pages

DAA Quicksort

The document discusses the Quicksort algorithm, detailing its partitioning process, performance analysis, and expected running time for both worst-case and best-case scenarios. It explains the divide-and-conquer strategy used in Quicksort and provides recurrence equations for its time complexity. Additionally, it touches on tail recursion and its implications for optimizing recursive calls in the algorithm.

Uploaded by

KASHISH MADAN
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views18 pages

DAA Quicksort

The document discusses the Quicksort algorithm, detailing its partitioning process, performance analysis, and expected running time for both worst-case and best-case scenarios. It explains the divide-and-conquer strategy used in Quicksort and provides recurrence equations for its time complexity. Additionally, it touches on tail recursion and its implications for optimizing recursive calls in the algorithm.

Uploaded by

KASHISH MADAN
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 18

Quicksort Algorithm

pivot

From i +1 to j is a window of
elements > x = A[r]. The cursor j
moves right one step at a time.

If the cursor j “discovers” an


element ≤ x, then this element
is swapped with the front element
of the window, effectively moving
the window right one step; if it
discovers an element > x, then
the window simply becomes longer
one unit.
Do Ex. 7.1-1.
Performance of Quicksort
 Worst-case partitioning: one subproblem of size n-1,
other 0.
Time: (n2). Why?
 Best-case partitioning: each subproblem of size
└n/2┘ and ┌n/2┐-1 (at most n/2).
Time: (nlog n). Why?
 Balanced partitioning: even if each subproblem size is
at least a constant proportion of the original problem
the running time is (nlog n).
Analysis of Divide and Conquer:
 Divide: the problem into a number of subproblems. (D(n))
 Conquer: the subproblem by solving them recursively
(divide subproblem of size 1/b of the problem  aT(n/b))
 Combine: the solutions to the subproblems into the
solution of the original problem. (C(n))
 Recurrence equation:
T(n) = O(1) if n ≤ c
aT(n/b) + D(n) + C(n) otherwise
Where n is the input size, c is constant(small size),
Analysis of Quick Sort in Worst-
Case Partitioning:
 Divide: Partition A[p..r], into two subarrays A[p..q−1] & A[q+1..r],
such that each element in the first subarray A[p..q−1] ≤ A[q] ≤
each element in the second subarray A[q + 1 . .r] (O(n))
 Conquer: the subproblem of size n-1, other 0 by solving them
recursively ( T(n-1)+T(0)= T(n-1))
 Combine: No work is needed to combine the subarrays, because
they are sorted in place
 Recurrence equation:
T(n) = O(1) if n = 0
T(n-1)+ O(n) otherwise
Where n is the input size
Solve by
Recursion tree
Cost incurred at level 1= cn
Cost incurred at level 2= cn
Cost incurred at level 3= cn
.
Cost incurred at level (i+1) = cn
.
Cost incurred at bottom level = cn

 Total no. of levels = n


 Total Cost=n * [cn] {ignore
constants}
= c n2
= O(n2 )
Solve by iterative expansion
(substitution method)
The recurrence for the worst-case running time T(n) of QUICK-SORT:

T(n) = 1 if n = 0
T(n-1) + cn otherwise

T(n) = T(n-1) + cn
= [T(n-2)+c(n-1)] +cn
= T(n-2) + cn - c+ cn = T(n-2) + 2cn - c
= [T(n-3)+c(n-2)]+ 2cn - c
= T(n-3)+cn – 2c+2cn -c = T(n-3)+3cn – 3c
= [T(n-4)+c(n-3)]+3cn-3c = T(n-4)+4cn – 6c
= T(n-4)+4cn-6c
= T(n-i ) + icn –c(1+2+3+...+i)
= ….
= T(0) + (n)cn –c(1+2+3+..+n)
= 0 + cn2 –c(n2+n)/2 = cn2/2-cn/2
= O(n2)
 n-i = 0
i=n
Analysis of Quick Sort in Best-
Case Partitioning:
 Divide: partition takes n time. (O(n))
 Conquer: the subproblem of size n-1, other 0 by
solving them recursively( T(n/2)+T(n/2) = 2T(n/2))
 Combine: the solutions to the subproblems into the
solution of the original problem.
 Recurrence equation:
T(n) = O(1) if n = 0
2T(n/2)+ O(n) otherwise
Where n is the input size
Solve by
Recursion tree
Cost incurred at level 1= cn
Cost incurred at level 2= c(n-1)
Cost incurred at level 3= c(n-2)
.
Cost incurred at level (i+1) = c(n-i)
.
Cost incurred at bottom level = c

 Total no. of levels = n


 Total Cost=n * [cn] {ignore
constants}
= cn
= O(n log2 n)
Solve by iterative expansion
The recurrence for the worst-case running time T(n) of QUICK-SORT:

T(n) = c if n = 1
2T(n/2) + cn if n > 1

T(n) = 2T(n/2) + cn
= 2 [2T(n/4)+cn/2] +cn
= 4 T(n/4) + cn + cn
= 4 [2T(n/8)+cn/4]+ 2cn
= 8 T(n/8)+cn+2cn = 8 T(n/8)+3cn
= 8 [2T(n/16)+cn/8]+3cn
= 16 T(n/16)+4cn
= 2i T(n/2i ) + icn
= ….
= n T(1) + (logn)cn
= cn + cnlogn
= O(nlogn)
 n/2i = 1
 n = 2i
 i = log2 n = logn
Expected Running Time of
RANDOMIZED-QUICKSORT
Lemma 7.1: Let X be the number of comparisons
performed in line 4 of PARTITION over the
entire execution of QUICKSORT (or
RANDOMIZED-QUICKSORT) on an n-element
array. Then the running time of QUICKSORT (or
RANDOMIZED-QUICKSORT) is O(n +X).
Proof: There are at most n calls to PARTITION
(Why? Because the pivot is dropped from future
recursive calls.). Each call to PARTITION does
constant work and executes the for loop some
number of iterations. Therefore, the total work
done is O(n) + the total number of iterations of
the for loop over the entire execution of
QUICKSORT.
However, each iteration of the for loop performs
the comparison of line 4 exactly once.
Conclusion follows.
Rename the array z1, z2, …, zn, where z1≤ z2 ≤ … ≤ zn
Let the indicator random variable Xij = I{zi is compared to zj}.
Then X = ∑i=1..n-1 ∑j=i+1..n Xij
Therefore,
E[X] = E[ ∑i=1..n-1 ∑j=i+1..n Xij ]
= ∑i=1..n-1 ∑j=i+1..n Pr{zi is compared to zj}

Now, zi and zj are compared iff the first element to be chosen from
Zij = [zi, zi+1, …, zj] as pivot is either zi or zj. Prior to the point when
an element of Zij is chosen as pivot, all of Zij is in the same
partition. Because pivots are chosen randomly any element of Zij is
equally likely to be the first chosen as pivot. We conclude:
Pr{zi is compared to zj} = 2 / (j-i+1)
where k = j-i
Therefore, E[X] = ∑i=1..n-1 ∑j=i+1..n 2 / (j-i+1)
= ∑i=1..n-1 ∑k=1..n-i 2 / (k+1) ≤ ∑i=1..n-1 ∑k=1..n-1 2/(k+1)
< ∑i=1..n-1 ∑k=1..n 2/k = ∑i=1..n-1 O(log n)
= O(nlog n)
∑k=1..n 1/k ≈ loge n
Hoare’s Original Partitioning
Strategy

Do Prob. 7-1a
Tail Recursion
int bsTR(int Low, int High, double X, double A[]) int bsNonTR(int Low, int High, double X, double A[])
{ {
int Mid; int Mid;

start:
if (Low > High) return (-1); //search failed if (Low > High) return (-1); //search failed

Mid = (Low + High)/2; Mid = (Low + High)/2;

if (A[Mid] < X) if (A[Mid] < X)


return ( bsTR(Mid + 1, High, X, A) ); // tail-recursive call {
Low = Mid + 1;
else if (X < A[Mid]) goto start;
return ( bsTR(Low, Mid - 1, X, A) ); // tail-recursive call }

else else if (X < A[Mid])


return (Mid); // X has been located {
} High = Mid - 1;
goto start;
Tail recursive binary search
}
A recursive call immediately before a routine
exits is tail recursive. Tail recursion is else
return (Mid); // X has been located
wasteful as it can be replaced by iteration
}
thereby saving on the recursion stack.
Optimized compilers automatically eliminate it. Non-tail recursive binary search
Avoiding Tail Recursion in
QUICKSORT
The second recursive call in QUICKSORT is tail recursive
and can be eliminated

Tail recursion question: is the recursive Fact call tail recursive?


int Fact(int n)
{
if (n < 1) return 1;
else return n*Fact(n-1);
}

You might also like