0% found this document useful (0 votes)
2 views22 pages

Lec4 - Design and Analysis of Algorithms

The document discusses the Divide and Conquer algorithm design paradigm, which involves breaking down a problem into smaller sub-problems, solving them recursively, and combining their solutions. It provides examples of algorithms that utilize this technique, such as Merge Sort and Quick Sort, detailing their processes and time complexities. The document also includes pseudocode for Merge Sort and Quick Sort, illustrating their implementation and analysis of performance.

Uploaded by

afra awed
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views22 pages

Lec4 - Design and Analysis of Algorithms

The document discusses the Divide and Conquer algorithm design paradigm, which involves breaking down a problem into smaller sub-problems, solving them recursively, and combining their solutions. It provides examples of algorithms that utilize this technique, such as Merge Sort and Quick Sort, detailing their processes and time complexities. The document also includes pseudocode for Merge Sort and Quick Sort, illustrating their implementation and analysis of performance.

Uploaded by

afra awed
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 22

Lecture four

Algorithms Design
Techniques
1- Divide And
Conquer

1
Divide And Conquer
 Divide And conquer is an important algorithm

design paradigm.
 It works by recursively breaking down a

problem into two or more sub-problems of the


same size, until these becomes simple enough
to be solved directly.
 The solutions to the sub-problems are then
2
combines to give a solution to the original
Three Steps of The Divide and
Conquer Approach

The most well known algorithm design strategy:

1. Divide the problem into two or more smaller sub-

problems.

2. Conquer the sub-problems by solving them

recursively.

3. Combine the solutions to the sub-problems into

the solutions for the original problem.


3
Divide-and-Conquer Technique (cont.)

a problem of size n
instanc(
)e
subproblem 1 subproblem 2
of size n/2 of size n/2

a solution to a solution to
subproblem 1 subproblem 2

a solution to the original problem


4
Divide and Conquer Examples

Binary search
Merge sort
Quick sort
Maximum and
Minimum
5
Merge sort.
 Split array A[0..n-1] into about equal halves and make copies of

each half in arrays B and C


 Sort arrays B and C recursively

 Merge sorted arrays B and C into array A as follows:

 Repeat the following until no elements remain in one of the

arrays:
 compare the first elements in the remaining unprocessed

portions of the arrays


 copy the smaller of the two into A, while incrementing the

index indicating the unprocessed portion of that array


 Once all elements in one of the arrays are processed, copy the
6
remaining unprocessed elements from the other array into A.
Mergesort(contd.)
ALGORITHM Mergesort(A[0..n-1])
//sorts array A[0..n-1] by recursive mergesort
//Input: A[0..n-1] to be sorted
//Output: Sorted A[0..n-1]
if n > 1
copy A[0..-1] to B[0.. -1]
copy A[..n-1] to C[0..-1]
Mergesort(B[0..-1])
Mergesort(C[0..-1])
Merge(B, C, A)

:B 2 3 8 9 :C 1 4 5 7

:A 1 2 3 4 5 7 8 9
7
Algorithm 1: mergesort(left, right)
1. If(left<right)
2. mid=(left + right)/2 //divide problem
into sub-problems
3. mergesort(left, mid) //solve first
sub-problem
4. mergesort(mid+1,right) //solve second
sub-problem
5. merge(left, mid, right) // combine the
solutions
6. Endif

8
Algorithm 2: merge (left, mid, right)

1. i=j=left, k=mid+1
2. While j≤ mid and k ≤ right
3. If(a[j] <a[k])
4. temp[i]=a[j] //Increment i and j
5. Else
6. temp[i]=a[k] //Increment i and k
7. Endif
8. Endwhile
9. For i= left to k
10. a[i]= temp[i]
11.Endfor
9
Time complexity for merge sort
We analyze the worst case running time of merge sort on n

elements. merge sort on just one element takes constant time.


When we have n>1 elements we break down the running time
as follows

i. Divide: just computes the middle of the sub array, which


takes constant time, O(1).

ii. Conquer: we recursively solve two sub-problems, each of size

n/2,which contributes T(n/2) + T(n/2) to the running time.

iii. Combine: the merge procedure on an n-element sub array

takes O(n).
10
Time complexity for merge sort
 We add a function that is O(n) and a function that is O(1) .

This sum is a linear function of n, that is, O(n).


 Adding it to the 2T(n/2) term from the “conquer” step

gives the recurrence for the worst case running time of


merge sort:
T(1)=O(1) for n=1
T(n)= 2T(n/2) + O(n) for n>1
This can be solved by repeated substitutions . Finally.
T(n)=n+ O(n)
Therefore, the running time of merge sort is O(n)

11
Mergesort(contd.)
:Divide
8 3 2 9 7 1 5 4
:Merge

8 3 2 9 7 1 5 4

8 3 2 9 7 1 5 4

8 3 2 9 7 1 5 4

3 8 2 9 1 7 4 5

2 3 8 9 1 4 5 7

1 2 3 4 5 7 8 9
12
Quick sort.
 Quicksort can perform quite fast, on average about

O(), but its worst case is a degrading O().


 Quick sort is naturally recursive.

 We partition the array into two sub-arrays.

 The partition is the heart quicksort works.

 We chosen some integer value call pivot to partition

elements ,the elements less than or equal pivot


added to left sub-array and elements greater than
pivot added to right sub-array.
13
Quick sort.
 A General algorithm based on a recursive method follows.

 Assume A is an integer array of size n.

 the left and right invoke procedure and they are

initialized with 1 and n respectively, and its the current


lower and upper bounds of the sub-array.
 The indicts newleft and newright are used to select certain

elements during the processing of each sub-array.


 Pivot is placed in its final location in the array.

14
2. amid=a[(left + right/2)]
Algorithm 3: Quicksort(a, left, right)
//pivot
3. While newleft ≤ newright
//partition
4. While (a[newleft] < amid and newleft < right
//loop 1
5. Increment newleft
6. Endwhile
7. While (amid < a[newright] and new right > left)
//loop2
8. Decrement newright
9. Endwhile
10. If newleft ≤ newright , then
11. swap(a[newleft] , a[newright] )
12. Increment newleft and Decrement newright
13. End if
14. Endwhile /
/partition ends
15. If left<newright, then
16. call Quicksort(a, left, newright) // sort
left sub-array
15
17. Endif
Example: Quick sort
We have array with 12 elements.

6 3 1 9 7 4 4 6 9 2 8 5
5 5 5 0 5 5 0 0 5 5 5 5
left=1, right=12, newleft=left,
newright =right
 mid=newleft + newright/2 =6,
a[mid]=45
Left sub-array is 1 to 5 and right
sub-array is 7 to 12
16
Example: Quick sort
Loop 1 scan left sub-array from left to
right as the element 65 is greater than
(45) , 65 is to shifted to right sub-array.
The newleft =1
Loop2 scans right sub-array from right to
left as the element 25 is less than (45).
The newright =10. the element in newleft
and newright are interchanged.
6 3 1 9 7 4 4 6 9 25 85 55 Newlef Newrig
Swap(65,25)
5 5 5 0 5 5 0 0 5 t ht
1 10

17
Example: Quick sort
2 3 1 9 7 4 4 6 9 65 85 55 Newlef Newrig
5 5 5 0 5 5 0 0 5 t ht
4 7

2 3 1 4 7 4 9 6 9 65 85 55 Newlef Newrig
5 5 5 0 5 5 0 0 5 t ht
5 6

2 3 1 4 4 7 9 6 9 65 85 55 Newlef Newrig
5 5 5 0 5 5 0 0 5 t ht
6 5

 As left <newright(1<5).the following recursive call is made


to sort the left sub-array.
 Quicksort(a, left, newright)
18Follow trace table below to quicksort
Table: trace of quicksort
ste a a2 a3 a4 a5 a6 a a8 a a1 a1 a1 lef rig pivo
p 1 7 9 0 1 2 t ht t
0 6 35 15 90 75 45 4 60 9 25 85 55 1 12 45
5 0 5
1 2 3 1 4 4 75 9 60 9 6 85 55 1 5 15
5 5 5 0 5 0 5 5
2 1 3 2 4 4 75 9 60 9 65 85 55 2 5 25
5 5 5 0 5 0 5
3 1 25 35 40 45 75 9 60 9 65 85 55 3 5 40
5 0 5
4 1 25 35 40 45 75 9 60 9 65 85 55 6 12 95
5 0 5
5 1 25 35 40 45 75 9 60 5 65 85 95 6 11 60
5 0 5
6 1 25 35 40 45 55 6 90 7 65 85 95 6 7 55
5 0 5
19
7 1 25 35 40 45 55 6 90 7 65 85 95 8 11 75
(contd.)
i j i j
5 3 1 9 8 2 4 7 2 3 1 4 5 8 9 7
i j i j
5 3 1 9 8 2 4 7 2 3 1 4 5 8 9 7
1 2 3 4 5 8 9 7
i j i j
i j
5 3 1 4 8 2 9 7 2 1 3 4 5 8 9 7
1 2 3 4 5 8 9 7
i j j i
i j
5 3 1 4 8 2 9 7 2 1 3 4 5 8 9 7
1 2 3 4 5 8 7 9
i j
j i
5 3 1 4 2 8 9 7 1 2 3 4 5 8 9 7
1 2 3 4 5 8 7 9
j i
5 3 1 4 2 8 9 7 1 2 3 4 5 8 9 7
1 2 3 4 5 7 8 9
i=j
2 3 1 4 5 8 9 7 1 2 3 4 5 8 9 7
1 2 3 4 5 7 8 9
j i
1 2 3 4 5 8 9 7
1 2 3 4 5 7 8 9
20
Time complexity for Quick sort

Best case time is T(n) = 2T(n/2) + Cn


 =O()

Worst case is T(n)= T(n-1) + n


 =O()

Average case time is T(n)=O()

21
Exercise
1. Proof the average case run time for quick
sort.
2. Implement algorithm3 using Java
programming language.

22

You might also like