Daa Unit 2
Daa Unit 2
Divide and Conquer is an algorithmic pattern. In algorithmic methods, the design is to take a
dispute on a huge input, break the input into minor pieces, decide the problem on each of the
small pieces, and then merge the piecewise solutions into a global solution. This mechanism of
solving the problem is called the Divide & Conquer Strategy.
Divide and Conquer algorithm consists of a dispute using the following three steps.
Examples: The specific computer algorithms are based on the Divide & Conquer approach:
Binary Search
Tower of Hanoi.
Stopping Condition
1. Relational Formula: It is the formula that we generate from the given technique. After
generation of Formula we apply D&C Strategy, i.e. we break the problem recursively & solve
the broken subproblems.
2. Stopping Condition: When we break the problem using Divide & Conquer Strategy, then we
need to know that for how much time, we need to apply divide & Conquer. So the condition
where the need to stop our recursion steps of D&C is called as Stopping Condition.
Following algorithms are based on the concept of the Divide and Conquer Technique:
Binary Search: The binary search algorithm is a searching algorithm, which is also called a
half-interval search or logarithmic search. It works by comparing the target value with the
middle element existing in a sorted array. After making the comparison, if the value differs, then
the half that cannot contain the target will eventually eliminate, followed by continuing the
search on the other half. We will again consider the middle element and compare it with the
target value. The process keeps on repeating until the target value is met. If we found the other
half to be empty after ending the search, then it can be concluded that the target is not present in
the array.
Quicksort: It is the most efficient sorting algorithm, which is also known as partition-exchange
sort. It starts by selecting a pivot value from an array followed by dividing the rest of the array
elements into two sub-arrays. The partition is made by comparing each of the elements with the
pivot value. It compares whether the element holds a greater value or lesser value than the pivot
and then sort the arrays recursively.
Merge Sort: It is a sorting algorithm that sorts an array by making comparisons. It starts by
dividing an array into sub-array and then recursively sorts each of them. After the sorting is
done, it merges them back.
Strassen's Algorithm: It is an algorithm for matrix multiplication, which is named after Volker
Strassen. It has proven to be much faster than the traditional algorithm when works on large
matrices.
Linear Search and Binary Search are the two popular searching techniques. Here we will discuss
the Binary Search Algorithm.
Binary search is the search technique that works efficiently on sorted lists. Hence, to search an
element into some list using the binary search technique, we must ensure that the list is sorted.
Binary search follows the divide and conquer approach in which the list is divided into two
halves, and the item is compared with the middle element of the list. If the match is found then,
the location of the middle element is returned. Otherwise, we search into either of the halves
depending upon the result produced through the match.
Working of Binary search
Now, let's see the working of the Binary Search Algorithm.
To understand the working of the Binary search algorithm, let's take a sorted array. It will be
easy to understand the working of Binary search with an example.
o Iterative method
o Recursive method
The recursive method of binary search follows the divide and conquer approach.
1. mid = (beg + end)/2
beg = 0
end = 8
1. Time Complexity
Case Time Complexity
o Best Case Complexity - In Binary search, best case occurs when the element to search is
found in first comparison, i.e., when the first middle element itself is the element to be
searched. The best-case time complexity of Binary search is O(1).
o Average Case Complexity - The average case time complexity of Binary search
is O(logn).
o Worst Case Complexity - In Binary search, the worst case occurs, when we have to
keep reducing the search space till it has only one element. The worst-case time
complexity of Binary search is O(logn).
Divide: In Divide, first pick a pivot element. After that, partition or rearrange the array into two
sub-arrays such that each element in the left sub-array is less than or equal to the pivot element
and each element in the right sub-array is larger than the pivot element.
Quicksort picks an element as pivot, and then it partitions the given array around the picked
pivot element. In quick sort, a large array is divided into two arrays in which one holds values
that are smaller than the specified value (Pivot), and another array holds the values that are
greater than the pivot.
After that, left and right sub-arrays are also partitioned using the same approach. It will continue
until the single element remains in the sub-array.
o Pivot can be random, i.e. select the random pivot from the given array.
o Pivot can either be the rightmost element of the leftmost element of the given array.
o Select median as the pivot element.
To understand the working of quick sort, let's take an unsorted array. It will make the concept
more clear and understandable.
In the given array, we consider the leftmost element as pivot. So, in this case, a[left] = 24,
a[right] = 27 and a[pivot] = 24.
Since, pivot is at left, so algorithm starts from right and move towards left.
Now, a[pivot] < a[right], so algorithm moves forward one position towards left, i.e. -
Because, a[pivot] > a[right], so, algorithm will swap a[pivot] with a[right], and pivot moves to
right, as -
Now, a[left] = 19, a[right] = 24, and a[pivot] = 24. Since, pivot is at right, so algorithm starts
from left and moves to right.
Now, a[left] = 9, a[right] = 24, and a[pivot] = 24. As a[pivot] > a[left], so algorithm moves one
position to right as -
Now, a[left] = 29, a[right] = 24, and a[pivot] = 24. As a[pivot] < a[left], so, swap a[pivot] and
a[left], now pivot is at left, i.e. -
Since, pivot is at left, so algorithm starts from right, and move to left. Now, a[left] = 24, a[right]
= 29, and a[pivot] = 24. As a[pivot] < a[right], so algorithm moves one position to left, as -
Now, a[pivot] = 24, a[left] = 24, and a[right] = 14. As a[pivot] > a[right], so, swap a[pivot] and
a[right], now pivot is at right, i.e. -
Now, a[pivot] = 24, a[left] = 14, and a[right] = 24. Pivot is at right, so the algorithm starts from
left and move to right.
Now, a[pivot] = 24, a[left] = 24, and a[right] = 24. So, pivot, left and right are pointing the same
element. It represents the termination of procedure.
Element 24, which is the pivot element is placed at its exact position.
Elements that are right side of element 24 are greater than it, and the elements that are left side of
element 24 are smaller than it.
Now, in a similar manner, quick sort algorithm is separately applied to the left and right sub-
arrays. After sorting gets done, the array will be -
Quicksort complexity
Now, let's see the time complexity of quicksort in best case, average case, and in worst case. We
will also see the space complexity of quicksort.
1. Time Complexity
Case Time Complexity
o Best Case Complexity - In Quicksort, the best-case occurs when the pivot element is the
middle element or near to the middle element. The best-case time complexity of quicksort
is O(n*logn).
o Average Case Complexity - It occurs when the array elements are in jumbled order that
is not properly ascending and not properly descending. The average case time complexity
of quicksort is O(n*logn).
o Worst Case Complexity - In quick sort, worst case occurs when the pivot element is
either greatest or smallest element. Suppose, if the pivot element is always the last
element of the array, the worst case would occur when the given array is sorted already in
ascending or descending order. The worst-case time complexity of quicksort is O(n2).
Though the worst-case complexity of quicksort is more than other sorting algorithms such
as Merge sort and Heap sort, still it is faster in practice. Worst case in quick sort rarely occurs
because by changing the choice of pivot, it can be implemented in different ways. Worst case in
quicksort can be avoided by choosing the right pivot element.
Merge sort is similar to the quick sort algorithm as it uses the divide and conquer approach to
sort the elements. It is one of the most popular and efficient sorting algorithm. It divides the
given list into two equal halves, calls itself for the two halves and then merges the two sorted
halves. We have to define the merge() function to perform the merging.
The sub-lists are divided again and again into halves until the list cannot be divided further. Then
we combine the pair of one element lists into two-element lists, sorting them in the process. The
sorted two-element pairs is merged into the four-element lists, and so on until we get the sorted
list.
We know that merge sort first divides the whole array iteratively into equal halves unless the
atomic values are achieved. We see here that an array of 8 items is divided into two arrays of size
4.
This does not change the sequence of appearance of items in the original. Now we divide these
two arrays into halves.
We further divide these arrays and we achieve atomic value which can no more be divided.
Now, we combine them in exactly the same manner as they were broken down. Please note the
color codes given to these lists.
We first compare the element for each list and then combine them into another list in a sorted
manner. We see that 14 and 33 are in sorted positions. We compare 27 and 10 and in the target
list of 2 values we put 10 first, followed by 27. We change the order of 19 and 35 whereas 42
and 44 are placed sequentially.
In the next iteration of the combining phase, we compare lists of two data values, and merge
them into a list of found data values placing all in a sorted order.
After the final merging, the list should look like this −
1. Time Complexity
Case Time Complexity
o Best Case Complexity - It occurs when there is no sorting required, i.e. the array is
already sorted. The best-case time complexity of merge sort is O(n*logn).
o Average Case Complexity - It occurs when the array elements are in jumbled order that
is not properly ascending and not properly descending. The average case time complexity
of merge sort is O(n*logn).
o Worst Case Complexity - It occurs when the array elements are required to be sorted in
reverse order. That means suppose you have to sort the array elements in ascending order,
but its elements are in descending order. The worst-case time complexity of merge sort
is O(n*logn).
Before knowing more about the heap sort, let's first see a brief description of Heap.
In heap sort, basically, there are two phases involved in the sorting of elements. By using the
heap sort algorithm, they are as follows -
o The first step includes the creation of a heap by adjusting the elements of the array.
o After the creation of heap, now remove the root element of the heap repeatedly by
shifting it to the end of the array, and then store the heap structure with the remaining
elements.
Now let's see the working of heap sort in detail by using an example. To understand it more
clearly, let's take an unsorted array and try to sort it using heap sort. It will make the explanation
clearer and easier.
First, we have to construct a heap from the given array and convert it into max heap.
After converting the given heap into max heap, the array elements are -
Next, we have to delete the root element (89) from the max heap. To delete this node, we have to
swap it with the last node, i.e. (11). After deleting the root element, we again have to heapify it to
convert it into max heap.
After swapping the array element 89 with 11, and converting the heap into max-heap, the
elements of array are -
In the next step, again, we have to delete the root element (81) from the max heap. To delete this
node, we have to swap it with the last node, i.e. (54). After deleting the root element, we again
have to heapify it to convert it into max heap.
After swapping the array element 81 with 54 and converting the heap into max-heap, the
elements of array are -
In the next step, we have to delete the root element (76) from the max heap again. To delete this
node, we have to swap it with the last node, i.e. (9). After deleting the root element, we again
have to heapify it to convert it into max heap.
After swapping the array element 76 with 9 and converting the heap into max-heap, the elements
of array are -
In the next step, again we have to delete the root element (54) from the max heap. To delete this
node, we have to swap it with the last node, i.e. (14). After deleting the root element, we again
have to heapify it to convert it into max heap.
After swapping the array element 54 with 14 and converting the heap into max-heap, the
elements of array are -
In the next step, again we have to delete the root element (22) from the max heap. To delete this
node, we have to swap it with the last node, i.e. (11). After deleting the root element, we again
have to heapify it to convert it into max heap.
After swapping the array element 22 with 11 and converting the heap into max-heap, the
elements of array are -
In the next step, again we have to delete the root element (14) from the max heap. To delete this
node, we have to swap it with the last node, i.e. (9). After deleting the root element, we again
have to heapify it to convert it into max heap.
After swapping the array element 14 with 9 and converting the heap into max-heap, the elements
of array are -
In the next step, again we have to delete the root element (11) from the max heap. To delete this
node, we have to swap it with the last node, i.e. (9). After deleting the root element, we again
have to heapify it to convert it into max heap.
After swapping the array element 11 with 9, the elements of array are -
Now, heap has only one element left. After deleting it, heap will be empty.
1. Time Complexity
Case Time Complexity
o Best Case Complexity - It occurs when there is no sorting required, i.e. the array is
already sorted. The best-case time complexity of heap sort is O(n logn).
o Average Case Complexity - It occurs when the array elements are in jumbled order that
is not properly ascending and not properly descending. The average case time complexity
of heap sort is O(n log n).
o Worst Case Complexity - It occurs when the array elements are required to be sorted in
reverse order. That means suppose you have to sort the array elements in ascending order,
but its elements are in descending order. The worst-case time complexity of heap sort
is O(n log n).
The time complexity of heap sort is O(n logn) in all three cases (best case, average case, and
worst case). The height of a complete binary tree having n elements is logn.
Jobs J1 J2 J3 J4 J5 J6
Deadlines 5 3 3 2 4 2
Answer the following questions-
1. Write the optimal schedule that gives maximum profit.
2. Are all the jobs completed in the optimal schedule?
3. What is the maximum earned profit?
Solution
Step-01:
Sort all the given jobs in decreasing order of their profit-
Jobs J4 J1 J3 J2 J5 J6
Deadlines 2 5 3 3 4 2
Step-02:
Value of maximum deadline = 5.
So, draw a Gantt chart with maximum time on Gantt chart = 5 units as shown-
Now,
We take each job one by one in the order they appear in Step-01.
We place the job on Gantt chart as far as possible from 0.
Step-03:
We take job J4.
Since its deadline is 2, so we place it in the first empty cell before deadline 2 as-
Step-04:
We take job J1.
Since its deadline is 5, so we place it in the first empty cell before deadline 5 as-
Step-05:
We take job J3.
Since its deadline is 3, so we place it in the first empty cell before deadline 3 as-
Step-06:
We take job J2.
Since its deadline is 3, so we place it in the first empty cell before deadline 3.
Since the second and third cells are already filled, so we place job J2 in the first cell as-
Step-07:
Now, we take job J5.
Since its deadline is 4, so we place it in the first empty cell before deadline 4 as-
Now,
The only job left is job J6 whose deadline is 2.
All the slots before deadline 2 are already occupied.
Thus, job J6 can not be completed.
Now, the given questions may be answered as-
Part-01:
The optimal schedule is-
J2 , J4 , J3 , J5 , J1
This is the required order in which the jobs must be completed in order to obtain the maximum
profit.
Part-02:
All the jobs are not completed in optimal schedule.
This is because job J6 could not be completed within its deadline.
Part-03:
Maximum earned profit
= Sum of profit of all the jobs in optimal schedule
= Profit of job J2 + Profit of job J4 + Profit of job J3 + Profit of job J5 + Profit of job J1
= 180 + 300 + 190 + 120 + 200
= 990