0% found this document useful (0 votes)
3 views24 pages

UNIT III - Algorithm Design Techniques

This document covers various algorithm design techniques, including Divide and Conquer, Dynamic Programming, and the Greedy Method. It provides detailed explanations of algorithms such as Merge Sort, Quick Sort, and methods for finding maximum and minimum values in arrays, along with their time complexities. Additionally, it discusses the principles of Dynamic Programming and the Greedy Technique, highlighting their applications and advantages.

Uploaded by

viji
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views24 pages

UNIT III - Algorithm Design Techniques

This document covers various algorithm design techniques, including Divide and Conquer, Dynamic Programming, and the Greedy Method. It provides detailed explanations of algorithms such as Merge Sort, Quick Sort, and methods for finding maximum and minimum values in arrays, along with their time complexities. Additionally, it discusses the principles of Dynamic Programming and the Greedy Technique, highlighting their applications and advantages.

Uploaded by

viji
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 24

UNIT III – Algorithm Design Techniques

Subtitle: Divide & Conquer | Dynamic Programming |


Greedy Method
• Divide and Conquer • Divide and Conquer

Merge Sort
Divide and Conquer – Introduction
• Key Steps: Divide → Sort → Merge
• Problem-solving technique • Time complexity: O(n log n)
• Break – Solve – Combine • Stable sort
• Examples: Merge Sort, Quick Sort, • Diagram of merging
etc.
Quick Sort
Finding Maximum and Minimum • Uses partitioning
• Average case: O(n log n), Worst case:
• Description of method O(n²)
• Time complexity • In-place sorting
• Example illustration or pseudo • Pivot selection strategies
code
Divide and Conquer – Introduction
It is a problem-solving strategy that involves breaking a problem into
smaller sub-problems, solving each sub-problem independently, and combining
their solutions to solve the original problem.

• Divide: Split the problem into smaller sub-problems.


• Conquer: Solve each sub-problem recursively.
• Combine: Merge the results of the sub-problems to form the final solution.
Examples:
• Merge Sort, Quick Sort, Binary Search and Maximum & Minimum Finding

• Time Complexity Insight: O(n log n) solutions


Finding Maximum and Minimum Algorithm Outline

•If only one element: It is both max


To find the maximum and and min.
minimum numbers in a given array
numbers[] of size n, the following
•If two elements: Compare and
algorithm can be used.
decide max/min.

Divide and Conquer Steps: •If more than two elements:


Divide: Split array into two halves. • Find middle index.
Conquer: Recursively find max and min • Recursively find max/min for
of each half. left and right halves.
Combine: Compare results to find overall • Compare to get overall max and
max and min. min.
Input Array: [13, 5, 7, 9, 2]
Left max = 13, min = 5

[13, 5, 7, 9, 2] Right max = 9, min = 2


/ \
[13, 5, 7] [9, 2] Combine:
/ \ / \
Overall max = 13,
[13] [5,7] [9] [2]
Overall min = 2
/ / \
[13] [5] [7]
function findMaxMin(arr, low, high): Example
if low == high: Array: [2, 8, 1, 4, 7, 6]
return (arr[low], arr[low])

else if high == low + 1: •Divide into [2, 8, 1] and [4, 7, 6]


if arr[low] > arr[high]:
return (arr[low], arr[high]) •Find max/min in both parts:
• Left: max = 8, min = 1
else:
return (arr[high], arr[low])
• Right: max = 7, min = 4
else:
mid = (low + high) / 2
•Combine:
(max1, min1) = findMaxMin(arr, low, mid) • Overall max = 8
(max2, min2) = findMaxMin(arr, mid+1, high)
• Overall min = 1
Time Complexity

Overall Time Complexity: O(n)

Comparison Count:
•Linear scan: 2(n-1) comparisons.

•Divide and Conquer: (3n/2) - 2 comparisons.

Advantage: Fewer comparisons, efficient for large arrays.

HOME WORK : Array: [8, 3, 5, 1, 6, 2, 9]


Merge Sort
• Merge Sort is a divide and conquer sorting algorithm that divides an array into
two halves, recursively sorts each half, and then merges the two sorted halves
into a single sorted array.
It works by dividing the unsorted list into
smaller sub-lists until each sub-list contains
only one element (which is trivially sorted),
and then merging those sub-lists to produce
new sorted sub-lists until there is
only one sorted list remaining.
• Merge Sort guarantees a time complexity of O(n log n) in all cases (best,
average, and worst).
Algorithm Steps Time Complexity:

•If the array has one element or is •Average: O(n log n)


empty, it’s already sorted. •Worst: O(n log n)

•Divide the array into two halves. Space Complexity:

•Recursively sort the left half and the •O(n) (because of the extra
right half. array used for merging)

•Merge the two sorted halves into a •Best: O(n log n)


single sorted array.
MERGE_SORT(arr, left, right): if leftArray[i] <= rightArray[j]:

if left < right: arr[k] = leftArray[i]


mid = (left + right) / 2 i++

MERGE_SORT(arr, left, mid) else:


arr[k] = rightArray[j]
MERGE_SORT(arr, mid + 1, right)
j++
MERGE(arr, left, mid, right)
k++
while i < size of leftArray:
MERGE(arr, left, mid, right): arr[k] = leftArray[i]
create leftArray = arr[left..mid] i++
create rightArray = arr[mid+1..right] k++
i = 0, j = 0, k = left while j < size of rightArray:
arr[k] = rightArray[j]
while i < size of leftArray and j < size of
rightArray: j++
k++
[8, 4, 5, 7, 1, 3, 6, 2]
|
DIVIDE -------------------------------
(N/2) | |
[8, 4, 5, 7] [1, 3, 6, 2]
| |
DIVIDE ----------- -----------
(N/2) | | | |
[8, 4] [5, 7] [1, 3] [6, 2]
| | | |
DIVIDE --- --- --- ---
(N/2) || || || ||
[8] [4] [5] [7] [1] [3] [6] [2]
\ / \ /
[4, 8] [5, 7] [1, 3] [2, 6] MERGE
\ / \ /
[4, 5, 7, 8] [1, 2, 3, 6] MERGE
\ /
[1, 2, 3, 4, 5, 6, 7, 8] MERGE
Quick Sort
Quick Sort is a divide and conquer sorting algorithm that picks an element as
a pivot and partitions the array around the pivot so that elements smaller than
the pivot are on the left, and elements greater than the pivot are on the right.
Then it recursively sorts the left and right sub-arrays.

Basic Steps:
• Choose a pivot element from the array.
• Partition the array:
• All elements smaller than the pivot go to the left.
• All elements greater than the pivot go to the right.
• Recursively apply Quick Sort to the left and right parts.
QUICKSORT(arr, low, high):
if low < high:
pivotIndex = PARTITION(arr, low, high)
QUICKSORT(arr, low, pivotIndex - 1)
QUICKSORT(arr, pivotIndex + 1, high)

PARTITION(arr, low, high):


pivot = arr[high] // choosing last element as pivot
i = low - 1
for j = low to high-1:
if arr[j] <= pivot:
i=i+1
swap arr[i] and arr[j]
swap arr[i + 1] and arr[high]
return (i + 1)
• Dynamic Programming • Dynamic Programming
Dynamic Programming –
Introduction Multistage Graph
• Optimal substructure • Graph model
• Overlapping sub-problems • DP approach for shortest path
• Memorization vs Tabulation • Diagram of layered graph

Matrix Chain Multiplication Optimal Binary Search Tree (OBST)


• Problem statement • Goal: minimize search cost
• Cost calculation formula • Use of probabilities
• DP table example (M[i][j]) • Table construction
Dynamic Programming - Introduction

Dynamic Programming is a commonly used algorithmic technique used


to optimize recursive solutions when same sub-problems are called again.

• To store solutions to sub-problems so that each is solved only once.


• To make sure that a recursive value is computed only once

Elements of dynamic programming


1. Optimal Substructure
2. Overlapping Sub-problems
3. Memoization vs Tabulation
Optimal Substructure
• A problem has optimal substructure if
the solution to the overall problem
depends on the solutions to its sub-
problems
Example
Consider the problem of finding the
minimum cost path in a weighted graph
from a source node to a destination
node. We can break this problem down
into smaller sub-problems:
• Find the minimum cost path from the
source node to each intermediate
node.
• Find the minimum cost path from each
intermediate node to the destination
node.
Memoization vs Tabulation
Memoization Tabulation
Top-Down approach Bottom-Up approach

Start with the main problem and Start solving smaller sub-problems
break it into sub-problems. first.

Store answers to subproblems Fill up a DP table iteratively.


(usually using recursion + cache).

Uses recursion. Uses iteration/loops.

May have overhead of recursion More space-efficient in some


stack. cases.
Matrix Chain Multiplication -MCM
Greedy Technique • Greedy Technique
Greedy Strategy – Introduction Optimal Merge Pattern
• Greedy-choice property • Merge files with minimal cost
• Optimal substructure • Similar to Huffman coding
• When it works well • Priority queue approach

Activity Selection Problem Huffman Trees


• Select max number of non- • Used in compression
overlapping activities • Build optimal prefix code
• Sort by finish time • Tree construction steps
• Greedy approach steps

You might also like