Unit3 Greedy
Unit3 Greedy
(BTECCE21501)
Prof. Jayendra Jadhav
Vishwakarma University, Pune
UNIT 3
Greedy Technique and Dynamic
Programming
Outline
Greedy Method:
– Applications: Fractional Knapsack problem, 0/1 Knapsack problem,
Coin changing problem, Container loading problem, Job sequencing
with deadlines.
– Minimum Cost Spanning Trees: Prim’s algorithm and Kruskal’s
Algorithm,
– Single Source Shortest path problem: Dijkstra’s algorithm &
Bellman Ford Algorithm, Optimal Merge pattern, Huffman Trees.
Dynamic Programming:
– Principle of optimality, Stassen’s method for Matrix multiplication,
Floyd’s algorithm, Multi stage graph, Optimal Binary Search Trees,
Knapsack Problem.
Greedy Method
Greedy Method
Optimal Substructure
Greedy Method – Key Principles
No Backtracking
Iterative Process
Feasibility
Problem Statement
Given:
Objective:
2. Sort Items
𝐯𝒊
𝐫𝐚𝐭𝐢𝐨𝒊 =
𝒘𝒊
2. Sort Items
Items 1 2 3 4 5
Weights (Kg) 3 3 2 5 1
Values 10 15 10 20 8
Fractional Knapsack Problem Example
Given, n = 5
Wi = {3, 3, 2, 5, 1}
Items 1 2 3 4 5
Weights (Kg) 3 3 2 5 1
Values 10 15 10 20 8
𝐯𝒊
3.3 5 5 4 8
𝒘𝒊
Fractional Knapsack Problem Example
(𝐯𝒊/𝒘𝒊).
Items 5 2 3 4 1
Weights (Kg) 1 3 2 5 3
Values 8 15 10 20 10
𝐯𝒊
8 5 5 4 3.3
𝒘𝒊
Fractional Knapsack Problem Example
Initially, Knapsack = 0
Items 5 2 3 4 1
Weights (Kg) 1 3 2 5 3
Values 8 15 10 20 10
𝐯𝒊
8 5 5 4 3.3
𝒘𝒊
Knapsack 1 1 1 4/5 0
Remaining Weight 10-1=9 9-3=6 6-2=4 4-4=0 0
Fractional Knapsack Problem Example
Items 5 2 3 4 1
Weights (Kg) 1 3 2 5 3
Values 8 15 10 20 10
𝐯𝒊
8 5 5 4 3.3
𝒘𝒊
Knapsack 1 1 1 4/5 0
Example2
Items 1 2 3 4 5 6
Weights (Kg) 6 10 3 5 1 3
Values 6 2 1 8 3 5
Example2
Items 1 2 3 4 5 6
Weights (Kg) 6 10 3 5 1 3
Values 6 2 1 8 3 5
Vi/Wi 1.00 0.20 0.33 1.60 3.00 1.66
Example2
(𝐯𝒊/𝒘𝒊).
Items 5 6 4 1 3 2
Weights (Kg) 1 3 5 6 3 10
Values 3 5 8 6 1 2
Vi/Wi 3.00 1.66 1.60 1.00 0.33 0.20
Example2
Initially, Knapsack = 0
Total Capacity = 16
Items 5 6 4 1 3 2
Weights (Kg) 1 3 5 6 3 10
Values 3 5 8 6 1 2
Vi/Wi 3.00 1.66 1.60 1.00 0.33 0.20
Knapsack 1 1 1 1 1/3 0
Weight = [(1*1)+(1*3)+(1*5)+(1*6)+(1/3*3)] = 16
Items 5 6 4 1 3 2
Weights (Kg) 1 3 5 6 3 10
Values 3 5 8 6 1 2
Vi/Wi 3.00 1.66 1.60 1.00 0.33 0.20
Knapsack 1 1 1 1 1/3 0
Items 1 2 3 4 5
Weights (Kg) 2 1 3 2 4
Values 12 10 25 15 25
Fractional Knapsack Problem Complexity
Fractional Knapsack Problem Complexity
Coin Changing Problem
Coin Changing Problem
For Example
Target amount: 63
Coin Changing Problem Example1
Problem Statement :
Initialize a Schedule:
Determine the maximum deadline to decide the number of
available slots.
Here, the maximum deadline is 3 (from job E).
Create a schedule table of size equal to the maximum
deadline.
Create a schedule array of size 3 initialized to None:
Initially, all slots are free.
Slots: [None, None, None]
Job Sequencing with Deadlines Example1
J4 3 100 Place in slot 3 (latest available before or on deadline). [None, None, J4, None]
J5 4 80 Place in slot 4 (latest available before or on deadline). [None, None, J4, J5]
J2 2 60 Place in slot 2 (latest available before or on deadline). [None, J2, J4, J5]
J3 1 40 Place in slot 1 (latest available before or on deadline). [J3, J2, J4, J5]
J4 3 100 Place in slot 3 (latest available before or on deadline). [None, None, J4, None]
J5 4 80 Place in slot 4 (latest available before or on deadline). [None, None, J4, J5]
J2 2 60 Place in slot 2 (latest available before or on deadline). [None, J2, J4, J5]
J3 1 40 Place in slot 1 (latest available before or on deadline). [J3, J2, J4, J5]
Prim’s algorithm
Kruskal’s Algorithm
Undirected & Connected Graphs
In other words,
V` = 5.
– The number of edges in the spanning
tree would be equal to (E` = |V| - 1)
– E` = 4.
– The possible spanning trees:
Spanning Trees
3
Minimum Cost Spanning Trees
using algorithms:
– Kruskal's Algorithm.
– Prim's Algorithm
Minimum Cost Spanning Trees
Kruskal's Algorithm:
Prim's Algorithm:
Step 4 - Now, select the edge CD, and add it to the MST.
Prim’s Algorithm
Step 5 - Now, choose the edge CA. Here, we cannot select the
edge CE as it would create a cycle to the graph. So, choose the
edge CA and add it to the MST.
Prim’s Algorithm
6
B D
7 5
4
A 3 2 F
8 2
C E
3
Kruskal’s Algorithm
6
B D
7 5
4
A 3 2 F
8 2
C E
3
Solve using Prim’s & Kruskal’s Algorithm
Single Source Shortest Path Problem
In SSSP Problem:
– You are given a weighted graph (which can be
directed or undirected).
– The graph has vertices connected by edges with
non-negative or negative weights.
– You are asked to determine the shortest (minimum
weight) path from a single source vertex to every
other vertex in the graph.
Single Source Shortest Path Problem
Key Terms:
– Shortest Path: The path between two vertices such
that the sum of the weights of the edges in the path
is minimized.
– Source Vertex: The starting point of the shortest
path search.
– Weighted Graph: A graph where edges have weights
representing the cost or distance between vertices.
Single Source Shortest Path Problem
Dijkstra’s algorithm
Huffman Trees
Dijkstra’s algorithm
W. Dijkstra in 1956.
Dijkstra’s algorithm
5
B D
4 6
8
A 1 2 F
2 6
C E
10
Dijkstra’s algorithm
5
B D
4 6
8
A 1 2 F
2 6
C E
10
Dijkstra’s algorithm
B 5 D
4 6
8
A 1 2 F
2 6
C E
10
Dijkstra’s algorithm
B 5 D
4 6 𝒅 𝒖 + 𝒄 𝒖, 𝒗 ] < 𝒅 𝒗
8 𝒅 𝒗 = 𝒅 𝒖 + 𝒄 𝒖, 𝒗
A 1 2 F
2 6
C E
10
Dijkstra’s algorithm
B 5 D
4 6 𝒅 𝒖 + 𝒄 𝒖, 𝒗 ] < 𝒅 𝒗
8 𝒅 𝒗 = 𝒅 𝒖 + 𝒄 𝒖, 𝒗
A 1 2 F
2 6
C E
10
Dijkstra’s algorithm
B 5 D
4 6 𝒅 𝒖 + 𝒄 𝒖, 𝒗 ] < 𝒅 𝒗
8 𝒅 𝒗 = 𝒅 𝒖 + 𝒄 𝒖, 𝒗
A 1 2 F
2 6
C E
10
Dijkstra’s algorithm
B 5 D
4 6 𝒅 𝒖 + 𝒄 𝒖, 𝒗 ] < 𝒅 𝒗
8 𝒅 𝒗 = 𝒅 𝒖 + 𝒄 𝒖, 𝒗
A 1 2 F
2 6
C E
10
Dijkstra’s algorithm
B 5 D
4 6 𝒅 𝒖 + 𝒄 𝒖, 𝒗 ] < 𝒅 𝒗
8 𝒅 𝒗 = 𝒅 𝒖 + 𝒄 𝒖, 𝒗
A 1 2 F
2 6
C E
10
Dijkstra’s algorithm
B 5 D
4 6 𝒅 𝒖 + 𝒄 𝒖, 𝒗 ] < 𝒅 𝒗
8 𝒅 𝒗 = 𝒅 𝒖 + 𝒄 𝒖, 𝒗
A 1 2 F
2 6
C E
10
Dijkstra’s algorithm
B 5 D
4 6
8
A 1 2 F
2 6
C E
10
Dijkstra’s algorithm
A B C D E F
A
5
B D
B
4 6
8 C
A 1 2 F
D
2 6
C E E
10
E
Dijkstra’s algorithm
A B C D E F
A 0 ∞ ∞ ∞ ∞ ∞
5
B D
A 0 4 2 ∞ ∞ ∞
4 6
8 C 0 3 2 10 12 ∞
A 1 2 F
B 0 3 2 8 12 ∞
2 6
C E D 0 3 2 8 10 14
10
E 0 3 2 8 10 14
F 0 3 2 8 10 14
Dijkstra’s algorithm
5
B D A B C D E F
4 6
A 0 ∞ ∞ ∞ ∞ ∞
8
A 1 2 F
A 0 4 2 ∞ ∞ ∞
2 6 C 0 3 2 10 12 ∞
C E
10
B 0 3 2 8 12 ∞
D 0 3 2 8 10 14
𝒅 𝒖 + 𝒄 𝒖, 𝒗 ] < 𝒅 𝒗
𝒅 𝒗 = 𝒅 𝒖 + 𝒄 𝒖, 𝒗 E 0 3 2 8 10 14
F 0 3 2 8 10 14
Dijkstra’s Algorithm
The goal is to find the shortest path from vertex 0 to all other vertices.
Dijkstra’s Algorithm
Initialize distances
Distance from 0 to 0 = 0
Distance from 0 to 1 = ∞
Distance from 0 to 2 = ∞
Distance from 0 to 3 = ∞
Distance from 0 to 4 = ∞
Distance from 0 to 5 = ∞
Distance from 0 to 6 = ∞
Dijkstra’s algorithm
Dijkstra’s algorithm
Dijkstra’s algorithm
Dijkstra’s algorithm
Dijkstra’s algorithm
Dijkstra’s algorithm
Dijkstra’s algorithm
Dijkstra’s algorithm
Negative Weight Cycle
– Negative weight edges can create negative weight cycles i.e. a cycle that
will reduce the total path distance by coming back to the same point.
– Negative weight cycles can give an incorrect result when trying to find
Key Features:
Step 1:
Initialize a distance array Dist[] to store the shortest distance for each vertex from
the source vertex.
Initially distance of source will be 0 and Distance of other vertices will be ∞.
Bellman Ford Algorithm
Step 2: 1st Relaxation
Start relaxing the edges, during 1st Relaxation:
Current Distance of B > (Distance of A) + (Weight of A to B)
Dist[B] : ∞ > (0 + 5) Dist[B] = 5
Bellman Ford Algorithm
Step 3: 2nd Relaxation
Current Distance of D > (Distance of B) + (Weight of B to D)
Dist[D] : ∞ > (5 + 2) Dist[D] = 7
Current Distance of C > (Distance of B) + (Weight of B to C)
Dist[C] : ∞ > 5 + 1 Dist[C] = 6
Bellman Ford Algorithm
Step 4: 3rd Relaxation
Current Distance of F > (Distance of D ) + (Weight of D to F)
Dist[F] : ∞ > 7 + 2 Dist[F] = 9
Current Distance of E > (Distance of C ) + (Weight of C to E)
Dist[E] : ∞ > 6 + 1 Dist[E] = 7
Bellman Ford Algorithm
Step 5: 4th Relaxation
Current Distance of D > (Distance of E) + (Weight of E to D)
Dist[D] : 7 > 7 + (-1) Dist[D] = 6
Current Distance of E > (Distance of F ) + (Weight of F to E)
Dist[E] : 7 > 9 + (-3) Dist[E] = 6
Bellman Ford Algorithm
Step 6: 5th Relaxation
Current Distance of F > (Distance of D) + (Weight of D to F)
Dist[F] : 9 > 6 + 2 Dist[F] = 8
Current Distance of D > (Distance of E ) + (Weight of E to D)
Dist[D] : 6 > 6 + (-1) Dist[D] = 5
Since the graph has 6 vertices, So during the 5th relaxation the shortest distance for
all the vertices should have been calculated.
Bellman Ford Algorithm
Now the final relaxation i.e. the 6th relaxation should indicate the presence of
negative cycle if there is any changes in the distance array of 5th relaxation.
Since, we observer changes in the Distance array Hence, we can conclude the
merging scenarios.
Optimal Merge Pattern
Problem Definition:
merge these files into one single file with the least
possible merging cost.
Approach:
approach.
Normal
20, 30, 10, 5, 30
M1 = 20+30 = 50
M2 = 50+10 = 60
M3 = 60+5 = 65
M4 = 65+30 = 95
Merge Cost = 50+60+65+95 = 270
Optimal Merge Pattern
Greedy Approach
Merge first two files
Normal
(5,10, 20, 30, 30) => (15, 20, 30, 30)
20, 30, 10, 5, 30
Merge next two files
M1 = 20+30 = 50
(15, 20, 30, 30) => (30, 30, 35)
M2 = 50+10 = 60
Merge next two files
M3 = 60+5 = 65
(30, 30, 35) => (35, 60)
M4 = 65+30 = 95
Merge next two files
Merge Cost = 50+60+65+95 = 270
(35, 60) => (95)
Total Merge Cost =15+35+60+95=205
Optimal Merge Pattern
5, 3, 2, 7, 9, 13
Optimal Merge Pattern
5, 3, 2, 7, 9, 13
Merg Cost = 5 + 10 + 16 + 23 + 39 = 93
Optimal Merge Pattern
Huffman Trees
– For each character, create a leaf node with the character and its frequency.
– Combine the two nodes with the smallest frequencies into a new internal node
whose frequency is the sum of the two. These nodes become children of the new
node.
– Repeat this process until all nodes are merged into a single tree. The final root
node will represent the combined frequency of all characters.
Assign Codes:
– Once the tree is built, assign binary codes to each character. The left branch
typically represents a 0, and the right branch represents a 1.
Huffman Trees
https://opendsa-server.cs.vt.edu/ODSA/Books/Everything/html/Huffman.html
Huffman Trees
0 1
0 1
0 1 0 1
0 1
0 1
0 1
Huffman Trees
0 1 Huffman Codes
E 0
0 1 U 100
D 101
L 110
0 1 0 1
C 1110
Z 111100
0 1
K 111101
M 11111
0 1
0 1
Huffman Trees
ENGINEERING
E N G I R
3 3 2 2 1
R G I E N
1 2 2 3 3
Huffman Trees
R1 G2 I2 E3 N3
R G I E N
1 2 2 3 3
Huffman Trees
R G I E N
1 2 2 3 3
Huffman Trees
encoding.