Lec09-Dynamic Programming
Lec09-Dynamic Programming
and
Analysis of
Algorithms
9. Dynamic Programming
Tentative Course Outline
• Divide and Conquer It is useful for problems that can be divided into
• Greedy smaller, independent subproblems.
• Dynamic Programming
It is suitable for problems where making the locally optimal
choice at each stage leads to a globally optimal solution.
4. Minimum Spanning Tree: Spans all the vertices in a connected, undirected graph with the minimum
possible sum of edge weights.
Prim’s Algorithm
- Find the minimum spanning tree of a connected, undirected graph with weighted edges.
- This algorithm is used to design efficient network layouts.
Kruskal’s Algorithm
- Another approach to finding a minimum spanning tree in a graph.
- It focuses on selecting the smallest weighted edges while avoiding cycles.
5. Shortest Path: Find the shortest path from a source node to all other nodes in a weighted graph. It's
commonly used in routing and network protocols.
Dijkstra's Algorithm
6. Interval Scheduling: Given a set of intervals, find the maximum number of non-overlapping intervals that
can be selected.
1. Sample Greedy Algorithms
5. Shortest Path
Dijkstra’s Algorithm
ü Optimization problem: find all shortest paths from the source.
ü Construction of the solution: shortest paths built vertex by vertex.
ü Greedy choice: at each step, choose the closest reachable vertex.
Graph representation
1. Sample Greedy Algorithms
5. Shortest Path
dist[s] ←0 (distance to source vertex is zero)
for all v ∈ V–{s}
do dist[v] ←∞ (set all other distances to infinity)
S←∅ (S, the set of visited vertices is initially empty)
Q←V (Q, the queue initially contains all vertices)
while Q ≠∅ (while the queue is not empty)
do u ← mindistance(Q,dist) (select the element of Q with the minimum distance)
S←S∪{u} (add u to list of visited vertices, remove from Q)
for all v ∈ neighbors[u]
do if dist[v] > dist[u] + w(u, v) (if new shortest path found)
then d[v] ←d[u] + w(u, v) (set new value of shortest path)
return dist
5. Shortest Path
Dijkstra’s Algorithm generates the shortest path from
Node a to all other nodes in the graph.
6. Interval Scheduling
• There are n meeting requests, meeting i takes time (si, ti)
• Cannot schedule two meeting together if their intervals overlap.
• Goal: Schedule as many meetings as possible.
Example: Meetings (1,3), (2, 4), (4, 5), (4, 6), (6, 8)
Solution: 3 meetings ((1, 3), (4, 5), (6, 8))
1. Sample Greedy Algorithms
6. Interval Scheduling
• Consider a set of tasks each represented by an interval describing the time in which it needs to be
processed by some machine.
• For instance, task A might run from 2:00 to 5:00, task B might run from 4:00 to 10:00 and task C might
run from 9:00 to 11:00.
• A subset of intervals is compatible if no two intervals overlap on the machine/resource.
• For example, the subset {A,C} is compatible, as is the subset {B}; but neither {A,B} nor {B,C} are
compatible subsets, because the corresponding intervals within each subset overlap.
• The interval scheduling maximization problem is to find a largest compatible set, i.e., a set of non-
overlapping intervals of maximum size.
Interval Scheduling:
• Job j starts at sj and finishes at fj.
• Two jobs compatible if they don't overlap.
1. Sample Greedy Algorithms • Goal: find maximum subset of mutually compatible jobs.
6. Interval Scheduling
g
h
Time
0 1 2 3 4 5 6 7 8 9 10 11
1. Sample Greedy Algorithms
6. Interval Scheduling
• Greedy template.
• Consider jobs in some natural order.
• Take each job provided it's compatible with the ones already taken.
[Fewest conflicts] For each job j, count the number of conflicting jobs cj. Schedule in ascending order of cj.
1. Sample Greedy Algorithms
6. Interval Scheduling
B
A
C
B
A
C
E
D
D
E
F
F
G G
H H
0 1 2 3 4 5 6 7 8 9 10 11 0 1 2 3 4 5 6 7 8 9 10 11
H
time
0 1 2 3 4 5 6 7 8 9 10 11
0 1 2 3 4 5 6 7 8 9 10 11
B
H
time
0 1 2 3 4 5 6 7 8 9 10 11
0 1 2 3 4 5 6 7 8 9 10 11
B
H
time
0 1 2 3 4 5 6 7 8 9 10 11
0 1 2 3 4 5 6 7 8 9 10 11
B
H
time
0 1 2 3 4 5 6 7 8 9 10 11
B C
0 1 2 3 4 5 6 7 8 9 10 11
B
H
time
0 1 2 3 4 5 6 7 8 9 10 11
0 1 2 3 4 5 6 7 8 9 10 11
B
H
time
0 1 2 3 4 5 6 7 8 9 10 11
B A
0 1 2 3 4 5 6 7 8 9 10 11
B
H
time
0 1 2 3 4 5 6 7 8 9 10 11
0 1 2 3 4 5 6 7 8 9 10 11
B
H
time
0 1 2 3 4 5 6 7 8 9 10 11
B E
0 1 2 3 4 5 6 7 8 9 10 11
B
H
time
0 1 2 3 4 5 6 7 8 9 10 11
B E
0 1 2 3 4 5 6 7 8 9 10 11
B
H
time
0 1 2 3 4 5 6 7 8 9 10 11
B D
E
0 1 2 3 4 5 6 7 8 9 10 11
B
H
time
0 1 2 3 4 5 6 7 8 9 10 11
B E
0 1 2 3 4 5 6 7 8 9 10 11
B
H
time
0 1 2 3 4 5 6 7 8 9 10 11
B E F
0 1 2 3 4 5 6 7 8 9 10 11
B
H
time
0 1 2 3 4 5 6 7 8 9 10 11
B E
0 1 2 3 4 5 6 7 8 9 10 11
B
H
time
0 1 2 3 4 5 6 7 8 9 10 11
B E
0 1 2 3 4 5 6 7 8 9 10 11
B
H
time
0 1 2 3 4 5 6 7 8 9 10 11
B E G
0 1 2 3 4 5 6 7 8 9 10 11
B
H
time
0 1 2 3 4 5 6 7 8 9 10 11
B E
0 1 2 3 4 5 6 7 8 9 10 11
B
H
time
0 1 2 3 4 5 6 7 8 9 10 11
B E H
0 1 2 3 4 5 6 7 8 9 10 11
B
H
time
0 1 2 3 4 5 6 7 8 9 10 11
B E H
0 1 2 3 4 5 6 7 8 9 10 11
B
H
time
0 1 2 3 4 5 6 7 8 9 10 11
B E H
0 1 2 3 4 5 6 7 8 9 10 11
1. Sample Greedy Algorithms
Consider jobs in increasing order of finish time. Take each job provided it's compatible with the ones already
taken.
Sort jobs by finish times so that f1 £ f2 £ ... £ fn.
set of jobs selected
S¬f
for j = 1 to n {
if (job j compatible with S)
S ¬ S È {j}
}
return S
Implementation.
Remember job j* that was added last to S.
Job j is compatible with A if sj ³ fj*.
2. Dynamic Programming
• Divide and Conquer It is useful for problems that can be divided into
• Greedy smaller, independent subproblems.
• Dynamic Programming
It is suitable for problems where making the locally optimal
choice at each stage leads to a globally optimal solution.
Split up the problem of finding the nth Fibonacci number into two sub-problems
Example:
Finding 5th Fibonacci Number
Can we do better?
Dynamic programming helps avoid redundant computations and improves
the time complexity of the algorithm compared to a recursive approach.
2. Dynamic Programming
F3 2
• We want to find F5. F2 1
• We know that F0 = 0 and F1 = 1. F1 1
• Keep these values in a table.
• Now compute F2 = F1 + F0 (the sum of the previous two fibonacci F0 0
numbers) and store the result in the table.
• Then compute F3 = F2 + F1 and store the result in the table.
- Note that to compute F3 we needed the value of F2.
- We did not re-compute the value of F2.
- We retrieved the value of F2 from the table in which it was stored.
• Do the same thing for F4 and F5.
2. Dynamic Programming
Dynamic programming is a technique for solving problems with the following properties:
• An instance is solved using the solutions for smaller instances.
• The solution for a smaller instance might be needed multiple times.
• The solutions to smaller instances are stored in a table, so that each smaller instance is solved
only once.
• Additional space is used to save time.
2. Dynamic Programming
• Both Divide and Conquer and Dynamic Programming splits a problem into sub-problems.
• Divide and Conquer is suitable when the sub-problems are independent of each other (i.e.
solution of a sub-problem does not depend on the solution of another sub-problem). e.g sorting
the first half of an array is independent of sorting the second half.
• Dynamic Programming is suitable when the sub-problems are dependent on each other. e.g.
finding the 3rd Fibonacci number is needed to find the 4th Fibonacci number.
• Divide and Conquer is top down.
• Dynamic Programming is bottom up.
2. Dynamic Programming
Optimization problem:
• There are many ways to cut the rod, we want the best one.
Length i 1 2 3 4
Illustration
A Recursive Solution
Let ri be the maximum amount of money we can get with a rod of size i.
• We do not know how large a piece we should cut off. So we try all possible cases.
ü First we try cutting a piece of length 1, and combining it with the optimal way to cut a rod of length n - 1.
ü Then we try cutting a piece of length 2, and combining it with the optimal way to cut a rod of length n - 2.
• In this way, we try all the possible lengths and then pick the best one. We end up with
Optimal Substructure
• Optimal substructure means that the solution to the given optimization problem can be obtained by the
combination of optimal solutions to its subproblems.
• A problem is said to have optimal substructure if an optimal solution can be constructed efficiently
from optimal solutions of its subproblems.
max (p + r )
rn = !"#"$ i n−i
A Top-down Approach
2. Dynamic Programming
A Bottom-up
Approach
Complexity O(n2)
Algorithms
and
Analysis of
Algorithms