0% found this document useful (0 votes)
25 views

Lec09-Dynamic Programming

The document outlines algorithms and analysis of algorithms topics including dynamic programming. It provides an introduction to greedy algorithms and lists examples like coin changing, fractional knapsack, and shortest paths. Interval scheduling is described as finding the maximum number of non-overlapping intervals that can be selected.

Uploaded by

MAJD ABDALLAH
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
25 views

Lec09-Dynamic Programming

The document outlines algorithms and analysis of algorithms topics including dynamic programming. It provides an introduction to greedy algorithms and lists examples like coin changing, fractional knapsack, and shortest paths. Interval scheduling is described as finding the maximum number of non-overlapping intervals that can be selected.

Uploaded by

MAJD ABDALLAH
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 49

Algorithms

and
Analysis of
Algorithms

9. Dynamic Programming
Tentative Course Outline

1. Introduction and Overview


2. Introduction to Algorithms and Analysis, Asymptotic Notations
3. Recursion
4. Sorting Algorithms
5. Searching Algorithms
6. Divide and Conquer Algorithms
7. Greedy Algorithms
8. Dynamic Programming
9. Graph Algorithms-1
10.Graph Algorithms-2
11.Tree Algorithms-1
12.Tree Algorithms-2
13.Backtracking
14.Approximation Approaches
Outline

1. Sample Greedy Algorithms


2. Dynamic Programming
1. Sample Greedy Algorithms

Basic Algorithm Design Techniques


The choice of algorithm depends on the specific characteristics of the problem at hand.

• Divide and Conquer It is useful for problems that can be divided into
• Greedy smaller, independent subproblems.
• Dynamic Programming
It is suitable for problems where making the locally optimal
choice at each stage leads to a globally optimal solution.

It is effective for problems with overlapping subproblems


and optimal substructure, allowing the reuse of previously
computed solutions.
1. Sample Greedy Algorithms

1. Coin Changing Problem: Given a set of coin denominations and an amount


to make change for, find the minimum number of coins needed to make that
amount.

2. Fractional Knapsack Problem: Given a set of items, each with a weight


and a value, and a knapsack with a maximum weight capacity, select a
combination of items to maximize the total value while staying within the
weight limit.

3. Huffman Coding: Given a set of characters and their frequencies in a text,


construct a binary tree to encode the characters in a way that minimizes the
total length of the encoded message. (Using shorter binary representations
for characters with high frequencies)
1. Sample Greedy Algorithms

4. Minimum Spanning Tree: Spans all the vertices in a connected, undirected graph with the minimum
possible sum of edge weights.
Prim’s Algorithm
- Find the minimum spanning tree of a connected, undirected graph with weighted edges.
- This algorithm is used to design efficient network layouts.
Kruskal’s Algorithm
- Another approach to finding a minimum spanning tree in a graph.
- It focuses on selecting the smallest weighted edges while avoiding cycles.

5. Shortest Path: Find the shortest path from a source node to all other nodes in a weighted graph. It's
commonly used in routing and network protocols.
Dijkstra's Algorithm

6. Interval Scheduling: Given a set of intervals, find the maximum number of non-overlapping intervals that
can be selected.
1. Sample Greedy Algorithms

5. Shortest Path
Dijkstra’s Algorithm
ü Optimization problem: find all shortest paths from the source.
ü Construction of the solution: shortest paths built vertex by vertex.
ü Greedy choice: at each step, choose the closest reachable vertex.

Graph representation
1. Sample Greedy Algorithms

5. Shortest Path
dist[s] ←0 (distance to source vertex is zero)
for all v ∈ V–{s}
do dist[v] ←∞ (set all other distances to infinity)
S←∅ (S, the set of visited vertices is initially empty)
Q←V (Q, the queue initially contains all vertices)
while Q ≠∅ (while the queue is not empty)
do u ← mindistance(Q,dist) (select the element of Q with the minimum distance)
S←S∪{u} (add u to list of visited vertices, remove from Q)
for all v ∈ neighbors[u]
do if dist[v] > dist[u] + w(u, v) (if new shortest path found)
then d[v] ←d[u] + w(u, v) (set new value of shortest path)

return dist

S, the set of visited vertices


Q, the queue initially contains all vertices
1. Sample Greedy Algorithms

5. Shortest Path
Dijkstra’s Algorithm generates the shortest path from
Node a to all other nodes in the graph.

Shortest path from a to b is 20.


Interval Scheduling
Activity Selection
Job Scheduling
1. Sample Greedy Algorithms

6. Interval Scheduling
• There are n meeting requests, meeting i takes time (si, ti)
• Cannot schedule two meeting together if their intervals overlap.
• Goal: Schedule as many meetings as possible.

Example: Meetings (1,3), (2, 4), (4, 5), (4, 6), (6, 8)
Solution: 3 meetings ((1, 3), (4, 5), (6, 8))
1. Sample Greedy Algorithms

6. Interval Scheduling
• Consider a set of tasks each represented by an interval describing the time in which it needs to be
processed by some machine.
• For instance, task A might run from 2:00 to 5:00, task B might run from 4:00 to 10:00 and task C might
run from 9:00 to 11:00.
• A subset of intervals is compatible if no two intervals overlap on the machine/resource.
• For example, the subset {A,C} is compatible, as is the subset {B}; but neither {A,B} nor {B,C} are
compatible subsets, because the corresponding intervals within each subset overlap.
• The interval scheduling maximization problem is to find a largest compatible set, i.e., a set of non-
overlapping intervals of maximum size.
Interval Scheduling:
• Job j starts at sj and finishes at fj.
• Two jobs compatible if they don't overlap.
1. Sample Greedy Algorithms • Goal: find maximum subset of mutually compatible jobs.

6. Interval Scheduling

g
h
Time
0 1 2 3 4 5 6 7 8 9 10 11
1. Sample Greedy Algorithms

6. Interval Scheduling

• Greedy template.
• Consider jobs in some natural order.
• Take each job provided it's compatible with the ones already taken.

[Earliest start time] Consider jobs in ascending order of sj.

[Earliest finish time] Consider jobs in ascending order of fj.

[Shortest interval] Consider jobs in ascending order of fj - sj.

[Fewest conflicts] For each job j, count the number of conflicting jobs cj. Schedule in ascending order of cj.
1. Sample Greedy Algorithms

6. Interval Scheduling

Counter example for earliest start time

Counter example for shortest interval

Counter example for fewest conflicts


1. Sample Greedy Algorithms

B
A
C
B
A
C
E
D
D
E
F
F

G G

H H
0 1 2 3 4 5 6 7 8 9 10 11 0 1 2 3 4 5 6 7 8 9 10 11

Sort the jobs according to earliest finish time

Earliest-finish-time-first algorithm demo


B

H
time
0 1 2 3 4 5 6 7 8 9 10 11

0 1 2 3 4 5 6 7 8 9 10 11
B

H
time
0 1 2 3 4 5 6 7 8 9 10 11

job B is compatible (add to schedule)

0 1 2 3 4 5 6 7 8 9 10 11
B

H
time
0 1 2 3 4 5 6 7 8 9 10 11

0 1 2 3 4 5 6 7 8 9 10 11
B

H
time
0 1 2 3 4 5 6 7 8 9 10 11

job C is incompatible (do not add to schedule)

B C

0 1 2 3 4 5 6 7 8 9 10 11
B

H
time
0 1 2 3 4 5 6 7 8 9 10 11

0 1 2 3 4 5 6 7 8 9 10 11
B

H
time
0 1 2 3 4 5 6 7 8 9 10 11

job A is incompatible (do not add to schedule)

B A

0 1 2 3 4 5 6 7 8 9 10 11
B

H
time
0 1 2 3 4 5 6 7 8 9 10 11

0 1 2 3 4 5 6 7 8 9 10 11
B

H
time
0 1 2 3 4 5 6 7 8 9 10 11

job E is compatible (add to schedule)

B E

0 1 2 3 4 5 6 7 8 9 10 11
B

H
time
0 1 2 3 4 5 6 7 8 9 10 11

B E

0 1 2 3 4 5 6 7 8 9 10 11
B

H
time
0 1 2 3 4 5 6 7 8 9 10 11

job D is incompatible (do not add to schedule)

B D
E

0 1 2 3 4 5 6 7 8 9 10 11
B

H
time
0 1 2 3 4 5 6 7 8 9 10 11

B E

0 1 2 3 4 5 6 7 8 9 10 11
B

H
time
0 1 2 3 4 5 6 7 8 9 10 11

job F is incompatible (do not add to schedule)

B E F

0 1 2 3 4 5 6 7 8 9 10 11
B

H
time
0 1 2 3 4 5 6 7 8 9 10 11

B E

0 1 2 3 4 5 6 7 8 9 10 11
B

H
time
0 1 2 3 4 5 6 7 8 9 10 11

job G is incompatible (do not add to schedule)

B E

0 1 2 3 4 5 6 7 8 9 10 11
B

H
time
0 1 2 3 4 5 6 7 8 9 10 11

job G is incompatible (do not add to schedule)

B E G
0 1 2 3 4 5 6 7 8 9 10 11
B

H
time
0 1 2 3 4 5 6 7 8 9 10 11

B E

0 1 2 3 4 5 6 7 8 9 10 11
B

H
time
0 1 2 3 4 5 6 7 8 9 10 11

job H is compatible (add to schedule)

B E H

0 1 2 3 4 5 6 7 8 9 10 11
B

H
time
0 1 2 3 4 5 6 7 8 9 10 11

B E H

0 1 2 3 4 5 6 7 8 9 10 11
B

H
time
0 1 2 3 4 5 6 7 8 9 10 11

done (an optimal set of jobs)

B E H

0 1 2 3 4 5 6 7 8 9 10 11
1. Sample Greedy Algorithms

Consider jobs in increasing order of finish time. Take each job provided it's compatible with the ones already
taken.
Sort jobs by finish times so that f1 £ f2 £ ... £ fn.
set of jobs selected

S¬f
for j = 1 to n {
if (job j compatible with S)
S ¬ S È {j}
}
return S

Implementation.
Remember job j* that was added last to S.
Job j is compatible with A if sj ³ fj*.
2. Dynamic Programming

Basic Algorithm Design Techniques


The choice of algorithm depends on the specific characteristics of the problem at hand.

• Divide and Conquer It is useful for problems that can be divided into
• Greedy smaller, independent subproblems.
• Dynamic Programming
It is suitable for problems where making the locally optimal
choice at each stage leads to a globally optimal solution.

It is effective for problems with overlapping subproblems


and optimal substructure, allowing the reuse of previously
computed solutions.
2. Dynamic Programming

Finding the nth Fibonacci Number

Let us design a divide and conquer algorithm.


• We know Fn = Fn-1 + Fn-2 (recursive part)
• F0 = 0 and F1= 1 (base case)

Split up the problem of finding the nth Fibonacci number into two sub-problems

1. Find the (n-1)th Fibonacci number


2. Find the (n-2)th Fibonacci number
3. Find the (n-3)th Fibonacci number

Keep solving the sub-problems until we get to F0 or F1


2. Dynamic Programming

Example:
Finding 5th Fibonacci Number

How many times did we compute F2 ?


2. Dynamic Programming

Can we do better?
Dynamic programming helps avoid redundant computations and improves
the time complexity of the algorithm compared to a recursive approach.
2. Dynamic Programming
F3 2
• We want to find F5. F2 1
• We know that F0 = 0 and F1 = 1. F1 1
• Keep these values in a table.
• Now compute F2 = F1 + F0 (the sum of the previous two fibonacci F0 0
numbers) and store the result in the table.
• Then compute F3 = F2 + F1 and store the result in the table.
- Note that to compute F3 we needed the value of F2.
- We did not re-compute the value of F2.
- We retrieved the value of F2 from the table in which it was stored.
• Do the same thing for F4 and F5.
2. Dynamic Programming

Dynamic programming is a technique for solving problems with the following properties:
• An instance is solved using the solutions for smaller instances.
• The solution for a smaller instance might be needed multiple times.
• The solutions to smaller instances are stored in a table, so that each smaller instance is solved
only once.
• Additional space is used to save time.
2. Dynamic Programming

Divide & Conquer vs. Dynamic Programming

• Both Divide and Conquer and Dynamic Programming splits a problem into sub-problems.
• Divide and Conquer is suitable when the sub-problems are independent of each other (i.e.
solution of a sub-problem does not depend on the solution of another sub-problem). e.g sorting
the first half of an array is independent of sorting the second half.
• Dynamic Programming is suitable when the sub-problems are dependent on each other. e.g.
finding the 3rd Fibonacci number is needed to find the 4th Fibonacci number.
• Divide and Conquer is top down.
• Dynamic Programming is bottom up.
2. Dynamic Programming

Example: Rod Cutting Problem


Given:
• A rod of a certain length, n.
• A list of prices for rods of sizes between 1 and n.
• What is the best way to cut the rod so that we can make the most money by selling it.

Optimization problem:
• There are many ways to cut the rod, we want the best one.
Length i 1 2 3 4

2. Dynamic Programming Price pi 1TL 5TL 8TL 9TL

Illustration

If we have a rod of length 4, there


are eight different ways to cut it.

ü The best strategy is cutting it


into two pieces of length 2,
which gives us 10 TL.
2. Dynamic Programming

A Recursive Solution
Let ri be the maximum amount of money we can get with a rod of size i.

• We can view the problem recursively as follows:


ü First, cut a piece off the left end of the rod, and sell it.
ü Then, find the optimal way to cut the remainder of the rod.

• We do not know how large a piece we should cut off. So we try all possible cases.
ü First we try cutting a piece of length 1, and combining it with the optimal way to cut a rod of length n - 1.
ü Then we try cutting a piece of length 2, and combining it with the optimal way to cut a rod of length n - 2.

• In this way, we try all the possible lengths and then pick the best one. We end up with

By allowing i to be n, we handle the case


rn = ! " max
# " $ (pi + rn −i) where the rod is not cut at all.
2. Dynamic Programming

Optimal Substructure

• Optimal substructure means that the solution to the given optimization problem can be obtained by the
combination of optimal solutions to its subproblems.

• A problem is said to have optimal substructure if an optimal solution can be constructed efficiently
from optimal solutions of its subproblems.

• The problem of finding the maximum value for

max (p + r )
rn = !"#"$ i n−i

can be obtained by finding the maximum value for rn −𝑖 .

• Dynamic programming problems exhibit the optimal substructure property.


2. Dynamic Programming

A Top-down Approach
2. Dynamic Programming

A Bottom-up
Approach

Complexity O(n2)
Algorithms
and
Analysis of
Algorithms

You might also like