Module 3 Dynamic Programming HMP
Module 3 Dynamic Programming HMP
• Principle of optimality
• 0/1 Knapsack
• Largest Common Subsequence
• Travelling Salesperson Problem
4
Comparison with Greedy Approach
Greedy and Dynamic Programming are methods for solving
optimization problems.
However, often you need to use dynamic programming since
the optimal solution cannot be guaranteed by a greedy
algorithm.
Dynamic Programming provides efficient solutions for some
problems for which a brute force approach would be very
slow.
To use Dynamic Programming we need only show that the
principle of optimality applies to the problem.
Instances when greedy approach fails
There are tons of tasks where greedy algorithms fail.
e.g. Coin changing problem, if you have coins 1,6,8, then
12=6+6 is better than 12=8+1+1+1+1.
e.g. Traveling salesman problem
Elements of Dynamic Programming …
Principle of optimality
In an optimal sequence of decisions or choices, each subsequence must
also be optimal.
Memorization (for overlapping sub-problems)
avoid calculating the same thing twice,
usually by keeping a table of known results that fills up as sub-instances
are solved.
Example: Fibonacci numbers
Fibonacci numbers: 0, 1, 1, 2, 3, 5, 8, 13, 21, 24
Computing the nth fibonacci number using bottom-up
iteration:
• f(0) = 0
• f(1) = 1
• f(2) = 0+1 = 1
• f(3) = 1+1 = 2
• f(4) = 1+2 = 3
• f(5) = 2+3 = 5
• f(n-2) = f(n-3)+f(n-4)
• f(n-1) = f(n-2)+f(n-3)
f(n-1) + f(n-2)
...
Recursive calls for fib
fib(5)
fib(4) fib(3)
fib(1) fib(0)
fib Using Dynamic Programming
fib(5)
6 fib(3)
fib(4)
5
fib(3) fib(2) fib(2) fib(1)
4
fib(1) fib(0)
1 2
fib Using Dynamic Programming
without recursion (iterative)
Knapsack 0-1 Problem
The difference
between this problem
and the fractional
knapsack one is that
you CANNOT take a
fraction of an item.
The best set of items from {I0, I1, I2} is {I0, I1, I2}
BUT the best set of items from {I0, I1, I2, I3} is {I0, I2, I3}.
In this example, note that this optimal solution, {I0, I2, I3},
does NOT build upon the previous optimal solution, {I0, I1,
I2}.
(Instead it build's upon the solution, {I0, I2}, which is really the optimal
subset of {I0, I1, I2} with weight 12 or less.)
Knapsack 0-1 problem
So now we must re-work the way we build upon previous sub-
problems…
Let B[k, w] represent the maximum total value of a subset Sk with
weight w.
Our goal is to find B[n, W], where n is the total number of items and
W is the maximal weight the knapsack can carry.
this means that the best subset of Sk that has total weight w is:
1) The best subset of Sk-1 that has total weight w, or
2) The best subset of Sk-1 that has total weight w-wk plus the item k
The 2D array/matrix B[i,w]
i/w 0 1 2 3 4 5
0 0 0 0 0 0 0
1 0
2 0
3 0
4 0
Knapsack 0-1 Problem Recursive Formula
Second case: wk ≤ w
Then the item k can be in the solution, and we choose the case
with greater value.
Knapsack 0-1 Algorithm
for w = 0 to W { // Initialize 1st row to 0’s
B[0,w] = 0
}
for i = 1 to n { // Initialize 1st column to 0’s
B[i,0] = 0
}
for i = 1 to n {
for w = 0 to W {
if wi <= w { //item i can be in the solution
if vi + B[i-1,w-wi] > B[i-1,w]
B[i,w] = vi + B[i-1,w- wi]
else
B[i,w] = B[i-1,w]
}
else B[i,w] = B[i-1,w] // wi > w
}
Knapsack 0-1 Problem
Let’s run our algorithm on the following data:
n = 4 (# of elements)
W = 5 (max weight)
Elements (weight, value):
(2,3), (3,4), (4,5), (5,6)
Knapsack 0-1 Example
i/w 0 1 2 3 4 5
0 0 0 0 0 0 0
1 0
2 0
3 0
4 0
for i = 1 to n
B[i,0] = 0
Items:
1: (2,3)
Knapsack 0-1 Example 2: (3,4)
3: (4,5)
4: (5,6)
i/w 0 1 2 3 4 5
0 0 0 0 0 0 0 i=1
1 0 0 vi = 3
2 0 wi = 2
3 0 w=1
4 0 w-wi = -1
We’re DONE!!
The max possible value that can be carried in this knapsack is $7
Knapsack 0-1 Algorithm
This algorithm only finds the max possible value that
can be carried in the knapsack
The value in B[n,W]
Let i = n and k = W
if B[i, k] ≠ B[i-1, k] then
mark the ith item as in the knapsack
i = i-1, k = k-wi
else
i = i-1 // Assume the ith item is not in the knapsack
// Could it be in the optimally packed knapsack?
Items: Knapsack:
Knapsack 0-1 Algorithm 1: (2,3)
Finding the Items 2: (3,4)
3: (4,5)
4: (5,6)
i/w 0 1 2 3 4 5
0 0 0 0 0 0 0 i=4
1 0 0 3 3 3 3 k=5
2 0 0 3 4 4 7 vi = 6
3 0 0 3 4 5 7 wi = 5
4 0 0 3 4 5 7 B[i,k] = 7
B[i-1,k] = 7
i=n,k=W
while i, k > 0
if B[i, k] ≠ B[i-1, k] then
mark the ith item as in the knapsack
i = i-1, k = k-wi
else
i = i-1
Items: Knapsack:
Knapsack 0-1 Algorithm 1: (2,3)
Finding the Items 2: (3,4)
3: (4,5)
4: (5,6)
i/w 0 1 2 3 4 5
0 0 0 0 0 0 0 i=3
1 0 0 3 3 3 3 k=5
2 0 0 3 4 4 7 vi = 5
3 0 0 3 4 5 7 wi = 4
4 0 0 3 4 5 7 B[i,k] = 7
B[i-1,k] = 7
i=n,k=W
while i, k > 0
if B[i, k] ≠ B[i-1, k] then
mark the ith item as in the knapsack
i = i-1, k = k-wi
else
i = i-1
Items: Knapsack:
Knapsack 0-1 Algorithm 1: (2,3) Item 2
Finding the Items 2: (3,4)
3: (4,5)
4: (5,6)
i/w 0 1 2 3 4 5
0 0 0 0 0 0 0 i=2
1 0 0 3 3 3 3 k=5
2 0 0 3 4 4 7 vi = 4
3 0 0 3 4 5 7 wi = 3
4 0 0 3 4 5 7 B[i,k] = 7
B[i-1,k] = 3
k – wi = 2
i=n,k=W
while i, k > 0
if B[i, k] ≠ B[i-1, k] then
mark the ith item as in the knapsack
i = i-1, k = k-wi
else
i = i-1
Items: Knapsack:
Knapsack 0-1 Algorithm 1: (2,3) Item 2
Finding the Items 2: (3,4) Item 1
3: (4,5)
4: (5,6)
i/w 0 1 2 3 4 5
0 0 0 0 0 0 0 i=1
1 0 0 3 3 3 3 k=2
2 0 0 3 4 4 7 vi = 3
3 0 0 3 4 5 7 wi = 2
4 0 0 3 4 5 7 B[i,k] = 3
B[i-1,k] = 0
k – wi = 0
i=n,k=W
while i, k > 0
if B[i, k] ≠ B[i-1, k] then
mark the ith item as in the knapsack
i = i-1, k = k-wi
else
i = i-1
Items: Knapsack:
Knapsack 0-1 Algorithm 1: (2,3) Item 2
Finding the Items 2: (3,4) Item 1
3: (4,5)
4: (5,6)
i/w 0 1 2 3 4 5
0 0 0 0 0 0 0 i=1
1 0 0 3 3 3 3 k=2
2 0 0 3 4 4 7 vi = 3
3 0 0 3 4 5 7 wi = 2
4 0 0 3 4 5 7 B[i,k] = 3
B[i-1,k] = 0
k – wi = 0
k = 0, so we’re DONE!
for i = 1 to n
Repeat n times
for w = 0 to W O(W)
< the rest of the code >
Brute-force search or exhaustive search, also known as generate and test, is a very general problem solving
technique and algorithmic paradigm that consists of systematically enumerating all possible candidates for the
solution and checking whether each candidate satisfies the problem's statement.
Largest/Longest Common Subsequence (LCS)
(Ref. Parag Dave, page no-285)
A subsequence is a sequence that appears in the same relative order, but not necessarily
contiguous. For example, “abc”, “abg”, “bdf”, “aeg”, ‘”acefg”, .. etc are subsequences of
“abcdefg”.
Problem:
Given two sequences X = <x1,x2,…xm> and Y = <y1,y2,…yn> , Find the longest sub-
sequence Z = <z1,….zk> that is common to X and Y.
For example:
If X = < A,B,C,B,D,A,B> and Y = <B,D,C,A,B,A>
then some common sub-sequences are:
{A} {B} {C} {D} {A,A} {B,B} {B,C,A} {B,C,A} {B,C,B,A } {B,D,A,B}
50
Largest/Longest Common Subsequence (LCS)
Algorithm LCS Length(x,y)
m = length(x);
n = length(y);
for i=1 to m do
c[i,0] = 0;
for j=0 to n do
c[0,j] = 0;
for i=1 to m do
for j=1 to n do
if x[i] = y[j] then
c[i , j] = c[i-1 , j-1] + 1;
else if c[i-1 , j] >= c[i , j-1] then
c[i , j] = c[i-1 , j];
else
c[i , j] = c[i , j-1]
Return c and b; 56
Largest/Longest Common Subsequence (LCS)
Example:
Given two strings are X = BACDB and Y = BDCB
find the longest common subsequence.
57
Travelling Salesman Problem
Problem: Solve the traveling salesman problem with the associated cost
adjacency matrix using dynamic programming.
1 2 3 4 5
1 - 24 11 10 9
2 8 - 2 5 11
3 26 12 - 8 7
4 11 23 24 - 6
5 5 4 8 11 -
58
Solve the given TSP using algorithm on next slide
Time complexity of the algorithm on previous slide: