CSC373 Week 3: Dynamic Programming Nisarg Shah
CSC373 Week 3: Dynamic Programming Nisarg Shah
Nisarg Shah
➢ Minimizing lateness
➢ Huffman encoding
➢…
E.g.
𝑝[8] = 1,
𝑝[7] = 3,
𝑝[2] = 0
➢ Bellman equation:
0 if 𝑗 = 0
𝑂𝑃𝑇 𝑗 = ൝
max 𝑂𝑃𝑇 𝑗 − 1 , 𝑣𝑗 + 𝑂𝑃𝑇 𝑝 𝑗 if 𝑗 > 0
• Memoization trick
➢ Simply remember what you’ve already computed, and re-
use the answer if needed in future
0 if 𝑗 = 0
𝑂𝑃𝑇 𝑗 = ൝
max 𝑂𝑃𝑇 𝑗 − 1 , 𝑣𝑗 + 𝑂𝑃𝑇 𝑝 𝑗 if 𝑗 > 0
∅ if 𝑗 = 0
𝑆 𝑗 = ൞ 𝑆(𝑗 − 1) if 𝑗 > 0 ∧ 𝑂𝑃𝑇 𝑗 − 1 ≥ 𝑣𝑗 + 𝑂𝑃𝑇 𝑝 𝑗
𝑗 ∪ 𝑆(𝑝 𝑗 ) if 𝑗 > 0 ∧ 𝑂𝑃𝑇 𝑗 − 1 < 𝑣𝑗 + 𝑂𝑃𝑇 𝑝 𝑗
∅ if 𝑗 = 0
𝑆 𝑗 = ൞ 𝑆(𝑗 − 1) if 𝑗 > 0 ∧ 𝑂𝑃𝑇 𝑗 − 1 ≥ 𝑣𝑗 + 𝑂𝑃𝑇 𝑝 𝑗
𝑗 ∪ 𝑆(𝑝 𝑗 ) if 𝑗 > 0 ∧ 𝑂𝑃𝑇 𝑗 − 1 < 𝑣𝑗 + 𝑂𝑃𝑇 𝑝 𝑗
0 if 𝑣 ≤ 0
∞ if 𝑣 > 0, 𝑖 = 0
𝑂𝑃𝑇 𝑖, 𝑣 =
𝑤 + 𝑂𝑃𝑇 𝑖 − 1, 𝑣 − 𝑣𝑖 ,
min 𝑖 if 𝑣 > 0, 𝑖 > 0
𝑂𝑃𝑇 𝑖 − 1, 𝑣
0 𝑖 =0∨𝑡 =𝑠
𝑂𝑃𝑇 𝑡, 𝑖 = ൞ ∞ 𝑖 =0∧𝑡 ≠𝑠
min 𝑂𝑃𝑇 𝑡, 𝑖 − 1 , min 𝑂𝑃𝑇 𝑢, 𝑖 − 1 + ℓ𝑢𝑡 otherwise
𝑢
• Simple idea:
➢ Run single-source shortest paths from each source 𝑠
➢ Running time is 𝑂 𝑛
4
3
➢ Actually, we can do this in 𝑂(𝑛 ) as well
• Example
➢ 𝑀1 is 5 X 10, 𝑀2 is 10 X 100, and 𝑀3 is 100 X 50
➢ 𝑀1 ⋅ 𝑀2 ⋅ 𝑀3 requires 5 ⋅ 10 ⋅ 100 + 5 ⋅ 100 ⋅ 50 =
30000 ops
➢ 𝑀1 ⋅ 𝑀2 ⋅ 𝑀3 requires 10 ⋅ 100 ⋅ 50 + 5 ⋅ 10 ⋅ 50 =
52500 ops
0 𝑖=𝑗
𝑂𝑃𝑇 𝑖, 𝑗 = ൝
min 𝑂𝑃𝑇 𝑖, 𝑘 + 𝑂𝑃𝑇 𝑘 + 1, 𝑗 + 𝑑𝑖−1 𝑑𝑘 𝑑𝑗 ∶ 𝑖 ≤ 𝑘 < 𝑗 if 𝑖 < 𝑗
• Can we do better?
➢ Surprisingly, yes. But not by a DP algorithm (that I know of)
➢ Hu & Shing (1981) developed 𝑂(𝑛 log 𝑛) time algorithm by
reducing chain matrix product to the problem of
“optimally” triangulating a regular polygon
Source: Wikipedia
Example
• 𝐴 is 10 × 30, 𝐵 is 30 × 5, 𝐶 is 5 × 60
• The cost of each triangle is the product
of its vertices
• Want to minimize total cost of all
triangles