The document discusses dynamic programming and its key concepts. It covers:
- Optimal substructure and how optimal solutions can be broken down into optimal subproblems.
- Overlapping subproblems, where the same subproblems are solved repeatedly without generating new ones.
- How dynamic programming solves problems bottom-up by storing results of subproblems to avoid recomputing them. Memoization is used to store these results and look them up later.
- Examples like matrix chain multiplication are used to illustrate optimal substructure and how dynamic programming solves the problem in O(n^3) time by avoiding redundant work through memoization.
The document discusses dynamic programming and its key concepts. It covers:
- Optimal substructure and how optimal solutions can be broken down into optimal subproblems.
- Overlapping subproblems, where the same subproblems are solved repeatedly without generating new ones.
- How dynamic programming solves problems bottom-up by storing results of subproblems to avoid recomputing them. Memoization is used to store these results and look them up later.
- Examples like matrix chain multiplication are used to illustrate optimal substructure and how dynamic programming solves the problem in O(n^3) time by avoiding redundant work through memoization.
Lecture Plan • When we should use dynamic programming? Optimal substructure Overlapping subproblems • Memoization
MA512: Data Structures and Algorithms
2 Optimal Substructure • Optimal substructure: Optimal solution contains within it optimal solutions to sub problems. Ex: In matrix-chain multiplication optimally doing A1, A2, A3, ...,An required A 1...k and A k+1 ...n to be optimal. Ex: Assembly line scheduling, Find the fastest way through S1,j contains fastest way through S1,j- 1 or S2,j-1
MA512: Data Structures and Algorithms
3 Optimal Substructure - contd. • It is often easy to show the optimal sub problem property as follows: Split problem into sub-problems
Sub-problems must be optimal, otherwise the
optimal splitting would not have been optimal.
MA512: Data Structures and Algorithms
4 Characterize subproblem space • Try to keep the space as simple as possible and expand it as necessary For example, in matrix chain multiplication, subproblem space A1A2…Aj will not work. Instead AiAi+1…Aj (vary at both ends) work In assembly line, subproblem is S1,j
MA512: Data Structures and Algorithms
5 Optimal substructure variants • Optimal substructure varies across problem domains in two ways How many subproblems?
• In matrix chain multiplication two subproblems
• In assembly-line schedule: one subproblem How many choices to use subproblem in optimal soln?
• In matrix chain multiplication: j-i choices
• In assembly line schedule: two choices • DP often solves the optimal substructure in bottom-up manner
MA512: Data Structures and Algorithms
6 Complexity analysis of DP • Running time = #overall subproblems * #choices In matrix chain: O(n2) * O(n) = O(n3) In assembly line scheduling: O(n) X O(1) = O(n) • Cost = cost of solving subproblem * cost of making choices In matrix chain choice cost: pi-1pipj In assembly line scheduling: • aij if stays in same line • ti’j-1+ aij (i‘ ≠ i) otherwise
MA512: Data Structures and Algorithms
7 Subtleties when determining optimal substructure • Optimal substructure may not apply though it may seem to apply at first sight Unweighted shortest path: • Find a path from u to v containing fewest edges • Can be proved to have optimal substructures Unweighted longest simple path • Find a simple path from u to v containing most edges • Does not satisfy optimal substructure Independence (no sharing of resources) if a problem has optimal substructure
MA512: Data Structures and Algorithms
8 Overlapping subproblems • Space of sub-problems must be small recursive solution re-solves the same sub-problem many times rather than generating new subproblems In contrast, a divide-and-conquer approach generates new subproblems at each step • Usually there are polynomially many sub- problems, and we revisit the same ones over and over again ⇒ overlapping sub-problems.
MA512: Data Structures and Algorithms
9 Recursive-Matrix-Chain • A recursive algorithm for matrix-chain multiplication
MA512: Data Structures and Algorithms
10 Recursion Tree of Recursive-Matrix-Chain
MA512: Data Structures and Algorithms
11 Running Time: Recursive-Matrix-Chain • T(n): Time taken to compute matrix chain of size n
• Each term T(i) appears twice once as T(k) and the next time as T(n-k)
• We can show by induction that
MA512: Data Structures and Algorithms
12 DP vs. Divide-and-conquer Dynamic Programming Divide-and-conqeur Optimal Substructure Yes Yes Overlapping subproblems Yes No Top-down or bottom-up Bottom-up Top-down
MA512: Data Structures and Algorithms
13 Memoization • Partial results are recorded (memo) and reused later without having to recompute them • What if we stored sub-problems and used the stored solutions in a recursive algorithm? This is like divide-and-conquer, top down, but should benefit like DP which is bottom-up. • Memoized version maintains an entry in a table. One can use a fixed table or a hash table.
MA512: Data Structures and Algorithms
14 MA512: Data Structures and Algorithms 15 Running Time: Memoized-Matrix- Chain • Each call to Lookup-Chain is either O(1) time if already computed earlier O(n), otherwise
• Each m[i, j] called many times but initialized only once
There are O(n2) calls of the second type, one per table entry • A given call of Lookup-Chain makes O(n) recursive calls Total time O(n3)