MIT18 05S14 Class11slides PDF
MIT18 05S14 Class11slides PDF
05 Spring 2014
http://xkcd.com/1236/
January 1, 2017 1 / 16
Learning from experience
January 1, 2017 2 / 16
Which die is it?
• Suppose you find out that the bag contained one 4-sided die and
one 10-sided die. Does this change your guess?
• Suppose you find out that the bag contained one 4-sided die and
100 10-sided dice. Does this change your guess?
January 1, 2017 3 / 16
Board Question: learning from data
January 1, 2017 4 / 16
Board Question: Dice
Five dice: 4-sided, 6-sided, 8-sided, 12-sided, 20-sided.
Suppose I picked one at random and, without showing it to you,
rolled it and reported a 13.
1. Make the full likelihood table (be smart about identical columns).
2. Make a Bayesian update table and compute the posterior
probabilities that the chosen die is each of the five dice.
3. Same question if I rolled a 5.
4. Same question if I rolled a 9.
January 1, 2017 5 / 16
Tabular solution
D = ‘rolled a 13’
Bayes
hypothesis prior likelihood numerator posterior
H P(H) P(D|H) P(D|H)P(H) P(H|D)
H4 1/5 0 0 0
H6 1/5 0 0 0
H8 1/5 0 0 0
H12 1/5 0 0 0
H20 1/5 1/20 1/100 1
total 1 1/100 1
January 1, 2017 6 / 16
Tabular solution
D = ‘rolled a 5’
Bayes
hypothesis prior likelihood numerator posterior
H P(H) P(D|H) P(D|H)P(H) P(H|D)
H4 1/5 0 0 0
H6 1/5 1/6 1/30 0.392
H8 1/5 1/8 1/40 0.294
H12 1/5 1/12 1/60 0.196
H20 1/5 1/20 1/100 0.118
total 1 0.085 1
January 1, 2017 7 / 16
Tabular solution
D = ‘rolled a 9’
Bayes
hypothesis prior likelihood numerator posterior
H P(H) P(D|H) P(D|H)P(H) P(H|D)
H4 1/5 0 0 0
H6 1/5 0 0 0
H8 1/5 0 0 0
H12 1/5 1/12 1/60 0.625
H20 1/5 1/20 1/100 0.375
total 1 .0267 1
January 1, 2017 8 / 16
Iterated Updates
January 1, 2017 9 / 16
Tabular solution
D1 = ‘rolled a 5’
D2 = ‘rolled a 9’
Bayes Bayes
hyp. prior likel. 1 num. 1 likel. 2 num. 2 posterior
H P(H) P(D1 |H) ∗∗∗ P(D2 |H) ∗∗∗ P(H|D1 , D2 )
H4 1/5 0 0 0 0 0
H6 1/5 1/6 1/30 0 0 0
H8 1/5 1/8 1/40 0 0 0
H12 1/5 1/12 1/60 1/12 1/720 0.735
H20 1/5 1/20 1/100 1/20 1/2000 0.265
total 1 0.0019 1
January 1, 2017 10 / 16
Board Question
January 1, 2017 11 / 16
Tabular solution: two steps
D1 = ‘rolled a 9’
D2 = ‘rolled a 5’
Bayes Bayes
hyp. prior likel. 1 num. 1 likel. 2 num. 2 posterior
H P(H) P(D1 |H) ∗∗∗ P(D2 |H) ∗∗∗ P(H|D1 , D2 )
H4 1/5 0 0 0 0 0
H6 1/5 0 0 1/6 0 0
H8 1/5 0 0 1/8 0 0
H12 1/5 1/12 1/60 1/12 1/720 0.735
H20 1/5 1/20 1/100 1/20 1/2000 0.265
total 1 0.0019 1
January 1, 2017 12 / 16
Tabular solution: one step
D = ‘rolled a 9 then a 5’
Bayes
hypothesis prior likelihood numerator posterior
H P(H) P(D|H) P(D|H)P(H) P(H|D)
H4 1/5 0 0 0
H6 1/5 0 0 0
H8 1/5 0 0 0
H12 1/5 1/144 1/720 0.735
H20 1/5 1/400 1/2000 0.265
total 1 0.0019 1
January 1, 2017 13 / 16
Board Question: probabilistic prediction
January 1, 2017 14 / 16
Solution
D1 = ‘rolled a 5’
D2 = ‘rolled a 4’
Bayes
hyp. prior likel. 1 num. 1 post. 1 likel. 2 post. 1 × likel. 2
H P(H) P(D1 |H) ∗ ∗ ∗ P(H|D1 ) P(D2 |H, D1 ) P(D2 |H, D1 )P(H|D1 )
H4 1/5 0 0 0 ∗ 0
H6 1/5 1/6 1/30 0.392 1/6 0.392 · 1/6
H8 1/5 1/8 1/40 0.294 1/8 0.294 · 1/40
H12 1/5 1/12 1/60 0.196 1/12 0.196 · 1/12
H20 1/5 1/20 1/100 0.118 1/20 0.118 · 1/20
total 1 0.085 1 0.124
The law of total probability tells us P(D1 ) is the sum of the Bayes
numerator 1 column in the table: P(D1 ) = 0.085 .
The law of total probability tells us P(D2 |D1 ) is the sum of the last
column in the table: P(D2 |D1 ) = 0.124
January 1, 2017 15 / 16
MIT OpenCourseWare
https://ocw.mit.edu
For information about citing these materials or our Terms of Use, visit: https://ocw.mit.edu/terms .