0% found this document useful (0 votes)
56 views16 pages

MIT18 05S14 Class11slides PDF

This document provides examples to illustrate Bayesian updating using discrete priors. It presents scenarios involving choosing medical treatments based on trial outcomes, identifying which type of die was selected based on roll results, and using Bayesian reasoning to compute probabilities based on diagnostic test results. It also discusses representing the scenarios using likelihood tables and Bayesian update tables to calculate posterior probabilities.

Uploaded by

MoMo
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
56 views16 pages

MIT18 05S14 Class11slides PDF

This document provides examples to illustrate Bayesian updating using discrete priors. It presents scenarios involving choosing medical treatments based on trial outcomes, identifying which type of die was selected based on roll results, and using Bayesian reasoning to compute probabilities based on diagnostic test results. It also discusses representing the scenarios using likelihood tables and Bayesian update tables to calculate posterior probabilities.

Uploaded by

MoMo
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

Bayesian Updating: Discrete Priors: 18.

05 Spring 2014

http://xkcd.com/1236/

January 1, 2017 1 / 16
Learning from experience

Which treatment would you choose?

1. Treatment 1: cured 100% of patients in a trial.


2. Treatment 2: cured 95% of patients in a trial.
3. Treatment 3: cured 90% of patients in a trial.

Which treatment would you choose?

1. Treatment 1: cured 3 out of 3 patients in a trial.


2. Treatment 2: cured 19 out of 20 patients treated in a trial.
3. Standard treatment: cured 90000 out of 100000 patients in clinical
practice.

January 1, 2017 2 / 16
Which die is it?

I have a bag containing dice of two types: 4-sided and 10-sided.

Suppose I pick a die at random and roll it.

Based on what I rolled which type would you guess I picked?

• Suppose you find out that the bag contained one 4-sided die and
one 10-sided die. Does this change your guess?

• Suppose you find out that the bag contained one 4-sided die and
100 10-sided dice. Does this change your guess?

January 1, 2017 3 / 16
Board Question: learning from data

• A certain disease has a prevalence of 0.005.


• A screening test has 2% false positives an 1% false negatives.

Suppose a patient is screened and has a positive test.


1 Represent this information with a tree and use Bayes’ theorem to
compute the probabilities the patient does and doesn’t have the
disease.
2 Identify the data, hypotheses, likelihoods, prior probabilities and
posterior probabilities.
3 Make a full likelihood table containing all hypotheses and
possible test data.
4 Redo the computation using a Bayesian update table. Match the
terms in your table to the terms in your previous calculation.

January 1, 2017 4 / 16
Board Question: Dice
Five dice: 4-sided, 6-sided, 8-sided, 12-sided, 20-sided.
Suppose I picked one at random and, without showing it to you,
rolled it and reported a 13.

1. Make the full likelihood table (be smart about identical columns).
2. Make a Bayesian update table and compute the posterior
probabilities that the chosen die is each of the five dice.
3. Same question if I rolled a 5.
4. Same question if I rolled a 9.

(Keep the tables for 5 and 9 handy! Do not erase!)

January 1, 2017 5 / 16
Tabular solution

D = ‘rolled a 13’

Bayes
hypothesis prior likelihood numerator posterior
H P(H) P(D|H) P(D|H)P(H) P(H|D)
H4 1/5 0 0 0
H6 1/5 0 0 0
H8 1/5 0 0 0
H12 1/5 0 0 0
H20 1/5 1/20 1/100 1
total 1 1/100 1

January 1, 2017 6 / 16
Tabular solution

D = ‘rolled a 5’

Bayes
hypothesis prior likelihood numerator posterior
H P(H) P(D|H) P(D|H)P(H) P(H|D)
H4 1/5 0 0 0
H6 1/5 1/6 1/30 0.392
H8 1/5 1/8 1/40 0.294
H12 1/5 1/12 1/60 0.196
H20 1/5 1/20 1/100 0.118
total 1 0.085 1

January 1, 2017 7 / 16
Tabular solution

D = ‘rolled a 9’

Bayes
hypothesis prior likelihood numerator posterior
H P(H) P(D|H) P(D|H)P(H) P(H|D)
H4 1/5 0 0 0
H6 1/5 0 0 0
H8 1/5 0 0 0
H12 1/5 1/12 1/60 0.625
H20 1/5 1/20 1/100 0.375
total 1 .0267 1

January 1, 2017 8 / 16
Iterated Updates

Suppose I rolled a 5 and then a 9.

Update in two steps:


First for the 5
Then update the update for the 9.

January 1, 2017 9 / 16
Tabular solution

D1 = ‘rolled a 5’

D2 = ‘rolled a 9’

Bayes numerator1 = likelihood1 × prior.

Bayes numerator2 = likelihood2 × Bayes numerator1

Bayes Bayes
hyp. prior likel. 1 num. 1 likel. 2 num. 2 posterior
H P(H) P(D1 |H) ∗∗∗ P(D2 |H) ∗∗∗ P(H|D1 , D2 )
H4 1/5 0 0 0 0 0
H6 1/5 1/6 1/30 0 0 0
H8 1/5 1/8 1/40 0 0 0
H12 1/5 1/12 1/60 1/12 1/720 0.735
H20 1/5 1/20 1/100 1/20 1/2000 0.265
total 1 0.0019 1

January 1, 2017 10 / 16
Board Question

Suppose I rolled a 9 and then a 5.

1. Do the Bayesian update in two steps:


First update for the 9.
Then update the update for the 5.

2. Do the Bayesian update in one step


The data is D = ‘9 followed by 5’

January 1, 2017 11 / 16
Tabular solution: two steps

D1 = ‘rolled a 9’

D2 = ‘rolled a 5’

Bayes numerator1 = likelihood1 × prior.

Bayes numerator2 = likelihood2 × Bayes numerator1

Bayes Bayes
hyp. prior likel. 1 num. 1 likel. 2 num. 2 posterior
H P(H) P(D1 |H) ∗∗∗ P(D2 |H) ∗∗∗ P(H|D1 , D2 )
H4 1/5 0 0 0 0 0
H6 1/5 0 0 1/6 0 0
H8 1/5 0 0 1/8 0 0
H12 1/5 1/12 1/60 1/12 1/720 0.735
H20 1/5 1/20 1/100 1/20 1/2000 0.265
total 1 0.0019 1

January 1, 2017 12 / 16
Tabular solution: one step

D = ‘rolled a 9 then a 5’

Bayes
hypothesis prior likelihood numerator posterior
H P(H) P(D|H) P(D|H)P(H) P(H|D)
H4 1/5 0 0 0
H6 1/5 0 0 0
H8 1/5 0 0 0
H12 1/5 1/144 1/720 0.735
H20 1/5 1/400 1/2000 0.265
total 1 0.0019 1

January 1, 2017 13 / 16
Board Question: probabilistic prediction

Along with finding posterior probabilities of hypotheses. We might


want to make posterior predictions about the next roll.

With the same setup as before let:


D1 = result of first roll
D2 = result of second roll

(a) Find P(D1 = 5).


(b) Find P(D2 = 4|D1 = 5).

January 1, 2017 14 / 16
Solution
D1 = ‘rolled a 5’

D2 = ‘rolled a 4’
Bayes
hyp. prior likel. 1 num. 1 post. 1 likel. 2 post. 1 × likel. 2
H P(H) P(D1 |H) ∗ ∗ ∗ P(H|D1 ) P(D2 |H, D1 ) P(D2 |H, D1 )P(H|D1 )
H4 1/5 0 0 0 ∗ 0
H6 1/5 1/6 1/30 0.392 1/6 0.392 · 1/6
H8 1/5 1/8 1/40 0.294 1/8 0.294 · 1/40
H12 1/5 1/12 1/60 0.196 1/12 0.196 · 1/12
H20 1/5 1/20 1/100 0.118 1/20 0.118 · 1/20
total 1 0.085 1 0.124

The law of total probability tells us P(D1 ) is the sum of the Bayes
numerator 1 column in the table: P(D1 ) = 0.085 .
The law of total probability tells us P(D2 |D1 ) is the sum of the last
column in the table: P(D2 |D1 ) = 0.124
January 1, 2017 15 / 16
MIT OpenCourseWare
https://ocw.mit.edu

18.05 Introduction to Probability and Statistics


Spring 2014

For information about citing these materials or our Terms of Use, visit: https://ocw.mit.edu/terms .

You might also like