0% found this document useful (0 votes)
31 views24 pages

Chapter8 Bayes

This document provides an introduction to Bayesian probability and Bayes' theorem. It explains that Bayesian probability quantifies uncertainty and allows reasoning with uncertain statements. Bayes' theorem shows the relationship between conditional and inverse conditional probabilities. The document provides examples to illustrate how to calculate probabilities using Bayes' theorem. It discusses how Bayesian probability is useful as it allows putting probability values on unknowns and making logical inferences with uncertain statements.

Uploaded by

Tanushree Gupat
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
31 views24 pages

Chapter8 Bayes

This document provides an introduction to Bayesian probability and Bayes' theorem. It explains that Bayesian probability quantifies uncertainty and allows reasoning with uncertain statements. Bayes' theorem shows the relationship between conditional and inverse conditional probabilities. The document provides examples to illustrate how to calculate probabilities using Bayes' theorem. It discusses how Bayesian probability is useful as it allows putting probability values on unknowns and making logical inferences with uncertain statements.

Uploaded by

Tanushree Gupat
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 24

Methods for Dummies 2009

Bayes for Beginners


Georgina Torbet & Raphael Kaplan
Bayesian Probability

“Probability”: often used to refer to frequency


… but
Bayesian Probability: a measure of a state of
knowledge.

It quantifies uncertainty. Allows us to reason using


uncertain statements.

A Bayesian model is continually updated as more data is


acquired.
How did this come about?

Billiard Table:

A white billiard ball is rolled along a line and we look at where it


stops.
We suppose that it has a uniform probability of falling anywhere
on the line. It stops at a point p.

A red billiard ball is then rolled n times under the same uniform
assumption.

How many times does the red ball roll further than the white ball?
Bayes' Theorem

Bayes' Theorem shows the relationship between a


conditional probability and its inverse.

i.e. it allows us to make an inference from


the probability of a hypothesis given the evidence to
the probability of that evidence given the hypothesis
and vice versa
Bayes' Theorem

P(A|B) = P(B|A) P(A)


P(B)

P(A) – the PRIOR PROBABILITY – represents


your knowledge about A before you have gathered
data.
e.g. if 0.01 of a population has schizophrenia then
the probability that a person drawn at random
would have schizophrenia is 0.01
Bayes' Theorem

P(A|B) = P(B|A) P(A)


P(B)

P(B|A) – the CONDITIONAL PROBABILITY – the


probability of B, given A.
e.g. you are trying to roll a total of 8 on two dice.
What is the probability that you achieve this, given
that the first die rolled a 6?
Bayes' Theorem

P(A|B) = P(B|A) P(A)


P(B)

So the theorem says:


The probability of A given B is equal to the
probability of B given A, times the prior probability
of A, divided by the prior probability of B.
A Simple Example

Mode of transport: Probability he is late:


Car 50%
Bus 20%
Train 1%

Suppose that Bob is late one day.


His boss wishes to estimate the probability that he traveled to
work that day by car.

He does not know which mode of transportation Bob usually


uses, so he gives a prior probability of 1 in 3 to each of the
three possibilities.
A Simple Example

P(A|B) = P(B|A) P(A) / P(B)


P(car|late) = P(late|car) x P(car) / P(late)

P(late|car) = 0.5 (he will be late half the time he drives)


P(car) = 0.33 (this is the boss' assumption)
P(late) = 0.5 x 0.33 + 0.2 x 0.33 + 0.01 x 0.33
(all the probabilities that he will be late added together)

P(car|late) = 0.5 x 0.33 / 0.5 x 0.33 + 0.2 x 0.33 + 0.01 x 0.33


= 0.165 / 0.71 x 0.33
= 0.7042
More complex example
Disease present in 0.5% population (i.e. 0.005)
Blood test is 99% accurate (i.e. 0.99)
False positive 5% (i.e. 0.05)
- If someone tests positive, what is the probability that they
have the disease?

P(A|B) = P(B|A) P(A) / P(B)


P(disease|pos) = P(pos|disease) x P(disease) / P(pos)
= 0.99 x 0.005 / (0.99x0.005)+(0.05x0.995)
= 0.00495 / 0.00495 + 0.04975
= 0.00495 / 0.0547
= 0.0905
What does this mean?

If someone tests positive for the disease, they


have a 0.0905 chance of having the disease.

i.e. there is just a 9% chance that they have it.

Even though the test is very accurate, because


the condition is so rare the test may not be useful.
So why is Bayesian probability
useful?

It allows us to put probability values on unknowns.


We can make logical inferences even regarding
uncertain statements.

This can show counterintuitive results – e.g. that


the disease test may not be useful.
Comparing Bayes

• Classical Inference- What is the likelihood that our data is not the result of
random chance? (e.g. Following a nested design; What is the likelihood of getting this data
given there is no activation?)

• Bayesian Inference- Does our hypothesis fit our data? Does it work better
than other models? (e.g. Assess how well a model fits our data; What is the likelihood of
getting this activation given the data?)
Why Use It?

After all, it is a subjective model isn’t it? Our inference is only as


good as our prior, right?

- We can rule out or accept the null hypothesis. By looking at the


null given the data, instead of the data given the null.
- This also means we can compare any model (including the null
hypothesis), even the validity of our priors!
- We can estimate the plausibility of whether one Beta might
have a stronger effect than another Beta in our GLM.
Bayesian paradigm
Likelihood and priors

Likelihood:

Prior:

Bayes rule:
generative model m
Hierarchical Models

• Levels of Analysis
• Even though we cannot measure at every level, but we can place
priors on what we think might be going on at each level.
• Use Empirical Bayes assumptions
• We can then compare models at each level to determine what
best fits our data at each level from a single neurotransmitter, all
the way up to a cognitive network.

(Churchland and
Sejnowski, 1988;
Science)
Hierarchical models

hierarchy

causality
Applying Bayesian Model Comparison to
Neuroimaging

• What are some ways we can use Bayesian


Inference in SPM8?
Example#1
• Pharmacological Neuroimaging experiment
• Clinical application (Parkinson’s, Alzheimer’s, etc.)
• Use priors to compare an activation in a particular brain region (basal
ganglia, hippocampus, etc.) that a drug targets to the rest of the brain.
• Using model comparison, we can assess the relative strengths of a
particular region to decide whether a targeted brain region was influenced
by pharmacological intervention more than the rest of the brain or other
specific regions.
Example#2
• EEG/MEG Source Reconstruction

(Mattout et al, 2006,


Neuroimage)
Other Example/Uses

• Anatomical Segmentation
• Dynamic Causal Modeling (DCM)

grey matter white matter CSF

[Ashburner et al., Human Brain Function,


2003]
Take Home Messages
• Bayesian Inference allows you to ask different questions than
you normally would with more classical approaches. (e.g. It
allows you to accept instead of fail to reject the null hypothesis
as the most likely hypothesis/model)
• It is an extremely useful tool in model comparison. You can
compare models that are not nested (instead of comparing to
random chance)
• It allows for incorporation of prior evidence and helps constrain
inferences to see how plausible they are against the given
data.
Conclusion
• Bayesian inference is applicable to something other than a
billiards game
Acknowledgements and
Recommended Resources
• Jean Daunizeau and his SPM course slides
• Past MFD slides
• Human Brain Function (eds. Ashburner, Friston, and Penny)
www.fil.ion.ucl.ac.uk/spm/doc/books/hbf2/pdfs/Ch17.pdf
• http://faculty.vassar.edu/lowry/bayes.html

You might also like