0% found this document useful (0 votes)
14 views2 pages

Bayesian Quiz1 Solutions

The document provides solutions to 7 Bayesian theory problems. It includes calculating joint and marginal probabilities from a probability mass function table, conditional probabilities, expected values and variance of a uniform distribution, setting up a conditional probability table for travel times, applying Bayes' theorem to binomial and Poisson distributions, and calculating a prior predictive distribution.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views2 pages

Bayesian Quiz1 Solutions

The document provides solutions to 7 Bayesian theory problems. It includes calculating joint and marginal probabilities from a probability mass function table, conditional probabilities, expected values and variance of a uniform distribution, setting up a conditional probability table for travel times, applying Bayes' theorem to binomial and Poisson distributions, and calculating a prior predictive distribution.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Quiz 1 Solution: Bayesian Theory.

1. Write down the table of the joint probability mass function for number of young and species. Then
sum entries across the rows and across the columns to get the marginal probabilities.

# Young S1 S2
0 0.06 0.10 0.16
1 0.48 0.22 0.70
2 0.06 0.08 0.14
0.60 0.40 1.00

Then the conditional probability of 0 Young given deer is Species 2:


Pr(Y = 0, S = S2) 0.10
Pr(Y = 0|Species = S2) = = = 0.25
Pr(S = S2) 0.40

2. The number of young and the species are dependent. For example,

Pr(Y = 0) = 0.16 6= Pr(Y = 0|Species = S2) = 0.25

3. If X ∼ Uniform(10,20),
10 + 20
E[X] = = 15
2
(20 − 10)2
V [X] = = 8.333333
p 12
V [X] 2.886751
CV [X] = = = 0.1924501 ≈ 0.19
E[X] 15

4. Set up a table of marginal probabilities for roadworks or not and conditional densities for travel time.

Yes No
Pr(R) 2/5 3/5
1 1
f (Y |R) 10 5

Then by Bayes Rule


Pr(R = Y es)f (Y = 12|R = Y es)
Pr(R = Y es|Y = 12) =
Pr(R = Y es)f (Y = 12|R = Y es) + Pr(R = N o)f (Y = 12|R = N o)
2 1
5 10 0.04
= 2 1 31 = = 0.250
5 10 + 55
0.04 + 0.12

5. Bayes Theorem application:

Y |θ ∼ Binomial (n = 225, θ)
θ ∼ Beta (7, 3)

Given Y =181:

θ|Y = 181 ∼ Beta (7 + 181, 3 + 225 − 181) = Beta(188, 47)

Then
188
E[θ|Y = 181] = = 0.800.
188 + 47
6. Bayes Theorem application:
iid
Yi ∼ Poisson(θ), i = 1, . . . , n
θ ∼ Gamma3, 2
P5
Then given n=5 and i=1 yi = 16,

θ|y ∼ Gamma (3 + 16, 2 + 5) = Gamma(19, 7)

Then
19
E[θ] = = 2.71
7

7. Prior predictive distribution for Y where

Y |θ ∼ Poisson (θ)
θ ∼ Gamma(α, β)

Z ∞ Z ∞ −θ y α
e θ β
Pr(Y = y) = f (y|θ)π(θ)dθ = θα−1 e−θβ dθ
0 0 y! Γ(α)
Z ∞
βα α+y−1 −θ(β+1) βα Γ(α + y)
= θ e dθ =
y!Γ(α) 0 y!Γ(α) (β + 1)α+y

With α=3 and β=2:

23 Γ(3 + y) 4Γ(3 + y)
Pr(Y = y) = 3+y
=
y!Γ(3) (2 + 1) y!33+y

Then
4Γ(3) 8
Pr(Y = 0) = 3
=
0!3 27
4Γ(4) 8
Pr(Y = 1) = =
1!34 27
4Γ(5) 49
Pr(Y = 2) = =
2!35 243
16
Then probability that Y is at least 1 is Pr(Y = 0) + Pr(Y = 1) = 27 = 0.593.

You might also like