0% found this document useful (0 votes)
174 views5 pages

Bayesian Networks and Graphs - Solutions Chapter 1

This document provides solutions to exercises from the book "Solutions for Bayesian networks and decision graphs" by Finn V. Jensen and Thomas D. Nielsen. It includes 17 solutions that work through problems involving probability calculations and applying the rules of probability to scenarios involving Bayes nets and conditional probabilities.

Uploaded by

Dan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
174 views5 pages

Bayesian Networks and Graphs - Solutions Chapter 1

This document provides solutions to exercises from the book "Solutions for Bayesian networks and decision graphs" by Finn V. Jensen and Thomas D. Nielsen. It includes 17 solutions that work through problems involving probability calculations and applying the rules of probability to scenarios involving Bayes nets and conditional probabilities.

Uploaded by

Dan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

Solutions for Bayesian networks and decision graphs

(second edition)

Finn V. Jensen and Thomas D. Nielsen

September 6, 2007

Solution for exercise 1.1


Define B 0 = B \ (A ∩ B). We immediately have

A ∩ B0 = ∅ , (1)

A ∪ B0 = A ∪ B , (2)
and
(A ∩ B) ∪ B 0 = B . (3)
From (1), we have
(A ∩ B) ∩ B 0 = ∅ . (4)
From (3), Axiom 3, and (4) we have

P (B) = P((A ∩ B) ∪ B 0 ) = P(A ∩ B) + P(B 0 )


⇔P (B 0 ) = P(B) − P(A ∩ B) . (5)

From (2), Axiom 3, and (1) we have

P (A ∪ B) = P(A ∪ B 0 ) = P(A) + P(B 0 ) . (6)

The result now follows from (5) and (6).

Solution for exercise 1.2


Sample space (r stands for “red”, and b for “blue”):

S1 = {r1b1, r1b2, . . . , r1b6, r2b1, r2b2, . . . , r6b6} .


1
Each element has probability 36 .
6
Sample space two: S2 = {2, . . . , 12} with probabilities P (7) = 36 , P (6) =
5 4 3 2
P (8) = 36 , P (5) = P (9) = 36 , P (4) = P (10) = 36 , P (3) = P (11) = 36 , and
1
P (2) = P (12) = 36 .

Solution for exercise 1.3

1
5 1
Probabilities for SA : PA (1) = · · · = PA (4) = 24 and PA (5) = PA (6) = 12 .
1
Probabilities for SB : PB (t1) = · · · = PB (t6) = 12 and PB (h1) = · · · =
PB (h4) = 18 .
7
PA (3) + PA (5) = 24 .
7
PB (t3) + PB (t5) + PB (h3) = 24 .

Solution for exercise 1.4

Solution for exercise 1.5


P (t) = 32 .
1
P (4 | t) = 12 .
P (4 | t) + P (5 | t) + P (6 | t) = 41 .
P (4 | h) = 83 .
The four-sided die thus has a higher probability of rolling 4 or more, than the
six-sided die.

Solution for exercise 1.6


Starting with (equation just before chap 1.2.2) we get
P (A ∩ B ∩ C)
P (A | B ∩ C) =
P (B ∩ C)
P (A ∩ B ∩ C)
⇔P (A | B ∩ C) =
P (B | C)P (C)
P (A ∩ B ∩ C)
⇔P (A | B ∩ C)P (B | C) =
P (C)
⇔P (A | B ∩ C)P (B | C) = P(A ∩ B | C) .
Here the first step is application of the fundamental rule, and the last step is
exploitation of the definition of conditional probability.

Solution for exercise 1.7


We have (p stands for “pregnant” and t for “test is positive”)
P (1) = P (p)P (t | p) = 0.05 · 0.98 = 0.049
P (2) = P (p)P (¬t | p) = 0.05 · 0.02 = 0.001
P (3) = P (¬p)P (t | ¬p) = 0.95 · 0.001 = 0.00095
P (4) = P (¬p)P (¬t | ¬p) = 0.95 · 0.999 = 094905 .
P (1)
P (p | t) = P (1)+P (3) = 0.981.

2
Solution for exercise 1.8
Sample space (r stands for “red”, and b for “blue”):

S = {r1b1, r1b2, . . . , r1b6, r2b1, r2b2, . . . , r6b6} .

1 1 1
P (r1b1) = P (r1b2) = P (r1b3) = P (r2b1) = · = P (r6b3) = 6 · 12 = 72 .
1 1 1
P (r1b4) = P (r1b5) = P (r1b6) = P (r2b4) = · = P (r6b6) = 6 · 4 = 24 .

Solution for exercise 1.9


We have the two variables C and D, with C having states h and t, and
D having states 1 to 6. We have PC (h) = 13 and PC (t) = 23 . We have
5 1
PD (1) = 18 + 24 = 23 1 1 11 1 1 17
72 , PD (2) = 9 + 24 = 72 , PD (3) = 9 + 8 = 72 ,
1
PD (4) = 18 + 18 = 72
13 1
, and PD (5) = PD (6) = 18 .

Solution for exercise 1.10


Define

Ca = {(a0 , b0 ) ∈ sp(A) × sp(B) : a0 = a},


Cb = {(a0 , b0 ) ∈ sp(A) × sp(B) : b0 = b},
Cab = {(a0 , b0 ) ∈ sp(A) × sp(B) : a0 = a and b0 = b}.

Obviously Cab = Ca ∩ Cb . We thus have

P (Cab ) = P (Ca ∩ Cb )
P (Cab ) P (Ca ∩ Cb )
⇔ =
P (Cb ) P (Cb )
P (Cab )
⇔ = P (Ca | Cb )
P (Cb )
⇔P (Cab ) = P (Ca | Cb )P (Cb ).

As this is valid for any choice of states a and b, the fundamental rule holds for
variables.

Solution for exercise 1.11


P (A) = (0.2, 0.4, 0.4), P (B) = (0.3, 0.3, 0.4).
P (A|b1 ) = (0.167, 0.5, 0.333), P (A|b2 ) = (0.333, 0, 0.667), P (A|b3 ) = 0.125, 0.625, 0.25).
P (B|a1 ) = (0.25, 0.5, 0.25), P (B|a2 ) = (0.375, 0, 0.625), P (B|a3 ) = (0.25, 0.5, 0.25).

Solution for exercise 1.12

(i) 0.996

3
(ii) 0.498

Solution for exercise 1.13

(i) P (B, c1 ) = (0.02, 0.08), P (B, c2 ) = (0.18, 0.72), P (B) = (0.2, 0.8)

(ii) P (A|b1 , c1 ) = (0.3, 0.7) = P (A|b1 , c2 ), P (A|b2 , c1 ) = (0.6, 0.4) = P (A|b2 , c2 ).

Solution for exercise 1.14

Solution for exercise 1.15


The associative, commutative, and distributive laws all follow from Property 1
of potentials, that multiplication of potentials is defined to be pointwise, and
that standard multiplication and summation follows the associative, commu-
tative, and distributive law. We therefore only show the most involved case,
namely the distributive law, in details:
For the distributive law, let C12 = dom (φ1 ) ∩ dom (φ2 ), C1 = dom (φ1 ) \ C12 ,
and C2 = dom (φ2 ) \ (C12 ∪ {A}). Obviously the sets are disjoint and C1 ∪ C12 ∪
C2 ∪ {A} = dom (φ1 φ2 ). Let c12 ∈ sp(C1 2), c1 ∈ sp(C1 ), and c2 ∈ sp(C2 ). We
calculate
X X
( φ1 φ2 )(c1 , c12 , c2 ) = (φ1 φ2 )(c1 , c12 , c2 , a)
A a∈sp(A)
X
= φ1 (c1 , c12 , c2 , a)φ2 (c1 , c12 , c2 , a)
a∈sp(A)
X
= φ1 (c1 , c12 )φ2 (c12 , c2 , a)
a∈sp(A)
X
= φ1 (c1 , c12 ) φ2 (c12 , c2 , a) .
a∈sp(A)

As c1 , c12 , and c2 were selected arbitrarily the result follows.

Solution for exercise 1.16

Solution for exercise 1.17

4
5

You might also like