0% found this document useful (0 votes)
23 views3 pages

Bayes 2

The document outlines the solution to a Bayesian network problem involving variables related to burglary, earthquake, alarm, and calls from John and Mary. It details the computation of probabilities P(M|B) and P(J, M|B) using Bayes' Theorem, marginalization, and conditional independence. The final results are P(M|B) = 0.6586 and P(J, M|B) = 0.5594.

Uploaded by

minmattral
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
23 views3 pages

Bayes 2

The document outlines the solution to a Bayesian network problem involving variables related to burglary, earthquake, alarm, and calls from John and Mary. It details the computation of probabilities P(M|B) and P(J, M|B) using Bayes' Theorem, marginalization, and conditional independence. The final results are P(M|B) = 0.6586 and P(J, M|B) = 0.5594.

Uploaded by

minmattral
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Problem 2 Solution

Step 1: Understanding the Problem


We have a Bayesian network involving the following variables:

• B (Burglary)
• E (Earthquake)
• A (Alarm)
• J (John Calls)
• M (Mary Calls)

We are given conditional probability tables (CPTs) and need to com-


pute:

1. P (M |B)
2. P (J, M |B)

We will use Bayes’ Theorem, product rule, marginalization, and con-


ditional independence as needed.

Step 2: Compute P (M |B)


We need to find the probability that Mary calls given that a burglary has oc-
curred.

P (M |B) = P (M, A|B) + P (M, ¬A|B)


Using the chain rule:

P (M, A|B) = P (M |A)P (A|B)

P (M, ¬A|B) = P (M |¬A)P (¬A|B)


We extract the known values from the problem:

• P (M |A) = 0.70
• P (M |¬A) = 0.01
• P (A|B, E) = 0.95, P (A|B, ¬E) = 0.94
• P (A|¬B, E) = 0.29, P (A|¬B, ¬E) = 0.001
• P (B) = 0.001
• P (E) = 0.002

1
Step 2.1: Compute P (A|B) (Marginalization Over E)
We marginalize over E:

P (A|B) = P (A|B, E)P (E) + P (A|B, ¬E)P (¬E)


Substituting the values:

P (A|B) = (0.95 × 0.002) + (0.94 × 0.998)

P (A|B) = 0.0019 + 0.93812 = 0.94002

Step 2.2: Compute P (¬A|B)


P (¬A|B) = 1 − P (A|B) = 1 − 0.94002 = 0.05998

Step 2.3: Compute P (M |B)


P (M |B) = (P (M |A)P (A|B)) + (P (M |¬A)P (¬A|B))

P (M |B) = (0.70 × 0.94002) + (0.01 × 0.05998)

P (M |B) = 0.65801 + 0.0005998 = 0.6586


Thus,

P (M |B) = 0.6586

Step 3: Compute P (J, M |B)


Using the chain rule:

P (J, M |B) = P (J|M, B)P (M |B)


Since J and M are conditionally independent given A:

P (J, M |B) = P (J|B)P (M |B)


We already found P (M |B) = 0.6586, now we compute P (J|B).

P (J|B) = P (J|A)P (A|B) + P (J|¬A)P (¬A|B)

P (J|B) = (0.90 × 0.94002) + (0.05 × 0.05998)

P (J|B) = 0.84602 + 0.002999 = 0.849

2
Step 3.1: Compute P (J, M |B)
P (J, M |B) = P (J|B)P (M |B)

P (J, M |B) = 0.849 × 0.6586

P (J, M |B) = 0.5594


Thus,

P (J, M |B) = 0.5594

Final Answers
• P (M |B) = 0.6586

• P (J, M |B) = 0.5594

You might also like