0% found this document useful (0 votes)
1 views5 pages

Aiml Partb Unit II QP

The document discusses probabilistic reasoning in AI, focusing on Bayesian inference and methods for exact inference in Bayesian networks, specifically Variable Elimination and Belief Propagation. It outlines the steps and advantages of the Variable Elimination algorithm for computing probabilities efficiently and highlights the pros and cons of both inference methods. Additionally, it touches on concepts such as uncertain knowledge, prior and conditional probabilities, and the application of Bayes' theorem in decision-making under uncertainty.

Uploaded by

R GAYATHRI
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
1 views5 pages

Aiml Partb Unit II QP

The document discusses probabilistic reasoning in AI, focusing on Bayesian inference and methods for exact inference in Bayesian networks, specifically Variable Elimination and Belief Propagation. It outlines the steps and advantages of the Variable Elimination algorithm for computing probabilities efficiently and highlights the pros and cons of both inference methods. Additionally, it touches on concepts such as uncertain knowledge, prior and conditional probabilities, and the application of Bayes' theorem in decision-making under uncertainty.

Uploaded by

R GAYATHRI
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 5

UNIT II: PROBABILISTIC REASONING Acting under uncertainty – Bayesian inference –

naïve bayes models. Probabilistic reasoning – Bayesian networks – exact inference in BN –


approximate inference in BN – causal networks.
Part B

1.How to get the exact inference form Bayesian network?

Exact Inference in Bayesian Networks

Bayesian Networks (BNs) provide a probabilistic graphical model for representing and
reasoning about uncertainty in AI. To compute exact inference, we need to find the
probability distribution of a variable given observed evidence.

Methods for Exact Inference in Bayesian Networks

There are two primary methods for exact inference:

1. Variable Elimination (VE)


2. Belief Propagation (Message Passing Algorithm)

1. Variable Elimination (VE)

How it Works:

 Uses marginalization (summing out variables) to compute probabilities efficiently.


 Works by eliminating non-query variables from the joint probability distribution.

Algorithm Steps:

1. Start with the full joint probability distribution.


2. Multiply conditional probability tables (CPTs) of relevant variables.
3. Eliminate non-query variables by summing over them (marginalization).
4. Normalize the result to get the probability distribution.

Example:

Consider a simple Bayesian Network:

java
CopyEdit
Burglary (B) Earthquake (E)
| |
v v
Alarm (A)

We want to compute P(B | A = true) (Probability of Burglary given the Alarm rings).
Using Variable Elimination:

P(B∣A)=P(A∣B)P(B)P(A)P(B | A) = \frac{P(A | B) P(B)}{P(A)}P(B∣A)=P(A)P(A∣B)P(B)

If multiple variables exist, we sum out the unnecessary ones.

Pros and Cons:

✔ Works well for small networks


✔ Avoids computing the full joint distribution
❌ Computationally expensive for large networks

2. Belief Propagation (Message Passing)

How it Works:

 Used for tree-structured Bayesian Networks.


 Computes marginal probabilities efficiently by passing messages between nodes.

Types of Message Passing:

1. Sum-Product Algorithm (for marginal probabilities)


2. Max-Product Algorithm (for MAP inference)

Algorithm Steps:

1. Initialize messages at leaf nodes.


2. Pass messages from children to parents (bottom-up).
3. Combine messages at each node using probability rules.
4. Compute the final probability at the query node.

Example:

If we have a tree-like network:

css
CopyEdit
A→B→C→D

We can pass messages between nodes to compute P(D | A = true) efficiently.

Pros and Cons:

✔ Very fast for tree-structured networks


✔ Can handle large-scale inference
2.Explain variable elimination algorithm for answering queries on Bayesian
networks?

Variable Elimination (VE) is an exact inference algorithm used in Bayesian Networks to


compute probabilities efficiently. Instead of computing the full joint probability distribution,
VE eliminates non-query variables step by step through marginalization, reducing
computational complexity.

Algorithm Overview

Given a Bayesian Network P(X₁, X₂, ..., Xₙ) and a query P(Q | E) (where Q is the query
variable and E is evidence), Variable Elimination works as follows:

Steps of the Algorithm

1. Identify relevant factors


o Extract Conditional Probability Tables (CPTs) for variables involved in the
query.
2. Eliminate non-query variables (Sum out irrelevant variables)
o Choose an elimination order for non-query variables.
o Marginalize out each variable by summing over its possible values.
o Multiply resulting factors to simplify the probability expression.
3. Compute the final probability distribution
o Normalize the result so that the probabilities sum to 1.
4. Step-by-Step Example

Consider a simple Bayesian Network:

Burglary (B) Earthquake (E)

| |

v v

Alarm (A)

Given Query: P(B | A = true)

We need to compute P(B | A) using Variable Elimination.

Step 1: Extract Factors (CPTs)

From the Bayesian network, we have:

1. P(B) = {B: 0.001, ¬B: 0.999}


2. P(E) = {E: 0.002, ¬E: 0.998}
3. P(A | B, E) (Alarm probability table):
4.
E P(A=1) P(A=0)
1 1 0.95 0.05
1 0 0.94 0.06
0 1 0.29 0.71
0 0 0.001 0.999

Step 2: Compute P(B | A) using Variable Elimination

Since E (Earthquake) is not part of the query, we sum it out.

Marginalization over E:

P(A∣B)=∑EP(A∣B,E)P(E)

Using values from the CPTs:

P(A∣B=1)=P(A∣B=1,E=1)P(E=1)+P(A∣B=1,E=0)P(E=0))
=(0.95×0.002)+(0.94×0.998)=0.0019+0.93812=0.94002

Similarly,

P(A∣B=0)=P(A∣B=0,E=1)P(E=1)+P(A∣B=0,E=0)P(E=0))
=(0.29×0.002)+(0.001×0.998)=0.00058+0.000998=0.001578

Step 3: Apply Bayes’ Rule to Compute P(B | A)

P(B∣A)=P(A∣B)P(B)P(A)P(B | A)

Where:

P(A)=P(A∣B=1)P(B=1)+P(A∣B=0)P(B=0)
= (0.94002×0.001) + (0.001578×0.999)
= 0.00094002+0.00157642
=0.00251644

Now:

P(B∣A)=0.94002×0.001/0.00251644
=0.000940020.00251644≈0.3735
Thus, P(B | A) ≈ 0.3735, meaning the probability of a burglary given that the alarm rang is
37.35%.

Advantages of Variable Elimination

✔ More efficient than computing the full joint probability.


✔ Works well for small to medium-sized Bayesian Networks.
✔ Eliminates variables one at a time, reducing complexity.

Limitations

❌ Computationally expensive for large networks.


❌ Order of elimination matters (bad order → slow computation).
❌ Not efficient for dynamic queries (repeating the process for different queries).

3. Define uncertain knowledge, prior probability and conditional probability. State the Bayes’
theorem. How it is useful for decision making under uncertainty? Explain belief networks
briefly?
4. Explain the method of handling approximate inference in Bayesian networks.
5. What is Bayes’ rule? Explain how Bayes’ rule can be applied to tackle uncertain Knowledge.
6. Discuss about Bayesian Theory and Bayesian network.
7. Explain how does Bayesian statistics provide reasoning under various kinds of uncertainty?
8. How to get the approximate inference from Bayesian network.
9. Construct a Bayesian Network and define the necessary CPTs for the given scenario. We have
a bag of three biased coins a,b and c with probabilities of coming up heads of 20%, 60% and
80% respectively. One coin is drawn randomly from the bag (with equal likelihood of drawing
each of the three coins) and then the coin is flipped three times to generate the outcomes X1, X2
and X3. a. Draw a Bayesian network corresponding to this setup and define the relevant CPTs.
b. Calculate which coin is most likely to have been drawn if the flips come up HHT
10. Consider the following set of propositions Patient has spots Patient has measles Patient has
high fever Patient has Rocky mountain spotted fever. Patient has previously been inoculated
against measles. Patient was recently bitten by a tick Patient has an allergy. a) Create a network
that defines the casual connections among these nodes. b) Make it a Bayesian network by
constructing the necessary conditional probability matrix.

You might also like