Aiml Partb Unit II QP
Aiml Partb Unit II QP
Bayesian Networks (BNs) provide a probabilistic graphical model for representing and
reasoning about uncertainty in AI. To compute exact inference, we need to find the
probability distribution of a variable given observed evidence.
How it Works:
Algorithm Steps:
Example:
java
CopyEdit
Burglary (B) Earthquake (E)
| |
v v
Alarm (A)
We want to compute P(B | A = true) (Probability of Burglary given the Alarm rings).
Using Variable Elimination:
How it Works:
Algorithm Steps:
Example:
css
CopyEdit
A→B→C→D
Algorithm Overview
Given a Bayesian Network P(X₁, X₂, ..., Xₙ) and a query P(Q | E) (where Q is the query
variable and E is evidence), Variable Elimination works as follows:
| |
v v
Alarm (A)
Marginalization over E:
P(A∣B)=∑EP(A∣B,E)P(E)
P(A∣B=1)=P(A∣B=1,E=1)P(E=1)+P(A∣B=1,E=0)P(E=0))
=(0.95×0.002)+(0.94×0.998)=0.0019+0.93812=0.94002
Similarly,
P(A∣B=0)=P(A∣B=0,E=1)P(E=1)+P(A∣B=0,E=0)P(E=0))
=(0.29×0.002)+(0.001×0.998)=0.00058+0.000998=0.001578
P(B∣A)=P(A∣B)P(B)P(A)P(B | A)
Where:
P(A)=P(A∣B=1)P(B=1)+P(A∣B=0)P(B=0)
= (0.94002×0.001) + (0.001578×0.999)
= 0.00094002+0.00157642
=0.00251644
Now:
P(B∣A)=0.94002×0.001/0.00251644
=0.000940020.00251644≈0.3735
Thus, P(B | A) ≈ 0.3735, meaning the probability of a burglary given that the alarm rang is
37.35%.
Limitations
3. Define uncertain knowledge, prior probability and conditional probability. State the Bayes’
theorem. How it is useful for decision making under uncertainty? Explain belief networks
briefly?
4. Explain the method of handling approximate inference in Bayesian networks.
5. What is Bayes’ rule? Explain how Bayes’ rule can be applied to tackle uncertain Knowledge.
6. Discuss about Bayesian Theory and Bayesian network.
7. Explain how does Bayesian statistics provide reasoning under various kinds of uncertainty?
8. How to get the approximate inference from Bayesian network.
9. Construct a Bayesian Network and define the necessary CPTs for the given scenario. We have
a bag of three biased coins a,b and c with probabilities of coming up heads of 20%, 60% and
80% respectively. One coin is drawn randomly from the bag (with equal likelihood of drawing
each of the three coins) and then the coin is flipped three times to generate the outcomes X1, X2
and X3. a. Draw a Bayesian network corresponding to this setup and define the relevant CPTs.
b. Calculate which coin is most likely to have been drawn if the flips come up HHT
10. Consider the following set of propositions Patient has spots Patient has measles Patient has
high fever Patient has Rocky mountain spotted fever. Patient has previously been inoculated
against measles. Patient was recently bitten by a tick Patient has an allergy. a) Create a network
that defines the casual connections among these nodes. b) Make it a Bayesian network by
constructing the necessary conditional probability matrix.