Module 5
Module 5
UNCERTAINTY
In which we see how an agent can tame uncertainty with degrees of belief.
Outlin
e
1 Acting Under
Uncertainty
2 Basics on
Probability
3 Probabilistic Inference via
Enumeration
4 Independence and Conditional
Independence
5 Applying Bayes’
Rule
6 An Example: The Wumpus World
Revisited
Outlin
e
1 Acting Under
Uncertainty
2 Basics on
Probability
3 Probabilistic Inference via
Enumeration
4 Independence and Conditional
Independence
5 Applying Bayes’
Rule
6 An Example: The Wumpus World
Revisited
The real world: Things go wrong
Consider a plan for changing a tire after getting a flat using
operators of RemoveTire(x), PutOnTire(x),
InflateTire(x)
• Incomplete information
– Unknown preconditions, e.g., Is the spare actually
intact?
– Disjunctive effects, e.g., Inflating a tire with a pump may cause the tire
to inflate or a slow hiss or the tire may burst or …
• Incorrect information
– Current state incorrect, e.g., spare NOT intact
– Missing/incorrect postconditions (effects) in operators
• Qualification problem:
– can never finish listing all the required preconditions and possible
conditional outcomes of actions
Possible Solutions
• Conditional planning
– Plan to obtain information (observation actions)
– Subplan for each contingency, e.g.,
[Check(Tire1), [IF Intact(Tire1)
THEN [Inflate(Tire1)] ELSE
[CallAAA] ]
– Expensive because it plans for many
unlikely cases
• Monitoring/Replanning
– Assume normal states, outcomes
– Check progress during execution, replan
if necessary
Dealing with Uncertainty Head-on: Probability
• Let action At = leave for airport t minutes before flight from Lindbergh Field
• Will At get me there on time?
Problems:
1. Partial observability (road state, other drivers' plans, etc.)
2. Noisy sensors (traffic reports)
3. Uncertainty in action outcomes (turn key, car doesn’t start, etc.)
4. Immense complexity of modeling and predicting traffic
These are NOT assertions about the world, but represent belief about the whether the assertion is
true.
1 Acting Under
Uncertainty
2 Basics on
Probability
3 Probabilistic Inference via
Enumeration
4 Independence and Conditional
Independence
5 Applying Bayes’
Rule
6 An Example: The Wumpus World
Revisited
Probabilities Basics: an AI-sh
Introduction
• Probabilistic assertions: state how likely possible worlds
are Sample space Ω: the set of all possible worlds
• ω ∈ Ω is a possible world (aka sample point or
atomic event) ex: the dice roll (1,4)
• the possible worlds are mutually exclusive and
exhaustive
• ex: the 36 possible outcomes of rolling two dice:
(1,1), (1,2), ...
• A probability model (aka probability space) is a sample space with an
assignment P(ω) for every ω ∈ Ω s.t.
• 0 ≤ P(ω) ≤ 1, for every ω ∈
Ω Σω∈Ω P(ω) = 1
• Ex: 1-die roll: P(1) = P(2) =
P(3) = P(4) = P(5) = P(6) =
1/6
• An Event A is any subset of Ω,
Random
Variables
1 Acting Under
Uncertainty
2 Basics on
Probability
3 Probabilistic Inference via
Enumeration
4 Independence and Conditional
Independence
5 Applying Bayes’
Rule
6 An Example: The Wumpus World
Revisited
Probabilistic Inference via
Enumeration
Basic Ideas
Start with the joint distribution P(Toothache, Catch, Cavity )
For any proposition ϕ, sum the atomic events where ϕ is true: P(ϕ) = Σ ω :
ω|=ϕP(ω)
Probabilistic Inference via Enumeration:
Example
Example: Generic Inference
Start with the joint distribution P(Toothache, Catch, Cavity )
For any proposition ϕ, sum the atomic events where ϕ is true: P(ϕ) = Σ ω :
1 Acting Under
Uncertainty
2 Basics on
Probability
3 Probabilistic Inference via
Enumeration
4 Independence and Conditional
Independence
5 Applying Bayes’
Rule
6 An Example: The Wumpus World
Revisited
Independen
ce Variables X and Y are independent iff P(X, Y ) =
P(X )P(Y )
(or equivalently,
ex: P(Toothache, P(X |YCavity,
iff Catch, ) = P(X ) or P(Y
Weather |X ) =
) = P(Toothache, Catch,
=⇒ ))
P(Y )P(Weather catch,
e.g. P(toothache,
Cavity ) cavity, cloudy ) = P(toothache, catch,
cavity )P(cloudy )
typically based on domain knowledge
May drastically
=⇒ ex: reduce
32-element table the number of
decomposed entries
into and computation
one 8-element and one 4-
element table
Unfortunately, absolute independence is quite rare
Conditional
Independence
Variables X and Y are conditionally independent given Z iff P(X, Y |Z) = P(X |Z)P(Y
|Z)
(or equivalently, iff P(X |Y , Z) = P(X |Z) or P(Y |X, Z) = P(Y |Z))
Consider P(Toothache, Cavity , Catch)
if I have a cavity, the probability that the probe catches in it doesn’t depend on
whether I have a toothache: P(catch|toothache, cavity ) = P(catch|cavity )
the same independence holds if I haven’t got a cavity:
P(catch|toothache, ¬cavity ) = P(catch|¬cavity )
=⇒ Catch is conditionally independent of Toothache given Cavity:
P(Catch|Toothache, Cavity ) = P(Catch|Cavity )
or, equivalently: P(Toothache|Catch, Cavity ) = P(Toothache|Cavity ), or
P(Toothache, Catch|Cavity ) = P(Toothache|Cavity )P(Catch|Cavity )
Hint: Toothache and Catch are two (mutually-independent) effects of the same
cause Cavity
Conditional Independence
[cont.]
In many cases, the use of conditional independence reduces the size of the
representation of the joint distribution dramatically
even from exponential to linear!
P(Toothache, Catch, Cavity )
= P(Toothache|Catch, Cavity )P(Catch,
Ex:
= )
P(Toothache|Catch,
Cavity Cavity )P(Catch|
=
Cavity )P(Cavity )
P(Toothache|Cavity )P(Catch|Cavity )P(Cavity
)
⇒ PassesP(Toothache,
from 7 to 2+2+1=5
Catch, Cavityindependent numbers
) contains 7 independent
entries
(the 8th can be obtained as 1 −Σ ...)
P(Toothache|Cavity ),P(Catch|Cavity ) contain 2 independent entries (2 × 2 matrix,
each row sums to 1)
P(Cavity ) contains 1 independent entry
General Case: if one causes has n independent
effects:
Exercis
e
Consider the joint probability distribution described in the table in previous
section (slide 20 onwards): P(Toothache, Catch, Cavity )
Consider the example in previous slide:
P(Toothache, Catch, Cavity )
= P(Toothache|Catch, Cavity )P(Catch, Cavity )
= P(Toothache|Catch, Cavity )P(Catch|Cavity )P(Cavity )
= P(Toothache|Cavity )P(Catch|Cavity )P(Cavity )
Compute separately the distributions P(Toothache|Catch, Cavity ), P(Catch|
Cavity ),
P(Cavity ), P(Toothache|Cavity ).
Recompute P(Toothache, Catch, Cavity ) in two
ways: P(Toothache|Catch, Cavity )P(Catch|Cavity
)P(Cavity ) P(Toothache|Cavity )P(Catch|
Cavity )P(Cavity )
and compare the result with P(Toothache, Catch,
Cavity )
Outlin
e
1 Acting Under
Uncertainty
2 Basics on
Probability
3 Probabilistic Inference via
Enumeration
4 Independence and Conditional
Independence
5 Applying Bayes’
Rule
6 An Example: The Wumpus World
Revisited
Bayes’
Rule
Bayes’
Rule/Theorem/Law P(a ∧ P(b|a)P(a)
Bayes’ rule: P(a|b) =
b)P(b) P(b)
=
P(X |
In distribution form P(Y |X = αP(X |Y )P(Y
Y )P(Y
P(X)
) = de )
)
α f= 1/P(X ): normalization constant to make P(Y |X ) entries
sum to 1 α′ s for different values of X )
(different
A version conditionalized on some background
evidence e:
P(X |Y , e)P(Y |
e)
P(Y |X , e) = P(X |
e)
Using Bayes’ Rule: The Simple
Case
Used to assess diagnostic probability from causal
probability:
P(effect|cause)P(cause)
P(cause|effect )
P(effect )
= P(cause|effect ) goes from effect to cause (diagnostic direction)
P(effect |cause) goes from cause to effect (causal direction)
Example
An expert doctor is likely to have causal knowledge ... P(symptoms|disease)
(i.e., P(effect |cause))
... and needs producing diagnostic knowledge P(disease|symptoms) (i.e., P(cause|
effect )) Ex: let m be meningitis, s be stiff neck
P(m) = 1/50000, P(s) = 0.01 (prior knowledge, from statistics)
“meningitis causes to the patient a stiff neck in 70% of cases”: P(s|m) = 0.7
(doctor’s experience)
P(s|m)P(m) .7 ·
⇒ P(m|s) = = =
0 P(s) 1/ 50000
0.0
0.0014
1
Using Bayes’ Rule: Combining
Evidence
A naive Bayes model is a probability model that assumes the effects are
conditionally independent, given the cause
1 Acting Under
Uncertainty
2 Basics on
Probability
3 Probabilistic Inference via
Enumeration
4 Independence and Conditional
Independence
5 Applying Bayes’
Rule
6 An Example: The Wumpus World
Revisited
An Example: The Wumpus
World
A probability model of the Wumpus World
Consider again the Wumpus World (restricted to pit
detection) Evidence: no pit in (1,1), (1,2), (2,1),
breezy in (1,2), (2,1)
Q. Given the evidence, what is the probability of having a pit in (1,3),
(2,2) or (3,1)? Two groups of variables:
Pij = true iff [i, j] contains a pit
(“causes”)
Bij = true iff [i, j] is breezy
(“effects”, consider
only
B1,1, B1,2, B2,1)
Joint Distribution:
P(P1,1 , ...,∗ P
=4,4
def
, B1,1 ,
¬b
B1,2 ,bB2,1 ∧ ) 1,1 b1,2 ∧ b2,1
p∗ = ¬p1,1 ∧ ¬p1,2 ∧
def
b∗ = ¬b1,1
def
b1,2 ∧ b2,1
∧
p∗ = ¬p1,1 ∧ ¬p1,2 ∧
def