Pert13 - Quantifying Uncertainty

Download as pdf or txt
Download as pdf or txt
You are on page 1of 32

Course : Artificial Intelligence (COMP6065)

Non-official Slides

Quantifying Uncertainty

Session 13

Revised by Williem
1
Learning Outcomes
At the end of this session, students will be able to:
• LO 4 : Demonstrate how to achieve a goal through a sequence
of actions called planning
• LO 5 : Apply various techniques to an agent when acting under
certainty

2
Outline
1. Acting Under Uncertainty
2. Basic Probability Notation
3. Inference Using Full Joint Distributions
4. Independence
5. Probability and Bayes’ Theorem
6. Summary

3
Acting Under Uncertainty
• Agent may need to handle uncertainty, whether due to partial
observability, nondeterminism, or combination of the two

• The agent’s knowledge can at best provide only a degree of


belief in the relevant sentences

• Main tool for dealing with degree of belief is probability theory

• Probability provides a way of summarizing the uncertainty that


come from laziness and ignorance, thereby solving the
quantification problem

4
Acting Under Uncertainty
• Uncertainty in logic sentence

– Toothache ⇒ Cavity (True?)

– Toothache ⇒ Cavity ∨ GumProblem ∨ Abscess . . .


(unlimited) (True?)

– Cavity ⇒ Toothache (True?)

• Probability

– From the statistical data, 80% of the toothache patients


have had cavities

5
Acting Under Uncertainty
• How can we plan to depart from home to the airport so we
can arrive on time?

– A90 97% chance on time

– A180 100% chance on time

– A1440 100% chance on time

• How can we choose the plan?

6
Acting Under Uncertainty
• Preferences, as expressed by utilities, are combined with

probabilities in the general theory of rational decisions called

decision theory

– Decision Theory = probability theory + utility theory

• Fundamental of decision theory is that an agent is rational if

only if it chooses the action that yields the highest expected

utility, averaged over all the possible outcomes of the action

called maximum expected utility (MEU).


7
Basic Probability Notation
• In probability theory, the set of all possible worlds is called the
sample space

• For example, if we are about to roll two (distinguishable) dice,


there are 36 possible worlds to consider: (1,1), (1,2), . . ., (6,6)

• The probability for each possible world is as follows:

– I.e. the probability of the rolled two dice are 1/36

8
Basic Probability Notation
• The propositions is the set of two or more possible worlds

• The probability is

– I.e P(Total=11) = P((5,6)) + P((6,5)) = 1/36 + 1/36 = 1/18

9
Basic Probability Notation
• There are two kinds of probabilities:

– Unconditional or prior probabilities

• Degrees of belief in propositions in the absence of any


other information

– Conditional or posterior probabilities

• There is some evidence (information) to support the


probability

• If the first dice is 5, then what is the P(Total=11)?

10
Basic Probability Notation
• Conditional probabilities

– The “|” is pronounced “given”

– Example

– Product rule

11
Basic Probability Notation
• Variables in probability theory are called random variables

– Total, Die1

• Every random variable has a domain

– Total = {2, …, 12}

– Die1 = {1,…6}

12
Basic Probability Notation
• Probabilities of all the possible values of a random variable:

– P(Weather =sunny) = 0.6

– P(Weather =rain) = 0.1

– P(Weather =cloudy) = 0.29

– P(Weather =snow) = 0.01

• The probability distribution for the random variable Weather

– P(Weather)=0.6, 0.1, 0.29, 0.01

13
Basic Probability Notation
• For continuous variables, it is not possible to write out the
entire distribution as a vector, because there are infinitely many
values

• The temperature at noon is distributed uniformly between 18


and 26 degrees Celcius

– P(NoonTemp =x) = Uniform[18C,26C](x)

– Called probability density function

14
Basic Probability Notation
• We need notation for distributions on multiple variables

• P(Weather , Cavity) denotes the probabilities of all


combinations of the values of Weather and Cavity

• This is a 4×2 table of probabilities called the joint probability


distribution of Weather and Cavity

15
Basic Probability Notation
• Probability of disjoint called (inclusive-exclusive principle)

• Example
– To 100 child, asked “what the meaning of traffic light”, and
the answer is :
• 75 child know the meaning of the red lamp
• 35 child know the meaning of the yellow lamp
• 50 child know the meaning of both
– Therefore :
• P(red yellow) = P(red) + P(yellow) – P(red yellow)
• P(red yellow) = 0.75 + 0.35 – 0.5 = 0.6
16
Inference Using Full Joint Distributions
• Full Joint Distribution for the Toothache, Cavity, Catch

• Six possible world in which cavity  toothache holds:

– P(cavity  toothache)

• = 0.108 + 0.012 + 0.072 + 0.008 + 0.016 + 0.064 = 0.28

17
Inference Using Full Joint Distributions
• Full Joint Distribution for the Toothache, Cavity, Catch

• Four possible world in which cavity holds:

– P(cavity) = 0.108 + 0.012 + 0.016 + 0.064 = 0.2

18
Inference Using Full Joint Distributions
• Full Joint Distribution for the Toothache, Cavity, Catch

• Can also compute conditional probabilities:


– P(cavity | toothache)
• = P(cavity  toothache) / P(toothache)
• = (0.016+0.064) / (0.108 + 0.012 + 0.016 + 0.064)
• = 0.4
19
Independence
• How are P(toothache, catch, cavity, cloudy) and P(toothache,
catch, cavity) related?

– Use the product rule :

• P(toothache, catch, cavity, cloudy) = P(cloudy |


toothache, catch, cavity) P(toothache, catch, cavity)

– However, the weather does not influence the dental


variables, therefore :

• P(cloudy | toothache, catch, cavity) = P(cloudy)

20
Independence
• A and B are independent iff

– P(A|B) = P(A) or P(B|A) = P(B) or P(A, B) = P(A) P(B)

– P(Toothache, Catch, Cavity, Weather) = P(Toothache, Catch,


Cavity) P(Weather)

21
Independence
• Conditional independence

– P(X, Y | Z) = P(X | Z)P(Y | Z)

• As example, let’s take a look at toothache and catch


probabilities, given cavity

– P(toothache ∧ catch | Cavity) = P(toothache | Cavity)P(catch


| Cavity)

– These variables are independent, give the presence or the


absence of a cavity

22
Probability and Bayes’ Theorem
• Bayes’ rule or Bayes’ theorem

– Why use this?


• We perceive as evidence the effect of some unknown
cause and we would like to determine that cause
23
Probability and Bayes’ Theorem
• For example, a doctor knows that the disease meningitis
causes the patient to have a stiff neck, say, 70% of the time

• The doctor also knows some unconditional facts: the prior


probability that a patient has meningitis is 1/50,000, and the
prior probability that any patient has a stiff neck is 1%.

24
Probability and Bayes’ Theorem
Vany had onset of symptoms such as spots on the face. Doctor
diagnose that Vany got chicken pox with the possibility:
• Probability appearance of spots on the face, if Vany got chicken
pox, p (spots/chicken pox) = 0,8
• Probability Vany got chicken pox without notice any symptoms,
p(chicken pox) = 0,4
• Probability appearance of spots on the face, if Vany got allergy,
p(spots/allergy) = 0,3
• Probability Vany got allergy without notice any symptoms,
p(allergy) = 0,7
25
Probability and Bayes’ Theorem
• Probability appearance of spots on the face, if Vany got
pimples, p(spots/pimples) = 0,9

• Probability that Vany got pimples without notice any


symptoms, p(pimples) = 0,5

Calculate the probability of each symptoms stated above!


(0,8) * (0,4) 0,32
p (chickenpox / spots )    0,327
(0,8) * (0,4)  (0,3) * (0,7)  (0,9) * (0,5) 0,98
(0,3) * (0,7)
p (allergy / spots )   0,214
0,98
(0,9) * (0,5)
p ( pimples / spots )   0,459
0,98
26
Probability and Bayes’ Theorem
• Problem : Marie is getting married tomorrow, at an outdoor
ceremony in the desert. In recent years, it has rained only 5
days each year. Unfortunately, the weatherman has predicted
rain for tomorrow. When it actually rains, the weatherman
correctly forecasts rain 90% of the time. When it doesn't rain,
he incorrectly forecasts rain 10% of the time. What is the
probability that it will rain on the day of Marie's wedding?

27
Probability and Bayes’ Theorem
• Solution: The sample space is defined by two mutually-
exclusive events - it rains or it does not rain. Additionally, a
third event occurs when the weatherman predicts rain.
Notation for these events appears below.

✓ Event A1. It rains on Marie's wedding.

✓ Event A2. It does not rain on Marie's wedding.

✓ Event B. The weatherman predicts rain.

28
Probability and Bayes’ Theorem
• P( A1 ) = 5/365 = 0.0136985 [It rains 5 days out of the year.]

• P( A2 ) = 360/365 = 0.9863014 [It does not rain 360 days out


of the year.]

• P( B | A1 ) = 0.9 [When it rains, the weatherman predicts rain


90% of the time.]

• P( B | A2 ) = 0.1 [When it does not rain, the weatherman


predicts rain 10% of the time.]

29
Probability and Bayes’ Theorem
• We want to know P( A1 | B ), the probability it will rain on the
day of Marie's wedding, given a forecast for rain by the
weatherman. The answer can be determined from Bayes'
theorem, as shown below.

(0.014)(0.9)
p ( A1 | B )   0.0111
(0.014)(0.9)  (0.986)(0.1)

30
Summary
• Basic probability statements include prior probabilities and
conditional probabilities over simple and complex
propositions

• The full joint probability distribution specifies the probability


of each complete assignment of values to random variables

• Bayes’ rule allows unknown probabilities to be computed


from known conditional probabilities, usually in the causal
direction

31
References
• Stuart Russell, Peter Norvig. 2010. Artificial Intelligence : A
Modern Approach. Pearson Education. New Jersey.
ISBN:9780132071482

• http://aima.cs.berkeley.edu

32

You might also like