0% found this document useful (0 votes)
22 views35 pages

Chapter 10 Reasoning Under Uncertainty

Uploaded by

weaverjordan210
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
22 views35 pages

Chapter 10 Reasoning Under Uncertainty

Uploaded by

weaverjordan210
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 35

ICS 2405

Knowledge-Based
Systems

Chapter 9: Reasoning
Under Uncertainty

11/11/2024 CHAPTER 9 REASONING UNDER UNCERTAINTY 1


Overview Reasoning
and Uncertainty
Motivation
Objectives
Sources of Uncertainty and Inexactness in Reasoning
◦ Incorrect and Incomplete Knowledge
◦ Ambiguities
◦ Belief and Ignorance
Probability Theory
◦ Bayesian Networks
◦ Certainty Factors
◦ Belief and Disbelief
◦ Dempster-Shafer Theory
◦ Evidential Reasoning
Important Concepts and Terms
Chapter Summary

11/11/2024 CHAPTER 9 REASONING UNDER UNCERTAINTY 2


Motivation
reasoning for real-world problems involves missing knowledge, inexact
knowledge, inconsistent facts or rules, and other sources of
uncertainty
while traditional logic in principle is capable of capturing and
expressing these aspects, it is not very intuitive or practical
◦ explicit introduction of predicates or functions

many expert systems have mechanisms to deal with uncertainty


◦ sometimes introduced as ad-hoc measures, lacking a sound foundation

11/11/2024 CHAPTER 9 REASONING UNDER UNCERTAINTY 3


Objectives
be familiar with various sources of uncertainty and imprecision
in knowledge representation and reasoning
understand the main approaches to dealing with uncertainty
◦ probability theory
◦ Bayesian networks
◦ Dempster-Shafer theory
◦ important characteristics of the approaches
◦ differences between methods, advantages, disadvantages, performance,
typical scenarios
evaluate the suitability of those approaches
◦ application of methods to scenarios or tasks
apply selected approaches to simple problems

11/11/2024 CHAPTER 9 REASONING UNDER UNCERTAINTY 4


Introduction
reasoning under uncertainty and with inexact knowledge
◦ frequently necessary for real-world problems
heuristics
◦ ways to mimic heuristic knowledge processing
◦ methods used by experts
empirical associations
◦ experiential reasoning
◦ based on limited observations
probabilities
◦ objective (frequency counting)
◦ subjective (human experience )
reproducibility
◦ will observations deliver the same results when repeated

11/11/2024 CHAPTER 9 REASONING UNDER UNCERTAINTY 5


Dealing with
Uncertainty
expressiveness
◦ can concepts used by humans be represented adequately?
◦ can the confidence of experts in their decisions be expressed?
comprehensibility
◦ representation of uncertainty
◦ utilization in reasoning methods
correctness
◦ probabilities
◦ adherence to the formal aspects of probability theory
◦ relevance ranking
◦ probabilities don’t add up to 1, but the “most likely” result is sufficient
◦ long inference chains
◦ tend to result in extreme (0,1) or not very useful (0.5) results

computational complexity
◦ feasibility of calculations for practical purposes

11/11/2024 CHAPTER 9 REASONING UNDER UNCERTAINTY 6


Sources of Uncertainty
data
◦ data missing, unreliable, ambiguous,
◦ representation imprecise, inconsistent, subjective, derived from defaults, …
expert knowledge
◦ inconsistency between different experts
◦ plausibility
◦ “best guess” of experts
◦ quality
◦ causal knowledge
◦ deep understanding
◦ statistical associations
◦ observations
◦ scope
◦ only current domain, or more general

11/11/2024 CHAPTER 9 REASONING UNDER UNCERTAINTY 7


Sources of Uncertainty
(cont.)
knowledge representation
◦ restricted model of the real system
◦ limited expressiveness of the representation mechanism
inference process
◦ deductive
◦ the derived result is formally correct, but inappropriate
◦ derivation of the result may take very long
◦ inductive
◦ new conclusions are not well-founded
◦ not enough samples
◦ samples are not representative
◦ unsound reasoning methods
◦ induction, non-monotonic, default reasoning

11/11/2024 CHAPTER 9 REASONING UNDER UNCERTAINTY 8


Uncertainty in Individual
Rules
errors
◦ domain errors
◦ representation errors
◦ inappropriate application of the rule

likelihood of evidence
◦ for each premise
◦ for the conclusion
◦ combination of evidence from multiple premises

11/11/2024 CHAPTER 9 REASONING UNDER UNCERTAINTY 9


Uncertainty and
Multiple Rules
conflict resolution
◦ if multiple rules are applicable, which one is selected
◦ explicit priorities, provided by domain experts
◦ implicit priorities derived from rule properties
◦ specificity of patterns, ordering of patterns creation time of rules, most recent
usage, …
compatibility
◦ contradictions between rules
◦ subsumption
◦ one rule is a more general version of another one
◦ redundancy
◦ missing rules
◦ data fusion
◦ integration of data from multiple sources

11/11/2024 CHAPTER 9 REASONING UNDER UNCERTAINTY 10


Basics of Probability
Theory
mathematical approach for processing uncertain information
sample space set
X = {x1, x2, …, xn}
◦ collection of all possible events
◦ can be discrete or continuous
probability number P(xi) reflects the likelihood of an event xi to occur
◦ non-negative value in [0,1]
◦ total probability of the sample space (sum of probabilities) is 1
◦ for mutually exclusive events, the probability for at least one of them is the sum
of their individual probabilities
◦ experimental probability
◦ based on the frequency of events
◦ subjective probability
◦ based on expert assessment

11/11/2024 CHAPTER 9 REASONING UNDER UNCERTAINTY 11


Probability theory:
probabilities
We can assign probabilities to the outcomes of a random variable.
P(Throw = heads) = 0.5
P(Mary_Calls = true) = 0.1
P(a) = 0.3
Some simple rules governing probabilities
1. All probabilities are between 0 and 1 inclusive 0 P (a ) 1
2. If something is necessarily true it has probability 1 P (true) 1
P( false ) 0
3. The probability of a disjunction being true is
P (a  b) P (a )  P (b)  P (a  b)
From these three laws all of probability theory can be derived.

11/11/2024 CHAPTER 9 REASONING UNDER UNCERTAINTY 12


Probability Theory: Conditional
Probability
A conditional probability expresses the likelihood that one event a will occur if b
occurs. We denote this as follows
P ( a | b)
P (Toothache true ) 0.2
e.g.
P (Toothache true | Cavity true ) 0.6

So conditional probabilities reflect the fact that some events make other events
more (or less) likely
If one event doesn’t affect the likelihood of another event they are said to be
independent and therefore
P ( a | b) P ( a )

E.g. if you roll a 6 on a die, it doesn’t make it more or less likely that you will roll a
6 on the next throw. The rolls are independent.

11/11/2024 CHAPTER 9 REASONING UNDER UNCERTAINTY 13


Combining Probabilities: the
product rule
How we can work out the likelihood of two events occuring together given their
base and conditional probabilities?
P (a  b) P (a | b) P (b) P (b | a ) P (a )
So in our toy example 1:

P (toothache  cavity ) P (toothache | cavity ) P (cavity )


P (cavity | toothache ) P (toothache )

But this doesn’t help us answer our question:


“I have toothache. Do I have a cavity?”

11/11/2024 CHAPTER 9 REASONING UNDER UNCERTAINTY 14


Advantages and Problems:
Probabilities
advantages
◦ formal foundation
◦ reflection of reality (a posteriori)
problems
◦ may be inappropriate
◦ the future is not always similar to the past
◦ inexact or incorrect
◦ especially for subjective probabilities
◦ ignorance
◦ probabilities must be assigned even if no information is available
◦ assigns an equal amount of probability to all such items
◦ non-local reasoning
◦ requires the consideration of all available evidence, not only from the rules currently under
consideration
◦ no compositionality
◦ complex statements with conditional dependencies can not be decomposed into independent parts

11/11/2024 CHAPTER 9 REASONING UNDER UNCERTAINTY 15


Reasoning under Uncertainty
Bayesian Approaches
derive the probability of a cause given a symptom
has gained importance recently due to advances in efficiency
◦ more computational power available
◦ better methods

especially useful in diagnostic systems


◦ medicine, computer help systems

inverse probability
◦ inverse to conditional probability of an earlier event given that a later
one occurred

11/11/2024 CHAPTER 9 REASONING UNDER UNCERTAINTY 16


Reasoning under Uncertainty
Bayes’ rule
We can rearrange the two parts of the product rule:

P (a  b) P (a | b) P (b) P (b | a ) P (a )

P (a | b) P (b) P (b | a ) P (a )

P (b | a ) P (a )
P ( a | b) 
P (b)

This is known as Bayes’ rule.


It is the cornerstone of modern probabilistic AI.
But why is it useful?

11/11/2024 CHAPTER 9 REASONING UNDER UNCERTAINTY 17


Bayes’ rule
We can think about some events as being “hidden” causes: not necessarily
directly observed (e.g. a cavity).

If we model how likely observable effects are given hidden causes (how likely
toothache is given a cavity)

Then Bayes’ rule allows us to use that model to infer the likelihood of the hidden
cause (and thus answer our question)

P (effect | cause) P (cause)


P (cause | effect ) 
P (effect )

In fact good models of P (effect | cause ) are often available to us in real


domains (e.g. medical diagnosis)

11/11/2024 CHAPTER 9 REASONING UNDER UNCERTAINTY 18


Bayes’ rule can capture causal
models
Suppose a doctor knows that a meningitis causes a stiff neck in 50% of cases
P( s | m) 0.5
She also knows that the probability in the general population of someone having
a stiff neck at any time is 1/20
P (m) 0.00002
She also has to know the incidence of meningitis in the population (1/50,000)
P ( s ) 0.05
Using Bayes’ rule she can calculate the probability the patient has meningitis:

P ( s | m) P (m) 0.5 0.00002


P(m | s)   0.0002 1 / 5000
P(s) 0.05
P (effect | cause) P (cause)
P (cause | effect ) 
P (effect )
11/11/2024 CHAPTER 9 REASONING UNDER UNCERTAINTY 19
Advantages and Problems of
Bayesian Reasoning
advantages
◦ sound theoretical foundation
◦ well-defined semantics for decision making

problems
◦ requires large amounts of probability data
◦ sufficient sample sizes
◦ subjective evidence may not be reliable
◦ independence of evidences assumption often not valid
◦ relationship between hypothesis and evidence is reduced to a number
◦ explanations for the user difficult
◦ high computational overhead

11/11/2024 CHAPTER 9 REASONING UNDER UNCERTAINTY 20


Reasoning under Uncertainty
Certainty Factors
denotes the belief in a hypothesis H given that some pieces of
evidence E are observed
no statements about the belief means that no evidence is present
◦ in contrast to probabilities, Bayes’ method

works reasonably well with partial evidence


◦ separation of belief, disbelief, ignorance

shares some foundations with Dempster-Shafer (DS) theory, but is


more practical
◦ introduced in an ad-hoc way in MYCIN
◦ later mapped to DS theory

11/11/2024 CHAPTER 9 REASONING UNDER UNCERTAINTY 21


Reasoning under Uncertainty
Belief and Disbelief
measure of belief
◦ degree to which hypothesis H is supported by evidence E
◦ MB(H,E) = 1 if P(H) = 1
(P(H|E) - P(H)) / (1- P(H)) otherwise

measure of disbelief
◦ degree to which doubt in hypothesis H is supported by evidence E
◦ MD(H,E) = 1 if P(H) = 0
(P(H) - P(H|E)) / P(H)) otherwise

11/11/2024 CHAPTER 9 REASONING UNDER UNCERTAINTY 22


Reasoning under Uncertainty
Certainty Factor
certainty factor CF
◦ ranges between -1 (denial of the hypothesis H) and +1 (confirmation of
H)
◦ allows the ranking of hypotheses

difference between belief and disbelief


CF (H,E) = MB(H,E) - MD (H,E)
combining antecedent evidence
◦ use of premises with less than absolute confidence
◦ E1 ∧ E2 = min(CF(H, E1), CF(H, E2))
◦ E1 ∨ E2 = max(CF(H, E1), CF(H, E2))
◦ ¬E = ¬ CF(H, E)

11/11/2024 CHAPTER 9 REASONING UNDER UNCERTAINTY 23


Reasoning under Uncertainty
Combining Certainty
Factors
certainty factors that support the same conclusion
several rules can lead to the same conclusion
applied incrementally as new evidence becomes available

CFrev(CFold, CFnew) =
CFold + CFnew(1 - CFold) if both > 0
CFold + CFnew(1 + CFold) if both < 0
CFold + CFnew / (1 - min(|CFold|, |CFnew|)) if one < 0

11/11/2024 CHAPTER 9 REASONING UNDER UNCERTAINTY 24


Reasoning under Uncertainty
Characteristics of
Certainty Factors
Aspect Probability MB MD CF
Certainly true P(H|E) = 1 1 0 1
Certainly false P(¬H|E) = 1 0 1 -1
No evidence P(H|E) = P(H) 0 0 0

Ranges
◦ measure of belief 0 ≤ MB ≤ 1
◦ measure of disbelief 0 ≤ MD ≤ 1
◦ certainty factor -1 ≤ CF ≤ +1

11/11/2024 CHAPTER 9 REASONING UNDER UNCERTAINTY 25


Reasoning under Uncertainty
Advantages and Problems of
Certainty Factors
Advantages
◦ simple implementation
◦ reasonable modeling of human experts’ belief
◦ expression of belief and disbelief
◦ successful applications for certain problem classes
◦ evidence relatively easy to gather
◦ no statistical base required

Problems
◦ partially ad hoc approach
◦ theoretical foundation through Dempster-Shafer theory was developed later
◦ combination of non-independent evidence unsatisfactory
◦ new knowledge may require changes in the certainty factors of existing knowledge
◦ certainty factors can become the opposite of conditional probabilities for certain cases
◦ not suitable for long inference chains

11/11/2024 CHAPTER 9 REASONING UNDER UNCERTAINTY 26


Reasoning under Uncertainty
Dempster-Shafer
Theory
mathematical theory of evidence
◦ uncertainty is modeled through a range of probabilities
◦ instead of a single number indicating a probability
◦ sound theoretical foundation
◦ allows distinction between belief, disbelief, ignorance (non-belief)
◦ certainty factors are a special case of DS theory

11/11/2024 CHAPTER 9 REASONING UNDER UNCERTAINTY 27


Reasoning under Uncertainty
DS Theory Notation
environment Θ = {O1, O2, ..., On}
◦ set of objects Oi that are of interest
◦ Θ = {O1, O2, ..., On}
frame of discernment FD
◦ an environment whose elements may be possible answers
◦ only one answer is the correct one
mass probability function m
◦ assigns a value from [0,1] to every item in the frame of discernment
◦ describes the degree of belief in analogy to the mass of a physical object
mass probability m(A)
◦ portion of the total mass probability that is assigned to a specific element
A of FD

11/11/2024 CHAPTER 9 REASONING UNDER UNCERTAINTY 28


Reasoning under Uncertainty
Belief and Certainty
belief Bel(A) in a set A
◦ sum of the mass probabilities of all the proper subsets of A
◦ all the mass that supports A
◦ likelihood that one of its members is the conclusion
◦ also called support function

plausibility Pls(A)
◦ maximum belief of A
◦ upper bound for the range of belief

certainty Cer(A)
◦ interval [Bel(A), Pls(A)]
◦ also called evidential interval
◦ expresses the range of belief

11/11/2024 CHAPTER 9 REASONING UNDER UNCERTAINTY 29


Reasoning under Uncertainty
Combination of Mass
Probabilities
combining two masses in such a way that the new mass represents a
consensus of the contributing pieces of evidence
◦ set intersection puts the emphasis on common elements of evidence,
rather than conflicting evidence

m1 ⊕ m2 (C) = Σ X ∩ Y m1(X) * m2(Y)


= C m1(X) * m2(Y) / (1- ΣX ∩ Y)
= C m1(X) * m2(Y)
where X, Y are hypothesis subsets
C is their intersection C = X ∩ Y
⊕ is the orthogonal or direct sum

11/11/2024 CHAPTER 9 REASONING UNDER UNCERTAINTY 30


Reasoning under Uncertainty
Differences Probabilities -
DS Theory
Aspect Probabilities Dempster-Shafer

Aggregate Sum ∑i Pi = 1 m(Θ) ≤ 1


Subset X ⊆ Y P(X) ≤ P(Y) m(X) > m(Y) allowed
relationship X, ¬X P(X) + P (¬X) = 1 m(X) + m(¬X) ≤ 1
(ignorance)

11/11/2024 CHAPTER 9 REASONING UNDER UNCERTAINTY 31


Reasoning under Uncertainty
Evidential Reasoning
extension of DS theory that deals with uncertain, imprecise, and
possibly inaccurate knowledge
also uses evidential intervals to express the confidence in a statement
◦ lower bound is called support (Spt) in evidential reasoning, and belief
(Bel) in Dempster-Shafer theory
◦ upper bound is plausibility (Pls)

11/11/2024 CHAPTER 9 REASONING UNDER UNCERTAINTY 32


Reasoning under Uncertainty
Evidential Intervals
Meaning Evidential Interval
Completely true [1,1]
Completely false [0,0]
Completely ignorant [0,1]
Tends to support [Bel,1] where 0 < Bel < 1
Tends to refute [0,Pls] where 0 < Pls < 1
Tends to both support and [Bel,Pls] where 0 < Bel ≤ Pls<
refute 1
Bel: belief; lower bound of the evidential interval
Pls: plausibility; upper bound

11/11/2024 CHAPTER 9 REASONING UNDER UNCERTAINTY 33


Reasoning under Uncertainty
Advantages and Problems of
Dempster-Shafer
advantages
◦ clear, rigorous foundation
◦ ability to express confidence through intervals
◦ certainty about certainty
◦ proper treatment of ignorance

problems
◦ non-intuitive determination of mass probability
◦ very high computational overhead
◦ may produce counterintuitive results due to normalization
◦ usability somewhat unclear

11/11/2024 CHAPTER 9 REASONING UNDER UNCERTAINTY 34


Reasoning under Uncertainty
Summary Reasoning
and Uncertainty
many practical tasks require reasoning under uncertainty
◦ missing, inexact, inconsistent knowledge

variations of probability theory are often combined with rule-based


approaches
◦ works reasonably well for many practical problems

Bayesian networks have gained some prominence


◦ improved methods, sufficient computational power

11/11/2024 CHAPTER 9 REASONING UNDER UNCERTAINTY 35


Reasoning under Uncertainty

You might also like