4/1/2025 DR K R B Aiml Mrecw 1
4/1/2025 DR K R B Aiml Mrecw 1
•Example:
• only know that all C’s are B’s, but not all B’s are C’s.
•Example:
◦Suppose C = Dogs, J = Animals, B = Living Beings.
◦All dogs are animals, all animals are living beings, but not all
living beings are dogs.
◦Hence, this conclusion is false.
Conclusion 2: "Some J’s are C’s" ✅ (True)
limitations:
AI researchers are now combining symbolic logic with machine learning
(neural networks, probabilistic AI).
Statistical reasoning adds probability and data-driven learning to symbolic logic, making
it more flexible and powerful.
Approaches to Reasoning
There are three different approaches to reasoning under uncertainties.
•Probabilistic Logic extends symbolic logic by allowing degrees of truth instead of just
true/false.
•Example:
◦Instead of "If fever, then flu," we say:
•P(Flu | Fever) = 0.7 (70% probability)
◦AI can now make informed decisions even with uncertainty.
11
4/1/2025 Dr K R B AIML MRECW
(ii) Bayesian Networks (for Probabilistic Reasoning)
•Bayesian Networks (BNs) combine symbolic logic (causal relationships) with statistical
reasoning (probabilities).
•Example:
◦A medical diagnosis uses a Bayesian Network to reason:
•P(Flu | Fever, Cough, Sore throat) = 0.85
◦The AI updates probabilities dynamically as new information arrives.
◦(iii) Machine Learning (for Rule Discovery)
Instead of manually defining rules, machine learning (ML) can discover patterns from
data.
◦ Example:
◦ Instead of a human defining rules like:
◦ "If a customer buys a laptop, they might buy a mouse."
◦ AI learns this rule from sales data using association rule mining or probabilistic
models.
4/1/2025 Dr K R B AIML MRECW 12
(iv) Fuzzy Logic (for Handling Vagueness)
•Fuzzy Logic extends symbolic logic by allowing "degrees of truth" rather than just
true/false.
•Example:
◦Instead of a hard rule:
"If temperature > 38°C, then fever,"
iii) Circumscription
where:
P(A) is the probability of event A.
Favorable outcomes are the cases where event A occurs.
Total outcomes are all possible cases in the experiment.
19
4/1/2025 Dr K R B AIML MRECW
Bayes' Theorem.
Bayes’ theorem is a way to update our beliefs based on new evidence.
It relates conditional probabilities and is expressed as:
�(� ∣ �)⋅�(�)
P(A ∣ B)=
�(�)
where:
•P(A ∣ B) is the probability of event A, given event B occurred.
•P(B ∣ A) is the probability of event B, given A.
•P(A) and P(B) are the individual probabilities of A and B.
20
4/1/2025 Dr K R B AIML MRECW
Example: Medical Testing
Let’s say:
•1% of a population has a rare disease P(D)= 0.01.
•A test for the disease(T) is 90% accurate, if you have the disease P(T∣ D)=0.9
•The test gives a false positive 5% of the time (P(T∣ ¬D)=0.05
•We want to find P(D∣T), the probability that a person actually has the disease given they
tested positive.
here:
-Cloudy (C) is the root cause.
- Sprinkler (S) ,Rain (R)depend on Cloudy.
- Wet Grass (W) is influenced by both Sprinkler (S) , Rain (R).
24
4/1/2025 Dr K R B AIML MRECW
§ FROM FIG . Each arrow/Line represents a conditional dependence between the
variables.
of Bayesian Networks
- Medical Diagnosis (ex: detecting diseases based on symptoms)
- Spam Filtering (ex: detecting spam emails)
- Fault Detection (ex: identifying failures in machines)
- AI and Robotics (ex: decision-making under uncertainty) 27
4/1/2025 Dr K R B AIML MRECW
q Compute a specific probability using these tables?
ü Let's compute a specific probability using the Bayesian Network.
Problem: Given that the grass is wet (W = True), what is the probability that it
rained P(R | W)?
We will use Bayes' Theorem.
�(� | �).�(�)
ü P(R | W) =
�(�)
where:
29
4/1/2025 Dr K R B AIML MRECW
Step 2: Compute P(R | W)
Once we have P(W), apply Bayes' theorem:
�(� | �) �(�)
P(R | W) =
�(�)
This means that if we observe the grass is wet, there is a 68.8% chance that it was due
to rain rather than just the sprinkler.
4/1/2025 30
Dr K R B AIML MRECW
q Certainty Factors (CFs)
Certainty Factors (CFs) were introduced in expert systems to deal with uncertainty in
rule-based reasoning, especially in medical diagnosis systems like MYCIN.
(MYCIN : diagnosing bacterial infections and recommending antibiotic treatments. It was
designed to help doctors)
Definition:
A certainty factor (CF) represents the degree of belief or disbelief in a hypothesis based
on evidence.
certainty factor CF(H | E) = MB(H | E) - MD(H | E)
where: MB(H | E) = Measure of Belief (how strongly the evidence supports ( H ).
MD(H | E) = Measure of Disbelief( absolute the rejection of H)
Range:
- CF in [-1, 1]
- +1 → Complete certainty (absolute belief in H )
- 0 → No information (neutral belief)
- -1→ Complete disbelief (absolute rejection of H )
4/1/2025 Dr K R B AIML MRECW 31
Example: In a medical diagnosis system, suppose:
- A cough increases : Measure of Belief in bronchitis by 0.7 (MB = 0.7).
- Absence of a cough decreases belief by
ie Measure of DisBelief 0.2 (MD = 0.2).
CF(Bronchitis | Cough) = 0.7 - 0.2 = 0.5.
Thus, the certainty of having bronchitis given a cough is 50%.
•Directed Acyclic Graphs (DAGs) where nodes represent random variables and edges
represent conditional dependencies.
•Each node has a conditional probability table (CPT).
Example: A Bayesian Network for diagnosing Flu might have nodes for:
Symtoms (Fever, Cough, Fatigue),Diseases
Dr K R B AIML MRECW
(Flu, Cold). 33
4/1/2025
•Test Results
�(�����, ����� ∣ ���) �(���)
P(Flu ∣ Fever, Cough) =
�(�����, �����)
Advantages:
✔ Efficiently models causality
✔ Updates beliefs using Bayes’ Theorem
✔ Used in medical diagnosis, fraud detection, speech recognition
Advantages:
(casuality means :The relationship between cause and effect (ex."The causality between
smoking and lung disease is well established.").
4/1/2025 38
Dr K R B AIML MRECW
Key Concepts in DST
1. Frame of Discernment (Θ)
A finite set of mutually exclusive and exhaustive hypotheses.
Example: If we want to determine the weather, let we define:
Θ = {Sunny,Rainy,Cloudy}
Represents how plausible an event is, considering both direct and indirect
support. Pl(A): represents the plausibility of the A.
- It is computed as: Bel(¬A): represents
Pl(A) = 1 - Bel(¬ A) the degree of certainty that A is false.
=
- Range of Uncertainty:
The difference between Belief and Plausibility:
Pl(A) - Bel(A) : represents the degree of ignorance/un certainity.
Dr K R B AIML MRECW 40
4/1/2025
Example: Diagnosing a Disease using DST
A doctor is diagnosing a disease, and there are two possible causes:
- Flu (F)
- Cold (C)
Let’s define the Frame of Discernment:
Θ = { F, C }
A test result suggests some evidence, but it is uncertain.
The doctor assigns belief as follows:
�
m12(A) = m3(A) = �∩�=�
��(�)��(�)
�−�