0% found this document useful (0 votes)
30 views49 pages

4/1/2025 DR K R B Aiml Mrecw 1

The document discusses symbolic logic and statistical reasoning in AI, highlighting their importance for knowledge representation, logical reasoning, and decision-making. It covers various components of symbolic logic, non-monotonic reasoning, and the integration of statistical methods like Bayes' theorem and Bayesian networks to handle uncertainty. The future of AI is presented as a hybrid approach combining symbolic and statistical reasoning to enhance interpretability and adaptability.

Uploaded by

datascience2427
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
30 views49 pages

4/1/2025 DR K R B Aiml Mrecw 1

The document discusses symbolic logic and statistical reasoning in AI, highlighting their importance for knowledge representation, logical reasoning, and decision-making. It covers various components of symbolic logic, non-monotonic reasoning, and the integration of statistical methods like Bayes' theorem and Bayesian networks to handle uncertainty. The future of AI is presented as a hybrid approach combining symbolic and statistical reasoning to enhance interpretability and adaptability.

Uploaded by

datascience2427
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 49

UNIT II

Symbolic Logic and Statistical UNIT 4.0


Reasoning:
Symbolic Logic, Non-monotonic
Reasoning, Logics for non-
monotonic reasoning Statistical
Probability and Bayes Theorem,
Certainty factors, Probabilistic
Graphical Models, Baysian statistic.
Markov Networks, Fuzzy Logic.

4/1/2025 Dr K R B AIML MRECW 1


qWhat is Symbolic Logic?
-Symbolic logic : mathematical symbols to express logical statements
and relationships.
- It is based on formal logic, which provides a structured way to represent and reason
about facts and rules.

q Importance of Symbolic Logic in AI


Symbolic logic is used in AI to enable machines to:

✅ Represent Knowledge – Store facts and rules in a structured format.


✅ Perform Logical Reasoning – Infer new facts based on existing information.
✅ Make Decisions – Solve problems using logical rules.
✅ Explain Reasoning – Unlike neural networks, symbolic AI provides transparent
decision-making.

4/1/2025 Dr K R B AIML MRECW 2


Components of Symbolic Logic
(i) Propositional Logic (Boolean Logic)

•Deals with simple statements (propositions) that can be true or false.


•Uses logical connectives like AND (∧), OR (∨),
IF-THEN (→), (↔).

•Example:

• "If it rains, the ground is wet."


• R→ W (where R = It rains, W = The ground is wet)

4/1/2025 Dr K R B AIML MRECW 3


(ii) First-Order Logic (Predicate Logic)

•Extends propositional logic by introducing quantifiers (∀, ∃) and relations.


•Allows reasoning with objects, properties, and relationships.
•Example:
◦"All humans are mortal."
◦∀x(Human(x)→Mortal(x))

◦(iii) Knowledge Representation


AI systems use symbolic logic to store and manipulate facts in knowledge
bases.
Example: Expert Systems use IF-THEN rules for decision-making.

4/1/2025 Dr K R B AIML MRECW 4


Symbolic Reasonning & Certinity

4/1/2025 Dr K R B AIML MRECW 5


Evaluating - Conclusions
Conclusion 1: "All B’s are C’s" : ❌ (False)

• only know that all C’s are B’s, but not all B’s are C’s.
•Example:
◦Suppose C = Dogs, J = Animals, B = Living Beings.
◦All dogs are animals, all animals are living beings, but not all
living beings are dogs.
◦Hence, this conclusion is false.
Conclusion 2: "Some J’s are C’s" ✅ (True)

•From C ⊆ J, we know that every C is inside J.


•Therefore, some J’s (at least those that are C) must
exist.Hence, this conclusion is true.
4/1/2025 Dr K R B AIML MRECW 6
Symbolic Reasoning - Uncertainty

�Symbolic Reasoning - Uncertainty deals with situations where we do not


have complete knowledge/Inconsistencies of Knowledge
, but we still need to make logical inferences.

�It is often used for decision-making when information is incomplete,


imprecise, or conflicting.
�Uncertainty is handled using Fuzzy Logic, Probabilistic Reasoning, and
Belief Networks
Example : Pens, Pencils, and Eraser Costs relatively relation.

4/1/2025 Dr K R B AIML MRECW 7


q Let's define some uncertain cost relationships - symbolic logic:

(Rule 1) Pens cost more than Pencils: Pens > Pencils


(Rule 2) Pens cost less than Erasers: Pens < Erasers X (made wrong concl)
(Rule 3) Erasers cost more than both Pens and Pencils:
Erasers > Pencil,Eraser > Pens
Observations from Symbolic Reasoning:

✔ Pens is costlier than Pencils ✅ (Pens > Pencils)


✔ Pens is cheaper than Erasers ✅ (Pens< Erasers)
✔ Erasers is the most expensive item ✅ (Erasers > Pencils and Pens)

This shows how symbolic reasoning under uncertainty helps AI make


logical inferences .
Dr K R B AIML MRECW 8
4/1/2025
The Future: Hybrid AI (Symbolic + Statistical AI)

limitations:
AI researchers are now combining symbolic logic with machine learning
(neural networks, probabilistic AI).

This hybrid approach allows:


✅ Interpretable AI – Symbolic logic makes AI explainable.
✅ Learning from Data – Statistical AI (ML) enhances adaptability.
✅ Better Reasoning & Decision-Making – Using logic + probabilities
together.

4/1/2025 Dr K R B AIML MRECW 9


q How Statistical Reasoning Helps Symbolic Logic
ü Symbolic logic is great for structured reasoning and rule-based decision-
making, but it struggles with uncertainty, ambiguity, and learning from
data.

ü Statistical reasoning helps by adding probabilistic methods that enable AI


to handle real-world uncertainty.

ü The combination of symbolic logic + statistical reasoning creates hybrid


AI systems that are both interpretable and adaptive.

4/1/2025 Dr K R B AIML MRECW 10


How Symbolic Logic

Statistical reasoning adds probability and data-driven learning to symbolic logic, making
it more flexible and powerful.
Approaches to Reasoning
There are three different approaches to reasoning under uncertainties.

(i) Probabilistic Logic (for Uncertainty Handling)

•Probabilistic Logic extends symbolic logic by allowing degrees of truth instead of just
true/false.
•Example:
◦Instead of "If fever, then flu," we say:
•P(Flu | Fever) = 0.7 (70% probability)
◦AI can now make informed decisions even with uncertainty.

11
4/1/2025 Dr K R B AIML MRECW
(ii) Bayesian Networks (for Probabilistic Reasoning)

•Bayesian Networks (BNs) combine symbolic logic (causal relationships) with statistical
reasoning (probabilities).
•Example:
◦A medical diagnosis uses a Bayesian Network to reason:
•P(Flu | Fever, Cough, Sore throat) = 0.85
◦The AI updates probabilities dynamically as new information arrives.
◦(iii) Machine Learning (for Rule Discovery)
Instead of manually defining rules, machine learning (ML) can discover patterns from
data.
◦ Example:
◦ Instead of a human defining rules like:
◦ "If a customer buys a laptop, they might buy a mouse."
◦ AI learns this rule from sales data using association rule mining or probabilistic
models.
4/1/2025 Dr K R B AIML MRECW 12
(iv) Fuzzy Logic (for Handling Vagueness)

•Fuzzy Logic extends symbolic logic by allowing "degrees of truth" rather than just
true/false.
•Example:
◦Instead of a hard rule:
"If temperature > 38°C, then fever,"

◦Fuzzy logic says:


•P(Fever | Temp = 37.5°C) = 0.6
•P(Fever | Temp = 39°C) = 0.9
◦AI can make more human-like, flexible decisions.

4/1/2025 Dr K R B AIML MRECW 13


q Non-Monotonic Reasoning
Non-monotonic reasoning is a type of reasoning where conclusions can be
withdrawn or modified based on new information.
This differs from monotonic reasoning, where once a conclusion is drawn, it
cannot be retracted even if new information is added.

�Why is Non-Monotonic Reasoning Important?


•Real-world knowledge is incomplete and uncertain.
•AI systems must adapt and revise conclusions as they receive new data.
•Used in expert systems, planning, diagnosis, and natural language
understanding.

4/1/2025 Dr K R B AIML MRECW 14


4/1/2025 Dr K R B AIML MRECW 15
Logics for Non-Monotonic Reasoning
i) Default Logic

•Introduced by Ray Reiter (1980).


• that hold unless contradicted.
Example:
•Rule: "Birds typically fly." PENGUIN :
a black and white bird,
•Fact: Tweety is a bird. found mainly in the
•Conclusion: Tweety can fly. ✅ Antarctic, that can’t fly.
•New Fact: Tweety is a penguin. �
•Revised Conclusion: Tweety cannot fly. ❌
This shows how a default assumption is retracted when new information
appears.
4/1/2025 Dr K R B AIML MRECW 16
ii)Autoepistemic Logic

�Introduced by Robert Moore (1985).


�Extends modal logic to represent an agent's beliefs about its own knowledge.
Example:
•"If I do not believe Tweety cannot fly, then I assume Tweety can fly."
This logic allows AI to reflect on its knowledge and revise beliefs when new facts appear.

iii) Circumscription

�Introduced by John McCarthy (1980).


�Works by minimizing assumptions—assumes something is false unless proven true.
Example:
"Birds can fly" unless given information about exceptions (penguins, ostriches, etc.).
When new information (Tweety = penguin) appears, AI "circumscribes" its knowledge,
updating beliefs.

4/1/2025 Dr K R B AIML MRECW 17


(iv) Argumentation Logic

�AI systems argue for and against conclusions, weighing evidence.


�Used in law, automated decision-making, and AI assistants.
Example:
•Argument 1: "Birds fly → Tweety flies."
•Argument 2: "Tweety is a penguin → Penguins don’t fly → Tweety doesn’t fly."
AI evaluates both and selects the stronger argument.

Real-World Applications of Non-Monotonic Reasoning


✅ Expert Systems – in medicine .
✅ Robotics – Robots adjust plans when obstacles are detected.
✅ Self-Driving Cars – AI updates routes based on new traffic data.
✅ Chatbots & AI Assistants – Chatbots revise responses based on user input.

4/1/2025 Dr K R B AIML MRECW 18


Statistical Probability
Statistical probability deals with measuring the likelihood of an event
occurring.
Formula:
������ �� ��������� ��������.
P(A)=
����� ������ �� ��������

where:
P(A) is the probability of event A.
Favorable outcomes are the cases where event A occurs.
Total outcomes are all possible cases in the experiment.

19
4/1/2025 Dr K R B AIML MRECW
Bayes' Theorem.
Bayes’ theorem is a way to update our beliefs based on new evidence.
It relates conditional probabilities and is expressed as:

�(� ∣ �)⋅�(�)
P(A ∣ B)=
�(�)
where:
•P(A ∣ B) is the probability of event A, given event B occurred.
•P(B ∣ A) is the probability of event B, given A.
•P(A) and P(B) are the individual probabilities of A and B.

20
4/1/2025 Dr K R B AIML MRECW
Example: Medical Testing

Let’s say:
•1% of a population has a rare disease P(D)= 0.01.
•A test for the disease(T) is 90% accurate, if you have the disease P(T∣ D)=0.9
•The test gives a false positive 5% of the time (P(T∣ ¬D)=0.05
•We want to find P(D∣T), the probability that a person actually has the disease given they
tested positive.

Using Bayes’ Theorem:


�(�∣�)⋅�(�) (0.9×0.01) 0.009
P(D∣ T) = = =
�(�∣�)⋅�(�)+�(�∣¬�)⋅�(¬�) (0.9×0.01)+(0.05×0.99) 0.009+0.0495
�.���
= ≈ 0.154
�.����
So, even if a person tests positive, they only have a 15.4% chance of actually having the
disease!
This demonstrates why statistical reasoning is important in interpreting test results.
4/1/2025 Dr K R B AIML MRECW 21
Example:
A fair die has six faces numbered {1,2,3,4,5,6}.
The probability of rolling a 4 is:
1
P(4) =
6
If we roll two dice, the probability of getting a sum of 7 is found by
listing all favorable cases:
(1,6), (2,5), (3,4), (4,3), (5,2), (6,1) → 6 favorable outcomes.
Total outcomes = 6×6 = 36.
6 1
P(sum=7) = =
36 6
This demonstrates how probability quantifies uncertainty.

Key points: Statistical probability quantifies how likely an event is to happen.


Bayes' Theorem helps update our beliefs based on new evidence.
Example of probability: Dice rolling shows how to compute likelihoods.
Example of Bayes' theorem: Medical testing shows that even a positive result
doesn’t always mean certainty.
4/1/2025 Dr K R B AIML MRECW 22
o Bayesian Network (BN)
§ A Bayesian Network is a probabilistic graphical model that represents a set of
variables and their conditional dependencies using a Directed Acyclic Graph (DAG).
§ It helps in reasoning under uncertainty by encoding joint probability distributions in a
structured way.
Example: Weather & Lawn Watering System
We consider four variables:

1. Cloudy (C) – Whether it is cloudy.


2. Rainy (R) – Whether it is raining.
3. Sprinkler (S) – Whether the sprinkler is on.
4. Wet Grass (W) – Whether the grass is wet.
Relationships:
- If Cloudy (C) is True, it influences Rain (R) and Sprinkler (S).
- If it rains (R=True), the grass can get wet (W=True).
- If the sprinkler is on (S=True), it can also make the grass wet (W=True).
4/1/2025 Dr K R B AIML MRECW 23
Bayesian Network Structure (DAG)
The Directed Acyclic Graph(DAG) representing the relationships is as follows:

here:
-Cloudy (C) is the root cause.
- Sprinkler (S) ,Rain (R)depend on Cloudy.
- Wet Grass (W) is influenced by both Sprinkler (S) , Rain (R).

24
4/1/2025 Dr K R B AIML MRECW
§ FROM FIG . Each arrow/Line represents a conditional dependence between the
variables.

4/1/2025 Dr K R B AIML MRECW 25


4/1/2025 Dr K R B AIML MRECW 26
vInference Example
If we observe that grass is wet (W=True), we can use Bayesian inference ,To
determine the probability that it rained, given that the sprinkler was off.

Using Bayes’ Theorem:

�(� | �). �(�)


P(R | W) =
�(�)

This can be computed by summing over possible cases.

of Bayesian Networks
- Medical Diagnosis (ex: detecting diseases based on symptoms)
- Spam Filtering (ex: detecting spam emails)
- Fault Detection (ex: identifying failures in machines)
- AI and Robotics (ex: decision-making under uncertainty) 27
4/1/2025 Dr K R B AIML MRECW
q Compute a specific probability using these tables?
ü Let's compute a specific probability using the Bayesian Network.
Problem: Given that the grass is wet (W = True), what is the probability that it
rained P(R | W)?
We will use Bayes' Theorem.

�(� | �).�(�)
ü P(R | W) =
�(�)

where:

- P(W | R) is the probability of wet grass given that it rained.


- P(R) is the prior probability of rain.
- P(W) is the total probability of wet grass
(computed using the Law of Total Probability).
4/1/2025 Dr K R B AIML MRECW 28
Step 1: Compute P(W) ( i.e Total Probability of Wet Grass)
Using the law of total probability:
P(W) = P(W | S, R) P(S, R) + P(W | S, ¬ R) P(S, ¬ R) +
P(W | ¬ S, R) P(¬ S, R) + P(W | ¬ S, ¬R) P(¬ S, ¬ R)]

From the tables:


- (P(S | C = T) = 0.1), (P(S | C = F) = 0.5)
- (P(R | C = T) = 0.8), (P(R | C = F) = 0.2)
- (P(C = T) = 0.5), (P(C = F) = 0.5)

We compute joint probabilities: here: C:Cloud, S:Sprinkler, R:rain

P(S, R) = P(S | C) P(R | C) . P(C)


Let's calculate P(W) numerically.

29
4/1/2025 Dr K R B AIML MRECW
Step 2: Compute P(R | W)
Once we have P(W), apply Bayes' theorem:

�(� | �) �(�)
P(R | W) =
�(�)

now compute these probabilities numerically.

The probability that it rained given that the grass is wet,


P(R | W) , is 0.688 or 68.8%.

This means that if we observe the grass is wet, there is a 68.8% chance that it was due
to rain rather than just the sprinkler.

4/1/2025 30
Dr K R B AIML MRECW
q Certainty Factors (CFs)
Certainty Factors (CFs) were introduced in expert systems to deal with uncertainty in
rule-based reasoning, especially in medical diagnosis systems like MYCIN.
(MYCIN : diagnosing bacterial infections and recommending antibiotic treatments. It was
designed to help doctors)

Definition:
A certainty factor (CF) represents the degree of belief or disbelief in a hypothesis based
on evidence.
certainty factor CF(H | E) = MB(H | E) - MD(H | E)
where: MB(H | E) = Measure of Belief (how strongly the evidence supports ( H ).
MD(H | E) = Measure of Disbelief( absolute the rejection of H)
Range:
- CF in [-1, 1]
- +1 → Complete certainty (absolute belief in H )
- 0 → No information (neutral belief)
- -1→ Complete disbelief (absolute rejection of H )
4/1/2025 Dr K R B AIML MRECW 31
Example: In a medical diagnosis system, suppose:
- A cough increases : Measure of Belief in bronchitis by 0.7 (MB = 0.7).
- Absence of a cough decreases belief by
ie Measure of DisBelief 0.2 (MD = 0.2).
CF(Bronchitis | Cough) = 0.7 - 0.2 = 0.5.
Thus, the certainty of having bronchitis given a cough is 50%.

Combining Certainty Factors


If multiple independent pieces of evidence E1, E2, ... support H , CFs can be
combined:
CFcombined = CF1 + CF2 (1 - CF1) if CFs are positive
or
��1 + ��2
CFcombined = if CFs are positive, when dealing with
(1 − ���( ∣��1​∣ , ∣��2​∣ )
conflicting evidence.
4/1/2025 Dr K R B AIML MRECW 32
a)Probabilistic Graphical Models (PGMs)
Probabilistic Graphical Models (PGMs) are graph-based representations of probabilistic
relationships between variables.
Why Use PGMs?

•They handle uncertainty and dependencies in complex problems.


•They compress large probability tables into efficient structures.
•Used in AI, machine learning, speech recognition, and bioinformatics.
Types of PGMs

(a) Bayesian Networks (BNs)

•Directed Acyclic Graphs (DAGs) where nodes represent random variables and edges
represent conditional dependencies.
•Each node has a conditional probability table (CPT).
Example: A Bayesian Network for diagnosing Flu might have nodes for:
Symtoms (Fever, Cough, Fatigue),Diseases
Dr K R B AIML MRECW
(Flu, Cold). 33
4/1/2025
•Test Results
�(�����, ����� ∣ ���) �(���)
P(Flu ∣ Fever, Cough) =
�(�����, �����)
Advantages:
✔ Efficiently models causality
✔ Updates beliefs using Bayes’ Theorem
✔ Used in medical diagnosis, fraud detection, speech recognition

(b) Markov Networks (Undirected PGMs)


Uses undirected graphs, representing mutual dependencies.
Instead of conditional probabilities, it uses potential functions to define
relationships.
Example: A Markov Network for image recognition could have:
Pixels as nodes.
Dependencies between neighboring
Dr K R B
pixels.
AIML MRECW 34
4/1/2025

P(X)= �
��(��​​)

where Z is a normalization constant, and


ϕC(XC) are clique potentials capturing relationships.

Advantages:

✔ Works well in spatial relationships (image processing, NLP)


✔ No need for causality, just dependencies.

(casuality means :The relationship between cause and effect (ex."The causality between
smoking and lung disease is well established.").

4/1/2025 Dr K R B AIML MRECW 35


4/1/2025 Dr K R B AIML MRECW 36
both CFs and PGMs are essential for handling uncertainty in AI, machine
learning, and real-world decision-making.

4/1/2025 Dr K R B AIML MRECW 37


q Dempster-Shafer Theory (DST).
ü The Dempster-Shafer Theory (DST), also called the Theory of Belief
Functions, is a mathematical framework for reasoning with uncertainty.
ü It is a generalization of probability theory, allowing us to model both
uncertainty and ignorance.

Key Differences from Probability Theory.


- Probability assigns precise values to events.
- DST allows for partial belief, meaning we can assign belief to sets of
possibilities instead of single outcomes.
- It distinguishes between uncertainty and ignorance, meaning that if we
are unsure about an event, we do not need to forcefully distribute belief
among the available choices.

4/1/2025 38
Dr K R B AIML MRECW
Key Concepts in DST
1. Frame of Discernment (Θ)
A finite set of mutually exclusive and exhaustive hypotheses.
Example: If we want to determine the weather, let we define:
Θ = {Sunny,Rainy,Cloudy}

2. Basic Probability Assignment (BPA) Mass Function (m(A))


- Assigns belief to a set of possibilities rather than a single hypothesis.
- The sum of all assignments must be 1:
sum m(A) = 1
- If no belief is assigned to a subset, that represents ignorance.

3. Belief Function (Bel(A))


- Represents the total belief fully committed to a hypothesis.
- It includes all subsets of ( A ). Dr K R B AIML MRECW
4/1/2025 39
4. Plausibility Function (Pl(A))
- refers to the quality of seeming reasonable, believable, or credible.
Something is plausible if it appears to be true or possible based on the
available evidence, even if it hasn't been proven definitively. It is often used
to evaluate the likelihood of an explanation, argument, or scenario.

Represents how plausible an event is, considering both direct and indirect
support. Pl(A): represents the plausibility of the A.
- It is computed as: Bel(¬A): represents
Pl(A) = 1 - Bel(¬ A) the degree of certainty that A is false.
=
- Range of Uncertainty:
The difference between Belief and Plausibility:
Pl(A) - Bel(A) : represents the degree of ignorance/un certainity.
Dr K R B AIML MRECW 40
4/1/2025
Example: Diagnosing a Disease using DST
A doctor is diagnosing a disease, and there are two possible causes:
- Flu (F)
- Cold (C)
Let’s define the Frame of Discernment:
Θ = { F, C }
A test result suggests some evidence, but it is uncertain.
The doctor assigns belief as follows:

4/1/2025 Dr K R B AIML MRECW 41


- m(F) = 0.6 (60% belief it is Flu)
- m(C) = 0.2 (20% belief it is Cold)
- m(F, C) = 0.2 (20% belief remains uncertain)
Belief & Plausibility Calculation
- Belief in Flu:
Bel(F) = m(F) = 0.6
- Plausibility in Flu:
Pl(F) = m(F) + m(F, C) = 0.6 + 0.2 = 0.8
- Belief in Cold:
Bel(C) = m(C) = 0.2
- Plausibility in Cold:
Pl(C) = m(C) + m(F, C) = 0.2 + 0.2 = 0.4
Thus, the degree of uncertainty (ignorance) about Flu is:
Pl(F) - Bel(F) = 0.8 - 0.6 = 0.2
means there is a 20% uncertainty regarding Flu. 42
4/1/2025 Dr K R B AIML MRECW
Dempster’s Rule of Combination
If we have two independent sources of evidence are Flu,Cold,
DST provides a method to combine them using Dempster’s Rule of Combination.


m12(A) = m3(A) = �∩�=�
�​�(�)��​(�)
�−�

where: m1 and m2 are two independent pieces of evidence,


K is the conflict between them.
Example: Combining Two Test Results
1. First test results:
- m1(F) = 0.5 , m1(C) = 0.3 , m1(F, C) = 0.2
2. Second test results:
- m2(F) = 0.4 , m2(C) = 0.4 , m2(F, C) = 0.2
Dempster’s Rule, we compute the combined belief, resolving conflicts between the two
sources.
4/1/2025 Dr K R B AIML MRECW 43
q solve the example using Dempster’s Rule of Combination to combine
two independent sources of evidence.
Problem: Diagnosing a Disease A doctor is diagnosing a patient and has
two independent test results suggesting whether the patient has Flu (F)
or Cold (C).
-------------------------------------------------------------------------------------------------

Step 1: Define the Frame of Discernment


Possible diagnoses:
Θ={F,C}
where F = Flu , C = Cold
{F,C} = Uncertainty (not sure whether it's Flu or Cold)
Each test provides a Basic Probability Assignment (BPA).

4/1/2025 Dr K R B AIML MRECW 44


4/1/2025 Dr K R B AIML MRECW 45
Step 3: Dempster’s Rule of Combination
Dempster’s Rule is given by:

m1,2(A)=m3(A) = �∩�=�
�​�(�)��​(�)=
�−�
where: K is the conflict factor, calculated as: �= �∩�=∅,
�​�(�)��​(�)
Let's compute K = m1​(F) m2​(C) + m1​(C)m2​(F)
and apply Dempster’s Rule to combine the two test results.

4/1/2025 Dr K R B AIML MRECW 46


Step 4: Final Computed Values

After applying Dempster’s Rule of Combination, we get:


•Conflict Factor K=0.32 (32% of the evidence is conflicting)
•Combined belief in Flu m12(F) =m3(F)=0.559 (55.9%)
•Combined belief in Cold m12(C) = m3(C)=0.382 (38.2%)
• Uncertainty m12(Θ)= m3(F,C)= 0.059 (5.9%)
Conclusion

After combining the two test results:


•The belief in Flu increased from 50% to 55.9%.
•The belief in Cold increased slightly from 30% to 38.2%.
•Uncertainty is reduced from 20% to 5.9%, meaning we are now more confident in the
diagnosis.
it shows how Dempster-Shafer Theory helps in combining uncertain evidence while
resolving conflicts.
4/1/2025 Dr K R B AIML MRECW 47
q What is Fuzzy Logic?
Fuzzy Logic is a mathematical approach to handling imprecise, uncertain, or vague
information.
Unlike traditional Boolean logic (where values are either True (1) or False (0)), fuzzy
logic allows for partial truth between 0 and 1.

Example of Fuzzy Logic.


•In Boolean logic:
◦"Is the temperature hot?" → Answer: Yes (1) or No (0)

•In Fuzzy Logic:


◦"How hot is the temperature?" → Answer: 0.7 (moderately hot) or 0.2 (slightly hot)
Fuzzy logic is widely used in artificial intelligence, control systems, robotics, and decision-
making applications. 48
4/1/2025 Dr K R B AIML MRECW
q What are Fuzzy Sets?
A fuzzy set is a collection of elements where each element has a degree of membership
between 0 and 1.
In traditional sets (crisp sets):
•An element either belongs to a set (1) or does not belong (0).
•Example: In a crisp set of "Tall people" {x ∣ x > 6 feet},
/ tall people r X given that x > 6 ft /
a person of 5.9 feet is not included.
fuzzy sets:
•An element can partially belong to a set.
•Example: The fuzzy set "Tall People" may have:
◦5.5 feet → 0.3 (somewhat tall)
◦5.9 feet → 0.6 (moderately tall)
◦6.2 feet → 0.9 (very tall)
Mathematically, a fuzzy set A is defined as: A= { (x,μA(x)) ∣ x ∈ X }
where: x is an element in the universe of discourse. μA(x) is the membership function, which
assigns a value between 0 and 1 indicating how strongly x belongs to the set A. 49
4/1/2025 Dr K R B AIML MRECW

You might also like