0% found this document useful (0 votes)
101 views57 pages

Logical Reasoning. Reasoning With Uncertainty. Certainty Factor (CF)

This document discusses different types of logical reasoning including abductive, deductive, and inductive reasoning. It provides examples and explanations of each type. Abductive reasoning reasons from effects to causes, deductive reasoning reasons from causes to effects, and inductive reasoning reasons from specific cases to general rules. It also discusses different forms of deductive reasoning including categorical, conditional, and disjunctive syllogisms as well as common deductive patterns like modus ponens, modus tollens, and chain arguments. Inductive reasoning involves reasoning from specific cases to general conclusions, which are probabilistic rather than certain.

Uploaded by

Waseem Qassab
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
101 views57 pages

Logical Reasoning. Reasoning With Uncertainty. Certainty Factor (CF)

This document discusses different types of logical reasoning including abductive, deductive, and inductive reasoning. It provides examples and explanations of each type. Abductive reasoning reasons from effects to causes, deductive reasoning reasons from causes to effects, and inductive reasoning reasons from specific cases to general rules. It also discusses different forms of deductive reasoning including categorical, conditional, and disjunctive syllogisms as well as common deductive patterns like modus ponens, modus tollens, and chain arguments. Inductive reasoning involves reasoning from specific cases to general conclusions, which are probabilistic rather than certain.

Uploaded by

Waseem Qassab
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 57

Logical reasoning.

Reasoning with uncertainty.


Certainty factor (CF)
Logical reasoning
• Logic – The science of correct reasoning.
• Reasoning – The drawing of inferences or
conclusions from known or assumed facts.
When solving a problem, one must understand
the question, gather all pertinent facts,
analyze the problem, i.e. compare with
previous problems (note similarities and
differences), perhaps use pictures or formulas
to solve the problem.
Logical reasoning
• Logical reasoning supports the attachment of
meaning to the information to be analyzed.
• Information to be analyzed is mostly incomplete
and meaning can be affected by differences in
contexts.
• The following three distinguishable types of
logical reasoning are known: abductive, deductive
and inductive.
Comparing abduction, deduction
and induction
Abduction: rule: All balls in the box are black A => B
B
observation: These balls are black -------------
explanation: These balls are from the box Possibly A

Deduction: major premise: All balls in the box are black A => B
minor premise: A
These balls are from the box ---------
conclusion: These balls are black B

Induction: case: These balls are from the box Whenever


A then B
observation: These balls are black -------------
hypothesized rule: All ball in the box are black Possibly
A => B
Abduction reasons from effects to causes
Deduction reasons from causes to effects
Induction reasons from specific cases to general rules
Abductive reasoning
”Abduction is no more nor less than guessing” (Charles
Sanders Pierce).

Users of applications often make wrong assumptions


about program’s actions.

If one does not eat healthy food, one will become ill.
One is ill.
Therefore, one has not eaten healthy food.

All beans from this bag are white.


The beans are white.
Therefore these beans are from this bag.
Deductive reasoning
• Syllogism: An argument composed of two
statements or premises (the major and minor
premises), followed by a conclusion.
• For any given set of premises, if the conclusion
is guaranteed, the arguments is said to be
valid.
• If the conclusion is not guaranteed (at least
one instance in which the conclusion does not
follow), the argument is said to be invalid.
Deductive reasoning
Deductive Reasoning is a type of logic in which one goes from a general statement to
a specific instance. It uses facts, rules, definitions or properties to arrive at a
conclusion.

All students eat pizza.


Jim is a university student.
Therefore, Jim eats pizza.

All athletes work out in the gym.


Boney is an athlete.
Therefore, Boney works out in the gym.

90% of humans are right-handed.


Joe is human.
Therefore, Joe is right-handed.

During winter it is cold outside.


If it is cold outside, I will not go outside.
It is winter, so I will not go outside.
Types of Syllogisms
• Three different kinds of syllogism:

1. Categorical – (all / every)


2. Conditional – (if / then)
3. Disjunctive – (either / or)
Categorical (all / every)
• Something true about all members of a category must be true
about a particular example.
• Expressed using variables:
Major: A is true about B.
Minor: C is equivalent to B.
Conclusion: A is also true about C.

• Example (Default reasoning):


Major: All men are mortal.
Minor: Socrates is a man.
Conclusion: Therefore, Socrates must be mortal.
Conditional (If / then)
• If a first assumption proves to be true, then a logically related
assumption must also be true.

• Expressed using variables:


Major: If A is true, then B is true as well. Minor:
A is true.
Conclusion: B must be true as well.

• Example:
Major: If there’s smoke, then there’s fire.
Minor: There is smoke.
Conclusion: Therefore, there must be a fire.
Disjunctive (Either / or)
• If a first assumption proves to be true, then a related but
contradictory assumption must be false.

• Expressed using variables:


Major: If A is true, then B must be false.
Minor: A is true.
Conclusion: B must be false then.

• Example:
Major: Either he pays taxes, or get audited by the IRS.
Minor: He paid taxes.
Conclusion: Therefore, he won’t be audited by the IRS.
Disjunctive Syllogism
A or B
not A
So B

- Either he must pay the electricity bill or he must go


bankrupt
- He must not go bankrupt
- So he must pay the electricity bill
Non-monotonic reasoning
• Typically birds fly.
• Penguins do not fly.
• X is a bird.
• We can mostly conclude that X flies.
• Another knowledge is added: X is a penguin.
• The previous conclusion must be replaced by
new conclusion that X does not fly
Modus ponens
The form of that argument...
If X, then Y
X
So Y.
This is a very common pattern of deductive reasoning.
They are always valid.

(X => Y), X

– If it rains, then the streets will be wet.


– It is raining.
– Conclusion: The streets will be wet.
Modus ponens
– If Kamil paid the electricity bill today, then
he will not be able to pay the gas bill.
– Kamil paid the electricity bill today.
– So he will not be able to pay the gas bill.
Modus ponens

• If we are in Istanbul, then we are in Turkey.


A B
• We are in Istanbul.
A
• Therefore, we are in Turkey.
B
Chain Argument
IF A then B.
IF B then C.
Therefore IF A then C.

IF he is red in the face then he is lying.


IF he is lying then he can’t be my friend.
Therefore if he is red in the face then he can’t be
my friend.
Modus tollens
• If A then B.
Not B.
Therefore not A.

(X => Y), ~Y

¬X

• If we’re in Istanbul, we’re in Turkey.


We’re not in Turkey.
Therefore, we’re not in Istanbul.
Modus tollens

-If Kamil paid the electricity bill today, then he


would have looked miserable when you saw him.
- Kamil did not look miserable when you saw him.
- So Kamil did not pay the electricity bill today.
Modus tollens
• If he is in Sacramento, he is in California.
He is not in California.
Therefore, he is not in Sacramento.

• If he loves her, he will come with her to Tibet.


He will not come with her to Tibet.
Therefore he does not love her.
Inductive reasoning
Inductive Reasoning. Involves going from a series of
specific cases to a general statement.
Uses patterns to arrive at conclusion(conjecture).
The conclusion in an inductive argument is never
guaranteed.

Example: What is the next number in the


sequence 6, 13, 20, 27,…
There is more than one correct answer.
Inductive reasoning
• Here’s the sequence 6, 13, 20, 27,…
• Look at the difference of each term.
• 13 – 6 = 7, 20 – 13 = 7, 27 – 20 = 7
• Thus the next term is 34, because 34 – 27 = 7.
• However what if the sequence represents the
dates. Then the next number could be 3 (31 days
in a month).
• The next number could be 4 (30 day month)
• Or it could be 5 (29 day month – Feb. Leap year)
• Or even 6 (28 day month – Feb.)
Inductive reasoning

1) Every quiz has been easy. Therefore, the


next quiz will be easy.
2) The teacher used PowerPoint in the last
few classes. Therefore, the teacher will
use PowerPoint tomorrow.
3) Every fall there have been hurricanes in
the tropics. Therefore, there will be
hurricanes in the tropics this coming fall.
Inductive probability
• The premises and conclusion do not have to be true –
The question is:
– If the premises were true, would the conclusion
follow?
• Deductive arguments are either 100% valid or 100%
invalid.
• Inductive arguments can be somewhat strong, strong,
very strong, depending on the degree of support the
premises provide for the conclusion.

According the National Weather Service, there is a 60% -


70% - 90% chance of rain today.

It is likely that it will rain today.


Inductive Generalization
• A generalization attributes some characteristic to
all or most members of a given class.
• Information about some members of the class is
said to license the generalization.
All dinosaur bones discovered thus far have been
more than 65 million years old.
Therefore probably all dinosaur bones are more
than 65 million years old.
Inductive Strength
• An inductive argument is strong if the
conclusion follows probably from the premises.

All recent company presidents have been


University graduates.
It is likely that the next company president will
be a University graduate.
Inductive Arguments
• Every ruby discovered thus far has been red.
So, probably all rubies are red.

• Polls show that 87% of 5-year-olds believe in the tooth


fairy.
Marta is 5 years old.
Marta probably believed in the tooth fairy.

• Chemically, potassium chloride is very similar to


ordinary table salt (sodium chloride).
Therefore, potassium chloride tastes like table salt.
Predictive Argument
• A statement about what will (likely) happen in
the future is defended with reasons.

It has rained in Vancouver every February


since records have been kept.
Therefore it will probably rain in Vancouver
next February.
Weakness
An argument that is not strong is weak.

In a weak inductive argument, the conclusion does


not follow probably from the premises.

He dreams about monsters. You dream about


monsters.
Therefore everybody probably dreams about
monsters.
Denying the Antecedent
- If neighbor comes to the party, Murat will leave.
- Neighbor did not come to the party.
- Therefore Murat did not leave.

- If he says he sent email, then he sent email.


- But he hasn’t said he sent email.
- So he hasn’t sent email.
Affirming the Consequent
If A then B.
B.
Therefore A.

- If we are on Mars then we are in the solar system.


- We are in the solar system.
- Therefore we are on Mars.

- If he is lying, then he would look uncomfortable.


- He looks uncomfortable.
- So he is lying.
Strict Necessity Test
John is a father
So John is a male

Murat is a brother
So Murat has sibling
Exceptions to the Strict Necessity Test
1. Magellan’s ships sailed around
the world. It necessarily follows,
therefore, that the earth is a sphere.
(The arguer intended to offer a logically conclusive
argument, so it should be treated as deductive.)

2. If he is Bill, then he is mortal. He is not Bill.


Therefore, he is not mortal. (The argument has a
pattern of reasoning characteristic of deductive
arguments, so should be treated as deductive).
Reasoning under uncertainty
• Probability theory
• Certainty factors
• Bayesian probability
Sources of uncertainty
• Uncertain inputs
– Missing data
– Noisy data

• Uncertain knowledge
– Multiple causes lead to multiple effects
– Incomplete enumeration of conditions or effects
– Incomplete knowledge of causality in the domain
– Probabilistic effects

• Uncertain outputs
– Abduction and induction are inherently uncertain
– Default reasoning, even in deductive fashion, is uncertain
– Incomplete deductive inference may be uncertain
33
Probability theory
• Probability theory ➔ degree of belief or
plausibility of a statement – a numerical
measure in [0,1]

• Degree of truth – fuzzy logic  degree of


belief
Types of Uncertainty

• Uncertainty in prior knowledge


E.g., some causes of a disease are unknown and are not
represented in the background knowledge of a medical-
assistant agent

• Uncertainty in actions
E.g., actions are represented with relatively short lists of
preconditions, while these lists are in fact arbitrary long

• Uncertainty in perception
E.g., sensors do not return exact or complete information
(locality of sensor) about the world; a robot never knows
exactly its position
Types of Uncertainty
• Uncertainty in actions

E.g., to deliver this lecture:


Instructor must be able to come to school
The heating system must be working
The computer must be working
The LCD projector must be working

Actions are represented with relatively short lists of


preconditions, while these lists are in fact arbitrary long. It is
not efficient (or even possible) to list all the possibilities.
Types of Uncertainty
For example, to drive the car in the morning:
• It must not have been stolen during the night
• It must not have flat tires
• There must be gas in the tank
• The battery must be working
• The ignition must work
• Car keys must not be lost
•No truck should obstruct the driveway
Etc…
Handling Uncertainty
Approaches:
1. [Optimistic] Default reasoning
2. [Pessimistic] Worst-case reasoning
3. [Realist] Probabilistic reasoning
Default Reasoning

• Rationale: The world is fairly normal.


Abnormalities are rare
• So, an agent assumes normality, until there is
evidence of the contrary
• E.g., if an agent sees a bird x, it assumes that x
can fly, unless it has evidence that x is a
penguin, an ostrich, a bird with broken wings, …
Worst-case reasoning
• Rationale: Just the opposite! The world is
ruled by Murphy’s Law
• Everything that can go wrong, will go wrong.
• If there are two ways to solve the problem,
and one way leads to a big problem, then
someone will choose this way.
Probabilistic Reasoning

• The world is not divided between “normal” and


“abnormal”, nor is it adversarial. Possible
situations have various likelihoods
(probabilities)
• The agent has probabilistic beliefs – pieces of
knowledge with associated probabilities
(strengths) – and chooses its actions to
maximize the expected value of some utility
function
Certainty Factors (CF)

• Uncertainty is represented as a Degree of Belief


or Certainty Factor
• Certainty Factors (CF) express belief in an event
(or fact or hypothesis) based on evidence (or the
expert's assessment)
• CFs are NOT probabilities
• CFs need not sum to 100
Certainty Factor (CF) Formalism

• CF(h | e) = MB(h, e) - MD(h, e): -1 < CF < 1

• When there is total belief: CF = 1


• When there is a total disbelief in hypothesis: CF = -1
• When there is no evidence to make judgment: CF = 0

-1 1
0
Frange T
of disbelief range of belief
Certainty Factor (CF) Formalism

CF interpretation
Certainty Factor (CF) Formalism
Certainty factors are attached to premises of rules in production
systems.We need to calculate the CF for conjunctions and disjunctions:

CF(P1  P2) = min( CF(P1), CF(P2) )


CF(P1  P2) = max( CF(P1), CF(P2) )

We also need to compute the CF of a result supported by two rules with


factors CF1 and CF2:

CF1 + CF2 - CF1 * CF2 when CF1 > 0, CF2 > 0,


CF1 + CF2 + CF1* CF2 when CF1 < 0, CF2 < 0,
CF1 + CF2
————————— when signs differ.
1 - min(|CF1|, |CF2|)
Certainty Factor (CF)
IF sky is clear
AND the forecast is sunny
THEN the action is ‘wear sunglasses’ (CF=0.8)

and the certainty factor of sky is clear is 0.9 and the


certainty factor of forecast of sunny is 0.7, then

CF (H,E1E2) = min [0.9, 0.7]  0.8 = 0.7  0.8 = 0.56


Certainty Factor (CF)
IF sky is overcast
OR the forecast is rain
THEN the action is ‘take an umbrella’ (CF = 0.9}

and the certainty factor of sky is overcast is 0.6 and the


certainty factor of forecast of rain is 0.8, then

CF (H,E1E2) = max [0.6, 0.8]  0.9 = 0.8  0.9 = 0.72


Certainty Factor (CF)
• For example consider a rule in a knowledge base:
• (P1 and P2) or P3 → R1(0.7)
• If CFs for P1, P2, and P3 are 0.6, 0.4, and 0.2,
respectively, then

CF(P1(0.6) and P2(0.4)) = MIN(0.6, 0.4) = 0.4


CF((0.4) or P3(0.2)) = MAX (0.4, 0.2) = 0.4
CF(R1) = 0.4 * 0.7 = 0.28
Combining Two Rules
R1: IF the inflation rate is less than 5 percent,
THEN stock market prices go up (CF = 0.7)
R2: IF unemployment level is less than 7 percent,
THEN stock market prices go up (CF = 0.6)

Assume that the inflation rate = 4 percent and the


unemployment level= 6.5 percent

Combined Effect

CF(R1,R2) = CF(R1) + CF(R2) - CF(R1)  CF(R2)


CF(R1,R2) = 0.7 + 0.6 – 0.7  0.6 = 1.3 - 0.42 = 0.88
Combining Three Rules
The following rules are given:

R1: IF P1
THEN C (CF = 0.8)

R2: IF P2
THEN C (CF = 0.7)

R3: IF P3
THEN C (CF = 0.9)

Combine all three rules to calculate CF for the conclusion C.

CF(R1,R2)=CF(R1)+CF(R2) - CF(R1)*CF(R2)= 0.8+0.7–0.8*0.7=1.5-0.56 =0.94


CF(R1,R2,R3)=CF(R1,R2)+CF(R3)-CF(R1,R2)*CF(R3)=0.94+0.9-0.94*0.9=1.84-
0.846=0.994
Certainty Factor (example)
Consider a rule in a knowledge base

(P1 and P2) or P3 → C1(0.7) and C2(0.3)

If CFs for P1, P2, and P3 are 0.6, 0.4, and 0.2, respectively,
then

CF(P1(0.6) and P2(0.4)) = MIN(0.6, 0.4) = 0.4


CF((0.4) or P3(0.2)) = MAX (0.4, 0.2) = 0.4
CF(C1) = 0.7 * 0.4 = 0.28
CF(C2) = 0.3 * 0.4 = 0.12
Certainty Factor (CF)
Certainty Factor (example)
Suppose we have the following rule R1 with certainty factors (CF):
IF (P1 AND P2 AND P3) OR (P4 AND not P5) then C (0.7)
and the certainty factors of P1, P2, P3, P4, P5 are as follows:
CF(P1) = 0.8,
CF(P2) = 0.7,
CF(P3) = 0.6,
CF(P4) = 0.9,
CF(P5) = 0.6
What is the certainty factor of the conclusion C after using rule R1?
For P1 and P2 and P3, the CF is
min(CF(P1), CF(P2), CF(P3)) = min(0.8, 0.7, 0.6) = 0.6. Call this CFA.
For not P5, (1-0.6) = 0.4.
For P4 and not P5, the CF is min(0.9, 0.4) = 0.4. Call this CFB.
For (P1 and P2 and P3) or (P4 and not P5), the CF is:
max(CFA, CFB) = max(0.6, 0.4) = 0.6.
Thus CF(C) = 0.6 * 0.7 = 0.42
Certainty Factor (example)
Suppose we have the following rule R1 with certainty factors (CF):
IF (P1 AND P2 AND P3) OR (P4 AND not P5) then C (0.7)
and the certainty factors of P1, P2, P3, P4, P5 are as follows:
CF(P1) = -0.8,
CF(P2) = 0.7,
CF(P3) = 0.6,
CF(P4) = -0.4,
CF(P5) = -0.6
What is the certainty factor of the conclusion C after using rule R1?
For P1 and P2 and P3, the CF is
min(CF(P1), CF(P2), CF(P3)) = min(-0.8, 0.7, 0.6) = -0.8. Call this CFA.
For not P5, -(-0.6) = 0.6.
For P4 and not P5, the CF is min(-0.4, 0.6) = -0.4. Call this CFB.
For (P1 and P2 and P3) or (P4 and not P5), the CF is:
max(CFA, CFB) = max(-0.8, -0.4) = -0.4.
Thus CF(C) = -0.4 * 0.7 = -0.28
Certainty Factor (example)
Given rules regarding hiring the candidates to a manager
position in the factory:

IF the applicant has professional skills (CF=0.7)


OR organization capabilities (CF=0.8)
AND the candidate has good ability in human relations (CF=0.9)
THEN accept the applicant (CF=0.6).
Find CF for the conclusion.

Min(0.7,0.9)=0.7
Min(0.8, 0.9)=0.8
Max(0.7,0.8)=0.8

CF for the conclusion is 0.8*0.6= 0.48

You might also like