FAI - Unit 4 - Logic and Knowledge
FAI - Unit 4 - Logic and Knowledge
by
Dr. Abdul Ahad
Content
4.1 Introduction of Logic
4.2 Logic Representation
4.3 Proposition Logic
4.4 Predicate Logic
4.5 Several Other Logics
4.6 Uncertainty and Probability
4.7 Knowledge Representation
4.8 Graphical Sketches and the Human Window
4.9 Graphs and the Bridges of Königs berg Problem
4.10 Representational Choices
4.11 Production Systems
4.12 Object Orientation and Frames,
4.13 Semantic Networks
Dr. Abdul Ahad, Department of AI 2
Logics in AI
i. Propositional Logic
ii. Predicate Logic (First Order
Logic)
iii. Uncertainty AI
a. Fuzzy Logic
b. Probability
Dr. Abdul Ahad, Department of AI
4.1 Propositional Logic
Propositional logic (PL) is the simplest form of logic where all
the statements are made by propositions.
A proposition is a declarative statement which is either true or
false.
It is a technique of knowledge representation in logical and
mathematical form.
Propositional logic is also called Boolean logic as it works on 0
and 1.
In propositional logic, we use symbolic variables to represent
the logic, and we can use any symbol for a representing a
proposition, such A, B, C, P, Q, R, etc.
Propositions can be either true or false, but it cannot be both.
Dr. Abdul Ahad, Department of AI
Propositional logic consists of an object, relations or function,
and logical connectives. These connectives are also called logical
operators.
The propositions and connectives are the basic elements of the
propositional logic. Connectives can be said as a logical operator
which connects two sentences.
A proposition formula which is always true is called tautology, and it
is also called a valid sentence.
A proposition formula which is always false is called Contradiction.
Statements which are questions, commands, or opinions are not
propositions such as "Where is Rohini", "How are you", "What is
your name", are not propositions.
propositional logic.
Example:
∴ C // Conclusion
Dr. Abdul Ahad, Department of AI
An argument is valid iff the implication formed by
taking the conjunction of the premises as the
antecedent and the conclusion as the consequent is a
tautology
i.e. (P1 /\ P2 /\ … /\ Pr) ⇒ C
premises are assumed to be true
clause form
In clause form there is no:
implication - change (p ⇒ q) to (~p \/ q)
conjunction - change p /\ q to p , q
double negation - change ~~p to p
Dr. Abdul Ahad, Department of AI
Resolution in the Propositional Logic
There are four steps to proving an argument valid by
resolution
Convert the premises to clause form
Negate the conclusion
Convert the negation of the conclusion to clause
form
Finally, search for a contradiction in the list of
clauses including combined clauses
Dr. Abdul Ahad, Department of AI
Prove the following argument is valid:
1) p⇒q
2) q ⇒ ~r
3) ~p ⇒ ~r
________
4) ∴ ~r
Step 1: Convert the premises to clause form
1') ~p \/ q
2') ~q \/ ~r
3') ~~p \/ ~r
Step 2: Negate the conclusion
4) ~~r
Dr. Abdul Ahad, Department of AI
Prove the following argument is valid:
1) p⇒q
2) q ⇒ ~r
3) ~p ⇒ ~r
________
4) ∴ ~r
Step 3: Convert the negation of the conclusion to clause form
4') r // via involution
Step 4: The clause base (list of clauses) is:
1') ~p \/ q
2') ~q \/ ~r
3') p \/ ~r
4') r
Dr. Abdul Ahad, Department of AI
4.2 Predicate Logic (First Order Logic)
(∃x)[Natural_number(x) /\ Divisible_by_2(x)]
Some natural numbers are even
(∀x){[Animal(x) /\ Has_Hair(x) /\ Warm_Blooded(x)]
⇒ Mammal(x)}
If x is a warm-blooded animal with hair
then x is a mammal
1) (∀x)(GC(x) ⇒ I(x))
2) (∀x)(I(x) ⇒ EF(x))
3) GC(Michael) \/ GC(Louis)
4) ~GC(Michael)
Therefore: 5) EF(Louis)
Fuzzy Logic
Probability Theory
Knowledge: Helps with making complex decisions and understanding .
Example:
Data: 70
Fact: 70 degrees Fahrenheit
Information: The temperature
outside is 70 degrees Fahrenheit.
Knowledge: If the temperature is
over 70 degrees Fahrenheit, then
you can go swimming.
For X3 For X4
μA∪B(X3) = max (μA(X3), μB(X3)) μA∪B(X4) = max (μA(X4), μB(X4))
μA∪B(X3) = max (0.4, 0.7) μA∪B(X4) = max (0.2, 0.9)
Dr. Abdul Ahad, Department of AI 32
μA∪B(X3) = 0.7 μA∪B(X4) = 0.9
Intersection Operation: The intersection operation of a fuzzy set is defined by:
For X3 For X4
μA∩B(X3) = min (μA(X3), μB(X3)) μA∩B(X4) = min (μA(X4), μB(X4))
μA∩B(X3) = min (0.4, 0.7) μA∩B(X4) = min (0.2, 0.9)
μA∩B(X3) = 0.4 Dr. Abdul Ahad, (X4) = 0.2
μA∩BDepartment of AI 33
Complement Operation: The complement operation of a fuzzy set is defined
by: μĀ(x) = 1-μA(x)
Example: Let's suppose A is a set which contains following elements:
A = {( X1, 0.3 ), (X2, 0.8), (X3, 0.5), (X4, 0.1)}
then,
Ā= {( X1, 0.7 ), (X2, 0.2), (X3, 0.5), (X4, 0.9)}
Because, according to this operation
For X1 For X2
μĀ(X1) = 1-μA(X1) μĀ(X2) = 1-μA(X2)
μĀ(X1) = 1 - 0.3 μĀ(X2) = 1 - 0.8
μĀ(X1) = 0.7 μĀ(X2) = 0.2
For X3 For X4
μĀ(X3) = 1-μA(X3) μĀ(X4) = 1-μA(X4)
μĀ(X3) = 1 - 0.5 μĀ(X4) = 1 - 0.1
μĀ(X3) = 0.5 μĀ(X4) = 0.9
Hence, 57% are the students who like CSE also like ECE.
Dr. Abdul Ahad, Department of AI 38
Bayes Theorem
Bayes’ theorem can be used to calculate the probability that a certain
event will occur or that a certain proposition is true. Bayes' theorem
is also known as Bayes' rule, Bayes' law, or Bayesian Reasoning,
which determines the probability of an event with uncertain
knowledge.
The above equation (a) is called as Bayes' rule or Bayes' theorem. This
equation is basic of most modern AI systems for probabilistic inference.
Here, P(A|B) is known as posterior, which we need to calculate, and it will be
read as Probability of hypothesis A when we have occurred an evidence B.
P(B|A) is called the likelihood, in which we consider that hypothesis is true,
then we calculate the probability of evidence.
P(A) is called the prior probability, probability of hypothesis before
considering the evidence.
P(B) is called marginal probability, pure probability of an evidence.
Dr. Abdul Ahad, Department of AI 40
Suppose we want to perceive the effect of some unknown cause, and want to
compute that cause, then the Bayes' rule becomes:
Hence, we can assume that 1 patient out of 750 patients has meningitis
disease with a stiff neck. Dr. Abdul Ahad, Department of AI 41
Example-2: Dangerous fires are rare (1%) but smoke is fairly common (10%) due
to barbecues, and 90% of dangerous fires make smoke.
We can then discover the probability of dangerous Fire when there is Smoke:
P (Fire | Smoke ) = [ P ( Smoke | Fire ) P ( Fire ) ] / P ( Smoke )
= [ 90% X 1% ] / 10%
= 9%
So it is still worth checking out any smoke to be sure.
Example-3: You are planning a picnic today, but the morning is cloudy. Oh no! 50%
of all rainy days start off cloudy! But cloudy mornings are common (about 40% of
days start cloudy) And this is usually a dry month (only 3 of 30 days tend to be
rainy, or 10%). What is the chance of rain during the day?
We will use Rain to mean rain during the day, and Cloud to mean cloudy morning. The
chance of Rain given Cloud is written P(Rain|Cloud).
P( Rain | Cloud ) = [ P ( Cloud | Rain ) X P ( Rain ) ] / P ( Cloud )
• P ( Cloud | Rain ) is Probability of Cloud, given that Rain happens = 50%
• P ( Rain ) is Probability of Rain = 10%
• P ( Cloud ) is Probability of Cloud = 40%
P ( Rain | Cloud ) = [ 0.5 X 0.1 ] / 0.4 = 0.125 (or 12.5% chance of rain).
Not too bad, let's have a picnic!
Dr. Abdul Ahad, Department of AI 42
Graphs
Graphs: Finite set of vertices (nodes) together with
finite set of edges
E = (u, v) : An edge is determined by its two vertices
Directed or undirected
Natural way to represent states, alternatives and
measurable paths in problem solving