0% found this document useful (0 votes)
84 views28 pages

Unit-3 (AI NOTES)

Uploaded by

ABHI SHARMA
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
84 views28 pages

Unit-3 (AI NOTES)

Uploaded by

ABHI SHARMA
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 28

UNIT – 3

KNOWLEDGE REPRESENTATION
First Order Predicate Logic (FOPL)

o First-order logic is another way of knowledge representation in artificial intelligence. It is an


extension to propositional logic.
o FOL is sufficiently expressive to represent the natural language statements in a concise way.
o First-order logic is also known as Predicate logic or First-order predicate logic. First-order logic
is a powerful language that develops information about the objects in a more easy way and can also
express the relationship between those objects.
o First-order logic (like natural language) does not only assume that the world contains facts like
propositional logic but also assumes the following things in the world:
o Objects: A, B, people, numbers, colors, wars, theories, squares,......
o Relations: It can be unary relation such as: red, round, is adjacent, or n-any relation such
as: the sister of, brother of, has color, comes between
o Function: Father of, best friend, third inning of, end of, ......
o As a natural language, first-order logic also has two main parts:
o Syntax
o Semantics

Syntax of First-Order logic:

The syntax of FOL determines which collection of symbols is a logical expression in first-order logic. The
basic syntactic elements of first-order logic are symbols. We write statements in short-hand notation in
FOL.

Basic Elements of First-order logic:

Following are the basic elements of FOL syntax:

Constant 1, 2, A, John, Mumbai, cat,....

Variables x, y, z, a, b,....

Predicates Brother, Father, >,....


Function sqrt, LeftLegOf, ....

Connectives 𝖠, 𝗏, ¬, ⇒, ⇔

Equality ==

Quantifier ∀, ∃

Atomic sentences:
o Atomic sentences are the most basic sentences of first-order logic. These sentences are formed from
a predicate symbol followed by a parenthesis with a sequence of terms.
o We can represent atomic sentences as Predicate (term1, term2, ...... , term n).

Example: Ravi and Ajay are brothers: => Brothers(Ravi, Ajay).


Chinky is a cat: => cat (Chinky).

Complex Sentences:
o Complex sentences are made by combining atomic sentences using connectives.

First-order logic statements can be divided into two parts:

o Subject: Subject is the main part of the statement.


o Predicate: A predicate can be defined as a relation, which binds two atoms together in a statement.

Consider the statement: "x is an integer.", it consists of two parts, the first part x is the subject of the
statement and second part "is an integer," is known as a predicate.

Integer(X)

Quantifiers in First-order logic:


o A quantifier is a language element which generates quantification, and quantification specifies the
quantity of specimen in the universe of discourse.
o These are the symbols that permit to determine or identify the range and scope of the variable in the
logical expression. There are two types of quantifier:
1. Universal Quantifier, (for all, everyone, everything)
2. Existential quantifier, (for some, at least one).
Universal Quantifier:

Universal quantifier is a symbol of logical representation, which specifies that the statement within its range
is true for everything or every instance of a particular thing.

The Universal quantifier is represented by a symbol ∀, which resembles an inverted A.

Note: In universal quantifier we use implication "→".

If x is a variable, then ∀x is read as:

o For all x
o For each x
o For every x.

Example:
All man drink coffee.

Let a variable x which refers to a cat so all x can be represented in UOD as below:

∀x man(x) → drink (x, coffee).

∀x ∀y man(x) ^ coffee(y)→ drink (x,y).

∀x ∀y child(x) ^ biscuit(y)→ love (x,y).


∀x child(x) → love (x, biscuit).

Biscuit(marigold)

Biscuit(oreo)

∀x child(x) → love (x, biscuit).

∀x ∀y child(x) ^ biscuit(y)→ love (x, y).

God help those who help themselves.

∀x person(x) → help (God, X).

∀x ∀yperson(x) ^ god(y) ^help (x, x) → help (y, x).

Existential Quantifier:

Existential quantifiers are the type of quantifiers, which express that the statement within its scope is true
for at least one instance of something.

It is denoted by the logical operator ∃, which resembles as inverted E. When it is used with a predicate
variable then it is called as an existential quantifier.

Note: In Existential quantifier we always use AND or Conjunction symbol (𝖠).

If x is a variable, then existential quantifier will be ∃x or ∃(x). And it will be read as:

o There exists a 'x.'


o For some 'x.'
o For at least one 'x.'

Example:

Some boys are intelligent.


∃x: boys(x) 𝖠 intelligent(x)

Some employees are sick today.

∃x: employee(x) 𝖠today_sick(x)

All employees are sick today.

∀x: employee(x)  sick(x)

Points to remember:
o The main connective for universal quantifier ∀ is implication →.
o The main connective for existential quantifier ∃ is and 𝖠.

Properties of Quantifiers:
o In universal quantifier, ∀x∀y is similar to ∀y∀x.
o In Existential quantifier, ∃x∃y is similar to ∃y∃x.
o ∃x∀y is not similar to ∀y∃x.

Some Examples of FOL using quantifier:

1. All birds fly.


In this question the predicate is "fly(bird)."
And since there are all birds who fly so it will be represented as follows.
∀x bird(x) →fly(x).
2. Every man respects his parent.
In this question, the predicate is "respect(x, y)," where x=man, and y= parent.
Since there is every man so will use ∀, and it will be represented as follows:
∀x man(x) → respects (x, parent).

3. Some boys play cricket.


In this question, the predicate is "play(x, y)," where x= boys, and y= game. Since there are some boys so we
will use ∃, and it will be represented as:
∃x boys(x) → play(x, cricket).

4. Not all students like both Mathematics and Science.


In this question, the predicate is "like(x, y)," where x= student, and y= subject.
Since there are not all students, so we will use ∀ with negation, so following representation for this:
¬∀ (x) [ student(x) → like(x, Mathematics) 𝖠 like(x, Science)].

5. Only one student failed in Mathematics.


In this question, the predicate is "failed(x, y)," where x= student, and y= subject.
Since there is only one student who failed in Mathematics, so we will use following representation for this:
∃(x) [ student(x) → failed (x, Mathematics) 𝖠∀ (y) [¬(x==y) 𝖠 student(y) → ¬failed (x,
Mathematics)].

Free and Bound Variables:


The quantifiers interact with variables which appear in a suitable way. There are two types of variables in
First-order logic which are given below:

Free Variable: A variable is said to be a free variable in a formula if it occurs outside the scope of the
quantifier.

Example: ∀x ∃(y)[P (x, y, z)], where z is a free variable.

Bound Variable: A variable is said to be a bound variable in a formula if it occurs within the scope of the
quantifier.

Example: ∀x [A (x) B( y)], here x and y are the bound variables.

Negation Rule:

~∀x F(x)=∃(x)~F(x)

~∀x Intelligent(x)=∃(x) ~Intelligent(x)

~∃(x)F(x) = ∀x ~F(x)

Well Formed Formula:


An atomic sentence is WFF.

Represent the following sentences in symbolic form:


All employees earning 1400 or more per year pay taxes.
Some employees are sick today.
No employees earn more than the president.

We define the abbreviations for predicates and functions, e.g.

E(x) =For X an employee


P(x)= for x is president.
i(x)= for income of x.
GE(u,v)= for u greater than or equal to v
S(x) = for x is sick today
T(x)= for x pay taxes

∀x ((E(x) ^ GE(i(x),1400))T(x))
∃(y) (E(y)S(y))
∀xy (E(x) ^ P(y)) ~GE(i(x), i(y))

The above expressions are called WFFs.


P and Q are WFFs
~P
P^Q
PvQ
PQ
PQ
∀x P(X)
∃(y)P(Y)
Are WFFS.
Some examples:
∀xyz(( Father(x,y) ^ Father(y,z)  grandfather(x,z))

Mary likes all kinds of food.

∀x food(x)like(mary,x)

Rice and banana are food.

Food(rice) ^ food(banana)

Ram eats everything that sue eats.

∀x eat(sue,x)eat(ram,x)
Exercises:

1. Marcus was a man.


2. Marcus was a pompeian.
3. All pompeians were Romans.
4. Caesar was a ruler.
5. All Romans were either loyal to Caesar or hated him.
6. Everyone is loyal to someone.
7. People only try to assassinate rulers they are not loyal to.
8. Marcus tried to assassinate Caesar.

Man (Marcus).
Pompeian (Marcus).
∀x Pompeian(X)Roman(X)
Ruler (Caesar)

∀x Roman(x)loyalto(x, Caesar) v hate(x, Caesar)


∀x ∃y loyalto (x,y)

∀x ∀y person(x) ^ ruler(y) ^ tryassassinate(x,y)~ loyalto (x,y)


tryassassinate(Marcus, Caesar)

Any person who is respected by every person is a king.

∀x ∃y person(x) ^ person(y) ^ respect(x,y)king(y)

Marcus was born in 40 AD.

Born (Marcus, 40)

All pompeians died when volcano erupted in 79 AD

∀x pompeian (x) ^ erupted (volcano, 79) died(x,79)

No mortal lives longer than 150 years.


∀x ∀t1 ∀t2 mortal(x) ^ born (x, t1) ^ gt ((t2-t1), 150)dead(x,t2)

It is now 1991.

Now=1991

Alive means not dead.


∀x ∀t alive(x,t) ~dead(x,t)
If someone dies, then he is dead at all later times.

∀x ∀t1 ∀t2 died(x,t1) ^ gt(t2,t1) dead (x,t2)

Everyone has a mother.


∀x ∃y mother(x, y) (correct)
∃y ∀x mother(x, y) (wrong)

Clausal Conversion
1. ~(course(x) ^ easy(x)) v like (ram, x)
~course(x) v ~easy(x) v like (ram, x) ------ (i)
2. ~(course(x) ^ engg (x)) v hard (x)
~course(x) v ~ engg (x) v hard (x) ----------- (ii)
3. ~(course(x) ^ art (x)) v easy(x)
~course(x) v ~art (x) v easy(x) -------------- (iii)
4. Course (AR04) (iv)
art (AR04) (v)

A clause can be defined as disjunction of a number of literals.


A ground clause is a one in which no variable occurs in the expression.
A Horn clause is a clause with at most one positive literal.

How to Write a Sentence into Clause Forms ?


The technique for representing a complex sentence into simple sentences so that resolution can be applied.

The technique is described below:


Algorithm for Converting a Sentence into Clauses (CNF):

Step I: Elimination of if-then operator:


Replace”→” operator by ¬ & 𝗏operator.
By replacing „if-then‟ operator by negation and OR operator, we find.

฀Xfood (X) (Marry, X)

Step II: Reduction of the scope of negation: move negation inwards.

Replace ¬ sign by choosing any of the following:


1. ~(PVQ)= ~P^~Q
2. ~(P^Q)= ~PV~Q
3. ~(~P) = P
4. ~(฀X P)= ∃x ~ P
5. ~(∃x P)= ฀X ~P

Step III: Renaming the variable within the scope of quantifiers:


Rename X by Y when {X} is a subset/proper subset of {X}. In the present context, since X and Yare
distinct, the above operation cannot be carried out.

Step IV: Moving of quantifiers in the front of the expression:


Bring all quantifiers at the front of the expression.

Applying this on the example yields.

Step V: Replacing existential quantifier as Skolem function of essential quantifiers:


When an existential quantifier (Y) precedes an essential quantifier (X), replace Y as S (X), where S is the
Skolem function. Also the essential quantifier is dropped from the sentence.

Skolemization: procedure for systematic elimination of the existential quantifiers in a first-order formula in
a prenex form, by introducing new constant and functional symbols, called Skolem constants and Skolem
functions, in the formula.

Step VI: Putting the resulting expression in conjunctive normal form (CNF):
For example, if the original expression is in the from P 𝗏 (Q 𝖠 R), then replace it by (P 𝗏 Q) 𝖠 (Q 𝖠 R).
In the present context, the resulting expression corresponding to expression (3) being in CNF, we need not
do any operation at this step.

Step VII: Writing one clause per line:


If the original expression is of the following CNF, then rewrite each clause/line, as illustrated below.

Original Expression:

After writing one clause per line, the resulting expression become as follow:

This algorithm can be illustrated with the help of following Examples:


Example 1:
Rewrite the following sentences in FOL:
1. Coconut-crunchy is a biscuit.

2. Mary is a child who takes coconut-crunchy.

3. John loves children who take biscuits.

4. John loves Mary.

The above statements can be represented in FOL using two qualities X and Y.

1. Biscuit (coconut-crunchy)

2. Child (mary) 𝖠Takes (mary, coconut-cruchy)


3. ∀X ((child (X))𝖠 ฀Y (Takes (X, Y) 𝖠 Biscuit (Y)) → Loves (john X)
4. Loves (john, mary)

Unification
o Unification is a process of making two different logical atomic expressions identical by finding a
substitution. Unification depends on the substitution process.

Like (ram, x) , eat (ram, a)

o It takes two literals as input and makes them identical using substitution.
o Any substitution that makes two or more expressions equal is called a unifier for that expression.
o Given two expressions that are unifiable, such as expressions C1& C2 with a unifier β with C1β= C2,
we say β is the most general unifier (mgu) if any other unifier α is an instance of β.

For e.g., Two unifiers for literals P(u,b,v) and P(w,x,y) are: α = {u/w, b/x, v/y} and β = {s/u, s/w, b/x,
c/v, c/y}. Here α is the mgu and β is an instance of α.

o Example: Find the MGU for Unify{King(x), King(John)}

Let Ψ1 = King(x), Ψ2 = King(John),

Substitution {John/x} is a unifier for these atoms and applying this substitution, and both expressions will
be identical.

o The UNIFY algorithm is used for unification, which takes two atomic sentences and returns a unifier
for those sentences (If any exist).
o Unification is a key component of all first-order inference algorithms.
o It returns fail if the expressions do not match with each other.
o The substitution variables are called Most General Unifier or MGU.
E.g. Let's say there are two different expressions, P(x, y), and P(a, f(z)).

In this example, we need to make both above statements identical to each other. For this, we will perform
the substitution.

P(x, y). ........ (i)


P(a, f(z))..........(ii)

o Substitute x with a, and y with f(z) in the first expression, and it will be represented as a/x and f(z)/y.
o With both the substitutions, the first expression will be identical to the second expression and the
substitution set will be: [a/x, f(z)/y].

Conditions for Unification:

Following are some basic conditions for unification:

o Predicate symbol must be same, atoms or expression with different predicate symbol can never be
unified.
o Number of Arguments in both expressions must be identical.
o Unification will fail if variable occurs in terms.

P(x), P(f(x, a))

x, f(x, a)

f(x, a)/x Not unifiable

x, h(w)

Unification Algorithm:

Returns mgu for a given set of expressions S.

Algorithm:

Step. 1: Set k=0 and σk = ε (empty set).


Step.2: If the set Sσk is a singleton, then stop; σk is mgu of S. Otherwise find the disagreement set Dk of
Sσk.
Step. 3: If there is a variable v and term t in Dk such that v does not occur in t, put σk+1=σk{t/v}. Set
k=k+1, and return to step 2. Otherwise stop. S is not unifiable.
e.g.

For each pair of the following atomic sentences find the most general unifier (If exist).

1. Find the MGU of S= { P(a, x, f(g(y))}, P(z, h(z, w), f(w))}


K=0

σ0 = ε

D0= {a,z}

σ1= {ε }o {a/z} ={a/z}


K=k+1 = 1

Sσ1={ p(a, x, f(g(y))}, P(a, h(a, w), f(w))}

D1= {x, h(a,w)}

σ2= {a/z}o {h(a,w)/x}

={a/z, h(a,w)/x}

K=k+1 = 2

Sσ2={ p(a, h(a,w), f(g(y))}, P(a, h(a, w), f(w))}

D2= {g(y), w)}

σ3= {a/z, h(a,w)/x, g(y)/w}

Sσ3={ p(a, h(a,w), f(g(y))), P(a, h(a, w), f(g(y)))}

D3= NULL

Hence σ3 is mgu.

Topper(X)

Topper(john)

Other Examples:

S={P(a, x, h(g(z))), P(z, h(y), h(y))}

σ0 = ε

D0= {a, z}

σ1= {ε }o {a/z} ={a/z}

K=k+1 = 1

Sσ1={ P(a, x, h(g(a))), P(a, h(y), h(y)) }

D1= {x, h(y)}


σ2= {a/z}o{h(y)/x}= {a/z, h(y)/x}

K=k+1 = 2

Sσ2={P(a, h(y), h(g(a))), P(a, h(y), h(y))}

D2= {g(a), y}

σ3= {a/z, h(y)/x}o{g(a)/y}

K=k+1 = 3

Sσ3={ P(a, h(g(a)), h(g(a))), P(a, h(g(a)), h(g(a))) }

σ3= {a/z, h(y)/x, g(a)/y} = mgu

S= { p(x, x), p(y, f(y))}

S σ1= { p(y, y), p(y, f(y))}

This expression is non unifiable.

S= {like(marry, john), like(Nancy, X)}

This expression is non unifiable.

S= {like(X, john), like(Nancy, Z)}

Nancy/X

S= {like(X, john), like(Nancy, Z)}

John/Z

S= {like(Nancy, john), like(Nancy, john)}

Consider Sentences:

1. The members of St. bridge club are Joe, Sally, Bill and Ellen.
2. Joe is married to Sally.
3. Bill is Ellen‟s brother.
4. The spouse of every married person in club is also in club.
5. The last meeting of club was at Joe‟s House.

1. Member (joe) ^ Member(sally) ^ Member(bill) ^ Member(ellen).


2. Married(joe, sally)
3. Brother(bill, ellen).
4. ∀x ∀y married (x, y) ^ member(x)  member(y)
5. Meeting(last) ^ house(joe).
Inferences methods in FOPL:
Chain Rule:

P Q

Q R

PR

∀X human(X)intelligent(X)
∀X intelligent(X) succeed(X)

X human(X) succeed(X)

Modus Ponens Rule:


PQ
P
---
Q

∀X human(X)intelligent(X)
human(john) john/X
∀X human(john)intelligent(john)
Intelligent(john)

∀X dog(X)faithful(X)
∀X dog(X)has_tail(X)
Dog(tommy)

Tommy/X

dog(tommy)faithful(tommy)
dog(tommy)has_tail(tommy)
Dog(tommy)

New Facts:
Faithful(tommy)
Has_tail(tommy)

~dog(X) V faithful(X) --- (i)


~dog(X) V has_tail(X) --- (ii)
Dog(tommy) ------------ (iii)
Faithful(tommy)
black(tommy)
Negate :
~faithful(tommy) ~dog(X) V faithful(X)

Tommy/X
~dog(tommy) dog(tommy)

[]
black(tommy)

Negate :
~black(tommy) ~dog(X) V faithful(X)

Tommy/X
~dog(tommy) V faithful(X) V ~black(tommy) dog(tommy)

faithful(tommy) V ~black(tommy) faithful(tommy)

Resolution in Predicate Logic :

If we have set of clauses C1, C2, C3, … Cn and we wish to prove or deduce clause D, ( D is logical
consequence of C1, C2, C3, … Cn) then first negate D and add ~D to set of clauses C1, C2, C3, … Cn.
Then using resolution together with factoring, we can show that the set is unsatisfiable by deducing a
contradiction. Such a proof is called a proof of refutation, which if successful, yields empty clause denoted
by [ ].

Given two clauses C1 and C2 with no variables in common. If there is a literal l1 in C1 which is a
complement of literal l2 in C2 then both l1 and l2 are deleted and a disjuncted C is formed from remaining
deduced clauses. The new clause C is called the resolvent of C1 and C2.

Resolution is a process of generating these resolvents from a set of clauses.

Consider the example,

Resolve two clauses


(~P V Q) and (~Q V R)

We write
~PVQ, ~QVR
~PVR

Several types of resolution are possible depending upon number and types of parents.

Binary Resolution:

Two clauses with complementary literals, being resolvent of,

~P(x, a) V Q(x) and ~Q(b) V R(x)

Is ~P(b, a) V R(b)

Unit Resulting Resolution:


Number of clauses resolved simultaneously to produce a unit clause.
{~married (x, y) V ~mother (x, z) V father (y, z),
married (sue, joe), ~father (joe, bill)}

Substitution mgu= {sue/x, joe/y, bill/z} is used results in unit clause ~mother (sue, bill).

Linear Resolution:

When each resolved clause Ci is parent to clause Ci+1 (i= 1,2, …, n-1)
e.g., C0 with some clause B0 to get C1 and then C1 with some clause B1 to get C2 and so on until Cn has been
derived.
C0 and B0

C1and B1

C2

Linear Input Resolution:

If one of the parents in linear resolution is always from original set of clauses (Bi).

e.g., given set of clauses S = { P VQ, ~P V Q, P V ~Q, ~P V ~Q}

Let C0 = P V Q

Choosing B0 = ~P V Q from set S and resolving this with C0 we obtain resolvent Q = C1.
B1 must now be chosen from S and the resolvent of C1 and B1 becomes C2 and so on.
Let C0 = P V QB0 = ~P V Q

QVQ=Q

C1= Q B1 = P V ~Q
C2 = P B2 = ~P V ~ Q
C3 = ~Q
Consider sentences again:

1. Ram like only easy courses.


2. Engg courses are hard.
3. All courses in arts are easy.
4. AR04 is an art course.

1. ∀x course(x) ^ easy(x) like (ram, x)


2. ∀x course(x) ^ engg (x) hard (x)
3. ∀x course(x) ^ art (x) easy(x)
4. Course (AR04) ^ art (AR04)

Clausal Conversion

1. ~(course(x) ^ easy(x)) v like (ram, x)


~course(x) v ~easy(x) v like (ram, x) ------ (i)
2. ~(course(x) ^ engg (x)) v hard (x)
~course(x) v ~ engg (x) v hard (x) ----------- (ii)
3. ~(course(x) ^ art (x)) v easy(x)
~course(x) v ~art (x) v easy(x) -------------- (iii)
4. Course (AR04) (iv)
art (AR04) (v)

Using resolution determine what course would Ram Like?

like (Ram, x) V ~ like(Ram, x) (i)

~course(x) v ~easy(x) v like (Ram, x) (iii)

~course(x) v ~art (x) v like (Ram, x) (iv)

AR04/x
~art (AR04) v like (Ram, AR04) (v)

like (Ram, AR04)


Consider sentences again:
1. John likes all kinds of food.
x food(x)  likes(john, x)
2. Apples are food.
food(apple)
3. Chicken is food.
food(chicken)
4. Anything anyone eats and isn't killed by is food.
xyeats(y, x)  killed(y)  food(x)
5. Bill eats peanuts and is still alive.
Eats(Bill, peanuts) ^alive(Bill)
6. Sue eats everything Bill eats.
xeats(Bill, x) eats(Sue, x)
7. Alive means not killed.
z alive(z) killed(z)
Clausal Conversion
1.  food(x)  likes(John, x)
2. food(apples)
3. food(chicken)
4.  eats(y,x)  killed(y)  food(x)
5. Eats (Bill, peanuts)
6. Alive (Bill)
7. eats(Bill,x)  eats(Sue,x)
8.  alive(x)  killed (x)
Resolution proof that John likes peanuts? likes( john, peanuts)
 likes (John, peanuts)  food(x)  likes (John, x)

Peanuts/x

 food(peanuts) eats(y,x)  killed(y)  food(x)

Peanuts/x

eats(y,peanuts)  killed(y) eats(Bill.peanuts)

Bill/y
Killed(Bill)  alive(x)  killed(x)
Bill/x

alive(Bill) alive(Bill)

[]

Prove that john likes peanuts using backward chaining?

likes(john, peanuts)

Food (peanuts)

eats(y, peanuts)  killed(y)

5,7 bill/y

NIL

What food does sue eat?

eats( sue, x) V ~eats (sue, x)  eats(Bill,x)  eats(Sue,x)

eats ( sue, x) V  eats(Bill,x) Eats (Bill, peanuts)

peanuts/x
eats (sue, peanuts)

Is john like peanuts? ~Like (john, peanuts)


What food does john like?
like (john, x) V ~like (john, x)

Consider Sentences:
1. The members of St. bridge club are Joe, Sally, Bill and Ellen.
2. Joe is married to Sally.
3. Bill is Ellen‟s brother.
4. The spouse of every married person in club is also in club.
5. The last meeting of club was at Joe‟s House.
6. The spouse lives in same house.
Represent these facts in predicate logic. Construct the resolution proof to demonstrate the truth of
each of the statement given by facts listed above. Do so if possible. Otherwise add the facts you
need and then construct the proof.

The last meeting of club was at Sally‟s House.

Ellen is not married.

1. Member (joe)
2. Member(sally)
3. Member(bill)
4. Member(ellen).
5. Married(joe, sally)
6. Brother(bill, ellen).
7. ~married (x, y) V ~ member(x) V member(y)
8. Meeting(last)
9. house(joe)
10. ~married (x, y) V ~house(x) V house(y)

To Prove:
Meeting (last) ^ house (sally)

~meeting (last) V ~house (sally) meeting(last)

~house (sally) ~married (x, y) V ~house(x) V house(y)

sally/y

~married (x, sally) V ~house(x) house(joe)

Joe/x

~married (joe, sally) married(joe, sally)

[]

~married (Ellen, y)
married (Ellen, y) ~married (x, y) V ~ member(x) V member(y)

Ellen/x

~ member(ellen) V member(y) member(ellen)

Member(y)

Y can be sally, bill or joe but joe is married to sally and bill is ellen‟s brother. Therefore, we reach
contradiction and hence ellen is not married.

P ^ Q  ~R
~P V ~Q V ~R

P ^ R  ~Q

~P V ~R V ~Q

Forward Chaining System and Backward Chaining System (Goal Driven approach)

Facts

P
Q
R

Rules

P ^ R S
P ^ ST

S
T
Data Driven Approach.

In Artificial intelligence, the purpose of the search is to find the path through a problem space. There are
two ways to pursue such a search that are forward and backward reasoning. The significant difference
between both of them is that forward reasoning starts with the initial data towards the goal. Conversely,
backward reasoning works in opposite fashion where the purpose is to determine the initial facts and
information with the help of the given results.

The same set of rules can be used for forward and backward reasoning, but it is useful to define two classes
of rules, each of which encodes a particular kind of knowledge:

1. Forward rules, encode knowledge about how to respond to certain input configurations.
2. Backward rules, encode knowledge about how to achieve particular goals.
We can separate these rules into two classes: Hence there are two kinds of rule systems:

In forward chaining system, we start with initial facts and keep using the rules to draw new conclusions. In
backward chaining system, we start with some hypothesis (or goal), we are trying to prove and keep looking
for the rules that would allow us to conclude that hypothesis.

Forward Chaining systems are data Driven while backward chaining systems are goal driven.

Forward Chaining System:

The facts in system are represented in working memory, which is continually updated. Rules in system
represent possible actions to take, sometimes also called condition action rules. The conditions are usually
patterns that must match items in working memory, which actions usually involve adding or deleting items
from working memory.

The interpreter controls the application of rules. It first checks to find all the rules whose condition matches
with the current state of memory then selects one and performs action in action part of the rule. The action
will result in new working memory and cycle begins again. This cycle will be repeated until either no rules
fire or some specified goal is satisfied.

Definition of Forward Reasoning

The solution of a problem generally includes the initial data and facts in order to arrive at the solution.
These unknown facts and information is used to deduce the result. For example, while diagnosing a patient
the doctor first check the symptoms and medical condition of the body such as temperature, blood pressure,
pulse, eye colour, blood, etcetera. After that, the patient symptoms are analysed and compared against the
predetermined symptoms. Then the doctor is able to provide the medicines according to the symptoms of
the patient. So, when a solution employs this manner of reasoning, it is known as forward reasoning.

Steps that are followed in the forward reasoning

The inference engine explores the knowledge base with the provided information for constraints whose
precedence matches the given current state.

 In the first step, the system is given one or more than one constraints.
 Then the rules are searched in the knowledge base for each constraint. The rules that fulfil the
condition are selected(i.e., IF part).
 Now each rule is able to produce new conditions from the conclusion of the invoked one. As a
result, THEN part is again included in the existing one.
 The added conditions are processed again by repeating step 2. The process will end if there is
no new conditions exist.

Definition of Backward Reasoning

The backward reasoning is inverse of forward reasoning in which goal is analysed in order to deduce the
rules, initial facts and data. We can understand the concept by the similar example given in the above
definition, where the doctor is trying to diagnose the patient with the help of the inceptive data such as
symptoms. However, in this case, the patient is experiencing a problem in his body, on the basis of which
the doctor is going to prove the symptoms. This kind of reasoning comes under backward reasoning.

Steps that are followed in the backward reasoning

In this type of reasoning, the system chooses a goal state and reasons in the backward direction. Now, let‟s
understand how does it happens and what steps are followed.

 Firstly, the goal state and the rules are selected where the goal state reside in the THEN part as
the conclusion.
 From the IF part of the selected rule the subgoals are made to be satisfied for the goal state to
be true.
 Set initial conditions important to satisfy all the subgoals.
 Verify whether the provided initial state matches with the established states. If it fulfils the
condition then the goal is the solution otherwise other goal state is selected.

A backward chaining system does not need to update a working memory.

Suppose we have following rules:

1. If lecturing X ^ marking_practical X overworked X


2. If month February  lecturing alison
3. If month February marking_practicalalison
4. If overworked X bad_mood X
5. If slept_badly Xbad_mood X
6. If month february weather cold
7. If year 1993  economy bad

Facts:

Month February

Year 1993

Prove:

Bad mood X?

X= alison

?-Lecturing X.

Key Differences Between Forward and Backward Reasoning in AI


1. The forward reasoning is data-driven approach while backward reasoning is a goal driven.

2. The process starts with new data and facts in the forward reasoning. Conversely, backward reasoning
begins with the results.
3. Forward reasoning aims to determine the result followed by some sequences. On the other hand,
backward reasoning emphasis on the acts that support the conclusion.

4. The forward reasoning is an opportunistic approach because it could produce different results. As
against, in backward reasoning, a specific goal can only have certain predetermined initial data which
makes it restricted.

5. The flow of the forward reasoning is from the antecedent to consequent while backward reasoning
works in reverse order in which it starts from conclusion to incipient.

Ontological Engineering
Ontology refers to organizing everything in the world into hierarch of categories.
Representing the abstract concepts such as Actions, Time, Physical Objects, and Beliefs is called
Ontological Engineering.
Ques: How categories are useful in Knowledge representation?

CATEGORIES AND OBJECTS


The organization of objects into categories is a vital part of knowledge representation. Although interaction
with the world takes place at the level of individual objects, much reasoning takes place at the level of
categories.

Taxonomy
Subclass relations organize categories into a taxonomy, or taxonomic hierarchy. Taxonomies have been
used explicitly for centuries in technical fields. For example, systematic biology aims to provide a
taxonomy of all living and extinct species; library science has developed a taxonomy of all fields of
knowledge, encoded as the Dewey Decimal system; and tax authorities and other government departments
have developed extensive taxonomies of
occupations and commercial products. Taxonomies are also an important aspect of general commonsense
knowledge.
First-order logic makes it easy to state facts about categories, either by relating objects to categories or by
quantifying over their members:
Ontology of Situation calculus.
Situations are logical terms consisting of the initial situation (usually called So) and all situations that are
generated by applying an action to a situation. The function Result(a, s) (sometimes called Do) names the
situation that results when action a is executed in situation s
Fluents are functions and predicates that vary from one situation to the next, such as the location of the
agent or the aliveness of the wumpus. The dictionary says a fluent is something that fllows, like a liquid. In
this use, it means flowing or changing across situations. By convention, the situation is always the last
argument of a fluent. For
example, lHoldzng(G1, So) says that the agent is not holding the gold GI in the initial situation So. Age(
Wumpus, So) refers to the wumpus's age in So.
Atemporal or eternal predicates and functions are also allowed. Examples include the predicate Gold (GI)
and the function LeftLeg Of ( Wumpus).
Time and event calculus
Situation calculus works well when there is a single agent performing instantaneous, discrete actions. When
actions have duration and can overlap with each other, situation calculus becomes somewhat awkward.
Therefore, we will cover those topics with an alternative for- EVENTCALCULUS malism known as event
calculus, which is based on points in time rather than on situations.
(The terms "event7' and "action" may be used interchangeably. Informally, "event" connotes a wider class
of actions, including ones with no explicit agent. These are easier to handle in event calculus than in
situation calculus.)
In event calculus, fluents hold at points in time rather than at situations, and the calculus is designed to
allow reasoning over intervals of time. The event calculus axiom says that a fluent is true at a point in time
if the fluent was initiated by an event at some time in the past and was not terminated by an intervening
event. The Initiates and Terminates relations play a role similar to the Result relation in situation calculus;
Initiates(e, f , t) means that the occurrence of event e at time t causes fluent f to become true, while
Terminates (w , f, t) means that f ceases to be true. We use Happens(e, t) to mean that event e happens at
time t,.

Semantic networks

Semantic networks are capable of representing individual objects, categories of objects, and relation among
objects. Objects or Ctegory names are represented in ovals and are connected by labeled arcs.
Semantic network example

You might also like