Unit 3 Ais
Unit 3 Ais
o It is the simplest way of storing facts which uses the relational method, and each fact
about a set of the object is set out systematically in columns.
o This approach of knowledge representation is famous in database systems where the
relationship between different entities is represented.
o This approach has little opportunity for inference.
Player1 65 23
Player2 58 18
Player3 75 24
2. Inheritable knowledge:
o In the inheritable knowledge approach, all data must be stored into a hierarchy of
classes.
o All classes should be arranged in a generalized form or a hierarchal manner.
Page 1 of 50
o In this approach, we apply inheritance property.
o Elements inherit values from other members of a class.
o This approach contains inheritable knowledge which shows a relation between
instance and class, and it is called instance relation.
o Every individual frame can represent the collection of attributes and its value.
o In this approach, objects and values are represented in Boxed nodes.
o We use Arrows which point from objects to their values.
o Example:
3. Inferential knowledge:
Page 2 of 50
man(Marcus)
∀x = man (x) ----------> mortal (x)s
4. Procedural knowledge:
o Procedural knowledge approach uses small programs and codes which describes how
to do specific things, and how to proceed.
o In this approach, one important rule is used which is If-Then rule.
o In this knowledge, we can use various coding languages such as LISP
language and Prolog language.
o We can easily represent heuristic or domain-specific knowledge using this approach.
o But it is not necessary that we can represent all cases in this approach.
There are mainly four ways of knowledge representation which are given as follows:
1. Logical Representation
2. Semantic Network Representation
3. Frame Representation
4. Production Rules
Page 3 of 50
1. Logical Representation
Logical representation is a language with some concrete rules which deals with propositions
and has no ambiguity in representation. Logical representation means drawing a conclusion
based on various conditions. This representation lays down some important communication
rules. It consists of precisely defined syntax and semantics which supports the sound
inference. Each sentence can be translated into logics using syntax and semantics.
Syntax:
o Syntaxes are the rules which decide how we can construct legal sentences in the logic.
o It determines which symbol we can use in knowledge representation.
o How to write those symbols.
Semantics:
o Semantics are the rules by which we can interpret the sentence in the logic.
o Semantic also involves assigning a meaning to each sentence.
a. Propositional Logics
b. Predicate logics
1. Logical representations have some restrictions and are challenging to work with.
2. Logical representation technique may not be very natural, and inference may not be so
efficient.
Page 4 of 50
This representation consist of mainly two types of relations:
Example: Following are some statements which we need to represent in the form of nodes
and arcs.
Statements:
a. Jerry is a cat.
b. Jerry is a mammal
c. Jerry is owned by Priya.
d. Jerry is brown colored.
e. All Mammals are animal.
In the above diagram, we have represented the different type of knowledge in the form of
nodes and arcs. Each object is connected with another object by some relation.
Page 5 of 50
2. Semantic networks try to model human-like memory (Which has 1015 neurons and
links) to store the information, but in practice, it is not possible to build such a vast
semantic network.
3. These types of representations are inadequate as they do not have any equivalent
quantifier, e.g., for all, for some, none, etc.
4. Semantic networks do not have any standard definition for the link names.
5. These networks are not intelligent and depend on the creator of the system.
3. Frame Representation
A frame is a record like structure which consists of a collection of attributes and its values to
describe an entity in the world. Frames are the AI data structure which divides knowledge
into substructures by representing stereotypes situations. It consists of a collection of slots
and slot values. These slots may be of any type and sizes. Slots have names and values which
are called facets.
Facets: The various aspects of a slot is known as Facets. Facets are features of frames which
enable us to put constraints on the frames. Example: IF-NEEDED facts are called when data
of any particular slot is needed. A frame may consist of any number of slots, and a slot may
include any number of facets and facets may have any number of values. A frame is also
known as slot-filter knowledge representation in artificial intelligence.
Frames are derived from semantic networks and later evolved into our modern-day classes
and objects. A single frame is not much useful. Frames system consist of a collection of
frames which are connected. In the frame, knowledge about an object or event can be stored
together in the knowledge base. The frame is a type of technology which is widely used in
various applications including Natural language processing and machine visions.
Example: 1
Slots Filters
Page 6 of 50
Genre Computer Science
Year 1996
Page 1152
Example 2:
Let's suppose we are taking an entity, Peter. Peter is an engineer as a profession, and his age
is 25, he lives in city London, and the country is England. So following is the frame
representation for this:
Slots Filter
Name Peter
Profession Doctor
Age 25
Weight 78
1. The frame knowledge representation makes the programming easier by grouping the
related data.
2. The frame representation is comparably flexible and used by many applications in AI.
3. It is very easy to add slots for new attribute and relations.
4. It is easy to include default data and to search for missing values.
Page 7 of 50
5. Frame representation is easy to understand and visualize.
4. Production Rules
Production rules system consist of (condition, action) pairs which mean, "If condition then
action". It has mainly three parts:
In production rules agent checks for the condition and if the condition exists then production
rule fires and corresponding action is carried out. The condition part of the rule determines
which rule may be applied to a problem. And the action part carries out the associated
problem-solving steps. This complete process is called a recognize-act cycle.
The working memory contains the description of the current state of problems-solving and
rule can write knowledge to the working memory. This knowledge match and may fire other
rules.
If there is a new situation (state) generates, then multiple production rules will be fired
together, this is called conflict set. In this situation, the agent needs to select a rule from these
sets, and it is called a conflict resolution.
Example:
o IF (at bus stop AND bus arrives) THEN action (get into the bus)
o IF (on the bus AND paid AND empty seat) THEN action (sit down).
o IF (on bus AND unpaid) THEN action (pay charges).
o IF (bus arrives at destination) THEN action (get down from the bus).
Page 8 of 50
Disadvantages of Production rule:
1. Production rule system does not exhibit any learning capabilities, as it does not store
the result of the problem for the future uses.
2. During the execution of the program, many rules may be active hence rule-based
production systems are inefficient.
Propositional logic (PL) is the simplest form of logic where all the statements are made by
propositions. A proposition is a declarative statement which is either true or false. It is a
technique of knowledge representation in logical and mathematical form.
Example:
1. a) It is Sunday.
2. b) The Sun rises from West (False proposition)
3. c) 3+3= 7(False proposition)
4. d) 5 is a prime number.
Page 9 of 50
Syntax of propositional logic:
The syntax of propositional logic defines the allowable sentences for the knowledge
representation. There are two types of Propositions:
a. Atomic Propositions
b. Compound propositions
Example:
Example:
Logical Connectives:
Logical connectives are used to connect two simpler propositions or representing a sentence
logically. We can create compound propositions with the help of logical connectives. There
are mainly five connectives, which are given as follows:
Page 10 of 50
4. Implication: A sentence such as P → Q, is called an implication. Implications are
also known as if-then rules. It can be represented as
If it is raining, then the street is wet.
Let P= It is raining, and Q= Street is wet, so it is represented as P → Q
5. Biconditional: A sentence such as P⇔ Q is a Biconditional sentence, example If I
am breathing, then I am alive
P= I am breathing, Q= I am alive, it can be represented as P ⇔ Q.
o We cannot represent relations like ALL, some, or none with propositional logic.
Example:
a. All the girls are intelligent.
b. Some apples are sweet.
o Propositional logic has limited expressive power.
o In propositional logic, we cannot describe statements in terms of their properties or
logical relationships.
Inference:
In artificial intelligence, we need intelligent computers which can create new logic from old
logic or by evidence, so generating the conclusions from evidence and facts is termed as
Inference.
Inference rules:
Inference rules are the templates for generating valid arguments. Inference rules are applied
to derive proofs in artificial intelligence, and the proof is a sequence of the conclusion that
leads to the desired goal.
In inference rules, the implication among all the connectives plays an important role.
Following are some terminologies related to inference rules:
Page 11 of 50
o Contrapositive: The negation of converse is termed as contrapositive, and it can be
represented as ¬ Q → ¬ P.
o Inverse: The negation of implication is called inverse. It can be represented as ¬ P →
¬ Q.
From the above term some of the compound statements are equivalent to each other, which
we can prove using truth table:
53.4M
867
Hence from the above truth table, we can prove that P → Q is equivalent to ¬ Q → ¬ P, and
Q→ P is equivalent to ¬ P → ¬ Q.
1. Modus Ponens:
The Modus Ponens rule is one of the most important rules of inference, and it states that if P
and P → Q is true, then we can infer that Q will be true. It can be represented as:
Example:
2. Modus Tollens:
The Modus Tollens rule state that if P→ Q is true and ¬ Q is true, then ¬ P will also true. It
can be represented as:
Page 12 of 50
Statement-1: "If I am sleepy then I go to bed" ==> P→ Q
Statement-2: "I do not go to the bed."==> ~Q
Statement-3: Which infers that "I am not sleepy" => ~P
3. Hypothetical Syllogism:
The Hypothetical Syllogism rule state that if P→R is true whenever P→Q is true, and Q→R
is true. It can be represented as the following notation:
Example:
Statement-1: If you have my home key then you can unlock my home. P→Q
Statement-2: If you can unlock my home then you can take my money. Q→R
Conclusion: If you have my home key then you can take my money. P→R
4. Disjunctive Syllogism:
The Disjunctive syllogism rule state that if P∨Q is true, and ¬P is true, then Q will be true. It
can be represented as:
Example:
5. Addition:
The Addition rule is one the common inference rule, and it states that If P is true, then P∨Q
will be true.
Example:
6. Simplification:
Page 13 of 50
The simplification rule state that if P∧ Q is true, then Q or P will also be true. It can be
represented as:
7. Resolution:
The Resolution rule state that if P∨Q and ¬ P∧R is true, then Q∨R will also be true. It can be
represented as
In the topic of Propositional logic, we have seen that how to represent statements using
propositional logic. But unfortunately, in propositional logic, we can only represent the facts,
which are either true or false. PL is not sufficient to represent the complex sentences or
natural language statements. The propositional logic has very limited expressive power.
Consider the following sentence, which we cannot represent using PL logic.
To represent the above statements, PL logic is not sufficient, so we required some more
powerful logic, such as first-order logic.
First-Order logic:
Page 14 of 50
o Objects: A, B, people, numbers, colors, wars, theories, squares, pits, wumpus,
......
o Relations: It can be unary relation such as: red, round, is adjacent, or n-
any relation such as: the sister of, brother of, has color, comes between
o Function: Father of, best friend, third inning of, end of, ......
o As a natural language, first-order logic also has two main parts:
a. Syntax
b. Semantics
The syntax of FOL determines which collection of symbols is a logical expression in first-
order logic. The basic syntactic elements of first-order logic are symbols. We write
statements in short-hand notation in FOL.
Variables x, y, z, a, b,....
Connectives ∧, ∨, ¬, ⇒, ⇔
Equality ==
Quantifier ∀, ∃
Atomic sentences:
o Atomic sentences are the most basic sentences of first-order logic. These sentences
are formed from a predicate symbol followed by a parenthesis with a sequence of
terms.
Page 15 of 50
o We can represent atomic sentences as Predicate (term1, term2, ......, term n).
Complex Sentences:
Consider the statement: "x is an integer.", it consists of two parts, the first part x is the
subject of the statement and second part "is an integer," is known as a predicate.
Universal Quantifier:
Universal quantifier is a symbol of logical representation, which specifies that the statement
within its range is true for everything or every instance of a particular thing.
o For all x
Page 16 of 50
o For each x
o For every x.
Example:
Let a variable x which refers to a cat so all x can be represented in UOD as below:
It will be read as: There are all x where x is a man who drink coffee.
Existential Quantifier:
Existential quantifiers are the type of quantifiers, which express that the statement within its
scope is true for at least one instance of something.
It is denoted by the logical operator ∃, which resembles as inverted E. When it is used with a
predicate variable then it is called as an existential quantifier.
If x is a variable, then existential quantifier will be ∃x or ∃(x). And it will be read as:
Page 17 of 50
Example:
It will be read as: There are some x where x is a boy who is intelligent.
In this topic, we will understand the Knowledge engineering process in an electronic circuit
domain, which is already familiar. This approach is mainly suitable for creating special-
purpose knowledge base.
Following are some main steps of the knowledge-engineering process. Using these steps, we
will develop a knowledge base which will allow us to reason about digital circuit (One-bit
full adder) which is given below
Page 18 of 50
1. Identify the task:
The first step of the process is to identify the task, and for the digital circuit, there are various
reasoning tasks.
At the first level or highest level, we will examine the functionality of the circuit:
At the second level, we will examine the circuit structure details such as:
In the second step, we will assemble the relevant knowledge which is required for digital
circuits. So for digital circuits, we have the following required knowledge:
3. Decide on vocabulary:
Page 19 of 50
The next step of the process is to select functions, predicate, and constants to represent the
circuits, terminals, signals, and gates. Firstly we will distinguish the gates from each other
and from other objects. Each gate is represented as an object which is named by a constant,
such as, Gate(X1). The functionality of each gate is determined by its type, which is taken as
constants such as AND, OR, XOR, or NOT. Circuits will be identified by a
predicate: Circuit (C1).
For gate input, we will use the function In(1, X1) for denoting the first input terminal of the
gate, and for output terminal we will use Out (1, X1).
The function Arity(c, i, j) is used to denote that circuit c has i input, j output.
The connectivity between gates can be represented by predicate Connect(Out(1, X1), In(1,
X1)).
We use a unary predicate On (t), which is true if the signal at a terminal is on.
To encode the general knowledge about the logic circuit, we need some following rules:
o If two terminals are connected then they have the same input signal, it can be
represented as:
1. ∀ t1, t2 Terminal (t1) ∧ Terminal (t2) ∧ Connect (t1, t2) → Signal (t1) = Signal (2).
o Signal at every terminal will have either value 0 or 1, it will be represented as:
Page 20 of 50
1. ∀ g Gate(g) ∧ Type(g) = XOR → Signal (Out(1, g)) = 1 ⇔ Signal (In(1, g)) ≠ Signal (In(2,
g)).
o Output of NOT gate is invert of its input:
Now we encode problem of circuit C1, firstly we categorize the circuit and its gate
components. This step is easy if ontology about the problem is already thought. This step
involves the writing simple atomics sentences of instances of concepts, which is known as
ontology.
For the given circuit C1, we can encode the problem instance in atomic sentences as below:
Since in the circuit there are two XOR, two AND, and one OR gate so atomic sentences for
these gates will be:
In this step, we will find all the possible set of values of all the terminal for the adder circuit.
The first query will be:
What should be the combination of input which would generate the first output of circuit C1,
as 0 and a second output to be 1?
1. ∃ i1, i2, i3 Signal (In(1, C1))=i1 ∧ Signal (In(2, C1))=i2 ∧ Signal (In(3, C1))= i3
2. ∧ Signal (Out(1, C1)) =0 ∧ Signal (Out(2, C1))=1
Page 21 of 50
Now we will debug the knowledge base, and this is the last step of the complete process. In
this step, we will try to debug the issues of knowledge base.
As propositional logic we also have inference rules in first-order logic, so following are some
basic inference rules in FOL:
o Universal Generalization
o Universal Instantiation
o Existential Instantiation
o Existential introduction
1. Universal Generalization:
o Universal generalization is a valid inference rule which states that if premise P(c) is
true for any arbitrary element c in the universe of discourse, then we can have a
conclusion as ∀ x P(x).
Example: Let's represent, P(c): "A byte contains 8 bits", so for ∀ x P(x) "All bytes contain
8 bits.", it will also be true.
2. Universal Instantiation:
Page 22 of 50
o The UI rule state that we can infer any sentence P(c) by substituting a ground term c
(a constant within domain x) from ∀ x P(x) for any object in the universe of
discourse.
Example:1.
Example: 2.
"All kings who are greedy are Evil." So let our knowledge base contains this detail as in the
form of FOL:
So from this information, we can infer any of the following statements using Universal
Instantiation:
3. Existential Instantiation:
Example:
Page 23 of 50
From the given sentence: ∃x Crown(x) ∧ OnHead(x, John),
So we can infer: Crown(K) ∧ OnHead( K, John), as long as K does not appear in the
knowledge base.
4. Existential introduction
Substitution θ = {John/x} is a unifier for these atoms and applying this substitution, and
both expressions will be identical.
o The UNIFY algorithm is used for unification, which takes two atomic sentences and
returns a unifier for those sentences (If any exist).
o Unification is a key component of all first-order inference algorithms.
Page 24 of 50
o It returns fail if the expressions do not match with each other.
o The substitution variables are called Most General Unifier or MGU.
E.g. Let's say there are two different expressions, P(x, y), and P(a, f(z)).
In this example, we need to make both above statements identical to each other. For this, we
will perform the substitution.
52.1M
959
Difference between JDK, JRE, and JVM
o Substitute x with a, and y with f(z) in the first expression, and it will be represented
as a/x and f(z)/y.
o With both the substitutions, the first expression will be identical to the second
expression and the substitution set will be: [a/x, f(z)/y].
o Predicate symbol must be same, atoms or expression with different predicate symbol
can never be unified.
o Number of Arguments in both expressions must be identical.
o Unification will fail if there are two similar variables present in the same expression.
Unification Algorithm:
Page 25 of 50
Step. 3: IF Ψ1 and Ψ2 have a different number of arguments, then return FAILURE.
Step. 4: Set Substitution set(SUBST) to NIL.
Step. 5: For i=1 to the number of elements in Ψ1.
a) Call Unify function with the ith element of Ψ1 and ith element of Ψ2, and put the
result into S.
b) If S = failure then returns Failure
c) If S ≠ NIL then do,
a. Apply S to the remainder of both L1 and L2.
b. SUBST= APPEND(S, SUBST).
Step.6: Return SUBST.
For each pair of the following atomic sentences find the most general unifier (If exist).
Page 26 of 50
S2 => { p(b, f(Y), f(g(b))); p(b, f(Y), f(Y))}
SUBST θ= {g(b) /Y}
5. Find the MGU of Q(a, g(x, a), f(y)), Q(a, g(f(b), a), x)}
SUBST θ= {b/y}
S1 => {Q(a, g(f(b), a), f(b)); Q(a, g(f(b), a), f(b))}, Successfully Unified.
Page 27 of 50
Forward Chaining
Forward chaining is also known as a forward deduction or forward reasoning method when
using an inference engine. Forward chaining is a form of reasoning which start with atomic
sentences in the knowledge base and applies inference rules (Modus Ponens) in the forward
direction to extract more data until a goal is reached.
The Forward-chaining algorithm starts from known facts, triggers all rules whose premises
are satisfied, and add their conclusion to the known facts. This process repeats until the
problem is solved.
Properties of Forward-Chaining:
Consider the following famous example which we will use in both approaches:
Example:
"As per the law, it is a crime for an American to sell weapons to hostile nations.
Country A, an enemy of America, has some missiles, and all the missiles were sold to it
by Robert, who is an American citizen."
To solve the above problem, first, we will convert all the above facts into first-order definite
clauses, and then we will use a forward-chaining algorithm to reach the goal.
o It is a crime for an American to sell weapons to hostile nations. (Let's say p, q, and r
are variables)
American (p) ∧ weapon(q) ∧ sells (p, q, r) ∧ hostile(r) → Criminal(p) ...(1)
o Country A has some missiles. ?p Owns(A, p) ∧ Missile(p). It can be written in two
definite clauses by using Existential Instantiation, introducing new Constant T1.
Page 28 of 50
Owns(A, T1) ......(2)
Missile(T1) .......(3)
o All of the missiles were sold to country A by Robert.
?p Missiles(p) ∧ Owns (A, p) → Sells (Robert, p, A) ......(4)
o Missiles are weapons.
Missile(p) → Weapons (p) .......(5)
o Enemy of America is known as hostile.
Enemy(p, America) →Hostile(p) ........(6)
o Country A is an enemy of America.
Enemy (A, America) .........(7)
o Robert is American
American(Robert). ..........(8)
Step-1:
In the first step we will start with the known facts and will choose the sentences which do not
have implications, such as: American(Robert), Enemy(A, America), Owns(A, T1), and
Missile(T1). All these facts will be represented as below.
Step-2:
At the second step, we will see those facts which infer from available facts and with satisfied
premises.
Rule-(1) does not satisfy premises, so it will not be added in the first iteration.
Rule-(4) satisfy with the substitution {p/T1}, so Sells (Robert, T1, A) is added, which infers
from the conjunction of Rule (2) and (3).
Rule-(6) is satisfied with the substitution(p/A), so Hostile(A) is added and which infers from
Rule-(7).
Page 29 of 50
Step-3:
At step-3, as we can check Rule-(1) is satisfied with the substitution {p/Robert, q/T1, r/A},
so we can add Criminal(Robert) which infers all the available facts. And hence we reached
our goal statement.
BACKWARD CHAINING
Page 30 of 50
o It is called a goal-driven approach, as a list of goals decides which rules are selected
and used.
o Backward -chaining algorithm is used in game theory, automated theorem proving
tools, inference engines, proof assistants, and various AI applications.
o The backward-chaining method mostly used a depth-first search strategy for proof.
Example:
In backward-chaining, we will use the same above example, and will rewrite all the rules.
Backward-Chaining proof:
In Backward chaining, we will start with our goal predicate, which is Criminal(Robert), and
then infer further rules.
Step-1:
At the first step, we will take the goal fact. And from the goal fact, we will infer other facts,
and at last, we will prove those facts true. So our goal fact is "Robert is Criminal," so
following is the predicate of it.
Step-2:
At the second step, we will infer other facts form goal fact which satisfies the rules. So as we
can see in Rule-1, the goal predicate Criminal (Robert) is present with substitution
{Robert/P}. So we will add all the conjunctive facts below the first level and will replace p
with Robert.
Page 31 of 50
Step-3:t At step-3, we will extract further fact Missile(q) which infer from Weapon(q), as it
satisfies Rule-(5). Weapon (q) is also true with the substitution of a constant T1 at q.
Step-4:
At step-4, we can infer facts Missile(T1) and Owns(A, T1) form Sells(Robert, T1, r) which
satisfies the Rule- 4, with the substitution of A in place of r. So these two statements are
proved here.
Page 32 of 50
Step-5:
At step-5, we can infer the fact Enemy(A, America) from Hostile(A) which satisfies Rule-
6. And hence all the statements are proved true using backward chaining.
REASONING
Page 33 of 50
The reasoning is the mental process of deriving logical conclusion and making predictions
from available knowledge, facts, and beliefs. Or we can say, "Reasoning is a way to infer
facts from existing data." It is a general process of thinking rationally, to find valid
conclusions.
Types of Reasoning
51.8M
1.1K
o Deductive reasoning
o Inductive reasoning
o Abductive reasoning
o Common Sense Reasoning
o Monotonic Reasoning
o Non-monotonic Reasoning
1. Deductive reasoning:
Deductive reasoning is deducing new information from logically related known information.
It is the form of valid reasoning, which means the argument's conclusion must be true when
the premises are true.
Deductive reasoning is a type of propositional logic in AI, and it requires various rules and
facts. It is sometimes referred to as top-down reasoning, and contradictory to inductive
reasoning.
In deductive reasoning, the truth of the premises guarantees the truth of the conclusion.
Deductive reasoning mostly starts from the general premises to the specific conclusion,
which can be explained as below example.
Example:
Page 34 of 50
The general process of deductive reasoning is given below:
2. Inductive Reasoning:
Inductive reasoning is a form of reasoning to arrive at a conclusion using limited sets of facts
by the process of generalization. It starts with the series of specific facts or data and reaches
to a general statement or conclusion.
In inductive reasoning, we use historical data or various premises to generate a generic rule,
for which premises support the conclusion.
In inductive reasoning, premises provide probable supports to the conclusion, so the truth of
premises does not guarantee the truth of the conclusion.
Example:
Premise: All of the pigeons we have seen in the zoo are white.
3. Abductive reasoning:
Abductive reasoning is a form of logical reasoning which starts with single or multiple
observations then seeks to find the most likely explanation or conclusion for the observation.
Example:
Page 35 of 50
Conclusion It is raining.
Common sense reasoning is an informal form of reasoning, which can be gained through
experiences.
Common Sense reasoning simulates the human ability to make presumptions about events
which occurs on every day.
It relies on good judgment rather than exact logic and operates on heuristic
knowledge and heuristic rules.
Example:
The above two statements are the examples of common sense reasoning which a human mind
can easily understand and assume.
5. Monotonic Reasoning:
In monotonic reasoning, once the conclusion is taken, then it will remain the same even if we
add some other information to existing information in our knowledge base. In monotonic
reasoning, adding knowledge does not decrease the set of prepositions that can be derived.
To solve monotonic problems, we can derive the valid conclusion from the available facts
only, and it will not be affected by new facts.
Monotonic reasoning is not useful for the real-time systems, as in real time, facts get
changed, so we cannot use monotonic reasoning.
Example:
It is a true fact, and it cannot be changed even if we add another sentence in knowledge base
like, "The moon revolves around the earth" Or "Earth is not round," etc.
Page 36 of 50
o If we deduce some facts from available facts, then it will remain valid for always.
6. Non-monotonic Reasoning
Logic will be said as non-monotonic if some conclusions can be invalidated by adding more
knowledge into our knowledge base.
"Human perceptions for various things in daily life, "is a general example of non-monotonic
reasoning.
Example: Let suppose the knowledge base contains the following knowledge:
So from the above sentences, we can conclude that Pitty can fly.
However, if we add one another sentence into knowledge base "Pitty is a penguin", which
concludes "Pitty cannot fly", so it invalidates the above conclusion.
Page 37 of 50
o In non-monotonic reasoning, the old facts may be invalidated by adding new
sentences.
o It cannot be used for theorem proving.
1. Forward chaining starts from known Backward chaining starts from the
facts and applies inference rule to goal and works backward through
extract more data unit it reaches to inference rules to find the required
the goal. facts that support the goal.
5. Forward chaining tests for all the Backward chaining only tests for few
available rules required rules.
9. Forward chaining is aimed for any Backward chaining is only aimed for
conclusion. the required data.
Page 38 of 50
DIFFERENCE BETWEEN INDUCTIVE AND DEDUCTIVE
REASONING
Starts from Deductive reasoning starts from Inductive reasoning starts from
Premises. the Conclusion.
Page 39 of 50
RESOLUTION
Resolution is a theorem proving technique that proceeds by building refutation proofs, i.e.,
proofs by contradictions. It was invented by a Mathematician John Alan Robinson in the year
1965.
Resolution is used, if there are various statements are given, and we need to prove a
conclusion of those statements. Unification is a key concept in proofs by resolutions.
Resolution is a single inference rule which can efficiently operate on the conjunctive normal
form or clausal form.
Clause: Disjunction of literals (an atomic sentence) is called a clause. It is also known as a
unit clause.
The resolution rule for first-order logic is simply a lifted version of the propositional rule.
Resolution can resolve two clauses if they contain complementary literals, which are assumed
to be standardized apart so that they share no variables.
This rule is also called the binary resolution rule because it only resolves exactly two
literals.
Example:
Where two complimentary literals are: Loves (f(x), x) and ¬ Loves (a, b)
These literals can be unified with unifier θ= [a/f(x), and b/x] , and it will generate a resolvent
clause:
Page 40 of 50
Steps for Resolution:
To better understand all the above steps, we will take an example in which we will apply
resolution.
Example:
a. John likes all kind of food.
a. Apple and vegetable are food
b. Anything anyone eats and not killed is food.
c. Anil eats peanuts and still alive
d. Harry eats everything that Anil eats.
Prove by resolution that:
e. John likes peanuts.
In the first step we will convert all the given statements into its first order logic.
In First order logic resolution, it is required to convert the FOL into CNF as CNF form makes
easier for resolution proofs.
Page 41 of 50
b. food(Apple) Λ food(vegetables)
c. ∀x ∀y ¬ [eats(x, y) Λ ¬ killed(x)] V food(y)
d. eats (Anil, Peanuts) Λ alive(Anil)
e. ∀x ¬ eats(Anil, x) V eats(Harry, x)
f. ∀x¬ [¬ killed(x) ] V alive(x)
g. ∀x ¬ alive(x) V ¬ killed(x)
h. likes(John, Peanuts).
Page 42 of 50
o Drop Universal quantifiers.
In this step we will drop all universal quantifier since all the statements are not
implicitly quantified so we don't need it.
a. ¬ food(x) V likes(John, x)
b. food(Apple)
c. food(vegetables)
d. ¬ eats(y, z) V killed(y) V food(z)
e. eats (Anil, Peanuts)
f. alive(Anil)
g. ¬ eats(Anil, w) V eats(Harry, w)
h. killed(g) V alive(g)
i. ¬ alive(k) V ¬ killed(k)
j. likes(John, Peanuts).
o Distribute conjunction ∧ over disjunction ¬.
This step will not make any change in this problem.
o In this statement, we will apply negation to the conclusion statements, which will be
written as ¬likes(John, Peanuts)
Now in this step, we will solve the problem by resolution tree using substitution. For the
above problem, it will be given as follows:
Page 43 of 50
Hence the negation of the conclusion has been proved as a complete contradiction with the
given set of statements.
o In the first step of resolution graph, ¬likes(John, Peanuts) , and likes(John, x) get
resolved(canceled) by substitution of {Peanuts/x}, and we are left with ¬
food(Peanuts)
o In the second step of the resolution graph, ¬ food(Peanuts) , and food(z) get resolved
(canceled) by substitution of { Peanuts/z}, and we are left with ¬ eats(y, Peanuts) V
killed(y) .
o In the third step of the resolution graph, ¬ eats(y, Peanuts) and eats (Anil,
Peanuts) get resolved by substitution {Anil/y}, and we are left with Killed(Anil) .
o In the fourth step of the resolution graph, Killed(Anil) and ¬ killed(k) get resolve by
substitution {Anil/k}, and we are left with ¬ alive(Anil) .
o In the last step of the resolution graph ¬ alive(Anil) and alive(Anil) get resolved
Page 44 of 50
ONTOLOGICAL ENGINEERING
Ontologies are a powerful tool for organizing and understanding information in a structured
way. They provide a clear framework for defining the relationships between different
concepts, making it easier to share and analyze data across various fields.
This article will explore what ontologies are, how they are used, and why they are
important for improving data management and communication in areas like artificial
intelligence, semantic web, and knowledge management.
Ontologies
Ontologies are formal definitions of vocabularies that allow us to define difficult or
complex structures and new relationships between vocabulary terms and members of
classes that we define. Ontologies generally describe specific domains such as scientific
research areas.
Example:
Ontology depicting Movie:-
Components Of Ontology:
1. Individuals – Individuals are also known as instances of objects or concepts. It may
or may not be present in an ontology. It represents the atomic level of an ontology. For
example, in the above ontology of movie, individuals can be a film (Titanic), a director
(James Cameron), an actor (Leonardo DiCaprio).
2. Classes – Sets of collections of various objects are termed as classes. For example, in
the above ontology representing movie, movie genre (e.g. Thriller, Drama), types of
person (Actor or Director) are classes.
3. Attributes – Properties that objects may possess. For example, a movie is described
by the set of „parts‟ it contains like Script, Director, Actors.
Page 45 of 50
4. Relations – Ways in which concepts are related to one another. For example, as
shown above in the diagram a movie has to have a script and actors in it.
Different Ontology Languages:
CycL – It was developed for the Cyc project and is based on First Order Predicate
Calculus.
Rule Interchange Format (RIF) – It is the language used for combining ontologies
and rules.
Open Biomedical Ontologies (OBO) – It is used for various biological and biomedical
ontologies.
Web Ontology Language (OWL) – It is developed for using ontologies over
the World Wide Web (WWW).
• Predicates: apple(x)
Category organization
• Relation = inheritance:
The agents we have constructed so far have beliefs and can deduce new beliefs. Yet none of
them has any knowledge about beliefs or about deduction. Knowledge about one‟s own
knowledge and reasoning processes is useful for controlling inference
Page 46 of 50
propositional attitudes – the attitude an agent has towards mental objects (e.g. Believes,
Knows, Wants, Intends, and Informs). These propositional attitudes do not behave like
normal predicates
for example, to assert that Lois knows that Superman can fly:
Knows(Lois, CanFly(Superman))
one issue, is that if “Superman is Clark Kent is true“, then the inferential rules concludes
that “Lois knows that Clark can fly” (this is an example of referential transparency). But in
reality Lois doesn‟t actually know that clark can fly
Superman = Clark (Superman = Clark) and (Knows(Lois, CanFly(Superman)) ⊨ Knows(Lois,
CanFly(Clark))
referential transparency
§ an expression always evaluates to the same result in any context
§ e.g. if agent knows that 2+2=4 and 4<5, then agent should know that 2+2<5
referential opacity
§ not referential transparent
§ we want referential opacity for propositional attitudes because terms do matter and not all
agents know which terms are co-referential
§ not directly possible in most formal logic languages (except Modal Logic)
Modal Logic
designed to allow referential opacity into knowledge base.
regular logic is concerned with single modality (the modality of truth), allowing us to
express “P is true”
modal logic includes modal operators that takes sentences (rather than terms) as arguments
(e.g. “A knows P” is represented with notation KAP where K is the modal operator for
knowledge, A is the agent, and P is a sentence).
syntax of modal logic is the same as first-order logic, with the addition that sentences can
also be formed with modal operators
Page 47 of 50
semantics of modal logic is more complicated. In first-order logic a model contains a set of
objects and an interpretation that maps each name to the appropriate object, relation, or
function. In modal logic we want to be able to consider both the possibility that Superman‟s
secret identity is Clark and that it isn‟t. Therefore, in modal logic a model consists of a
collection of possible worlds (instead of 1 true world). The worlds are connected in a graph
by accessibility relations (one relation for each modal operator). We say that world w1 is
accessible from world w0 with respect to the modal operator KA if everything in w1 is
consistent with what A knows in w0, and we write this as Acc(KA, w0, w1)
Figure shows accessibility as an arrow between possible worlds
A knowledge atom KAP is true in world w if and only if P is true in every world accessible
from w. The truth of more complex sentences is derived by recursive application of this rule
and the normal rules of first-order logic. That means that modal logic can be used to reason
about nested knowledge sentences: what one agent knows about another agent‟s knowledge.
For example, we can say that, even though Lois doesn‟t know whether Superman‟s secret
identity is Clark Kent, she does know that Clark knows:
Page 48 of 50
report or if Clark is Superman. But she does know that Superman knows whether he is Clark,
because in every world that is accessible to Lois, either Superman knows I, or he knows ¬I.
Lois does not know which is the case, but either way she knows Superman knows.
In the TOP-RIGHT of figure we represent the scenario where it is common knowledge that
Lois has seen the weather report. So in w4 she knows rain is predicted and in w6 she knows
rain is not predicted. Superman does not know the report, but he knows that Lois knows,
because in every world that is accessible to him, either she knows R or she knows ¬R.
In the BOTTOM of figure we represent the scenario where it is common knowledge that
Superman knows his identity, and Lois might or might not have seen the weather report. We
represent this by combining the two top scenarios, and adding arrows to show that Superman
does not know which scenario actually holds. Lois does know, so we don‟t need to add any
arrows for her. In w0 Superman still knows I but not R, and now he does not know whether
Lois knows R. From what Superman knows, he might be in w0 or w2, in which case Lois
does not know whether R is true, or he could be in w4, in which case she knows R, or w6, in
which case she knows ¬R.
Page 49 of 50
Page 50 of 50