AI PPT Unit 3
AI PPT Unit 3
Properties of Operators:
• Commutativity:
– P∧ Q= Q ∧ P, or
– P ∨ Q = Q ∨ P.
• Associativity:
– (P ∧ Q) ∧ R= P ∧ (Q ∧ R),
– (P ∨ Q) ∨ R= P ∨ (Q ∨ R)
Propositional logic In Artificial intelligence
• Identity element:
– P ∧ True = P,
– P ∨ True= True.
• Distributive:
– P∧ (Q ∨ R) = (P ∧ Q) ∨ (P ∧ R).
– P ∨ (Q ∧ R) = (P ∨ Q) ∧ (P ∨ R).
• DE Morgan's Law:
– ¬ (P ∧ Q) = (¬P) ∨ (¬Q)
– ¬ (P ∨ Q) = (¬ P) ∧ (¬Q).
• Double-negation elimination:
– ¬ (¬P) = P.
Limitations of Propositional logic:
• We cannot represent relations like ALL, some, or none with propositional logic.
– All the girls are intelligent.
– Some apples are sweet.
• Propositional logic has limited expressive power.
First-Order Predicate logic(FOPL)
• It is an extension to propositional logic.
• First-order logic is also known as Predicate logic or First-order predicate logic.
• First-order logic does not only assume that the world contains facts like
propositional logic but also assumes the following things in the world:
• Objects: A, B, people, numbers, colors, wars, theories, squares, pits, wumpus,
• Relations: It can be unary relation such as: red, round, is adjacent, or n-any relation
such as: the sister of, brother of, has color, comes between
• Function: Father of, best friend, third inning of, end of, ......
• As a natural language, first-order logic also has two main parts:
Syntax
Semantic
Syntax of First-Order logic:
• The syntax of FOL determines which collection of symbols is a logical expression
in first-order logic. The basic syntactic elements of first-order logic are symbols. We
write statements in short-hand notation in FOL.
First-Order Predicate logic(FOPL)
Constant 1, 2, A, John, Mumbai, cat,....
Variables x, y, z, a, b,....
Connectives ∧, ∨, ¬, ⇒, ⇔
Equality ==
Quantifier ∀, ∃
First-Order Predicate logic(FOPL)
Atomic sentences:
• Atomic sentences are the most basic sentences of first-order logic. These sentences
are formed from a predicate symbol followed by a parenthesis with a sequence of
terms.
• We can represent atomic sentences as Predicate (term1, term2, ......, term n).
• Example: Ravi and Ajay are brothers: => Brothers(Ravi, Ajay).
Chinky is a cat: => cat (Chinky).
Complex Sentences:
• Complex sentences are made by combining atomic sentences using connectives.
First-order logic statements can be divided into two parts:
• Subject: Subject is the main part of the statement.
• Predicate: A predicate can be defined as a relation, which binds two atoms together
in a statement.
Example:
• "x is an integer.", it consists of two parts, the first part x is the subject of the
statement and second part "is an integer," is known as a predicate.
First-Order Predicate logic(FOPL)
Quantifiers in First-order logic:
• These are the symbols that permit to determine or identify the range and scope of
the variable in the logical expression. There are two types of quantifier:
– Universal Quantifier, (for all, everyone, everything)
– Existential quantifier, (for some, at least one).
Universal Quantifier:
• Universal quantifier is a symbol of logical representation, which specifies that the
statement within its range is true for everything or every instance of a particular
thing.
• The Universal quantifier is represented by a symbol ∀, which resembles an inverted
A.
• In universal quantifier we use implication "→".
• If x is a variable, then ∀x is read as:
• For all x
• For each x
• For every x.
First-Order Predicate logic(FOPL)
Example:
• All man drink coffee.
• ∀x man(x) → drink (x, coffee).
• It will be read as: There are all x where x is a man who drink coffee.
Existential Quantifier:
• Existential quantifiers are the type of quantifiers, which express that the statement
within its scope is true for at least one instance of something.
• It is denoted by the logical operator ∃, which resembles as inverted E. When it is
used with a predicate variable then it is called as an existential quantifier.
• In Existential quantifier we always use AND or Conjunction symbol ( ∧).
• If x is a variable, then existential quantifier will be ∃x or ∃(x). And it will be read
as:
• There exists a 'x.'
• For some 'x.'
• For at least one 'x.‘
Example:
• Some boys are intelligent.
First-Order Predicate logic(FOPL)
• ∃x: boys(x) ∧ intelligent(x)
• It will be read as: There are some x where x is a boy who is intelligent.
Notes:
• The main connective for universal quantifier ∀ is implication →.
• The main connective for existential quantifier ∃ is and ∧.
knowledge-engineering
• The process of constructing a knowledge-base in first-order logic is called as
knowledge- engineering. In knowledge-engineering, someone who investigates a
particular domain, learns important concept of that domain, and generates a formal
representation of the objects, is known as knowledge engineer.
Inference in First-Order Logic
• Inference in First-Order Logic is used to deduce new facts or sentences from
existing sentences
An Example - Facts in FOL
(1) Marcus was a man.
man(Marcus)
(2) Marcus was a Pompeian.
Pompeian(Marcus)
(3) All Pompeians were Romans.
x Pompeian(x) Roman(x)
(4) Caesar was a ruler.
ruler(Caesar)
(5) All Romans were either loyal to Caesar or hated him.
x Roman(x) loyalto(x, Caesar) hate(x, Caesar)
(6) Everyone is loyal to someone.
x y loyalto(x, y)
(7) People only try to assassinate rulers they are not loyal to.
x y person(x) ruler(y) tryassassinate(x, y) loyalto(x, y)
(8) Marcus tried to assassinate Caesar.
tryassassinate(Marcus, Caesar)
Unification
• Unification is a process of making two different logical atomic expressions identical
by finding a substitution. Unification depends on the substitution process.
• It takes two literals as input and makes them identical using substitution.
• Let Ψ1 and Ψ2 be two atomic sentences and 𝜎 be a unifier such that, Ψ1𝜎 = Ψ2𝜎, then
it can be expressed as UNIFY(Ψ1, Ψ2).
Example:
Find the MGU for Unify{King(x), King(John)}
• Let Ψ1 = King(x), Ψ2 = King(John),
• Substitution θ = {John/x} is a unifier for these atoms and applying this substitution,
and both expressions will be identical.
• The UNIFY algorithm is used for unification, which takes two atomic sentences and
returns a unifier for those sentences (If any exist).
• Unification is a key component of all first-order inference algorithms.
• It returns fail if the expressions do not match with each other.
• The substitution variables are called Most General Unifier or MGU.
Conditions for Unification:
• Predicate symbol must be same, atoms or expression with different predicate symbol
Unification
• Number of Arguments in both expressions must be identical.
• Unification will fail if there are two similar variables present in the same
expression.
Algorithm: Unify(Ψ1, Ψ2)
Step. 1: If Ψ1 or Ψ2 is a variable or constant, then:
a) If Ψ1 or Ψ2 are identical, then return NIL.
b) Else if Ψ1is a variable,
a. then if Ψ1 occurs in Ψ2, then return FAILURE
b. Else return { (Ψ2/ Ψ1)}.
c) Else if Ψ2 is a variable,
a. If Ψ2 occurs in Ψ1 then return FAILURE,
b. Else return {( Ψ1/ Ψ2)}.
d) Else return FAILURE.
Step.2: If the initial Predicate symbol in Ψ1 and Ψ2 are not same, then return
FAILURE.
Step. 3: IF Ψ1 and Ψ2 have a different number of arguments, then return FAILURE.
Unification Algorithm
Step. 5: For i=1 to the number of elements in Ψ 1.
a) Call Unify function with the ith element of Ψ 1 and ith element of Ψ2, and put the result into S.
b) If S = failure then returns Failure
c) If S ≠ NIL then do,
a. Apply S to the remainder of both Ψ1 and Ψ2
b. SUBST= APPEND(S, SUBST).
Step.6: Return SUBST.
Implementation of the Algorithm
Step.1: Initialize the substitution set to be empty.
Step.2: Recursively unify atomic sentences:
• Check for Identical expression match.
• If one expression is a variable vi, and the other is a term ti which does not contain variable vi,
then:
– Substitute ti / vi in the existing substitutions
– Add ti /vi to the substitution setlist.
– If both the expressions are functions, then function name must be similar, and the
number of arguments must be the same in both the expression.
• For each pair of the following atomic sentences find the most general unifier (If exist).
Find the most general unifier for each pair of the following atomic statements (If exist).
1. Find the MGU of {p(f(a), g(Y)) and p(X, X)}
SUBST θ= {b/y}
SUBST θ={f(Y) /X}
Once the engine has access to the relevant facts, it will use these facts to draw
conclusions. In order to do this, the engine will use a set of inference rules. These
rules are typically based on logic or probability. The engine will use these rules to
determine what conclusions can be drawn from the evidence.
Owns(A,T1)
Missile(T1) .......(3)
For all T1 : Missiles(T1) ∧ Owns (A, T1) → Sells (Robert, T1, A) ......(4)
Missile(T1) → Weapons (T1) .......(5)
Backward Chaining
5. Forward chaining tests for all the available rules Backward chaining only tests for few required
rules.
6. Forward chaining is suitable for the planning, Backward chaining is suitable for diagnostic,
monitoring, control, and interpretation application. prescription, and debugging application.
7. Forward chaining can generate an infinite number of Backward chaining generates a finite number of
possible conclusions. possible conclusions.
2. Inductive Reasoning:
• Inductive reasoning is a form of reasoning to arrive at a conclusion using limited
sets of facts by the process of generalization. It starts with the series of specific facts
or data and reaches to a general statement or conclusion.
• Inductive reasoning is a type of propositional logic, which is also known as cause-
effect reasoning or bottom-up reasoning.
• In inductive reasoning, we use historical data or various premises to generate a
generic rule, for which premises support the conclusion.
• In inductive reasoning, premises provide probable supports to the conclusion,
so the truth of premises does not guarantee the truth of the conclusion.
Example:
• Premise: All of the pigeons we have seen in the zoo are white.
• Conclusion: Therefore, we can expect all the pigeons to be white
Reasoning
3. Abductive reasoning:
• Abductive reasoning is a form of logical reasoning which starts with single or
multiple observations then seeks to find the most likely explanation or conclusion
for the observation.
• Abductive reasoning is an extension of deductive reasoning, but in abductive
reasoning, the premises do not guarantee the conclusion.
• Example:
Implication: Cricket ground is wet if it is raining
Axiom: Cricket ground is wet.
Conclusion It is raining.
Reasoning
5. Monotonic Reasoning:
• In monotonic reasoning, once the conclusion is taken, then it will remain the same
even if we add some other information to existing information in our knowledge
base. In monotonic reasoning, adding knowledge does not decrease the set of
prepositions that can be derived. Example:
• Earth revolves around the Sun.
• It is a true fact, and it cannot be changed even if we add another sentence in
knowledge base like, "The moon revolves around the earth" Or "Earth is not round,"
etc.
Advantages of Monotonic Reasoning:
• In monotonic reasoning, each old proof will always remain valid.
• If we deduce some facts from available facts, then it will remain valid for always.
Disadvantages of Monotonic Reasoning:
• We cannot represent the real world scenarios using Monotonic reasoning.
• Since we can only derive conclusions from the old proofs, so new knowledge from
the real world cannot be added.
Reasoning
6. Non-monotonic Reasoning
• In Non-monotonic reasoning, some conclusions may be invalidated if we add some
more information to our knowledge base.
Example: Birds can fly
• Penguins cannot fly
• Pitty is a bird
• So from the above sentences, we can conclude that Pitty can fly.
• However, if we add one another sentence into knowledge base "Pitty is a penguin",
which concludes "Pitty cannot fly", so it invalidates the above conclusion.
Advantages of Non-monotonic reasoning:
• In Non-monotonic reasoning, we can choose probabilistic facts or can make
assumptions. Also used in Robot Navigation.
Disadvantages of Non-monotonic Reasoning:
• In non-monotonic reasoning, the old facts may be invalidated by adding new
sentences.
• It cannot be used for theorem proving.
Resolution
• Resolution is a theorem proving technique that proofs by contradictions. It was
invented by a Mathematician John Alan Robinson in the year 1965.
• Resolution is used, if there are various statements given, and we need to prove a
conclusion of those statements. Unification is a key concept in proofs by
resolutions. Resolution is a single inference rule which can efficiently operate on
the conjunctive normal form or clausal form.
• Clause: Disjunction of literals (an atomic sentence) is called a clause. It is also
known as a unit clause.
• Conjunctive Normal Form: A sentence represented as a conjunction of clauses is
said to be conjunctive normal form or CNF.
Resolution
Steps for Resolution:
• Conversion of facts into first-order logic.
• Convert FOL statements into CNF
• Negate the statement which needs to prove (proof by contradiction)
• Draw resolution graph (unification).
Example:
• John likes all kind of food.
• Apple and vegetable are food
• Anything anyone eats and not killed is food.
• Anil eats peanuts and still alive
• Harry eats everything that Anil eats.
Prove by resolution that:
• John likes peanuts.
Resolution
• Step-1: Conversion of Facts into FOL
In this step we will drop all universal quantifier since all the statements are not implicitly
quantified so we don't need it.
– ¬ food(x) V likes(John, x)
– food(Apple)
– food(vegetables)
– ¬ eats(y, z) V killed(y) V food(z)
– eats (Anil, Peanuts)
– alive(Anil)
Resolution
– ¬ eats(Anil, w) V eats(Harry, w)
– killed(g) V alive(g)
– ¬ alive(k) V ¬ killed(k)
– likes(John, Peanuts).
• Distribute conjunction over disjunction
This step will not make any change in this problem.
• Step-3: Negate the statement to be proved
• In this statement, we will apply negation to the conclusion statements, which will be
written as ¬likes(John, Peanuts)
• Step-4: Draw Resolution graph:
• Now in this step, we will solve the problem by resolution tree using substitution.
For the above problem, it will be given as follows:
Utility Theory: The main idea of utility theory is really simple: an agent's
preferences over possible outcomes can be captured by a function that maps these
outcomes to a real number; the higher the number the more that agent likes that
outcome. The function is called a utility function.
Probabilistic reasoning:
Probabilistic reasoning is a way of knowledge representation where we apply the concept
of probability to indicate the uncertainty in knowledge. In probabilistic reasoning, we
combine probability theory with logic to handle the uncertainty.
We can find the probability of an uncertain event by using the below formula.
Random variables: Random variables are used to represent the events and objects in
the real world.
Conditional probability:
Conditional probability is a probability of occurring an event when another event has
already happened.
Let's suppose, we want to calculate the event A when event B has already occurred,
"the probability of A under the conditions of B", it can be written as:
Markov chains. These are the simplest type of Markov model and are used to represent
systems where all states are observable. Markov chains show all possible states, and
between states, they show the transition rate, which is the probability of moving from one
state to another per unit of time. Applications of this type of model include prediction of
market crashes, speech recognition and search engine algorithms.
Hidden Markov models. These are used to represent systems with some unobservable
states. In addition to showing states and transition rates, hidden Markov models also
represent observations and observation likelihoods for each state. Hidden Markov models
are used for a range of applications, including thermodynamics, finance and pattern
recognition.
How are Markov models represented?
The simplest Markov model is a Markov chain, which can be
expressed in equations, as a transition matrix or as a graph. A
transition matrix is used to indicate the probability of moving from
each state to each other state. Generally, the current states are listed
in rows, and the next states are represented as columns. Each cell
then contains the probability of moving from the current state to the
next state. For any given row, all the cell values must then add up to
one.
A graph consists of circles, each of which represents a state, and
directional arrows to indicate possible transitions between states.
The directional arrows are labeled with the transition probability.
The transition probabilities on the directional arrows coming out of
any given circle must add up to one.