0% found this document useful (0 votes)
27 views53 pages

Atomic Sentences

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
27 views53 pages

Atomic Sentences

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 53

Atomic Sentences

• These are the most basic sentences of FOL


• This sentences are formed from a predicate symbol followed by
parenthesis with sequence of terms
• It is represented as predicate with subjects
• Laxmi is an elephant
• Where as Laxmi is a subject and elephant is a predicate
• Elephant (Laxmi)
Complex Sentences
• These are the combination of two or more atomic sentences
• And these sentences are connected logically
• Laxmi, Preeti, and Namratha are friends
• Friends(Laxmi, Preeti, Namratha)
Free and Bound Variables
• We’ve discussed about two types of quantifiers
• The quantifiers then interact with variables which appears in a suitable
way
• There are two types as said and they are:
• Free variables
• Bound variables
Free variables
• In this variables, if it occurs or be represented outside the given scope
of quantifiers
• Then the said variables can be represented as:
• ∀x ∃y [p(x, y, z)]
• Where here both quantifiers for all and there exists occurs with the
free variable z
Bound Variable
• The variable is said to be bound variable in a formula if it occurs
within the scope of quantifiers
• For example:
• ∀x [A(x)B(y)]
• Where A and b are bounded with no other free variables
Semantic network representation
• It is an alternative of predicate logic for knowledge representation
• In this network, we can represent the knowledge in the form of
graphical network
• This network consists of nodes which represents objects, and arcs
wherein which this represents relationship between the objects
• This networks are easy to understand and can be easily extended or
enhanced
Example
• Tom is a cat
• Tom is a mammal
• Tom is owned by xyz
• Tom in brown in color
• All mammals or animals
Advantages of SNR
• This is such a natural representation of knowledge
• It is simple in representation and easily understood by any level of
programmer
Drawbacks of Semantic Network
Representation
• This type of representation are inadequate because they don’t have any
proper quantifiers as seen in Predicate logic
• This network is not intelligent and completely programmer dependent
Frame Representation
• It is a record like structure which consist of attributes and the
corresponding values
• It consist of collection of slots and slots values
• The slots may be of any type and size
• These slots have names and values which is otherwise called as Facts
• A frame may consists of any number of slots
• This slots may have any number of facts
• And this facts may have any number of values
• So this Frame Representation is also called as Slot Filter Knowledge
Representation
Example
• Peter is a engineer by profession. His age is 25 and weight is 78. He is
from London and his country by birth is England, UK
• Now representation of frames look like:
Advantages of FR
• It is flexible
• Easy to include n number of values
• Simple and easy to understand
Disadvantages of FR
• It is a much more generalized approach and cannot be seen in any
other logical representations as it restricts the user to input values
which are difficult in nature
Production Rules
• This system consists of condition and action pairs
• Which means, ‘if – condition, then – action”
• This production rules has three main parts as:
1. Set of Production Rules
2. Working Memory
3. Recognized act cycle
• In production rule, the agent checks for condition which occurs and
then the further action shall be taken from the agents side
• Also, there might be set of rules to be followed, and the complete
step of following the rules for specific said condition is called as
recognized – act cycle
• The working memory contains the description of the state of the
problem solving, and rule that can returned for the knowledge
• For example:
• If (bus arrives and bus stops)
• Then (get into the bus)
• If (in the bus, paid the ticket, seat free)
• Then (sit down)

• If (in the bus, unpaid)


• Then (pay fine)

• If (bus reaches destination)


• Then (get down)
Advantages and Disadvantages
• This rules are simple and expressed in natural or simple English
language
• Easily removed, modified

Disadvantages:
• High end learning is not seen in this particular production rules and so
it will not be helpful in future
Reasoning in AI
• Reasoning is the process of deriving logical conclusions and making
predictions from the available knowledge, fact and believes
• It is also defined as a way to gather fact from the existing or available
data
• In AI, reasoning is essential so that machine can think as a human
brain and perform like a human
Types of Reasoning
• There are different types of reasoning
1. Deductive Reasoning
2. Inductive Reasoning
3. Abductive Reasoning
4. Common Sense Reasoning
5. Monotonic Reasoning
6. Non-Monotonic Reasoning
Deductive Reasoning
• It is about collecting new information from the logically related
known information
• It is a form of valid reasoning which means, the conclusions or
inferences are true if the said premises are true
• It is sometime also referred as Top-Down Reasoning
• For example: All humans are vegetarians
• Also said as: xyz are humans && xyz are vegetarians
Inductive Reasoning
• It is a form of reasoning to arrive at conclusion using limited set of
fact or just consuming a limited fact
• It is also called as Cost Effective Reasoning
• It is a type of propositional logic and it provides support to the
conclusion
• Wherein which the premises might be true but not the conclusion
• For example: All pigeons we have seen in the zoo were white -> True
All pigeons are white -> Might or might not be true
Abductive Reasoning
• It is a form of logical reasoning which starts with single or multiple
observations
• And then finds the most likely conclusion for the observation
• It is an extension of deductive reasoning
• But the premises does not guarantee you the conclusion
• For example: My college is completely wet if Bangalore rains
• My college is wet
• Bangalore rains -> may be the reason or may not
Common Sense reasoning
• It is a informal form of reasoning which can be gained through
experiences
• This reasoning simulates the human ability to make assumptions
about events which occurs on daily life
• It is something depending on good judgement rather than exact logic
• For example: If I zebra cross while in a green signal there’s a chance of
me getting into an accident
Monotonic Reasoning
• In this reasoning, once the conclusion is made it will be same even if
we update
• Solving or deriving to conclusions with the available fact is seen here
• Example: Earth revolves around the sun
• Here, even though if we add like Earth is round and it revolves around
the sun, this doesn’t change the meaning of the premises that was
given
Non-Monotonic Reasoning
• In this reasoning, if we keep on updating the premises, it’ll become
invalid and give no reason
• This usually deals with incomplete and uncertain models
• Human perceptions for many things in daily life is an example of non-
monotonic reasoning
Truth Maintenance System (TMS)
• TMS is otherwise known as belief revision and revision maintenance
system
• A TMS maintenances consistency in a knowledge representation and
it concentrate on problem solving aspects
• Which is towards the solution aspects
• TMS can be diagrammatically represented as:
Diagrammatic Representation of
TMS

IE (Inference Engine) TMS

KB (Knowledge Base)
Goals of TMS
• There are four main goals of TMS and they are:
1. Provide Justifications for Conclusions
2. Recognize inconsistency
3. Support default reasoning
4. Support dependency driven backtracking
Providing justification for
conclusions
• When a problem-solving systems provide a conclusions to user’s
queries, an explanation of the conclusion is always required or should
be provided
Recognize Inconsistency
• An IE (Inference Engine) may tell TMS that some sentences are
contradictory and so the TMS rectifies it
Support default reasoning
• In many situation, in the absence of proper knowledge default
assumption are supported
Support dependency driven
backtracking
• The justification sometime indicates the correction and so we can
clarify or rectify the corrections by the means of backtracking
IE’s belief about the engine
1. False
2. True
3. Assumed True
4. Assumed False
5. Assumed Inference
6. Don’t Care
• False – The sentence is believed to be unconditionally false and is
otherwise known as contradiction
• True – The sentence is believed to be unconditionally true and is
otherwise known as tautology
• Assumed True – This is something to deal with the enabled
assumptions or driven assumptions, assuming something might be
true without knowing it deep
• Assumed False - This is something to deal with the enabled
assumptions or driven assumptions, assuming something might be
false without knowing it deep
• Assumed Inference – A sentence is believed by other inference from
other sentences
• Don’t care – The sentence is completely ignored here
Probability Reasoning in AI
• Knowledge Representation in FOL and Propositional Logic is based on
certainty or prediction
• For example: A  B which means A implies B i.e., if A is true then B is
also true
• But in some situations where we were not so sure about whether the
value of A is true or not
• Then this can’t be expressed using A  B
• And so this situation comes under uncertainty
• Therefore, to characterize uncertain knowledge where we are not
sure about the prediction
• Then we need to use uncertain reasoning or probabilistic reasoning
Some of the cases for uncertainty to
occur
• Experimental error
• Equipment fault
• Temperature variation
• Climate change
• Information from unknown sources
Need for probabilistic reasoning in
AI
• Probability reasoning are used under following circumstances
• Unpredictable outcomes
• Values are large to handle
• Unknown error
• In this reasoning, there are 2 main methods to solve any sort of
uncertainty and they are:
1. Bayes' rule
2. Bayesian statistics

• The probability can be a defined as the chance of occurrence of any


uncertain events
• It is the numerical measure of some likelihood that an event will occur
• The value of probability will always fall between 0 and 1
• It can be represented as 0 <= p(x) <= 1, where 0 can be uncertainty, p
is the probability for any event x, and 1 can be certainty
Bayes Rule
• This theorem determines the probability of an event with uncertain
knowledge
• It is the way to calculate the value of probability of P(B|A) with the
knowledge of P(A|B)
• Bayes theorem can be derived using product rule * probability of an
event A with known event B
• We can say as P(A Λ B) = P(A|B) * P(B) --------------- eq (1)
• While probability of event B with given A will be as: P(A Λ B) = P(B|A)
* P(A) ----------------- eq (2)
• Where P(A Λ B) is a joint probability
• While equating eq (1) and eq (2), we get the solution as:
• i.e., by calculating the right side of the equations we get,
• P(A|B) = P(B|A) * P(A)/P(B)
• Henceforth, the above derived equation is known as Bayes Rule or
Bayes Theorem
• It is the most important as well as the basic of all the AI system to get
the uncertainty of probabilistic
• P(A|B) is called posterior which we need to calculate
• It will be read as probability of hypothesis A with the occurrence of
evidence B
Semantic Networks
• It is a graphical notation for representing a knowledge in an
interconnected node pattern
• It represents knowledge or support reasoning
• It is an alternative of predicate logic and it represents objects and
objects in terms of circle, rectangle, arcs etc.,
• Let us discuss about the components of semantic networks:
Components of Semantic Networks
• Lexical Component – It contains nodes, links and labels
• Structural Component – It contains links or nodes which are directed
• Semantic – It contains definition related to links or nodes
• Procedural – it contains “construct for creation of new links and
destructor for removal of links”
• Now let us discuss few examples:
• A Sparrow is a bird

• A bird has feathers


Frames
• It is a collection of attributes or slots and associated values the
describes the real world entity
• It uses records to represent the knowledge
• A frame can represent individual object or class of similar object
• Therefore the frames are semantic networks with some properties
• There are some attributes which is attached to each state and they
are:
• Instance – A particular slot with a class or object
• Definition – Slot definition or a particular value to a slot
• Default – Any default value
• Domain – particular area in which a slot can be filled
• Range – any range or class of particular elements
• Range Constraints – representation of logical expression with true or
false values
• Single Values – which returns single value or might be said as default
single value
• Inverse – representation of slot inverse using reasoning
Representation of frames
• Tweety is a yellow bird having wings to fly
{
Tweety
(species value (bird))
(color value (yellow))
(Activity value (fly))
}
• Write a frame for books
{
book
(title value (AI))
(author value (ABC))
(publisher value (harper))
(price value (100))
}
Scripts
• It is a structure that describes a set of circumstances which could be
followed from one another
• It consists of number of frames with more specialized roles
• There are certain components of scripts and they are:
Components of Scripts
• Roles – a person or an actor involved in an event
• Property – an object or a real world thing involved in a event
• Entry Condition – a condition which is to be satisfied before the event
to be occurred
• Result – a condition which is true after the event
• Track – variations / updates on the scripts
• Scene – A sequence of events that occur
Example for script
Script: withdrawn money
1. Roles – customer, bank employee, cashier
2. Properties – token, form, counters
3. Entry condition – customer has no money
4. Result – customer has money
5. Track – Bank
6. Scene – Sequence of action

You might also like