AI Note
AI Note
Artificial Intelligence refers to the phenomenon where a machine acts as a blueprint of the
human mind, by being able to understand, analyze, and learn from data through specially
designed algorithms. Artificially intelligent machines can remember human behavior patterns
and adapt according to their preferences.
The major concepts closely related to AI are machine learning, deep learning and natural
language processing (NLP).
Machine Learning (ML) involves teaching machines about important concepts via examples
by means of big data that needs to be structured (in machine language) for the machines to
understand. This is all done by feeding them the right algorithms.
Deep Learning is a step ahead of ML, meaning it learns through representation but the data
does not need to be structured for it to make sense of it. This is due to the artificial neural
networks that are inspired by the human neural structure.
One doesn’t have to put much thought into traveling to a new destination anymore. Instead of
having to rely on confusing address directions, you can now simply open up the handy map
application on your phone and type in your destination.
So how does the application know the exact directions, the optimal route, and even road
barriers and traffic congestions? Not too long ago, only GPS (satellite-based navigation) was
used as guidance for commuting. But now, artificial intelligence is being incorporated to give
users a much more enhanced experience in regards to their specific surroundings.
Via machine learning, the app algorithm remembers the edges of the buildings that have been
fed into the system after the staff has manually identified them. This allows the addition of
clear visuals of buildings on the map. Another feature is the quality of recognizing and
understanding handwritten house numbers which help commuters reach the exact house they
were looking for. Places that lack formal street signs can also be identified with their outlines
or handwritten labels.
The application has been taught to understand and identify traffic. Thus, it recommends the
best route that avoids roadblocks and congestion. The AI-based algorithm also tells users
the exact distance and time they will reach their destination as it has been taught to calculate
this based on traffic conditions. Users can also view the pictures of their locations before
getting there.
So by employing a similar AI technology, various ride-hailing applications have also
come into existence.
When you’re typing out documents, there are inbuilt or downloadable auto-correcting tools for editors
that check for spelling mistakes, grammar, readability, and plagiarism depending on their complexity
level.
It must have taken you a while to learn your language before you became fluent in it. Similarly, artificially
intelligent algorithms also use machine learning, deep learning, and natural language processing to
identify incorrect usage of language and suggest corrections.
Linguists and computer scientists work together to teach machines grammar, just like you were taught
at school. Machines are fed with copious amounts of high-quality language data, organized in such a
manner that machines can understand it. So when you use even a single comma incorrectly, the editor
will mark it red and prompt suggestions.
The next time you have a language editor check your document, know that you are using one of the
many examples of artificial intelligence.
7. Social Media
The advent of social media provided a new narrative to the world with excessive freedom of
speech. However, this brought some societal evils such as cybercrime, cyberbullying, and
hate speech. Various social media applications are using the support of AI to control these
problems and provide users with other entertaining features.
Social media, being a great example of artificial intelligence, also has the ability to understand
the sort of content a user resonates with and suggests similar content to them. The facial
recognition feature is also utilized in social media accounts, helping people tag their friends
through automatic suggestions. Smart filters can identify and automatically weed out spam or
unwanted messages. Smart replies are another feature users can enjoy.
Some future plans of the social media industry include using artificial intelligence to identify
mental health problems such as suicidal tendencies through analyzing the content posted and
consumed. This can be forwarded to mental health doctors.
8. E-Payments
Having to run to the bank for every transaction can be a hectic errand. Good news! Banks are
now leveraging artificial intelligence to facilitate customers by simplifying payment processes.
Artificial intelligence has made it possible to deposit cheques from the comfort of your home.
AI is proficient in deciphering handwriting, making online cheque processing practicable.
The way fraud can be detected by observing users’ credit card spending patterns is also an
example of artificial intelligence. For example, the algorithms know what kind of products User
X buys, when and from where they are bought, and in what price bracket they fall. When there
is some unusual activity that does not fit in with the user profile, the system instantly alerts
user X.
Importance of AI
Artificial intelligence could be used to do lots of impressive tasks and jobs. AI can help designers
and artists make quick tweaks to visuals. AI can also help researchers identify “fake” images or
connect touch and sense. AI is being used to program websites and apps by combining symbolic
reasoning and deep learning. Basically, artificial intelligence goes beyond deep learning. Here are
five reasons why AI is important.
AI is also used in the agriculture industry extensively. Robots can be used to plant seeds, fertilized
crops and administer pesticides, among a lot of other uses. Farmers can use a drone to monitor
the cultivation of crops and also collect data for analysis.
The value-add data will be used to increase the final output. How? The data collected is analyzed
by AI on such variables as crop health and soil conditions, boosting final production, and it can
also be used in harvesting, especially for crops that are difficult to gather.
Artificial Intelligence will eliminate the need for you to perform tedious tasks.
AI is changing the workplace, and there are plenty of reasons to be optimistic. It is used to do lots
of tedious and lengthy tasks, especially the low-skilled types of jobs that are labor-intensive. It
means that employees will be retasked away from boring jobs and bring significant and positive
change in the workplace.
For instance, artificial intelligence is used in the automotive industry to do repetitive tasks such
as performing a routine operation in the assembly line, for example. Allowing a robot to care for
well, robotic-tasks, has created a shift in the workforce.
Auto accidents are one of the most popular types of accidents that happen in America. It kills
thousands of people annually. A whopping 95 percent of these accidents are caused by human
error, meaning accidents are avoidable.
The number of accident cases will reduce as artificial intelligence is being introduced into the
industry by the use of self-driving cars. On-going research in the auto industry is looking at ways
AI can be used to improve traffic conditions.
Smart systems are currently in place in many cities that are used to analyze traffic lights at the
intersections. Avoiding congestion leads to safer movements of vehicles, bicycles, and
pedestrians.
Conclusion
Artificial intelligence is very useful in all industries as more research is being done to advance it.
The advancements in this AI tech will be most useful if it is understood and trusted. An important
part of it is that artificial intelligence and related technologies such as drones, robots, and
autonomous vehicles can create around tens of millions of jobs over the next decade.
What is knowledge?
Knowledge is the body of facts and principles. Knowledge can be language, concepts, procedures,
rules, ideas, abstractions, places, customs and so on. Study of knowledge is called Epistemology.
Humans are best at understanding, reasoning, and interpreting knowledge. Human knows things,
which is knowledge and as per their knowledge they perform various actions in the real world. But
how machines do all these things comes under knowledge representation and reasoning.
Knowledge representation is a study of ways of how knowledge is actually picturized and how
effectively it resembles the representation in human brain.
o Knowledge representation and reasoning (KR, KRR) is the part of Artificial intelligence which
concerned with AI agents thinking and how thinking contributes to intelligent behavior of
agents.
o It is responsible for representing information about the real world so that a computer can
understand and can utilize this knowledge to solve the complex real world problems such as
diagnosis a medical condition or communicating with humans in natural language.
o It is also a way which describes how we can represent knowledge in artificial intelligence.
Knowledge representation is not just storing data into some database, but it also enables an
intelligent machine to learn from that knowledge and experiences so that it can behave
intelligently like a human.
A knowledge representation system should provide way of representing complex knowledge and
should posses the following characteristics.
• The representation should have a set of well defined syntax and semantics.
• The knowledge representation scheme should have a good expressive capacity.
• The representation must be efficient. That is, it should use only limited resources without
compromising on the expressive power.
What to Represent:
Following are the kind of knowledge which needs to be represented in AI systems:
o Object: All the facts about objects in our world domain. E.g., Guitars contains strings, trumpets
are brass instruments.
o Events: Events are the actions which occur in our world.
o Performance: It describe behavior which involves knowledge about how to do things.
o Meta-knowledge: It is knowledge about what we know.
o Facts: Facts are the truths about the real world and what we represent.
o Knowledge-Base: A knowledge base is an organized collection of facts about the system's
domain. An inference engine interprets and evaluates the facts in the knowledge base in
order to provide an answer.
A knowledge-based system (KBS) is a form of artificial intelligence (AI) that aims to capture
the knowledge of human experts to support decision-making. Examples of knowledge-based systems
include expert systems, which are so called because of their reliance on human expertise.
Knowledge: Knowledge is awareness or familiarity gained by experiences of facts, data, and situations.
Following are the types of knowledge in artificial intelligence:
Types of knowledge
1. Declarative Knowledge:
o Declarative knowledge is to know about something and it is a passive knowledge..
o It includes concepts, facts, and objects.
o It is also called descriptive knowledge and expressed in declarative sentences.
o It is simpler than procedural language.
o Example mark statement of a student.
2. Procedural Knowledge
o It is also known as imperative knowledge.
o Procedural knowledge is a type of knowledge which is responsible for knowing how to do
something.
o It can be directly applied to any task.
o It includes rules, strategies, procedures, agendas, etc.
o Procedural knowledge depends on the task on which it can be applied.
3. Meta-knowledge:
o Knowledge about the other types of knowledge is called Meta-knowledge.
4. Heuristic knowledge:
o Heuristic knowledge is representing knowledge of some experts in a field or subject.
o Heuristic knowledge is based on previous experiences, awareness of approaches, and which
are good to work but not guaranteed.
o Heuristic knowledge are rules or tricks used to make judgement and also to simplify solutions
of problems. It is acquired through experience. An expert uses his knowledge that has
gathered due to his experience and learning.
5. Structural knowledge:
o Structural knowledge is basic knowledge to problem-solving.
o It describes relationships between various concepts such as kind of, part of, and grouping of
something.
o It describes the relationship that exists between concepts or objects.
A knowledge-based system (KBS) is a form of artificial intelligence (AI) that aims to capture the
knowledge of human experts to support decision-making. Examples of knowledge-based systems
include expert systems, which are so called because of their reliance on human expertise.
The typical architecture of a knowledge-based system, which informs its problem-solving method,
includes a knowledge base and an inference engine. The knowledge base contains a collection of
information in a given field -- medical diagnosis, for example. The inference engine deduces insights
from the information housed in the knowledge base. Knowledge-based systems also include an
interface through which users query the system and interact with it.
A knowledge-based system may vary with respect to its problem-solving method or approach. Some
systems encode expert knowledge as rules and are therefore referred to as rule-based systems.
Another approach, case-based reasoning, substitutes cases for rules. Cases are essentially solutions to
existing problems that a case-based system will attempt to apply to a new problem.
Techniques of knowledge representation
There are mainly four ways of knowledge representation which are given as follows:
1. Logical Representation
2. Semantic Network Representation
3. Frame Representation
4. Production Rules / Scripts
1. Logical Representation
Logical representation is a language with some concrete rules which deals with
propositions and has no ambiguity in representation. Logical representation means
drawing a conclusion based on various conditions. This representation lays down some
important communication rules. It consists of precisely defined syntax and semantics
which supports the sound inference. Each sentence can be translated into logics using
syntax and semantics.
Syntax:
o Syntaxes are the rules which decide how we can construct legal sentences in the
logic.
o It determines which symbol we can use in knowledge representation.
o How to write those symbols.
Semantics:
o Semantics are the rules by which we can interpret the sentence in the logic.
o Semantic also involves assigning a meaning to each sentence.
o
Logical representation can be categorised into mainly two logics:
a. Propositional Logics
b. Predicate logics
Note: We will discuss Prepositional Logics and Predicate logics in later chapters.
Advantages of logical representation:
1. Logical representation enables us to do logical reasoning.
2. Logical representation is the basis for the programming languages.
Disadvantages of logical Representation:
1. Logical representations have some restrictions and are challenging to work with.
2. Logical representation technique may not be very natural, and inference may not
be so efficient.
Note: Do not be confused with logical representation and logical reasoning as logical representation is a
representation language and reasoning is a process of thinking logically.
3. Frame Representation
A frame is a record like structure which consists of a collection of attributes and its
values to describe an entity in the world. Frames are the AI data structure which divides
knowledge into substructures by representing stereotypes situations. It consists of a
collection of slots and slot values. These slots may be of any type and sizes. Slots have
names and values which are called facets.
Slots Filters
Year 1996
Page 1152
Facets: The various aspects of a slot is known as Facets. Facets are features of frames
which enable us to put constraints on the frames. Example: IF-NEEDED facts are called
when data of any particular slot is needed. A frame may consist of any number of slots,
and a slot may include any number of facets and facets may have any number of values.
A frame is also known as slot-filter knowledge representation in artificial
intelligence.
Frames are derived from semantic networks and later evolved into our modern-day
classes and objects. A single frame is not much useful. Frames system consist of a
collection of frames which are connected. In the frame, knowledge about an object or
event can be stored together in the knowledge base. The frame is a type of technology
which is widely used in various applications including Natural language processing and
machine visions.
Example: 1
Let's take an example of a frame for a book
Example 2:
Let's suppose we are taking an entity, Peter. Peter is an engineer as a profession, and
his age is 25, he lives in city London, and the country is England. So following is the
frame representation for this:
Slots Filter
Name Peter
Profession Doctor
Age 25
Weight 78
Production rules system consist of (condition, action) pairs which mean, "If
condition then action". It has mainly three parts:
o The set of production rules
o Working Memory
o The recognize-act-cycle
In production rules agent checks for the condition and if the condition exists
then production rule fires and corresponding action is carried out. The condition
part of the rule determines which rule may be applied to a problem. And the action
part carries out the associated problem-solving steps. This complete process is
called a recognize-act cycle.
The working memory contains the description of the current state of problems-
solving and rule can write knowledge to the working memory. This knowledge
match and may fire other rules.
If there is a new situation (state) generates, then multiple production rules will
be fired together, this is called conflict set. In this situation, the agent needs to
select a rule from these sets, and it is called a conflict resolution.
Example:
o IF (at bus stop AND bus arrives) THEN action (get into the bus)
o IF (on the bus AND paid AND empty seat) THEN action (sit down).
o IF (on bus AND unpaid) THEN action (pay charges).
o IF (bus arrives at destination) THEN action (get down from the
bus).
Advantages of Production rule:
1. The production rules are expressed in natural language.
2. The production rules are highly modular, so we can easily remove, add or
modify an individual rule.
Disadvantages of Production rule:
1. Production rule system does not exhibit any learning capabilities, as it does
not store the result of the problem for the future uses.
2. During the execution of the program, many rules may be active hence rule-
based production systems are inefficient.
The Knowledge Organization System (KOS) provides a framework for the different
classification schemes used to organize knowledge. Some KOSs are library
classifications, taxonomies, subject headings, thesauri, ontologies, etc. KOS is a
corner stone of Knowledge Organization tools.
One of the main bottlenecks is knowledge acquisition. This phase tries to identify
the main concepts, by looking at different information sources and seeking the
advice of domain experts. The next step is conceptualization, by structuring the
domain. This means analyzing terminology, synonyms and hierarchical and
associative structures. Also, it is important to identify the constraints of each
relation or attribute.
Knowledge acquisition
Predicate Logic deals with predicates, which are propositions, consist of variables.
In propositional logic, we can only represent the facts, which are either true or
false. PL is not sufficient to represent the complex sentences or natural language
statements. The propositional logic has very limited expressive power. Consider
the following sentence, which we cannot represent using PL logic.
First-Order logic:
Syntax:
Syntax has to do with what ‘things’ (symbols, notations) one is allowed to use in the
language and in what way; there is/are a(n):
• Alphabet
• Language constructs
• Sentences to assert knowledge
Formal meaning of semantics is which has to do what those sentences with the
alphabet and constructs are supposed to mean.
Atomic sentences:
o Atomic sentences are the most basic sentences of first-order logic. These
sentences are formed from a predicate symbol followed by a parenthesis with
a sequence of terms.
o We can represent atomic sentences as Predicate (term1, term2, ......,
term n).
Complex Sentences:
o Complex sentences are made by combining atomic sentences using
connectives.
Consider the statement: "x is an integer.", it consists of two parts, the first part
x is the subject of the statement and second part "is an integer," is known as a
predicate.
Quantifiers in First-order logic:
Universal Quantifier:
Universal quantifier is a symbol of logical representation, which specifies that the
statement within its range is true for everything or every instance of a particular
thing.
o For all x
o For each x
o For every x.
Example:
All man drink coffee.
It will be read as: There are all x where x is a man who drink coffee.
Existential Quantifier:
Existential quantifiers are the type of quantifiers, which express that the statement
within its scope is true for at least one instance of something.
If x is a variable, then existential quantifier will be ∃x or ∃(x). And it will be read as:
Example:
Some boys are intelligent.
We can write it as
It will be read as: There are some x where x is a boy who is intelligent.
Points to remember:
Properties of Quantifiers:
The following table shows the predicate calculus expression and their meaning
Some Examples of FOL using quantifier:
5. For example from Natural Language to First order logic consider the following
three sentences:
– “Each animal is an organism”
– “All animals are organisms”
– “If it is an animal then it is an organism”
This can be formalised as:
∀x(Animal(x)→Organism(x))
∃x(Book(x)∧heavy(x))
Example: ∀xy [A (x) B( y)], here x and y are the bound variables.
Properties of WFFS
The evaluation of complex formulas in FOPL can often be facilitiated through the
substitution of equivalent formulas. The table shows a number of equivalent
expressions or properties.
A formula is either
1. An atomic formula; or
2. The application of a Boolean operator to formulas; or
3. A quantifier followed by a variable followed by a formula.
Examples: Odd(x). Odd(x) ∨ ¬Odd(Plus(x,x)). ∃x Odd(Plus(x,y)).
∀x Odd(x) ⇒ ¬Odd(Plus(x,3)).
Example:
Start. ∀x [Even(x) ⇔ [∀y Even(Times(x,y))]]
After Step 1: ∀x [[Even(x) ⇒ [∀y Even(Times(x,y))]] ∧
[[∀y Even(Times(x,y))] ⇒ Even(x)]].
After step 2: ∀x [[¬Even(x) ∨ [∀y Even(Times(x,y))]] ∧
[¬[∀y Even(Times(x,y))] ∨ Even(x)]].
After step 3: ∀x [[¬Even(x) ∨ [∀y Even(Times(x,y))]] ∧
[[∃y ¬Even(Times(x,y))] ∨ Even(x)]].
After step 4: ∀x [[¬Even(x) ∨ [∀y Even(Times(x,y))]] ∧
[¬Even(Times(x,Sk1(x))) ∨ Even(x)]].
After step 5: [¬Even(x) ∨ Even(Times(x,y))] ∧
[¬Even(Times(x,Sk1(x))) ∨ Even(x)].
Step 6 has no effect.
After step 7: ¬Even(x) ∨ Even(Times(x,y)).
¬Even(Times(x,Sk1(x))) ∨ Even(x)
Logic
One of the prime activities of human intelligence is reasoning. The activity of reasoning
involves construction, organization and manipulation of statements to arrive at new
conclusions. Thus, logic can be defined as a scientific study of the process of reasoning and
system of rules and procedures that help in the reasoning process.
Basically, the logic process takes some information (called premises) and produces
some outputs (called conclusions). Logic is basically classified into two categories,
propositional logic and predicate logic.
Propositional logic (PL) is the simplest form of logic where all the statements are made by
propositions. A proposition is a declarative statement which is either true or false. It is a
technique of knowledge representation in logical and mathematical form.
Example:
1. a) It is Sunday.
2. b) The Sun rises from West (False proposition)
3. c) 3+3= 7(False proposition)
4. d) 5 is a prime number.
. There are two types of Propositions. They are Atomic Propositions (simple propositions)
and Molecular propositions (compound propositions).
Atomic Proposition: Atomic propositions are the simple propositions. It consists of a single
proposition symbol. These are the sentences which must be either true or false.
Example:
Example:
i). The letters A, B, … Z and these letters with subscripted numerals are well-formed atomic
propositions
ii). If A and B are well-formed atomic proposition then they can be connected with logical
connectives.
Logical Connectives:
Logical connectives are used to connect two simpler propositions or representing a sentence
logically. We can create compound propositions with the help of logical connectives. There
are mainly five connectives, which are given as follows:
A clear meaning of the logical propositions can be arrived at by constructing appropriate truth
tables for the molecular propositions. The following tables give the truth table for all
connectives.
Truth Table:
In propositional logic, we need to know the truth values of propositions in all possible
scenarios. We can combine all the possible combination with logical connectives, and the
representation of these combinations in a tabular format is called Truth table. Following are
the truth table for all logical connectives.
Truth table with three propositions:
We can build a proposition composing three propositions P, Q, and R. This truth table is
made-up of 8n Tuples as we have taken three proposition symbols.
Precedence Operators
Just like arithmetic operators, there is a precedence order for propositional connectors or
logical operators. This order should be followed while evaluating a propositional problem.
Following is the list of the precedence order for operators:
Note: For better understanding use parenthesis to make sure of the correct interpretations. Such as
¬R∨ Q, It can be interpreted as (¬R) ∨ Q.
Properties of statements.
Valid : A sentence is valid if it is true for every interpretation. Valid sentences are also called
tautologies.
Equivalence: two sentences are equivalent if they have the same truth value under every
interpretation.
Logical consequences: A sentence is logical consequence of another if it is satisfied by all
interpretations which satisfy the first. More generally, it is a logical consequence of other
statements if and only if for any interpretation in which the statements are true, the resulting
statement is also true.
A valid statement is satisfiable, and a contradictory statement is invalid, but the
converse is not necessarily true.
P is satisfiable but not valid since an interpretation that assigns false to P assigns false to the
sentence P.
(P ∨¬P) is valid since every interpretation results in a value of true for (P ∨¬P)
(P ∧¬P) is a contradiction since every interpretation results in a value of false for (P ∧¬P).
P and ¬(¬P) are equivalent since each has the same truth values under every interpretation.
Logical equivalence
Logical equivalence is one of the features of propositional logic. Two propositions are said to
be logically equivalent if and only if the columns in the truth table are identical to each other.
A B A∧B B∧A
True True True True
True False False False
False True False False
False False False False
A B A→B ¬A ¬A V B
True True True False true
True False False False False
False True True True True
False False True True true
This gives A →B = ¬A V B
Some commonly used logical equivalences are listed in the following table.
Equivalence laws (or) Properties of Operators:
o Idempotency:
o P ∨ P= P, or
o P ∧ P = P.
o Commutativity:
o P∧ Q= Q ∧ P, or
o P ∨ Q = Q ∨ P.
o Associativity:
o (P ∧ Q) ∧ R= P ∧ (Q ∧ R),
o (P ∨ Q) ∨ R= P ∨ (Q ∨ R)
o Identity element:
o P ∧ True = P,
o P ∨ True= True.
o Distributive:
o P∧ (Q ∨ R) = (P ∧ Q) ∨ (P ∧ R).
o P ∨ (Q ∧ R) = (P ∨ Q) ∧ (P ∨ R).
o DE Morgan's Law:
o ¬ (P ∧ Q) = (¬P) ∨ (¬Q)
o ¬ (P ∨ Q) = (¬ P) ∧ (¬Q).
o Double-negation elimination:
o ¬ (¬P) = P.
o Conditional elimination:
o P → Q = ¬P ∨ Q
o Bi-conditional elimination:
o P ⇔ Q = (P → Q) ∧ (Q → P)
o We cannot represent relations like ALL, some, or none with propositional logic.
Example:
1. All the girls are intelligent.
2. Some apples are sweet.
o Propositional logic has limited expressive power.
o In propositional logic, we cannot describe statements in terms of their properties or
logical relationships.
Tautologies
A Tautology is a formula which is always true for every value of its propositional
variables.
Example − Prove [(A→B)∧A]→B[(A→B)∧A]→B is a tautology
The truth table is as follows −
A B A→B (A → B) ∧ A [( A → B ) ∧ A] → B
Wffs are formed only by applying the abo→ve rules a finite number of
tumes.
The abovr rules state that all wffs are formed from atomic formulas and the proper applicatiob of
quamtifiers and logical connectives. Some examples of valid wffs are:
MAN(john)
PILOT(father-of(bill))
∀p P(x) → Q(x)
MAN(¬john)
Father-of(Q(x))
MARRIED(MAM,WOMAN)
The first statement is invalid because universal quantifier is applied to the predicate
P(x). the second expression is invalid since the term john, a constant, is negated.
(terms can not be negated). The third expression is invalid since it is a function with
predicate argument. The last expression fails since it is a predicate with two predicate
arguments.
Syntax of Predicate Calculus
The predicate calculus uses the following types of symbols:
Constants: A constant symbol denotes a particular entity. E.g. John,
Muriel, 1.
Functions: A function symbol denotes a mapping from a number of
entities to a single entities: E.g.
fatherOf is a function with one argument. Plus is a function with two
arguments. fatherOf(John)
is some person. plus(2,7) is some number.
Predicates: A predicate denotes a relation on a number of entities.
e.g. Married is a predicate
with two arguments. Odd is a predicate with one argument.
Married(John, Sue) is a sentence that
is true if the relation of marriage holds between the people John and
Sue. Odd(Plus(2,7)) is a true
sentence.
Variables: These represent some undetermined entity. Examples: x,
s1, etc.
Boolean operators: ¬, ∨, ∧, ⇒, ⇔.
Quantifiers: The symbols ∀ (for all) and ∃ (there exists).
Grouping symbols: The open and close parentheses and the comma.
A term is either
1. A constant symbol; or
2. A variable symbol; or
3. A function symbol applied to terms.
Examples: John, x, fatherOf(John), plus(x,Plus(1,3)).
An atomic formula is a predicate symbol applied to terms.
Examples: Odd(x). Odd(plus(2,2)). Married(Sue,fatherOf(John)).
A formula is either
1. An atomic formula; or
2. The application of a Boolean operator to formulas; or
3. A quantifier followed by a variable followed by a formula.
Examples: Odd(x). Odd(x) ∨ ¬Odd(Plus(x,x)). ∃x Odd(Plus(x,y)).
∀x Odd(x) ⇒ ¬Odd(Plus(x,3)).
A sentence is a formula with no free variables. (That is, every
occurrence of every variable is
associated with some quantifier.)
Clausal Form
A literal is either an atomic formula or the negation of an atomic
formula.
Examples: Odd(3). ¬Odd(Plus(x,3)). Married(Sue,y).
A clause is the disjunction of literals. Variables in a clause are
interpreted as universally quantified with the largest possible
scope.
Example: Odd(x) ∨ Odd(y) ∨ ¬Odd(Plus(x,y)) is interpreted as
∀x,y Odd(x) ∨ Odd(y) ∨ ¬Odd(Plus(X,Y)).
Converting a sentence to clausal form
1. Replace every occurrence of α⇔β by
(A⇒B) ∧ (B⇒A). When this is complete, the sentence will have no
occurrence of ⇔.
2. Replace every occurrence of A⇒B by ¬A∨B. When this is complete,
the only Boolean operators
will be ∨, ¬, and ∧.
3. Replace every occurrence of ¬(A ∨ B) by ¬A ∧ ¬B; every occurrence
of ¬(A ∧ B) by ¬A ∨ ¬B;
and every occurrence of ¬¬A by A.
New step: Replace every occurrence of ¬∃xf(x) by ∀x¬f(x) and every
occurrence of ¬∀xf(x) by ∃x¬f(x).
Repeat as long as applicable. When this is done, all negations will be
next to an atomic sentence.
4. (New Step: Skolemization). For every existential quantifier ∃x in
the formula, do the following:
If the existential quantifier is not inside the scope of any universal
quantifiers, then
i. Create a new constant symbol γ.
ii. Replace every occurrence of the variable x by γ.
iii. Drop the existential quantifier.
If the existential quantifier is inside the scope of universal quantifiers
with variables ∆1 . . . ∆k,then
i. Create a new function symbol γ.
ii. Replace every occurrence of the variable x by the term γ(∆1 . . . ∆k)
iii. Drop the existential quantifier.
Example. Change ∃x Blue(x) to Blue(Sk1).
Change ∀x∃y Odd(Plus(x,y)) to ∀x Odd(Plus(x,Sk2(x)).
Change ∀x,y∃z∀a∃b P(x,y,z,a,b) to P(x,y,Sk3(x,y),a,Sk4(x,y,a)).
5. New step: Elimination of universal quantifiers:
Part 1. Make sure that each universal quantifier in the formula uses a
variable with a different name, by changing variable names if
necessary.
Part 2. Drop all univeral quantifiers.
Example. Change [∀x P(x)] ∨ [∀x Q(x)] to P(x) ∨ Q(x1).
6. (Same as step 4 of CNF conversion.) Replace every occurrence of
(A ∧ B) ∨ C by (A ∨ γ) ∧ (B ∨ γ), and every occurrence of A ∨ (B ∧ C)
by (A ∨ B) ∧ (A ∨ C).
Repeat as long as applicable. When this is done, all conjunctions will
be at top level.
7. (Same as step 5 of CNF conversion.) Break up the top-level
conjunctions into separate sentences. That is, replace A ∧ B by the
two sentences A and B. When this is done, the set will be in CNF.
Example:
Start. ∀x [Even(x) ⇔ [∀y Even(Times(x,y))]]
After Step 1: ∀x [[Even(x) ⇒ [∀y Even(Times(x,y))]] ∧
[[∀y Even(Times(x,y))] ⇒ Even(x)]].
Linguistic analysis concerns with identifying the structural units and classes of
language. Linguists also attempt to describe how smaller units can be combined to
form larger grammatical units such as how words can be combined to form phrases,
phrases can be combined to form clauses, and so on. They also concern what
constrains the possible meanings for a sentence.
Natural Language Processing (NLP)
Natural Language Processing (NLP) refers to AI method of communicating with an
intelligent systems using a natural language such as English.
Processing of Natural Language is required when you want an intelligent system like
robot to perform as per your instructions, when you want to hear decision from a
dialogue based clinical expert system, etc.
The field of NLP involves making computers to perform useful tasks with the natural
languages humans use. The input and output of an NLP system can be −
• Speech
• Written Text
Components of NLP
There are two components of NLP as given −
It is the process of producing meaningful phrases and sentences in the form of natural
language from some internal representation.
It involves −
• Text planning − It includes retrieving the relevant content from knowledge
base.
• Sentence planning − It includes choosing required words, forming meaningful
phrases, setting tone of the sentence.
• Text Realization − It is mapping sentence plan into sentence structure.
The NLU is harder than NLG.
Difficulties in NLU
NL has an extremely rich form and structure.
It is very ambiguous. There can be different levels of ambiguity −
• Lexical ambiguity − It is at very primitive level such as word-level.
For example, treating the word “board” as noun or verb?
• Syntax Level ambiguity − A sentence can be parsed in different ways.
For example, “He lifted the beetle with red cap.” − Did he use cap to lift the
beetle or he lifted a beetle that had red cap?
• Referential ambiguity − Referring to something using pronouns. For example,
Rima went to Gauri. She said, “I am tired.” − Exactly who is tired?
One input can mean different meanings.
Many inputs can mean the same thing.
Steps in NLP
There are general five steps −
• Lexical Analysis − It involves identifying and analyzing the structure of words.
Lexicon of a language means the collection of words and phrases in a
language. Lexical analysis is dividing the whole chunk of text into paragraphs,
sentences, and words.
• Syntactic Analysis (Parsing) − It involves analysis of words in the sentence
for grammar and arranging words in a manner that shows the relationship
among the words. The sentence such as “The school goes to boy” is rejected
by English syntactic analyzer.
• Context-Free Grammar
• Parser
−
Context-Free Grammar
A context-free grammar (CFG) is a list of rules that define the set of all well-formed
sentences in a language. Each rule has a left-hand side, which identifies a syntactic
category, and a right-hand side, which defines its alternative component parts, reading from
left to right.
It is the grammar that consists rules with a single symbol on the left-hand side of the
rewrite/production rules. Let us create grammar to parse a sentence −
“The bird pecks the grains”
Articles (DET) − a | an | the
Nouns − bird | birds | grain | grains
Noun Phrase (NP) − Article + Noun | Article + Adjective + Noun
= DET N | DET ADJ N
Verbs − pecks | pecking | pecked
Verb Phrase (VP) − NP V | V NP
Adjectives (ADJ) − beautiful | small | chirping
The parse tree breaks down the sentence into structured parts so that the computer
can easily understand and process it. In order for the parsing algorithm to construct
this parse tree, a set of rewrite/production rules, which describe what tree structures
are legal, need to be constructed.
These rules say that a certain symbol may be expanded in the tree by a sequence of
other symbols. According to first order logic rule, if there are two strings Noun Phrase
(NP) and Verb Phrase (VP), then the string combined by NP followed by VP is a
sentence. The rewrite rules for the sentence are as follows −
S → NP VP
NP → DET N | DET ADJ N
VP → V NP
Lexicon −
In the second part, the individual words will be combined to provide meaning in
sentences.
The most important task of semantic analysis is to get the proper meaning of the
sentence. For example, analyze the sentence “Ram is great.” In this sentence, the
speaker is talking either about Lord Ram or about a person whose name is Ram. That
is why the job, to get the proper meaning of the sentence, of semantic analyzer is
important.
Hyponymy
It may be defined as the relationship between a generic term and instances of that
generic term. Here the generic term is called hypernym and its instances are called
hyponyms. For example, the word color is hypernym and the color blue, yellow etc.
are hyponyms.
Homonymy
It may be defined as the words having same spelling or same form but having different
and unrelated meaning. For example, the word “Bat” is a homonymy word because
bat can be an implement to hit a ball or bat is a nocturnal flying mammal also.
Polysemy
Polysemy is a Greek word, which means “many signs”. It is a word or phrase with
different but related sense. In other words, we can say that polysemy has the same
spelling but different and related meaning. For example, the word “bank” is a
polysemy word having the following meanings −
• A financial institution.
• The building in which such an institution is located.
• A synonym for “to rely on”.
Synonymy
It is the relation between two lexical items having different forms but expressing the
same or a close meaning. Examples are ‘author/writer’, ‘fate/destiny’.
Antonymy
It is the relation between two lexical items having symmetry between their semantic
components relative to an axis. The scope of antonymy is as follows −
• Application of property or not − Example is ‘life/death’, ‘certitude/incertitude’
• Application of scalable property − Example is ‘rich/poor’, ‘hot/cold’
• Application of a usage − Example is ‘father/son’, ‘moon/sun’.
Meaning Representation
Semantic analysis creates a representation of the meaning of a sentence. But before
getting into the concept and approaches related to meaning representation, we need
to understand the building blocks of semantic system.
Lexical Semantics
The first part of semantic analysis, studying the meaning of individual words is called
lexical semantics. It includes words, sub-words, affixes (sub-units), compound words
and phrases also. All the words, sub-words, etc. are collectively called lexical items.
In other words, we can say that lexical semantics is the relationship between lexical
items, meaning of sentences and syntax of sentence.
Following are the steps involved in lexical semantics −
• Classification of lexical items like words, sub-words, affixes, etc. is performed
in lexical semantics.
• Decomposition of lexical items like words, sub-words, affixes, etc. is performed
in lexical semantics.
• Differences as well as similarities between various lexical semantic structures
is also analyzed.
It was developed by Woods in 1970. It is one of the largest and most successful
question-answering system using AI techniques. This system had a separate
syntax analyzer and a semantic interpreter. Its parser was written in ATN
(Augmented Transition Network) form. The system was used in various tests and
responded successfully to queries like followings:
→ Which one is the oldest material between Iron, Bauxite and Aluminum?
The LUNAR system is mainly deal with queries. But the performance of the
system is very good than other systems.
The parser interprets the language inputs and translates them into appropriate
structures that interact with application software.
LIFER has proven to be effective as a front end for a number of systems. The main
disadvantage is the potentially large number of patterns that may be required for
a system which requires many, diverse patterns.
The system can be roughly divided into four component domains: 1) a syntactic
parser which is governed by a large English grammar, 2) a semantic component
of programs that interpret the meanings of words and structures, 3) a cognitive
deduction component used to examine consequences of facts, carry out
commands, and final answers, and 4) an English response generation component.
In addition there is a knowledge base containing blocks world knowledge, and a
model of its own reasoning process, used to explain its action.
Integrating the parts of the understanding process with procedural knowledge has
resulted in an efficient and effective understanding system. Of course, the domain
of SHRDLU is very limited and closed, greatly simplifying the problem.
Parsing and its relevance in NLP
The word ‘Parsing’ whose origin is from Latin word ‘pars’ (which means ‘part’), is
used to draw exact meaning or dictionary meaning from the text. It is also called
Syntactic analysis or syntax analysis. Comparing the rules of formal grammar, syntax
analysis checks the text for meaningfulness. The sentence like “Give me hot ice-
cream”, for example, would be rejected by parser or syntactic analyzer.
The process of determining the syntactical structure of a sentence is known as
parsing. It is a process of analysing a sentence by taking it apart word-by-word and
determining its structure. When given an input string, the lexical parts or terms (root
words) must first be identified by type, and then the role they play in a sentence must
be determined. These parts can then be combined successively into larger units a
complete tree structure has been completed.
To determine the meaning of a word, a parser must have access to a lexicon.
When the parser selects a word from the input stream it locates the word in the lexicon
and obtains the word’s possible function and other features, including semantic
information. This information is then used in building a tree or other representation
structure. The general parsing process is illustrated in the following figure.
We can understand the relevance of parsing in NLP with the help of following points
−
• Parser is used to report any syntax error.
• It helps to recover from commonly occurring error so that the processing of the
remainder of program can be continued.
• Parse tree is created with the help of a parser.
• Parser is used to create symbol table, which plays an important role in NLP.
• Parser is also used to produce intermediate representations (IR).
Syntactic analysis or parsing or syntax analysis is the third phase of NLP. The
purpose of this phase is to draw exact meaning, or you can say dictionary meaning
from the text. Syntax analysis checks the text for meaningfulness comparing to the
rules of formal grammar. For example, the sentence like “hot ice-cream” would be
rejected by semantic analyzer.
In this sense, syntactic analysis or parsing may be defined as the process of
analyzing the strings of symbols in natural language conforming to the rules of formal
grammar.
Types of Parsing
Derivation divides parsing into the followings two types −
• Top-down Parsing
• Bottom-up Parsing
Top-down Parsing
In this kind of parsing, the parser starts constructing the parse tree from the start
symbol and then tries to transform the start symbol to the input. The most common
form of top-down parsing uses recursive procedure to process the input. The main
disadvantage of recursive descent parsing is backtracking.
For example, a possible top-down parse of the sentence “Kathy jumped the horse”
would written as
Bottom-up Parsing
In this kind of parsing, the parser starts with the input symbol and tries to construct
the parser tree up to the start symbol.
A possible bottom-up parse of the same sentence might written as follows.
Difference between Top down parsing and Bottom up parsing
There are 2 types of Parsing Technique present in parsing, first one is Top-down
parsing and second one is Bottom-up parsing.
Top-down Parsing is a parsing technique that first looks at the highest level of the
parse tree and works down the parse tree by using the rules of grammar while Bottom-
up Parsing is a parsing technique that first looks at the lowest level of the parse tree
and works up the parse tree by using the rules of grammar.
There are some differences present to differentiate these two parsing techniques, which
are given below:
S.No Top Down Parsing Bottom Up Parsing
Conclusion
The expert system is a part of AI, and the first ES was developed in the year 1970,
which was the first successful approach of artificial intelligence. It solves the most
complex issue as an expert by extracting the knowledge stored in its knowledge
base. These systems are designed for a specific domain, such as medicine,
science, etc.
Below is the block diagram that represents the working of an expert system:
It is important to remember that an expert system is not used to replace the human
experts; instead, it is used to assist the human in making a complex decision. These
systems do not have human capabilities of thinking and work on the basis of the
knowledge base of the particular domain.
o MYCIN: It was one of the earliest backward chaining expert systems that
was designed to find the bacteria causing infections like bacteraemia and
6. High Performance: The expert system provides high performance for solving
any type of complex problem of a specific domain with high efficiency and accuracy.
8. Highly responsive: ES provides the result for any complex query within a very
short period of time.
2. Inference Engine
3. Knowledge Base
4. Explanation System/Module
The following diagram shows the general out sketch of an Expert System
Architecture.
The following figure represents the relation between each components of a typical Expert
System.
With the help of a user interface, the expert system interacts with the user, takes
queries as an input in a readable format, and passes it to the inference engine. After
getting the response from the inference engine, it displays the output to the user. In
other words, it is an interface that helps a non-expert user to communicate
with the expert system to find a solution.
2. Inference Engine
o The inference engine is known as the brain of the expert system as it is the
main processing unit of the system. It applies inference rules to the
knowledge base to derive a conclusion or deduce new information. It helps in
deriving an error-free solution of queries asked by the user.
o With the help of an inference engine, the system extracts the knowledge from
the knowledge base.
o The inferring process is carried out recursively in three stages: (1) match (2)
select and (3) execute. During the match stage, the contents of working
memory are compared to facts and rules contained in the knowledge base.
Once all the matched rules have been found, one of the rules is selected for
execution and the selected rule is then executed.
o Forward Chaining: It starts from the known facts and rules, and applies the
inference rules to add their conclusion to the known facts.
3. Knowledge Base
o One can also view the knowledge base as collections of objects and their
attributes. Such as a Lion is an object and its attributes are it is a mammal, it
is not a domestic animal, etc.
4. Explanation System/Module
o To respond to a how query, the explanation module traces all the sequence of
rules that led to the conclusion and is printed for the user in an easy to
understand human-language style.
o The editor is used by developers to create new rules for addition to the
knowledge base, to delete outmoded rules, or to modify existing rules.
The people involved / Participants in the development of Expert System
5. High security: These systems provide high security to resolve any query.
6. Considers all the facts: To respond to any query, it checks and considers all
the available facts and provides the result accordingly. But it is possible that a
human expert may not consider some facts due to any reason.
8. Advising: It is capable of advising the human being for the query of any
domain from the particular ES.
11. Interpreting the input: It is capable of interpreting the input given by the
user.
15. They can be used for risky places where the human presence is not safe.
o The response of the expert system may get wrong if the knowledge base
contains the wrong information.
o Like a human being, it cannot produce a creative output for different
scenarios.
o For each domain, we require a specific ES, which is one of the big limitations.