AI Slides 2025
AI Slides 2025
1
Course Content
Module 1. Introduction to Artificial Intelligence (Lecture 10 hours)
Module 1: Overview, Turing Test, Intelligent Agents, Problem solving: solving
problems by searching, Uninformed Search - Depth First Search, Breadth First
Search, DFID, Heuristic Search- Generate and test, Best First Search, Beam
Search, A* , Problem Reduction Search- AND/OR graphs, AO*, Constraint
satisfaction, Means-end analysis, Stochastic search methods- simulated
annealing, Particle Swarm Optimization, Game Playing- Minmax algorithm, alpha
beta pruning
4
References
• A Modern Approach Artificial Intelligence
Russell, Norvig. Second Ed, Pearson Education.
5
Introduction
• What is AI?
• The foundations of AI
• A brief history of AI
• Introductory problems
6
What is Artificial Intelligence (AI)?
• Intelligence: “ability to learn, understand and think” (Oxford dictionary)
• AI is the study of how to make computers make things which at the moment
people do better. (Already Implemented)
• AI : Simulation of human intelligence processes by machines specially computer
• Examples: Speech recognition, Smell, Face, Object, Intuition, Inference,
Learning new skills, Decision making, Abstract thinking
• John McCarthy coined the term Artificial Intelligence in 1956, known as father of
AI, also created LISP
7
Capabilities of AI
● Problem-solving
● Learning from experience
● Understanding natural language
● Recognizing images, speech, or patterns
● Making decisions autonomously
Examples of AI in Real Life
● Virtual assistants like Siri or Alexa
● Recommendation systems in Netflix and YouTube
● Self-driving cars
● Chatbots and customer service automation
● Medical diagnosis systems
● And What Not
Dimensions of AI
11
Dimensions of AI
12
Thinking Humanly: Cognitive Modelling
● Cognitive modeling is an area of computer science that deals with simulating
human problem-solving and mental processing in a computerized model.
13
Thinking Rationally: Laws of Thought
• Formalizing the inference process using logic
• Formal logic provides a precise notation and rules for representing and
reasoning with all kinds of things in the world
○ Syllogism and pattern of arguments
14
Thinking Humanly vs. Thinking Rationally
16
Acting Humanly
• Exhibit human behavior
• Creating machines that perform functions that require intelligence, when same
tasks are performed by people
• Capabilities need to be incorporated in machine, so as to act like human
○ NLP
○ Robotics
○ Automated learning
○ Knowledge Representation
○ Machine Learning
○ Computer Vision
17
Acting Rationally vs. Acting Humanly
● Acting Humanly: AI systems that behave like humans, often
designed for natural interactions through speech, gestures, or facial
expressions.
○ Example: Chatbots like ChatGPT or virtual assistants like Siri engage in
conversations mimicking human communication.
● Acting Rationally: AI that acts to achieve the best possible outcome,
based on its goals and available data, even if it doesn't mimic human
behavior.
○ Example: Autonomous drones use AI to navigate efficiently and complete missions
like delivering packages without imitating human behavior.
Machines With True Intelligence
In 1950, Alan M. Turing published a landmark paper, in which he
speculated about the “Possibility of creating machine with true
intelligence”.
19
Turing Test or Imitation Game
• The first serious proposal in the philosophy of AI
Imitation Game
Human
• Predicted that by 2000, a machine might have a 30% chance of fooling a lay
person for 5 minutes
• Anticipated all major arguments against AI in following 50 years
• Suggested major components of AI: knowledge, reasoning, language
understanding, learning
21
The Foundations of AI
• Philosophy (423 BC − present):
− Logic, methods of reasoning.
− Mind as a physical system.
− Foundations of learning, language, and rationality.
22
The Foundations of AI
• Psychology (1879 − present):
− Adaptation.
− Phenomena of perception and motor control.
− Experimental techniques.
23
A Brief History of AI
• The gestation of AI (1943 − 1956):
− 1943: McCulloch & Pitts: Boolean circuit model of brain.
− 1950: Turing’s “Computing Machinery and Intelligence”.
− 1956: McCarthy’s name “Artificial Intelligence” adopted.
24
A Brief History of AI
• A dose of reality (1966 − 1974):
− AI discovered computational complexity.
− Neural network research almost disappeared after
Minsky & Papert’s book in 1969.
25
A Brief History of AI
• AI becomes an industry (1980 − 1988):
− Expert systems industry booms.
− 1981: Japan’s 10-year Fifth Generation project.
26
The Term Artificial Intelligence
John McCarthy is one of the founding fathers of AI, together with Marvin
Minsky, Allen Newell, and Herbert A. Simon
Weak Vs. Strong AI
● Weak AI: Building machines that act intelligently (without worrying,
actually intelligent)
● Strong AI: To develop a person
● The ultimate goal of AI is to build a person or more humbly an animal
● The goal of work in AI is to build machines that perform tasks normally
requiring human intelligence (Nilsson, Nils J.)
What is Involved(⅓)
● Interaction with the real world
○ Perceive, understand and act
● Reasoning and Planning
○ Modelling the external world
○ Planning and decision making
○ Deals with the unexpected problems and uncertainties
● Learning and adaptation
What is Involved(⅔)
● Philosophy
○ Logic, methods of reasoning, mind as physical system
● Mathematics
○ Formal representation and Proof theory
● Statistics and Probability
○ Modelling uncertainty, learning from data
● Economics
○ Utility, decision theory
What is Involved(3/3)
Neuroscience
Psychology/Cognitive Science
Computer Engineering
Control theory
Linguistics
Task Domains of AI
• Mundane Tasks:
– Perception
• Vision
• Speech
– Natural Languages
• Understanding
• Generation
• Translation
– Common sense reasoning
– Robot Control
• Formal Tasks
– Games : chess, checkers etc
– Mathematics: Geometry, logic,Proving properties of programs
• Expert Tasks:
– Engineering ( Design, Fault finding, Manufacturing planning)
– Scientific Analysis
– Medical Diagnosis
– Financial Analysis 32
AI Technique
• Intelligence requires Knowledge
• Knowledge posesses less desirable properties such as:
– Voluminous
– Hard to characterize accurately
– Constantly changing
– Differs from data that can be used
• AI technique is a method that exploits knowledge that should be represented in
such a way that:
– Knowledge captures generalization
– It can be understood by people who must provide it
– It can be easily modified to correct errors.
– It can be used in variety of situations
33
SEARCH
PROBLEMS
● To solve the problem of building a system you should take the following steps:
■ Define the problem accurately including detailed specifications and what constitutes a
suitable solution.
■ Scrutinize the problem carefully, for some features may have a central effect on the
chosen method of solution.
■ Segregate and represent the background knowledge needed in the solution of the
problem.
■ Choose the best solving techniques for the problem to solve a solution.
1 4 8 4
7 6 5 7 6 5
● Stack
● Expand deepest unexpanded node
● Last In First Out
● Put successors at front A
B C
D E F G
Properties
Complete: No, fails in infinite-depth space
Time: O(bm), m= maximum depth
Space: O(bm)
Optimal: No,may find a non-optimal goal first
Breadth First Search
D E F G
Properties
To avoid the infinite depth problem of DFS, we can decide to only search
until depth L.
What if, solution is deeper than L: Increase the L iteratively
Depth First Iterative Deepening
(DFID)
Properties
Complete: Yes
Time : O(bd)
Space: O(bd)
Optimal: Yes, if step cost=1, or increasing function of depth
Heuristic Search
● Heuristics : Greek word, means find or discover
● A study of methods and rules of discovery and invention
● Provides the ability to rank the children of a node
● Takes a state and returns a numeric value
● It is a function which defines how far a state is from goal
Generate and Test Search:
● In this technique, all the solutions are generated and tested for the best
solution.
● It ensures that the best solution is checked against all possible generated
solutions.
Algorithm :
Complete: Good Generators need to be complete i.e. they should generate all the
possible solutions and cover all the possible states. In this way, we can guaranty our
algorithm to converge to the correct solution at some point in time.
Non Redundant: Good Generators should not yield a duplicate solution at any
point of time as it reduces the efficiency of algorithm thereby increasing the time of
search and making the time complexity exponential. In fact, it is often said that if
solutions appear several times in the depth-first search then it is better to modify the
procedure to traverse a graph rather than a tree.
Informed: Good Generators have the knowledge about the search space which they
maintain in the form of an array of knowledge. This can be used to search how far
the agent is from the goal, calculate the path cost and even find a way to reach the
goal.
Hill Climbing
● A simple optimization algorithm used in AI to find the best possible
solution for a given problem.
● a local search algorithm which continuously moves in the direction of
increasing elevation/value to find the peak of the mountain or best
solution to the problem.
● It terminates when it reaches a peak value where no neighbor has a
higher value
● Ex: Travelling Salesman Problem
● It is also called greedy local search as it only looks to its good
immediate neighbor state and not beyond that.
Features of Hill Climbing
● Flat local maximum: It is a flat space in the landscape where all the
neighbor states of current states have the same value.
Types of Hill Climbing Algorithm:
● It only checks it's one successor state, and if it finds better than the current
state, then move else be in the same state.
Step 1: Evaluate the initial state, if it is goal state then return success and
Stop.
Step 2: Loop Until a solution is found or there is no new operator left to
apply.
Step 3: Select and apply an operator to the current state.
Step 4: Check new state:
If it is goal state, then return success and quit.
Else if it is better than the current state then assign new state as a current
state.
Else if not better than the current state, then return to step2.
Step 5: Exit.
Steepest-Ascent hill climbing:
Solution: Backtracking technique can be a solution of the local maximum in state space landscape. Create a
list of the promising path so that the algorithm can backtrack the search space and explore other paths as
well.
Plateau: A plateau is the flat area of the search space in which all the neighbor states
of the current state contains the same value, because of this algorithm does not find
any best direction to move. A hill-climbing search might be lost in the plateau area.
Solution: The solution for the plateau is to take big steps or very little steps while
searching, to solve the problem. Randomly select a state which is far away from the
current state so it is possible that the algorithm could find non-plateau region.
Ridges: A ridge is a special form of the local maximum. It has an area which is
higher than its surrounding areas, but itself has a slope, and cannot be reached in
a single move.
2. Start from the initial node (say N) and put it in the ‘ordered’ OPEN list
b. Select the first/top node (say N) in the OPEN list and move it to the CLOSED list. Also
c. If N is a GOAL node, then move the node to the Closed list and exit the loop returning
d. If N is not the GOAL node, expand node N to generate the ‘immediate’ next nodes linked to
e. Reorder the nodes in the OPEN list in ascending order according to an evaluation function
f(n)
Beam Search
● A heuristic search algorithm that examines a graph by extending the most
promising node in a limited set is known as beam search.
● The heuristic cost associated with the node is used to choose the best
nodes. The width of the beam search is denoted by W.
● Beam search is a heuristic search technique that always expands the W
number of the best nodes at each level. It progresses level by level and
moves downwards only from the best W nodes at each level.
● Beam Search constructs its search tree using breadth-first search. It
generates all the successors of the current level’s state at each level of the
tree. However, at each level, it only evaluates a W number of states. Other
nodes are not taken into account.
Beam Search cases
● When W = 1, the search becomes a hill-climbing search in which the best node
is always chosen from the successor nodes.
● No states are pruned if the beam width is unlimited, and the beam search is
identified as a breadth-first search.
● It takes care of space complexity
● The beamwidth bounds the amount of memory needed to complete the search,
but it comes at the cost of completeness and optimality (possibly that it will not
find the best solution).
Constraint Satisfaction Problems in Artificial Intelligence
X: It is a set of variables.
D: It is a set of domains where the variables reside. There is a specific domain for
each variable.
C: It is a set of constraints which are followed by the set of variables.
● In constraint satisfaction, domains are the spaces where the variables reside,
following the problem specific constraints.
● These are the three main elements of a constraint satisfaction technique. The
constraint value consists of a pair of {scope, rel}.
● The scope is a tuple of variables which participate in the constraint and rel is a
relation which includes a list of values which the variables can take to satisfy
the constraints of the problem.
●
Solving Constraint Satisfaction Problems
The requirements to solve a constraint satisfaction problem (CSP) is:
● A state-space
● The notion of the solution.
There are following two types of domains which are used by the variables :
● Discrete Domain: It is an infinite domain which can have one state for multiple variables. For example, a
start state can be allocated infinite times for each variable.
● Finite Domain: It is a finite domain which can have continuous states describing one domain for one
specific variable. It is also called a continuous domain.
Knowledge Representation
Object: All the facts about objects in our world domain. E.g., Guitars contains
strings, trumpets are brass instruments.
Facts: Facts are the truths about the real world and what we represent.
1- Declarative Knowledge:
Meta-knowledge:
● Knowledge about the other types of knowledge is called Meta-knowledge.
Heuristic knowledge:
There are mainly four ways of knowledge representation which are given as
follows:
● Logical Representation
● Semantic Network Representation
● Frame Representation
● Production Rules
1. Logical Representation
● Syntaxes are the rules which decide how we can construct legal sentences in
the logic.
● It determines which symbol we can use in knowledge representation.
● How to write those symbols.
Semantics:
● Semantics are the rules by which we can interpret the sentence in the logic.
● Semantic also involves assigning a meaning to each sentence.
Logical reasoning forms the basis for a huge domain of computer science and
mathematics. They help in establishing mathematical arguments, valid or invalid.
1. Propositional Logic :
● A proposition is basically a declarative sentence that has a truth value.
● Truth value can either be true or false, but it needs to be assigned any of the
two values and not be ambiguous.
● The purpose of using propositional logic is to analyze a statement, individually
or compositely.
For example :
The following statements :
Are not propositions because they do not have a truth value. They are ambiguous.
But the following statements :
(a+b)2 = a2 + 2ab + b2
If x is real, then x2 >= 0
If x is real, then x2 < 0
The sun rises in the east.
The sun rises in the west.
Are all propositions because they have a specific truth value, true or false.
For example :
In P(x) : x>5, x is the subject or the variable and ‘>5’ is the predicate.
P(7) : 7>5 is a proposition where we are assigning values to the variable x, and it
has a truth value, i.e. True.
The set of values that the variables of the predicate can assume is called the
Universe or Domain of Discourse or Domain of Predicate.
Propositional logic is the logic that deals with a collection of declarative Predicate logic is an expression consisting of variables with a specified domain. It
statements which have a truth value, true or false. consists of objects, relations and functions between the objects.
It is the basic and most widely used logic. Also known as Boolean logic. It is an extension of propositional logic covering predicates and quantification.
A proposition has a specific truth value, either true or false. A predicate’s truth value depends on the variables’ value.
Predicate logic helps analyze the scope of the subject over the predicate. There are
Scope analysis is not done in propositional logic. three quantifiers : Universal Quantifier (∀) depicts for all, Existential Quantifier (∃)
depicting there exists some and Uniqueness Quantifier (∃!) depicting exactly one.
It cannot deal with sets of entities. It can deal with set of entities with the help of quantifiers.
Predicate Logic
Predicate Logic deals with predicates, which are propositions, consist of variables.
Existential Quantifier:
If p(x) is a proposition over the universe U. Then it is denoted as ∃x p(x) and read
as "There exists at least one value in the universe of variable x such that p(x) is true.
The quantifier ∃ is called the existential quantifier.
There are several ways to write a proposition, with an existential quantifier, i.e.,
(∃x∈A)p(x) or ∃x∈A such that p (x) or (∃x)p(x) or p(x) is true for
some x ∈A.
Universal Quantifier:
If p(x) is a proposition over the universe U. Then it is denoted as ∀x,p(x) and read
as "For every x∈U,p(x) is true." The quantifier ∀ is called the Universal
Quantifier.
~( ∃ x∈U) (x+6=25)
≅∀ x∈U~ (x+6)=25
≅(∀ x∈U) (x+6)≠25
3. ~( ∃ x p(x)∨∀ y q(y)
● NLP helps users to ask questions about any subject and get a direct response within seconds.
● NLP offers exact answers to the question means it does not offer unnecessary and unwanted
information.
● NLP helps computers to communicate with humans in their languages.
● It is very time efficient.
● Most of the companies use NLP to improve the efficiency of documentation processes, accuracy of
documentation, and identify the information from large databases
Disadvantages of NLP
● A list of disadvantages of NLP is given below:
● NLP may not show context.
● NLP is unpredictable
● NLP may require more keystrokes.
● NLP is unable to adapt to the new domain, and it has a limited function that's why NLP is built for a
single and specific task only.
Components of NLP
Natural Language Understanding (NLU) helps the machine to understand and analyse human
language by extracting the metadata from content such as concepts, entities, keywords,
emotion, relations, and semantic roles.
NLU mainly used in Business applications to understand the customer's problem in both
spoken and written language.
2. Natural Language Generation (NLG)
● Natural Language Generation (NLG) acts as a translator that converts the computerized data into
natural language representation. It mainly involves Text planning, Sentence planning, and Text
Realization.
NLU NLG
NLU is the process of reading and interpreting language. NLG is the process of writing or generating language.
It produces non-linguistic outputs from natural language It produces constructing natural language outputs from
● Question Answering
● Spam Detection
Spam detection is used to detect unwanted e-mails getting to a user's inbox.
Sentiment Analysis
● Sentiment Analysis is also known as opinion mining. It is used on the web to analyse the
attitude, behaviour, and emotional state of the sender.
● This application is implemented through a combination of NLP (Natural Language Processing)
and statistics by assigning the values to the text (positive, negative, or natural), identify the
mood of the context (happy, sad, angry, etc.)
Machine Translation
Machine translation is used to translate text or speech from one natural language to another natural
language.
Spelling correction
Microsoft Corporation provides word processor software like MS-word, PowerPoint for the spelling
correction.
Speech Recognition
Speech recognition is used for converting spoken words into text. It is used in applications, such as
mobile, home automation, video recovery, dictating to Microsoft Word, voice biometrics, voice user
interface, and so on.
Chatbot
Implementing the Chatbot is one of the important applications of NLP. It is used by many companies
to provide the customer's chat services.
Information extraction
Information extraction is one of the most important applications of NLP. It is used for extracting
structured information from unstructured or semi-structured machine-readable documents.
It converts a large set of text into more formal representations such as first-order logic
structures that are easier for the computer programs to manipulate notations of the natural
language processing.
Phases of NLP
There are the following five phases of NLP:
1. Lexical Analysis and Morphological
The first phase of NLP is the Lexical Analysis. This phase scans the source code as a stream
of characters and converts it into meaningful lexemes. It divides the whole text into
paragraphs, sentences, and words.
Syntactic Analysis is used to check grammar, word arrangements, and shows the relationship
among the words.
In the real world, Agra goes to the Poonam, does not make any sense, so this sentence is
rejected by the Syntactic analyzer.
3. Semantic Analysis
Semantic analysis is concerned with the meaning representation. It mainly focuses on the
literal meaning of words, phrases, and sentences.
4. Discourse Integration
Discourse Integration depends upon the sentences that proceeds it and also invokes the
meaning of the sentences that follow it.
5. Pragmatic Analysis
Pragmatic is the fifth and last phase of NLP. It helps you to discover the intended effect by
applying a set of rules that characterize cooperative dialogues.
❖ Syntactic analysis or parsing or syntax analysis is the third phase of NLP.
❖ The purpose of this phase is to draw exact meaning, or you can say dictionary meaning from the text.
❖ Syntax analysis checks the text for meaningfulness comparing to the rules of formal grammar.
❖ For example, the sentence like “hot ice-cream” would be rejected by semantic analyzer.
❖ In this sense, syntactic analysis or parsing may be defined as the process of analyzing the strings of
symbols in natural language conforming to the rules of formal grammar.
❖ The origin of the word ‘parsing’ is from Latin word ‘pars’ which means ‘part’.
Concept of Parser
● It is used to implement the task of parsing. It may be defined as the software component
designed for taking input data (text) and giving structural representation of the input after
checking for correct syntax as per formal grammar.
● It also builds a data structure generally in the form of parse tree or abstract syntax tree or
other hierarchical structure.
The main roles of the parse include −
Types of Parsing
Derivation divides parsing into the followings two types −
● Top-down Parsing
● Bottom-up Parsing
Top-down Parsing
In this kind of parsing, the parser starts constructing the parse tree from the start symbol and
then tries to transform the start symbol to the input.
The most common form of top down parsing uses recursive procedure to process the input.
The main disadvantage of recursive descent parsing is backtracking.
Bottom-up Parsing
In this kind of parsing, the parser starts with the input symbol and tries to construct the parser
tree up to the start symbol.
Concept of Derivation
In order to get the input string, we need a sequence of production rules. Derivation is a set of
production rules. During parsing, we need to decide the non-terminal, which is to be replaced along
with deciding the production rule with the help of which the non-terminal will be replaced.
Types of Derivation
In this section, we will learn about the two types of derivations, which can be used to decide which
non-terminal to be replaced with production rule −
Left-most Derivation
In the left-most derivation, the sentential form of an input is scanned and replaced from the left to the
right. The sentential form in this case is called the left-sentential form.
Right-most Derivation
In the left-most derivation, the sentential form of an input is scanned and replaced from right to left.
The sentential form in this case is called the right-sentential form.
Semantic Analysis
Entities: Any sentence is made of different entities that are related to each other. It
represents any individual category such as name, place, position, etc.
Concepts: It represents the general category of individual, such as person, city etc.
● In a sentence, there are a few entities that are co-related to each other.
Relationship extraction is the process of extracting the semantic relationship
between these entities.
The discourse integration step forms the story of the sentence. Every sentence
should have a relationship with its preceding and succeeding sentences. These
relationships are checked by Discourse Integration.
or
Discourse Integration depends upon the sentences that proceeds it and also invokes
the meaning of the sentences that follow it.
Pragmatic Analysis:
● Once all grammatical and syntactic checks are complete, the sentences are
now checked for their relevance in the real world.
● Start with some estimate of the correct weight settings. Modify the weight in
the program on the basis of accumulated experiences.
● Features that appear to be good predictors will have their weights increased
and bad ones will be decreased.
Learning with Macro-Operators:
Macro-operators can be used to group a whole series of actions into one. For
example:
● Making dinner can be described a lay the table, cook dinner, serve dinner.
● Explanation based learning has ability to learn from a single training instance.
Instead of taking more examples the explanation based learning is emphasized
to learn a single, specific example.
● For example, consider the Ludoo game. In a Ludoo game, there are generally
four colors of buttons. For a single color there are four different squares.
Suppose the colors are red, green, blue and yellow.
● So maximum four members are possible for this game. Two members are
considered for one side (suppose green and red) and other two are considered
for another side (suppose blue and yellow).
● For any one opponent the other will play his game. A square sized small box
marked by symbols one to six is circulated among the four members.
● The number one is the lowest number and the number six is the highest for
which all the operations are done.
● Always anyone from the 1st side will try to attack any one member in the 2nd
side and vice versa.
● At any instance of play the players of one side can attack towards the players
of another side.
● Explanation based generalization (EBG) is an algorithm for explanation based
learning, described in Mitchell at al. (1986). It has two steps first, explain
method and secondly, generalize method.
● During the first step, the domain theory is used to prune away all the
unimportant aspects of training examples with respect to the goal concept.
● The second step is to generalize the explanation as far as possible while still
describing the goal concept.
● It has two steps first, explain method and secondly, generalize method.
● During the first step, the domain theory is used to prune away all the unimportant
aspects of training examples with respect to the goal concept.
● The second step is to generalize the explanation as far as possible while still
describing the goal concept.
What is an expert system?
● An expert system is a computer program that uses artificial intelligence (AI) technologies to
simulate the judgment and behavior of a human or an organization that has expertise and
experience in a particular field.
● Expert systems are usually intended to complement, not replace, human experts.
● The concept of expert systems was developed in the 1970s by computer scientist Edward
Feigenbaum, a computer science professor at Stanford University and founder of
Stanford's Knowledge Systems Laboratory.
Difference between AI and Expert System
Artificial Intelligence: AI manages more comprehensive issues of automating a system. This computerization
should be possible by utilizing any field such as image processing, cognitive science, neural systems,
machine learning, etc.
Expert System:
● An expert system is an AI software that uses knowledge stored in a knowledge base to solve
problems that would usually require a human expert thus preserving a human expert’s
knowledge in its knowledge base.
● They can advise users as well as provide explanations to them about how they reached a
particular conclusion or advice.
Artificial Intelligence Expert System
AI is the ability of a machine or a computer program to Expert systems represent the most successful
think, work, learn and react like humans. demonstration of the capabilities of AI.
AI involves the use of methods based on the intelligent Experts systems are computer programs designed to
behavior of humans to solve complex problems. solve complex decision problems.
Characteristics
● Facial Recognition
● Automate Simple and Repetitive Tasks
● Chatbots
● Natural language processing ● High Efficiency and Accuracy
● Imitation Of Human Cognition ● Highly responsive
● Understandable
● Deep Learning
● Reliability
● Cloud Computing
Components of AI:
Components of Expert System:
1. Natural Language Processing (NLP)
1. Inference engine
2. Knowledge representation
2. Knowledge base
3. Reasoning
3. User interface
4. Problem solving
4. Knowledge acquisition module
5. Machine learning
AI is the study is systems that act in a way to any Expert system represent the most successful
observer would appear to be intelligent. demonstration of the capabilities of AI
● E-Commerce ● Hospitals
● Education ● Medical facilities
● Lifestyle ● Help desks management
● Navigation ● Loan analysis
● Robotics ● Warehouse optimization
● Human Resource ● Stock market trading
● Healthcare ● Airline scheduling & cargo schedules and
● Gaming and others others
The knowledge acquired from the human expert must be encoded in such
a way that it remains a faithful representation of what the expert knows,
and it can be manipulated by a computer.
Rules "if-then" are predominant form of encoding knowledge in expert systems. These are
of the form :
If a1 , a2 , . . . . . , an
Then b1 , b2 , . . . . . , bn where
The objects are denoted as nodes of a graph. The relationship between two objects are
denoted as a link between the corresponding two nodes.
The most common form of semantic networks uses the links between nodes to represent
IS-A and HAS relationships between objects.
Example of Semantic Network
The Fig. below shows a car IS-A vehicle; a vehicle HAS wheels.
In this technique, knowledge is decomposed into highly modular pieces called frames, which are
generalized record structures. Knowledge consist of concepts, situations, attributes of concepts,
relationships between concepts, and procedures to handle relationships as well as attribute values.
‡ The attributes, the relationships between concepts, and the procedures are allotted to slots in a
frame.
‡ The contents of a slot may be of any data type - numbers, strings, functions or procedures and so
on.
‡ The frames may be linked to other frames, providing the same kind of inheritance as that provided
by a semantic network.
Two frames, their slots and the slots filled with data type are shown.
Expert System Shells
● Expert system shells are toolkits that can be used to develop expert systems.
They consist of some built expert system components with an empty
knowledge base. Hence, in most cases, the knowledge engineer is left with
only populating the knowledge base.
3. A structured skeleton of a knowledge base (in its empty state) with the suitable knowledge
representation facilities
Not only the above components but also some ES Shells provide facilities for database connectivity through
the knowledge engineer (who perform the knowledge engineering and modelling).
The knowledge base can be connected with an external database (like MySQL) since the knowledge base is
not optimal in storing extensive data. The knowledge base cannot directly access the database and these
Engineer to easily update and check the knowledge base. Knowledge Engineer collects the
expertise knowledge in a specific domain and models in populating the knowledge base.
Inference engine which is the most important part of an expert system access the
knowledge base and solves the problem by either backward chaining or forward chaining
of facts and rules in the knowledge base. In ES Shells, the inference engine is also a
which provides the user with reasons and explanations to provide a certain
In an expert system shell, the design of the user interface and other software
and software engineer. (Depending on the size these parties may vary from
systems at various scales. Depending on the ES Shell there are various pros and cons
The building block of a neural network is the single neuron. The diagram below shows the structure of a neutron with
one input. The formula for the above equation will read x0 * w0 + x1 * w1 + x2 * w2 + b = y.
Layers
Neural networks organise neurons into layers. A layer in which every neuron is connected to every other
neuron in its next layer is called a dense layer.
S
3 4
C B
7
10 12 6
D E F
2 16
5
G
Expert Systems
● An expert system is a computer application that uses rules, approaches, and facts to
provide solutions to complex problems.
● Examples of expert systems include MYCIN and DENDRAL.
● MYCIN uses the backward chaining technique to diagnose bacterial infections.
● DENDRAL employs forward chaining to establish the structure of chemicals.
● There are three components in an expert system: user interface, inference engine, and
knowledge base.
● Backward and forward chaining stem from the inference engine component. This is a
component in which logical rules are applied to the knowledge base to get new
information or make a decision.
● The backward and forward chaining techniques are used by the inference engine as
strategies for proposing solutions or deducing information in the expert system.
Expert Systems
Expert System Shells
● Expert system shells are toolkits that can be used to develop expert systems.
They consist of some built expert system components with an empty
knowledge base. Hence, in most cases, the knowledge engineer is left with
only populating the knowledge base.
3. A structured skeleton of a knowledge base (in its empty state) with the suitable knowledge
representation facilities
Not only the above components but also some ES Shells provide facilities for database connectivity through
the knowledge engineer (who perform the knowledge engineering and modelling).
The knowledge base can be connected with an external database (like MySQL) since the knowledge base is
not optimal in storing extensive data. The knowledge base cannot directly access the database and these