0% found this document useful (0 votes)
5 views

FAI - Unit 4 - Logic and Knowledge

FAI SUBJECT IMPORTANT CONCEPTS GUARANTEED QUESTIONS

Uploaded by

yvtinsane
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views

FAI - Unit 4 - Logic and Knowledge

FAI SUBJECT IMPORTANT CONCEPTS GUARANTEED QUESTIONS

Uploaded by

yvtinsane
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 55

Fundamentals of Artificial Intelligence

Unit 4: Logic and Knowledge in Artificial Intelligence

by
Dr. Abdul Ahad
Content
4.1 Introduction of Logic
4.2 Logic Representation
4.3 Proposition Logic
4.4 Predicate Logic
4.5 Several Other Logics
4.6 Uncertainty and Probability
4.7 Knowledge Representation
4.8 Graphical Sketches and the Human Window
4.9 Graphs and the Bridges of Königs berg Problem
4.10 Representational Choices
4.11 Production Systems
4.12 Object Orientation and Frames,
4.13 Semantic Networks
Dr. Abdul Ahad, Department of AI 2
Logics in AI

i. Propositional Logic
ii. Predicate Logic (First Order
Logic)
iii. Uncertainty AI
a. Fuzzy Logic
b. Probability
Dr. Abdul Ahad, Department of AI
4.1 Propositional Logic
 Propositional logic (PL) is the simplest form of logic where all
the statements are made by propositions.
 A proposition is a declarative statement which is either true or
false.
 It is a technique of knowledge representation in logical and
mathematical form.
 Propositional logic is also called Boolean logic as it works on 0
and 1.
 In propositional logic, we use symbolic variables to represent
the logic, and we can use any symbol for a representing a
proposition, such A, B, C, P, Q, R, etc.
 Propositions can be either true or false, but it cannot be both.
Dr. Abdul Ahad, Department of AI
 Propositional logic consists of an object, relations or function,
and logical connectives. These connectives are also called logical
operators.
 The propositions and connectives are the basic elements of the
propositional logic. Connectives can be said as a logical operator
which connects two sentences.
 A proposition formula which is always true is called tautology, and it
is also called a valid sentence.
 A proposition formula which is always false is called Contradiction.
 Statements which are questions, commands, or opinions are not
propositions such as "Where is Rohini", "How are you", "What is
your name", are not propositions.

Dr. Abdul Ahad, Department of AI


 Negation: A sentence such as ¬ P is called negation of P. A literal can be either Positive
literal or negative literal.
 Conjunction: A sentence which has ∧ connective such as, P ∧ Q is called a
conjunction.
Example: Rohan is intelligent and hardworking. It can be written as,
P= Rohan is intelligent,
Q= Rohan is hardworking. → P∧ Q .
 Disjunction: A sentence which has ∨ connective, such as P ∨ Q. is called disjunction,
where P and Q are the propositions.
Example: "Ritika is a doctor or Engineer",
Here P= Ritika is Doctor. Q= Ritika is Engineer, so we can write it as P ∨ Q.
 Implication: A sentence such as P → Q, is called an implication. Implications are also
known as if-then rules. It can be represented as
If it is raining, then the street is wet.
Let P= It is raining, and Q= Street is wet, so it is represented as P → Q
 Biconditional: A sentence such as P⇔ Q is a Biconditional sentence, Example: If I
am breathing, then I am alive
P= I am breathing, Q=Dr.IAbdul
am Ahad,
alive, it can be
Department represented as P ⇔ Q.
of AI
Dr. Abdul Ahad, Department of AI
Dr. Abdul Ahad, Department of AI
• Tautology

Dr. Abdul Ahad, Department of AI


Properties of Operators
• Commutativity:
– P∧ Q = Q ∧ P (or) P ∨ Q = Q ∨ P.
• Associativity:
– (P ∧ Q) ∧ R = P ∧ (Q ∧ R) (or) (P ∨ Q) ∨ R = P ∨ (Q ∨ R)
• Identity element:
– P ∧ True = P (or) P ∨ True = True.
• Distributive:
– P∧ (Q ∨ R) = (P ∧ Q) ∨ (P ∧ R).
– P ∨ (Q ∧ R) = (P ∨ Q) ∧ (P ∨ R).
• DE Morgan's Law:
– ¬ (P ∧ Q) = (¬P) ∨ (¬Q)
– ¬ (P ∨ Q) = (¬ P) ∧ (¬Q).
• Double-negation elimination:
– ¬ (¬P) = P.
Dr. Abdul Ahad, Department of AI
Limitations
 We cannot represent relations like ALL, some, or none with

propositional logic.
Example:

 All the girls are intelligent.

 Some apples are sweet.

 Propositional logic has limited expressive power.

 In propositional logic, we cannot describe statements in terms of

their properties or logical relationships.

Dr. Abdul Ahad, Department of AI


Arguments in the Propositional Logic

An argument in the propositional logic has the form:


A: P1
P2
.
.
.
Pr
________________________

∴ C // Conclusion
Dr. Abdul Ahad, Department of AI
 An argument is valid iff the implication formed by
taking the conjunction of the premises as the
antecedent and the conclusion as the consequent is a
tautology
i.e. (P1 /\ P2 /\ … /\ Pr) ⇒ C
premises are assumed to be true

 If this implication is a tautology, then the


argument A is valid

 A second approach to proving validity is known as


resolution
 Also referred to as resolution-refutation

Dr. Abdul Ahad, Department of AI


 Assumes premises are true and conclusion is
false
 If we can arrive at a contradiction, then

original conclusion must follow logically


from the premises and hence the original
argument is valid.
 Premises and conclusion must be in

clause form
 In clause form there is no:
 implication - change (p ⇒ q) to (~p \/ q)
 conjunction - change p /\ q to p , q
 double negation - change ~~p to p
Dr. Abdul Ahad, Department of AI
Resolution in the Propositional Logic
There are four steps to proving an argument valid by
resolution
 Convert the premises to clause form
 Negate the conclusion
 Convert the negation of the conclusion to clause
form
 Finally, search for a contradiction in the list of
clauses including combined clauses
Dr. Abdul Ahad, Department of AI
Prove the following argument is valid:
1) p⇒q
2) q ⇒ ~r

3) ~p ⇒ ~r

________
4) ∴ ~r
Step 1: Convert the premises to clause form
1') ~p \/ q
2') ~q \/ ~r
3') ~~p \/ ~r
Step 2: Negate the conclusion
4) ~~r
Dr. Abdul Ahad, Department of AI
Prove the following argument is valid:
1) p⇒q
2) q ⇒ ~r

3) ~p ⇒ ~r

________
4) ∴ ~r
Step 3: Convert the negation of the conclusion to clause form
4') r // via involution
Step 4: The clause base (list of clauses) is:
1') ~p \/ q
2') ~q \/ ~r
3') p \/ ~r
4') r
Dr. Abdul Ahad, Department of AI
4.2 Predicate Logic (First Order Logic)

 Greater expressive power than propositional logic


 Predicate logic expression consists of a predicate name
followed by a list of arguments
 The number of elements in the list of arguments is the
predicate’s arity
 Two quantifiers can be applied to predicate variables

Dr. Abdul Ahad, Department of AI


Quantifiers in Predicate Logic

 Existential quantifier (∃)


 ∃x – “there exists an x”
 One or more values of x are guaranteed to exist
 Universal quantifier (∀)
 ∀x – “for all x”
 The expression is stating something is true for
all values that x can assume
Dr. Abdul Ahad, Department of AI
Examples of Predicate Logic:

(∃x)[Natural_number(x) /\ Divisible_by_2(x)]
 Some natural numbers are even
(∀x){[Animal(x) /\ Has_Hair(x) /\ Warm_Blooded(x)]
⇒ Mammal(x)}
 If x is a warm-blooded animal with hair
then x is a mammal

Dr. Abdul Ahad, Department of AI


Resolution in Predicate Logic
• Steps for Resolution:
1.Conversion of facts into first-order
logic.
2.Convert FOL statements into CNF
3.Negate the statement which needs
to prove (proof by contradiction)
4.Draw resolution graph (unification).

Dr. Abdul Ahad, Department of AI


1) All great chefs are Italian.
2) All Italians enjoy good food.
3) Either Michael or Louis is a great chef.
4) Michael is not a great chef.
5) Therefore, Louis enjoys good food.
Use the following predicates:
 GC(x) : x is a great chef
 I(x) : x is Italian
 EF(x) : x enjoys good food

1) (∀x)(GC(x) ⇒ I(x))
2) (∀x)(I(x) ⇒ EF(x))
3) GC(Michael) \/ GC(Louis)
4) ~GC(Michael)
Therefore: 5) EF(Louis)

Dr. Abdul Ahad, Department of AI


1) All great chefs are Italian.
2) All Italians enjoy good food.
3) Either Michael or Louis is a great chef.
4) Michael is not a great chef.
5) Therefore, Louis enjoys good food.
Convert the premises into clause form where no quantifiers can
be present:
1) ~GC(x) \/ I(x)
2) ~I(y) \/ EF(y)
3) GC(Michael) \/ GC(Louis)
4) ~GC(Michael)

Negate the conclusion:


5) ~EF(Louis) //already in clause form
Dr. Abdul Ahad, Department of AI
4.3 Uncertainty in AI

 Fuzzy Logic
 Probability Theory

Dr. Abdul Ahad, Department of AI


Knowledge Representation
Knowledge representation using first-order logic and propositional
logic with certainty, which means we were sure about the predicates.
With this knowledge representation, we might write A→B, which
means if A is true then B is true, but consider a situation where we
are not sure about whether A is true or not then we cannot express
this statement, this situation is called “uncertainty”.
So to represent uncertain knowledge, where we are not sure about the
predicates, we need uncertain reasoning or probabilistic reasoning.
Following are some leading causes of uncertainty to occur in the real
world.
Information occurred from unreliable sources.
Experimental Errors
Equipment fault
Temperature variation
Climate change.

Dr. Abdul Ahad, Department of AI 25


 Information = Data + Facts
 Data : Numbers without meaning.
 Facts: Numbers with units.
 Data  Facts  Information  Knowledge
 Information: Conversion of Facts into meaning.


Knowledge: Helps with making complex decisions and understanding .
Example:
Data: 70
Fact: 70 degrees Fahrenheit
Information: The temperature
outside is 70 degrees Fahrenheit.
Knowledge: If the temperature is
over 70 degrees Fahrenheit, then
you can go swimming.

The Knowledge Hierarchy Dr. Abdul Ahad, Department of AI


Fuzzy Logic
The 'Fuzzy' word means the things that are not clear or are
vague. In artificial intelligence (AI) systems, fuzzy logic is used
to imitate human reasoning and cognition. Sometimes, we
cannot decide in real life that the given problem or statement is
either true or false.
In the Boolean system, only two possibilities (0 and 1) exist,
where 1 denotes the absolute truth value and 0 denotes the
absolute false value.
But in the fuzzy system, there are multiple possibilities
present between the 0 and 1, which are partially false and
partially true.
Fuzzy sets are denoted or represented by the tilde (~) character.
This concept was introduced by Lofti Zadeh in 1965 based on
the Fuzzy Set Theory.
Dr. Abdul Ahad, Department of AI
Examples of Fuzzy Logic as comparing to
Boolean Logic

Dr. Abdul Ahad, Department of AI


Architecture of a Fuzzy Logic System
In the architecture of the Fuzzy Logic system, each component
plays an important role. The architecture consists of the
different four components which are given below.
1.Rule Base
2.Fuzzification
3.Inference Engine
4.Defuzzification

Dr. Abdul Ahad, Department of AI


Rule Base is a component used for storing the set of rules and the If-
Then conditions given by the experts are used for controlling the
decision-making systems.
Fuzzification is a module or component for transforming the system
inputs, i.e., it converts the crisp number into fuzzy steps. The crisp
numbers are those inputs which are measured by the sensors and then
fuzzification passed them into the control systems for further
processing.
Inference Engine is a main component in any Fuzzy Logic system
(FLS), because all the information is processed in the Inference
Engine. It allows users to find the matching degree between the
current fuzzy input and the rules. After the matching degree, this
system determines which rule is to be added according to the given
input field.
Defuzzification is a module or component, which takes the fuzzy set
inputs generated by the Inference Engine, and then transforms them
into a crisp value. It is the last step in the process of a fuzzy logic
system.
Dr. Abdul Ahad, Department of AI
Membership Function: The membership function is a
function which represents the graph of fuzzy sets, and allows
users to quantify the linguistic term. It is a graph which is used
for mapping each element of x to the value between 0 and
1.This function is also known as indicator or characteristics
function.

The fuzzy set theory is denoted mathematically as A fuzzy set


(Ã) is a pair of U and M, where U is the Universe of discourse
and M is the membership function, which takes on values in the
interval [ 0, 1 ]. The universe of discourse (U) is also denoted
by Ω or X.
For example, the Fuzzy set B, the membership function for X is
defined as: μB:X → [0,1]. In this function X, each element of
set B is mapped to the value between 0 and 1. This is called a
degree of membership orDr. membership value.
Abdul Ahad, Department of AI 31
Fuzzy set Operations
Union Operation: The union operation of a fuzzy set is defined by:
μA∪B(x) = max (μA(x), μB(x))

Example: Let's suppose A is a set which contains following elements:


A = {( X1, 0.8 ), (X2, 0.6), (X3, 0.4), (X4, 0.2)}
And, B is a set which contains following elements:
B = {( X1, 0.3), (X2, 0.5), (X3,0.7), (X4, 0.9)}
then, AUB = {( X1, 0.8), (X2, 0.6), (X3, 0.7), (X4, 0.9)}
Because, according to this operation
For X1 For X2
μA∪B(X1) = max (μA(X1), μB(X1)) μA∪B(X2) = max (μA(X2), μB(X2))
μA∪B(X1) = max (0.8, 0.3) μA∪B(X2) = max (0.6, 0.5)
μA∪B(X1) = 0.8 μA∪B(X2) = 0.6

For X3 For X4
μA∪B(X3) = max (μA(X3), μB(X3)) μA∪B(X4) = max (μA(X4), μB(X4))
μA∪B(X3) = max (0.4, 0.7) μA∪B(X4) = max (0.2, 0.9)
Dr. Abdul Ahad, Department of AI 32
μA∪B(X3) = 0.7 μA∪B(X4) = 0.9
Intersection Operation: The intersection operation of a fuzzy set is defined by:

μA∩B(x) = min (μA(x), μB(x))

Example: Let's suppose A is a set which contains following elements:


A = {( X1, 0.8 ), (X2, 0.6), (X3, 0.4), (X4, 0.2)}
And, B is a set which contains following elements:
B = {( X1, 0.3), (X2, 0.5), (X3,0.7), (X4, 0.9)}
then,
Because, according to this operation
A∩B = {( X1, 0.3), (X2, 0.5), (X3, 0.4), (X4, 0.2)}
For X1 For X2
μA∩B(X1) = min (μA(X1), μB(X1)) μA∩B(X2) = min (μA(X2), μB(X2))
μA∩B(X1) = min (0.8, 0.3) μA∩B(X2) = min (0.6, 0.5)
μA∩B(X1) = 0.3 μA∩B(X2) = 0.5

For X3 For X4
μA∩B(X3) = min (μA(X3), μB(X3)) μA∩B(X4) = min (μA(X4), μB(X4))
μA∩B(X3) = min (0.4, 0.7) μA∩B(X4) = min (0.2, 0.9)
μA∩B(X3) = 0.4 Dr. Abdul Ahad, (X4) = 0.2
μA∩BDepartment of AI 33
Complement Operation: The complement operation of a fuzzy set is defined
by: μĀ(x) = 1-μA(x)
Example: Let's suppose A is a set which contains following elements:
A = {( X1, 0.3 ), (X2, 0.8), (X3, 0.5), (X4, 0.1)}
then,
Ā= {( X1, 0.7 ), (X2, 0.2), (X3, 0.5), (X4, 0.9)}
Because, according to this operation
For X1 For X2
μĀ(X1) = 1-μA(X1) μĀ(X2) = 1-μA(X2)
μĀ(X1) = 1 - 0.3 μĀ(X2) = 1 - 0.8
μĀ(X1) = 0.7 μĀ(X2) = 0.2

For X3 For X4
μĀ(X3) = 1-μA(X3) μĀ(X4) = 1-μA(X4)
μĀ(X3) = 1 - 0.5 μĀ(X4) = 1 - 0.1
μĀ(X3) = 0.5 μĀ(X4) = 0.9

Dr. Abdul Ahad, Department of AI 34


Probability
Probabilistic reasoning: Probabilistic reasoning is a way of knowledge
representation where we apply the concept of probability to indicate the
uncertainty in knowledge. In probabilistic reasoning, we combine
probability theory with logic to handle the uncertainty.
In the real world, there are lots of scenarios, where the certainty of
something is not confirmed, such as "It will rain today," "behavior of
someone for some situations," "A match between two teams or two
players." These are probable sentences for which we can assume that it
will happen but not sure about it, so here we use probabilistic reasoning.
Need of probabilistic reasoning in AI:
 When there are unpredictable outcomes.
 When specifications or possibilities of predicates becomes too large to handle.
 When an unknown error occurs during an experiment.
In probabilistic reasoning, there are two ways to solve problems with
uncertain knowledge:
Bayes' rule
Bayesian Statistics Dr. Abdul Ahad, Department of AI 35
Probability: Probabilty can be defined as a chance that an uncertain event
will occur. It is the numerical measure of the likelihood that an event will occur.
The value of probability always remains between 0 and 1 that represent ideal
uncertainties.
• 0 ≤ P(A) ≤ 1, where P(A) is the probability of an event A.
• P(A) = 0, indicates total uncertainty in an event A.
• P(A) = 1, indicates total certainty in an event A.
• P(¬A) = probability of a not happening event.
• P(¬A) + P(A) = 1.
We can find the probability of an uncertain event by using the below formula.

Event: Each possible outcome of a variable is called an event.


Sample space: The collection of all possible events is called sample space.
Random variables: Random variables are used to represent the events and objects in the real world.
Prior probability: The prior probability of an event is probability computed before
observing new information.
Posterior Probability: The probability that is calculated after all evidence or information
has taken into account. It is a combination of prior probability and new information.
Conditional probability: Conditional probability is a probability of occurring an event
Dr. Abdul Ahad, Department of AI 36
when another event has already happened.
Let's suppose, we want to calculate the event A when event B has
already occurred, "the probability of A under the conditions of B",
it can be written as:

Where, P(A⋀B) = Joint probability of a and B


P(B) = Marginal probability of B.
If the probability of A is given and we need to find the probability
of B, then it will be given as:

It can be explained by using the below Venn diagram, where B is occurred


event, so sample space will be reduced to set B, and now we can only
calculate event A when event B is already occurred by dividing the probability
of P(A⋀B) by P( B ).

Dr. Abdul Ahad, Department of AI 37


Example: In a college, there are 70% of the students who like
CSE and 40% of the students who likes CSE and ECE, and then
what is the percent of students those who like CSE also like
ECE?
Solution:
Let, A is an event that a student likes ECE.
B is an event that a student likes CSE.
P(A ∧ B) means “the probability that it is both CSE and ECE.”

P(A ∨ B), which means “the probability that either A is true or


B is true,” is defined by the following rule:
P(A ∨B) = P(A) + P(B) - P(A ∧B)

Hence, 57% are the students who like CSE also like ECE.
Dr. Abdul Ahad, Department of AI 38
Bayes Theorem
Bayes’ theorem can be used to calculate the probability that a certain
event will occur or that a certain proposition is true. Bayes' theorem
is also known as Bayes' rule, Bayes' law, or Bayesian Reasoning,
which determines the probability of an event with uncertain
knowledge.

Bayes' theorem was named after the British mathematician Thomas


Bayes. The Bayesian inference is an application of Bayes' theorem,
which is fundamental to Bayesian statistics. It is a way to calculate
the value of P(B|A) with the knowledge of P(A|B). Bayes' theorem
allows updating the probability prediction of an event by observing
new information of the real world.

Dr. Abdul Ahad, Department of AI 39


Bayes' theorem can be derived using product rule and conditional
probability of event A with known event B:
P(A ⋀ B)= P(A|B) P(B) or
Similarly, the probability of event B with known event A:
P(A ⋀ B)= P(B|A) P(A)
Equating right hand side of both the equations, we will get:

The above equation (a) is called as Bayes' rule or Bayes' theorem. This
equation is basic of most modern AI systems for probabilistic inference.
Here, P(A|B) is known as posterior, which we need to calculate, and it will be
read as Probability of hypothesis A when we have occurred an evidence B.
P(B|A) is called the likelihood, in which we consider that hypothesis is true,
then we calculate the probability of evidence.
P(A) is called the prior probability, probability of hypothesis before
considering the evidence.
P(B) is called marginal probability, pure probability of an evidence.
Dr. Abdul Ahad, Department of AI 40
Suppose we want to perceive the effect of some unknown cause, and want to
compute that cause, then the Bayes' rule becomes:

Example-1: What is the probability that a patient has diseases meningitis


with a stiff neck?
Given Data: A doctor is aware that disease meningitis causes a patient to have a
stiff neck, and it occurs 80% of the time. He is also aware of some more facts,
which are given as follows:
•The Known probability that a patient has meningitis disease is 1/30,000.
•The Known probability that a patient has a stiff neck is 2%.
Let a be the proposition that patient has stiff neck and b be the proposition that patient
has meningitis. , so we can calculate the following as:
P(a|b) = 0.8
P(b) = 1/30000
P(a)= .02

Hence, we can assume that 1 patient out of 750 patients has meningitis
disease with a stiff neck. Dr. Abdul Ahad, Department of AI 41
Example-2: Dangerous fires are rare (1%) but smoke is fairly common (10%) due
to barbecues, and 90% of dangerous fires make smoke.
We can then discover the probability of dangerous Fire when there is Smoke:
P (Fire | Smoke ) = [ P ( Smoke | Fire ) P ( Fire ) ] / P ( Smoke )
= [ 90% X 1% ] / 10%
= 9%
So it is still worth checking out any smoke to be sure.
Example-3: You are planning a picnic today, but the morning is cloudy. Oh no! 50%
of all rainy days start off cloudy! But cloudy mornings are common (about 40% of
days start cloudy) And this is usually a dry month (only 3 of 30 days tend to be
rainy, or 10%). What is the chance of rain during the day?
We will use Rain to mean rain during the day, and Cloud to mean cloudy morning. The
chance of Rain given Cloud is written P(Rain|Cloud).
P( Rain | Cloud ) = [ P ( Cloud | Rain ) X P ( Rain ) ] / P ( Cloud )
• P ( Cloud | Rain ) is Probability of Cloud, given that Rain happens = 50%
• P ( Rain ) is Probability of Rain = 10%
• P ( Cloud ) is Probability of Cloud = 40%
P ( Rain | Cloud ) = [ 0.5 X 0.1 ] / 0.4 = 0.125 (or 12.5% chance of rain).
Not too bad, let's have a picnic!
Dr. Abdul Ahad, Department of AI 42
Graphs
 Graphs: Finite set of vertices (nodes) together with
finite set of edges
 E = (u, v) : An edge is determined by its two vertices
 Directed or undirected
 Natural way to represent states, alternatives and
measurable paths in problem solving

Dr. Abdul Ahad, Department of AI


Example: The Bridges of Konigsberg Problem
The Challenge: Find a cycle connecting land masses A, B, C,
and D, crossing each of the seven bridges only once, and
returning to the starting point.

Dr. Abdul Ahad, Department of AI


The Bridges today: Euler solved the problem, concluding
that such a solution could not exist.

Dr. Abdul Ahad, Department of AI


Trees
 Search Trees
Used for problems that require analytical approaches such
as bfs, dfs
 Decision Trees
Special type of a search tree you can use to find a solution
to a problem by choosing from alternatives starting from
the root node

Dr. Abdul Ahad, Department of AI


The Towers of Hanoi Problem (Graphical Representation)

Dr. Abdul Ahad, Department of AI


Dr. Abdul Ahad, Department of AI
Examples:
Frames
Fundamental theme, based on a human’s ability to associate
seemingly unrelated facts into meaningful scenarios

Dr. Abdul Ahad, Department of AI


Frames

Dr. Abdul Ahad, Department of AI


Semantic Networks
 A system for capturing, storing and transferring information
(which represents knowledge) that may model how humans
store information and convert it to knowledge
 Are robust, efficient, and flexible
 Useful for computer programmers & AI researchers as a form
of Knowledge Representation
 Do not represent many details about real-world that must be
accounted for

Dr. Abdul Ahad, Department of AI


Concept Maps

Graphical form of knowledge representation where important


information in the domain can be embedded in nodes
(rectangular buttons or nodes of system) and arcs (the lines
connecting nodes)

Dr. Abdul Ahad, Department of AI


Dr. Abdul Ahad, Department of AI

You might also like