0% found this document useful (0 votes)
91 views183 pages

Ai Lect7 Expert System

Uploaded by

Menna Saed
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
91 views183 pages

Ai Lect7 Expert System

Uploaded by

Menna Saed
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 183

AI

Knowledge-Based Represenation
Expert Systems
Knowledge bases

 Knowledge base = set of sentences in a


formal language
Stages of Knowledge Use
 Acquisition
– structure of facts
– integration of old & new knowledge
 Retrieval (recall)
– roles of linking and chunking
– means of improving recall efficiency
Representation
 Set of syntactic and semantic conventions
which make it possible to describe things
 Syntax
– specific symbols allowed and rules allowed
 Semantics
– how meaning is associated with symbol
arrangements allowed by syntax
5
Knowledge Representation
Schemas
Logic based representation – first order
predicate logic, Prolog
Procedural representation – rules, production
system
Network representation – semantic networks,
conceptual graphs
Structural representation – scripts, frames,
objects
Conceptual Graphs
 each concept has got its type and an instance
general concept – a concept with a wildcard instance
dog:*X colour brown

specific concept – a concept with a concrete instance

dog:Emma colour brown


animal
 there exsists a hierarchy of types subtype:
 concept w is specialisation of concept v ifdog cat

type(v)>type(w) or instance(w)::type(v)
Types of Knowledge
 Objects
– both physical & concepts
 Events
– usually involve time
– maybe cause & effect relationships
 Performance
– how to do things
 META Knowledge
– knowledge about how to use knowledge
Proposition logic
Basic connectives and truth tables

statements (propositions): declarative sentences that are


either true or false--but not both.
Eg. Ahmed Hassan wrote Gone with the Wind.

2+3=5.

not statements:
What a beautiful morning!
Get up and do your exercises.
Fundamentals of Logic

"The number x is an integer."

is not a statement because its truth value cannot


be determined until a numerical value is
assigned for x.
Propositional logic
 Logical constants: true, false
 Propositional symbols: P, Q, S, ... (atomic
sentences)
 Sentences are combined by connectives:
...and [conjunction]
 ...or [disjunction]
...implies [implication / conditional]
..is equivalent [biconditional]
 ...not [negation]
 Literal: atomic sentence or negated atomic
sentence
Truth Tables

p q p q p q p q pq pq

0 0 0 0 0 1 1
0 1 0 1 1 1 0

1 0 0 1 1 0 0

1 1 1 1 0 1 1
Examples of PL sentences
 P means “It is hot.”
 Q means “It is humid.”
 R means “It is raining.”
 (P  Q)  R
“If it is hot and humid, then it is raining”
 QP
“If it is humid, then it is hot”
 A better way:
Hot = “It is hot”
Humid = “It is humid”
Raining = “It is raining”
Example

s: Aya goes out for a walk.


t: The moon is out.
u: It is snowing.

( t  u )  s : If the moon is out and it is not snowing, then


Aya goes out for a walk.

If it is snowing and the moon is not out, then Aya


will not go out for a walk.( u  t )  s
Logical Equivalence

p q p p  q pq
0 0 1 1 1
0 1 1 1 1
1 0 0 0 0
1 1 0 1 1

s1  s2
Logical equivalence
 Two sentences are logically equivalent} iff true in
same models: α ≡ ß iff α╞ β and β╞ α


Tables of Logical Equivalences
• Identity laws
Like adding 0
• Domination laws
Like multiplying by 0
• Idempotent laws
Delete redundancies
• Double negation
“I don’t like you, not”
• Commutativity
Like “x+y = y+x”
• Associativity
Like “(x+y)+z = y+
(x+z)”
• Distributivity
Like “(x+y)z = xz+yz”
• De Morgan
L3 18
Tables of Logical Equivalences

• Excluded middle
• Negating creates
opposite
• Definition of
implication in terms
of Not and Or
Fundamentals of Logic
A compound statement is called a tautology(T0) if it is
true for all truth value assignments for its component
statements.
If a compound statement is false for all such
assignments, then it is called a contradiction(F0).

p  ( p  q ) : tautology
p  ( p  q ) : contradiction
Propositional Logic - 2 more defn…

A tautology is a proposition that’s always TRUE.

A contradiction is a proposition that’s always FALSE.

p p p  p p  p
T F T F
F T T F
Tautology example

Demonstrate that
[¬p (p q )]q
is a tautology in two ways:
1. Using a truth table – show that
[¬p (p q )]q is always true
2. Using a proof (will get to this later).

22
Tautology by truth table
p q ¬p p q ¬p (p q ) [¬p (p q )]q
T T F T F T

T F F T F T

F T T T T T

F F T F F T
Derivational Proof Techniques

EG: consider the compound proposition


(p p )  ((sr)t) )  (qr )

Q: Why is this a tautology?


Derivational Proof Techniques

A: Part of it is a tautology (p p ) and


the disjunction of True with any other
compound proposition is still True:
(p p )  ((sr)t ))  (qr )
 T  ((sr)t ))  (qr )
 T
Derivational techniques formalize the
intuition of this example.
L3 25
Tautology by proof
[¬p (p q )]q
 [(¬p p)(¬p q)]q Distributive
 [ F  (¬p q)]q ULE
 [¬p q ]q Identity
 ¬ [¬p q ]  q ULE
 [¬(¬p) ¬q ]  q De Morgan
 [p  ¬q ]  q Double
Negation
 p  [¬q q ] Associative
 p  [q ¬q ] Commutative
pT ULE
T Domination
L3 26
Examples

1. “I don’t study well and fail” is logically


equivalent to “If I study well, then I
don’t fail”
2. Write a C program that represents the
compound proposition (pq)r
Use truth table to find
 -PQPR
 P  -Q  R  P  R  - Q
 A  B  -C  D  E  F
Limitations of propositional logic
 So far we studied propositional logic
 Some English statements are hard to model in
propositional logic:
 “If your roommate is wet because of rain, your
roommate must not be carrying any umbrella”
 Pathetic attempt at modeling this:
 RoommateWetBecauseOfRain =>
(NOT(RoommateCarryingUmbrella0) AND
NOT(RoommateCarryingUmbrella1) AND
NOT(RoommateCarryingUmbrella2) AND …)
Problems with propositional logic
 No notion of objects
 No notion of relations among objects
 RoommateCarryingUmbrella0 is instructive
to us, suggesting
– there is an object we call Roommate,
– there is an object we call Umbrella0,
– there is a relationship Carrying between these two
objects
 Formally, none of this meaning is there
– Might as well have replaced
RoommateCarryingUmbrella0 by P
First-Order Logic

Syntax
Constants
 Constants refer to objects, functions and relationships.
Ahmed, Mona, loves, happy,
 Simple sentences express relationships among objects.
loves(Ahmed, Mona)
They are called atoms.
 Compound sentences capture relationships among relations.
loves(x,y) loves(y,x)
loves(x,y) loves(y,x) happy(x)

 Relations can be unary as well.


tall(Tomy)
Elements of first-order logic
 Objects: can give these names such as
Umbrella0, Person0, John, Earth, …
 Relations: Carrying(., .), IsAnUmbrella(.)
– Carrying(Person0, Umbrella0),
IsUmbrella(Umbrella0)
– Relations with one object = unary relations
= properties
 Functions: Roommate(.)
– Roommate(Person0)
 Equality: Roommate(Person0) = Person1
Example with Functions
E.g. How about saying that E.g. Mona loves her dog.
Ahmed has a big nose?

loves(Mona, dog_of (Mona))


Ahmed is an object and

Note: We are allowed to relate sentences


nose_of (Ahmed) only.
So, we can say:
is a function that constructs
loves(Mona, dog_of (Mona)) 
an object from the
argument object. loves(Mona, cat_of (Mona))
But not,
Then, we can write: loves(Mona,
dog_of (Mona) cat_of (Mona))
big(nose_of (Ahmed))
First-Order Logic: ,
 The language that we have described so far, consisting of atoms and the
connectives (,,,,,) is typically called predicate logic.
 To extend it to first-order logic, we need to add quantifiers.
 The purpose of quantifiers is to allow us to say things about sets of
objects.
 To say that Heba loves everything we write:
x. loves (Heba, x)
We can think of  as a big conjunction. For example, if there are only three
objects Heba, dog, and cat, what the above asserts is:
loves (Heba, dog)  loves (Heba, cat)  loves (Heba, Heba)

 To say that Hassan loves something we write:


x. loves (Hassan, x)
We can think of  as a big disjunction. For example, if there are only three
objects as above, then what we are asserting is:
loves (Hassan, dog)  loves (Hassan, cat)  loves (Hassan, Hassan)
First Order Predicate Logic –
– enriched by variables, predicates, functions
– quantifiers , 
friends(father(david),father(andrew))
 Y friends(Y, petr)
 X likes(X,ice_cream)
 X  Y  Z parent(X,Y)  parent(X,Z) 
siblings(Y,Z)
Reasoning about many objects at once
 Variables: x, y, z, … can refer to multiple
objects
 New operators “for all” and “there exists”
– Universal quantifier and existential quantifier
 for all x: CompletelyWhite(x) =>
NOT(PartiallyBlack(x))
– Completely white objects are never partially black
 there exists x: PartiallyWhite(x) AND
PartiallyBlack(x)
– There exists some object in the world that is partially
white and partially black
Practice converting English to first-order logic

 “John has an umbrella”


 there exists y: (Has(John, y) AND
IsUmbrella(y))
 “Anything that has an umbrella is not wet”
 for all x: ((there exists y: (Has(x, y) AND
IsUmbrella(y))) => NOT(IsWet(x)))
 “Any person who has an umbrella is not wet”
 for all x: (IsPerson(x) => ((there exists y:
(Has(x, y) AND IsUmbrella(y))) =>
NOT(IsWet(x))))
More practice converting English
to first-order logic
 “John has at least two umbrellas”
 there exists x: (there exists y: (Has(John, x)
AND IsUmbrella(x) AND Has(John, y) AND
IsUmbrella(y) AND NOT(x=y))
 “John has at most two umbrellas”
 for all x, y, z: ((Has(John, x) AND
IsUmbrella(x) AND Has(John, y) AND
IsUmbrella(y) AND Has(John, z) AND
IsUmbrella(z)) => (x=y OR x=z OR y=z))
Even more practice converting
English to first-order logic…
 “Duke’s basketball team defeats any
other basketball team”
 for all x: ((IsBasketballTeam(x) AND
NOT(x=BasketballTeamOf(Duke))) =>
Defeats(BasketballTeamOf(Duke), x))
 “Every team defeats some other team”
 for all x: (IsTeam(x) => (there exists y:
(IsTeam(y) AND NOT(x=y) AND
Defeats(x,y))))
Reverse translation
• Translate the following into English.

 x hesitates(x)  lost(x)
• He who hesitates is lost.

 x business(x)  like(x,Showbusiness)


• There is no business like show business.

 x glitters(x)  gold(x)


• Not everything that glitters is gold.

 x t person(x)  time(t)  canfool(x,t)


• You can fool some of the people all the time.
Translating English to FOL
Every gardener likes the sun.
x gardener(x)  likes(x,Sun)
You can fool some of the people all of the time.
x t person(x) time(t)  can-fool(x,t)
You can fool all of the people some of the time.
x t (person(x)  time(t) can-fool(x,t))
x (person(x)  t (time(t) can-fool(x,t))) Equivalent
All purple mushrooms are poisonous.
x (mushroom(x)  purple(x))  poisonous(x)
No purple mushroom is poisonous.
x purple(x)  mushroom(x)  poisonous(x)
x (mushroom(x)  purple(x))  poisonous(x)
Equivalent
There are exactly two purple mushrooms.
x y mushroom(x)  purple(x)  mushroom(y)  purple(y) ^ (x=y)  z
(mushroom(z)  purple(z))  ((x=z)  (y=z))
Clinton is not tall.
tall(Clinton)
X is above Y iff X is on directly on top of Y or there is a pile of one or more
other objects directly on top of one another starting with X and ending
with Y.
x y above(x,y) ↔ (on(x,y)  z (on(x,z)  above(z,y)))
Resolution for first-order logic
 for all x: (NOT(Knows(John, x)) OR IsMean(x) OR Loves(John, x))
– John loves everything he knows, with the possible exception of
mean things
 for all y: (Loves(Jane, y) OR Knows(y, Jane))
– Jane loves everything that does not know her
 What can we unify? What can we conclude?
 Use the substitution: {x/Jane, y/John}
 Get: IsMean(Jane) OR Loves(John, Jane) OR Loves(Jane, John)
 Complete (i.e., if not satisfiable, will find a proof of this), if we can
remove literals that are duplicates after unification
– Also need to put everything in canonical form first
Properties of quantifiers
 x y is the same as y x
 x y is the same as y x
 x y is not the same as y x
– “There is a person who loves everyone in the world”
– “Everyone in the world is loved by at least one person”
 Quantifier duality: each can be expressed using the other
 x Likes(x,IceCream) x Likes(x,IceCream)
 x Likes(x,Broccoli) x
Likes(x,Broccoli)

– y x Loves(x,y)
 x y Loves(x,y)


Using FOL

 Brothers are siblings


  x,y Brother(x,y)  Sibling(x,y)
 One's mother is one's female parent
  m,c Mother(c) = m  (Female(m)  Parent(m,c))
 “Sibling” is symmetric
  x,y Sibling(x,y)  Sibling(y,x)
 A first cousin is a child of a parent’s
sibling
  x,y FirstCousin(x,y)   p,ps Parent(p,x) 
Sibling(ps,p)  Parent(ps,y)
An example
1. Sameh is a lawyer.
2. Lawyers are rich.
3. Rich people have big houses.
4. Big houses are a lot of work.
 We would like to conclude that Sameh’s house is
a lot of work.
 Natural languages are ambiguous so we can
have different axiomatizations.
Axiomatization 1
1. lawyer(Sameh)
2. x lawyer(x)  rich(x)
3. x rich(x)  y house(x,y)
4. x,y rich(x)  house(x,y)  big(y)
5. x,y ( house(x,y)  big(y)  work(y) )
 3 and 4, say that rich people do have at least one house
and all their houses are big.
 Conclusion we want to show:
house(Sameh, S_house)  work(Sameh, S_house)
 Or, do we want to conclude that John has at least one
house that needs a lot of work? I.e.
 y house(Sameh,y)  work(y)
Amir and the cat
 Everyone who loves all animals is loved by
someone.
 Anyone who kills an animal is loved by no
one.
 Mohamed loves all animals.
 Either Mohamed or Amir killed the cat, who
is named SoSo.
 Did Amir kill the cat?
first-order logic
 for all x: (NOT(Knows(John, x)) OR IsMean(x) OR
Loves(John, x))
– John loves everything he knows, with the possible
exception of mean things
 for all y: (Loves(Jane, y) OR Knows(y, Jane))
– Jane loves everything that does not know her
Converting sentences to CNF
1. Eliminate all ↔ connectives
(P ↔ Q)  ((P  Q) ^ (Q  P))
2. Eliminate all  connectives
(P  Q)  (P  Q)
3. Reduce the scope of each negation symbol to a single
predicate
P  P
(P  Q)  P  Q
(P  Q)  P  Q
(x)P  (x)P
(x)P  (x)P
4. Standardize variables: rename all variables so that
each quantifier has its own unique variable name
50
Converting sentences
5. Eliminate existential quantification by introducing Skolem
constants/functions
(x)P(x)  P(c)
c is a Skolem constant (a brand-new constant symbol
that is not used in any other sentence)
(x)(y)P(x,y)  (x)P(x, f(x))
since  is within the scope of a universally quantified
variable, use a Skolem function f to construct a new
value that depends on the universally quantified variable
f must be a brand-new function name not occurring in any
other sentence in the KB.
E.g., (x)(y)loves(x,y)  (x)loves(x,f(x))
In this case, f(x) specifies the person that x loves
52
Types of Inference
 Model Checking

 Forward chaining with modus ponens

 Backward chaining with modus ponens

53
Model Checking
 Enumerate all possible worlds

 Restrict to possible worlds in which the


KB is true

 Check whether the goal is true in those


worlds or not

54
Inference as Search
 State: current set of sentences
 Operator: sound inference rules to derive new
entailed sentences from a set of sentences

 Can be goal directed if there is a particular


goal sentence we have in mind
 Can also try to enumerate every entailed
sentence

55
Generalized Modus Ponens
Modus Ponens - special case of Resolution

pq Sunday  Dr Yasser is teaching AI


p Sunday
q Dr Yasser teaching AI

Using the tricks:


pq
 p
ppq
 q,
i.e. q
58
Sound rules of inference
 Each can be shown to be sound using a
truth table
RULE PREMISE CONCLUSION
Modus Ponens A, A  B B
And Introduction A, B AB
And Elimination AB A
Double Negation A A
Unit Resolution A  B, B A
Resolution A  B, B  C AC
An example
(x)(P(x)  ((y)(P(y)  P(f(x,y)))  (y)(Q(x,y)  P(y))))
2. Eliminate 
(x)(P(x)  ((y)(P(y)  P(f(x,y)))  (y)(Q(x,y)  P(y))))

3. Reduce scope of negation


(x)(P(x)  ((y)(P(y)  P(f(x,y))) (y)(Q(x,y)  P(y))))
4. Standardize variables
(x)(P(x)  ((y)(P(y)  P(f(x,y))) (z)(Q(x,z)  P(z))))
5. Eliminate existential quantification
(x)(P(x) ((y)(P(y)  P(f(x,y))) (Q(x,g(x))  P(g(x)))))

6. Drop universal quantification symbols


(P(x)  ((P(y)  P(f(x,y))) (Q(x,g(x))  P(g(x)))))
Two broad kinds of rule system
 forward chaining systems, and backward
chaining systems.
 In a forward chaining system you start with
the initial facts, and keep using the rules to
draw new conclusions (or take certain
actions) given those facts
 In a backward chaining system you start with
some hypothesis (or goal) you are trying to
prove, and keep looking for rules that would
allow you to conclude that hypothesis,
perhaps setting new subgoals to prove as
you go.
Forward chaining
 Proofs start with the given
axioms/premises in KB, deriving
new sentences until the goal/query
sentence is derived
 This defines a forward-chaining
inference procedure because it
moves “forward” from the KB to the
goal [eventually]
Forward chaining

 Idea: fire any rule whose premises are satisfied in the


KB,
– add its conclusion to the KB, until query is found
Forward chaining example
Backward chaining
 Proofs start with the goal query, find rules
with that conclusion, and then prove each
of the antecedents in the implication
 Keep going until you reach premises
 Avoid loops: check if new sub-goal is
already on the goal stack
 Avoid repeated work: check if new sub-
goal
– Has already been proved true
– Has already failed
Backward Chaining

1.Tortoise ( x)  Slug ( y )  Faster ( x, y )


2.Slimy ( z )  Creeps ( z )  Slug ( z )
3.Tortoise (Tom )
4.Slimy ( Steve)
5.Creeps ( Steve)

Is Tom faster than someone?


Forward chaining example
 KB:
– allergies(X)  sneeze(X)
– cat(Y)  allergic-to-cats(X)  allergies(X)
– cat(Felix)
– allergic-to-cats(Lise)
 Goal:
– sneeze(Lise)
Exercise
 You go to the doctor and for insurance
reasons they perform a test for a
horrible disease
 You test positive
 The doctor says the test is 99%
accurate
 Do you worry?
Reduction to propositional inference
Suppose the KB contains just the following:

x King(x)  Greedy(x)  Evil(x)


King(Ali)
Greedy(Ali)
Brother(Saad, Ali)

Instantiating the universal sentence in all possible ways, we have:

King(John)  Greedy(John)  Evil(John)


King(Richard)  Greedy(Richard)  Evil(Richard)
King(John)
Greedy(John)
Brother(Richard,John)

 The new KB is propositionalized: proposition symbols are

King(John), Greedy(John), Evil(John), King(Richard), etc.


An example

1. Sameh is a lawyer.
2. Lawyers are rich.
3. Rich people have big houses.
4. Big houses are a lot of work.
 We would like to conclude that Sameh’s
house is a lot of work.
Axiomatization 1
1. lawyer(Sameh)
2. x lawyer(x)  rich(x)
3. x rich(x)  y house(x,y)
4. x,y rich(x)  house(x,y)  big(y)
5. x,y ( house(x,y)  big(y)  work(y) )
 3 and 4, say that rich people do have at least one house
and all their houses are big.
 Conclusion we want to show:
house(Sameh, S_house)  work(Sameh, S_house)
 Or, do we want to conclude that Sameh has at least one
house that needs a lot of work? I.e.
 y house(Sameh,y)  work(y)
Hassan and the cat
 Every animal owner is an animal lover
 Everyone who loves all animals is loved by
someone.
 Anyone who kills an animal is loved by no one.
 Mustafa owns a dog.
 Either Mustafa or Hassan killed the cat, who is
named SoSo.
 Did Hassan kill the cat?
Practice example
Did Hassan kill the cat

 Mustafa owns a dog. Every dog owner is an animal lover. No


animal lover kills an animal. Either Hassan or Mustafa killed the
cat, who is named SoSo . Did Hassan kill the cat?
 These can be represented as follows:
A. (x) Dog(x)  Owns(Mustafa ,x)
B. (x) ((y) Dog(y)  Owns(x, y))  AnimalLover(x)
C. (x) AnimalLover(x)  ((y) Animal(y)  Kills(x,y))
D. Kills(Mustafa ,SoSo)  Kills(Hassan,SoSo)
E. Cat(SoSo)
F. (x) Cat(x)  Animal(x)
G. Kills(Hassan, SoSo)

GOAL
 Convert to clause form
A1. (Dog(D))
A2. (Owns(Mustafa,D))
B. (Dog(y), Owns(x, y), AnimalLover(x))
C. (AnimalLover(a), Animal(b), Kills(a,b))
D. (Kills(Mustafa,SoSo), Kills(Hassan,SoSo))
E. Cat(SoSo)
F. (Cat(z), Animal(z))
 Add the negation of query:
G: (Kills(Hassan, SoSo))

74
The resolution refutation proof
R1: G, D, {} (Kills(Mustafa,SoSo))
R2: R1, C, {a/Mustafa, b/SoSo}
(~AnimalLover(Mustafa),
~Animal(SoSo))
R3: R2, B, {x/Mustafa} (~Dog(y), ~Owns(Mustafa,
y), ~Animal(SoSo))
R4: R3, A1, {y/D} (~Owns(Mustafa, D),
~Animal(SoSo))
R5: R4, A2, {} (~Animal(SoSo))
R6: R5, F, {z/SoSo} (~Cat(SoSo))
R7: R6, E, {} FALSE
75
 The proof tree
G D
{}
R1: K(J,T) C
{a/J,b/T}
R2: AL(J)  A(T) B
{x/J}
R3: D(y)  O(J,y)  A(T)A1
{y/D}
R4: O(J,D), A(T) A2
{}
R5: A(T) F
{z/T}
R6: C(T) A
{}
R7: FALSE
Umbrellas in first-order logic
 You know the following things:
– You have exactly one other person living in your house, who is wet
– If a person is wet, it is because of the rain, the sprinklers, or both
– If a person is wet because of the sprinklers, the sprinklers must be
on
– If a person is wet because of rain, that person must not be
carrying any umbrella
– There is an umbrella that “lives in” your house, which is not in its
house
– An umbrella that is not in its house must be carried by some
person who lives in that house
– You are not carrying any umbrella
 Can you conclude that the sprinklers are on?
Example knowledge base
 The law says that it is a crime for an
American to sell weapons to hostile
nations. The country Nono, an enemy of
America, has some missiles, and all of its
missiles were sold to it by Colonel West,
who is American.

 Prove that Col. West is a criminal


Example knowledge base
... it is a crime for an American to sell weapons to hostile nations:
American(x)  Weapon(y)  Sells(x,y,z)  Hostile(z)  Criminal(x)
Nono … has some missiles, i.e., x Owns(Nono,x)  Missile(x):
… all of its missiles were sold to it by Colonel West
Missile(x)  Owns(Nono,x)  Sells(West,x,Nono)
Missiles are weapons:
An enemy of America counts as "hostile“:
Enemy(x,America)  Hostile(x)
West, who is American …
The country Nono, an enemy of America …
Enemy(Nono,America)
 American(West)
 Missile(x)  Weapon(x)
 Owns(Nono,M1) and Missile(M1)
Resolution proof: definite clauses


Rule-Based Systems
 Also known as “production systems” or
“expert systems”
 Rule-based systems are one of the most
successful AI paradigms
 Used for synthesis (construction) type
systems
 Also used for analysis (diagnostic or
classification) type systems
Rule-Based Systems
 Instead of representing knowledge in a
relatively declarative, static way (as a bunch
of things that are true), rule-based system
represent knowledge in terms of a bunch of
rules that tell you what you should do or what
you could conclude in different situations.
 A rule-based system consists of a bunch of
IF-THEN rules, a bunch of facts, and some
interpreter controlling the application of the
rules, given the facts.
1. IF (lecturing X)
AND (marking-practicals X)
THEN ADD (overworked X)
2. IF (month february)
THEN ADD (lecturing ali)

3. IF (month february)
THEN ADD (marking-practicals ali)
4. IF (overworked X)
OR (slept-badly X)
THEN ADD (bad-mood X)
5. IF (bad-mood X)
THEN DELETE (happy X)
6. IF (lecturing X)
THEN DELETE (researching X)
Rule Based Reasoning
 The advantages of rule-based approach:
– The ability to use
– Good performance
– Good explanation
 The disadvantage are
– Cannot handle missing information
– Knowledge tends to be very task
dependent
Other Reasoning

 There exist some other approaches as:


– Case-Based Reasoning
– Model-Based Reasoning
– Hybrid Reasoning
• Rule-based + case-based
• Rule-based + model-based
• Model-based + case-based
Uses of Knowledge
 Knowledge consists of facts, concepts, theories, heuristic
methods, procedures, and relationships
 Knowledge is also information organized and analyzed for
understanding and applicable to problem solving or
decision making
 Knowledge base - the collection of knowledge related to a
problem (or opportunity) used in an AI system
 Typically limited in some specific, usually narrow, subject
area or domain
 The narrow domain of knowledge, and that an AI system
must involve some qualitative aspects of decision making
(critical for AI application success)

86
Decision Support Systems and Intelligent Systems, Efraim Turban and Jay E. Aronson
Copyright 1998, Prentice Hall, Upper Saddle River, NJ
Knowledge Bases
 Search the Knowledge Base for Relevant
Facts and Relationships
 Reach One or More Alternative Solutions
to a Problem
 Augments the User (Typically a Novice)

87
Decision Support Systems and Intelligent Systems, Efraim Turban and Jay E. Aronson
Copyright 1998, Prentice Hall, Upper Saddle River, NJ
(ES) Introduction
 Expert System vs. knowledge-based system
 An Expert System is a system that employs
human knowledge captured in a computer to
solve problems that ordinarily require human
expertise
 ES imitate the expert’s reasoning processes to
solve specific problems

4
88
History of
Expert Systems
1. Early to Mid-1960s
– One attempt: the General-purpose Problem Solver
(GPS)
 General-purpose Problem Solver (GPS)
 A procedure developed by Newell and Simon
[1973] from their Logic Theory Machine -
– Attempted to create an "intelligent" computer
• general problem-solving methods applicable across domains
– Predecessor to ES
– Not successful, but a good start

5
89
2. Mid-1960s: Special-purpose ES programs
– DENDRAL
– MYCIN
 Researchers recognized that the problem-solving
mechanism is only a small part of a complete, intelligent
computer system
– General problem solvers cannot be used to build high
performance ES
– Human problem solvers are good only if they operate in a very
narrow domain
– Expert systems must be constantly updated with new information
– The complexity of problems requires a considerable amount of
knowledge about the problem area

6
90
3. Mid 1970s
– Several Real Expert Systems Emerge
– Recognition of the Central Role of Knowledge
– AI Scientists Develop
• Comprehensive knowledge representation theories
• General-purpose, decision-making procedures and inferences
 Limited Success Because
– Knowledge is Too Broad and Diverse
– Efforts to Solve Fairly General Knowledge-Based
Problems were Premature

7
91
BUT
 Several knowledge representations worked

Key Insight
 The power of an ES is derived from the specific
knowledge it possesses, not from the particular
formalisms and inference schemes it employs

8
92
4. Early 1980s
 ES Technology Starts to go Commercial
– XCON
– XSEL
– CATS-1
 Programming Tools and Shells Appear
– EMYCIN
– EXPERT
– META-DENDRAL
– EURISKO
 About 1/3 of These Systems Are Very Successful and
Are Still in Use

9
93
Latest ES Developments
 Many tools to expedite the construction of ES
at a reduced cost
 Dissemination of ES in thousands of
organizations
 Extensive integration of ES with other CBIS
 Increased use of expert systems in many tasks
 Use of ES technology to expedite IS
construction (ES Shell)

10
94
 The object-oriented programming approach in
knowledge representation
 Complex systems with multiple knowledge sources,
multiple lines of reasoning, and fuzzy information
 Use of multiple knowledge bases
 Improvements in knowledge acquisition
 Larger storage and faster processing computers
 The Internet to disseminate software and expertise.

11
95
Expert Systems
 Attempt to Imitate Expert Reasoning
Processes and Knowledge in Solving
Specific Problems
 Most Popular Applied AI Technology
– Enhance Productivity
– Augment Work Forces
 Narrow Problem-Solving Areas or Tasks
96
Decision Support Systems and Intelligent Systems, Efraim Turban and Jay E. Aronson
6th ed, Copyright 2001, Prentice Hall, Upper Saddle River, NJ
Expert Systems
 Provide Direct Application of Expertise

 Expert Systems Do Not Replace Experts,


But They
– Make their Knowledge and Experience More
Widely Available
– Permit Nonexperts to Work Better

97
Decision Support Systems and Intelligent Systems, Efraim Turban and Jay E. Aronson
6th ed, Copyright 2001, Prentice Hall, Upper Saddle River, NJ
Expertise
 The extensive, task-specific knowledge acquired from
training, reading and experience
– Theories about the problem area
– Hard-and-fast rules and procedures
– Rules (heuristics)
– Global strategies
– Meta-knowledge (knowledge about knowledge)
– Facts
 Enables experts to be better and faster than nonexperts

98
Decision Support Systems and Intelligent Systems, Efraim Turban and Jay E. Aronson
6th ed, Copyright 2001, Prentice Hall, Upper Saddle River, NJ
Human Expert Behaviors
 Recognize and formulate the problem
 Solve problems quickly and properly
 Explain the solution
 Learn from experience
 Restructure knowledge
 Break rules
 Determine relevance
 Degrade gracefully
99
Decision Support Systems and Intelligent Systems, Efraim Turban and Jay E. Aronson
6th ed, Copyright 2001, Prentice Hall, Upper Saddle River, NJ
Transferring Expertise
 Objective of an expert system
– To transfer expertise from an expert to a computer
system and
– Then on to other humans (nonexperts)
 Activities
– Knowledge acquisition
– Knowledge representation
– Knowledge inferencing
– Knowledge transfer to the user
 Knowledge is stored in a knowledge base
100
Decision Support Systems and Intelligent Systems, Efraim Turban and Jay E. Aronson
6th ed, Copyright 2001, Prentice Hall, Upper Saddle River, NJ
Inferencing
 Reasoning (Thinking)
 The computer is programmed so that it
can make inferences
 Performed by the Inference Engine

101
Decision Support Systems and Intelligent Systems, Efraim Turban and Jay E. Aronson
6th ed, Copyright 2001, Prentice Hall, Upper Saddle River, NJ
Rules
 IF-THEN-ELSE

 Explanation Capability
– By the justifier, or explanation
subsystem
 ES versus Conventional Systems

102
Decision Support Systems and Intelligent Systems, Efraim Turban and Jay E. Aronson
6th ed, Copyright 2001, Prentice Hall, Upper Saddle River, NJ
Knowledge as Rules
 MYCIN rule example:
IF the infection is meningitis
AND patient has evidence of serious skin or soft tissue
infection
AND organisms were not seen on the stain of the culture
AND type of infection is bacterial
THEN There is evidence that the organism (other than
those seen on cultures or smears) causin the infection
is Staphylococus coagpus.

22
103
Three Major ES Components

User Interface

Inference
Engine

Knowledge
Base

104
ES Shell
Working
Memory

Inference Knowledge
User Interface
Engine Base

Explanation
Facility

Database, Knowledge
Spreadsheets, etc. Acquisition

Basic ES Structure
26
105
All ES Components
 Knowledge Acquisition Subsystem
 Knowledge Base
 Inference Engine
 User Interface
 Blackboard (Workplace)
 Explanation Subsystem (Justifier)
 Knowledge Refining System
 User

 Most ES do not have a Knowledge Refinement Component


(See Figure 10.3)

106
And now with confidences
 Facts:
– F1: Ungee gives milk: .9
– F2: Ungee eats meat: .8
– F3: Ungee has hoofs: .7
 Rules:
– R1: If X gives milk, then it is a mammal: .6
– R2: If X is a mammal and eats meat, then
carnivore: .5
– R3: If X has hoofs, then X is carnivore: .4
 R1 with F1: Ungee is mammal. (F4)
 Confidence F4: C(F4) = .9*.6 = .54
 R2 using F2 and F4 yields: Ungee is
carnivore (F5).
 C(F5) from R2 = min(.54, .8)*.5 = .27
 R3 using F3 conclude F5 from R3
 C(F5) from R3 = .7*.4 = .28
 C(F5) from R3 and R2 = .27 @ .28 =
1 –(1-.28)*(1-.27) = .48
Knowledge Base
 The knowledge base contains the knowledge necessary for
understanding, formulating, and solving problems

 Two Basic Knowledge Base Elements


– Facts
– Special heuristics, or rules that direct the use of
knowledge

– Knowledge is the primary raw material of ES


– Incorporated knowledge representation
109
Decision Support Systems and Intelligent Systems, Efraim Turban and Jay E. Aronson
6th ed, Copyright 2001, Prentice Hall, Upper Saddle River, NJ
Inference Engine
 The brain of the ES
 The control structure (rule interpreter)
 Provides methodology for reasoning

110
The production system after Rule
1 has fired.
The system after Rule 4 has fired. Note the stack-
based approach to goal reduction.
The and/or graph searched in the car diagnosis
example, with the conclusion of Rule 4 matching the
first premise of Rule 1.
Explanation and Transparency in
Goal-Driven Reasoning
 The following dialogue begins with the
computer asking the user about the goals
present in the working memory:
– Gas in fuel tank?
YES
– Gas in carburetor?
YES
– Engine will turn over?
WHY
Model-Based Expert System Example

 The expected output value are given in () and the actual outputs
in [ ]
The Human Element in
Expert Systems
 Builder and User
 Expert and Knowledge engineer.

 The Expert
– Has the special knowledge, judgment, experience
and methods to give advice and solve problems
– Provides knowledge about task performance

35
116
The Knowledge Engineer
 Helps the expert(s) structure the problem area
by interpreting and integrating human answers
to questions, drawing analogies, posing
counterexamples, and bringing to light
conceptual difficulties

 Usually also the System Builder

36
117
The User
 Possible Classes of Users
– A non-expert client seeking direct advice - the ES
acts as a Consultant or Advisor
– A student who wants to learn - an Instructor
– An ES builder improving or increasing the
knowledge base - a Partner
– An expert - a Colleague or Assistant
 The Expert and the Knowledge Engineer
Should Anticipate Users' Needs and
Limitations When Designing ES

37
118
Problem Areas Addressed by
Expert Systems
 Interpretation systems
 Prediction systems
 Diagnostic systems
 Design systems
 Planning systems
 Monitoring systems
 Debugging systems
 Repair systems
 Instruction systems
 Control systems

119
Decision Support Systems and Intelligent Systems, Efraim Turban and Jay E. Aronson
6th ed, Copyright 2001, Prentice Hall, Upper Saddle River, NJ
Expert Systems Benefits
 Improved Decision Quality
 Increased Output and Productivity
 Decreased Decision Making Time
 Increased Process(es) and Product Quality
 Capture Scarce Expertise
 Can Work with Incomplete or Uncertain Information
 Enhancement of Problem Solving and Decision Making
 Improved Decision Making Processes
 Knowledge Transfer to Remote Locations
 Enhancement of Other MIS

120
Decision Support Systems and Intelligent Systems, Efraim Turban and Jay E. Aronson
6th ed, Copyright 2001, Prentice Hall, Upper Saddle River, NJ
Problems and Limitations of
Expert Systems
 Knowledge is not always readily available
 Expertise can be hard to extract from humans
 Expert system users have natural cognitive limits
 ES work well only in a narrow domain of knowledge
 Knowledge engineers are rare and expensive
 Lack of trust by end-users
 ES may not be able to arrive at valid conclusions
 ES sometimes produce incorrect recommendations

121
Decision Support Systems and Intelligent Systems, Efraim Turban and Jay E. Aronson
6th ed, Copyright 2001, Prentice Hall, Upper Saddle River, NJ
Expert System
Success Factors
 Most Critical Factors
– Champion in Management
– User Involvement and Training
 Plus
– The level of knowledge must be sufficiently high
– There must be (at least) one cooperative expert
– The problem must be qualitative (fuzzy), not quantitative
– The problem must be sufficiently narrow in scope
– The ES shell must be high quality, and naturally store and
manipulate the knowledge
– A friendly user interface
– Important and difficult enough problem

122
AI

Fuzzy Systems
History, State of the Art, and
Future Development

Today, Fuzzy Logic Has


Already Become the
Standard Technique for
Multi-Variable Control !

Sde 127
Types of Uncertainty and the
Modeling of Uncertainty

Most Words and Evaluations We Use in Our Daily Reasoning Are


Not Clearly Defined in a Mathematical Manner. This Allows
Humans to Reason on an Abstract Level!

Slide 128
Probability and Uncertainty

“... a person suffering from hepatitis shows in


60% of all cases a strong fever, in 45% of all
cases yellowish colored skin, and in 30% of all
cases suffers from nausea ...”

Stochastics and Fuzzy Logic


Complement Each Other !

Slide 129
Fuzzy Set Theory

Conventional (Boolean) Set Theory:

38.7°C
38°C
40.1°C 41.4°C
Fuzzy Set Theory:
42°C
39.3°C
“Strong Fever” 38.7°C
37.2°C 38°C
40.1°C 41.4°C

42°C
39.3°C
“Strong Fever”
“More-or-Less” Rather Than “Either-Or” ! 37.2°C
Slide 130
Fuzzy Sets
Lotfi A. Zadeh, The founder of fuzzy logic.
Degree’s of truth

Does it
remain
empty?
empty half-full full? almost full?
or half-empty? nearly full? ……
Reasoning With Uncertainty

Term Certainty Factor


D efinitely not -1.0
A lm ost certainly not -0.8
P robably not -0.6
M aybe not -0.4
U nknow n -0.2 to +0.2
M aybe +0.4
P robably +0.6
A lm ost certainly +0.8
D efinitely +1.0
Fuzzy Sets...

Representing
crisp and
fuzzy sets as
subsets of a
domain
(universe) U".
Fuzziness versus probability

Probability
density
function for
throwing a
dice and the
membership
functions of
the concepts
"Small"
number,
"Medium",
"Big".
Conceptualising in fuzzy
terms...

One
representation
for the fuzzy
number "about
600".
Conceptualising in fuzzy
terms...

Representing
truthfulness
(certainty) of
events as fuzzy
sets over the
[0,1] domain.
Sets

{z  Z  | z  3}  {1,2,3}

{Live dinosaurs in British Museum} = 

{0,1,1,2}  {0,1,2}
Strong Fever Revisited

Conventional (Boolean) Set Theory:

38.7°C
38°C
40.1°C 41.4°C
Fuzzy Set Theory:
42°C
39.3°C
“Strong Fever” 38.7°C
37.2°C 38°C
40.1°C 41.4°C

42°C
39.3°C
“Strong Fever”

37.2°C
Slide 141
Why fuzzy?
As Zadeh said, the term is concrete, immediate and
descriptive; we all know what it means.
However, many people in the West
were repelled by the word fuzzy ,
because
Whyitlogic?
is usually used in a negative sense.
Fuzziness rests on fuzzy set theory, and fuzzy logic
is just a small part of that theory.
Range of logical values in Boolean and fuzzy logic

0 01 0 1 1 0 0 0.2 0.4 0.6 0.8 1 1


(a) Boolean Logic. (b) Multi-valued Logic.
 The classical example in fuzzy sets is tall men.
The elements of the fuzzy set “tall men” are all
men, but their degrees of membership depend on
their height.
Fuzzy Set Definitions

Discrete Definition:
µSF(35°C) = 0 µSF(38°C) = 0.1 µSF(41°C) = 0.9

µSF(36°C) = 0 µSF(39°C) = 0.35 µSF(42°C) = 1

µSF(37°C) = 0 µSF(40°C) = 0.65 µSF(43°C) = 1

Continuous Definition: No More Artificial Thresholds!


µ(x)
1

0
36°C 37°C 38°C 39°C 40°C 41°C 42°C
Slide 145
Representation of hedges in fuzzy logic
Hedge Mathematical Graphical Representation
Expression

A little [A( x)]1.3

Slightly [A( x )]1.7

Very [A( x)]2

Extremely [A( x) ]3
Representation of hedges in fuzzy logic (continued)
Mathematical
Hedge Graphical Representation
Expression

Very very [A ( x)]4

More or less A (x)

Somewhat A (x)

2 [A (x )]2
if 0  A  0.5
Indeed
1  2 [1  A (x)]2
if 0.5 < A  1
Linguistic Variable

...Terms, Degree of Membership, Membership Function, Base Variable...

µ(x)
low temp normal raised temperature strong fever
1 … pretty much raised …

A Linguistic Variable
Defines a Concept of Our
... but just slightly strong … Everyday Language!

0
36°C 37°C 38°C 39°C 40°C 41°C 42°C

Slide 148
Fuzzy Sets
 Formal definition:
A fuzzy set A in X is expressed as a set of ordered pairs:

A  {( x ,  A ( x ))| x  X }

Membership Universe or
Fuzzy set
function universe of discourse
(MF)

A fuzzy set is totally characterized by a


membership function (MF).
Fuzzy Sets with Discrete Universes
 Fuzzy set C = “desirable city to live in”
X = {SF, Boston, LA}
C = {(SF, 0.9), (Boston, 0.8), (LA, 0.6)}
 Fuzzy set A = “sensible number of children”
X = {0, 1, 2, 3, 4, 5, 6}
A = {(0, .1), (1, .3), (2, .7), (3, 1), (4, .6), (5, .2), (6, .1)}
Cantor’s sets
Not A
B

A A
A

Complement Containment

A B A
A B

Intersection Union
Operations of fuzzy sets
 ( x)  ( x)
B
1 1 A
A
0 0
x x
B
1 1 A
Not A
0 0
Complement x Containment x

 ( x)  ( x)

1 1
AB AB
0 0
x x
1 AB 1
AB
0 0
x x
Intersection Union
 Complement
Crisp Sets: Who does not belong to the set?
Fuzzy Sets: How much do elements not belong to
the set?
The complement of a set is an opposite of this set.
For example, if we have the set of tall men, its
complement is the set of NOT tall men. When we
remove the tall men set from the universe of
discourse, we obtain the complement. If A is the
fuzzy set, its complement A can be found as
follows:
A(x) = 1  A(x)
 Containment
Crisp Sets: Which sets belong to which other sets?
Fuzzy Sets: Which sets belong to other sets?
Similar to a Chinese box, a set can contain other
sets. The smaller set is called the subset. For
example, the set of tall men contains all tall men;
very tall men is a subset of tall men. However, the
tall men set is just a subset of the set of men. In
crisp sets, all elements of a subset entirely belong to
a larger set. In fuzzy sets, however, each element
can belong less to the subset than to the larger set.
Elements of the fuzzy subset have smaller
memberships in it than in the larger set.
 Intersection
Crisp Sets: Which element belongs to both sets?
Fuzzy Sets: How much of the element is in both
sets?
In classical set theory, an intersection between two
sets contains the elements shared by these sets. For
example, the intersection of the set of tall men and
the set of fat men is the area where these sets
overlap. In fuzzy sets, an element may partly
belong to both sets with different memberships. A
fuzzy intersection is the lower membership in both
sets of each element. The fuzzy intersection of two
fuzzy sets A and B on universe of discourse X:
AB(x) = min [A (x), B (x)] = A (x)  B(x),
where xX
 Union
Crisp Sets: Which element belongs to either set?
Fuzzy Sets: How much of the element is in either
set?
The union of two crisp sets consists of every element
that falls into either set. For example, the union of
tall men and fat men contains all men who are tall
OR fat. In fuzzy sets, the union is the reverse of the
intersection. That is, the union is the largest
membership value of the element in either set. The
fuzzy operation for forming the union of two fuzzy
sets A and B on universe X can be given as:
AB(x) = max [A (x), B(x)] = A (x) B(x),
where xX
Set-Theoretic Operations
 Subset:
A  B  A  B
 Complement:
A  X  A  A ( x )  1  A ( x )
 Union:
C  A  B  c ( x )  max( A ( x ), B ( x ))  A ( x ) B ( x )
 Intersection:
C  A  B  c ( x )  min( A ( x ), B ( x ))  A ( x ) B ( x )
What is the difference between classical and
fuzzy rules?
A classical IF-THEN rule uses binary logic, for
example,
Rule: 1 Rule: 2
IF speed is > 100 IF speed is < 40
THEN stopping_distance is long THEN stopping_distance is short

The variable speed can have any numerical value


between 0 and 220 km/h, but the linguistic variable
stopping_distance can take either value long or short.
In other words, classical rules are expressed in the
black-and-white language of Boolean logic.
We can also represent the stopping distance rules in a
fuzzy form:
Rule: 1 Rule: 2
IF speed is fast IF speed is slow
THEN stopping_distance is long THEN stopping_distance is short

In fuzzy rules, the linguistic variable speed also has


the range (the universe of discourse) between 0 and
220 km/h, but this range includes fuzzy sets, such as
slow, medium and fast. The universe of discourse of
the linguistic variable stopping_distance can be
between 0 and 300 m and may include such fuzzy
sets as short, medium and long.
Boolean OR
p q pq
0 0 0
0 1 1
1 0 1
1 1 1
Fuzzy OR
p q pq
0 0 0
0 0.5 0.5
0 1 1
0.5 0 0.5
0.5 0.5 0.5
0.5 1 1
1 0 1
1 0.5 1
1 1 1
Basic Elements of a
Fuzzy Logic System
Fuzzy Logic Defines
the Control Strategy on
a Linguistic Level!
Fuzzification, Fuzzy Inference, Defuzzification:

Measured Variables 2. Fuzzy-Inference Command Variables


(Linguistic Values) (Linguistic Values)

Linguistic
Level

1. Fuzzification 3. Defuzzification
Numerical
Level

Measured Variables Plant Command Variables


(Numerical Values) (Numerical Values)
Basic Elements of a
Fuzzy Logic System
Closing the Loop
Control Loop of the Fuzzy Logic Controlled Container Crane: With Words !

Angle, Distance 2. Fuzzy-Inference Power


(Numerical Values) (Linguistic Variable)

Linguistic
Level

1. Fuzzification 3. Defuzzification
Numerical
Level

Angle, Distance Container Crane Power


(Numerical Values) (Numerical Values)
Types of Fuzzy Controllers:
- Direct Controller -
The Outputs of the Fuzzy Logic System Are the Command Variables of the Plant:

Command
IF temp=low
AND P=high
Variables
THEN A=med

IF ...
Plant

Fuzzification Inference Defuzzification

Measured Variables

Fuzzy Rules Output


Absolute Values !
Types of Fuzzy Controllers:
- Supervisory Control -
Fuzzy Logic Controller Outputs Set Values for Underlying PID Controllers:

IF temp=low
Set Values PID
AND P=high
THEN A=med

PID Plant
IF ...

Fuzzification Inference Defuzzification PID

Measured Variables

Human Operator
Type Control !
Types of Fuzzy Controllers:
- PID Adaptation -
Fuzzy Logic Controller Adapts the P, I, and D Parameter of a Conventional PID Controller:
Set Point Variable

IF temp=low
AND P=high P
THEN A=med
I Command Variable
D
IF ...
PID Plant
Fuzzification Inference Defuzzification

Measured Variable

The Fuzzy Logic System


Analyzes the Performance of the
PID Controller and Optimizes It !
Rough set theory
 Rough set theory was developed by
Zdzislaw Pawlak in the early 1980’s.
 Representative Publications:
– Z. Pawlak, “Rough Sets”, International Journal
of Computer and Information Sciences, Vol.11,
341-356 (1982).
– Z. Pawlak, Rough Sets - Theoretical Aspect of
Reasoning about Data, Kluwer Academic
Pubilishers (1991).
Introduction (2)
 The main goal of the rough set analysis is induction of
approximations of concepts.
 Rough sets constitutes a sound basis for KDD. It
offers mathematical tools to discover patterns hidden
in data.
 It can be used for feature selection, feature extraction,
data reduction, decision rule generation, and pattern
extraction (templates, association rules) etc.

 identifies partial or total dependencies in data,


eliminates redundant data, gives approach to null
values, missing data, dynamic data and others.
Information Systems/Tables
 IS is a pair (U, A)
Age LEMS  U is a non-empty finite set of
objects.
x1 16-30 50  A is a non-empty finite set of
attributes such that
x2 16-30 0
for every
x3 31-45 1-25  is called the value set of a.
x4 31-45 1-25
a : U  Va
x5 46-60 26-49
x6 16-30 26-49 a  A.
x7 46-60 26-49 Va
Decision Systems/Tables
 DS:
 is the decision
Age LEMS Walk attribute (instead of one we
x 1 16-30 50 can consider more decision
yes attributes).
x2 16-30 0 no  The elements of A are
called the condition
x3 31-45 1-25 no
attributes.
x4 31-45 1-25 yes
x5 46-60 26-49 no T  (U , A  {d })
x6 16-30 26-49 yes
x7 46-60 26-49 no
dA
Indiscernibility
 Let IS = (U, A) be an information system, then
with any B  A there is an associated
equivalence relation:
IND IS ( B)  {( x, x' )  U 2 | a  B, a( x)  a( x' )}
whereINDIS (B) is called the B-indiscernibility
relation.
 If ( x, x' )  INDIS ( B),[ x]Bthen
. objects x and x’ are
indiscernible from each other by attributes from
B.
 The equivalence classes of the B-
indiscernibility relation are denoted by
An Example of Indiscernibility
 The non-empty subsets of
the condition attributes are
Age LEMS Walk {Age}, {LEMS}, and {Age,
LEMS}.
x 1 16-30 50  IND({Age}) = {{x1,x2,x6},
yes
{x3,x4}, {x5,x7}}
x2 16-30 0 no
 IND({LEMS}) = {{x1}, {x2},
x3 31-45 1-25 no {x3,x4}, {x5,x6,x7}}
x4 31-45 1-25 yes  IND({Age,LEMS}) = {{x1},
x5 46-60 26-49 no {x2}, {x3,x4}, {x5,x7}, {x6}}.
x6 16-30 26-49 yes
x7 46-60 26-49 no
Set Approximation
 Let T = (U, A) and let B  A and X  U .
We can approximate X using only the
information contained in B by
constructing the B-lower and B-upper
approximations of X, denoted B X and B X
respectively, where
B X  {x | [ x]B  X },

B X  {x | [ x]B  X   }.
Set Approximation (2)
 B-boundary region of X,
consists of those objects that we cannot
decisively classify into X in B.
 B-outside region of X, BN B ( X )  B X  B X ,
consists of those objects that can be with
certainty classified as not belonging to X.
 A set is said to be rough if its boundary region
is non-empty, otherwise the set U is
B X,
crisp.
An Example of Set Approximation
 Let W = {x | Walk(x) = yes}.
Age LEMS Walk AW  {x1, x6},
x 1 16-30 50
yes
AW  {x1, x3, x 4, x6},
x2 16-30 0 no BN A (W )  {x3, x 4},
x3 31-45 1-25 no U  AW  {x 2, x5, x7}.
x4 31-45 1-25 yes
 The decision class, Walk, is
x5 46-60 26-49 no rough since the boundary
x6 16-30 26-49 yes region is not empty.
x7 46-60 26-49 no
An Example of
Set Approximation (2)
{{x2}, {x5,x7}}

AW {{x3,x4}}
yes
AW
{{x1},{x6}}
yes/no

no
Lower & Upper Approximations

RX  X
U/R
RX
R : subset of
set X attributes
Lower & Upper Approximations
(2)
Upper Approximation:

R X  {Y  U / R : Y  X   }
Lower Approximation:
R X  {Y  U / R : Y  X }
Lower & Upper Approximations
(3)
U Headache Temp. Flu
U1 Yes Normal No
U2 Yes High Yes
The indiscernibility classes defined by
U3 Yes Very-high Yes R = {Headache, Temp.} are
U4 No Normal No {u1}, {u2}, {u3}, {u4}, {u5,
U5 No High No u7}, {u6, u8}.
U6 No Very-high Yes
U7 No High Yes
U8 No Very-high No

X1 = {u | Flu(u) = yes} X2 = {u | Flu(u) = no}


= {u2, u3, u6, u7} = {u1, u4, u5, u8}
RX1 = {u2, u3} RX2 = {u1, u4}
R X1 = {u2, u3, u6, u7, u8, u5}
R X2 = {u1, u4, u5, u8, u7, u6}
Lower & Upper Approximations
(4)
R = {Headache, Temp.}
U/R = { {u1}, {u2}, {u3}, {u4}, {u5, u7}, {u6, u8}}
X1 = {u | Flu(u) = yes} = {u2,u3,u6,u7}
X2 = {u | Flu(u) = no} = {u1,u4,u5,u8}

RX1 = {u2, u3} X1 X2


R X1 = {u2, u3, u6, u7, u8, u5}
u2 u7 u5 u1
RX2 = {u1, u4}
u3 u6 u8 u4
R X2 = {u1, u4, u5, u8, u7, u6}
Four Basic Classes of Rough Sets
 X is roughly B-definable, iff B ( X )   and
B( X )  U ,
 X is internally B-undefinable, iff
and B( X )  
 B ( X )  U , B-undefinable, iff
X is externally
and B( X )  
 X is totally
B ( X ) B-undefinable,
U, iff
and B( X )  
B( X )  U .
An Example of Reducts & Core
Reduct1 = {Muscle-pain,Temp.}
U Muscle Temp. Flu
pain
U Headache Muscle Temp. Flu U1,U4 Yes Normal No
pain U2 Yes High Yes
U1 Yes Yes Normal No U3,U6 Yes Very-high Yes
U2 Yes Yes High Yes U5 No High No
U3 Yes Yes Very-high Yes
U4 No Yes Normal No Reduct2 = {Headache, Temp.}
U5 No No High No
U Headache Temp. Flu
U6 No Yes Very-high Yes
U1 Yes Norlmal No
U2 Yes High Yes
U3 Yes Very-high Yes
CORE = {Headache,Temp}  U4 No Normal No
U5 No High No
      {MusclePain, Temp}
U6 No Very-high Yes
     = {Temp}
Given the following rules:
1. IF (lecturing X) AND (marking-practicals X) THEN ADD
(overworked X)
2. IF (month february) THEN ADD (lecturing ali)
3. IF (month february) THEN ADD (marking-practicals ali)
4. IF (overworked X) OR (slept-badly X) THEN ADD (bad-
mood X)
5. IF (bad-mood X) THEN DELETE (happy X)
6. IF (lecturing X) THEN DELETE (researching X)

Let us assume that initially we have a working memory


with the following fact elements:
(month february)
(happy ali)
(researching ali)

Apply forward chaining.

You might also like