0% found this document useful (0 votes)
86 views

CS 540 Lecture Notes - Logic

Uploaded by

abdolmojeeb nour
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
86 views

CS 540 Lecture Notes - Logic

Uploaded by

abdolmojeeb nour
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

University of Wisconsin - Madison CS 540 Lecture Notes C. R.

Dyer

Logic (Chapter 7)
Logic for Knowledge Representation and Reasoning
One of the core problems in developing an intelligent system is knowledge representation, i.e.,
solving the problems of (1) how to represent the knowledge one has about a problem domain, and (2)
how to reason using that knowledge in order to answer questions or make decisions
Knowledge representation deals with the problem of how to model the world sufficiently for intelligent
action
Logic is one of the oldest representation languages studied for AI, and is the foundation for many
existing systems that use logic as either inspiration or the basis for the tools in that system (e.g., rule-
based expert systems and the Prolog programming language)
For a knowledge-based intelligent agent, we need:
To represent knowledge about the world in a formal language
To reason about the world using inferences in the language
To decide what action to take by inferring that the selected action is good

Representation Languages

Fundamental problem of designing a knowledge representation language is the fundamental tradeoff


between (1) a language that is expressive enough to represent the important objects and relations in a
problem domain, yet (2) allows for a tractable (i.e., efficient) means of reasoning and answering
questions about implicit information in a reasonable amount of time
Logic is a well-studied, general-purpose language for describing what's true and false in the world,
along with mechanical procedures that can operate on sentences in the language to perform reasoning
(i.e., to determine what "implicitly follows" from what is explicitly represented)

Logic
Logic is a formal system in which the formulas or sentences have true or false values
A logic includes:
Syntax: Specifies the symbols in the language and how they can be combined to form sentences.
Hence facts about the world are represented as sentences in logic
Semantics: Specifies what facts in the world a sentence refers to. Hence, also specifies how you
assign a truth value to a sentence based on its meaning in the world. A fact is a claim about the
world, and may be true or false.
Inference Procedure: Mechanical method for computing (deriving) new (true) sentences from
existing sentences
Facts are claims about the world that are True or False, whereas a representation is an expression
(sentence) in some language that can be encoded in a computer program and stands for the objects and
relations in the world
We need to ensure that the representation is consistent with reality, so that the following figure holds:
entails
Representation: Sentences --------------> Sentences
| |
| |
| Semantics | Semantics
| refer to | refer to
| |
\/ follows \/
World: Facts ------------------> Facts
Truth: A sentence is True if the state of affairs it describes is actually the case in the world. So, truth
can only be assessed with respect to the semantics. Yet the computer does not know the semantics of
the knowledge representation language, so we need some way of performing inferences to derive valid
conclusions even when the computer does not know what the semantics (the interpretation) is
To build a logic-based representation:
User defines a set of primitive symbols and the associated semantics
Logic defines the ways of putting these symbols together so that the user can define legal
sentences in the language that represent true facts in the world
Logic defines ways of inferring new sentences from existing ones

Propositional Logic (PL)


A simple language that is useful for showing key ideas and definitions
User defines a set of propositional symbols, like P and Q. User defines the semantics of each of these
symbols. For example,
P means "It is hot"
Q means "It is humid"
R means "It is raining"
A sentence (also called a formula or well-formed formula or wff) is defined as:
1. A symbol
2. If S is a sentence, then ~S is a sentence, where "~" is the "not" logical operator
3. If S and T are sentences, then (S v T), (S ^ T), (S => T), and (S <=> T) are sentences, where the
four logical connectives correspond to "or," "and," "implies," and "if and only if," respectively
4. A finite number of applications of (1)-(3)
Examples of PL sentences:
(P ^ Q) => R (here meaning "If it is hot and humid, then it is raining")
Q => P (here meaning "If it is humid, then it is hot")
Q (here meaning "It is humid.")
Given the truth values of all of the constituent symbols in a sentence, that sentence can be "evaluated"
to determine its truth value (True or False). This is called an interpretation of the sentence.
A model is an interpretation (i.e., an assignment of truth values to symbols) of a set of sentences such
that each sentence is True. A model is just a formal mathematical structure that "stands in" for the
world.
A valid sentence (also called a tautology) is a sentence that is True under all interpretations. Hence,
no matter what the world is actually like or what the semantics is, the sentence is True. For example
"It's raining or it's not raining."
An inconsistent sentence (also called unsatisfiable or a contradiction) is a sentence that is False
under all interpretations. Hence the world is never like what it describes. For example, "It's raining and
it's not raining."
Sentence P entails sentence Q, written P |= Q, means that whenever P is True, so is Q. In other words,
all models of P are also models of Q

Logical (Deductive) Inference

Let KB = { S1, S2,..., SM } be the set of all sentences in our Knowledge Base, where each Si is a sentence in
Propositional Logic. Let { X1, X2, ..., XN } be the set of all the symbols (i.e., variables) that are contained in
all of the M sentences in KB. Say we want to know if a goal (aka query, conclusion, or theorem) sentence G
follows from KB.

Since the computer doesn't know the interpretation of these sentences in the world, we don't know whether
the constituent symbols represent facts in the world that are True or False. So, instead, consider all possible
combinations of truth values for all the symbols, hence enumerating all logically distinct cases:
X1 X2 ... XN | S1 S2 ... SM | S1 ^ S2 ^...^ SM | G | (S1 ^...^ SM) => G
-------------|--------------|------------------|---|-------------------
F F ... F | | | |
F F ... T | | | |
... | | | |
T T ... T | | | |

There are 2^N rows in the table.


Each row corresponds to an equivalence class of worlds that, under a given interpretation, have the
truth values for the N symbols assigned in that row.
The models of KB are the rows where the third-to-last column is true, i.e., where all of the sentences
in KB are true.
A sentence R is valid if and only if it is true under all possible interpretations, i.e., if the entire column
associated with R contains all true values.
Since we don't know the semantics and therefore whether each symbol is True or False, to determine if
a sentence G is entailed by KB, we must determine if all models of KB are also models of G. That is,
whenever KB is true, G is true too. In other words, whenever the third-to-last column has a T, the same
row in the second-to-last column also has a T. But this is logically equivalent to saying that the
sentence (KB => G) is valid (by definition of the "implies" connective). In other words, if the last
column of the table above contains only True, then KB entails G; or conclusion G logically follows
from the premises in KB, no matter what the interpretations (i.e., semantics) associated with all of the
sentences!

The truth table method of inference is complete for PL (Propositional Logic) because we can always
enumerate all 2^n rows for the n propositional symbols that occur. But this is exponential in n. In
general, it has been shown that the problem of checking if a set of sentences in PL is satisfiable is NP-
complete. (The truth table method of inference is not complete for FOL (First-Order Logic).)

Example
Using the "weather" sentences from above, let KB = (((P ^ Q) => R) ^ (Q => P) ^ Q) corresponding to the
three facts we know about the weather: (1) "If it is hot and humid, then it is raining," (2) "If it is humid, then
it is hot," and (3) "It is humid." Now let's ask the query "Is it raining?" That is, is the query sentence R
entailed by KB? Using the truth-table approach to answering this query we have:
P Q R | (P ^ Q) => R | Q => P | Q | KB | R | KB => R
-----------------------------------------------------
T T T T T T T T T
T T F F T T F F T
T F T T T F F T T
T F F T T F F F T
F T T T F T F T T
F T F T F T F F T
F F T T T F F T T
F F F T T F F F T

Hence, in this problem there is only one model of KB, when P, Q, and R are all True. And in this case R is
also True, so R is entailed by KB. Also, you can see that the last column is all True values, so the sentence
KB => R is valid.

Instead of an exponential length proof by truth table construction, is there a faster way to implement the
inference process? Yes, using a proof procedure or inference procedure that uses sound rules of inference
to deduce (i.e., derive) new sentences that are true in all cases where the premises are true. For example,
consider the following:

P Q | P P => Q | P ^ (P => Q) | Q | (P ^ (P => Q)) => Q


------|------------|--------------|-------------------------
F F | F T | F | F | T
F T | F T | F | T | T
T F | T F | F | F | T
T T | T T | T | T | T

Since whenever P and P => Q are both true (last row only), Q is true too, Q is said to be derived from these
two premise sentences. We write this as KB |- Q. This local pattern referencing only two of the M sentences
in KB is called the Modus Ponens inference rule. The truth table shows that this inference rule is sound. It
specifies how to make one kind of step in deriving a conclusion sentence from a KB.
Therefore, given the sentences in KB, construct a proof that a given conclusion sentence can be derived from
KB by applying a sequence of sound inferences using either sentences in KB or sentences derived earlier in
the proof, until the conclusion sentence is derived. This method is called the Natural Deduction procedure.
(Note: This step-by-step, local proof process also relies on the monotonicity property of PL and FOL. That
is, adding a new sentence to KB does not affect what can be entailed from the original KB and does not
invalidate old sentences.)

Sound Rules of Inference


Here are some examples of sound rules of inference. Each can be shown to be sound once and for all using a
truth table. The left column contains the premise sentence(s), and the right column contains the derived
sentence. We write each of these derivations as A |- B , where A is the premise and B is the derived sentence.

Name Premise(s) Derived Sentence


Modus Ponens A, A => B B
And Introduction A, B A^B
And Elimination A^B A
Double Negation ~~A A
Unit Resolution A v B, ~B A
Resolution A v B, ~B v C AvC

Using Inference Rules to Prove a Query/Goal/Theorem

A proof is a sequence of sentences, where each sentence is either a premise or a sentence derived from earlier
sentences in the proof by one of the rules of inference. The last sentence is the query (also called goal or
theorem) that we want to prove.

Example for the "weather problem" given above.

1. Q Premise
2. Q => P Premise
3. P Modus Ponens(1,2)
4. (P ^ Q) => R Premise
5. P^Q And Introduction(1,3)
6. R Modus Ponens(4,5)

Two Important Properties for Inference


Soundness: If KB |- Q then KB |= Q
That is, if Q is derived from a set of sentences KB using a given set of rules of inference, then Q is
entailed by KB. Hence, inference produces only real entailments, or any sentence that follows
deductively from the premises is valid.

Completeness: If KB |= Q then KB |- Q
That is, if Q is entailed by a set of sentences KB, then Q can be derived from KB using the rules of
inference. Hence, inference produces all entailments, or all valid sentences can be proved from the
premises.

Propositional Logic is Too Weak a Representational Language

Propositional Logic (PL) is not a very expressive language because:


Hard to identify "individuals." E.g., Mary, 3
Can't directly talk about properties of individuals or relations between individuals. E.g., tall(Bill)
Generalizations, patterns, regularities can't easily be represented. E.g., all triangles have 3 sides

Consider the problem of representing the following information:

Every person is mortal.


Confucius is a person.
Confucius is mortal.

How can these sentences be represented so that we can infer the third sentence from the first two? In PL we
have to create propositional symbols to stand for all or part of each sentence. For example, we might do:

Person => Mortal


Person-Confucius
Mortal-Confucius

That is, we have used four symbols to represent the three given sentences. But, given this representation, the
third sentence is not entailed by the first two.

A different representation would be to use three symbols to represent the three sentences as

Person => Mortal


Confucius => Person
Confucius => Mortal

In this case the third sentence is entailed by the first two, but we needed an explicit symbol, Confucius, to
represent an individual who is a member of the classes "person" and "mortal." So, to represent other
individuals we must introduce separate symbols for each one, with means for representing the fact that all
individuals who are "people" are also "mortal." First-Order Logic (abbreviated FOL or FOPC) is
expressive enough to concisely represent this kind of situation.

Copyright © 1996-2003 by Charles R. Dyer. All rights reserved.

You might also like