Cognitive Mathematical
Cognitive Mathematical
Pei Wang
Department of Computer and Information Sciences
Temple University
[email protected]
http://www.cis.temple.edu/pwang/
Abstract
First-order predicate logic meets many problems when used to explain
or reproduce cognition and intelligence. These problems have a common nature, that is, they all exist outside mathematics, the domain for
which mathematical logic was designed. Cognitive logic and mathematical logic are fundamentally different, and the former cannot be obtained
by partially revising or extending the latter. A reasoning system using
a cognitive logic is briefly introduced, which provides solutions to many
problems in a unified manner.
(1) Uncertainty
Traditional theories of reasoning are certain in several aspects, whereas actual human reasoning is often uncertain in these aspects.
The meaning of a term in mathematical logic is determined according to
an interpretation, therefore it does not change as the system runs. On the
contrary, the meaning of a term in human mind often changes according
to experience and context. Example: What is a language?
In mathematical logic, the meaning of a compound term is completely
determined by its definition, which reduces its meaning into the meaning
of its components and the operator (connector) that joins the components.
On the contrary, the meaning of a compound term in human mind often
cannot be fully reduced to that of its components, though is still related
to them. Example: Is a blackboard exactly a black board?
In mathematical logic, a statement is either true or false, but people often
take truth values of certain statements as between true and false. Example: Is Tomorrow will be cloudy true or false?
In mathematical logic, the truth value of a statement does not change
over time. However, people often revise their beliefs after getting new
information. Example: After learning that Tweety is a penguin, will you
change some of your beliefs formed when you only know that Tweety is a
bird?
In mathematical logic, a contradiction leads to the proof of any arbitrary
conclusion. However, the existence of a contradiction in a human mind will
not make the person to do so. Example: When you experience conflicting
beliefs, do you believe 1 + 1 = 3?
In traditional reasoning systems, inference processes follow algorithms,
therefore are predictable. On the other hand, human reasoning processes
are often unpredictable, and very often an inference process jumps in an
unanticipated direction. Example: Have you ever postponed your writing
plan, and waited for an inspiration?
In traditional reasoning systems, how a conclusion is obtained can be accurately explained and repeated. On the contrary, the human mind often
generates conclusions whose sources and paths cannot be backtracked.
Example: Have you ever said I dont know why I believe that. Its just
my intuition?
In traditional reasoning systems, every inference process has a prespecified
goal, and the process stops whenever its goal is achieved. However, though
human reasoning processes are also guided by various goals, they often
cannot be completely achieved. Example: Have you ever tried to find the
goal of your life? When can you stop thinking about it?
None of the problems listed in the previous section is new. Actually, each of
them have obtained many proposed solutions, in the form of various non-classical
logics and reasoning systems. However, few of these solutions try to treat the
problems altogether, but see them as separate issues.
There is a common nature of these problems: they all exist outside mathematics. At the time of Aristotle, the goal of logic was to find abstract patterns
of valid inference in all domains. It remained to be the case until the time of
Frege, Russell, and Whitehead, whose major interest was to set up a solid logic
foundation for mathematics. For this reason, they developed a new logic to
model valid inference in mathematics, typically the binary deduction processes
that derives theorems from axioms and postulations.
What is the deference between a cognitive logic as used in everyday life
and a mathematical logic as used in meta-mathematics? A key difference is
their assumptions on whether their knowledge and resources are sufficient to
solve the problems they face. On this aspect, we can distinguish three types of
reasoning systems:
Pure-axiomatic systems. These systems are designed under the assumption that both knowledge and resources are sufficient. A typical example is the
notion of formal system suggested by Hilbert (and many others), in which all
answers are deduced from a set of axioms by a deterministic algorithm. The
axioms and answers get their meaning by being mapped into a concrete domain
using model-theoretical semantics. Such a system is built on the idea of sufficient knowledge and resources, because all relevant knowledge is assumed to be
fully embedded in the axioms, and because questions have no time constraints,
as long as they are answered in finite time. If a question requires information
beyond the scope of the axioms, it is not the systems fault but the questioners,
so no attempt is made to allow the system to improve its capacities and to adapt
to its environment.
Semi-axiomatic systems. These systems are designed under the assumption that knowledge and resources are insufficient in some, but not all, aspects.
Consequently, adaptation is necessary. Most current non-classical logics fall
into this category. For example, non-monotonic logics draw tentative conclusions (such as Tweety can fly) from defaults (such as Birds normally can
fly) and facts (such as Tweety is a bird), and revise such conclusions when
new facts (such as Tweety is a penguin) arrive. However, in these systems,
defaults and facts are usually unchangeable, and time pressure is not taken into
account [Reiter, 1987]. Fuzzy logic treats categorical membership as a matter of degree, but does not accurately explain where the degree come from
[Zadeh, 1965]. Many learning systems attempt to improve their behavior, but
still work solely with binary logic where everything is black-and-white, and persist in always seeking optimal solutions of problems [Michalski, 1993]. Although
some heuristic-search systems look for less-than-optimal solutions when working
within time limits, they usually do not attempt to learn from experience, and
do not consider possible variations of time pressure [Simon and Newell, 1958].
4
NARS overview
us into a circular definition or an infinite regress. The way out of this seeming
circularity in NARS is bootstrapping. A simple subset of L is defined first,
with its semantics. Then, it is used to define the semantics of the whole L.
As a result, the truth value of statements in NAL uniformly represents various types of uncertainty, such as randomness, fuzziness, and ignorance. The
semantics specifies how to understand sentences in L, and provides justifications
for the inference rules.
Categorical language
As said above, NARS needs a formal language in which the meaning of a
term is represented by its relationship with other terms, and the truth value
of a sentence is determined by available evidence. For these purposes, the notion of (positive or negative) evidence should be naturally introduced into the
language. Unfortunately, the formal language used in first-order predicate logic
does not satisfy the requirement, as revealed by the Confirmation Paradox
[Hempel, 1943].
A traditional rival to predicate logic is known as term logic. Such logics, exemplified by Aristotles Syllogistic, have the following features: [Boche
nski, 1970,
Englebretsen, 1981]
1. Each sentence is categorical, in the sense that it has a subject term and a
predicate term, related by a copula intuitively interpreted as to be.
2. Each inference rule is syllogistic, in the sense that it takes two sentences
that share a common term as premises, and from them derives a conclusion
in which the other two (unshared) terms are related by a copula.
In NARS, the basic form of knowledge is a statement S P , in which
two terms are related together by an inheritance relation. The statement indicates that the two terms can be used as each other in certain situations. The
truth value of the statement measures its evidential support obtained from the
experience of the system.
Traditional term logic has been criticized for its poor expressive power. In
NARS, this problem is solved by introducing various types of compound terms
into the language, to represent set, intersection and difference, product and
image, statement, and so on.
Syllogistic inference rules
The inference rules in term logic correspond to inheritance-based inference.
In NARS, each statement indicates how to use one item as another one, according to the experience of the system. A typical inference rule in NARS takes two
statements containing a common term as premises, and derives a conclusion
between the other two terms. The truth value of the conclusion is calculated
from the truth values of the premises, according to the semantics mentioned
above.
Different rules correspond to different combinations of premises, and use
different truth-value functions to calculate the truth value from those of the
premises, justified according to the semantics of the system. The inference rules
in NAL uniformly carry out choice, revision, deduction, abduction, induction,
7
exemplification, comparison, analogy, compound term formation and transformation, and so on.
Control mechanism
NARS cannot guarantee to process every task optimally with insufficient
knowledge, the best way to carry out a task is unknown; with insufficient resources, the system cannot exhaustively try all possibilities. Since NARS still
needs to try its best in this situation, the solution used in NARS is to let the
items and activities in the system compete for the limited resources.
In the system, different data items (tasks, beliefs, and concepts) have different priority values attached, according to which the systems resources are
distributed. These values are determined according to the past experience of
the system, and are adjusted according to the change of situation. A special
data structure is developed to implement a probabilistic priority queue with a
limited storage. Using it, each access to an item takes roughly a constant time,
and the accessibility of an item depends on its priority value. When no space
is left, items with low priority will be removed. The memory of the system
contains a collection of concepts, each of which is identified by a term in the
formal language. Within the concept, all the tasks and beliefs that have the
term as subject or predicate are collected together.
The running of NARS consists of individual inference steps. In each step, a
concept is selected probabilistically (according to its priority), then a task and a
belief are selected (also probabilistically), and some inference rules take the task
and the belief as premises to derive new tasks and beliefs, which are added into
the memory. The system runs continuously, and interacts with its environment
all the time, without stopping at the beginning and ending of each task. The
processing of a task is interwoven with the processing of other existing tasks, so
as to give the system a dynamic and context-sensitive nature.
Discussion
In the following, I roughly summarize how NARS solves the problems listed at
the beginning of the paper.
In NARS, the truth value of a statement is determined by available evidence. Since evidence can be either positive or negative, and future evidence is unpredictable, the system cannot be certain about its beliefs.
Depending on the source of the evidence, the uncertainty may correspond
to randomness or fuzziness [Wang, 1996].
In traditional binary logic, the truth value of a statement only depends
on the existence of negative evidence, but in NARS, as in the everyday
reasoning of human beings, both positive evidence and negative evidence
contribute to truth value. Therefore, Ravens are black and Non-black
things are not ravens are no longer equivalent, because they have different
positive evidence (though the same negative evidence) [Wang, 1999]. In
References
[Birnbaum, 1991] Birnbaum, L. (1991). Rigor mortis: a response to Nilssons
Logic and artificial intelligence. Artificial Intelligence, 47:5777.
[Boche
nski, 1970] Boche
nski, I. (1970). A History of Formal Logic. Chelsea
Publishing Company, New York. Translated and edited by I. Thomas.
[Dreyfus, 1992] Dreyfus, H. (1992). What Computers Still Cant Do. MIT Press,
Cambridge, Massachusetts.
[Englebretsen, 1981] Englebretsen, G. (1981). Three Logicians. Van Gorcum,
Assen, The Netherlands.
[Hempel, 1943] Hempel, C. (1943). A purely syntactical definition of confirmation. Journal of Symbolic Logic, 8:122143.
[McDermott, 1987] McDermott, D. (1987). A critique of pure reason. Computational Intelligence, 3:151160.
[Michalski, 1993] Michalski, R. (1993). Inference theory of learning as a conceptual basis for multistrategy learning. Machine Learning, 11:111151.
[Penrose, 1994] Penrose, R. (1994). Shadows of the Mind. Oxford University
Press.
[Reiter, 1987] Reiter, R. (1987). Nonmonotonic reasoning. Annual Review of
Computer Science, 2:147186.
[Searle, 1980] Searle, J. (1980). Minds, brains, and programs. The Behavioral
and Brain Sciences, 3:417424.
[Simon and Newell, 1958] Simon, H. A. and Newell, A. (1958). Heuristic problem solving: the next advance in operations research. Operations Research,
6:110.
[Wang, 1995] Wang, P. (1995). Non-Axiomatic Reasoning System: Exploring
the Essence of Intelligence. PhD thesis, Indiana University.
[Wang, 1996] Wang, P. (1996). The interpretation of fuzziness. IEEE Transactions on Systems, Man, and Cybernetics, 26(4):321326.
[Wang, 1999] Wang, P. (1999). A new approach for induction: From a nonaxiomatic logical point of view. In Ju, S., Liang, Q., and Liang, B., editors,
Philosophy, Logic, and Artificial Intelligence, pages 5385. Zhongshan University Press.
[Wang, 2001] Wang, P. (2001). Wasons cards: what is wrong? In Proceedings
of the Third International Conference on Cognitive Science, pages 371375,
Beijing.
[Zadeh, 1965] Zadeh, L. (1965). Fuzzy sets. Information and Control, 8:338
353.
10