Ontological Engineering
Ontological Engineering
Components Of Ontology:
Classes (Concepts): Categories or kinds of things (e.g., "Person," "Animal").
Relations: Relationships between entities (e.g., "is a part of," "has a").
Instances (Individuals): Concrete examples of a class (e.g., "John," "Tina").
Properties (Attributes): Characteristics of instances (e.g., age, color).
Ontology defines the shared vocabulary needed for a domain of knowledge, which allows
reasoning engines to process and infer information.
Example: In the domain of medicine, an ontology could define concepts like "Disease,"
"Symptom," and "Treatment," and relationships such as "caused by" or "treats."
In “toy” domains, the choice of representation is not that important; many choices will work.
Complex domains such as shopping on the Internet or driving a car in traffic require more
general and flexible representations. This chapter shows how to create these representations,
concentrating on general concepts—such as Events, Time, Physical Objects, and Beliefs—
that occur in many different domains. Representing these abstract concepts is sometimes
called ontological engineering.
For any special-purpose ontology, it is possible to make changes like these to move toward
greater generality. An obvious question then arises: do all these ontologies converge on a
general-purpose ontology? After centuries of philosophical and computational investigation,
the answer is “Maybe.” In this section, we present one general-purpose ontology that
synthesizes ideas from those centuries. Two major characteristics of general-purpose
ontologies distinguish them from collections of special-purpose ontologies:
Ageneral-purpose ontology should be applicable in more or less any special-purpose
domain (with the addition of domain-specific axioms). This means that no
representational issue can be finessed or brushed under the carpet.
In any sufficiently demanding domain, different areas of knowledge must be unified,
because reasoning and problem solving could involve several areas simultaneously. A
robot circuit-repair system, for instance, needs to reason about circuits in terms of
electrical connectivity and physical layout, and about time, both for circuit timing
analysis and estimating labor costs. The sentences describing time therefore must be
capable of being combined with those describing spatial layout and must work equally
well for nanoseconds and minutes and for angstroms and meters.
We should say up front that the enterprise of general ontological engineering has so far had
only limited success. None of the top AI applications (as listed in Chapter 1) make use of a
shared ontology—they all use special-purpose knowledge engineering. Social/political
considerations can make it difficult for competing parties to agree on an ontology. As Tom
Gruber (2004) says, “Every ontology is a treaty—a social agreement—among people with
some common motive in sharing.” When competing concerns outweigh the motivation for
sharing, there can be no common ontology. Those ontologies that do exist have been created
along four routes:
1. Byateam of trained ontologist/logicians, who architect the ontology and write axioms.
The CYC system was mostly built this way (Lenat and Guha, 1990).
2. By importing categories, attributes, and values from an existing database or databases.
DBPEDIA was built by importing structured facts from Wikipedia (Bizer et al., 2007).
3. By parsing text documents and extracting information from them. TEXTRUNNER
was built by reading a large corpus of Web pages (Banko and Etzioni, 2008).
4. By enticing unskilled amateurs to enter commonsense knowledge. The OPENMIND
system was built by volunteers who proposed facts in English (Singh et al., 2002;
Chklovski and Gil, 2005).