An Introduction To Multisensor Data Fusion
An Introduction To Multisensor Data Fusion
Invited Paper
Multisensor data fusion is an emerging technology applied to computer and the Lambda machine) in the early 1970’s
Department of Defense (DoD) areas such as automated target provided an impetus to artificial intelligence [119], recent
recognition, battlefield surveillance, and guidance and control advances in computing and sensing have provided the
of autonomous vehicles, and to non-DoD applications such as
monitoring of complex machinery, medical diagnosis, and smart ability to emulate, in hardware and software, the natural
buildings. Techniques for multisensor data fusion are drawn from data fusion capabilities of humans and animals. Currently,
a wide range of areas including artificial intelligence, pattern data fusion systems are used extensively for target tracking,
recognition, statistical estimation, and other areas. This paper automated identification of targets, and limited automated
provides a tutorial on data fusion, introducing data fusion applica-
reasoning applications. Spurred by significant expenditures
tions, process models, and identification of applicable techniques.
Comments are made on the state-of-the-art in data fusion. by the Department of Defense (DoD), data fusion technol-
ogy has rapidly advanced from a loose collection of related
I. INTRODUCTION techniques, to an emerging true engineering discipline with
standardized terminology (see Fig. 1), collections of robust
In recent years, multisensor data fusion has received
mathematical techniques [2]–[4], and established system
significant attention for both military and nonmilitary appli-
design principles. Software in the area of data fusion
cations. Data fusion techniques combine data from multiple
applications is becoming avavailable in the commercial
sensors, and related information from associated databases,
marketplace [16].
to achieve improved accuracies and more specific infer-
Applications for multisensor data fusion are widespread.
ences than could be achieved by the use of a single
Military applications include: automated target recognition
sensor alone [1]–[4]. The concept of multisensor data fusion
(e.g., for smart weapons), guidance for autonomous vehi-
is hardly new. Humans and animals have evolved the
cles, remote sensing, battlefield surveillance, and automated
capability to use multiple senses to improve their ability
threat recognition systems, such as identification-friend-
to survive. For example, it may not be possible to assess
foe-neutral (IFFN) systems [14]. Nonmilitary applications
the quality of an edible substance based solely on the
include monitoring of manufacturing processes, condition-
sense of vision or touch, but evaluation of edibility may
based maintenance of complex machinery, robotics [129],
be achieved using a combination of sight, touch, smell,
and medical applications. Techniques to combine or fuse
and taste. Similarly, while one is unable to see around
data are drawn from a diverse set of more traditional
comers or through vegetation, the sense of hearing can
disciplines including: digital signal processing, statistical
provide advanced warning of impending dangers. Thus
multisensory data fusion is naturally performed by animals estimation, control theory, artificial intelligence, and classic
and humans to achieve more accurate assessment of the sur- numerical methods [16], [12], [54]. Historically, data fusion
rounding environment and identification of threats, thereby methods were developed primarily for military applications.
improving their chances of survival. However, in recent years these methods have been applied
While the concept of data fusion is not new, the emer- to civilian applications, and there has been bidirectional
gence of new sensors, advanced processing techniques, and technology transfer [5]. Various annual conferences pro-
improved processing hardware make real-time fusion of vide a forum for discussing data fusion applications and
data increasingly possible [5], [6]. Just as the advent of techniques [7]–[10].
symbolic processing computers (viz., the SYMBOLIC’s In principle, fusion of multisensor data provides signifi-
cant advantages over single source data. In addition to the
Manuscript received April 23, 1996; revised October 14, 1996.
D. L. Hall is with the Applied Research Laboratory, The Penn- statistical advantage gained by combining same-source data
sylvania State University, University Park, PA 16802 USA (e-mail: (e.g., obtaining an improved estimate of a physical phenom-
[email protected]). ena via redundant observations), the use of multiple types
J. Llinas is with the State University of New York, Buffalo, NY 14260
USA (e-mail: [email protected]). of sensors may increase the accuracy with which a quantity
Publisher Item Identifier S 0018-9219(97)00775-5. can be observed and characterized. In the accompanying
Fig. 2 [2], a simple example is provided of a moving object, by either of the two independent sensors. This results in a
such as an aircraft, observed by both a pulsed radar and an reduced error region as shown in the fused or combined
infrared imaging sensor. The radar provides the ability to location estimate. A similar effect may be obtained in
accurately determine the aircraft’s range, but has a limited determining the identity of an object based on observations
ability to determine the angular direction of the aircraft. of an object’s attributes. For example, there is evidence
By contrast, the infrared imaging sensor can accurately that bats identify their prey by a combination of factors
determine the aircraft’s angular direction, but is unable that include size, texture (based on acoustic signature), and
to measure range. If these two observations are correctly kinematic behavior.
associated (as shown in the central part of the figure), The most fundamental characterization of data fusion
then the combination of the two sensors data provides an involves a hierarchical transformation between observed
improved determination of location than could be obtained energy or parameters (provided by multiple sources as
include antisubmarine warfare systems to support Navy Another application, Battlefield Intelligence, Surveil-
tactical fleet operations (Fig. 6), and automated systems lance, and Target Acquisitions systems attempt to detect
to guide autonomous vehicles. Sensor suites may include and identify potential ground targets. Examples include the
radar, sonar, electronic intelligence (ELINT), observation of location of land mines and automatic target recognition of
communications traffic (COMINT), infrared, and synthetic high value targets. Sensors include airborne surveillance
aperture radar (SAR) observations [100]. The surveillance via Moving Target Indicator (MTI) radar, synthetic
area for ocean surveillance may encompass hundreds of aperture radar, passive electronic support measures, photo
nautical square miles, and a focus on air, surface, and reconnaissance, ground-based acoustic sensors, remotely
subsurface targets. Multiple surveillance platforms may also piloted vehicles, electro-optic sensors, and infrared sensors.
be involved with numerous targets tracked. Challenges to Key inferences sought are information to support battlefield
ocean surveillance involve the large surveillance volume, situation assessment and threat assessment, and course-of-
the combination of targets and sensors, and the complex action estimation.
signal propagation environment—especially for underwater A detailed discussion of DoD data fusion applications
sonar sensing. An example of an ocean surveillance system can be found in the collected annual Proceedings of the
is shown in Fig. 6. Data Fusion Systems Conference [7], Proceedings of the
Air-to-air and surface-to-air defense systems have been National Symposium on Sensor Fusion [9], and various
developed by the military to detect, track, and identify strategic documents [15].
aircraft and anti-aircraft weapons and sensors. These de-
fense systems use sensors such as radar, passive elec- III. NONMILITARY APPLICATIONS OF DATA FUSION
tronic support measures (ESM), infrared, identification- A second broad community which addresses data fusion
friend-foe (IFF) sensors, electro-optic image sensors, and problems is the academic/commercial/industrial commu-
visual (human) sightings. These systems support counter- nity. This diverse group addresses problems such as the
air, order-of-battle aggregation, assignment of aircraft to implementation of robotics, automated control of industrial
raids, target prioritization, route planning, and other activi- manufacturing systems, development of smart buildings,
ties. Challenges to these data fusion systems include enemy and medical applications (see Fig. 7), among other evolving
countermeasures, the need for rapid decision making, and applications. As with the military applications, each of these
potentially large combinations of target-sensor pairings. A applications has particular challenges, sensor suites, and
special challenge for IFF systems is the need to confi- implementation environments.
dently and noncooperatively identify enemy aircraft. The Remote sensing systems have been developed to identify
proliferation of weapon systems throughout the world, and and locate entities and objects. Examples include systems
the subsequent lack of relationship between the nationality to monitor agricultural resources (e.g., the productivity and
of weapon origin and combatants who use the weaponry, health of crops), to locate natural resources, and to monitor
causes increased IFF challenges. weather and natural disasters. These systems rely primarily
on image systems using multispectral sensors. Such pro- A final example of a data fusion system for nonmilitary
cessing systems are dominated by automatic and multispec- applications is the area of medical diagnosis. Currently,
tral image processing. The multispectral imagery employed increasingly sophisticated sensors are being developed for
includes the Landsat satellite system, the SPOT system, or medical applications. Sensors such as nuclear magnetic
others. A frequently used technique for multisensor image resonance (NMR) devices, acoustic imaging devices, and
fusion involves adaptive neural networks. Multi-image data medical tests, individually provide improvements in med-
are processed on a pixel-by-pixel basis and input to a neural ical diagnostic capability. The ability to fuse these data
network to automatically classify the contents of the image. together promises to improve the diagnostic capability, and
False colors are usually associated with types of crops, reduce false diagnoses. A clear challenge here is the signal
vegetation, or classes of objects. The resulting false color propagation environment, and difficulties in obtaining train-
synthetic image is readily interpreted by human analysts. A ing data for adaptive techniques such as neural networks.
key challenge in multi-image data fusion is interimage co- Military and nonmilitary communities are beginning to
registration. This problem requires the alignment of two or share information to create real technology transfer across
more photos so that the images are overlaid in such a way application domains. For example, the first International
that corresponding picture elements (pixels) on each picture Conference on Multi-Sensor Fusion and Integration for
represent the same location on earth (each pixel represents Intelligent Systems was sponsored by the IEEE and held
the same direction from an observer’s point of view). in Las Vegas, NV, on 2–5 October 1994 [10]. Also,
This co-registration problem is exacerbated by the fact annual (on-going) SPIE conferences focus on non-DoD
that image sensors are nonlinear, and perform a complex applications [8].
transformation between observed three-dimensional (3-D)
space, and a two-dimensional (2-D) image plane [86]. A IV. A DATA FUSION PROCESS MODEL
second application area which spans both military and One of the historical barriers to technology transfer in
nonmilitary users is the monitoring of complex mechanical data fusion has been the lack of a unifying terminology,
equipment such as turbomachinery, helicopter gear-trains, which crosses application-specific boundaries. Even within
or industrial manufacturing equipment. For a drivetrain military applications, related but different applications such
application, for example, available sensor data may include as IFF systems, battlefield surveillance, and automatic target
accelerometers, temperature gauges, oil debris monitors, recognition, have used different definitions for fundamen-
acoustic sensors, and even infrared measurements. An on- tal terms such as correlation and data fusion. In order
line condition monitoring system would seek to combine to improve communications among military researchers
these observations in order to identify precursors to fail- and system developers, the Joint Directors of Laboratories
ure, such as abnormal wear of gears, shaft misalignment, (JDL) Data Fusion Working Group, established in 1986,
or bearing failure. It is anticipated that the use of such began an effort to codify the terminology related to data
condition-based monitoring would reduce costs for mainte- fusion. The result of that effort was the creation of a
nance, improve safety, and improve reliability [126]. Such process model for data fusion, and a Data Fusion Lexicon
systems are beginning to be developed for helicopters and [12], [11]. The top level of the JDL data fusion process
other high cost systems. Special difficulties for data fusion model is shown in Fig. 8. The JDL process model is a
involve noncommensurate sensors and challenging signal functionally oriented model of data fusion and is intended
propagation and noise environments. to be very general and useful across multiple application
database management. This collection of functions and basic processing approach. At this lowest level in the
provides access to, and management of, data fusion hierarchy (shown in the third column of Fig. 10), specific
databases, including data retrieval, storage, archiving, methods such as Kalman filters, alpha-beta filters, multiple
compression, relational queries, and data protection. hypothesis trackers, etc. are identified to perform each
Database management for data fusion systems is par- function.
ticularly difficult because of the large and varied data The JDL model described here is generic, and is in-
managed (i.e., images, signal data, vectors, textural tended merely as a basis for common understanding and
data) and the data rates both for ingestion of incoming discussion. The separation of processes into Levels 1–4 is
sensor data, as well as the need for rapid retrieval. an artificial partition. Implementation of real data fusion
A summary of the JDL data fusion process components systems integrates and interleaves these functions into an
are shown in Fig. 9. Each of these components can be overall processing flow. The data fusion process model
hierarchically broken down into subprocesses. The first is augmented by a hierarchical taxonomy which identifies
level decomposition and associated applicable problem categories of techniques and algorithms for performing the
solving techniques as shown in Fig. 10. For example, Level identified functions. In addition, an associated lexicon has
1 processing is subdivided into four types of functions: data been developed to provide a consistent definition of data
alignment, data/object correlation, object positional, kine- fusion terminology [11]. The JDL model, while originally
matic, and attribute estimation, and finally, object identity developed for military applications, is clearly applicable to
estimation. The object positional, kinematic, and attribute nonmilitary applications. For example, in condition-based
estimation function is further subdivided into system mod- monitoring, the concept of Level 3 threat refinement can
els, defined optimization criteria, optimization approaches, be associated with the identification of potential system me-
chanical faults (and their anticipated progression). Thus, the Bowman has developed the concept of a hierarchical data
JDL model is useful for nonmilitary applications. Indeed, fusion tree to partition fusion problems into nodes, each
the JDL model terminology is beginning to experience conceptually involving functions such as data association,
wide utilization and acceptance throughout the data fusion correlation, and estimation, etc.
technical community. It should be noted, however, that
V. ARCHITECTURES FOR MULTISENSOR DATA FUSION
there have been a number of extensions to the JDL model
as well as discussion about its overall utility. Waltz [86], One of the key issues in developing a multisensor data
fusion system is the question of where in the data flow
for example, demonstrated that the JDL model does not
to actually combine or fuse the data. We will focus on
adequately address multi-image fusion problems. Waltz
two situations for Level 1 fusion: 1) fusion of locational
described how the JDL model can be extended to include information (such as observed range, azimuth, and eleva-
concepts of fusion of image data, especially those involving tion) to determine the position and velocity of a moving
complex synthetic aperture imagery. Hall and Ogrodnik object, and 2) fusion of parametric data (such as radar
[127] extended the model further to account for complex cross section, infrared spectra, etc.) to determine the identity
meta sensors (e.g., sensors involving multiple components of an observed object. We will discuss these two cases
and utilization of wideband processing techniques). Bow- separately, though in an actual system, fusion of locational
man [128] has argued that the JDL model is useful, but does and parametric identity information could be performed in
not help in developing an architecture for a real system. an integrated fashion.
There are three broad alternatives to fusing locational data, as desired. These alternative fusion architectures are
information to determine the position and velocity of an illustrated in Fig. 11.
object: 1) fusion of the raw observational data (so-called The top part of the figure shows fusion of raw ob-
data level or report level fusion), 2) fusion of state vectors servational data. Data from each sensor (or each differ-
(in this case, a state vector is an optimum estimate using ent sensor type) are aligned to transform the sensor data
an individual sensor’s measurements of the position and from sensor-based units and coordinates to convenient
velocity of an observed object), and 3) a hybrid approach coordinates and units for central processing. The data
which allows fusion of either raw data or state vector are then associated/correlated to determine which sensor
(b)
(c)
Fig. 12. Alternate architectures for multisensor identity fusion.
universal architecture which is applicable to all situations or interacting reasoning techniques to solve the component
applications. The architectures shown here provide a range problems, with an evolving solution obtained by combining
of possible implementations. the results for each subproblem. This is analogous to how
human experts might gather around a blackboard and solve
a problem (hence the name of the KBS architecture). An
A. Knowledge-Based Methods for Data Fusion
example of a blackboard architecture is shown in Fig. 13.
Interpretation of fused data for situation assessment or Regardless of the specific KBS technique used, three
threat assessment requires automated reasoning techniques
elements are required: 1) one or more knowledge repre-
drawn from the field of artificial intelligence. In particular,
sentation schemes, 2) an automated inference/evaluation
knowledge-based systems (KBS) or expert systems have
process, and 3) control schemes. Knowledge representation
been developed to interpret the results of Level 1 processing
systems, analyzing issues such as the context in which the schemes are techniques for representing facts, logical re-
data are observed, the relationship among observed entitles, lationships, procedural knowledge, and uncertainty. Many
hierarchical groupings of targets or objects, and predictions techniques have been developed for knowledge representa-
of future actions of targets or entities. Such reasoning is tion including production rules, frames, semantic networks,
normally performed by humans, but may be approximated scripts, and others. For each of these techniques, uncer-
by automated reasoning techniques. The reader is referred tainty in the observed data and the logical relationships
to the artificial intelligence literature for more details. A can be represented using probability, fuzzy set theory,
frequently-applied approach for data fusion involves the Dempster–Shafer evidential intervals, or other methods.
use of so-called blackboard KBS [122]. These systems The goal in building an automated reasoning system is
partition the problem into related subproblems and use to capture the reasoning capability of a human expert by
specifying the rules, frames, scripts, etc. which represent rapid prototyping of such an expert data fusion system. Key
the essence of the interpretive task. In contrast to the highly issues for developing such a system include the creation
numerical fusion processes at Level 1, the fusion of data of the knowledge base (i.e., actually specifying the rules,
and information at these higher levels of inference is largely frames, scripts, etc. via a knowledge engineering process),
(but not exclusively) conducted at the symbolic level,. Thus, and the test and evaluation of such a system. Despite these
in general applications can require a mixture of numerical difficulties, such systems are increasingly being developed
and symbolic processing. for data fusion.
Given a knowledge base, an inference or evaluation
process must be developed to utilize the knowledge. There B. Assessment of the State-of-the-Art
are formal schemes which have been developed based on The technology of multisensor data fusion is rapidly
formal logic, fuzzy logic, probabilistic reasoning, template evolving. There is much concurrent ongoing research to
methods, case-based reasoning, and many other techniques. develop new algorithms, improve existing algorithms, and
Each of these automated reasoning schemes has an inter- to understand how to assemble these techniques into an
nally consistent formalism which prescribes how to utilize overall architecture to address diverse data fusion applica-
the knowledge base (i.e., the rules, frames, etc.) to obtain tions. A brief assessment of the state-of-the-art is provided
a resulting conclusion or inference. here and shown in Fig. 14.
Finally, automated reasoning requires a control scheme The most mature area of data fusion processing is Level
to implement the reasoning process. Techniques include 1 processing, using multisensor data to determine the posi-
search methods (e.g., search a knowledge base to identify tion, velocity, attributes, and identity of individual objects
applicable rules), reason maintenance systems, assumption- or entities. In particular, determining the position and
based and justification-based truth maintenance, hierar- velocity of an object based on multiple sensor observations
chical decomposition, control theory, etc. Each of these is a relatively old problem. Gauss and Legendre developed
schemes involves assumptions and an approach for control- the method of least square for the particular problem of orbit
ling the evolving reasoning process. Control schemes direct determination for asteroids [66], [68]. Numerous mathemat-
the search through a knowledge base in order to exploit and ical techniques exist to perform coordinate transformations,
fuse the multisensor, dynamic data. associate observations-to-observations or observations-to-
The combination of selected knowledge representation tracks, and to estimate the position and velocity of a target.
scheme(s), inference/evaluation process, and control Multisensor target tracking is dominated by sequential
scheme are used to achieve automated reasoning. Popular estimation techniques such as the Kalman filter. Challenges
techniques are rule-based KBS and more recently fuzzy in this area involve circumstances in which there is a
logic based techniques. There are numerous prototype dense target environment, rapidly maneuvering targets, or
expert systems for data fusion, and readily available complex signal propagation environments (e.g., involving
commercial expert system development tools to help the multipath propagation, co-channel interference, or clut-
ter). Single target tracking in high signal-to-noise envi- taneously, guided by a knowledge-based system to select
ronments, for dynamically well-behaved (i.e., dynamically the appropriate solution, based on algorithm performance.
predictable) targets is straightforward. Current research A special problem in Level 1 processing is achieving
focuses on solving the correlation and maneuvering target robustness in the automatic identification of targets based
problem for the more complex multisensor multitarget on observed characteristics or attributes. At this time, object
cases. Techniques such as multiple hypothesis tracking recognition is dominated by feature-based methods in which
(MHT), probabilistic data association methods, random a feature vector (i.e., a representation of the sensor data)
set theory [81], [82], and multiple criteria optimization is mapped into feature space with the hope of identifying
theory [76] are all being used to resolve these issues. the target based on the location of the feature vector
Some researchers are utilizing multiple techniques simul- relative to a priori determined decision boundaries. Popular