Multisensor Data Fusion: David L. Hall and James Llinas
Multisensor Data Fusion: David L. Hall and James Llinas
Multisensor Data Fusion: David L. Hall and James Llinas
CONTENTS
1.1 Introduction ............................................................................................................................1
1.2 Multisensor Advantages .......................................................................................................2
1.3 Military Applications ............................................................................................................3
1.4 Nonmilitary Applications .....................................................................................................5
1.5 Three Processing Architectures ........................................................................................... 7
1.6 Data Fusion Process Model ...................................................................................................8
1.7 Assessment of the State-of-the-Art .................................................................................... 10
1.8 Dirty Secrets in Data Fusion............................................................................................... 11
1.9 Additional Information ....................................................................................................... 13
References ...................................................................................................................................... 13
1.1 Introduction
Over the past two decades, significant attention has been focused on multisensor data
fusion for both military and nonmilitary applications. Data fusion techniques combine
data from multiple sensors and related information to achieve more specific inferences
than could be achieved by using a single, independent sensor. Data fusion refers to the
combination of data from multiple sensors (either of the same or different types), whereas
information fusion refers to the combination of data and information from sensors, human
reports, databases, etc.
The concept of multisensor data fusion is hardly new. As humans and animals evolved,
they developed the ability to use multiple senses to help them survive. For example, assess-
ing the quality of an edible substance may not be possible using only the sense of vision;
the combination of sight, touch, smell, and taste is far more effective. Similarly, when
vision is limited by structures and vegetation, the sense of hearing can provide advanced
warning of impending dangers. Thus, multisensory data fusion is naturally performed by
animals and humans to assess more accurately the surrounding environment and to iden-
tify threats, thereby improving their chances of survival. Interestingly, recent applications
of data fusion1 have combined data from an artificial nose and an artificial tongue using
neural networks and fuzzy logic.
Although the concept of data fusion is not new, the emergence of new sensors, advanced
processing techniques, improved processing hardware, and wideband communications
has made real-time fusion of data increasingly viable. Just as the advent of symbolic pro-
cessing computers (e.g., the Symbolics computer and the Lambda machine) in the early
1970s provided an impetus to artificial intelligence, the recent advances in computing and
sensing have provided the capability to emulate, in hardware and software, the natural
data fusion capabilities of humans and animals. Currently, data fusion systems are used
extensively for target tracking, automated identification of targets, and limited automated
reasoning applications. Data fusion technology has rapidly advanced from a loose collec-
tion of related techniques to an emerging true engineering discipline with standardized
terminology, collection of robust mathematical techniques, and established system design
principles. Indeed, the remaining chapters of this handbook provide an overview of these
techniques, design principles, and example applications.
Applications for multisensor data fusion are widespread. Military applications include
automated target recognition (e.g., for smart weapons), guidance for autonomous vehi-
Downloaded by [ISTANBUL TEKNIK UNIVERSITESI] at 01:07 26 February 2015
cles, remote sensing, battlefield surveillance, and automated threat recognition (e.g.,
identification-friend-foe-neutral [IFFN] systems). Military applications have also extended
to condition monitoring of weapons and machinery, to monitoring of the health status of
individual soldiers, and to assistance in logistics. Nonmilitary applications include moni-
toring of manufacturing processes, condition-based maintenance of complex machinery,
environmental monitoring, robotics, and medical applications.
Techniques to combine or fuse data are drawn from a diverse set of more traditional dis-
ciplines, including digital signal processing, statistical estimation, control theory, artificial
intelligence, and classic numerical methods. Historically, data fusion methods were devel-
oped primarily for military applications. However, in recent years, these methods have
been applied to civilian applications and a bidirectional transfer of technology has begun.
Target report
Target report LOS
LOS Slant range
Azimuth uncertainty
uncertainty
Radar
Slant range absolute
uncertainty uncertainty
region Target
Absolute report
uncertainty
Target region
report intersection
Elevation
Downloaded by [ISTANBUL TEKNIK UNIVERSITESI] at 01:07 26 February 2015
Azimuth
uncertainty FLIR absolute Elevation uncertainty
uncertainty uncertainty
region
FIGURE 1.1
A moving object observed by both a pulsed radar and an infrared imaging sensor.
better determination of location than could be obtained by either of the two independent
sensors. This results in a reduced error region, as shown in the fused or combined location
estimate. A similar effect may be obtained in determining the identity of an object on the
basis of the observations of an objects attributes. For example, there is evidence that bats
identify their prey by a combination of factors, including size, texture (based on acoustic
signature), and kinematic behavior. Interestingly, just as humans may use spoofing tech-
niques to confuse sensor systems, some moths confuse bats by emitting sounds similar to
those emitted by the bat closing in on prey (see http://www.desertmuseum.org/books/
nhsd_moths.htmldownloaded on October 4, 2007).
targets can be tracked. Challenges to ocean surveillance involve the large surveillance
volume, the combination of targets and sensors, and the complex signal propagation
environmentespecially for underwater sonar sensing. An example of an ocean surveil-
lance system is shown in Figure 1.2.
Air-to-air and surface-to-air defense systems have been developed by the military to
detect, track, and identify aircraft and antiaircraft weapons and sensors. These defense
TABLE 1.1
Representative Data Fusion Applications for Defense Systems
specific Inferences Sought by Primary Observable Sensor
Downloaded by [ISTANBUL TEKNIK UNIVERSITESI] at 01:07 26 February 2015
FIGURE 1.2
An example of an ocean surveillance system.
systems use sensors such as radar, passive electronic support measures (ESM), infrared
identification-friend-foe (IFF) sensors, electrooptic image sensors, and visual (human)
sightings. These systems support counterair, order-of-battle aggregation, assignment of
aircraft to raids, target prioritization, route planning, and other activities. Challenges to
these data fusion systems include enemy countermeasures, the need for rapid decision
making, and potentially large combinations of target-sensor pairings. A special challenge
for IFF systems is the need to confidently and noncooperatively identify enemy aircraft.
The proliferation of weapon systems throughout the world has resulted in little correlation
between the national origin of a weapon and the combatants who use the weapon.
Finally, battlefield intelligence, surveillance, and target acquisition systems attempt to
detect and identify potential ground targets. Examples include the location of land mines
Downloaded by [ISTANBUL TEKNIK UNIVERSITESI] at 01:07 26 February 2015
and automatic target recognition. Sensors include airborne surveillance via SAR, passive
ESM, photo-reconnaissance, ground-based acoustic sensors, remotely piloted vehicles,
electrooptic sensors, and infrared sensors. Key inferences sought are information to sup-
port battlefield situation assessment and threat assessment.
TABLE 1.2
Representative Nondefense Data Fusion Applications
Specific Inferences Sought by Primary Observable Surveillance Sensor
Applications Data Fusion Process Data Volume Platforms
Condition- Detection, EM signals, acoustic Microscopic to Ships, aircraft,
based characterization of signals, magnetic, hundreds of ground-based
maintenance system faults, temperatures, x-rays, feet (e.g., factories)
recommendations for lubricant debris,
maintenance/corrections vibration
Robotics Object location/ Television, acoustic Microscopic to Robot body
recognition, guide the signals, EM signals, tens of feet
locomotion of robot (e.g., x-rays about the robot
hands and feet)
Medical Location/identification of X-rays, nuclear magnetic Human body Laboratory
diagnoses tumors, abnormalities, resonance (NMR), volume
and disease temperature, infrared,
visual inspection,
chemical and biological
data, self-reports of
symptoms by humans
Environmental Identification/location of Synthetic aperture radar Hundreds of Satellites, aircraft,
monitoring natural phenomena (e.g., (SAR), seismic, EM miles, miles (site ground-based,
earthquakes, weather) radiation, core monitoring) underground
samples, chemical and samples
biological data
Remote sensing systems have been developed to identify and locate entities and objects.
Examples include systems to monitor agricultural resources (e.g., to monitor the productivity
and health of crops), locate natural resources, and monitor weather and natural disasters.
These systems rely primarily on image systems using multispectral sensors. Such processing
systems are dominated by automatic image processing. Multispectral imagerysuch as the
Landsat satellite system (http://www.bsrsi.msu.edu/) and the SPOT systemis used (see
http://www.spotimage.fr/web/en/167-satellite-image-spot-formosat-2-kompsat-2-radar.
php). A technique frequently used for multisensor image fusion involves adaptive neural
networks. Multiimage data are processed on a pixel-by-pixel basis and input to a neural
network to classify automatically the contents of the image. False colors are usually asso-
ciated with types of crops, vegetation, or classes of objects. Human analysts can readily
Downloaded by [ISTANBUL TEKNIK UNIVERSITESI] at 01:07 26 February 2015
FIGURE 1.3
Mechanical diagnostic test-bed used by The Pennsylvania State University to perform condition-based main-
tenance research.
F
Sensor
Downloaded by [ISTANBUL TEKNIK UNIVERSITESI] at 01:07 26 February 2015
e
A a
A
s t
s u
r
o e Joint
Sensor c Data-
Identity identity
B i level e
x declaration declaration
a fusion t
t r
i a
o c
t
Sensor n i
N o
n
(a)
F
Sensor e
A a
t A
u s
r s Feature-
e o level
Sensor c fusion Joint
B e i identity
x a declaration
t t
r Identity
i
a o declaration
c n
t
Sensor i
o
N n
(b)
F
Sensor e Identity
a declaration I/DA A
A t s
u s Declaration-
r level
Sensor e o
Identity I/DB c fusion Joint
B e declaration i identity
x
t a declaration
r t Identity
a i
c declaration
t o
Sensor i Identity
o declaration I/DN n
N
(c) n
FIGURE 1.4
(a) Direct fusion of sensor data. (b) Representation of sensor data via feature vectors, with subsequent fusion
of the feature vectors. (c) Processing of each sensor to achieve high-level inferences or decisions, which are
subsequently combined.
If the multisensor data are commensurate (i.e., if the sensors are measuring the same
physical phenomena such as two visual image sensors or two acoustic sensors) then
the raw sensor data can be directly combined. Techniques for raw data fusion typically
involve classic estimation methods such as Kalman filtering.2 Conversely, if the sensor
data are noncommensurate then the data must be fused at the feature/state vector level
or decision level.
Feature-level fusion involves the extraction of representative features from sensor data.
An example of feature extraction is the cartoonists use of key facial characteristics to rep-
resent the human face. This techniquewhich is popular among political satiristsuses
key features to evoke recognition of famous figures. Evidence confirms that humans utilize
a feature-based cognitive function to recognize objects. In the case of multisensor feature-
Downloaded by [ISTANBUL TEKNIK UNIVERSITESI] at 01:07 26 February 2015
level fusion, features are extracted from multiple sensor observations and combined into a
single concatenated feature vector that is an input to pattern recognition techniques such
as neural networks, clustering algorithms, or template methods.
Decision-level fusion combines sensor information after each sensor has made a prelim-
inary determination of an entitys location, attributes, and identity. Examples of decision-
level fusion methods include weighted decision methods (voting techniques), classical
inference, Bayesian inference, and DempsterShafers method.
Human
Sources computer
interaction
Downloaded by [ISTANBUL TEKNIK UNIVERSITESI] at 01:07 26 February 2015
Support Fusion
database database
Level 4
Process
refinement
FIGURE 1.5
Joint Directors of Laboratories process model for data fusion.
For each of these subprocesses, the hierarchical JDL model identifies specific functions
and categories of techniques (in the models second layer) and specific techniques (in the
models lowest layer). Implementation of data fusion systems integrates and interleaves
these functions into an overall processing flow.
The data fusion process model is augmented by a hierarchical taxonomy that identi-
fies categories of techniques and algorithms for performing the identified functions. An
associated lexicon has been developed to provide a consistent definition of data fusion
terminology. The JDL model is described in more detail in Chapters 2 and 3, and by Hall
and McMullen.3
squares for determining the orbits of asteroids.2 Numerous mathematical techniques exist
for performing coordinate transformations, associating observations to observations or
to tracks, and estimating the position and velocity of a target. Multisensor target tracking
is dominated by sequential estimation techniques such as the Kalman filter. Challenges
in this area involve circumstances in which there is a dense target environment, rapidly
maneuvering targets, or complex signal propagation environments (e.g., involving mul-
tipath propagation, cochannel interference, or clutter). However, single-target tracking in
excellent signal-to-noise environments for dynamically well-behaved (i.e., dynamically
predictable) targets is a straightforward, easily resolved problem.
Current research focuses on solving the assignment and maneuvering target problem.
Techniques such as multiple-hypothesis tracking (MHT) and its extensions, probabilistic
data association methods, random set theory, and multiple criteria optimization theory
are being used to resolve these issues. Recent studies have also focused on relaxing
the assumptions of the Kalman filter using techniques such as particle filters and other
methods. Some researchers are utilizing multiple techniques simultaneously, guided by
a knowledge-based system capable of selecting the appropriate solution on the basis of
algorithm performance.
A special problem in level 1 processing involves the automatic identification of targets
on the basis of observed characteristics or attributes. To date, object recognition has been
dominated by feature-based methods in which a feature vector (i.e., a representation of the
sensor data) is mapped into feature space with the hope of identifying the target on the
basis of the location of the feature vector relative to a priori determined decision bound-
aries. Popular pattern recognition techniques include neural networks, statistical classi-
fiers, and vector machine approaches. Although numerous techniques are available, the
ultimate success of these methods relies on the selection of good features. (Good features
provide excellent class separability in feature space, whereas bad features result in greatly
overlapping feature space areas for several classes of target.) More research is needed in
this area to guide the selection of features and to incorporate explicit knowledge about
target classes. For example, syntactic methods provide additional information about the
makeup of a target. In addition, some limited research is proceeding to incorporate con-
textual informationsuch as target mobility with respect to terrainto assist in target
identification.
Level 2 and level 3 fusions (situation refinement and threat refinement) are currently
dominated by knowledge-based methods such as rule-based blackboard systems, intelli-
gent agents, Bayesian belief network formulations, etc. These areas are relatively immature
and have numerous prototypes, but few robust, operational systems. The main challenge
in this area is to establish a viable knowledge base of rules, frames, scripts, or other meth-
ods to represent knowledge about situation assessment or threat assessment. Unfortu-
nately, only primitive cognitive models exist to replicate the human performance of these
functions. Much research is needed before reliable and large-scale knowledge-based sys-
tems can be developed for automated situation assessment and threat assessment. New
approaches that offer promise are the use of fuzzy logic and hybrid architectures, which
extend the concept of blackboard systems to hierarchical and multitime scale orien-
tations. Also, recent work by Yen and his associates4 on team-based intelligent agents
appears promising. These agents emulate the way human teams collaborate, proactively
exchanging information and anticipating information needs.
Level 4 processing, which assesses and improves the performance and operation of an
ongoing data fusion process, has a mixed maturity. For single-sensor operations, tech-
niques from operations research and control theory have been applied to develop effec-
tive systems, even for complex single sensors such as phased array radars. By contrast,
Downloaded by [ISTANBUL TEKNIK UNIVERSITESI] at 01:07 26 February 2015
situations that involve multiple sensors, external mission constraints, dynamic observing
environments, and multiple targets are more challenging. To date, considerable difficulty
has been encountered in attempting to model and incorporate mission objectives and
constraints to balance optimized performance with limited resources, such as computing
power and communication bandwidth (e.g., between sensors and processors), and other
effects. Methods from utility theory are being applied to develop measures of system per-
formance and effectiveness. Knowledge-based systems are being developed for context-
based approximate reasoning. Significant improvements would result from the advent of
smart, self-calibrating sensors, which can accurately and dynamically assess their own
performance. The advent of distributed network-centric environments, in which sensing
resources, communications capabilities, and information requests are very dynamic, cre-
ates serious challenges for level 4 fusion. It is difficult (or possibly impossible) to optimize
resource utilization in such an environment. In a recent study, Mullen et al.5 have applied
concepts of market-based auctions to dynamically allocate resources, treating sensors and
communication systems as suppliers of services, and users and algorithms as consumers, to
rapidly assess how to allocate system resources to satisfy the consumers of information.
Data fusion has suffered from a lack of rigor with regard to the test and evaluation of
algorithms and the means of transitioning research findings from theory to application.
The data fusion community must insist on high standards for algorithm development, test,
and evaluation; creation of standard test cases; and systematic evolution of the technology
to meet realistic applications. On a positive note, the introduction of the JDL process model
and the emerging nonmilitary applications are expected to result in increased cross-
discipline communication and research. The nonmilitary research in robotics, condition-
based maintenance, industrial process control, transportation, and intelligent buildings
would produce innovations that will cross-fertilize the entire field of data fusion tech-
nology. The challenges and opportunities related to data fusion establish it as an exciting
research field with numerous applications.
There is still no substitute for a good sensor (and a good human to interpret the
results)This means that if something cannot be actually observed or inferred
Downloaded by [ISTANBUL TEKNIK UNIVERSITESI] at 01:07 26 February 2015
from effects, then no amount of data fusion from multiple sensors would overcome
this problem. This problem becomes even more challenging as threats change. The
transition from the search for well-known physical targets (e.g., weapon systems,
emitters, etc.) to targets based on human networks causes obvious issues with
determining what can and should be observed. In particular, trying to determine
intent is tantamount to mind reading, and is an elusive problem.
Downstream processing still cannot absolve upstream sins (or lack of attention to
the data)It is clear that we must do the best processing possible at every step of
the fusion/inference process. For example, it is necessary to perform appropriate
image and signal processing at the data stage, followed by appropriate transforma-
tions to extract feature vectors, etc., for feature-based identity processing. Failure
to perform the appropriate data processing or failure to select and refine effective
feature vectors cannot be overcome by choosing complex pattern recognition tech-
niques. We simply must pay attention at every stage of the information chain, from
energy detection to knowledge creation.
Not only may the fused result be worse than the best sensor, but failure to address
pedigree, information overload, and uncertainty may really fowl up thingsThe
rapid introduction of new sensors and use of humans as soft sensors (reporters)
in network operations places special challenges on determining how to weight the
incoming data. Failure to accurately assess the accuracy of the sensor/input data
would lead to biases and errors in the fused results. The advent of networked oper-
ations and service-oriented architectures (SOA) can exacerbate this problem by
rapidly disseminating data and information without understanding the sources or
pedigree (who did what to the data).
There are still no magic algorithmsThis book provides an overview of numerous
algorithms and techniques for all levels of fusion. Although there are increasingly
sophisticated algorithms, it is always a challenge to match the algorithm with the
actual state of knowledge of the data, system, and inferences to be made. No single
algorithm is ideal under all circumstances.
There will never be enough training dataHowever, hybrid methods that com-
bine implicit and explicit information can help. It is well-known that pattern recog-
nition methods, such as neural networks, require training data to establish the key
weights. When seeking to map an n-dimensional feature vector to one of m classes
or categories, we need in general n m (1030) training examples under a vari-
ety of observing conditions. This can be very challenging to obtain, especially with
dynamically changing threats. Hence, in general, there will never be enough train-
ing data available to satisfy the mathematical conditions for pattern recognition
techniques. However, new hybrid methods that use a combination of sample data,
model-based data, and human subject explicit information can assist in this area.
We have started at the wrong end (viz., at the sensor side vs. at the human side
of fusion)Finally, we note that extensive research has been conducted to develop
methods for level 0 and level 1 fusions. In essence, we have started at the data
side or sensor inputs to progress toward the human side. More research needs
to be conducted in which we begin at the human side (viz., at the formation of
hypotheses or semantic interpretation of events) and proceed toward the sensing
side of fusion. Indeed, the introduction of the level 5 process was recognition of
this need.
Downloaded by [ISTANBUL TEKNIK UNIVERSITESI] at 01:07 26 February 2015
The original issues identified (viz., that fusion is not a static process, and that the benefits
of fusion processing are difficult to quantify) still hold true.
Overall, this is an exciting time for the field of data fusion. The rapid advances and pro-
liferation of sensors, the global spread of wireless communications, and the rapid improve-
ments in computer processing and data storage enable new applications and methods to
be developed.
D.L. Hall, Mathematical Techniques in Multisensor Data Fusion, Artech House, Inc.
(1992)Provides details on the mathematical and heuristic techniques for data
fusion.
E. Waltz and J. Llinas, Multisensor Data Fusion, Artech House, Inc. (1990)Presents
an excellent overview of data fusion especially for military applications.
L.A. Klein, Sensor and Data Fusion Concepts and Applications, SPIE Optical Engineer-
ing Press, Vol. TT 14 (1993)Presents an abbreviated introduction to data fusion.
R. Antony, Principles of Data Fusion Automation, Artech House, Inc. (1995)Provides
a discussion of data fusion processes with special focus on database issues to
achieve computational efficiency.
A multimedia computer-based training package, Introduction to Data Fusion, A Mul-
timedia Computer-Based Training Package, available from Artech House, Inc., Boston,
MA, 1995.
References
1. T. Sundic, S. Marco, J. Samitier, and P. Wide, Electronic tongue and electronic nose data fusion
in classification with neural networks and fuzzy logic based models, IEEE, 3, 14741480, 2000.
2. H.W. Sorenson, Least-squares estimation: From Gauss to Kalman, IEEE Spectrum, 7, 6368,
July 1970.
3. D. Hall and S.A.H. McMullen, Mathematical Techniques in Multisensor Data Fusion, Artech House
Inc., Boston, MA, 2004.
4. G. Airy, P.-C. Chen, X. Fan, J. Yen, D. Hall, M. Brogan, and T. Huynh, Collaborative RPD agents
assisting decision making in active decision spaces, in Proceedings of the 2006, IAT06, IEEE/
WIC/ACM International Conference on Intelligent Agent Technology, December 2006.
5. T. Mullen, V. Avasarala, and D.L. Hall, Customer-driven sensor management, IEEE Intelligent
Systems, Special Issue on Self-Management through Self-Organization in Information Systems,
March/April 2006, 4149.
6. D.L. Hall and A. Steinberg, Dirty secrets in multisensor data fusion, Proceedings of the National
Symposium on Sensor Data Fusion (NSSDF), San Antonio, TX, June 2000.
Downloaded by [ISTANBUL TEKNIK UNIVERSITESI] at 01:07 26 February 2015