Multisensor Data Fusion: David L. Hall and James Llinas

Download as pdf or txt
Download as pdf or txt
You are on page 1of 14

1

Multisensor Data Fusion

David L. Hall and James Llinas

CONTENTS
1.1 Introduction ............................................................................................................................1
1.2 Multisensor Advantages .......................................................................................................2
1.3 Military Applications ............................................................................................................3
1.4 Nonmilitary Applications .....................................................................................................5
1.5 Three Processing Architectures ........................................................................................... 7
1.6 Data Fusion Process Model ...................................................................................................8
1.7 Assessment of the State-of-the-Art .................................................................................... 10
1.8 Dirty Secrets in Data Fusion............................................................................................... 11
1.9 Additional Information ....................................................................................................... 13
References ...................................................................................................................................... 13

1.1 Introduction
Over the past two decades, significant attention has been focused on multisensor data
fusion for both military and nonmilitary applications. Data fusion techniques combine
data from multiple sensors and related information to achieve more specific inferences
than could be achieved by using a single, independent sensor. Data fusion refers to the
combination of data from multiple sensors (either of the same or different types), whereas
information fusion refers to the combination of data and information from sensors, human
reports, databases, etc.
The concept of multisensor data fusion is hardly new. As humans and animals evolved,
they developed the ability to use multiple senses to help them survive. For example, assess-
ing the quality of an edible substance may not be possible using only the sense of vision;
the combination of sight, touch, smell, and taste is far more effective. Similarly, when
vision is limited by structures and vegetation, the sense of hearing can provide advanced
warning of impending dangers. Thus, multisensory data fusion is naturally performed by
animals and humans to assess more accurately the surrounding environment and to iden-
tify threats, thereby improving their chances of survival. Interestingly, recent applications
of data fusion1 have combined data from an artificial nose and an artificial tongue using
neural networks and fuzzy logic.
Although the concept of data fusion is not new, the emergence of new sensors, advanced
processing techniques, improved processing hardware, and wideband communications
has made real-time fusion of data increasingly viable. Just as the advent of symbolic pro-
cessing computers (e.g., the Symbolics computer and the Lambda machine) in the early

2009 by Taylor & Francis Group, LLC

CRC_53086_Ch001.indd 1 7/31/2008 6:15:09 PM


2 Handbook of Multisensor Data Fusion: Theory and Practice

1970s provided an impetus to artificial intelligence, the recent advances in computing and
sensing have provided the capability to emulate, in hardware and software, the natural
data fusion capabilities of humans and animals. Currently, data fusion systems are used
extensively for target tracking, automated identification of targets, and limited automated
reasoning applications. Data fusion technology has rapidly advanced from a loose collec-
tion of related techniques to an emerging true engineering discipline with standardized
terminology, collection of robust mathematical techniques, and established system design
principles. Indeed, the remaining chapters of this handbook provide an overview of these
techniques, design principles, and example applications.
Applications for multisensor data fusion are widespread. Military applications include
automated target recognition (e.g., for smart weapons), guidance for autonomous vehi-
Downloaded by [ISTANBUL TEKNIK UNIVERSITESI] at 01:07 26 February 2015

cles, remote sensing, battlefield surveillance, and automated threat recognition (e.g.,
identification-friend-foe-neutral [IFFN] systems). Military applications have also extended
to condition monitoring of weapons and machinery, to monitoring of the health status of
individual soldiers, and to assistance in logistics. Nonmilitary applications include moni-
toring of manufacturing processes, condition-based maintenance of complex machinery,
environmental monitoring, robotics, and medical applications.
Techniques to combine or fuse data are drawn from a diverse set of more traditional dis-
ciplines, including digital signal processing, statistical estimation, control theory, artificial
intelligence, and classic numerical methods. Historically, data fusion methods were devel-
oped primarily for military applications. However, in recent years, these methods have
been applied to civilian applications and a bidirectional transfer of technology has begun.

1.2 Multisensor Advantages


Fused data from multiple sensors provide several advantages over data from a single
sensor. First, if several identical sensors are used (e.g., identical radars tracking a moving
object), combining the observations would result in an improved estimate of the target
position and velocity. A statistical advantage is gained by adding the N independent
observations (e.g., the estimate of the target location or velocity is improved by a factor
proportional to N1/2), assuming the data are combined in an optimal manner. The same
result could also be obtained by combining N observations from an individual sensor.
The second advantage is that using the relative placement or motion of multiple sen-
sors the observation process can be improved. For example, two sensors that measure
angular directions to an object can be coordinated to determine the position of the object
by triangulation. This technique is used in surveying and for commercial navigation (e.g.,
VHF omni-directional range [VOR]). Similarly, sensors, one moving in a known way with
respect to another, can be used to measure instantaneously an objects position and velocity
with respect to the observing sensors.
The third advantage gained using multiple sensors is improved observability. Broaden-
ing the baseline of physical observables can result in significant improvements. Figure 1.1
provides a simple example of a moving object, such as an aircraft, that is observed by
both a pulsed radar and a forward-looking infrared (FLIR) imaging sensor. The radar can
accurately determine the aircrafts range but has a limited ability to determine the angular
direction of the aircraft. By contrast, the infrared imaging sensor can accurately determine
the aircrafts angular direction but cannot measure the range. If these two observations are
correctly associated (as shown in Figure 1.1), the combination of the two sensors provides a

2009 by Taylor & Francis Group, LLC

CRC_53086_Ch001.indd 2 7/31/2008 6:15:10 PM


Multisensor Data Fusion 3

Radar Combined FLIR

Target report
Target report LOS
LOS Slant range
Azimuth uncertainty
uncertainty
Radar
Slant range absolute
uncertainty uncertainty
region Target
Absolute report
uncertainty
Target region
report intersection
Elevation
Downloaded by [ISTANBUL TEKNIK UNIVERSITESI] at 01:07 26 February 2015

Azimuth
uncertainty FLIR absolute Elevation uncertainty
uncertainty uncertainty
region

FIGURE 1.1
A moving object observed by both a pulsed radar and an infrared imaging sensor.

better determination of location than could be obtained by either of the two independent
sensors. This results in a reduced error region, as shown in the fused or combined location
estimate. A similar effect may be obtained in determining the identity of an object on the
basis of the observations of an objects attributes. For example, there is evidence that bats
identify their prey by a combination of factors, including size, texture (based on acoustic
signature), and kinematic behavior. Interestingly, just as humans may use spoofing tech-
niques to confuse sensor systems, some moths confuse bats by emitting sounds similar to
those emitted by the bat closing in on prey (see http://www.desertmuseum.org/books/
nhsd_moths.htmldownloaded on October 4, 2007).

1.3 Military Applications


The Department of Defense (DoD) community focuses on problems involving the location,
characterization, and identification of dynamic entities such as emitters, platforms, weap-
ons, and military units. These dynamic data are often termed as order-of-battle database or
order-of-battle display (if superimposed on a map display). Beyond achieving an order-of-
battle database, DoD users seek higher-level inferences about the enemy situation (e.g., the
relationships among entities and their relationships with the environment and higher-level
enemy organizations). Examples of DoD-related applications include ocean surveillance,
air-to-air defense, battlefield intelligence, surveillance and target acquisition, and strategic
warning and defense. Each of these military applications involves a particular focus, a sen-
sor suite, a desired set of inferences, and a unique set of challenges, as shown in Table 1.1.
Ocean surveillance systems are designed to detect, track, and identify ocean-based tar-
gets and events. Examples include antisubmarine warfare systems to support navy tactical
fleet operations and automated systems to guide autonomous vehicles. Sensor suites can
include radar, sonar, electronic intelligence (ELINT), observation of communications traffic,
infrared, and synthetic aperture radar (SAR) observations. The surveillance volume for
ocean surveillance may encompass hundreds of nautical miles and focus on air, surface,
and subsurface targets. Multiple surveillance platforms can be involved and numerous

2009 by Taylor & Francis Group, LLC

CRC_53086_Ch001.indd 3 7/31/2008 6:15:10 PM


4 Handbook of Multisensor Data Fusion: Theory and Practice

targets can be tracked. Challenges to ocean surveillance involve the large surveillance
volume, the combination of targets and sensors, and the complex signal propagation
environmentespecially for underwater sonar sensing. An example of an ocean surveil-
lance system is shown in Figure 1.2.
Air-to-air and surface-to-air defense systems have been developed by the military to
detect, track, and identify aircraft and antiaircraft weapons and sensors. These defense

TABLE 1.1
Representative Data Fusion Applications for Defense Systems
specific Inferences Sought by Primary Observable Sensor
Downloaded by [ISTANBUL TEKNIK UNIVERSITESI] at 01:07 26 February 2015

Applications Data Fusion Process Data Surveillance Volume Platforms


Ocean Detection, tracking, Expectation Hundreds of nautical Ships, aircraft,
surveillance identification of maximization (EM) miles, air/surface/ submarines,
targets and events signals, acoustic subsurface ground-based,
signals, nuclear- ocean-based
related, derived
observations
Air-to-air and Detection, tracking, EM radiation Hundreds of miles Ground-based,
surface-to-air identification of (strategic), miles aircraft
defense aircraft (tactical)
Battlefield Detection and EM radiation Tens of hundreds of Ground-based,
intelligence, identification of miles about a aircraft
surveillance, potential ground battlefield
and target targets
acquisition
Strategic Detection of indications EM radiation, Global Satellites,
warning and of impending strategic nuclear-related aircraft
defense actions, detection and
tracking of ballistic
missiles and warheads

Aircraft radar detects


presence of the submarine

Ship receives observation data and


fuses it with a reference database
to identify submarine

Coordination of input from multiple


sonars is used to track the submarine

FIGURE 1.2
An example of an ocean surveillance system.

2009 by Taylor & Francis Group, LLC

CRC_53086_Ch001.indd 4 7/31/2008 6:15:11 PM


Multisensor Data Fusion 5

systems use sensors such as radar, passive electronic support measures (ESM), infrared
identification-friend-foe (IFF) sensors, electrooptic image sensors, and visual (human)
sightings. These systems support counterair, order-of-battle aggregation, assignment of
aircraft to raids, target prioritization, route planning, and other activities. Challenges to
these data fusion systems include enemy countermeasures, the need for rapid decision
making, and potentially large combinations of target-sensor pairings. A special challenge
for IFF systems is the need to confidently and noncooperatively identify enemy aircraft.
The proliferation of weapon systems throughout the world has resulted in little correlation
between the national origin of a weapon and the combatants who use the weapon.
Finally, battlefield intelligence, surveillance, and target acquisition systems attempt to
detect and identify potential ground targets. Examples include the location of land mines
Downloaded by [ISTANBUL TEKNIK UNIVERSITESI] at 01:07 26 February 2015

and automatic target recognition. Sensors include airborne surveillance via SAR, passive
ESM, photo-reconnaissance, ground-based acoustic sensors, remotely piloted vehicles,
electrooptic sensors, and infrared sensors. Key inferences sought are information to sup-
port battlefield situation assessment and threat assessment.

1.4 Nonmilitary Applications


Other groups addressing data fusion problems are the academic, commercial, and indus-
trial communities. They involve nonmilitary applications such as the implementation of
robotics, automated control of industrial manufacturing systems, development of smart
buildings, and medical applications. As with military applications, each of these applica-
tions has a particular set of challenges and sensor suites, and a specific implementation
environment (see Table 1.2).

TABLE 1.2
Representative Nondefense Data Fusion Applications
Specific Inferences Sought by Primary Observable Surveillance Sensor
Applications Data Fusion Process Data Volume Platforms
Condition- Detection, EM signals, acoustic Microscopic to Ships, aircraft,
based characterization of signals, magnetic, hundreds of ground-based
maintenance system faults, temperatures, x-rays, feet (e.g., factories)
recommendations for lubricant debris,
maintenance/corrections vibration
Robotics Object location/ Television, acoustic Microscopic to Robot body
recognition, guide the signals, EM signals, tens of feet
locomotion of robot (e.g., x-rays about the robot
hands and feet)
Medical Location/identification of X-rays, nuclear magnetic Human body Laboratory
diagnoses tumors, abnormalities, resonance (NMR), volume
and disease temperature, infrared,
visual inspection,
chemical and biological
data, self-reports of
symptoms by humans
Environmental Identification/location of Synthetic aperture radar Hundreds of Satellites, aircraft,
monitoring natural phenomena (e.g., (SAR), seismic, EM miles, miles (site ground-based,
earthquakes, weather) radiation, core monitoring) underground
samples, chemical and samples
biological data

2009 by Taylor & Francis Group, LLC

CRC_53086_Ch001.indd 5 7/31/2008 6:15:12 PM


6 Handbook of Multisensor Data Fusion: Theory and Practice

Remote sensing systems have been developed to identify and locate entities and objects.
Examples include systems to monitor agricultural resources (e.g., to monitor the productivity
and health of crops), locate natural resources, and monitor weather and natural disasters.
These systems rely primarily on image systems using multispectral sensors. Such processing
systems are dominated by automatic image processing. Multispectral imagerysuch as the
Landsat satellite system (http://www.bsrsi.msu.edu/) and the SPOT systemis used (see
http://www.spotimage.fr/web/en/167-satellite-image-spot-formosat-2-kompsat-2-radar.
php). A technique frequently used for multisensor image fusion involves adaptive neural
networks. Multiimage data are processed on a pixel-by-pixel basis and input to a neural
network to classify automatically the contents of the image. False colors are usually asso-
ciated with types of crops, vegetation, or classes of objects. Human analysts can readily
Downloaded by [ISTANBUL TEKNIK UNIVERSITESI] at 01:07 26 February 2015

interpret the resulting false color synthetic image.


A key challenge in multiimage data fusion is coregistration. This problem requires the
alignment of two or more photos so that the images are overlaid in such a way that cor-
responding picture elements (pixels) on each picture represent the same location on earth
(i.e., each pixel represents the same direction from an observers point of view). This coreg-
istration problem is exacerbated by the fact that image sensors are nonlinear and they
perform a complex transformation between the observed three-dimensional space and a
two-dimensional image.
A second application area, which spans both military and nonmilitary users, is the mon-
itoring of complex mechanical equipment such as turbo machinery, helicopter gear trains,
or industrial manufacturing equipment. For a drivetrain application, for example, sensor
data can be obtained from accelerometers, temperature gauges, oil debris monitors, acous-
tic sensors, and infrared measurements. An online condition-monitoring system would
seek to combine these observations to identify precursors to failure such as abnormal gear
wear, shaft misalignment, or bearing failure. The use of such condition-based monitoring
is expected to reduce maintenance costs and improve safety and reliability. Such systems
are beginning to be developed for helicopters and other platforms (see Figure 1.3).

Torque Gear Torque


30 HP drive cell box cell 75 HP load

FIGURE 1.3
Mechanical diagnostic test-bed used by The Pennsylvania State University to perform condition-based main-
tenance research.

2009 by Taylor & Francis Group, LLC

CRC_53086_Ch001.indd 6 7/31/2008 6:15:12 PM


Multisensor Data Fusion 7

1.5 Three Processing Architectures


Three basic alternatives can be used for multisensor data: (1) direct fusion of sensor data;
(2) representation of sensor data via feature vectors, with subsequent fusion of the feature
vectors; or (3) processing of each sensor to achieve high-level inferences or decisions, which
are subsequently combined. Each of these approaches utilizes different fusion techniques
as described and shown in Figures 1.4a through 1.4c.

F
Sensor
Downloaded by [ISTANBUL TEKNIK UNIVERSITESI] at 01:07 26 February 2015

e
A a
A
s t
s u
r
o e Joint
Sensor c Data-
Identity identity
B i level e
x declaration declaration
a fusion t
t r
i a
o c
t
Sensor n i
N o
n
(a)

F
Sensor e
A a
t A
u s
r s Feature-
e o level
Sensor c fusion Joint
B e i identity
x a declaration
t t
r Identity
i
a o declaration
c n
t
Sensor i
o
N n
(b)

F
Sensor e Identity
a declaration I/DA A
A t s
u s Declaration-
r level
Sensor e o
Identity I/DB c fusion Joint
B e declaration i identity
x
t a declaration
r t Identity
a i
c declaration
t o
Sensor i Identity
o declaration I/DN n
N
(c) n

FIGURE 1.4
(a) Direct fusion of sensor data. (b) Representation of sensor data via feature vectors, with subsequent fusion
of the feature vectors. (c) Processing of each sensor to achieve high-level inferences or decisions, which are
subsequently combined.

2009 by Taylor & Francis Group, LLC

CRC_53086_Ch001.indd 7 7/31/2008 6:15:14 PM


8 Handbook of Multisensor Data Fusion: Theory and Practice

If the multisensor data are commensurate (i.e., if the sensors are measuring the same
physical phenomena such as two visual image sensors or two acoustic sensors) then
the raw sensor data can be directly combined. Techniques for raw data fusion typically
involve classic estimation methods such as Kalman filtering.2 Conversely, if the sensor
data are noncommensurate then the data must be fused at the feature/state vector level
or decision level.
Feature-level fusion involves the extraction of representative features from sensor data.
An example of feature extraction is the cartoonists use of key facial characteristics to rep-
resent the human face. This techniquewhich is popular among political satiristsuses
key features to evoke recognition of famous figures. Evidence confirms that humans utilize
a feature-based cognitive function to recognize objects. In the case of multisensor feature-
Downloaded by [ISTANBUL TEKNIK UNIVERSITESI] at 01:07 26 February 2015

level fusion, features are extracted from multiple sensor observations and combined into a
single concatenated feature vector that is an input to pattern recognition techniques such
as neural networks, clustering algorithms, or template methods.
Decision-level fusion combines sensor information after each sensor has made a prelim-
inary determination of an entitys location, attributes, and identity. Examples of decision-
level fusion methods include weighted decision methods (voting techniques), classical
inference, Bayesian inference, and DempsterShafers method.

1.6 Data Fusion Process Model


One of the historical barriers to technology transfer in data fusion has been the lack of a
unifying terminology that crosses applicationspecific boundaries. Even within military
applications, related but distinct applicationssuch as IFF, battlefield surveillance, and
automatic target recognitionused different definitions for fundamental terms such as
correlation and data fusion. To improve communications among military researchers and
system developers, the Joint Directors of Laboratories (JDL) Data Fusion Working Group
(established in 1986) began an effort to codify the terminology related to data fusion. The
result of that effort was the creation of a process model for data fusion and a data fusion
lexicon (shown in Figure 1.5).
The JDL process model, which is intended to be very general and useful across mul-
tiple application areas, identifies the processes, functions, categories of techniques, and
specific techniques applicable to data fusion. The model is a two-layer hierarchy. At the
top level, shown in Figure 1.5, the data fusion process is conceptualized by sensor inputs,
humancomputer interaction, database management, source preprocessing, and six key
subprocesses:

Level 0 processing (subobject data association and estimation) is aimed at combining


pixel or signal level data to obtain initial information about an observed targets
characteristics.
Level 1 processing (object refinement) is aimed at combining sensor data to obtain
the most reliable and accurate estimate of an entitys position, velocity, attributes,
and identity (to support prediction estimates of future position, velocity, and
attributes).

2009 by Taylor & Francis Group, LLC

CRC_53086_Ch001.indd 8 7/31/2008 6:15:14 PM


Multisensor Data Fusion 9

Data fusion domain

Level 0 Level 1 Level 2 Level 3


Signal Object Situation Threat
refinement refinement refinement refinement

Human
Sources computer
interaction
Downloaded by [ISTANBUL TEKNIK UNIVERSITESI] at 01:07 26 February 2015

Database management system

Support Fusion
database database

Level 4
Process
refinement

FIGURE 1.5
Joint Directors of Laboratories process model for data fusion.

Level 2 processing (situation refinement) dynamically attempts to develop a descrip-


tion of current relationships among entities and events in the context of their
environment. This entails object clustering and relational analysis such as force
structure and cross-force relations, communications, physical context, etc.
Level 3 processing (significance estimation) projects the current situation into the
future to draw inferences about enemy threats, friend and foe vulnerabilities, and
opportunities for operations (and also consequence prediction, susceptibility, and
vulnerability assessments).
Level 4 processing (process refinement) is a meta-process that monitors the overall
data fusion process to assess and improve real-time system performance. This is
an element of resource management.
Level 5 processing (cognitive refinement) seeks to improve the interaction between a
fusion system and one or more user/analysts. Functions performed include aids
for visualization, cognitive assistance, bias remediation, collaboration, team-based
decision making, course of action analysis, etc.

For each of these subprocesses, the hierarchical JDL model identifies specific functions
and categories of techniques (in the models second layer) and specific techniques (in the
models lowest layer). Implementation of data fusion systems integrates and interleaves
these functions into an overall processing flow.
The data fusion process model is augmented by a hierarchical taxonomy that identi-
fies categories of techniques and algorithms for performing the identified functions. An
associated lexicon has been developed to provide a consistent definition of data fusion
terminology. The JDL model is described in more detail in Chapters 2 and 3, and by Hall
and McMullen.3

2009 by Taylor & Francis Group, LLC

CRC_53086_Ch001.indd 9 7/31/2008 6:15:15 PM


10 Handbook of Multisensor Data Fusion: Theory and Practice

1.7 Assessment of the State-of-the-Art


The technology of multisensor data fusion is rapidly evolving. There is much concur-
rent research ongoing to develop new algorithms, to improve existing algorithms, and to
assemble these techniques into an overall architecture capable of addressing diverse data
fusion applications.
The most mature area of data fusion process is level 1 processingusing multisensor data
to determine the position, velocity, attributes, and identity of individual objects or entities.
Determining the position and velocity of an object on the basis of multiple sensor obser-
vations is a relatively old problem. Gauss and Legendre developed the method of least
Downloaded by [ISTANBUL TEKNIK UNIVERSITESI] at 01:07 26 February 2015

squares for determining the orbits of asteroids.2 Numerous mathematical techniques exist
for performing coordinate transformations, associating observations to observations or
to tracks, and estimating the position and velocity of a target. Multisensor target tracking
is dominated by sequential estimation techniques such as the Kalman filter. Challenges
in this area involve circumstances in which there is a dense target environment, rapidly
maneuvering targets, or complex signal propagation environments (e.g., involving mul-
tipath propagation, cochannel interference, or clutter). However, single-target tracking in
excellent signal-to-noise environments for dynamically well-behaved (i.e., dynamically
predictable) targets is a straightforward, easily resolved problem.
Current research focuses on solving the assignment and maneuvering target problem.
Techniques such as multiple-hypothesis tracking (MHT) and its extensions, probabilistic
data association methods, random set theory, and multiple criteria optimization theory
are being used to resolve these issues. Recent studies have also focused on relaxing
the assumptions of the Kalman filter using techniques such as particle filters and other
methods. Some researchers are utilizing multiple techniques simultaneously, guided by
a knowledge-based system capable of selecting the appropriate solution on the basis of
algorithm performance.
A special problem in level 1 processing involves the automatic identification of targets
on the basis of observed characteristics or attributes. To date, object recognition has been
dominated by feature-based methods in which a feature vector (i.e., a representation of the
sensor data) is mapped into feature space with the hope of identifying the target on the
basis of the location of the feature vector relative to a priori determined decision bound-
aries. Popular pattern recognition techniques include neural networks, statistical classi-
fiers, and vector machine approaches. Although numerous techniques are available, the
ultimate success of these methods relies on the selection of good features. (Good features
provide excellent class separability in feature space, whereas bad features result in greatly
overlapping feature space areas for several classes of target.) More research is needed in
this area to guide the selection of features and to incorporate explicit knowledge about
target classes. For example, syntactic methods provide additional information about the
makeup of a target. In addition, some limited research is proceeding to incorporate con-
textual informationsuch as target mobility with respect to terrainto assist in target
identification.
Level 2 and level 3 fusions (situation refinement and threat refinement) are currently
dominated by knowledge-based methods such as rule-based blackboard systems, intelli-
gent agents, Bayesian belief network formulations, etc. These areas are relatively immature
and have numerous prototypes, but few robust, operational systems. The main challenge
in this area is to establish a viable knowledge base of rules, frames, scripts, or other meth-
ods to represent knowledge about situation assessment or threat assessment. Unfortu-
nately, only primitive cognitive models exist to replicate the human performance of these

2009 by Taylor & Francis Group, LLC

CRC_53086_Ch001.indd 10 7/31/2008 6:15:15 PM


Multisensor Data Fusion 11

functions. Much research is needed before reliable and large-scale knowledge-based sys-
tems can be developed for automated situation assessment and threat assessment. New
approaches that offer promise are the use of fuzzy logic and hybrid architectures, which
extend the concept of blackboard systems to hierarchical and multitime scale orien-
tations. Also, recent work by Yen and his associates4 on team-based intelligent agents
appears promising. These agents emulate the way human teams collaborate, proactively
exchanging information and anticipating information needs.
Level 4 processing, which assesses and improves the performance and operation of an
ongoing data fusion process, has a mixed maturity. For single-sensor operations, tech-
niques from operations research and control theory have been applied to develop effec-
tive systems, even for complex single sensors such as phased array radars. By contrast,
Downloaded by [ISTANBUL TEKNIK UNIVERSITESI] at 01:07 26 February 2015

situations that involve multiple sensors, external mission constraints, dynamic observing
environments, and multiple targets are more challenging. To date, considerable difficulty
has been encountered in attempting to model and incorporate mission objectives and
constraints to balance optimized performance with limited resources, such as computing
power and communication bandwidth (e.g., between sensors and processors), and other
effects. Methods from utility theory are being applied to develop measures of system per-
formance and effectiveness. Knowledge-based systems are being developed for context-
based approximate reasoning. Significant improvements would result from the advent of
smart, self-calibrating sensors, which can accurately and dynamically assess their own
performance. The advent of distributed network-centric environments, in which sensing
resources, communications capabilities, and information requests are very dynamic, cre-
ates serious challenges for level 4 fusion. It is difficult (or possibly impossible) to optimize
resource utilization in such an environment. In a recent study, Mullen et al.5 have applied
concepts of market-based auctions to dynamically allocate resources, treating sensors and
communication systems as suppliers of services, and users and algorithms as consumers, to
rapidly assess how to allocate system resources to satisfy the consumers of information.
Data fusion has suffered from a lack of rigor with regard to the test and evaluation of
algorithms and the means of transitioning research findings from theory to application.
The data fusion community must insist on high standards for algorithm development, test,
and evaluation; creation of standard test cases; and systematic evolution of the technology
to meet realistic applications. On a positive note, the introduction of the JDL process model
and the emerging nonmilitary applications are expected to result in increased cross-
discipline communication and research. The nonmilitary research in robotics, condition-
based maintenance, industrial process control, transportation, and intelligent buildings
would produce innovations that will cross-fertilize the entire field of data fusion tech-
nology. The challenges and opportunities related to data fusion establish it as an exciting
research field with numerous applications.

1.8 Dirty Secrets in Data Fusion


In the first edition of this handbook, a chapter entitled Dirty Secrets in Data Fusion was
included. It was based on a article written by Hall and Steinberg.6 This original article had
identified the following seven challenges or issues in data fusion:

1. There is no substitute for a good sensor.


2. Downstream processing cannot absolve the sins of upstream processing.

2009 by Taylor & Francis Group, LLC

CRC_53086_Ch001.indd 11 7/31/2008 6:15:15 PM


12 Handbook of Multisensor Data Fusion: Theory and Practice

3. The fused answer may be worse than the best sensor.


4. There are no magic algorithms.
5. There will never be enough training data.
6. It is difficult to quantify the value of data fusion.
7. Fusion is not a static process.

Subsequently, these dirty secrets were revised as follows:

There is still no substitute for a good sensor (and a good human to interpret the
results)This means that if something cannot be actually observed or inferred
Downloaded by [ISTANBUL TEKNIK UNIVERSITESI] at 01:07 26 February 2015

from effects, then no amount of data fusion from multiple sensors would overcome
this problem. This problem becomes even more challenging as threats change. The
transition from the search for well-known physical targets (e.g., weapon systems,
emitters, etc.) to targets based on human networks causes obvious issues with
determining what can and should be observed. In particular, trying to determine
intent is tantamount to mind reading, and is an elusive problem.
Downstream processing still cannot absolve upstream sins (or lack of attention to
the data)It is clear that we must do the best processing possible at every step of
the fusion/inference process. For example, it is necessary to perform appropriate
image and signal processing at the data stage, followed by appropriate transforma-
tions to extract feature vectors, etc., for feature-based identity processing. Failure
to perform the appropriate data processing or failure to select and refine effective
feature vectors cannot be overcome by choosing complex pattern recognition tech-
niques. We simply must pay attention at every stage of the information chain, from
energy detection to knowledge creation.
Not only may the fused result be worse than the best sensor, but failure to address
pedigree, information overload, and uncertainty may really fowl up thingsThe
rapid introduction of new sensors and use of humans as soft sensors (reporters)
in network operations places special challenges on determining how to weight the
incoming data. Failure to accurately assess the accuracy of the sensor/input data
would lead to biases and errors in the fused results. The advent of networked oper-
ations and service-oriented architectures (SOA) can exacerbate this problem by
rapidly disseminating data and information without understanding the sources or
pedigree (who did what to the data).
There are still no magic algorithmsThis book provides an overview of numerous
algorithms and techniques for all levels of fusion. Although there are increasingly
sophisticated algorithms, it is always a challenge to match the algorithm with the
actual state of knowledge of the data, system, and inferences to be made. No single
algorithm is ideal under all circumstances.
There will never be enough training dataHowever, hybrid methods that com-
bine implicit and explicit information can help. It is well-known that pattern recog-
nition methods, such as neural networks, require training data to establish the key
weights. When seeking to map an n-dimensional feature vector to one of m classes
or categories, we need in general n m (1030) training examples under a vari-
ety of observing conditions. This can be very challenging to obtain, especially with
dynamically changing threats. Hence, in general, there will never be enough train-
ing data available to satisfy the mathematical conditions for pattern recognition

2009 by Taylor & Francis Group, LLC

CRC_53086_Ch001.indd 12 7/31/2008 6:15:15 PM


Multisensor Data Fusion 13

techniques. However, new hybrid methods that use a combination of sample data,
model-based data, and human subject explicit information can assist in this area.
We have started at the wrong end (viz., at the sensor side vs. at the human side
of fusion)Finally, we note that extensive research has been conducted to develop
methods for level 0 and level 1 fusions. In essence, we have started at the data
side or sensor inputs to progress toward the human side. More research needs
to be conducted in which we begin at the human side (viz., at the formation of
hypotheses or semantic interpretation of events) and proceed toward the sensing
side of fusion. Indeed, the introduction of the level 5 process was recognition of
this need.
Downloaded by [ISTANBUL TEKNIK UNIVERSITESI] at 01:07 26 February 2015

The original issues identified (viz., that fusion is not a static process, and that the benefits
of fusion processing are difficult to quantify) still hold true.
Overall, this is an exciting time for the field of data fusion. The rapid advances and pro-
liferation of sensors, the global spread of wireless communications, and the rapid improve-
ments in computer processing and data storage enable new applications and methods to
be developed.

1.9 Additional Information


Additional information about multisensor data fusion may be found in the following
references:

D.L. Hall, Mathematical Techniques in Multisensor Data Fusion, Artech House, Inc.
(1992)Provides details on the mathematical and heuristic techniques for data
fusion.
E. Waltz and J. Llinas, Multisensor Data Fusion, Artech House, Inc. (1990)Presents
an excellent overview of data fusion especially for military applications.
L.A. Klein, Sensor and Data Fusion Concepts and Applications, SPIE Optical Engineer-
ing Press, Vol. TT 14 (1993)Presents an abbreviated introduction to data fusion.
R. Antony, Principles of Data Fusion Automation, Artech House, Inc. (1995)Provides
a discussion of data fusion processes with special focus on database issues to
achieve computational efficiency.
A multimedia computer-based training package, Introduction to Data Fusion, A Mul-
timedia Computer-Based Training Package, available from Artech House, Inc., Boston,
MA, 1995.

References
1. T. Sundic, S. Marco, J. Samitier, and P. Wide, Electronic tongue and electronic nose data fusion
in classification with neural networks and fuzzy logic based models, IEEE, 3, 14741480, 2000.
2. H.W. Sorenson, Least-squares estimation: From Gauss to Kalman, IEEE Spectrum, 7, 6368,
July 1970.

2009 by Taylor & Francis Group, LLC

CRC_53086_Ch001.indd 13 7/31/2008 6:15:15 PM


14 Handbook of Multisensor Data Fusion: Theory and Practice

3. D. Hall and S.A.H. McMullen, Mathematical Techniques in Multisensor Data Fusion, Artech House
Inc., Boston, MA, 2004.
4. G. Airy, P.-C. Chen, X. Fan, J. Yen, D. Hall, M. Brogan, and T. Huynh, Collaborative RPD agents
assisting decision making in active decision spaces, in Proceedings of the 2006, IAT06, IEEE/
WIC/ACM International Conference on Intelligent Agent Technology, December 2006.
5. T. Mullen, V. Avasarala, and D.L. Hall, Customer-driven sensor management, IEEE Intelligent
Systems, Special Issue on Self-Management through Self-Organization in Information Systems,
March/April 2006, 4149.
6. D.L. Hall and A. Steinberg, Dirty secrets in multisensor data fusion, Proceedings of the National
Symposium on Sensor Data Fusion (NSSDF), San Antonio, TX, June 2000.
Downloaded by [ISTANBUL TEKNIK UNIVERSITESI] at 01:07 26 February 2015

2009 by Taylor & Francis Group, LLC

CRC_53086_Ch001.indd 14 7/31/2008 6:15:15 PM

You might also like