Introduction To Hyperspectral Image Analysis
Introduction To Hyperspectral Image Analysis
Reflectance is the percentage of the light hitting a material that is then reflected by that
material (as opposed to being absorbed or transmitted). A reflectance spectrum shows
the reflectance of a material measured across a range of wavelengths (Fig. 2). Some
materials will reflect certain wavelengths of light, while other materials will absorb the
same wavelengths. These patterns of reflectance and absorption across wavelengths can
uniquely identify certain materials.
Figure 4. Reflectance spectra of the three materials in Figure 2 as they would appear to
the multispectral Landsat 7 ETM sensor.
Figure 5. Reflectance spectra of the three materials in Figure 2 as they would appear to
the hyperspectral AVIRIS sensor. The gaps in the spectra are wavelength ranges at
which the atmosphere absorbs so much light that no reliable signal is received from the
surface.
Although most hyperspectral sensors measure hundreds of wavelengths, it is not the
number of measured wavelengths that defines a sensor as hyperspectral. Rather it is the
narrowness and contiguous nature of the measurements. For example, a sensor that
measured only 20 bands could be considered hyperspectral if those bands were
contiguous and, say, 10 nm wide. If a sensor measured 20 wavelength bands that were,
say, 100 nm wide, or that were separated by non-measured wavelength ranges, the sensor
would no longer be considered hyperspectral.
Standard multispectral image classification techniques were generally developed to
classify multispectral images into broad categories. Hyperspectral imagery provides an
opportunity for more detailed image analysis. For example, using hyperspectral data,
spectrally similar materials can be distinguished, and sub-pixel scale information can be
extracted. To fulfill this potential, new image processing techniques have been
developed.
Most past and current hyperspectral sensors have been airborne (Table 1), with two
recent exceptions: NASAs Hyperion sensor on the EO-1 satellite, and the U.S. Air
Force Research Labs FTHSI sensor on the MightySat II satellite. Several new spacebased hyperspectral sensors have been proposed recently (Table 2). Unlike airborne
sensors, space-based sensors are able to provide near global coverage repeated at regular
intervals. Therefore, the amount of hyperspectral imagery available should increase
significantly in the near future as new satellite-based sensors are successfully launched.
Table 1. Current and Recent Hyperspectral Sensors and Data Providers
Satellite
Sensors
Manufacturer
Number of Bands
Spectral Range
FTHSI on
MightySat II
256
0.35 to 1.05 mm
220
0.4 to 2.5 mm
www.vs.afrl.af.mil
/TechProgs/MightySatII
Hyperion on EO1
Airborne
Sensors
Manufacturer
Number of Bands
Spectral Range
AVIRIS
(Airborne Visible
Infrared Imaging
Spectrometer)
224
0.4 to 2.5 mm
HYDICE
(Hyperspectral
Digital Imagery
Collection
Experiment)
210
0.4 to 2.5 mm
PROBE-1
128
0.4 to 2.5 mm
up to 228
0.4 to 1.0 mm
100 to 200
Visible to thermal
infrared
VIS/NIR
(.43 to 1.05 mm),
SWIR1
(1.5 to 1.8 mm),
SWIR2
(2.0 to 2.5 mm),
and TIR
makalu.jpl.nasa.gov/
www.earthsearch.com
casi
(Compact
Airborne
Spectrographic
Imager)
ITRES Research
Limited
HyMap
Integrated Spectronics
www.itres.com
www.intspec.com
EPS-H
(Environmental
Protection
System)
GER Corporation
www.ger.com
(8 to 12.5 mm)
DAIS 7915
(Digital Airborne
Imaging
Spectrometer)
GER Corporation
VIS/NIR
(0.43 to 1.05 mm),
SWIR1
(1.5 to 1.8 mm),
SWIR2
(2.0 to 2.5 mm),
MIR
(3.0 to 5.0 mm),
and TIR
(8.7 to 12.3 mm)
DAIS 21115
(Digital Airborne
Imaging
Spectrometer)
GER Corporation
VIS/NIR
(0.40 to 1.0 mm),
SWIR1
(1.0 to 1.8 mm),
SWIR2
(2.0 to 2.5 mm),
MIR
(3.0 to 5.0 mm),
and TIR
(8.0 to 12.0 mm)
AISA
(Airborne
Imaging
Spectrometer)
Spectral Imaging
up to 288
0.43 to 1.0 mm
www.specim.fi
Sensor
Sponsoring Agencies
ARIES-I
ARIES-I
Auspace Ltd
ACRES
Earth Resource Mapping Pty. Ltd.
Geoimage Pty. Ltd.
CSIRO
PROBA
CHRIS
NEMO
COIS
PRISM
Whole pixel analysis methods attempt to determine whether one or more target materials
are abundant within each pixel in a multispectral or hyperspectral image on the basis of
the spectral similarity between the pixel and target spectra. Whole-pixel scale tools
include standard supervised classifiers such as Minimum Distance or Maximum
Likelihood (Richards and Jia, 1999), as well as tools developed specifically for
hyperspectral imagery such as Spectral Angle Mapper and Spectral Feature Fitting.
Spectral Angle Mapper (SAM)
Consider a scatter plot of pixel values from two bands of a spectral image. In such a plot,
pixel spectra and target spectra will plot as points (Fig. 6). If a vector is drawn from the
origin through each point, the angle between any two vectors constitutes the spectral
angle between those two points. The Spectral Angle Mapper (Yuhas et al., 1992)
computes a spectral angle between each pixel spectrum and each target spectrum. The
smaller the spectral angle, the more similar the pixel and target spectra. This spectral
angle will be relatively insensitive to changes in pixel illumination because increasing or
decreasing illumination doesnt change the direction of the vector, only its magnitude
(i.e., a darker pixel will plot along the same vector, but closer to the origin). Note that
although this discussion describes the calculated spectral angle using a two-dimensional
scatter plot, the actual spectral angle calculation is based on all of the bands in the image.
In the case of a hyperspectral image, a spectral hyper-angle is calculated between each
pixel and each target.
al., 2000). A relatively simple form of this method, called Spectral Feature Fitting, is
available as part of ENVI. In Spectral Feature Fitting the user specifies a range of
wavelengths within which a unique absorption feature exists for the chosen target. The
pixel spectra are then compared to the target spectrum using two measurements: 1) the
depth of the feature in the pixel is compared to the depth of the feature in the target, and
2) the shape of the feature in the pixel is compared to the shape of the feature in the target
(using a least-squares technique).
Sub-Pixel Methods
Sub-pixel analysis methods can be used to calculate the quantity of target materials in
each pixel of an image. Sub-pixel analysis can detect quantities of a target that are much
smaller than the pixel size itself. In cases of good spectral contrast between a target and
its background, sub-pixel analysis has detected targets covering as little as 1-3% of the
pixel. Sub-pixel analysis methods include Complete Linear Spectral Unmixing, and
Matched Filtering.
Complete Linear Spectral Unmixing
The set of spectrally unique surface materials existing within a scene are often referred to
as the spectral endmembers for that scene. Linear Spectral Unmixing (Adams et al.,
1986; Boardman, 1989) exploits the theory that the reflectance spectrum of any pixel is
the result of linear combinations of the spectra of all endmembers inside that pixel. A
linear combination in this context can be thought of as a weighted average, where each
endmember weight is directly proportional to the area the pixel containing that
endmember. If the spectra of all endmembers in the scene are known, then their
abundances within each pixel can be calculated from each pixels spectrum.
Unmixing simply solves a set of n linear equations for each pixel, where n is the number
of bands in the image. The unknown variables in these equations are the fractions of each
endmember in the pixel. To be able to solve the linear equations for the unknown pixel
fractions it is necessary to have more equations than unknowns, which means that we
need more bands than endmember materials. With hyperspectral data this is almost
always true.
The results of Linear Spectral Unmixing include one abundance image for each
endmember. The pixel values in these images indicate the percentage of the pixel made
up of that endmember. For example, if a pixel in an abundance image for the endmember
quartz has a value of 0.90, then 90% of the area of the pixel contains quartz. An error
image is also usually calculated to help evaluate the success of the unmixing analysis.
Matched Filtering
Matched Filtering (Boardman et al., 1995) is a type of unmixing in which only userchosen targets are mapped. Unlike Complete Unmixing, we dont need to find the
spectra of all endmembers in the scene to get an accurate analysis (hence, this type of
analysis is often called a partial unmixing because the unmixing equations are only
partially solved). Matched Filtering was originally developed to compute abundances of
targets that are relatively rare in the scene. If the target is not rare, special care must be
taken when applying and interpreting Matched Filtering results.
Matched Filtering filters the input image for good matches to the chosen target
spectrum by maximizing the response of the target spectrum within the data and
suppressing the response of everything else (which is treated as a composite unknown
background to the target). Like Complete Unmixing, a pixel value in the output image is
proportional to the fraction of the pixel that contains the target material. Any pixel with a
value of 0 or less would be interpreted as background (i.e., none of the target is present).
One potential problem with Matched Filtering is that it is possible to end up with false
positive results. One solution to this problem that is available in ENVI is to calculate an
additional measure called infeasibility. Infeasibility is based on both noise and image
statistics and indicates the degree to which the Matched Filtering result is a feasible
mixture of the target and the background. Pixels with high infeasibilities are likely to be
false positives regardless of their matched filter value.
Summary
Hyperspectral sensors and analyses have provided more information from remotely
sensed imagery than ever possible before. As new sensors provide more hyperspectral
imagery and new image processing algorithms continue to be developed, hyperspectral
imagery is positioned to become one of the most common research, exploration, and
monitoring technologies used in a wide variety of fields.
References
Aber, J. D., and Martin, M. E., 1995, High spectral resolution remote sensing of canopy
chemistry. In Summaries of the Fifth JPL Airborne Earth Science Workshop, JPL
Publication 95-1, v. 1, pp. 1-4.
Adams, J. B., Smith, M. O., and Johnson, P.E., 1986, Spectral mixture modeling: A new
analysis of rock and soil types at the Viking Lander 1 site. Journal of Geophysical
Research, vol. 91(B8), pp. 8090-8112.
Ben-Dor, E., Patin, K., Banin, A. and Karnieli, A., 2001, Mapping of several soil
properties using DAIS-7915 hyperspectral scanner data. A case study over clayey
soils in Israel. International Journal of Remote Sensing (in press).
Boardman, J. W., 1989, Inversion of imaging spectrometry data using singular value
decomposition. Proceedings of the Twelfth Canadian Symposium on Remote
Sensing, v. 4., pp. 2069-2072.
Boardman, J. W., Kruse, F. A., and Green, R. O., 1995, Mapping target signatures via
partial unmixing of AVIRIS data. In Summaries of the Fifth JPL Airborne Earth
Science Workshop, JPL Publication 95-1, v. 1, pp. 23-26.
Clark, R. N., and Swayze, G. A., 1995, Mapping minerals, amorphous materials,
environmental materials, vegetation, water, ice, and snow, and other materials:
The USGS Tricorder Algorithm. In Summaries of the Fifth Annual JPL Airborne
Earth Science Workshop, JPL Publication 95-1, v. 1, pp. 39 - 40.
Clark, R. N., Swayze, G. A., Gallagher, A., Gorelick, N., and Kruse, F. A., 1991,
Mapping with imaging spectrometer data using the complete band shape leastsquares algorithm simultaneously fit to multiple spectral features from multiple
Author
Peg Shippert is the Earth Science Applications Specialist for Research Systems, Inc.
(www.researchsystems.com), the makers of ENVI and IDL and a wholly owned
subsidiary of Eastman Kodak. She has a Ph.D. in physical geography and more than 13
years of experience analyzing multispectral and hyperspectral imagery for a wide variety
of applications.