Compression of Hyperspectral Image Using JPEG Compression Algorithm
Compression of Hyperspectral Image Using JPEG Compression Algorithm
The JPEG Encoder Operates through Three Primary Together, these components allow JPEG compression
Blocks: to achieve a balance between image quality and file size,
making it an essential technique in digital imaging
Forward Discrete Cosine Transform (FDCT): applications.
This step transforms the spatial representation of the
image intothe frequency domain, separating important visual
information from less significant data.
Forward Discrete Cosine Transform- optimizes the image quality by focusing on perceptually
The Forward Discrete Cosine Transform (FDCT) is a relevant details. The process is designed to align with human
first step in JPEG compression that enables the efficient visual perception, ensuring that the most noticeable features
representation of images by transforming pixel data into the are preserved even after compression. By utilizing cosine
frequency domain. This mathematical operation processes8x8 functions to represent pixel values, FDCT enhances the
blocks of an image, effectively separating the image into robustness of image encoding, making it suitable for various
different frequency components. By analyzing these applications beyond JPEG, including video processing and
components, FDCT retains low-frequency information that other image analysis tasks. Ultimately, the FDCT lays the
corresponds to the essential visual details, while high- foundation for the subsequent stages of JPEG compression,
frequency components, which capture rapid changes and highlighting its importance in modern digital imaging and
noise, can be discarded during quantization. This aspect of communication systems.
FDCT not only facilitates significant file size reduction but also
Assuming an interleaving block of 8 × 8 (single- discarded altogether. Although this introduces a loss of some
component, grayscale images) as inputs for the FDCT, 64 image quality, the significant reduction in file size makes
DCT coefficients are produced. These coefficients are then quantization a beneficial trade-off. Moreover, by increasing the
used in the quantization process as shown in the block number of zero coefficients in the resultant data, quantization
diagram. enhances the efficiency of subsequent entropy encoding
methods, such as Huffman coding, allowing for even greater
Quantization- compression. This makes quantization a fundamentalaspect
Quantization is a second step in image compression, of modern digital image processing, balancing size reduction
particularly in the JPEG algorithm, where it effectively with quality preservation.
transforms continuous DCT coefficients into discrete values,
significantly enhancing data reduction. This process operates The figure shows the simple quantization of a signal by
by mappinga range of input values to a smaller set of output choosing the amplitude values closest to the analog
values, similar to rounding numbers to their nearest integers. amplitude. In the given implementation, quantization is
After applying the Forward Discrete Cosine Transform performed in conjunction with a quantization table and the
(FDCT) to an 8x8 block of pixels, each DCT coefficient is input signal is digitized. This process is fundamentally lossy
divided by a corresponding value from a predefined since this is a many-to-one mapping. This method causes
quantization table and then rounded to the nearest integer. image loss in lossy image compression for DCT-based
These tables are designedbased on human visual perception, encoders. Quantization can berepresented using equation (9).
emphasizing the retention of lower-frequency components This is done by rounding of the quotient of dividing each
that carry more critical image information, while allowing DCT coefficient by its correspondingquantum to the nearest
higher-frequency components—often linked to finer details integer.
and noise—to be approximated more coarsely or even
Fig 4 Quantization
more efficient representation, like “10x 0,” significantly During compression, the algorithm traverses the tree
reducing the amount of data stored. starting from the leaf node corresponding to the symbol to be
compressed. As it navigates up to the root, it determines
Huffman Coding whether the current node is a left or right child of its parent,
Entropy encoding further optimizes compression assigning a corresponding binary value of (0) or (1). This
through a compact representation of the quantized DCT efficient approach allows for optimal encoding of the
coefficients, employing techniques like Huffman coding. quantized coefficients, ultimately contributing to the overall
Specified in the JPEG standard, Huffman coding is utilizedin effectiveness of JPEG compression [2][18][10].
baseline sequential codecs and all operational modes of JPEG
compression, effectively encoding source symbols based on Decompression-
their probabilities. Introduced by David Huffman in 1952, The decompression process reverses the compression
this variable-length encoding algorithm minimizes the steps but follows a different order to ensure the original image
average number of bits required for symbol representation, is accurately reconstructed. Initially, the algorithm retrieves
given that it adheres to the prefix condition. the Huffman tables that were saved with the compressed
image. Using these tables, it decodes the Huffman tokens that
represent the compressed data. Next, the focus shifts to the
Discrete Cosine Transform (DCT) values. The first
coefficient of each DCT block is recovered, and the
remaining 63 coefficients are reconstructed, with zeros filled
in where quantization has led to loss of information. This step
is crucial for re- establishing the block’s integrity, ensuring
that any missing values do not impede the reconstruction
process [2][18].
The Huffman coding process initiates with a frequency For this analysis, the Pavia University (PaviaU) dataset
table, detailing the occurrence of each symbol. The algorithm was employed. The PaviaU dataset is a widely used
then constructs a Huffman tree from this table, where each benchmark for hyperspectral image processing tasks. It
node represents a symbol, its frequency, and pointers to its consists of a hyperspectral image captured over the city of
parent and child nodes. The tree grows through successive Pavia, Italy, by the ROSIS sensor (Reflective Optics System
iterations that identify the two nodes with the lowest Imaging Spectrometer). The dataset includes 610x340 pixels,
frequencies that have not yet become parents. A new node is each pixel having 103 spectral bands after noisy bands are
created, serving as the parent of these two nodes, and its removed, covering wavelengths in the visible to near-infrared
frequency is the sum of the frequencies of its children. This range. The PaviaU dataset is particularly useful for
process continues until a single node remains, which becomes applications in urban analysis, where detailed spectral
the root of the Huffman tree. information is necessary to distinguish between different
materials or land cover types, making it ideal for assessing the
effectiveness of hyperspectral image compression methods.
A similar compression process is performed across all approach not only optimizes the data for storage but also
103 spectral bands to ensure efficient image reduction and enhances the transmission speed, making it feasible to
facilitate successful transmission through various channels. transfer the compressed images over networks with limited
Each band, containing unique spectral information, bandwidth. Ultimately, this strategy enables reliable and
undergoes a standardized compression algorithm that efficient access to valuablehyperspectral data across multiple
minimizes data size while preserving critical details. By applications, ensuring that critical information is preserved
compressing each band individually, the resulting image while reducing the logistical challenges associated with large
retains its high-dimensional characteristics, which are crucial dataset.
for subsequent analyses and applications. This systematic
Result:
Table 1 Result
Metric Band 50 Band 70 Band 85 Band 100 Average
Compression Ratio (CR) 15.45Kb 11.71Kb 10.84Kb 10.29Kb 12.07Kb
In summary, the JPEG compression algorithm remains [1]. Soni, H., & Gupta, S. (2020). "A Study on Lossy
a fundamental method in digital image processing, effectively Image Compression Techniques." International
balancing the need for image quality and reduced file sizes. The Journal ofComputer Applications.
various stages of JPEG compression— including Forward [2]. Sayood, K. (2020). Introduction to Data Compression.
Discrete Cosine Transform (DCT), quantization, and entropy Morgan Kaufmann.
encoding—collaborate to createa compact representation of [3]. Jain, A. K. (2021). "Image Compression Techniques:
visual data. By focusing on lower-frequency components that A Comprehensive Review." International Journal of
are more perceptible to the human eye, JPEG maximizes Image and Graphics.
compression while minimizing visual degradation. [4]. Jia, C., Wu, H., & Wang, L. (2021). "Efficient JPEG
Image Compression Method Based on Optimized
Huffman coding plays an essential role within the Huffman Coding." Journal of Information Science,
entropy encoding phase, allowing for efficient data 47(2), 238-250.
representation by assigning shorter codes to more frequent [5]. Zhang, L., Liu, Z., & Wang, X. (2022). "Adaptive
values. The careful orchestration of these techniques ensures Huffman Coding for Image Compression." Applied
that, during decompression, the original image can be Sciences, 12(4), 1804.
accurately reconstructed. As technology progresses, the JPEG [6]. Feng, L., Zhang, H., & Zhao, J. (2022). "Color Image
algorithm continues to evolve, adapting to meet Processing Using YCbCr Color Space." Journal of
contemporary challenges in image processing, such ashigh- Visual Communication and Image Representation.
resolution imaging and real-time applications. Future [7]. Khan, A., & Kumar, P. (2022). "Entropy Coding
innovations may further enhance compression efficiencies Techniques in Image Compression." Journal of
without compromising quality, reinforcing JPEG's relevance ComputerScience and Technology.
in diverse fields like medical imaging, satellite imagery, and [8]. Zhang, J., Wang, L., & Zhou, Y. (2022). "Efficient
digital media. This adaptability highlights JPEG's enduring DCT Algorithms for Image Compression." Journal of
significance in modern imagingtechnologies. Real-Time Image Processing, 19(3), 605-617.
[9]. Wang, C., Zhou, X., & Liu, Y. (2023). "An Overview
FUTURE WORK of Image Compression Algorithms." Applied
Sciences.
Integration of Machine Learning: [10]. Huang, H., & Yao, X. (2023). "A Fast Huffman
Future enhancements can focus on leveraging machine Coding Algorithm for Image Compression." Journal
learning algorithms to improve JPEG compression ofVisual Communication and Image Representation,
techniques. This approach could result in adaptive methods 104, 103338.
that better tailor compression based on the specific [11]. Lee, S., Kim, J., & Cho, H. (2023). "Improving JPEG
characteristics of images, thereby optimizing both file size Compression Efficiency Using Enhanced Huffman
and quality. Coding Techniques." IEEE Transactions on Image
Processing, 32, 1872-1883.
Exploration of Advanced Codecs: [12]. Singh, R., Kumar, A., & Gupta, S. (2023). "Adaptive
Research should investigate newer image formats such Discrete Cosine Transform for Image Compression."
as AVIF andHEIF to evaluate their performance compared to International Journal of Imaging Systems and
traditional JPEG. Understanding the advantages of these Technology, 33(1), 234-244.
formats in terms of compression efficiency and visual quality [13]. Chen, Y., Liu, X., & Wang, H. (2023). "Perceptual
can facilitate a smoother transition to more advanced Quantization for Image Compression Based on
technologies in digital imaging. Human Visual Sensitivity." IEEE Transactions on
Image Processing, 32(4), 1020-1034.
Enhancing Real-Time Processing: [14]. Wang, R., Zhang, Y., & Liu, Z. (2023). "Adaptive
As the demand for instantaneous applications continues Huffman Coding for Enhanced JPEG Compression."
to rise, it will beessential to optimize JPEG algorithms for Journal of Visual Communication and Image
faster processing and reduced latency. This may involve Representation, 88, 103271.
refining the underlying algorithms to improve both [15]. Bhatia, S., Singh, A., & Sahu, S. (2023).
compression and decompression speeds in various contexts. "Performance Analysis of Arithmetic Coding in
JPEG Compression." Journal of Electronic Imaging,
32(1), 013001.