0% found this document useful (0 votes)
14 views4 pages

1

A digital image is a numerical representation of a real image made up of pixels, while digital image processing uses algorithms to manipulate these images. Key concepts include the distinction between analog and digital images, image sampling and quantization, and various noise models. Fundamental steps in digital image processing encompass acquisition, preprocessing, segmentation, and enhancement techniques such as histogram equalization and filtering.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views4 pages

1

A digital image is a numerical representation of a real image made up of pixels, while digital image processing uses algorithms to manipulate these images. Key concepts include the distinction between analog and digital images, image sampling and quantization, and various noise models. Fundamental steps in digital image processing encompass acquisition, preprocessing, segmentation, and enhancement techniques such as histogram equalization and filtering.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

1. Define digital image and digital image processing.

A digital image is a representation of a real image in a numerical format, consisting of pixels

arranged in a grid. Digital image processing involves the use of computer algorithms to perform

image processing on digital images.

2. Differentiate between analog and digital images.

Analog images are continuous images such as photographs, while digital images are discrete and

represented in pixel format. Digital images can be processed using computers.

3. Explain the elements of a digital image processing system.

The elements include image acquisition, preprocessing, segmentation, representation and

description, recognition, and interpretation, along with knowledge base.

4. What is image sampling and quantization? Illustrate with an example.

Sampling is the process of converting a continuous image into a digital image by taking samples.

Quantization assigns each sample a discrete value. For example, a grayscale image with 256

shades has been quantized into 256 levels.

5. Define spatial resolution and intensity resolution.

Spatial resolution refers to the number of pixels used to represent an image. Intensity resolution

refers to the number of bits used to represent the intensity of each pixel.

6. Write short notes on image histogram.

An image histogram is a graphical representation of the tonal distribution in an image. It plots the

number of pixels for each intensity value.

7. Explain neighborhood operations in image processing.


Neighborhood operations involve processing a pixel based on the values of its surrounding pixels.

Common operations include filtering and edge detection.

8. Define and explain the concept of image contrast.

Image contrast refers to the difference in intensity between the darkest and lightest parts of an

image. High contrast makes features more distinguishable.

9. List and explain various noise models in image processing.

Common noise models include Gaussian noise, Salt-and-Pepper noise, Poisson noise, and Speckle

noise. Each has different statistical characteristics.

10. Describe Gaussian and Salt-and-Pepper noise.

Gaussian noise is statistical noise with a probability density function equal to that of the normal

distribution. Salt-and-Pepper noise appears as sparse white and black pixels.

11. What is histogram equalization?

Histogram equalization is a technique to enhance image contrast by spreading out the most

frequent intensity values.

12. Explain basic steps of image enhancement in the spatial domain.

Steps include image sharpening, contrast enhancement, smoothing, and edge enhancement using

spatial filters.

13. Compare linear and non-linear spatial filters.

Linear filters use a linear combination of pixel values (e.g., averaging), while non-linear filters (e.g.,

median) do not follow linearity.


14. Define and describe the purpose of median filtering.

Median filtering is a non-linear technique used to remove noise, especially Salt-and-Pepper noise,

by replacing each pixel with the median of its neighbors.

15. Describe convolution operation in 2D image processing.

Convolution involves applying a kernel (matrix) over an image, multiplying overlapping values, and

summing the result to produce filtered output.

16. Write a short note on Fourier Transform in image processing.

The Fourier Transform converts spatial domain images to frequency domain, making it easier to

analyze image components such as edges and textures.

17. Explain edge detection using Sobel operator.

The Sobel operator uses two 3x3 kernels to detect horizontal and vertical edges by calculating the

gradient magnitude at each pixel.

18. Define color image and color models.

A color image is composed of multiple channels, typically red, green, and blue. Color models include

RGB, CMYK, HSV, and YCbCr.

19. Differentiate between lossless and lossy compression.

Lossless compression retains all image data (e.g., PNG), while lossy compression (e.g., JPEG)

discards some data for reduced file size.

20. What is thresholding in image segmentation?

Thresholding converts grayscale images into binary images by setting a threshold value. Pixels

above the threshold are set to one value; others are set to another.
21. Describe the fundamental steps involved in digital image processing.

Steps include image acquisition, preprocessing, segmentation, representation and description,

recognition, and interpretation.

22. Explain the basic geometric transformations used in images.

Transformations include translation, scaling, rotation, and affine transformations to alter image

geometry.

23. Describe in detail the histogram equalization process.

Histogram equalization enhances contrast by redistributing the image histogram to span the full

range of intensities.

24. Derive and explain the 2D Discrete Fourier Transform

The 2D DFT of an image f(x,y) of size MxN is:

F(u,v) = Sumx=0 to M-1 Sumy=0 to N-1 f(x,y) * e^(-j2pi(ux/M + vy/N)). It transforms the spatial

domain to the frequency domain.

You might also like