Image Processing Part 3
Image Processing Part 3
For example, the data for the pixel in the fifth row, second column is stored in the matrix element
(5,2). You use normal MATLAB matrix subscripting to access values of individual pixels. For
example, the MATLAB code A(2,15) returns the value of the pixel at row 2, column 15 of the
image
Aim:
Syntax
To demonstrate edge detection
% numbers of colors
sncols=128;
ncols=32;
% get image from MATLAB image
load('trees');
% show original image
figure(1);
showgimg(real(X),sncols);
drawnow;
% construct convolution functions
[m,n] = size(X);
gs = [1 -1]; ge = [];
hs = [1 -1]; he = [];
g = [gs,zeros(1,m-length(gs)-length(ge)),ge];
h = [hs,zeros(1,n-length(hs)-length(he)),he];
% construct convolution matrices as sparse matrices
Y = spcnvmat(g);
Z = spcnvmat(h);
Wg = Y*X;
Wh = X*Z';
% show transformed images
figure(2);
showgimg(Wg,ncols);
drawnow;
figure(3)
showgimg(Wh,ncols);
drawnow;
figure(4)
showgimg(abs(Wg)+abs(Wh),ncols);
drawnow;
Theory
Edges characterize boundaries and are therefore a problem of fundamental importance in image
processing. Edges in images are areas with strong intensity contrasts – a jump in intensity from
one pixel to the next. Edge detecting an image significantly reduces the amount of data and
filters out useless information, while preserving the important structural properties in an image.
There are many ways to perform edge detection. However, the majority of different methods
may be grouped into two categories, gradient and Laplacian. The gradient method detects the
edges by looking for the maximum and minimum in the first derivative of the image. The
Laplacian method searches for zero crossings in the second derivative of the image to find edges.
An edge has the one-dimensional shape of a ramp and calculating the derivative of the image can
highlight its location. Suppose we have the following signal, with an edge shown by the jump in
intensity below: The intensity changes thus discovered in each of the channels are then
represented by oriented primitives called zero-crossing segments, and evidence is given that this
representation is complete. (2) Intensity changes in images arise from surface discontinuities or
from reflectance or illumination boundaries, and these all have the property that they are
spatially localized. Because of this, the zero-crossing segments from the different channels are
not independent, and rules are deduced for combining them into a description of the image. This
description is called the raw primal sketch.