0% found this document useful (0 votes)
35 views79 pages

Computer Vision 2 Feature Extraction 4 Students

The document discusses the concept of texture in images, defining it as the arrangement of pixel colors and intensities, and introduces texels as groups of similar pixels. It explains various methods for texture analysis, including statistical features, Gabor filters, and Gray Level Co-occurrence Matrix (GLCM), which are used for texture segmentation, synthesis, and transfer. The document highlights the applications of texture analysis in computer vision, emphasizing the importance of selecting appropriate filters to extract relevant features from images.

Uploaded by

sushanth.tambe
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
35 views79 pages

Computer Vision 2 Feature Extraction 4 Students

The document discusses the concept of texture in images, defining it as the arrangement of pixel colors and intensities, and introduces texels as groups of similar pixels. It explains various methods for texture analysis, including statistical features, Gabor filters, and Gray Level Co-occurrence Matrix (GLCM), which are used for texture segmentation, synthesis, and transfer. The document highlights the applications of texture analysis in computer vision, emphasizing the importance of selecting appropriate filters to extract relevant features from images.

Uploaded by

sushanth.tambe
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 79

Computer Vision

Feature Extraction
(Local texture representation using filters)

1
What is Texture?
• Provides information of arrangement of pixel colors/ intensities in an image
• Characterised by distribution of intensity levels in neighbourhood of a pixel

Image Cropped Patch with texture


2
Image texture
• A group of pixels ( patch) having similar properties is called Texel
• If Texel is repeated spatially, we get a particular texture
• Human can distinguish depth of objects based on texture

3
Image Texture
• Images have same number of white and black pixels but different textures
(distribution)
• Texture can be fine/regular, smooth, grained and coarse

50% white and 50% black


pixels

4
Analysis of Texture
• Statistical features of texels can be used to determine the type of texture

Statistical features

Smooth Coarse fine/regular

5
Analysis of Texture
• Statistical features of texels can be used to determine the type of texture

Statistical features

Smooth Coarse fine/regular

6
Analysis of Texture
• Statistical features of texels can be used to determine the type of texture

Statistical features

Smooth Coarse fine/regular

7
Image texture

• Broad applications are


• Texture based segmentation
• Texture synthesis
• Texture transfer
• Texture using spectral property

8
Ex: Texture based segmentation
• Extract texture features using Gabor filter, GLCM etc
• GLCM provides statistical features
• Gabor filter provides edge, texture and feature vector
• Based on features, classify different textures
• Segment objects with similar textures

9
Ex: Texture based segmentation
• Extract texture features using Gabor filter, GLCM etc
• GLCM provides statistical features
• Gabor filter provides edge and texture features
• Based on features, classify different textures
• Segment objects with similar textures

10
Image texture (Texture based Synthesis)
• Construct a large image from a small sample image by repeating texture

11
Image texture (Texture based Synthesis)
• Construct a large image from a small sample image by repeating texture

12
Image texture (Texture Transfer)
• Take a texture from one image and paint onto another image

13
Texture using Spectral Property (Gabor Filter)

• Gabor filter is used to generate features that represent texture and edges
• Mimics human vision, how human recognize texture with our eyes
• Is a special class of bandpass filter
• Possess localization properties of image
• Is a combination of Gaussian and Sinusoidal term
• Sinusoidal signal of particular wavelength and orientation, modulated by a
Gaussian wave
• Gaussian component provides weight and sine component provides
directionality

14
2D Gabor Filter
Visual color representation of filters

Sinusoid oriented 300 with X-axis 2D Gaussian Gabor filter

Bank of Gabor filters with a number of different parameters are used to


analyze texture feature of an image

15
2D Gabor Filter
• Image is passed through each filter of the filter bank
• Filtered outputs are added to generate final output
• Edge which gets detected is the edge oriented at an angle at which the Gabor
filter is oriented

16
2D Gabor Filter
• Image is passed through each filter of the filter bank
• Filtered outputs are added to generate final output
• Edge which gets detected is the edge oriented at an angle at which the Gabor
filter is oriented

17
2D Gabor Filter
• Image is passed through each filter of the filter bank
• Filtered outputs are added to generate final output
• Edge which gets detected is the edge oriented at an angle at which the Gabor
filter is oriented

18
2D Gabor Filter
• Image is passed through each filter of the filter bank
• Filtered outputs are added to generate final output
• Edge which gets detected is the edge oriented at an angle at which the Gabor
filter is oriented

19
2D Gabor Filter
• Image is passed through each filter of the filter bank
• Filtered outputs are added to generate final output
• Edge which gets detected is the edge oriented at an angle at which the Gabor
filter is oriented

20
2D Gabor Filter
• Image is passed through each filter of the filter bank
• Filtered outputs are added to generate final output
• Edge which gets detected is the edge oriented at an angle at which the Gabor
filter is oriented

Add all
outputs

21
2D Gabor Filter: application
• Highlight or extract patterns/stripes
• Ex: Use a bank of 16 Gabor filters at an orientation every 11.25 (for total of
1800)
• Each filter highlights patterns of specific orientation

Patterns or stripes on skin are


16 Gabor filters
at different orientations 22
2D Gabor Filter
• Each filter responds to a particular texture of specific orientation
• Highest response at edges and at points where texture changes

23
Parameters of 2D Gabor filter
• A 2D Gabor filter can be viewed as a sinusoidal signal of particular frequency and
orientation, modulated by a Gaussian wave

λ — Wavelength of the sinusoidal component


Ө — Orientation of the normal to the parallel
stripes of the Gabor function
Ψ — The phase offset of the sinusoidal function
σ — The sigma/standard deviation of the
Gaussian envelope
ϒ — The spatial aspect ratio and specifies the
ellipticity of the of Gabor function
X and y are pixel locations

24
Ex: 2D Gabor filter

• For 11x11 Gabor filter, • For 101x101 Gabor filter,


• Sigma=4, theta=0, lambda=pi/4, gamma=0, psi=0 • Sigma=1, theta=0, lambda=pi/8,
gamma=0.1, psi=0

- 0.3 0.5 0.3 - 0.8 - 0.14 1.0 - 0.14 - 0.8 0.3 0.5 - 0.3

- 0.3 0.5 0.3 - 0.8 - 0.14 1.0 - 0.14 - 0.8 0.3 0.5 - 0.3

- 0.3 0.5 0.3 - 0.8 - 0.14 1.0 - 0.14 - 0.8 0.3 0.5 - 0.3

- 0.3 0.5 0.3 - 0.8 - 0.14 1.0 - 0.14 - 0.8 0.3 0.5 - 0.3

- 0.3 0.5 0.3 - 0.8 - 0.14 1.0 - 0.14 - 0.8 0.3 0.5 - 0.3

- 0.3 0.5 0.3 - 0.8 - 0.14 1.0 - 0.14 - 0.8 0.3 0.5 - 0.3

- 0.3 0.5 0.3 - 0.8 - 0.14 1.0 - 0.14 - 0.8 0.3 0.5 - 0.3

- 0.3 0.5 0.3 - 0.8 - 0.14 1.0 - 0.14 - 0.8 0.3 0.5 - 0.3

- 0.3 0.5 0.3 - 0.8 - 0.14 1.0 - 0.14 - 0.8 0.3 0.5 - 0.3

- 0.3 0.5 0.3 - 0.8 - 0.14 1.0 - 0.14 - 0.8 0.3 0.5 - 0.3

- 0.3 0.5 0.3 - 0.8 - 0.14 1.0 - 0.14 - 0.8 0.3 0.5 - 0.3

Visual representation of filter Filter element values Visual representation of filter

25
Generation of 2D Gabor filters
• Parameters control the shape and size of the Gabor function
• Example: Ө = 0, ϒ = 0.25, σ = 10, Ψ = 0
• Lambda (λ), wavelength of the sinusoidal component
• control width of the strips of the Gabor filter
• Increasing the wavelength produces thicker stripes in filter
λ — Wavelength of the sinusoidal
component
Ө — The orientation of the normal to the
parallel stripes of the Gabor function
Ψ — The phase offset of the sinusoidal
function
ϒ — The spatial aspect ratio and specifies the
ellipticity of the support of the Gabor
function
σ — The sigma/standard deviation of the
26
Gaussian envelope
Generation of 2D Gabor filters

Theta (Ө)
Orientation of filter with normal (00)

Gamma (ϒ)
• Specifies aspect ratio of Gabor function
• Shows ellipticity of curve

27
Generation of 2D Gabor filters
Sigma (σ):

• Sigma of Gaussian controls the overall size of the Gabor envelope


• For larger sigma, filter has more stripes

28
Generation of 2D Gabor filters

29
Generation of 2D Gabor filters

30
Generation of 2D Gabor filters

• Several filters can be defined by changing the parameters of Gabor filter


• Different features can be extracted from the original image by using Gabor filters
with specific parameters and adding outputs of all

31
Response of Gabor filter

32
Response of Gabor filter

33
Response of Gabor filter

34
Response of Gabor filter

35
Parameters of 2D Gabor filter
• A 2D Gabor filter can be viewed as a sinusoidal signal of particular frequency and
orientation, modulated by a Gaussian wave
• Filter has a real and an imaginary component

λ — Wavelength of the sinusoidal component


Ө — Orientation of the normal to the parallel
stripes of the Gabor function
Ψ — The phase offset of the sinusoidal function
σ — The sigma/standard deviation of the
Gaussian envelope
ϒ — The spatial aspect ratio and specifies the
ellipticity
X and y are pixel locations

36
Parameters of 2D Gabor filter
• A 2D Gabor filter can be viewed as a sinusoidal signal of particular frequency and
orientation, modulated by a Gaussian wave
• Filter has a real and an imaginary component

λ — Wavelength of the sinusoidal component


Ө — The orientation of the normal to the parallel
stripes of the Gabor function
Ψ — The phase offset of the sinusoidal function
σ — The sigma/standard deviation of the
Gaussian envelope
ϒ — The spatial aspect ratio and specifies the
ellipticity
X and y are pixel values

37
Generate features using Gabor filters

• Determine real or the imaginary part of the filtered image


• Phase of the response matrix represents orientation of edges
• Amplitude of the response matrix represents strength of edge
• Magnitudes of responses with the same orientation can be taken as a feature
vector
• Apply PCA on bank of filters to reduce number of filter elements (dimensions)
and train model using filtered image

38
Ex: Extract features with Gabor filters

-1 0 1
2 1 2
1 -2 0

Kernel of Gabor filter

Image

Filtered image
39
Extract features with Gabor filters
• Selection of correct combination of filters is important
• A particular combination suits the requirement
• To remove vertical stripes, apply horizontal Gabor filter

40
Texture Segmentation using Gabor filters
Segment dog from the bathroom floor

• Texture of floor is regular and periodic


• It is different from smooth texture of
the dog's fur

• Design an array of Gabor Filters which are tuned to different frequencies and
orientations to determine texture of floor
• Consider orientations between 0 and 1500 degrees in steps of 30 degrees
• Use wavelength with increasing powers of two starting from 4/sqrt(2)

41
Texture Segmentation using Gabor filters
Output of Gabor filter • Apply K-means to segment image

Segmented image

42
Texture Segmentation using Gabor filters

43
Texture Segmentation using Gabor filters

44
Texture of image
• Texture is repetition of patterns of local variation of
pixel intensities
• Texture feature is a measure method about
relationship among the pixels in local area
• General Textures are rough or smooth rough

• Image with rough texture has a large difference


between pixel intensities
• Textures in images quantify Grey Level differences
of neighboring pixels

smooth

45
Techniques for Texture Extraction
• Geometrical- convert to binary and then detemine statistics of only
connected pixels
• Signal Processing – Gabor filter
• Statistical
• First order Statistics - Standard Deviation, Mean, Variance
• Second Order statistics – Uses GLCM

46
Gray Level Co-occurrence Matrix (GLCM)

• Provides distribution of the gray levels in the texture image


• Is a statistical method of examining texture
• Considers the spatial relationship of pixels before determining
statistics
• Statistical features provide information about the texture of an
image

47
Create GLCM

• Calculate how often a pixel with the intensity value i occurs with a
specific spatial relation to a pixel with the value j
• Element of the matrix is the sum of the number of times pixels with
value, i occurred in specified relationship to a pixel with value, j in
the image
• Number of gray levels in the image determines the size of GLCM

48
GLCM example

0 0 1 1
0 0 1 1
0 2 2 2
2 2 3 3

2-bit Image matrix

• (0,1) relation- offset is 0 rows and 1


column
• Other relationships like (1,0), (0,-2) etc are
also possible

49
GLCM example
Pixel values, j 
0 1 2 3

Pixel values, i 
0 0 1 1

0 1 2 3
2 0 0 0
0 0 1 1 0 0 0 0
0 2 2 2 0 0 0 0
2 2 3 3 0 0 0 0

2-bit Image matrix


Partial GLCM for (0,1) spatial relationship

50
GLCM example
Pixel values, j 
0 1 2 3

Pixel values, i 
0 0 1 1

0 1 2 3
2 0 0 0
0 0 1 1 0 0 0 0
0 2 2 2 0 0 0 0
2 2 3 3 0 0 0 0

2-bit Image matrix


Partial GLCM for (0,1) spatial relationship

0 1 2 3

0
2 2 1 0

1
0 2 0 0
0 0 3 1

2
0 0 0 1

3
GLCM for (0,1) spatial relationship
51
Features of GLCM
Pixel values, j 
0 1 2 3

Pixel values, i 
0
2 2 1 0

1
0 2 0 0
0 0 3 1

2
0 0 0 1
3
• Line parallel to the diagonal is one cell away from the diagonal, with intensity
difference of 1

52
Statistics using GLCM
j
2 2 1 0
i 0 2 0 0
0 0 3 1
0 0 0 1
GLCM of an image

• Contrast
• Dissimilarity
• Energy
• Entropy
• Homogeneity
• Correlation

53
GLCM Feature: Contrast
j
0 1 2 3
2 2 1 0
0
i

0 2 0 0
1

0 0 3 1
2

0 0 0 1
3

GLCM
Sum of elements of GLCM = 12

54
GLCM Feature: Contrast
j j
0 1 2 3 0 1 2 3
2/12 2/12 1/12 0

0
2 2 1 0
0
i

0 2/12 0 0

1 2
0 2 0 0
1

i
0 0 3 1 0 0 3/12 1/12
2

0 0 0 1 0 0 0 1/12

3
3

GLCM Normalized GLCM = GLCM/12


Sum of elements of GLCM = 12

• Elements of normalized GLCM = Probability, p(i,j)

55
GLCM Feature: Contrast
j j
0 1 2 3 0 1 2 3
2/12 2/12 1/12 0

0 1
2 2 1 0
0
i

0 2 0 0 0 2/12 0 0

i
1
0 0 3/12 1/12

2
0 0 3 1
2

0 0 0 1/12

3
0 0 0 1
3

Normalized GLCM

Elements of normalized GLCM = Probability, p(i,j)


𝐶𝑜𝑛𝑡𝑟𝑎𝑠𝑡 = ෍ 𝑖 − 𝑗 2 𝑝(𝑖, 𝑗)
𝑖,𝑗

• Contrast: 0.5833

56
GLCM Feature: Contrast

𝐶𝑜𝑛𝑡𝑟𝑎𝑠𝑡 = ෍ 𝑖 − 𝑗 2 𝑝(𝑖, 𝑗)
𝑖,𝑗

• Uses weights related to the distance from the GLCM diagonal


• Returns a intensity contrast between a pixel and its neighbor over the image
• Contrast is more for pixels which are away from the diagonal
• Contrast is defined for a specific relationship used for GLCM
• For (0,1) relationship, it shows contrast in horizontal direction and with next pixel
• Contrast can be defined for other relationships

57
Ex: GLCM feature, contrast of image
0 200 255

200

Image GLCM
58
Ex: GLCM feature, contrast of image

Image GLCM
59
Dissimilarity derived from Example GLCM
j
0 1 2 3

0
2/12 2/12 1/12 0

1 2
0 2/12 0 0

i
𝑑𝑖𝑠𝑠𝑖𝑚𝑖𝑙𝑎𝑟𝑖𝑡𝑦 = ෍ 𝑖 − 𝑗 𝑝(𝑖, 𝑗) 0 0 3/12 1/12
𝑖,𝑗 0 0 0 1/12

3
Normalized GLCM

• Dissimilarity increases linearly with pixel difference

Dissimilarity = 0.4166

60
Energy derived from Example GLCM
j
0 1 2 3

0
2/12 2/12 1/12 0

1 2
0 2/12 0 0

i
𝐸𝑛𝑒𝑟𝑔𝑦 = ෍ 𝑝(𝑖, 𝑗) 2 0 0 3/12 1/12
𝑖,𝑗
0 0 0 1/12

3
Normalized GLCM
Energy =

61
Homogeneity derived from GLCM
• Is inverse of the contrast
• Decreases exponentially if more pixels are away from the diagonal
• Measures the closeness of the intensity distribution

𝑝(𝑖, 𝑗)
ℎ𝑜𝑚𝑜𝑔𝑒𝑛𝑒𝑖𝑡𝑦 = ෍ 2
1+ 𝑖−𝑗
𝑖,𝑗

62
Homogeneity derived from Example GLCM
j
0 1 2 3
2/12 2/12 1/12 0

0
0 2/12 0 0

1
𝑝(𝑖, 𝑗)

i
ℎ𝑜𝑚𝑜𝑔𝑒𝑛𝑒𝑖𝑡𝑦 = ෍ 0 0 3/12 1/12

2
1+ 𝑖−𝑗 2
𝑖,𝑗 0 0 0 1/12

3
Probability, p(i,j)

homogeneity = 0.8083

63
Entropy derived from GLCM
j
0 1 2 3

0
2/12 2/12 1/12 0

1
0 2/12 0 0

i
0 0 3/12 1/12

2
0 0 0 1/12

3
Probability, p(i,j) = GLCM(i,j)/12

• If p(i,j) is small (i.e. occurrences of that pixel combination is low)


• Measure of how pairs pixels with specific relationship are equally distributed
𝑒𝑛𝑡𝑟𝑜𝑝𝑦 = ෍ −𝑝 𝑖, 𝑗 𝑙𝑜𝑔2(𝑝 𝑖, 𝑗 )
𝑖,𝑗

entropy =
64
GLCM mean, variance
j
0 1 2 3

0
2/12 2/12 1/12 0

1
0 2/12 0 0

i
0 0 3/12 1/12

2
0 0 0 1/12

3
Probability, p(i,j) = GLCM(i,j)/12

• GLCM mean is mean of number of times a pixel’s occurrence with a


specific relationship
𝜇𝑖 = ෍ 𝑖𝑝 𝑖, 𝑗 𝜇𝑗 = ෍ 𝑗𝑝 𝑖, 𝑗
𝑖,𝑗 𝑖,𝑗

𝜇𝑖 = 1.08 𝜇𝑗 = 1.5

65
GLCM mean, variance
j
0 1 2 3

0
2/12 2/12 1/12 0

1
0 2/12 0 0

i
0 0 3/12 1/12

2
0 0 0 1/12

3
Probability, p(i,j) = GLCM(i,j)/12

𝜎𝑖2 = ෍ 𝑝(𝑖, 𝑗)(𝑖 − 𝜇𝑖 )2 𝜎𝑗2 = ෍ 𝑝(𝑖, 𝑗)(𝑗 − 𝜇𝑗 )2


𝑖,𝑗
𝑖,𝑗

𝜎𝑖2 = 𝜎𝑗2 =

66
Mean and Variance derived from Example GLCM
j
0 1 2 3

0
2/12 2/12 1/12 0
0 2/12 0 0

1
i
0 0 3/12 1/12

2
0 0 0 1/12

3
Probability, p(i,j) = GLCM(i,j)/12

𝜇𝑖 = ෍ 𝑖𝑝 𝑖, 𝑗 𝜇𝑗 = ෍ 𝑗𝑝 𝑖, 𝑗
𝑖,𝑗
𝑖,𝑗

𝜎𝑖2 = ෍ 𝑝(𝑖, 𝑗)(𝑖 − 𝜇𝑖 )2 𝜎𝑗2 = ෍ 𝑝(𝑖, 𝑗)(𝑗 − 𝜇𝑗 )2


𝑖,𝑗 𝑖,𝑗

𝐶𝑜𝑟𝑟𝑒𝑙𝑎𝑡𝑖𝑜𝑛 = ෍(𝑖 − 𝑢𝑖 )(𝑗 − 𝜇𝑗 )𝑝(𝑖, 𝑗)/(𝜎𝑖 𝜎𝑗 )


67
𝑖,𝑗
Correlation from GLCM

𝐶𝑜𝑟𝑟𝑒𝑙𝑎𝑡𝑖𝑜𝑛 = ෍(𝑖 − 𝑢𝑖 )(𝑗 − 𝜇𝑗 )𝑝(𝑖, 𝑗)/(𝜎𝑖 𝜎𝑗 )


𝑖,𝑗

• Measure of how correlated a pixel is to its neighbor in the image


• Range is -1 to 1
• 1 or -1 for a perfectly positively or negatively correlated image.
• NaN for constant image

68
GLCM features

69
GLCM features

70
GLCM features for (0,1) relationship

71
GLCM features

72
GLCM features

73
GLCM for texture classification
• Gray-level co-occurrence matrix (GLCM) provides texture features
• Classify images using texture feature of GLCM features
• Use machine learning algorithm (SVM, K-nearest neighbors, Random forest

74
GLCM for texture classification

Apply machine learning


to identify defects in
Fabric

75
References
• https://www.oreilly.com/library/view/programming-computer-vision/9781449341916/ch02.html

https://www.baeldung.com/cs/image-processing-feature-descriptors
• https://sbme-tutorials.github.io/2018/cv/notes/9_week9.html
• https://www.analyticsvidhya.com/blog/2019/09/feature-engineering-images-introduction-hog-feature-
descriptor/
• https://towardsdatascience.com/hog-histogram-of-oriented-gradients-67ecd887675f
• https://medium.com/@dnemutlu/hog-feature-descriptor-263313c3b40d
• https://learnopencv.com/histogram-of-oriented-gradients/
• https://debuggercafe.com/image-recognition-using-histogram-of-oriented-gradients-hog-descriptor/
• https://www.google.com/amp/s/iq.opengenus.org/object-detection-with-histogram-of-oriented-gradients-
hog/amp/
• https://www.google.com/url?sa=t&source=web&rct=j&url=https://m.youtube.com/watch%3Fv%3D5nZGnY
PyKLU&ved=2ahUKEwjn2ISl1sz-AhUERmwGHe-
aB3EQo7QBegQIDBAF&usg=AOvVaw3j2q__19MNeHimMyT1lewO

76
References
• https://nptel.ac.in/courses/108103174
• https://www.google.com/url?sa=t&source=web&rct=j&url=https://m.youtube.com/watc
h%3Fv%3Dm-_lRRScetk&ved=2ahUKEwjn2ISl1sz-AhUERmwGHe-
aB3EQo7QBegQIAxAF&usg=AOvVaw1Hfzvqvgo3grrqA8jyZiY9
• https://sbme-tutorials.github.io/2018/cv/notes/6_week6.html
• https://medium.com/data-breach/introduction-to-harris-corner-detector-32a88850b3f6

https://www.baeldung.com/cs/harris-corner-detection
• https://www.codingninjas.com/codestudio/library/harris-corner-detection
• https://www.google.com/url?sa=t&source=web&rct=j&url=https://www.cs.umd.edu/cla
ss/fall2019/cmsc426-0201/files/12_HarrisCornerDetection.pdf&ved=2ahUKEwj_q4fY5br-
AhWu-jgGHeTBCJAQFnoECD8QAQ&usg=AOvVaw0WjY5eRFeu-vCUFu-g6o90
• https://fiveko.com/feature-points-using-harris-corner-detector/

77
References
• https://www.google.com/url?sa=t&source=web&rct=j&url=https://m.youtube.com/watch%3Fv%3DPtc4dEnPwt8&ved=2ahUKEwj
F6Inyw8X-AhWvTWwGHRR9CWcQo7QBegQIAxAF&usg=AOvVaw1NG86Bu1KDU3Dc0H1EuVNs

• https://towardsdatascience.com/sift-scale-invariant-feature-transform-c7233dc60f37

http://www.scholarpedia.org/article/Scale_Invariant_Feature_Transform

• https://www.google.com/amp/s/www.geeksforgeeks.org/sift-interest-point-detector-using-python-opencv/amp/
• https://www.codingninjas.com/codestudio/library/scale-invariant-feature-transform-sift

https://www.analyticsvidhya.com/blog/2019/10/detailed-guide-powerful-sift-technique-image-matching-python/

• https://medium.com/data-breach/introduction-to-sift-scale-invariant-feature-transform-65d7f3a72d40
• https://nptel.ac.in/courses/108103174

78
References
• https://levelup.gitconnected.com/the-integral-image-4df3df5dce35
• https://in.mathworks.com/help/images/integral-image.html
• https://traffic.nayan.co/blog/AI/Integral-Image/
• https://www.researchgate.net/figure/9x9-15x15-box-filter-Filters-Dyy-left-and-Dxy-right-
for-two-successive-scale-levels_fig3_330349878
• https://www.google.com/url?sa=t&source=web&rct=j&url=https://m.youtube.com/watc
h%3Fv%3DaEcIsx_aT6U&ved=2ahUKEwjU0dSK7ML-AhXf-
zgGHYNiAiMQwqsBegQIUxAE&usg=AOvVaw19Y_Ibe14zebgND0Yoiu6g
• https://medium.com/data-breach/introduction-to-surf-speeded-up-robust-features-
c7396d6e7c4e

https://en.m.wikipedia.org/wiki/Speeded_up_robust_features#:~:text=The%20SURF%20
algorithm%20is%20based,local%20neighborhood%20description%2C%20and%20matchi
ng.

79

You might also like