0% found this document useful (0 votes)
47 views6 pages

Texture Classification Based On Symbolic Data Analysis: Abstract

This document summarizes a research paper that proposes a new approach for texture-based image classification using gray-level co-occurrence matrices (GLCM) and a fuzzy clustering algorithm called Fuzzy Kohonen Clustering Network for Symbolic Interval Data (IFKCN). The approach extracts texture features from images using GLCM, represents the data symbolically, clusters the symbolic data using IFKCN to generate prototypes, and classifies new images by comparing them to the prototypes. Experimental results showed an average classification success rate of 97.39% using this hybrid texture classification method.

Uploaded by

ultimatekp144
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
47 views6 pages

Texture Classification Based On Symbolic Data Analysis: Abstract

This document summarizes a research paper that proposes a new approach for texture-based image classification using gray-level co-occurrence matrices (GLCM) and a fuzzy clustering algorithm called Fuzzy Kohonen Clustering Network for Symbolic Interval Data (IFKCN). The approach extracts texture features from images using GLCM, represents the data symbolically, clusters the symbolic data using IFKCN to generate prototypes, and classifies new images by comparing them to the prototypes. Experimental results showed an average classification success rate of 97.39% using this hybrid texture classification method.

Uploaded by

ultimatekp144
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

Texture Classication Based on Symbolic Data Analysis

Carlos W.D. de Almeida


1
and Renata M.C.R. de Souza
1
and Ana L ucia B. Candeias
2
1- Federal University of Pernambuco, Center of Informatics (CIn)
Recife, Brazil. E-mail: [email protected] and [email protected]
2- Federal University of Pernambuco, Department of Cartographic Engineering
Recife, Brazil. E-mail: [email protected]
Abstract. This article presents a hybrid approach for texture-based image clas-
sication using the gray-level co-occurrence matrices (GLCM) and a new Fuzzy
Kohonen Clustering Network for Symbolic Interval Data (IFKCN). The GLCM
matrices extracted from an image database are processed to create the training data
set using IFKCN algorithm. The IFKCN organizes and extracts prototypes from
processed GLCMmatrices. The experimental results demonstrate that the proposed
method is encouraging with an average successful rate of 97.39%.
1 Introduction
The perception of texture is believed to play an important role in the human visual
system for recognition and interpretation. Several authors have worked in nding de-
scriptors and features for texture identication. Existing features and techniques for
modeling textures include Gabor transforms [1], afne adaption [2] and some invari-
ant feature descriptors such as Zernike moments [4]. Haralick [5] suggested the use of
gray-level co-occurrence matrices (GLCM) to extract texture features from an image.
Efcient organization and indexing of objects are steps of paramount importance in
texture classication. Cluster analysis is the organization of a collection of patterns into
clusters based on similarity In clustering analysis, the items to be grouped are usually
represented as a vector of quantitative or qualitative measurements where each column
represents a variable. In practice, however, this model is too restrictive to represent
complex data since to take into account variability and/or uncertainty inherent to the
data, variables must assume sets of categories or intervals, possibly even with frequen-
cies or weights. These kinds of data have been mainly studied in Symbolic Data Analy-
sis (SDA) [6]. The aim of SDA is to provide suitable methods for managing aggregated
data described by multi-valued variables, where the cells of the data table contain sets
of categories, intervals, or weight distributions [6].
The focus of the paper is to show the effectiveness of a hybrid approach in handling
and classication from a texture database, which is characterized by: (i) a new texture
image descriptor based on Gray Level Co-occurrence Matrix (GLCM) [5] and Symbolic
Data Analysis (SDA) [6] (ii) a new method Fuzzy Kohonen Clustering Network for
Symbolic Interval Data (IFKCN).
2 The Proposed Approach for Texture Classication
Four main modules are the building blocks of the proposed method: (i) Texture De-
scriptor, (ii) Symbolic Data Analysis, (iii) Clustering, and (iv) Classication. Our ap-
145
ESANN 2012 proceedings, European Symposium on Artificial Neural Networks, Computational Intelligence
and Machine Learning. Bruges (Belgium), 25-27 April 2012, i6doc.com publ., ISBN 978-2-87419-049-0.
Available from http://www.i6doc.com/en/livre/?GCOI=28001100967420.
proach has a learning stage where the texture descriptor module receives images from
a database and applies the GLCM method.
Gray level co-occurrence matrix (GLCM) [5] describes the relative frequencies with
which two pixels separated by a distance d under a specied angle occur on the image.
In this work, we will use a set of offsets sweep through 180 degrees (ie 0, 45, 90, and
135 degrees) and d = 1, obtaining four GLCM matrices. These angles are essential
to obtain a feature vector invariant to rotation. Computed these matrices, we get an
enormous amount of information on the texture. It is totally impractical to store such
information. To reduce the dimensionality of the data, we extract symbolic data and
obtain a reduced representation of data. Then, the GLCM matrices are pre-processed in
order to obtain input data for the clustering or classication modules. In the Clustering
module, the IFKCN algorithmorganises and extracts prototypes fromthe processed ma-
trices, which ends the learning stage. The classication module receives a pre-processed
query image and compares it with the prototypes (representations of clusters) obtained
in the clustering module. The nal result is a list of images belonging to a few number
of clusters considered to be the nearest to the users query.
2.1 Texture Descriptor Module
A Gray level co-occurrence matrix [5] P(i, j) describes the relative frequencies with
which two pixels separated by a distance d under a specied angle occur on the
image, one with graytone i and the other with graytone j. Such matrices of graytone
spatial dependence frequencies depend on the angular relationship between neighboring
pixels and on the other distance between them. The GLCM can be dened as:
P

(i, j) = Pr(I(p
1
) = i I(p
2
) = j p
1
p
2
= d) (1)
where P is the probability, p
1
and p
2
are positions in the gray scale image I. In this
work, we will use a set of offsets sweep through 180 degrees (ie 0, 45, 90, and 135
degrees), obtaining four GLCM matrices. The algorithm proceeds as follows:
TEXTURE DESCRIPTOR ALGORITHM SCHEMA
1: Compute for each imagem Imgs
2: for s = 1 to S do
3: Compute the GLCM(P

s
) fromimage Imgs using the angles: = {0

, 45

, 90

, 135

}.
4: end for
2.2 SDA Module
This module transforms the GLCM matrices P

in a single GLCM matrix composed


of symbolic interval data PI. Then, we re-extract symbolic interval data from the ma-
trix PI in order to obtain a vector of interval data X, in turn, will constitute the input
data for the clustering module. For each position in the gray scale matrix P

(i, j), we
extract the minimumand maximumprobabilities over all angles . This way, we create
a new variable PI(i, j) in order to capture the variability of the probability over dif-
ferent values of . Problems with choosing [min; max] can arise when these extreme
146
ESANN 2012 proceedings, European Symposium on Artificial Neural Networks, Computational Intelligence
and Machine Learning. Bruges (Belgium), 25-27 April 2012, i6doc.com publ., ISBN 978-2-87419-049-0.
Available from http://www.i6doc.com/en/livre/?GCOI=28001100967420.
values are in fact outliers. The extraction of the minimum and maximum probabilities
is computed as:
PI(i, j) =

if max(P

(i, j)) == 0 then


PI(i, j) = [0; 0]
else
PI(i, j) = [min(P

(i, j)) > 0; max(P

(i, j))]
end if
(2)
SDA Module assumes that the interval matrix PI is composed by n items or indi-
viduals (rows) that are described by p interval-type variables (columns). For a given
number n of interval data x
i
= [a
i
, b
i
] (i = 1, ..., n), the extraction of a vector (X)
from a matrix (PI) is dened by the following equation:
x
i
=

if max{bij | j = 1, ..., p} == 0 then


xi = [0, 0]
else
xi = [min{aij | j = 1, ..., p} > 0, max{bij | j = 1, ..., p}]
end if
(3)
SDA ALGORITHM SCHEMA
1: Given a GLCM matrix P

, do
2: for i = 1 to I do
3: for j = 1 to J do
4: Compute the PI(i, j) nding the minimum and maximum probabilities over all
angles P

(i, j) ( = 0

, 45

, 90

, 135

), using the Equation 2.


5: end for
6: end for
7: for i = 1 to I do
8: Find the minimum and maximum boundary in the PI matrix for the jth variable (j = 1
, ..., p), using the Equation 3.
9: end for
2.3 Clustering Module
The classical Fuzzy Kohonen Clustering Network (FKCN) [7] is a batch clustering
algorithm that combines the ideas of fuzzy membership values for learning rates, the
parallelism of the Fuzzy C-Means (FCM), and the structure and self-organizing update
rules of the Kohonen Clustering Network (KCN) [8].
Let = {1, ..., n} be a set of n objects indexed by k and described by p symbolic
interval variables {x
1
k
, ..., x
p
k
} indexed by j. A symbolic interval variable X [6] is a
correspondence dened from in R such that for each k , X(k) = [a, b] ,
where = {[a, b] : a, b R, a b} is the set of closed intervals dened from R.
Each pattern k is represented as a vector of intervals x
k
= (x
1
k
, ..., x
p
k
), where x
j
k
=
[a
j
k
, b
j
k
] . This method looks for a partition of into c clusters {P
1
, ..., P
c
} indexed
by i. A prototype g
i
of cluster P
i
be also represented as a vector of intervals g
i
=
(g
1
i
, ..., g
p
i
), where g
j
i
= [
j
i
,
j
i
] . For a partition of in c clusters {P
1
, ..., P
c
}
and a corresponding set of prototypes {g
1
, ..., g
c
} such that an adequacy criterion J
147
ESANN 2012 proceedings, European Symposium on Artificial Neural Networks, Computational Intelligence
and Machine Learning. Bruges (Belgium), 25-27 April 2012, i6doc.com publ., ISBN 978-2-87419-049-0.
Available from http://www.i6doc.com/en/livre/?GCOI=28001100967420.
measuring the tting between the clusters and their prototypes is locally minimized.
This criterion J is based on a squared Euclidean distance and is dened as:
J =
c

i=1
n

k=1
(u
ik
)
m
(x
k
gi) =
c

i=1
n

k=1
(u
ik
)
m
p

j=1
_
(a
j
k

j
i
)
2
+ (b
j
k

j
i
)
2
_
(4)
The algorithm sets an initial partition and convergence when the criterion J reaches
a stationary value representing a local minimum. IFKCN controls learning rates and
size of the neighborhoods changing the value of weight exponents with time as follows:
mt = m0 t
m0 1
tmax
(5)
where m
0
is the initial value of weight exponent greater than one and t
max
is the itera-
tion limit. The membership degree u
ik
(k = 1, ..., n) of each pattern k in each cluster
P
i
, minimizing the clustering criterion J under u
ik
0 and

c
i=1
u
ik
= 1, is updated
according to the following expression:
u
ik,t
=
_
c

h=1
_

p
j=1
_
(a
j
k

j
i,t1
)
2
+ (b
j
k

j
i,t1
)
2
_

p
j=1
_
(a
j
k

j
h,t1
)
2
+ (b
j
k

j
h,t1
)
2
_
_ 2
m
t
1
_
1
for (i = 1, ..., c) (6)
The fuzzied learning rate is dened as:

ik,t
= (u
ik,t
)
mt
(7)
The prototype g
i
= (g
1
i
, ..., g
p
i
) of class P
i
(i = 1, ..., c), which minimizes the
clustering criterion J, has the bounds of the interval g
j
i
= [
j
i
,
j
i
] (j = 1, ..., p) updated
according to the following expression:

j
i,t
=
j
i,t1
+

n
k=1
(
ik,t
)(a
j
k

j
k,t1
)

n
k=1
(
ik,t
)
and
j
i,t
=
j
i,t1
+

n
k=1
(
ik,t
)(b
j
k

j
k,t1
)

n
k=1
(
ik,t
)
(8)
IFKCN-FD ALGORITHM SCHEMA
1: Fix the number of clusters c. Select > 0. Set tmax and m0 > 1.
2: Initialize the prototype vector g
i
and membership degree u
ik
(k = 1, ..., n) (i = 1, ..., c) of
the pattern k belonging to cluster Pi (i = 1, ..., c) such that u
ik
0 and

c
i=1
u
ik
= 1.
3: for t = 1 to tmax do
4: for k = 1 to n do
5: Calculate the learning rate (Equation 5), the fuzzy membership (Equation 6) and the
fuzzied learning rate (Equation 7).
6: Update all cluster centroids interval (Equation 8).
7: Compute Et = vt vt1
2
.
8: if Et then
9: Stop.
10: else
11: next t.
12: end if
13: end for
14: end for
148
ESANN 2012 proceedings, European Symposium on Artificial Neural Networks, Computational Intelligence
and Machine Learning. Bruges (Belgium), 25-27 April 2012, i6doc.com publ., ISBN 978-2-87419-049-0.
Available from http://www.i6doc.com/en/livre/?GCOI=28001100967420.
2.4 Classication Module
This module receives a pre-processed image query and return to the user the classe
considered to be the most similar to his/her query. After the training phase, it is possible
to use the IFKCN to construct a classier in which each prototype represents one class
type. If labelled data are available, this information can be used to assign each prototype
a label. The IFKCN is labelled based on votes between the labels according input
data vectors and only uses the one which is most frequent. Finally, class label of each
original data vector is the label of the corresponding Best Matching Units (BMUs) [8].
A BMU is the winning node of the IFKCN, that is, the prototype more similar to the
query. The search procedure considers at least the rst BMU. It is possible that more
than one of the BMUs have to be considered in order to classify.
CLASSIFICATION ALGORITHM SCHEMA
1: Compare xj = (x
1
j
, . . . , x
p
j
) with all the node vectors (m
d
) (d = 1, . . . , M) of the IFKCN
using the Euclidean distance.
2: Obtain an ordered list of all BMUs.
3: Determine the rst BMUs in the IFKCN.
4: Extract the label associated with the selected BMU(s).
5: if the label data are not available then
6: Determine the next BMU in the IFKCN and repeat the step 4.
7: end if
3 Experimental Results
In order to assess the performance of the proposed hybrid approach, experiments with
the Brodatz database [9], were carried out. In the experiments, each Brodatz texture
constitutes a separate class. Each texture have 640 640 pixels, with 8 bits/pixel.
The texture was partitioned in 32 32 subimages, taking 400 non-overlapping texture
samples in each class. The samples were separated in two disjoint sets, one for training
and the other for testing the classier. This corpus is the same used by Li et al [10].
The evaluation is based by the correct classication rate (CCR) [10]. These mea-
surements are estimated in the framework of a Monte Carlo experience with 30 random
partitions of the training and test sets. This approach is compared with several classi-
ers in Li et al [10].
In this experiment, the effect of the training set size in the classier accuracy was
then assessed. A small fraction (from 1.25% to 10%) of the 400 subimages are used
in training the classiers, while the rest are used for testing. The training samples
in each class was set to 1.25% (5 samples), 2.5% (10 samples), 3.75% (15 samples),
5% (20 samples), 6.25% (25 samples), 7.5% (30 samples), 8.75% (35 samples) and
10% (40 samples). For the proposed method, the number of clusters c was set to 30
(the same number of classes in database). Figure 1 summarizes the results of the pro-
posed method, along with the results [10] for the single and fused SVM classier, the
Bayes classiers using Bayes distance and Mahalanobis distance, and the LVQ classi-
er. These measurements are estimated in the framework of a Monte Carlo experience
with 30 random partitions of the training and test sets. From this gure, we can observe
149
ESANN 2012 proceedings, European Symposium on Artificial Neural Networks, Computational Intelligence
and Machine Learning. Bruges (Belgium), 25-27 April 2012, i6doc.com publ., ISBN 978-2-87419-049-0.
Available from http://www.i6doc.com/en/livre/?GCOI=28001100967420.
Fig. 1: Texture classication CCR rates
the superiority of the proposed classier in comparision with the other ones.
4 Conclusions
In this paper, an approach for texture-based image classication using the gray-level co-
occurrence matrices (GLCM) and FKCN for Symbolic Interval Data (IFKCN) methods
is presented. To showthe usefulness of the proposed methodology, an application with a
benchmark data set was considered. The proposed classier was compared with several
classiers according to the error rate of classication. The results demonstrated that our
method outperformed the other ones.
References
[1] Anil K. Jain and Farshid Farrokhnia. Unsupervised texture segmentation using gabor lters. Pattern
Recognition, 24(12):11671186, 1991.
[2] J. Zhang, M. Marsza, S. Lazebnik, and C. Schmid. Local features and kernels for classication of texture
and object categories: A comprehensive study. International Journal of Computer Vision, 73(2):213
238, 2007.
[3] Dimitrios Charalampidis and Takis Kasparis. Wavelet-based rotational invariant roughness features for
texture classication and segmentation. IEEE Trans. Image Process., 11:825837, 2002.
[4] A. Khotanzad and Y.H. Hong. Invariant image recognition by zernike moments. IEEE Transactions on
Pattern Analysis and Machine Intelligence, 12(5):489497, 1990.
[5] Robert M. Haralick, K. Shanmugam, and ItsHak Dinstein. Textural features for image classication.
IEEE Transactions on In Systems, Man and Cybernetics, 3(6):610621, 1973.
[6] Hans-Hermann Bock and Edwin Diday. Analysis of Symbolic Data: Exploratory Methods for Extract-
ing Statistical Information from Complex Data. Springer, 2000.
[7] James C. Bezdek, Eric Chen-Kuo Tsao, and Nikhil R. Pal. Fuzzy kohonen clustering networks. In Proc.
of the First IEEE Conference on Fuzzy Systems, San Diego, USA, 1992.
[8] T. Kohonen. Self-Organizing Maps. Springer-Verlag, 3rd edition edition, 2001.
[9] P. Brodatz. Textures: A Photographic Album for Artists and Designers. Dover Publications, 1966.
[10] Shutao Li, James T. Kwok, Hailong Zhu, and Yaonan Wang. Texture classication using the support
vector machines. Pattern Recognition, 36(12):28832893, 2003.
150
ESANN 2012 proceedings, European Symposium on Artificial Neural Networks, Computational Intelligence
and Machine Learning. Bruges (Belgium), 25-27 April 2012, i6doc.com publ., ISBN 978-2-87419-049-0.
Available from http://www.i6doc.com/en/livre/?GCOI=28001100967420.

You might also like