Nanda and Panda 2013 - A Survey On Nature Inspired Metaheuristic Algorithms For Partitional Clustering
Nanda and Panda 2013 - A Survey On Nature Inspired Metaheuristic Algorithms For Partitional Clustering
Review
Department of Electronics and Communication Engineering, Malaviya National Institute of Technology Jaipur, Rajasthan 302017, India
School of Electrical Sciences, Indian Institute of Technology Bhubaneswar, Odisha 751013, India
art ic l e i nf o
a b s t r a c t
Article history:
Received 10 October 2012
Received in revised form
23 August 2013
Accepted 20 November 2013
The partitional clustering concept started with K-means algorithm which was published in 1957. Since
then many classical partitional clustering algorithms have been reported based on gradient descent
approach. The 1990 kick started a new era in cluster analysis with the application of nature inspired
metaheuristics. After initial formulation nearly two decades have passed and researchers have developed
numerous new algorithms in this eld. This paper embodies an up-to-date review of all major nature
inspired metaheuristic algorithms employed till date for partitional clustering. Further, key issues
involved during formulation of various metaheuristics as a clustering problem and major application
areas are discussed.
& 2014 Published by Elsevier B.V.
Keywords:
Partitional clustering
Nature inspired metaheuristics
Evolutionary algorithms
Swarm intelligence
Multi-objective Clustering
Contents
1.
2.
Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
Single objective nature inspired metaheuristics in partitional clustering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
2.1.
Problem formulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
2.2.
Historical developments in nature inspired metaheuristics for partitional clustering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
2.2.1.
Evolutionary algorithms in partitional clustering. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
2.2.2.
Physical algorithms in partitional clustering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
2.2.3.
Swarm Intelligence algorithms in partitional clustering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
2.2.4.
Bio-inspired algorithms in partitional clustering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
2.2.5.
Other nature inspired metaheuristics for partitional clustering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
2.3.
Fitness functions for partitional clustering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
2.4.
Cluster validity indices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
3. Multi-objective algorithms for exible clustering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
3.1.
Problem formulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
3.2.
Historical development in multi-objective algorithms for partitional clustering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
3.3.
Evaluation methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
4. Real life application areas of nature inspired metaheuristics based partitional clustering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
5. Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
6. Future research issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
Corresponding author.
E-mail addresses: [email protected] (S.J. Nanda), [email protected] (G. Panda).
Please cite this article as: S.J. Nanda, G. Panda, A survey on nature inspired metaheuristic algorithms for partitional clustering, Swarm
and Evolutionary Computation (2014), http://dx.doi.org/10.1016/j.swevo.2013.11.003i
1. Introduction
Data clustering determines a group of patterns in a dataset
which are homogeneous in nature. The objective is to develop an
automatic algorithm which can accurately classify an unleveled
dataset into groups. Recent literature [15] broadly classies
clustering algorithms into three categories: hierarchical, partitional and overlapping. The hierarchical algorithms provide a tree
structure output (dendrogram plot) which represent the nested
grouping of the elements of a dataset [6,7]. They do not require a
priori knowledge about the number of clusters present in the
dataset [8,9]. However the process involved in the algorithm is
assumed to be static and elements assigned to a given cluster
cannot move to other clusters [10]. Therefore they exhibit poor
performance when the separation of overlapping clusters is
carried out.
The overlapping nature of clusters is better expressed in fuzzy
clustering [1113]. The popular algorithms include fuzzy c-means
(FCM) [14] and fuzzy c-shells algorithm (FCS) [15]. In this approach
each element of a dataset belongs to all the clusters with a fuzzy
membership grade. The fuzzy clustering can be converted to a
crisp clustering (any element belongs to one cluster only) by
assigning each element to the cluster with highest measure of
membership value.
The partitional clustering divides a dataset into a number of
groups based upon certain criterion known as tness measure. The
tness measure directly affects the nature of formation of clusters.
Once an appropriate tness measure is selected the partitioning
task is converted into an optimization problem (example: grouping based on minimization of distance or maximization of correlation between patterns, otherwise optimizing their density in the N
dimensional space etc.). These partitional techniques are popular
in various research elds due to their capability to cluster large
datasets (example: in signal and image processing for image
segmentation [16], in wireless sensor network for classifying the
sensors to enhance lifetime and coverage [1720], in communication to design accurate blind equalizers [21], in robotics to
efciently classify the humans based upon their activities [22], in
computer science for web mining and pattern recognition [23], in
economics research to identify the group of homogeneous consumers [24], in management studies to determine the portfolio
[27], in seismology to classify the aftershocks from the regular
background events [28], to perform high dimensional data analysis
[29], in medical sciences to identify diseases from a group of
patient reports and genomic studies [30], in library sciences for
grouping books based upon the content [32], etc.). In all these
applications the nature of patterns associated with the datasets is
different from each other. Therefore a single partitional algorithm
cannot universally solve all problems. Thus given a problem in
hand an user has to carefully investigate the nature of the patterns
associated with the dataset and select the appropriate clustering
strategy.
The K-means algorithm is the most fundamental partitional
clustering concept which was published by Lloyd of Bell Telephone
laboratories in 1957 [373375]. After 50 years of its existence, till
date this algorithm is still popular and widely used for high
dimensional datasets due to its simplicity and lower computational complexity [34,376,377]. In this case the minimization of
Euclidean distance between elements and cluster center is considered as optimization criterion. Inspired by K-means a number of
gradient algorithms for partitional clustering are developed by
researchers which include bisecting K-means [35] (recursively
dividing the dataset into two clusters in each step), sort-means
[36] (means are shortened in the order of increasing distance from
each mean to speed up the traditional process), kd-tree [37]
(determines the closest cluster centers for all the data points),
Please cite this article as: S.J. Nanda, G. Panda, A survey on nature inspired metaheuristic algorithms for partitional clustering, Swarm
and Evolutionary Computation (2014), http://dx.doi.org/10.1016/j.swevo.2013.11.003i
Table 1
Broad classication of nature inspired metaheuristic algorithms.
Types
Single objective
Multi-objective
Evolutionary algorithms
NSGA II [305,306],
Multi-objective DE [343]
Multi-objective GP [317]
Multi-objective ES [318]
SPEA [326], PESA II [325]
Physical algorithms
Multi-objective
Multi-objective
Multi-objective
Multi-objective
SA [313]
MA [314]
HS [315]
SFL [316]
Swarm intelligence
Multi-objective
Multi-objective
Multi-objective
Multi-objective
ACO [333]
PSO [307]
ABC [310]
FSA [321]
Bio-inspired algorithms
Multi-objective
Multi-objective
Multi-objective
Multi-objective
Multi-objective
CSO [311]
Cuckoo [319]
Firey [312]
IWO [283]
GSA [320]
inspired metaheuristics used in partitional clustering, (2) up-todate survey on exible partitional clustering based on multiobjective metaheuristic algorithms, (3) consolidation of recently
developed cluster validation majors, and (4) exploration of the
new application areas of partitional clustering algorithms.
The paper is organized as follows. Section 2 deals with the
advances in single objective nature inspired metaheuristics for
partitional clustering, which includes recent developments in
algorithm design, tness functions selection and cluster validity
indices used for verication. The multi-objective metaheuristics
used for exible clustering are discussed in Section 3. The real life
application areas of nature inspired partitional clustering are
highlighted in Section 4. Finally the concluding remarks of
investigation made in the survey are presented in Section 5.
A number of issues on innovative future research are presented
in Section 7.
8 k 1; 2; ; K;
Ck \ Cl
8 k; l 1; 2; K and k a l; C k Z:
k1
Ck
f Z ND ; C k
8 k 1; 2; ; K
In the last two decades a number of nature inspired metaheuristics have been proposed in the literature and applied to many
real life applications. In recent years to solve various unsupervised
optimization problems the metaheuristic algorithms are successfully used. Present stage for any unsupervised optimization problem in hand an user can easily pick up a suitable metaheuristic
algorithm for solving the purpose. The solution achieved ensures
optimality as these population based algorithms explore the entire
search space with the progress in generations.
The basic steps associated with the core algorithms for partitional clustering are listed in Table 2. The recent works on
partitional clustering are outlined in sequence.
Please cite this article as: S.J. Nanda, G. Panda, A survey on nature inspired metaheuristic algorithms for partitional clustering, Swarm
and Evolutionary Computation (2014), http://dx.doi.org/10.1016/j.swevo.2013.11.003i
Table 2
Basic steps involved in single objective standard GA, DE, ACO, PSO, ABC, AIS and BFO algorithms for solving partitional clustering problem.
GA
Initialize
Chromos.
DE
Initialize
Particles
ACO
Initialize
Ants
PSO
Initialize
Particles
ABC
Initialize
Bees
AIS
Initialize
Immune Cells
BFO
Initialize
Bacteria
(
+
)
Next Generation
Crossover
Mutation
(
+
)
Next Generation
Mutation
Crossover
(
+
)
Next Generation
Fitness
Update Pheromone
Intensity
(
+
)
Next Generation
Vel. update
Pos. update
Compute
GBst & PBst
Next Generation
Compute
Emp. Bees
Greedy Selection
& Fitness
(
+
)
Next Generation
Fitness
Clone
(
+
)
Next Generation
Chemotaxis
Swarming
(
+
)
Fitness
Fitness
Drop or
Peak
Fitness
Onlooker
Bees
Mutation
Reproduction
Selection
Selection
Short
Memory
Selection
Selection
Selection
Eliminat.
& Dispers.
(
*
)
(
*
)
(
*
)
(
*
)
(
*
)
(
*
)
(
*
)
Cl.
O/p
Cl.
O/p
Cl.
O/p
Cl.
O/p
Cl.
O/p
Cl.
O/p
Cl.
O/p
Please cite this article as: S.J. Nanda, G. Panda, A survey on nature inspired metaheuristic algorithms for partitional clustering, Swarm
and Evolutionary Computation (2014), http://dx.doi.org/10.1016/j.swevo.2013.11.003i
(SMT) setup time. Feng-jie and Ye [130] applied the GA and PSO
based hybrid clustering algorithm for image segmentation of
transmission lines picture to determine the faults. This system is
helpful for remote video monitoring. Hong and Kwong [131]
combined steady-state genetic algorithm and ensemble learning
for cluster analysis. Chaves and Lorenab [132] developed a hybrid
algorithm Clustering Search (consisting of GA along with local
search heuristic) to solve capacitated centered clustering problem.
Recently a two stage genetic algorithm was proposed by He et al.
[134] for cluster analysis in which two-stage selection and mutation operations are incorporated to enhance the search capability
of the algorithm. The two stage genetic algorithm provides
accurate results compared to agglomerative k-means [133] and
standard genetic k-means algorithms. A grouping genetic algorithm (GGA) is a compact one proposed by Falkenauer [135] to
handle grouping-based problems. The GGA is successfully used for
cluster analysis of benchmark UCI datasets in [136]. Recently Tan
et al. [137] applied the GGA based clustering technique to improve
the spectral efciency of OFDMA (orthogonal frequency-division
multiple access) based multicast systems.
Please cite this article as: S.J. Nanda, G. Panda, A survey on nature inspired metaheuristic algorithms for partitional clustering, Swarm
and Evolutionary Computation (2014), http://dx.doi.org/10.1016/j.swevo.2013.11.003i
Table 3
Similarity functions f used by the single objective nature inspired metaheuristic algorithms for cluster analysis. Considering dataset Z ND fz1D ; z2D ; ; zND g to be
divided into K clusters with valid partitions Ck as per (1).
Similarity fun.
Characteristics
Medoid distance
Explanation
Minimization of sum of Distance between objects and medoids of dataset
min
Representation
F 1 N
i 1 j A f1; Kgdzi ; mj where medoids fm1 ; m2 ; ; mk g Z, d is any distance
Used in
Lucasius et al. [121], Castro and Murray [122], Sheng and Liu [107]
Centroid distance
Explanation
Minimization of sum of squared Euclidean distance of objects from respective cluster means
Representation F 2 K zi A cj J zi j J 2 , with j is the mean of cj
j1
Distortion distance
Explanation
Minimization of intraclus- ter diversity
Representation F 3 F 2 =N D
Used in
Krishna and Murty [106], Lu et al. [108,109], Franti et al. [124], Kivijarvi et al. [125]
Explanation
It is the ratio of between cluster (B) and pooled within cluster (W) covariance matrices. The VRC should be maximized
Representation
trace B=K 1
F 4 VRC
trace W=N K
Used in
Cowgill et al. [110], Casillas et al. [115]
Used in
Details
Maulik and Bandyopadhyay [105], Zhang and Cao [207], Murthy and Chowdhury [103]
Explanation
Dunn0 s index to be maximized for optimal partition
min
min
Representation
c ;c
F 6 DIK iA K j A K; j a i max i j
, where ci ; cj minfdzi ; zj : zi A ci ; zj A cj g, ck maxfdzi ; zj : zi ; zj A ci g,
k A K fck g
Used in
DavisBouldin (DB) index
d is the distance
Dunn [293], Zhang and Cao [207]
Explanation
Ratio of sum of within cluster scatter to between cluster separation. DB index to be minimized
max
Si;q Sj;q
Representation
F 7 DBK 1kKi 1 Ri;qt , where Ri;qt j A K; ja i
dij;t
h
i1=q
, where Ni and i are the number of elements and center
The ith cluster scatter Si;q N1i z A ci J z i J q
Used in
t 1=t
of ci respectively. The separation distance between ith and jth cluster is dij;t D
d 1 ji;d j;d j
Davis and Bouldin [291], Cole [123], Das et al. [16], Bandyopadhyay and Maulik [113], Agustin-Blas et al. [136]
CS measure
Explanation
CS Measure is to be minimized for optimal partitioning
h
i
max
Representation
Ki 1 N1i zj A ci zq A ci dzj ; zq
, with centroid mi N1i zj A ci zj , where Ni is number of elements in ci
F 8 CSK
min
Ki 1 j A K; j a ifdmi ; mj g
Silhouette
Explanation
Higher silhouette provides better assignment of elements
Szi
Representation
where element zi A A, with A; B ck
F 9 N
i1
N
bzi azi
Szi
, Silhouette range: 1 r Szi r 1,
maxfazi ; bzi g
azi is the average dissimilarity of zi to other elements of A,
neighbor dissimilarity bzi min disszi ; B, A a B
Used in
Kaufman and Rousseeuw [294], Hruschka et al. [3,118]
Used in
given by
f x f x0
Paccept exp
T
Lu et al. [165] developed a fast simulated annealing based clustering approach by combining multiple clusterings based on different
agreement measures between partitions. Recently the SA based
clustering is applied to group the suppliers for effective management and to fulll the demands of customers (i.e. to build a good
supply chain management system) [166].
Please cite this article as: S.J. Nanda, G. Panda, A survey on nature inspired metaheuristic algorithms for partitional clustering, Swarm
and Evolutionary Computation (2014), http://dx.doi.org/10.1016/j.swevo.2013.11.003i
Please cite this article as: S.J. Nanda, G. Panda, A survey on nature inspired metaheuristic algorithms for partitional clustering, Swarm
and Evolutionary Computation (2014), http://dx.doi.org/10.1016/j.swevo.2013.11.003i
Please cite this article as: S.J. Nanda, G. Panda, A survey on nature inspired metaheuristic algorithms for partitional clustering, Swarm
and Evolutionary Computation (2014), http://dx.doi.org/10.1016/j.swevo.2013.11.003i
recent work the IPSO has been suitably employed for partitional
clustering task [234]. Graaff and Engelbrecht [263] initially developed a local network neighborhood clustering method based on
AIS. Later on they have formulated the immune based algorithm
for cluster analysis under uncertain environments [264].
Please cite this article as: S.J. Nanda, G. Panda, A survey on nature inspired metaheuristic algorithms for partitional clustering, Swarm
and Evolutionary Computation (2014), http://dx.doi.org/10.1016/j.swevo.2013.11.003i
10
applied the multi-objective CSO algorithm for optimal deployment of sensor nodes in wireless sensor networks.
Cuckoo Search Algorithm The cuckoo search algorithm is
developed by Yang and Deb [272] in 2009. The algorithm
mimics the breeding behavior of cuckoos (to lay their eggs
in the nests of other birds). Three basic operations associated
are: (i) every cuckoo lays one egg at a time, and dumps its egg
in randomly selected nest in the environment, (ii) the nests
with good quality of eggs will remain for next generations, (iii)
the number of host bird nests is xed, and the egg laid by a
cuckoo is identied by the host bird depending on a probability
in the range [0, 1] (under such situation, the host bird can
either destroy the egg or destroy the present nest and build a
new one). Goel et al. [274] have formulated the cuckoo search
based clustering algorithm and applied it for extraction of
water body information from remote sensing satellite images.
Firey algorithm The algorithm is proposed by Yang [275277]
observing the rhythmic ashes of reies. Senthilnath et al.
[278] applied the algorithm for cluster analysis of UCI datasets.
The algorithm follows three rules based upon the glowing
nature of reies: (i) all reies are unisex and each rey is
attracted towards other reies regardless of their sex; (ii) the
attraction is proportional to their brightness. Therefore between
any two ashing reies, the less brighter one will move
towards the brighter one. As the attraction is proportional to
the brightness, both decrease with the increase in distance
between reies. In the surrounding if there is no brighter one
than a particular rey, then it has to move randomly; (iii) the
brightness of a rey is determined by the nature of objective
function. Initially at the beginning of clustering algorithm, all the
reies are randomly dispersed across the entire search space.
Then the algorithm determines the optimal partitions based on
two phases: (i) variation of light intensity: the brightness of a
rey at current position is reected on its tness value, (ii)
movement towards attractive rey: the rey changes its
position by observing the light intensity of adjacent reies.
Hassanzadeh et al. [279] have successfully applied the rey
clustering algorithm for image segmentation.
Invasive Weed Optimization Algorithm (IWO) The IWO is
proposed by Mehrabian and Lucas [280] by following the
colonization of weeds. The weeds reproduce their seeds spread
over a special area and grow to new plants in order to nd the
optimized position. The automatic clustering algorithm based
upon IWO is formulated by Chowdhury et al. [281]. The
algorithm is based upon four basic steps: (i) initialization of
the weeds in the whole search space, (ii) reproduction of the
weeds, (iii) distribution of the seeds, (iv) competitive exclusion
of the weeds (tter weeds produce more seeds). Su et al. [282]
applied the algorithm for image clustering. The multi-objective
IWO is proposed by Kundu et al. [283] which is recently applied
for cluster analysis by Liu et al. [284].
Gravitational Search Algorithm (GSA) Rashedi [285,286] proposed the GSA following the principles of Newton law of
gravity which states that Every particle in the universe attracts
every other particle with a force that is directly proportional to
the product of their masses and inversely proportional to the
square of the distance between them. The algorithm is used for
cluster analysis by Hatamlou et al. [287]. Recently Yin et al.
[288] developed a hybrid algorithm based on K-harmonic
means and GSA.
kAK
Fk min f 1 k; f 2 k; ; f M k
Please cite this article as: S.J. Nanda, G. Panda, A survey on nature inspired metaheuristic algorithms for partitional clustering, Swarm
and Evolutionary Computation (2014), http://dx.doi.org/10.1016/j.swevo.2013.11.003i
11
and Steinley [347] reported the cross validation issues in multiobjective clustering.
Recently the use of parametric and nonparametric statistical tests
has become popular among the evolutionary researchers. Usually
these tests are carried out to decide where one evolutionary
algorithm is considered better than another [348]. Therefore these
tests can effectively be applied to evaluate the performance of the
new multi-objective clustering algorithms. The parametric tests
described by Garcia et al. [349] are popular in which the authors
have selected 14 UCI datasets to compare the performance of ve
evolutionary algorithm used for classication purpose. They have
used Wilcoxon signed-ranks to evaluate the performance with
classication rate and Cohen0 s kappa as accuracy measure. However
the parametric tests are based upon the assumptions of independence, normality, and homoscedasticity which at times do not get
satised in multi-problem analysis. Under such situations the nonparametric test is preferable. The papers by Derrac et al. [348] and
Garcia et al. [350] clearly highlight the signicance of nonparametric
test, which can perform two classes of analysis pairwise comparisons
and multiple comparisons. The pairwise comparisons include Sign
test, Wilcoxon test, Multiple sign test, and Friedman test. The multiple comparisons consist of Friedman Aligned ranks, Quade test,
Contrast Estimation. The books [353,352] and statistical toolbox in
MATLAB [351] are helpful in implementing these statistical tests.
5. Conclusion
This paper provides an up-to-date review of nature inspired
metaheuristic algorithms for partitional clustering. It is observed
that the traditional gradient based partitional algorithms are
computationally simpler but often provide inaccurate results as
the solution is trapped in the local minima. The nature inspired
metaheuristics explore the entire search space with the population
involved and ensure that optimal partition is achieved. Further
single objective algorithms provide one optimal solution where as
the multi-objective algorithms provide the exibility to select the
desired solution from a set of optimal solutions. The promising
solutions of automatic clustering are much helpful as they do not
need apriori information about the number of clusters present in
the dataset. It is important to note that although numerous
clustering algorithms have been published considering various
practical aspects, no single clustering algorithm has been shown to
dominate the rest for all application areas.
Please cite this article as: S.J. Nanda, G. Panda, A survey on nature inspired metaheuristic algorithms for partitional clustering, Swarm
and Evolutionary Computation (2014), http://dx.doi.org/10.1016/j.swevo.2013.11.003i
12
Table 4
Widely used UCI benchmark data sets for nature inspired metaheuristics based partitional clustering.
Datasets
Creater
R.A. Fisher
GA [100,114,91,128,134], DE [16,155,153],
ACO [184,198,193,202,207], BFO [267],
PSO [223,226,233,224,222,227], CSO [271],
ABC [247,250,248], Firey [278], Frog [180],
NSGA II [345], MOAIS [334], MOCK [22], MODE [343]
Forina et al.
B. German
W.H. Wolberg
O. Mangasarian
R. Quinlan
N. Ilter
H.A. Guvenir
V. Sigillito
Vision Group
Table 5
Real life application areas of nature inspired metaheuristic based partitional clustering.
Applications
Image segmentation
GA Feng et al. [130], PSO Lee et al. [241], Abraham et al. [4], Zhang et al. [230], ACO Ghosh et al. [190], DE Das et al.
[2], Review Jain et al. [10], NSGA II Mukhopadhyay et al. [332], Bandyopadhyay et al. [331], MOCLONAL Yang et al.
[336], Multiobj. Review Bong and Rajeswari [323]
Image clustering
GA Bandyopadhyay et al. [113], DE Das et al. [151,152], Omran et al. [156], PSO Omran et al. [219,243],
NSGA II Bandyopadhyay et al. [331]
Document clustering
GA Casillas et al. [115], Kuo and lin [129], PSO Cui et al. [244], ACO Yang et al. [194], Handl and Meyer [196],
DE Abraham et al. [157], Review Steinbach et al. [35], Andrews et al. [33], Jain et al. [34]
Web mining
ACO Labroche et al. [209], Abraham and Ramos [216], PSO Alam et al. [245]
Text mining
ACO Handl and Meyer [196], Vizine et al. [218], SA - Chang [163]
GA Tan et al. [137], ABC Karaboga et al. [251], Udgata et al. [252], PSO Yu et al. [237], BFO Gaba et al. [268], MOCSO
Pradhan and Panda [20], Review O. Younis et al. [17], M. Younis et al. [18], Kumarawadu et al. [19]
ACO Merkle et al. [217], PSO Ji et al. [238], Ali et al. [339], DE Chakraborty et al. [158]
SA W Jin et al. [164]
GA Lu et al. [109], Ma et al. [117], ACO He and Hui [215], DE Das et al. [2], PSO Sun et al. [225], Du et al. [229],
Thangavel et al. [236], AIS Lie et al. [260], Review Jiang et al. [30], Lukashin et al. [31], Xu and Wunsch [91], Hruschka
et al. [3,119,120], Jain et al. [34], MODE Suresh et al. [343]
Intrusion detection
GA Liu et al. [116], ACO Ramos and Abraham [211], Tsang and Kwong [212], PSO Lima et al. [246]
Computational nance
Review MacGregor et al. [24], Brabazon et al. [25], Amendola et al. [26], Nanda et al. [27]
GA Franti et al. [124], Lucasius et al. [121], ACO Chen et al. [213] Evolutionary algorithm NOCEA Saras et al. [29]
PSO Cho [355], Review Jain et al. [10,34], Zaliapin et al. [28],
Nanda et al. [364]
Please cite this article as: S.J. Nanda, G. Panda, A survey on nature inspired metaheuristic algorithms for partitional clustering, Swarm
and Evolutionary Computation (2014), http://dx.doi.org/10.1016/j.swevo.2013.11.003i
References
[1] R. Xu, D.C. Wunsch, Clustering, Oxford, Wiley, 2009.
[2] S. Das, A. Abraham, A. Konar, Metaheuristic Clustering, Springer, 2009,
ISBN 3540921729.
[3] E.R. Hruschka, R.J.G.B. Campello, A.A. Freitas, A.C.P.L.F. De Carvalho, A survey
of evolutionary algorithms for clustering, IEEE Trans. Syst. Man Cybern. Part
C Appl. Rev. 39 (2) (2009) 133155.
[4] A. Abraham, S. Das, S. Roy, Swarm intelligence algorithms for data clustering,
in: Soft Computing for Knowledge Discovery and Data Mining, Part IV,
Springer, 2007, pp. 279313.
[5] S. Basu, I. Davidson, K. Wagstaff (Eds.), Constrained Clustering: Advances in
Algorithms, Theory and Applications, Data Mining and Knowledge Discovery,
Chapman and Hall/CRC, 2008. ISBN 1584889977.
[6] H. Frigui, R. Krishnapuram, A robust competitive clustering algorithm with
applications in computer vision, IEEE Trans. Pattern Anal. Mach. Intell. 21 (5)
(1999) 450465.
[7] Y. Leung, J. Zhang, Z. Xu, Clustering by scale-space ltering, IEEE Trans.
Pattern Anal. Mach. Intell. 22 (12) (2000) 13961410.
[8] S.C. Johnson, Hierarchical clustering schemes, Psychometrika 2 (1967)
241254.
[9] F. Murtagh, A survey of recent advances in hierarchical clustering algorithms,
Comput. J. 26 (1983) 354359.
[10] A.K. Jain, M.N. Murty, P.J. Flynn, Data clustering a review, ACM Comput. Surv.
31 (3) (1999) 264323.
[11] M. Sato-Ilic, L.C. Jain, Innovations in Fuzzy Clustering: Theory and Application, Springer-Verlag, Berlin, Germany, 2006.
[12] A. Baraldi, P. Blonda, A survey of fuzzy clustering algorithms for pattern
recognitionPart I, IEEE Trans. Syst. Man Cybern. Part B Cybern. 29 (6) (1999)
778785.
[13] A. Baraldi, P. Blonda, A survey of fuzzy clustering algorithms for pattern
recognitionPart II, IEEE Trans. Syst. Man Cybern. Part B Cybern. 29 (6)
(1999) 786801.
[14] F. Hoppner, F. Klawonn, R. Kruse, T. Runkler, Fuzzy Cluster Analysis, Wiley,
1999, ISBN 0471988642.
[15] F. Hoppner, Fuzzy shell clustering algorithms in image processing: fuzzy crectangular and 2-rectangular shells, IEEE Trans. Fuzzy Syst. 5 (1997)
599613.
[16] S. Das, A. Abraham, A. Konar, Automatic clustering using an improved
differential evolution algorithm, IEEE Trans. Syst. Man Cybern. Part A Syst.
Hum. 38 (1) (2008) 218237.
[17] O. Younis, M. Krunz, S. Ramasubramanian, Node clustering in wireless sensor
networks: recent developments and deployment challenges, IEEE Netw.
(2006) 2025.
[18] M. Younis, P. Munshi, G. Gupta, S.M. Elsharkawy, On efcient clustering of
wireless sensor networks, in: Second IEEE Workshop on Dependability and
Security in Sensor Networks and Systems, 2006, pp. 110.
[19] P. Kumarawadu, D.J. Dechene, M. Luccini, A. Sauer, Algorithms for node
clustering in wireless sensor networks: a survey, in: IEEE International
Conference on IAFS, 2008, pp. 295300.
13
[20] P.M. Pradhan, G. Panda, Connectivity constrained wireless sensor deployment using multiobjective evolutionary algorithms and fuzzy decision
making, Ad Hoc Netw. 10 (6) (2012) 11341145.
[21] S. Chen, S. McLaughlin, P.M. Grant, B. Mulgrew, Multi-Stage blind clustering
equaliser, IEEE Trans. Commun. 43 (1995) 701705.
[22] S.J. Nanda, G. Panda, Automatic clustering using MOCLONAL for classifying
actions of 3D human models, in: IEEE Symposium on Humanities, Science
and Engineering Research, 2012, pp. 385390.
[23] S.K. Halgamuge, L. Wang, Classication and Clustering for Knowledge
Discovery, Springer-Verlag, Berlin, Germany, 2005.
[24] R.C. MacGregor, A.T. Hodgkinson, Small Business Clustering Technologies:
Applications in Marketing, Management, IT and Economics, Idea Group Pub.,
Hershey, Pennsylvania, 2007.
[25] A. Brabazon, M. O0 Neill, I. Dempsey, An introduction to evolutionary
computation in nance, IEEE Comput. Intell. Mag. 3 (4) (2008) 4255.
[26] A. Amendola, D. Belsley, E.J. Kontoghiorghes, H.K. van Dijk, Y. Omori, E. Zivot,
Special issue on statistical and computational methods in nance, Comput.
Stat. Data Anal. 52 (2008) 28422845.
[27] S.R. Nanda, B. Mahanty, M.K. Tiwari, Clustering Indian stock market data for
portfolio management, Expert Syst. Appl. 37 (12) (2010) 87938798.
[28] I. Zaliapin, A. Gabrielov, V. Keilis-Borok, H. Wong, Clustering analysis of
seismicity and aftershock identication, Phy. Rev. Lett. 101 (2008).
[29] I.A. Saras, P.W. Trinder, A.M.S. Zalzala, NOCEA: a rule-based evolutionary
algorithm for efcient and effective clustering of massive high-dimensional
databases, Appl. Soft Comput. 7 (3) (2007) 668710.
[30] D. Jiang, C. Tang, A. Zhang, Cluster analysis for gene expression data: a
survey, IEEE Trans. Knowl. Data Eng. 16 (11) (2004) 13701386.
[31] A.V. Lukashin, M.E. Lukashev, R. Fuchs, Topology of gene expression networks as revealed by data mining and modeling, Bioinformatics 19 (15)
(2003) 19091916.
[32] M.N. Murty, A.K. Jain, Knowledge-based clustering scheme for collection
management and retrieval of library books, Pattern Recognit. 28 (7) (1995)
949963.
[33] N.O. Andrews, E.A. Fox, Recent Developments in Document Clustering, Technical
Report TR-07-35, Department of Computer Science, Virginia Tech, 2007.
[34] A.K. Jain, Data clustering: 50 years beyond K-means, Pattern Recognit. Lett.
31 (2010) 651666.
[35] M. Steinbach, G. Karypis,V. Kumar, A comparison of document clustering
techniques, in: KDD Workshop on Text Mining, 2000.
[36] S. Phillips, Acceleration of k-means and related clustering algorithms, in:
International Workshop on Algorithm Engineering and Experimentation, Lecture
Notes in Computer Science, vol. 2409, Springer-Verlag, 2002, pp. 166177.
[37] D. Pelleg, A. Moore, Accelerating exact k-means algorithms with geometric
reasoning, in: Fifth ACM SIGKDD International Conference on Knowledge
Discovery and Data Mining, CA-ACM Press, 1999, pp. 277281.
[38] D. Pelleg, A. Moore, x-means: extending k-means with efcient estimation of
the number of clusters, in: Seventeenth International Conference on
Machine Learning, Morgan Kaufmann, San Francisco, 2000, pp. 727734.
[39] B. Zhang, M. Hsu, U. Dayal, k-harmonic means: a spatial clustering algorithm
with boosting, in: International Workshop on Temporal, Spatial and SpatioTemporal Data Mining, Lecture Notes in Articial Intelligence, Springer, 2001,
pp. 3145.
[40] Z. Huang, Extensions to the k-means algorithm for clustering large data sets
with categorical values, Data Mining Knowl. Discov. 2 (3) (1998) 283304.
[41] A. Chaturvedi, P. Green, J. Carroll, k-modes clustering, J. Classif. 18 (1) (2001)
3555.
[42] B. Scholkopf, A. Smola, K.R. Muller, Nonlinear component analysis as a kernel
eigenvalue problem, J. Neural Comput. 10 (5) (1998) 12991319.
[43] L. Kaufman, P.J. Rousseeuw, Finding Groups in Data: An Introduction to
Cluster Analysis, Wiley, 2008, ISBN 0471735787.
[44] X.S. Yang, Nature-Inspired Metaheuristic Algorithms, second edition, Luniver
Press, 2010, ISBN 1905986106.
[45] J. Brownlee, Clever Algorithms: Nature-Inspired Programming Recipes, lulu.
com, 2012.
[46] J.H. Holland, Adaptation in Natural and Articial Systems, University of
Michigan Press, 1975, ISBN 9780262581110.
[47] D.E. Goldberg, Genetic Algorithms in Search, Optimization, and Machine
Learning, Addison-Wesley, 1989, ISBN 0201157675.
[48] S. Kirkpatrick, C.D. Gelatt , M.P. Vecchi, Optimization by simulated annealing,
Science 220 (4598) (1983) 671680.
[49] H.P. Schwefel, Numerical Optimization of Computer Models, John Wiley and
Sons, 1981, ISBN 0471099880.
[50] I. Rechenberg, Evolution strategy: nature0 s way of optimization, in: Optimization: Methods and Applications, Possibilities and Limitations, Lecture
Notes in Engineering, Springer, Berlin, 1989, pp. 106126.
[51] T. Back, F. Hoffmeister, H.P. Schwefel, A survey of evolution strategies, in:
Proceedings of the Fourth International Conference on Genetic Algorithms,
1991.
[52] J.R. Koza, Genetic Programming: On the Programming of Computers by
Natural Selection, MIT Press, Cambridge, 1992.
[53] R. Storn, K. Price, Differential evolutiona simple and efcient heuristic for
global optimization over continuous spaces, J. Global Optim. 11 (1997)
341359.
[54] K. Price, R. Storn, J. Lampinen, Differential EvolutionA Practical Approach to
Global Optimization, Springer, Berlin, 2005.
Please cite this article as: S.J. Nanda, G. Panda, A survey on nature inspired metaheuristic algorithms for partitional clustering, Swarm
and Evolutionary Computation (2014), http://dx.doi.org/10.1016/j.swevo.2013.11.003i
14
[55] A.K. Qin, P.N. Suganthan, Self-adaptive differential evolution algorithm for
numerical optimization, in: IEEE Congress on Evolutionary Computation, CEC
2005, pp. 17851791.
[56] A.K. Qin, V.L. Huang, P.N. Suganthan, Differential evolution algorithm with
strategy adaptation for global numerical optimization, IEEE Trans. Evol.
Comput. 13 (2) (2009) 398417.
[57] S. Rahnamayan, H.R. Tizhoosh, M.M.A. Salama, Opposition-based differential
evolution for optimization of noisy problems. in: IEEE Congress on Evolutionary Computation, CEC 2006, pp. 18651872.
[58] S. Rahnamayan, H.R. Tizhoosh, M.M.A. Salama, Opposition-based differential
evolution, IEEE Trans. Evol. Comput. 12 (1) (2008) 6479.
[59] E. Bonabeau, M. Dorigo, G. Theraulaz, Swarm Intelligence: From Natural to
Articial Systems, Oxford University Press, US, 1999.
[60] J. Kennedy, R.C. Eberhart, Y. Shi, Swarm Intelligence, Morgan Kaufmann,
2001, ISBN 1558605959.
[61] A.P. Engelbrecht, Fundamentals of Computational Swarm Intelligence, John
Wiley and Sons, 2006, ISBN 0470091916.
[62] M. Dorigo, Optimization, learning and natural algorithms (Ph.D. thesis),
Politecnico di Milano, Italy, 1992.
[63] M. Dorigo, V. Maniezzo, A. Colorni, The ant system: optimization by a colony of
cooperating agents, IEEE Trans. Syst. Man Cybern. Part B Cybern. 26 (1996) 2941.
[64] M. Dorigo, L.M. Gambardella, Ant colony system: a cooperative learning
approach to the traveling salesman problem, IEEE Trans. Evol. Comput. 1 (1)
(1997) 5366.
[65] M. Dorigo, G. Di Caro, Ant colony optimization: a new meta-heuristic, in:
IEEE Congress on Evolutionary Computation, CEC 1999, vol. 2, 1999,
pp. 14701477.
[66] M. Dorigo, T. Stutzle, Ant Colony Optimization, The MIT Press, 2004,
ISBN 0262042193.
[67] M. Dorigo, C. Blum, Ant colony optimization theory: a survey, Theor. Comput.
Sci. 344 (2005) 243278.
[68] J. Kennedy, R. Eberhart, Particle swarm optimization, in: IEEE International
Conference on Neural Network, vol. 4, 1995, pp. 19421948.
[69] R. Eberhart, J. Kennedy, A new optimizer using particle swarm theory, in:
Sixth International Symposium on Micro Machine and Human Science, MHS
1995, pp. 3943.
[70] M. Clerc, J. Kennedy, The particle swarmexplosion, stability, and convergence in a multidimensional complex space, IEEE Trans. Evol. Comput. 6 (1)
(2002) 5873.
[71] X. Hu, Y. Shi, R. Eberhart, Recent advances in particle swarm, in: IEEE
Congress on Evolutionary Computation, CEC 2004, vol. 1, pp. 9097.
[72] R. Poli, J. Kennedy, T. Blackwell, Particle Swarms: The Second Decade,
Hindawi Publishing Corporation, 2008, ISBN 9774540379.
[73] B. Basturk, D. Karaboga, An articial bee colony (ABC) algorithm for numeric
function optimization, in: IEEE Swarm Intelligence Symposium, IN, USA,
2006.
[74] D. Karaboga, B. Basturk, Articial bee colony optimization algorithm for
solving constrained optimization problems, in: Foundations of Fuzzy Logic
and Soft Computing, Lecture Notes in Computer Science, vol. 4529, Springer,
2007, pp. 789798.
[75] D. Karaboga, B. Basturk, A powerful and efcient algorithm for numerical
function optimization: articial bee colony (ABC) algorithm, J. Global Optim.
39 (3) (2007) 171459.
[76] D. Karaboga, B. Basturk, On the performance of articial bee colony (ABC)
algorithm, Appl. Soft Comput. 8 (1) (2008) 687697.
[77] D. Karaboga, B. Akay, A comparative study of articial bee colony algorithm,
Appl. Math. Comput. 214 (1) (2009) 108132.
[78] D. Dasgupta, Articial Immune Systems and their Applications, SpringerVerlag, 1999, ISBN 3540643907.
[79] L.N. de Charsto, J. Timmis, An Introduction to Articial Immune Systems: A
New Computational Intelligence Paradigm, Springer-Verlag, 2002.
[80] L.N. de Charsto, J.V. Zuben, Learning and optimization using clonal selection
principle, IEEE Trans. Evol. Comput. 6 (3) (2002) 239251.
[81] D. Dasgupta, Advances in articial immune systems, IEEE Comput. Intell.
Mag. 1 (4) (2006) 4049.
[82] D. Dasgupta, S. Yu, F. Nino, Recent advances in articial immune systems:
models and applications, Appl. Soft Comput. 11 (2011) 15741587.
[83] S. J. Nanda, Articial immune systems: principle, algorithms and applications
(M. tech. research thesis), National Institute of Technology, Rourkela, India,
2009.
[84] K. Passino, Biomimicry of bacterial foraging for distributed optimization and
control, IEEE Control Syst. Mag. 22 (3) (2002) 5267.
[85] W.J. Tang, Q.H. Wu, J.R. Saunders, Bacterial foraging algorithm for dynamic
environments, in: IEEE Congress on Evolutionary Computation, CEC 2006,
pp. 13241330.
[86] S. Mishra, A hybrid least squarefuzzy bacterial foraging strategy for
harmonic estimation, IEEE Trans. Evol. Comput. 9 (1) (2005) 6173.
[87] J. Greensmith, The dendritic cell algorithm (Ph.D. thesis), University of
Nottingham, 2007.
[88] J. Greensmith, U. Aickelin, J. Twycross, Articulation and clarication of the
dendritic cell algorithm, in: Fifth International Conference on Articial
Immune Systems (ICARIS 2006), Lecture Notes in Computer Science, vol.
4163, Springer-Verlag, 2006, pp. 404417.
[89] A.K. Jain, A. Topchy, M.H.C. Law, J.M. Buhmann, Landscape of clustering
algorithms, in: Proceedings of International Conference on Pattern Recognition, 2004, pp. 260263.
[90] A.K. Jain, R. Duin, J. Mao, Statistical pattern recognition: a review, IEEE Trans.
Pattern Anal. Mach. Intell. 22 (1) (2000) 437.
[91] R. Xu, D. Wunsch, Survey of clustering algorithms, IEEE Trans. Neural Netw.
13 (3) (2005) 645678.
[92] A.A. Freitas, A survey of evolutionary algorithms for data mining and
knowledge discovery, in: Advances in Evolutionary Computing, SpringerVerlag, New York, USA, 2003.
[93] S. Paterlini, T. Minerva, Evolutionary Approaches for Cluster Analysis, in: Soft
Computing Applications, Springer, 2003, pp. 167178.
[94] O.A. Mohamed Jafar, R. Sivakumar, Ant-based clustering algorithms: a brief
survey, Int. J. Comput. Theory Eng. 2 (5) (2010) 787796.
[95] G. Gan, C. Ma, J. Wu, Data Clustering Theory, Algorithms and Applications,
ASA-SIAM Series on Statistics and Applied Probability, 2007.
[96] P. Berkhin, A Survey of Clustering Data Mining Techniques, in: Grouping
Multidimensional Data, Springer, 2006, pp. 25-73.
[97] C.M. Bishop, Pattern Recognition and Machine Learning, Springer, 2006,
ISBN 0387310738.
[98] P.N. Tan, M. Steinbach, V. Kumar, Introduction to Data Mining, AddisonWesley Longman Pub., USA, 2005.
[99] R. Duda, P. Hart, D. Stork, Pattern Classication, Second Ed., John Wiley and
Sons, New York, 2001.
[100] J.C. Bezdek, S. Boggavaparu, L.O. Hall, A. Bensaid, Genetic algorithm guided
clustering, in: IEEE Congress on Evolutionary Computation, CEC 1994,
pp. 3440.
[101] M. Sarkar, B. Yegnarayana, D. Khemani, A clustering algorithm using an
evolutionary programming-based approach, Pattern Recognit. Lett. 18 (1997)
975986.
[102] L.I. Kuncheva, J. C. Bezdek, Selection of cluster prototypes from data by a
genetic algorithm, in: Fifth European Congress on Intelligent Techniques and
Soft Computing, 1997, pp. 16831688.
[103] C.A. Murthy, N. Chowdhury, In search of optimal clusters using genetic
algorithm, Pattern Recognit. Lett. 17 (8) (1996) 825832.
[104] Q. Xiaofeng, E. Palmieri, Theoretical analysis of evolutionary algorithms with
an innite population size in continuous space, Part I: basic properties of
selection and mutation, IEEE Trans. Neural Netw. 5 (1) (1994) 102119.
[105] U. Maulik, S. Bandyopadhyay, Genetic algorithm-based clustering technique,
Pattern Recognit. 33 (2000) 14551465.
[106] K. Krishna, N. Murty, Genetic K-means algorithm, IEEE Trans. Syst. Man
Cybern. Part B Cybern. 29 (3) (1999) 433439.
[107] W. Sheng, X. Liu, A hybrid algorithm for K-medoid clustering of large data
sets, in: IEEE Congress on Evolutionary Computation, CEC 2004, pp. 7782.
[108] Y. Lu, S. Lu, F. Fotouhi, Y. Deng, S.J. Brown, FGKA: A fast genetic K-means
clustering algorithm, in: ACM Symposium on Applied Computing, 2004,
pp. 622623.
[109] Y. Lu, S. Lu, F. Fotouhi, Y. Deng, S.J. Brown, Incremental genetic K-means
algorithm and its application in gene expression data analysis, BMC Bioinform. 5 (2004) 110.
[110] M.C. Cowgill, R.J. Harvey, L.T. Watson, A genetic algorithm approach to cluster
analysis, Comput. Math. Appl. 37 (7) (1999) 99108.
[111] L.Y. Tseng, S.B. Yang, A genetic approach to the automatic clustering problem,
Pattern Recognit. 34 (2001) 415424.
[112] S. Bandyopadhyay, U. Maulik, Nonparametric genetic clustering: comparison
of validity indices, IEEE Trans. Syst. Man Cybern. Part C Appl. Rev. 31 (1)
(2001) 120125.
[113] S. Bandyopadhyay, U. Maulik, Genetic clustering for automatic evolution of
clusters and application to image classication, Pattern Recognit. 35 (2002)
11971208.
[114] S. Bandyopadhyay, U. Maulik, An evolutionary technique based on K-Means
algorithm for optimal clustering in RN, Inf. Sci. 146 (2002) 221237.
[115] A. Casillas, M.Y.G. de Lena, R. Martynez, Document clustering into an
unknown number of clusters using a genetic algorithm, in: International
Conference on Text, Speech, and Dialogue, Lecture Notes in Computer
Science, vol. 2807, Springer, 2003, pp. 4349.
[116] Y. Liu, K. Chen, X. Liao, W. Zhang, A genetic clustering method for intrusion
detection, Pattern Recognit. 37 (2004) 927942.
[117] P.C.H. Ma, K.C.C. Chan, X. Yao, D.K.Y. Chiu, An evolutionary clustering
algorithm for gene expression microarray data analysis, IEEE Trans. Evol.
Comput. 10 (3) (2006) 296314.
[118] E.R. Hruschka, R.J.G.B. Campello, L.N. de Castro, Improving the efciency of a
clustering genetic algorithm, in: Ninth Ibero-American Conference on Articial Intelligence, Lecture Notes in Computer Science, vol. 3315, SpringerVerlag, 2004, pp. 861870.
[119] E.R. Hruschka, L.N. de Castro, R.J.G.B. Campello, Evolutionary algorithms for
clustering gene-expression data, in: Fourth IEEE International Conference on
Data Mining, 2004, pp. 403406.
[120] E.R. Hruschka, R.J.G.B. Campello, L.N. de Castro, Evolving clusters in geneexpression data, Inf. Sci. 176 (2006) 18981927.
[121] C.B. Lucasius, A.D. Dane, G. Kateman, On k-medoid clustering of large
datasets with the aid of a genetic algorithm: background, feasiblity and
comparison, Anal. Chim. Acta 282 (3) (1993) 647669.
[122] V. Estivill-Castro, A.T. Murray, Spatial clustering for data mining with genetic
algorithms, in: International ICSC Symposium on Engineering of Intelligent
Systems, 1997, pp. 317323.
[123] R.M. Cole, Clustering with genetic algorithms (M.S. thesis), University of
Western Australia, Perth, 1998.
Please cite this article as: S.J. Nanda, G. Panda, A survey on nature inspired metaheuristic algorithms for partitional clustering, Swarm
and Evolutionary Computation (2014), http://dx.doi.org/10.1016/j.swevo.2013.11.003i
15
Please cite this article as: S.J. Nanda, G. Panda, A survey on nature inspired metaheuristic algorithms for partitional clustering, Swarm
and Evolutionary Computation (2014), http://dx.doi.org/10.1016/j.swevo.2013.11.003i
16
Please cite this article as: S.J. Nanda, G. Panda, A survey on nature inspired metaheuristic algorithms for partitional clustering, Swarm
and Evolutionary Computation (2014), http://dx.doi.org/10.1016/j.swevo.2013.11.003i
17
Please cite this article as: S.J. Nanda, G. Panda, A survey on nature inspired metaheuristic algorithms for partitional clustering, Swarm
and Evolutionary Computation (2014), http://dx.doi.org/10.1016/j.swevo.2013.11.003i
18
[319] X.S. Yang, S. Deb, Multiobjective cuckoo search for design optimization.
Comput. Oper. Res. 40 (6) (2013) 16161624.
[320] H.R. Hassanzadeh, A multi-objective gravitational search algorithm, in: IEEE
Second International Conference on Computational Intelligence, Communication Systems and Networks, 2010, pp. 712.
[321] M. Jiang, K. Zhu, Multiobjective optimization by articial sh swarm algorithm, in: IEEE International Conference on Computer Science and Automation Engineering, 2011, pp. 506511.
[322] U. Maulik, S. Bandyopadhyay, A. Mukhopadhyay, Multiobjective Genetic
Algorithms for Clustering, Springer, 2011, ISBN 3642166148.
[323] C. Bong, M. Rajeswari, Multi-objective nature-inspired clustering and classication techniques for image segmentation, Appl. Soft Comput. 11 (2011)
32713282.
[324] D.W. Corne, J. Knowles, M. J. Oates, The Pareto Envelope-based Selection
Algorithm for Multiobjective Optimization, in: Sixth Conference on Parallel
Problem Solving from Nature, 2000, pp. 839848.
[325] D.W. Corne, N.R. Jerram, J. Knowles, M. J. Oates, PESA-II: Region-based
selection in evolutionary multiobjective optimization, in: Genetic and
Evolutionary Computing Conference, 2001, pp. 283290.
[326] E. Zitzler, L. Thiele, Multiobjective evolutionary algorithms: a comparative
case study and the strength pareto approach, IEEE Trans. Evol. Comput. 3 (4)
(1999) 257271.
[327] J. Handl, H. Julia, J. Knowles, Evolutionary multiobjective clustering, in:
Eighth conference on Parallel Problem Solving from Nature, 2004.
[328] J. Handl, J. Knowles, Multiobjective Clustering with Automatic Determination of
the Number of Clusters, Technical Report TR-COMPSYSBIO-2004-02, Institute of
Science and Technology, University of Manchester, Manchester, U.K., Available:
http://www.dbkweb.ch.umist.ac.uk/handl/publications.html, 2004.
[329] J. Handl, J. Knowles, Improvements to the scalability of multiobjective
clustering, in: IEEE Congress on Evolutionary Computation, CEC 2005,
pp. 23722379.
[330] J. Handl, J. Knowles, An evolutionary approach to multiobjective clustering,
IEEE Trans. Evol. Comput. 11 (1) (2007) 5676.
[331] S. Bandyopadhyay, U. Maulik, A. Mukhopadhyay, Multiobjective genetic
clustering for pixel classication in remote sensing imagery, IEEE Trans.
Geosci. Remote Sens. 45 (5) (2007) 15051511.
[332] A. Mukhopadhyay, U. Maulik, A multiobjective approach to MR brain image
segmentation, Appl. Soft Comput. 11 (2011) 872880.
[333] D.S. Santos, D.D. Oliveira, A.L.C. Bazzan, A multiagent multiobjective clustering algorithm, in: Data Mining and Multi-agent Integration, Springer, 2009.
[334] M. Gong, L. Zhang, L. Jiao, S. Gou, Solving multiobjective clustering using an
immune-inspired algorithm, in: IEEE Congress on Evolutionary Computation,
CEC 2007, pp. 1522.
[335] W. Ma, L. Jiao, M. Gong, Immunodominance and clonal selection inspired
multiobjective clustering, Progress Nat. Sci. 19 (2009) 751758.
[336] D. Yang, L. Jiao, M. Gong, F. Liu, Articial immune multi-objective SAR image
segmentation with fused complementary features, Inf. Sci. 181 (2011)
27972812.
[337] S. Das, A. Abraham, A. Konar, Automatic kernel clustering with a Multi-Elitist
Particle Swarm Optimization Algorithm, Pattern Recognit. Lett. 29 (2008)
688699.
[338] A. Paoli, F. Melgani, E. Pasolli, Clustering of hyperspectral images based on
multiobjective particle swarm optimization, IEEE Trans. Geosci. Remote Sens.
47 (12) (2009) 41754188.
[339] H. Ali, W. Shahzad, F.A. Khan, Energy-efcient clustering in mobile ad-hoc
networks using multi-objective particle swarm optimization, Appl. Soft
Comput. 12 (7) (2012) 19131928.
[340] S. Saha, S. Bandyopadhyay, A new multiobjective simulated annealing based
clustering technique using symmetry, Pattern Recognit. Lett. 30) (2009)
13921403.
[341] S. Saha, S. Bandyopadhyay, A symmetry based multiobjective clustering
technique for automatic evolution of clusters, Pattern Recognit. 43) (2010)
738751.
[342] R. Caballero, M. Laguna, R. Marti, J. Molina, Scatter tabu search for multiobjective clustering problems, J. Oper. Res. Soc. 62 (11) (2011) 20342046.
[343] K. Suresh, D. Kundu, S. Ghosh, S. Das, A. Abraham, S.Y. Han, Multi-objective
differential evolution for automatic clustering with application to MicroArray data analysis, Sensors 9 (2009) 39814004.
[344] K. Faceli, A.C.P.L.F. de Carvalho, M.C.P. Souto, Multi-objective clustering
ensemble, Int. J. Hybrid Intell. Syst. 4 (3) (2007) 145156.
[345] K.S.N. Ripon, M.N.H. Siddique, Evolutionary multi-objective clustering for
overlapping clusters detection, in: IEEE Congress on Evolutionary Computation, CEC 2009, pp. 976982.
[346] J. Handl, J. Knowles, Multi-objective clustering and cluster validation, in:
Studies in Computational Intelligence, vol. 16, Springer, 2006, pp. 2147.
[347] M.J. Brusco, D. Steinley, Cross validation issues in multiobjective clustering,
Br. J. Math. Stat. Psychol. 62) (2009) 349368.
Please cite this article as: S.J. Nanda, G. Panda, A survey on nature inspired metaheuristic algorithms for partitional clustering, Swarm
and Evolutionary Computation (2014), http://dx.doi.org/10.1016/j.swevo.2013.11.003i