0% found this document useful (0 votes)
46 views

Chemical Reaction Optimization: A Tutorial: Memetic Computing March 2012

ssdeww

Uploaded by

'Hady' Hadiyanto
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
46 views

Chemical Reaction Optimization: A Tutorial: Memetic Computing March 2012

ssdeww

Uploaded by

'Hady' Hadiyanto
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/257779624

Chemical Reaction Optimization: A tutorial

Article  in  Memetic Computing · March 2012


DOI: 10.1007/s12293-012-0075-1 · Source: DBLP

CITATIONS READS

122 2,950

2 authors:

Albert Lam Victor O. K. Li


The University of Hong Kong The University of Hong Kong
77 PUBLICATIONS   2,168 CITATIONS    636 PUBLICATIONS   12,330 CITATIONS   

SEE PROFILE SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Resource optimization for computationally expensive problems View project

Vehicular Energy Network View project

All content following this page was uploaded by Albert Lam on 05 June 2014.

The user has requested enhancement of the downloaded file.


Memetic Comp. (2012) 4:3–17
DOI 10.1007/s12293-012-0075-1

REGULAR SURVEY PAPER

Chemical Reaction Optimization: a tutorial


(Invited paper)

Albert Y. S. Lam · Victor O. K. Li

Received: 11 July 2011 / Accepted: 31 January 2012 / Published online: 12 February 2012
© The Author(s) 2012. This article is published with open access at Springerlink.com

Abstract Chemical Reaction Optimization (CRO) is a 1 Introduction


recently established metaheuristics for optimization, inspired
by the nature of chemical reactions. A chemical reaction is Nature, by itself, is a complex system and human beings
a natural process of transforming the unstable substances to and their associated activities are parts of Nature. This com-
the stable ones. In microscopic view, a chemical reaction plex system operates forever without any problems because
starts with some unstable molecules with excessive energy. there are laws governing the operations of all contained com-
The molecules interact with each other through a sequence of ponents. The ways in which Nature functions are excellent
elementary reactions. At the end, they are converted to those principles to operate complicated systems and to solve prob-
with minimum energy to support their existence. This prop- lems. When it comes to mathematics and computer science,
erty is embedded in CRO to solve optimization problems. these operating methods are called algorithms [29]. Mimick-
CRO can be applied to tackle problems in both the discrete ing the behaviors of Nature to achieve goal-oriented activi-
and continuous domains. We have successfully exploited ties is called Natured-inspired computing [31]. Thus, we can
CRO to solve a broad range of engineering problems, includ- see that the ideas of imitating phenomena from Nature have
ing the quadratic assignment problem, neural network train- great potential in developing algorithms to tackle engineer-
ing, multimodal continuous problems, etc. The simulation ing problems, especially those for which we lack adequate
results demonstrate that CRO has superior performance when knowledge to design the corresponding efficient solving
compared with other existing optimization algorithms. This methods.
tutorial aims to assist the readers in implementing CRO to Moreover, optimization is one of the cornerstones in engi-
solve their problems. It also serves as a technical overview neering and science; most of the problems can be formu-
of the current development of CRO and provides potential lated in the form of optimization. It is prevalent, ranging
future research directions. from power generation scheduling in electrical engineer-
ing [1], stock market trend prediction in finance [42], to
Keywords Chemical Reaction Optimization · DNA sequencing in biomedical science [32]. Various opti-
Metaheuristic · Nature-inspired algorithm · mization techniques have been attempted to solve optimiza-
Approximate algorithm · Optimization tion problems. However, it is generally believed that for a
class of problems, called nondeterministic polynomial-time
A. Y. S. Lam (B) hard (NP-hard) in computation complexity theory [11], algo-
Department of Electrical Engineering and Computer Sciences, rithms obtaining the optimal solutions in polynomial time do
University of California, Berkeley, 273 Cory Hall, Berkeley,
not exist [10]. In reality, many important real-world prob-
CA 94720, USA
e-mail: [email protected] lems are classified as NP-hard. Does it mean that there is
no hope to solve these problems efficiently? Auspiciously,
V. O. K. Li we are satisfied with near-optimal (or sub-optimal) solu-
Department of Electrical and Electronic Engineering,
tions most of the time provided that the objective func-
The University of Hong Kong, Rm. 610D, Chow Yei
Ching Building, Pokfulam Rd., Hong Kong, China tion values of these solutions are not too far away from
e-mail: [email protected] the optimal. This gives birth to approximate algorithms,

123
4 Memetic Comp. (2012) 4:3–17

a very influential class of optimization algorithms nowa- most of the problems addressed by CRO in Sect. 6. We con-
days. clude this tutorial and suggest potential future work in Sect. 7.
In the past few decades, the field of Nature-inspired opti-
mization techniques has grown incredibly fast. These algo-
rithms are usually general-purpose and population-based. 2 Inspiration
They are normally referred to as evolutionary algorithms1
because many of them are motivated by biological evolution. CRO is a technique which loosely couples chemical reactions
Evolution means “the variation of allele frequencies in popu- with optimization. It does not attempt to capture every detail
lations over time” [2]. We can, without harm, broaden the idea of chemical reactions. In general, the principles of chemical
of “evolution” to non-biological processes. In a broad sense, reactions are governed by the first two laws of thermodynam-
evolutionary algorithms cover those which vary a group of ics [14]. Here we explain these laws at a high level to enable
solutions in iterations based on some Nature-inspired oper- the readers to easily grasp the working mechanisms of chem-
ations. Examples include, but are not limited to, Genetic ical reactions. The first law (conservation of energy) says that
Algorithm (GA) [13], Memetic Algorithm (MA) [6,23], Ant energy cannot be created or destroyed; energy can transform
Colony Optimization (ACO) [8], Particle Swarm Optimiza- from one form to another and transfer from one entity to
tion (PSO) [15], Differential Evolution (DE) [28], and Har- another. A chemical reacting system consists of the chemi-
mony Search (HS) [12]. Many of them are inspired by the cal substances and its surroundings. Each chemical substance
biological process, varying in scale from the genetic level, possesses potential and kinetic energies, and the energies of
e.g. GA, MA, and DE, to the creature level, e.g. ACO and the surroundings are symbolically represented by the central
PSO. Unlike the others, HS is motivated by the phenomenon energy buffer in CRO.2 A reaction is endothermic when it
of human activities in composing music. requires heat obtained from the surroundings to initialize the
The aforementioned algorithms are successful in solving reaction process. An exothermic reaction refers to one whose
many different kinds of optimization problems, as demon- chemical substances give heat to the surroundings. These two
strated by their huge number of citations in the literature. kinds of reactions can be characterized by the initial buffer
According to the No-Free-Lunch Theorem [35], all meta- size: when it is positive, the reaction is endothermic; when
heuristics which search for extrema are exactly the same it is zero, the reaction is exothermic. The second law says
in performance when averaged over all possible objective that the entropy of a system tends to increase, where entropy
functions. In other words, when one works excellent in a is the measure of the degree of disorder. Potential energy is
certain class of problems, it will be outperformed by the oth- the energy stored in a molecule with respect to its molecular
ers in other classes. Therefore, all algorithms which have configuration. When it is converted to other forms, the sys-
been shown to address some optimization problems success- tem becomes more disordered. For example, when molecules
fully are equally important, as each of them must perform with more kinetic energy (converted from potential energy)
identically well on the average. As the spectrum of optimi- move faster, the system becomes more disordered and its
zation problems is huge, the number of reported successful entropy increases. Thus all reacting systems tend to reach a
metaheuristics is much less than the number of problems. state of equilibrium, whose potential energy drops to a min-
Recently, Lam and Li [18] proposed a new metaheuristic for imum. In CRO, we capture this phenomenon by converting
optimization, inspired by the nature of chemical reactions. potential energy to kinetic energy and by gradually losing the
They coined it Chemical Reaction Optimization (CRO). In energy of the chemical molecules to the surroundings.
a short period of time, CRO has been applied to solve many A chemical system undergoes a chemical reaction when
problems successfully, outperforming many existing evolu- it is unstable, in the sense that it possesses excessive energy.
tionary algorithms in most of the test cases. In this tutorial, It manipulates itself to release the excessive energy in order
we introduce this new paradigm, provide guidelines to help to stabilize itself. This manipulation is called chemical reac-
the readers implement CRO for their optimization problems, tions. If we look at the chemical substances at the microscopic
summarize the applications of CRO reported in the literature, level, a chemical system consists of molecules, which are
and identify possible future research directions for CRO. the smallest particles of a compound that retain the chem-
The rest of this tutorial is organized as follows. Section 2 ical properties of the compound.3 Molecules are classified
gives the inspiration of CRO. We explain the characteristics into different species based on the underlying chemical prop-
of CRO in Sect. 3 and introduce the basic elements of CRO in erties. For example, carbon monoxide (CO) and nitrogen
Sect. 4. In Sect. 5, we demonstrate the CRO framework and
show how the algorithm is structured. We briefly describe 2 The central energy buffer will be introduced in Sect. 4.
3 The definition of molecule is from the General Chemistry Online:
Glossary from Frostburg State University, available at http://antoine.
1 They are also sometimes called metaheuristics. frostburg.edu/chem/senese/101/index.shtml.

123
Memetic Comp. (2012) 4:3–17 5

Table 1 Characteristics of the four elementary reactions


Extent of change Number of molecules involved
Uni-molecular Inter-molecular

More Decomposition Synthesis


Less On-wall ineffective Inter-molecular
collision ineffective collision

and synthesis correspond to on-wall ineffective collision and


inter-molecular ineffective collision, respectively, but they
have much more vigorous changes to the molecular struc-
tures. We summarize the elementary reactions in Table 1.
Fig. 1 Illustrative representation of a chemical reaction on the potential Optimization is the study of problems in which one seeks
energy surface [18] to minimize (or maximize)5 a function by systematically
choosing the values of the variables in an allowed set. Most of
the currently widely utilized optimization algorithms oper-
dioxide (NO2 ) are two different chemical species. The chem- ate iteratively, irrespective of whether they are traditional
ical system (CO + NO2 ) is unstable and the unstable chem- techniques (e.g. descent methods, Newton’s methods [3]) or
icals finally convert to more stable species, CO2 and NO. the evolutionary ones. We can see that the nature of chemi-
The overall chemical equation which governs this process cal reactions and optimization resemble each other at a high
can be described by CO + NO2 → CO2 + NO. In fact, level; they both seek to undergo a series of events step-by-
this reacting system is realized in multiple stages, described step. Due to the success of other evolutionary algorithms and
by consecutive sub-reactions4 : 2NO2 → NO3 + NO and the No-Free-Lunch Theorem, CRO was proposed based on
NO3 + CO → NO2 + CO2 . We depict the aforementioned these observations.
chemical reaction on the potential energy surface in Fig. 1.
We can see that a chemical reaction is accomplished by some
consecutive (and parallel) sub-reaction steps, from reactants 3 Characteristics
changing to products via intermediates and transition states.
A chemical reaction always results in more stable products Conservation of energy in CRO is as natural selection in GA.
with minimum energy and it is a step-wise process of search- Re-distribution of energy among the molecules and inter-
ing for the optimal point. change of energy from one form to another govern the algo-
Molecules store energy in the form of chemical bonds; rithmic philosophy of CRO. With the underlying assumption
bond formation requires energy from the outside while bond of energy conservation and transformation, we manipulate
breakage releases energy to the surroundings. A molecule the solutions through a random sequence of elementary reac-
is identified by its molecular structure, which characterizes tions. The two ineffective collisions implement local search
the contained atoms, bond length, angle, and torsion. Mol- (intensification) while decomposition and synthesis give the
ecules with subtle changes in molecular structures are still effect of diversification. An appropriate mixture of intensi-
considered to belong to the same species. fication and diversification makes an effective search of the
A chemical change of a molecule is triggered by a col- global minimum in the solution space.
lision. There are two types of collisions: uni-molecular and CRO is a variable population-based metaheuristic. The
inter-molecular collisions. The former describes the situa- total number of molecules in different iterations may not be
tion when the molecule hits on some external substances the same. When an (on-wall or inter-molecular) ineffective
(e.g. a wall of the container) while the latter represents the collision happens, the number of molecules before and after
cases where the molecule collides with other molecules. the change remains identical. On the other hand, with decom-
The corresponding reaction change is called an elementary position and synthesis, the number increases and decreases,
reaction. An ineffective elementary reaction is one which respectively. With our definitions of the decomposition and
results in a subtle change of molecular structure. We consider synthesis criteria, we can implicitly influence their frequen-
four kinds of elementary reactions: on-wall ineffective col- cies by controlling α and β, respectively (the details will be
lision, decomposition, inter-molecular ineffective collision,
and synthesis. With respect to molecularity, decomposition 5 We assume an optimization is a minimization problem. A maximi-
zation problem can be converted to the corresponding minimization by
4 Some systems may have the sub-reactions take place spontaneously. adding a minus sign to the objective function.

123
6 Memetic Comp. (2012) 4:3–17

discussed in Sect. 5). However, the frequencies have a stron- particular problem, we do not need strong synchronization
ger relationship with the “landscape” of the objective func- among the CROs. Unlike other evolutionary algorithms, CRO
tion. For example, if “down-hill” search direction is always does not define generations and each iteration involves only
possible, we will never trigger the decomposition criterion. a subset of molecules. Each CRO maintains its own popu-
If “hill-climbing” does not often happen, molecules will not lation size. Interactions between CROs can be carried out at
convert their kinetic energy for worse solutions and the syn- a certain instant without much restriction as each CRO does
thesis criterion is not invoked. When working with a small set not need to wait for another CRO to complete certain actions,
of molecules, we focus on local search more in some regions. e.g. computation of the whole population in a generation in
Otherwise, we try to spread “seeds” (i.e. molecules) to the GA. An attempt to parallelize CRO can be found in [39].
whole solution space in a greater extent. Therefore, CRO tries To summarize, the advantages of CRO are highlighted as
to adapt itself to the problem with the goal of locating the follows:
global minimum effectively.
In a broad sense, CRO is an algorithmic framework where – CRO is a design framework which allows deploying dif-
we only define the general operations of agents and the ferent operators to suit different problems.
energy management scheme. It allows certain implementa- – Its variable population size allows the system to adapt to
tion details to be adjusted to suit the characteristics of prob- the problems automatically.
lems. In other words, CRO has high flexibility for users to – Conversion of energy and transfer of energy in different
customize it to meet their own needs. entities and in different forms make CRO unique among
CRO has a strong relationship with memetic computa- meterheursitics. CRO has the potential to tackle those
tion, which is defined as “a paradigm that uses the notion problems which have not been successfully solved by
of meme(s) as units of information encoded in computa- other metaheuristics.
tional representations for the purpose of problem-solving” – Other attributes can easily be incorporated into the agent
[6]. MA hybridizes global and local heuristic search tech- (i.e. molecule). This gives flexibility to design different
niques. CRO realizes the global and local search with the operators.
elementary reactions. Moreover, we can incorporate (individ- – CRO enjoys the advantages of both SA and GA.
ual and social) learning ability or ideas from other algorithms – CRO can be easily programmed in object-oriented pro-
into CRO through appropriate designs of the decomposition gramming language, where a class defines a molecule and
and synthesis mechanisms. This may result in a more pow- methods define the elementary reaction types.
erful CRO-based algorithm for particular problems. – It is easy to modify CRO to run in parallel, as the pop-
CRO enjoys the advantages of both Simulated Anneal- ulation size does not need to be synchronized between
ing (SA) [17] and GA. The energy conservation requirement computing units.
gives similar effects of the Metropolis Algorithm used in SA
while the decomposition and synthesis operations share sim-
ilarities with the crossover and mutation operations of GA. 4 Basic components, elementary reactions, and concepts
When the number of molecules is small, CRO is more like
SA. When some crossover and mutation operators are imple- In this section, we introduce the building blocks of CRO and
mented in decomposition and synthesis, CRO performs more explain how they are integrated as a complete algorithm. We
like GA (more discussion can be found in Sect. 5.6). first define the manipulated agent, then describe the elemen-
The basic unit in CRO is a molecule. Each molecule has tary reactions, and finally elaborate on the core concept of
certain attributes, e.g. potential and kinetic energies, molec- CRO, namely, conservation of energy.
ular structures, etc., with values characterizing that mole-
cule. The elementary reactions define the implementations 4.1 The manipulated agent
of molecular interactions. Thus, we can easily program CRO
with an object-oriented programming language [30], e.g. CRO is a multi-agent algorithm and the manipulated agents
C++ and Java. We can define a class whose data fields rep- are molecules. Each molecule has several attributes, some
resent the attributes and whose methods describe the ele- of which are essential to the basic operations of CRO. The
mentary reactions. Whenever a molecule is constructed (in essential attributes include (a) the molecular structure (ω); (b)
initialization and decomposition), we create an object repre- the potential energy (PE); and (c) the kinetic energy (KE).
senting a molecule from the class. If a molecule is removed The rest depends on the algorithm operators and they are uti-
from the system by combining with another one (in synthe- lized to construct different CRO variants for particular prob-
sis), we can simply destroy the corresponding object. lems provided that their implementations satisfy the charac-
Parallelization of CRO can be done without too much teristics of the elementary reactions. The optional attributes
effort. When we implement multiple CROs for solving a adopted in most of the published CRO variants are (d) the

123
Memetic Comp. (2012) 4:3–17 7

number of hits (NumHit); (e) the minimum structure (Min- 4.2.1 On-wall ineffective collision
Struct); (f) the minimum PE (MinPE); and (g) the minimum
hit number (MinHit). Illustrations of the attributes mentioned An on-wall ineffective collision represents the situation when
above are listed in the following: a molecule collides with a wall of the container and then
bounces away remaining in one single unit. In this collision,
1. Molecular structure ω captures a solution of the prob- we only perturb the existing ω to ω , i.e.,
lem. It is not required to be in any specific format: it can ω → ω .
be a number, a vector, or even a matrix. For example, if
the problem solution space is defined as a set of vectors This can be done by picking ω in the neighborhood of ω.
composed of five real numbers, then ω can be any of Let N (·) be any neighborhood search operator, we have ω =
these vectors. N (ω) and PE ω = f (ω ). Moreover, a certain portion of
2. Potential energy PE is defined as the objective function KE of the transformed molecule is withdrawn to the cen-
value of the corresponding solution represented by ω. If tral energy buffer (buffer). Let KElossRate be a parameter of
f denotes the objective function, then we have CRO, 0 ≤ KELossRate ≤ 1, and a ∈ [KELossRate, 1] be
a random number, uniformly distributed from KELossRate
PE ω = f (ω). (1) to 1. We get

KE ω = (PE ω − PE ω + KE ω ) × a (2)
3. Kinetic energy KE is a non-negative number and it quan-
tifies the tolerance of the system accepting a worse and the remaining energy, (PE ω − PE ω + KE ω ) × (1 − a),
solution than the existing one. We will elaborate on the is transferred to buffer. If KE ω is large enough such that the
concept later in this section. transformed molecule satisfies the following energy conser-
4. Number of hits When a molecule undergoes a collision, vation condition:
one of the elementary reactions will be triggered and it
PE ω + KE ω ≥ PE ω (3)
may experience a change in its molecular structure. Num-
Hit is a record of the total number of hits (i.e. collisions) (further discussed in Sect. 4.3), we can have PE ω > PE ω .
a molecule has taken. In other words, we can obtain a worse solution in this ele-
5. Minimum structure MinStruct is the ω with the mini- mentary reaction. Of course, it is always possible to undergo
mum corresponding PE which a molecule has attained an on-wall ineffective collision when PE ω ≤ PE ω . When
so far. After a molecule experiences a certain number of a molecule experiences more of this elementary reaction, it
collisions, it has undergone many transformations of its will have more KE transferred to buffer. Hence, the chance
structure, with different corresponding PE. MinStruct is of having a worse solution is lower in a subsequent change.
the one with the lowest PE in its own reaction history.
6. Minimum potential energy When a molecule attains its 4.2.2 Decomposition
MinStruct, MinPE is the corresponding PE.
7. Minimum hit number MinHit is the number of hits when Decomposition refers to the situation when a molecule hits
a molecule realizes MinStruct. It is an abstract notation a wall and then breaks into several parts (for simplicity, we
of time when Minstruct is achieved. consider two parts in our discussion). Assume that ω pro-
duces ω1 and ω2 , i.e.,
4.2 Elementary reactions
ω → ω1 + ω2 .
There are four types of elementary reactions, each of which Any mechanism, which can produce ω1 and ω2 from ω, is
takes place in each iteration of CRO. They are employed allowed. Theoretically, even generating solutions indepen-
to manipulate solutions (i.e. explore the solution space) and dent of the existing one (random generation of new solution)
to redistribute energy among the molecules and the buffer. is feasible. The idea of decomposition is to allow the system
For demonstration purposes, we will also give examples of to explore other regions of the solution space after enough
the most frequently used operators in various applications of local search by the ineffective collisions. The effectiveness
CRO in Sect. 5. Other designs can be found in the references of the solution generation mechanism is problem-dependent.
provided in Sect. 6. Note that there is no strict requirements Since more solutions are created, the total sum of PE and KE
on the mechanisms of the operators and operators designed of the original molecule may not be sufficient. In other words,
for other algorithms may also be adopted. However, CRO we may have
ensures the conservation of energy when new solutions are
generated with the operators. PE ω + KE ω < PE ω1 + PE ω2 .

123
8 Memetic Comp. (2012) 4:3–17

As energy conservation is not satisfied, this decomposition KE ω2 = E inter × (1 − δ4 ), (10)


has to be aborted. To increase the chance of having a decom-
where δ4 is a random number generated in [0, 1].
position completed, we randomly draw a small portion of
energy from buffer to support the change. Let δ1 and δ2 be two
independent and identically distributed numbers uniformly 4.2.4 Synthesis
generated in the range of [0, 1]. We modify the energy con-
servation condition for decomposition as follows: Synthesis does the opposite of decomposition. A synthesis
happens when multiple (assume two) molecules hit against
PE ω + KE ω + δ1 × δ2 × buffer ≥ PE ω1 + PE ω2 . (4) each other and fuse together, i.e.,
This models the situation that some energy from buffer is ω1 + ω2 → ω .
transferred to the molecule when it hits the wall. If (4) holds,
the existing molecule with ω is replaced by the two newly As only one molecule is produced, it is likely to satisfy the
generated ones, whose KEs randomly share the remaining energy conservation condition:
energy E dec = (PE ω + KE ω + δ1 × δ2 × buffer) − (PE ω1 + PE ω1 + PE ω2 + KE ω1 + KE ω2 ≥ PE ω . (11)
PE ω2 ), i.e.,
If (11) holds, the resulting KE ω just takes up all the remain-
KE ω1 = E dec × δ3 and (5) ing energy, i.e.,
KE ω2 = E dec × (1 − δ3 ), (6)
KE ω = (PE ω1 + PE ω2 + KE ω1 + KE ω2 ) − (PE ω ). (12)
where δ3 is a random number generated in [0, 1]. The energy
in the buffer is also updated by We can see that we allow greater change to ω with respect to
ω1 and ω2 and KE ω is usually higher than KE ω . The result-
buffer  = (1 − δ1 δ2 )buffer. (7) ing molecule has a higher “ability” to explore a new solution
region. Any mechanism allowing the combination of solu-
tions is allowed, where the resultant molecule is in a region
4.2.3 Inter-molecular ineffective collision farther away from the existing ones in the solution space.
The idea behind synthesis is diversification of solutions. The
Inter-molecular ineffective collision takes place when multi- implementation detail is again problem-dependent.
ple molecules collide with each other and then bounce away.
The molecularity (assume two) remains unchanged before
4.3 Conservation of energy
and after the process, i.e.,
ω1 + ω2 → ω1 + ω2 . One of the fundamental assumptions of CRO is conserva-
tion of energy, which means that energy cannot be created
This elementary reaction is very similar to the uni-molecu- or destroyed. The whole system refers to all the defined
lar ineffective counterpart; we generate ω1 and ω2 by ω1 = molecules and the container, which is connected to buffer.
N (ω1 ) and ω2 = N (ω2 ). The energy management is similar The total amount of energy of the whole system is deter-
but no buffer is involved. The energy conservation condition mined by the objective function values (i.e. PE) of the initial
can be stated as population of molecules whose size is PopSize, the initial
PE ω1 + PE ω2 + KE ω1 + KE ω2 ≥ PE ω1 + PE ω2 . (8) KE (InitialKE) assigned, and the initial value of buffer. Let
P E ωi (t), K E ωi (t), PopSi ze(t), and bu f f er (t) be the PE
As more molecules are involved, the total sum of energy of molecule i, the KE of molecule i, the number of mole-
of the molecular sub-system is larger than that of the on- cules, and the energy in the central buffer at time t. When the
wall ineffective collision. The probability of the molecules algorithm evolves, the total amount of energy in the system
to explore their immediate surroundings is higher. In other always remains constant, i.e.,
words, the molecules have higher flexibility to be trans-
formed to more diverse molecular structures. We can use 
PopSize(t)
(PE ωi (t) + KE ωi (t)) + buffer(t) = C, (13)
the same operator for on-wall ineffective collision to pro-
i=1
duce new solutions. We apply the operator to each mole-
cule to get a new one. If (8) is satisfied, KEs of the trans- where C is a constant. Each elementary reaction manages
formed molecules share the remaining energy E inter = a sub-system (i.e. a subset of entities of the system); a
(PE ω1 + PE ω2 + KE ω1 + KE ω2 ) − (PE ω1 + PE ω2 ) in the uni-molecular collision involves a molecule and the container
sub-system, i.e., while an inter-molecular collision concerns multiple mole-
cules. After an elementary reaction, the total energy of the
KE ω1 = E inter × δ4 and (9) constructed sub-system remains the same. Let k and l be the

123
Memetic Comp. (2012) 4:3–17 9

number of molecules involved before and after a particular Algorithm 1 “Molecule” class
elementary reaction, and let ω and ω be the molecular struc- 1: class Molecule
tures of an existing molecule and the one to be generated 2: Attributes:
3: ω, PE, KE, NumHit, MinStruct, MinPE, MinHit
from the elementary reaction, respectively. In general, the 4: Method:
elementary reaction can only take place when it satisfies the 5: Molecule() \\constructor
following energy conservation condition: 6: {
7: Randomly generate ω in the solution space

k 
l 8: PE ← f (ω)
(PE ωi + KE ωi ) ≥ PE ωi . (14) 9: KE ← InitialKE
i=1 i=1 10: NumHit ← 0
11: MinStruct ← ω
We modify this condition for decomposition as it involves 12: MinPE ← PE
buffer on the left-hand side of (14). Note that PE is deter- 13: MinHit ← 0
mined by (1) according to the molecular structure. If the 14: }
15: Onwall IneffectiveCollision()
resultant molecules have very high potential energy, i.e. they 16: Decomposition()
give very bad solutions, the reaction will not occur. 17: IntermolecularIneffectiveCollision()
Theoretically, energy cannot attain a negative value and 18: Synthesis()
any operation resulting in negative energy should be forbid- 19: end class
den. However, some problems may attain negative objec-
tive function values (i.e. negative PE), but we can convert
the problem to an equivalent one by adding an offset to the In this stage, we define the manipulated agent, i.e. a mol-
objective function to make each PE non-negative. The law ecule, set the parameter values, and construct the initial pop-
of conservation of energy is still obeyed and the system ulation of molecules. As mentioned in Sect. 3, it is preferred
works perfectly. Interested readers may refer to [20] for more to program CRO with an object-oriented programming lan-
details. guage [30]. We create a “Molecule” class with some attri-
butes and methods. The attributes are those mentioned in
Sect. 4.1 while we define five methods in the class, including
5 Algorithm design the class constructor and the four elementary reactions.7 The
constructor defines the details of an object when it is created
In this section, we will guide the readers to develop a basic according to the class. Here the object refers to a “mole-
version of CRO. This serves to help the readers understand cule”. As we normally generate the initial set of solutions
how CRO works. We also give examples of some common randomly in the solution space, we assign a random solution
operators used in CRO. Although CRO is a general-purpose to ω in the constructor. The pseudocode of the “Molecule”
metaheuristic, as with other general-purpose metaheuristics, class is given in Algorithm 1. We create PopSize number of
this basic CRO may not give good performance to all prob- molecules from “Molecule” to form the initial population of
lems of interest. At the end of this section, we also give some molecules.
suggestions on how to proceed to more advanced versions of
CRO which can be more adaptive to the problem. 5.2 Iterations
Similar to other evolutionary algorithms or metaheuris-
tics, CRO consists of three stages: initialization, iterations, Molecules with energy move and trigger collisions. A mole-
and the final stage. We define the elements of the algorithms cule can either hit on a wall of the container or collide with
in the initialization and the algorithm explores the solution each other. This is decided by generating a random num-
space in iterations. In the final stage, the algorithm terminates ber b in [0, 1]. If b > MoleColl or the system only has one
and the best found solution is output. molecule, we have a uni-molecular collision. Otherwise, an
inter-molecular collision follows.
5.1 Initialization For a uni-molecular collision, we randomly select one
molecule from the population and decide if it results in an
Imagine there is a container with some chemical substances on-wall ineffective collision or a decomposition, by check-
inside in the forms of molecules. We initialize the set- ing the decomposition criterion on the chosen molecule. In
tings of the algorithm and assign values to the algorithmic most of the CRO applications in Sect. 6, the decomposition
parameters, including PopSize, KELossRate, MoleColl,
buffer, InitialKE, α, and β.6
7As we only carry out the elementary reactions in the iterations stage,
6 We will discuss MoleColl, α, and β in Sect. 5.2. we will explain their implementations in Sect. 5.2.

123
10 Memetic Comp. (2012) 4:3–17

criterion is defined as Similarly, for an inter-molecular collision, we randomly


select two molecules from the population and determine if
NumHit − MinHit > α. (15) there will be an inter-molecular ineffective collision or a
synthesis by checking the synthesis criterion on the chosen
This means that the molecule has undergone α times of local
molecules. We usually adopt the following definition in the
search without locating a better local minimum. Thus we
deployments of CRO in Sect. 6: all involved molecules satisfy
should explore other parts of the solution space through
decomposition. Other definitions of decomposition criteria KE ≤ β. (16)
are allowed provided that diversification takes place suitably
in between intensifications. If (15) is satisfied, it will result That means all involved molecules have kinetic energy less
in a decomposition. Otherwise, we get an on-wall ineffec- than or equal to β. Molecules with too low KE lose the flexi-
tive collision. The pseudocodes of on-wall ineffective colli- bility of escaping from local minima. We trigger a synthesis
sion and decomposition are shown in Algorithms 2 and 3, to bring those inflexible molecules to other solution regions
respectively. for exploration. If (16) for each involved molecule is sat-
isfied, it will result in a synthesis. Otherwise, we have an
inter-molecular ineffective collision. The pseudocodes of
Algorithm 2 OnwallIneffectiveCollision inter-molecular ineffective collision and synthesis are shown
1: Input: molecule Mω in Algorithms 4 and 5, respectively.
2: ω ← N (ω)
3: PE ω ← f (ω )
4: NumHit ω ← NumHit ω + 1 Algorithm 4 IntermolecularIneffectiveCollision
5: if PE ω + KE ω ≥ PE ω then
1: Input: molecules Mω1 and Mω2
6: Generate a ∈ [KELossRate, 1]
2: ω1 ← N (ω1 ) and ω2 ← N (ω2 )
7: KE ω ← (PE ω − PE ω + KE ω ) × a
3: PE ω1 ← f (ω1 ) and PE ω2 ← f (ω2 )
8: buffer ← buffer + (PE ω − PE ω + KE ω ) × (1 − a)
4: NumHit ω1 ← NumHit ω1 + 1 and NumHit ω2 ← NumHit ω2 + 1
9: ω ← ω
5: E inter ← (PE ω1 + PE ω2 + KE ω1 + KE ω2 ) − (PE ω1 + PE ω2 )
10: PE ω ← PE ω
6: if E inter ≥ 0 then
11: KE ω ← KE ω
7: Generate δ4 ∈ [0, 1]
12: if PE ω < MinPE ω then
8: KE ω1 ← E inter × δ4 and KE ω2 ← E inter × (1 − δ4 )
13: MinStruct ω ← ω
14: MinPE ω ← PE ω 9: ω1 ← N (ω1 ) and ω2 ← N (ω2 )
15: MinHit ω ← NumHit ω 10: PE ω1 ← PE ω1 and PE ω2 ← PE ω2
16: end if 11: KE ω1 ← KE ω1 and KE ω2 ← KE ω2
17: end if 12: if PE ω1 < MinPE ω1 then
13: MinStruct ω1 ← ω1
14: MinPE ω1 ← PE ω1
15: MinHit ω1 ← NumHit ω1
16: end if
17: if PE ω2 < MinPE ω2 then
Algorithm 3 Decomposition 18: MinStruct ω2 ← ω2
1: Input: molecule Mω 19: MinPE ω2 ← PE ω2
2: Create Mω1 and Mω2 20: MinHit ω2 ← NumHit ω2
21: end if
3: Obtain ω1 and ω2 from ω
22: end if
4: PE ω1 ← f (ω1 ) and PE ω2 ← f (ω2 )
5: if PE ω + KE ω ≥ PE ω1 + PE ω2 then
6: Edec ← PE ω + KE ω − (PE ω1 + PE ω2 )
7: goto Step 13
8: else
9: Generate δ1 , δ2 ∈ [0, 1] Algorithm 5 Synthesis
10: Edec ← PE ω + KE ω + δ1 δ2 × buffer − (PE ω1 + PE ω2 ) 1: Input: molecules Mω1 and Mω2
11: if Edec ≥ 0 then 2: Create Mω
12: buffer ← buffer × (1 − δ1 δ2 ) 3: Obtain ω from ω1 and ω2
13: Generate δ3 ∈ [0, 1] 4: PE ω ← f (ω )
14: KE ω1 ← Edec × δ3 and KE ω2 ← Edec × (1 − δ3 ) 5: if PE ω1 + PE ω2 + KE ω1 + KE ω2 ≥ PE ω then
15: MinStruct ω1 ← ω1 and MinStruct ω2 ← ω2 6: KE ω ← (PE ω1 + PE ω2 + KE ω1 + KE ω2 ) − PE ω
16: MinPE ω1 ← PE ω1 and MinPE ω2 ← PE ω2 7: MinStruct ω ← ω
17: Destroy Mω 8: MinPE ω1 ← PE ω
18: else 9: Destroy Mω1 and Mω2
19: NumHit ω ← NumHit ω + 1 10: else
20: Destroy Mω1 and Mω2 11: NumHit ω1 ← NumHit ω1 + 1 and NumHit ω2 ← NumHit ω2 + 1
21: end if 12: Destroy Mω
22: end if 13: end if

123
Memetic Comp. (2012) 4:3–17 11

Fig. 2 Schematic diagram of CRO [18]

Inequalities (15) and (16) control the degree of diversifi- without improvements, etc. In this stage, we simply output
cation by α and β. Proper values of α and β balance intensifi- the best solution found with its objective function value and
cation (i.e. exploitation) and diversification (i.e. exploration). terminate the algorithm.
After an elementary reaction (manipulation of solutions)
completes, we check if the energy conservation condition is 5.4 The overall algorithm
obeyed. If not, the change is abolished. Then we check if
any newly determined solution has a lower objective func- To program CRO, we just need to assemble the previously
tion value. If so, we record the best solution obtained so far. mentioned components together according to the peudocode
If no stopping criteria are met, we will start a new iteration. given in Algorithm 6. To ease understanding the flow of the
algorithm, we also give the schematic diagram of CRO in
5.3 The final stage Fig. 2.
Here are some suggested values for the parameters:
If any of the stopping criteria is met, we will go to the PopSi ze = 10, K E Loss Rate = 0.2, MoleColl =
final stage. The stopping criteria are defined according to 0.2, I nitial K E = 1000, α = 500, β = 10, and bu f f er =
the user’s requirements and preferences. Typical stopping 0. These values are deduced from our implementation in
criteria include the maximum amount of CPU time used, [18,20]. However, these parameter values are problem-
the maximum number of function evaluations performed, dependent. To maximize the performance of CRO for a par-
obtaining an objective function value less than a predefined ticular problem, the readers may perform some parameter
threshold, the maximum number of iterations performed tunings to determine a good combination of parameter values.

123
12 Memetic Comp. (2012) 4:3–17

Algorithm 6 CRO u i ; li , u i ∈ R, ∀i. First we randomly pick an element ω(i)


1: Input: Objective function f and the parameter values from ω. Let i be a random variable with a Gaussian prob-
2: \\ Initialization ability density function having zero mean and variance σ 2 .
3: Set PopSize, KELossRate, MoleColl, buffer, InitialKE, α, and β
4: Create PopSize number of molecules
Let δi be a realization of i . We have ω̃(i) = ω(i) + δi . If
5: \\ Iterations ω̃(i) is smaller than li , we get ω (i) by reflecting on li with
6: while the stopping criteria not met do the amount of violation. Otherwise, we have ω̃(i) = ω(i). If
7: Generate b ∈ [0, 1] ω̃(i) is larger than u i , we obtain ω (i) similarly by reflecting
8: if b > MoleColl then
9: Randomly select one molecule Mω
on u i . Mathematically, we get ω (i) by
10: if Decomposition criterion (15) met then ⎧

⎪ 2li − ω̃(i) if ω̃(i) < li ,
11: Trigger Decomposition ⎨
12: else 
ω (i) = 2u i − ω̃(i) if ω̃(i) > u i , (17)
13: Trigger OnwallIneffectiveCollision ⎪


14: end if ω̃(i) otherwise.
15: else
16: Randomly select two molecules Mω1 and Mω2
17: if Synthesis criterion (16) met then 5.5.3 Half-total change
18: Trigger Synthesis
19: else It is an example of a decomposition operator and it can be
20: Trigger IntermolecularIneffectiveCollision
21: end if used in Line 3 of Algorithm 3. As its name implies, we pro-
22: end if duce a new solution from an existing one by keeping one half
23: Check for any new minimum solution of the existing solution values and assigning the remaining
24: end while half with new values. Suppose we try to produce two new
25: \\ The final stage
26: Output the best solution found and its objective function value solutions ω1 = [ω1 (i), 1 ≤ i ≤ n] and ω2 = [ω2 (i), 1 ≤
i ≤ n] from ω = [ω(i), 1 ≤ i ≤ n]. For ω1 , we first copy
ω to ω1 and then randomly pick n/2
elements in the vec-
5.5 Operator examples tor of ω1 , where ·
returns the largest integer not greater
than the argument. For each of these elements, e.g., ω1 (i),
Here we give examples of operators used in some applica- we assign a new value according to the problem constraints.
tions of CRO given in Sect. 6. For example, if ω1 (i) can only take a value in a set Si , we
can just randomly select an element from Si to ω1 (i). If Si
5.5.1 Two-exchange is a continuous set, we can add a random perturbation to it
to get a new ω1 (i), similar to the Gaussian perturbation with
It is also called pair-exchange or 2-opt [5]. It is a neighbor- reflection scheme mentioned in the previous subsection for
hood search operator N (·) for combinatorial problems. It can the ith element of ω1 . After producing ω1 , we produce ω2
be used in Line 2 of Algorithm 2 and Line 2 of Algorithm 4. similarly. As the randomly chosen elements of ω1 and ω2 and
Consider a problem with solutions in the form of vec- the newly assigned values are different, ω1 is quite different
tors of n elements. Let ω = [ω(i), 1 ≤ i ≤ n] be a from ω1 , and also from ω.
particular solution. First we randomly pick two distinct ele-
ments from ω, e.g., ω(i) and ω( j), where i < j. Then 5.5.4 Probabilistic select
we form a new solution ω by exchanging their positions,
i.e., ω = [ω(1), . . . , ω(i − 1), ω( j), ω(i + 1), . . . , ω( j − It is an example of synthesis operator and it can be used in
1), ω(i), ω( j + 1), . . . , ω(n)]. If the problem is confined to Line 3 of Algorithm 5. Suppose we produce solution ω =
a permutation vector space, this operator can guarantee that [ω (i), 1 ≤ i ≤ n] by combining ω1 = [ω1 (i), 1 ≤ i ≤ n]
ω is still a permutation vector as long as ω is a permutation and ω2 = [ω2 (i), 1 ≤ i ≤ n]. This operator tries to randomly
vector. select elements from ω1 and ω2 to form ω . To do this, we
assign each ω (i) with a value equal to either ω1 (i) or ω2 (i)
5.5.2 Gaussian perturbation with reflection randomly.

Gaussian perturbation is also called Gaussian mutation [4]. It 5.6 Advanced settings
is a neighborhood search operator N (·) for continuous prob-
lems. It can be used in Line 2 of Algorithm 2 and Line 2 of We have only specified how the basic CRO works. Due
Algorithm 4. to the No-Free-Lunch Theorem [35], it cannot have good
Consider a problem with continuous solution space. Let performance for all kinds of problems. Recall that CRO is
ω = [ω(i), 1 ≤ i ≤ n] where ω(i) ∈ [li , u i ], li ≤ cast as a general-purpose algorithm and it is stated in the form

123
Memetic Comp. (2012) 4:3–17 13

of an algorithmic framework. Many details can be modified More information about constraint-handling techniques can
to suit a particular problem. Here we attempt to give the read- be found in [9].
ers some directions for developing advanced versions.
In the basic CRO, we always specify the maximum num-
ber of molecules involved in an elementary reaction to be 6 Applications
two. Each elementary reaction needs to satisfy the energy
conservation condition (14) in order to realize a new solution. Although CRO is a newly proposed algorithm, it has
However, if more than two molecules are involved (except been applied to problems in many disciplines successfully.
the on-wall ineffective collision which is always one-to-one), CRO has been compared with many existing evolutionary
more energy may be committed and the molecules may attain approaches and it achieves very competitive or even superior
new solutions to a greater extent; more molecules may com- performance. Applications of CRO with the operators used
pensate others for a great change in PE. For example, we are summarized in Table 2.
may allow an inter-molecular ineffective collision with three
molecules: ω1 + ω2 + ω3 → ω1 + ω2 + ω3 . 6.1 Quadratic assignment problem
We only give the principles of the elementary reactions in
Sect 5.2, where operators are required to specify how to gen- Quadratic Assignment Problem is a fundamental combinato-
erate new solutions from existing ones. In Sect. 5.5, we give rial problem in operations research [22]. It belongs to location
examples of some commonly used operators of CRO. Sim- analysis, about minimizing the transportation cost by vary-
ilar to other evolutionary algorithms, it is possible to design ing the locations of facilities. Consider the assignment of n
good operators to gain better performance for a particular facilities to n locations. We know the distance between each
problem. Moreover, we can also adopt the operators suc- pair of locations and the human flow between each pair of
cessfully used in other algorithms in CRO. For example, the facilities. The problem is to minimize the total cost (distance
effect of decomposition is similar to that of mutation in GA. × flow) by arranging the locations of the facilities. In [18],
We can apply a mutation operator to a molecule twice to the earliest version of CRO is compared with the variants
produce two different molecules. We can also apply a GA of some popular evolutionary algorithms and CRO achieves
crossover operator in synthesis to combine solutions into superior performance in many test instances. In [39], a paral-
one. lel version of CRO with a synchronous communication strat-
We can design the decomposition and synthesis criteria egy is proposed to tackle the problem. The computation time
other than those given in (15) and (16). Decomposition and and the solution quality of the parallel implementation are
synthesis bring diversification to the algorithm. Diversifi- improved when compared to those of the sequential version
cation cannot take place too often, or it will be become a of CRO.
completely random algorithm. The criteria specify when a
diversification happens in between intensifications. Recall 6.2 Resource-constrained project scheduling problem
that in Sect. 4.1, only the molecular structure ω, PE, and
KE are necessary to characterize a molecule. Other attributes Resource-Constrained Project Scheduling Problem is one of
are optionally used to describe the condition of a molecule the most intractable, NP-hard optimization problems in oper-
for checking the decomposition and synthesis criteria. Other ations research, related to project management, resource allo-
attributes can also be introduced for different designs of the cation, and the manufacturing process [7]. Consider that time
criteria. is divided into slots and there are some activities to be sched-
Many optimization problems impose constraints to dif- uled for a project. Each activity requires resources to process
ferentiate feasible solutions from the infeasible ones. New and it may span more than one time slot. Resources are lim-
solutions generated in the elementary reactions may be fea- ited; we need to decide which activities should be supported
sible or infeasible depending on the operators used. There in a certain time slot. There are also precedence constraints
are generally three approaches to handle constraints: among the activities. In other words, some activities can only
start when certain ones have been completed. The objective
1. One can design the operators which always map to fea- is to minimize the lifespan of the project. In [18], CRO can
sible solutions. achieve the known global minimums of most instances in the
2. We allow the operators to produce infeasible solutions standard benchmarks.
but we introduce a mechanism to convert any infeasible
solutions into feasible ones at the end of each iteration. 6.3 Channel assignment problem in wireless mesh networks
3. We allow infeasible solutions without any correction
mechanism but we impose a penalty at the objective func- A wireless mesh network is composed of some stationary
tion to any infeasible solution. wireless mesh routers, each of which is equipped with certain

123
14

Table 2 Applications of CRO

123
Problem Ref. Type Field Year Solution Structure Neighborhood Decomposition Synthesis operator
operator operator

Quadratic [18,39] Combinatorial, Operations research 2010 Permutation vector Two-exchange Circular shift Distance-preserving
Assignment NP-hard crossover
Problem
Resource-Con- [18] Combinatorial, Operations research 2010 Permutation vector Two-exchange Circular shift Distance-preserving
strained Project NP-hard crossover
Scheduling
Problem
Channel Assignment [18] Combinatorial, Communications, 2010 Integer vector One-difference Half-total-change Probabilistic select
Problem in NP-hard Networking
wireless mesh
networks
Population [19] Continuous Communications, 2010 Right stochastic Randomly Randomly assign Probabilistic select
Transition Problem Networking matrix redistribute the rows to new on rows
in peer-to-peer live sum of two random solutions with
streaming numbers into two random generation
of unassigned rows
Cognitive Radio [21] Combinatorial, Communications, 2010 Binary vector One-difference Randomly assign Probabilistic select
Spectrum NP-hard Networking bits to new
Allocation solutions with
Problem random generation
of unassigned bits
Grid Scheduling [36,37] Combinatorial, Computing 2010, 2011 Permutation vector, Insertion, Random generation, Position-based,
Problem NP-hard integer vector two-exchange, half-random one-position
One-difference exchange
Standard continuous [20] Continuous Mathematics 2011 Real vector Gaussian Half-total-change Probabilistic select,
benchmark perturbation BLX-α
functions
Stock Portfolio [38] Mixed-integer, Finance 2011 Mixed-integer vector One-difference Half-random Keep the aligned
Selection Problem multi-objective, numbers and
NP-hard randomly generate
the rest
Artificial neural [41] Continuous Computational 2011 Real matrices and Gaussian Perturb every Probabilistic select
network training intelligence vectors perturbation element with 0.5
probability
Network Coding [25] Combinatorial, Communications, 2011 Integer vector One-difference Randomly generate a Probabilistic select
Optimization NP-hard Networking solution and
Problem modify the coding
links
Memetic Comp. (2012) 4:3–17
Memetic Comp. (2012) 4:3–17 15

radio interfaces. Two routers establish a communication link 6.6 Grid scheduling problem
if they are located in the transmission range of each other with
the same channel assigned to one of their interfaces. There Grid computing is the next wave of computing where we del-
are only limited channels available and two established com- egate computational tasks to the computer cloud (e.g. Inter-
munication links on the same channel interfere each other if net), instead of on a standalone machine [27]. The tasks are
they are in close proximity. The Channel Assignment Prob- taken up by idle computing resources and computed results
lem assigns channels to the communications links so as to are then returned to the requester. The resources may be het-
minimize the induced interference subject to interface con- erogeneous in computational power and volatile. Grid Sched-
straint, which means that we cannot assign more channels to uling Problem schedules tasks to resources so as to minimize
a router than the number of interfaces equipped. This prob- computational overheads and to utilize the resources effec-
lem is NP-hard and combinatorial [33]. CRO can improve tively. Several variants of CRO are proposed with different
the existing solutions to the problem [18]. considerations of solution representation and priority of the
elementary reactions [36,37]. CRO outperforms many exist-
ing evolutionary methods in most test cases.
6.4 Population Transition Problem in peer-to-peer live
streaming
6.7 Standard continuous benchmark functions
In a peer-to-peer live streaming system, there is a stream
Many optimization problems are continuous problems. The
source providing streaming data, together with peers receiv-
original CRO [18] mainly focuses on discrete problems
ing the data. Due to heterogeneous network conditions, peers
and a successful general-purpose metaheuristic should also
experience different transmission delays for the data from the
be applicable to the continuous domain. CRO is extended
source and they can be grouped into colonies according to the
to solve continuous problems systematically in [20]. This
delays. For a particular peer, its upstream peers with shorter
continuous version is tested with the standard benchmarks
transmission delays and those in the same colony can serve
comprised of unimodal, high-dimensional and low-dimen-
as a source for the data. The system is in universal streaming
sional multimodal functions [40]. Many existing evolution-
when all peers are served with sufficient streaming data. Peers
ary approaches are compared with CRO, which show very
can join and leave the system and switch to another colony
competitive results. An adaptive scheme for CRO is also pro-
while the system can impose rules to guide peers to join the
posed in [20].
colonies (e.g. assign transition probability for peers transiting
from colony to colony). Population Transition Problem max-
6.8 Stock portfolio selection problem
imizes the probability of universal streaming by assigning
population transition probabilities among all colonies [19].
Investing in a selection of stocks, instead of a single one,
In [19], CRO is compared with some practical strategies and
is almost the golden rule in finance to reduce risk. There is
simulation shows that the evolutionary approach by CRO
always a tradeoff in investment: minimizing the risk while
performs better than the non-evolutionary ones.
maximizing the return. Stock portfolio selection studies how
to build a portfolio of stocks with the consideration of the two
6.5 Cognitive radio spectrum allocation problem contradicting objectives [34]. It is a multi-objective mixed-
integer NP-hard problem. In [38], CRO is employed to solve
In many countries, wireless channel utilization is regu- the problem based on the Markowitz model and the Sharpe
lated and most of the channels can only be used by autho- ratio. A super molecule-based CRO is proposed to compute
rized users. Due to the widespread employment of wireless the Pareto frontier with better performance in terms of Sharpe
devices, the shared channels become overcrowded and their ratio, expected return, and variance than the canonical form.
quality of service deteriorates. With underutilization of the
restricted channels, the capacity of the whole wireless sys- 6.9 Artificial neural network training
tem will substantially increase when unauthorized users are
allowed to use the restricted channels provided that higher An artificial neural network is a very successful tool to model
priority is given to the authorized users. Adjoining users on the input–output relationships of complex systems. The net-
the same channel induce interference. Restricted to an inter- work is formed by interconnecting artificial neurons arranged
ference-free environment, the spectrum allocation problem in layers, simulating a biological neural network [24]. It is
assigns channels to users in order to maximize system util- an adaptive system with the ability to learn from provided
ity subject to hardware constraint (which is similar to the data. It has many real-world applications, e.g. classification,
interface constraint in Sect. 6.3) [26]. CRO shows dramatic function approximation, and data mining. In order to model
improvement over other existing approaches [21]. a system, a neural network requires a set of related data for

123
16 Memetic Comp. (2012) 4:3–17

training, i.e., to evolve the network structure and to tune the employ CRO to their own problems and learn the charac-
weights. In [41], CRO is employed to train neural networks. teristics of CRO with the toolbox. Moreover, there are very
The CRO-trained neural networks have the best testing error few efforts on parallelization and distributed computation of
rate among many representative evolutionary schemes. CRO. Due to its variable population structure, there is no
strict requirement on the population size and synchroniza-
6.10 Network coding optimization problem tion among the distributed computational platforms. Further-
more, the frequencies of decomposition and synthesis affect
In a traditional computer networks, sources send data to the performance. How best to control these frequencies to
destinations via some routers and the routers only receive further improve the performance is still an open question.
and forward the data without further processing. Network
Acknowledgments This work was supported in part by the Strate-
coding enhances network performance when routers are gic Research Theme of Information Technology of The University of
endowed with processing ability (coding). Network coding Hong Kong. A.Y.S. Lam was also supported in part by the Croucher
can increase the system throughput without any topologi- Foundation Research Fellowship.
cal changes to the network. However, enabling coding on all
Open Access This article is distributed under the terms of the Creative
possible links increases computational cost. Network Coding Commons Attribution License which permits any use, distribution, and
Optimization Problem minimizes the number of coding links reproduction in any medium, provided the original author(s) and the
while maintaining a certain transmission rate [16]. It is an source are credited.
NP-hard combinatorial problem. In [25], CRO is employed
to tackle this problem and is shown to outperform existing References
algorithms.
1. AlRashidi M, El-Hawary M (2009) A survery of particle swarm
optimization applications in electric power systems. IEEE Trans
Evol Comput 13(4):913–918
7 Concluding remarks and future work 2. Ashlock D (2004) Evolutionary computation for modeling and
optimization. Springer, New York
3. Boyd S, Vandenberghe L (2004) Convex optimization. Cambridge
CRO is a recently developed general-purpose optimization University Press, Cambridge, UK
technique. Its inspiration comes from the nature of chem- 4. Burger R (2000) The mathematical theory of selection, recombi-
ical reactions. It mimics the interactions of molecules, in nation, and mutation. Wiley, Chichester
5. Cela E (1998) The quadratic assignment problem: theory and algo-
the form of elementary reactions. The randomly constructed
rithms. Kluwer Academic Publishers, Dordrecht, The Netherlands
sequence of elementary reactions lets the molecules explore 6. Chen XS, Ong YS, Lim MH, Tan KC (2011) A multi-facet survey
the solution space for the global minimum. Energy manage- on memetic computation. IEEE Trans Evol Comput 15(5):591–
ment is the fundamental characteristic of CRO. The conser- 607
7. Demeulemeester EL, Herroelen WS (2002) Project scheduling: a
vation of energy governs the acceptance of new solutions and
research handbook. Academic Publishers, Boston, MA, USA
the scope of search. Its variable population structure allows 8. Dorigo M, Stutzle T (2004) Ant colony optimization. The MIT
the algorithm to adapt to the problem with reasonable mix- Press, Cambridge, MA, USA
ture of intensification and diversification. These are the rea- 9. Eiben AE (2001) Evolutionary algorithms and constraint satisfac-
tion: definitions, survey, methodology, and research directions.
sons why CRO performs very well in solving optimization
Theoretical aspects of evolutionary computing. Springer, London,
problems. Although CRO is just recently proposed, it has pp 13–30
been successfully applied to many benchmarks and practical 10. Fortnow L (2009) The status of the P versus NP problem. Commun
problems. The examples described in this paper give readers ACM 52(9):78–86
11. Garey MR, Johnson DS (1979) Computers and intractability: A
some ideas on how to apply CRO to their own problems. We
guide to the theory of NP-completeness. WH Freeman & Co Ltd,
believe this is just the start of the CRO journey. This tutorial New York
serves to summarize the current development of CRO and to 12. Geem ZW, Kim JH, Loganathan GV (2001) A new heuristic opti-
lay down potential research directions. mization algorithm: harmony search. Simulation 76(2):60–68
13. Goldberg DE (1989) Genetic algorithms in search, optimization,
Thanks to the No-Free-Lunch Theorem, each successful
and machine learning. Addison-Wesley, Reading, MA, USA
metaheuristic performs well on certain classes of problems. 14. Guggenheim EA (1967) Thermodynamics: an advanced treatment
Which classes of problems are suitable for CRO? There is for chemists and physicists. 5th edn. Wiley, North Holland
no easy answer at this moment. Similar to other evolutionary 15. Kennedy J, Eberhart RC (2001) Swarm intelligence. Morgan
Kaufmann, San Francisco
algorithms, when CRO is applied to more areas, the research
16. Kim M, Medard M, Aggarwal V, OReilly UM, Kim W, Ahn CW
community will give an answer. To ease implementation, a (2007) Evolutionary approaches to minimizing network coding
toolbox called CROToolbox is available.8 Users can quickly resources. In: Proceedings of the 26th annual IEEE conference
on computer Communications, Anchorage, AK, USA
17. Kirkpatrick S, Gelatt CD, Vecchi MP (1983) Optimization by sim-
8 CROToolbox can be downloaded at http://cro.eee.hku.hk. ulated annealing. Science 220((4598):671–680

123
Memetic Comp. (2012) 4:3–17 17

18. Lam AYS, Li VOK (2010) Chemical-reaction-inspired metaheu- 31. Shadbolt N (2004) Nature-inspired computing. IEEE Intell Syst
ristic for optimization. IEEE Trans Evol Comput 14(3):381–399 19(1):2–3
19. Lam AYS, Xu J, Li VOK (2010) Chemical reaction optimization for 32. Shin SY, Lee IH, Kim D, Zhang BT (2005) Multiobjective evolu-
population transition in peer-to-peer live streaming. In: Proceed- tionary optimization of DNA sequences for reliable DNA comput-
ings of the IEEE congress on evolutionary computation. Barcelona, ing. IEEE Trans Evol Comput 9(2):143–158
Spain 33. Subramanian AP, Gupta H, Das SR, Cao J (2008) Minimum inter-
20. Lam AYS, Li VOK, Yu JJQ (2011, in press) Real-coded chemi- ference channel assignment in multiradio wireless mesh networks.
cal reaction optimization. IEEE Trans Evol Comput (accepted for IEEE Trans Mobile Comput 7(12):1459–1473
publication) 34. Tollo GD, Roli A (2008) Metaheuristics for the portfolio selection
21. Lam AYS, Li VOK (2010) Chemical reaction optimization for problem. J Financial Quant Anal 8(4):621–636
cognitive radio spectrum allocation. In: Proceedings of the IEEE 35. Wolpert DH, Macready WG (1997) No free lunch theorems for
Global Communications Conference. Miami, FL, USA optimization. IEEE Trans Evol Comput 1(1):67–82
22. Loiola EM, de Abreu NMM, Boaventura-Netto PO, Hahn P, 36. Xu J, Lam AYS, Li VOK (2010) Chemical reaction optimization
Querido T (2007) A survey for the quadratic assignment problem. for the grid scheduling problem. In: Proceedings of the IEEE inter-
Eur J Oper Res 176(2):657–690 national conference on communications. Cape Town, South Africa
23. Ong YS, Lim MH, Chen XS (2010) Research frontier: memetic 37. Xu J, Lam AYS, Li VOK (2011) Chemical reaction optimization
computation past, present and future. IEEE Comput Intell Mag for task scheduling in grid computing. IEEE Trans Parallel Distrib
5(2):24–36 Syst 22(10):1624–1631
24. Palmes PP, Hayasaka T, Usui S (2005) Mutation-based genetic 38. Xu J, Lam AYS, Li VOK (2011) Stock portfolio selection using
neural network. IEEE Trans Neural Netw 16(3):587–600 chemical reaction optimization. In: Proceedings of the international
25. Pan B, Lam AYS, Li VOK (2011) Network coding optimization conference on operations research and financial engineering. Paris,
based on chemical reaction optimization. In: Proceedings of the France
IEEE global communications conference. Houston, TX, USA 39. Xu J, Lam AYS, Li VOK (2010) Parallel chemical reaction opti-
26. Peng C, Zheng H, Zhao BY (2006) Utilization and fairness in spec- mization for the quadratic assignment problem. In: Proceedings of
trum assignment for opportunistic spectrum access. ACM/Kluwer the international conference on genetic and evolutionary methods.
Mobile Netw Appl 11(4):555–576 Las Vegas, NV, USA
27. Ritchie G, Levine J (2004) A hybrid ant algorithm for scheduling 40. Yao X, Liu Y, Lin G (1999) Evolutionary programming made
independent jobs in heterogeneous computing environments. In: faster. IEEE Trans Evol Comput 3(2):82–102
Proceedings of 23rd workshop of the UK planning and scheduling 41. Yu JJQ, Lam AYS, Li VOK (2011) Evolutionary artificial neural
special interest group. Cork, Ireland network based on chemical reaction optimization. In: Proceedings
28. Price K, Storn R, Lampinen J (2005) Differential evolution: a prac- of the IEEE congress on evolutionary computation. New Orleans,
tical approach to global optimization. Springer, Berlin LA, USA
29. Rogers H (1987) Theory of recursive functions and effective com- 42. Yu L, Chen H, Wang S, Lai KK (2009) Evolving least squares sup-
putability. The MIT Press, Cambridge, MA, USA port vector machines for stock market trend mining. IEEE Trans
30. Schach S (2010) Object-oriented and classical software engineer- Evol Comput 13(1):87–102
ing. 8th edn. McGraw-Hill, New York

123

View publication stats

You might also like