Chemical Reaction Optimization: A Tutorial: Memetic Computing March 2012
Chemical Reaction Optimization: A Tutorial: Memetic Computing March 2012
net/publication/257779624
CITATIONS READS
122 2,950
2 authors:
Some of the authors of this publication are also working on these related projects:
All content following this page was uploaded by Albert Lam on 05 June 2014.
Received: 11 July 2011 / Accepted: 31 January 2012 / Published online: 12 February 2012
© The Author(s) 2012. This article is published with open access at Springerlink.com
123
4 Memetic Comp. (2012) 4:3–17
a very influential class of optimization algorithms nowa- most of the problems addressed by CRO in Sect. 6. We con-
days. clude this tutorial and suggest potential future work in Sect. 7.
In the past few decades, the field of Nature-inspired opti-
mization techniques has grown incredibly fast. These algo-
rithms are usually general-purpose and population-based. 2 Inspiration
They are normally referred to as evolutionary algorithms1
because many of them are motivated by biological evolution. CRO is a technique which loosely couples chemical reactions
Evolution means “the variation of allele frequencies in popu- with optimization. It does not attempt to capture every detail
lations over time” [2]. We can, without harm, broaden the idea of chemical reactions. In general, the principles of chemical
of “evolution” to non-biological processes. In a broad sense, reactions are governed by the first two laws of thermodynam-
evolutionary algorithms cover those which vary a group of ics [14]. Here we explain these laws at a high level to enable
solutions in iterations based on some Nature-inspired oper- the readers to easily grasp the working mechanisms of chem-
ations. Examples include, but are not limited to, Genetic ical reactions. The first law (conservation of energy) says that
Algorithm (GA) [13], Memetic Algorithm (MA) [6,23], Ant energy cannot be created or destroyed; energy can transform
Colony Optimization (ACO) [8], Particle Swarm Optimiza- from one form to another and transfer from one entity to
tion (PSO) [15], Differential Evolution (DE) [28], and Har- another. A chemical reacting system consists of the chemi-
mony Search (HS) [12]. Many of them are inspired by the cal substances and its surroundings. Each chemical substance
biological process, varying in scale from the genetic level, possesses potential and kinetic energies, and the energies of
e.g. GA, MA, and DE, to the creature level, e.g. ACO and the surroundings are symbolically represented by the central
PSO. Unlike the others, HS is motivated by the phenomenon energy buffer in CRO.2 A reaction is endothermic when it
of human activities in composing music. requires heat obtained from the surroundings to initialize the
The aforementioned algorithms are successful in solving reaction process. An exothermic reaction refers to one whose
many different kinds of optimization problems, as demon- chemical substances give heat to the surroundings. These two
strated by their huge number of citations in the literature. kinds of reactions can be characterized by the initial buffer
According to the No-Free-Lunch Theorem [35], all meta- size: when it is positive, the reaction is endothermic; when
heuristics which search for extrema are exactly the same it is zero, the reaction is exothermic. The second law says
in performance when averaged over all possible objective that the entropy of a system tends to increase, where entropy
functions. In other words, when one works excellent in a is the measure of the degree of disorder. Potential energy is
certain class of problems, it will be outperformed by the oth- the energy stored in a molecule with respect to its molecular
ers in other classes. Therefore, all algorithms which have configuration. When it is converted to other forms, the sys-
been shown to address some optimization problems success- tem becomes more disordered. For example, when molecules
fully are equally important, as each of them must perform with more kinetic energy (converted from potential energy)
identically well on the average. As the spectrum of optimi- move faster, the system becomes more disordered and its
zation problems is huge, the number of reported successful entropy increases. Thus all reacting systems tend to reach a
metaheuristics is much less than the number of problems. state of equilibrium, whose potential energy drops to a min-
Recently, Lam and Li [18] proposed a new metaheuristic for imum. In CRO, we capture this phenomenon by converting
optimization, inspired by the nature of chemical reactions. potential energy to kinetic energy and by gradually losing the
They coined it Chemical Reaction Optimization (CRO). In energy of the chemical molecules to the surroundings.
a short period of time, CRO has been applied to solve many A chemical system undergoes a chemical reaction when
problems successfully, outperforming many existing evolu- it is unstable, in the sense that it possesses excessive energy.
tionary algorithms in most of the test cases. In this tutorial, It manipulates itself to release the excessive energy in order
we introduce this new paradigm, provide guidelines to help to stabilize itself. This manipulation is called chemical reac-
the readers implement CRO for their optimization problems, tions. If we look at the chemical substances at the microscopic
summarize the applications of CRO reported in the literature, level, a chemical system consists of molecules, which are
and identify possible future research directions for CRO. the smallest particles of a compound that retain the chem-
The rest of this tutorial is organized as follows. Section 2 ical properties of the compound.3 Molecules are classified
gives the inspiration of CRO. We explain the characteristics into different species based on the underlying chemical prop-
of CRO in Sect. 3 and introduce the basic elements of CRO in erties. For example, carbon monoxide (CO) and nitrogen
Sect. 4. In Sect. 5, we demonstrate the CRO framework and
show how the algorithm is structured. We briefly describe 2 The central energy buffer will be introduced in Sect. 4.
3 The definition of molecule is from the General Chemistry Online:
Glossary from Frostburg State University, available at http://antoine.
1 They are also sometimes called metaheuristics. frostburg.edu/chem/senese/101/index.shtml.
123
Memetic Comp. (2012) 4:3–17 5
123
6 Memetic Comp. (2012) 4:3–17
discussed in Sect. 5). However, the frequencies have a stron- particular problem, we do not need strong synchronization
ger relationship with the “landscape” of the objective func- among the CROs. Unlike other evolutionary algorithms, CRO
tion. For example, if “down-hill” search direction is always does not define generations and each iteration involves only
possible, we will never trigger the decomposition criterion. a subset of molecules. Each CRO maintains its own popu-
If “hill-climbing” does not often happen, molecules will not lation size. Interactions between CROs can be carried out at
convert their kinetic energy for worse solutions and the syn- a certain instant without much restriction as each CRO does
thesis criterion is not invoked. When working with a small set not need to wait for another CRO to complete certain actions,
of molecules, we focus on local search more in some regions. e.g. computation of the whole population in a generation in
Otherwise, we try to spread “seeds” (i.e. molecules) to the GA. An attempt to parallelize CRO can be found in [39].
whole solution space in a greater extent. Therefore, CRO tries To summarize, the advantages of CRO are highlighted as
to adapt itself to the problem with the goal of locating the follows:
global minimum effectively.
In a broad sense, CRO is an algorithmic framework where – CRO is a design framework which allows deploying dif-
we only define the general operations of agents and the ferent operators to suit different problems.
energy management scheme. It allows certain implementa- – Its variable population size allows the system to adapt to
tion details to be adjusted to suit the characteristics of prob- the problems automatically.
lems. In other words, CRO has high flexibility for users to – Conversion of energy and transfer of energy in different
customize it to meet their own needs. entities and in different forms make CRO unique among
CRO has a strong relationship with memetic computa- meterheursitics. CRO has the potential to tackle those
tion, which is defined as “a paradigm that uses the notion problems which have not been successfully solved by
of meme(s) as units of information encoded in computa- other metaheuristics.
tional representations for the purpose of problem-solving” – Other attributes can easily be incorporated into the agent
[6]. MA hybridizes global and local heuristic search tech- (i.e. molecule). This gives flexibility to design different
niques. CRO realizes the global and local search with the operators.
elementary reactions. Moreover, we can incorporate (individ- – CRO enjoys the advantages of both SA and GA.
ual and social) learning ability or ideas from other algorithms – CRO can be easily programmed in object-oriented pro-
into CRO through appropriate designs of the decomposition gramming language, where a class defines a molecule and
and synthesis mechanisms. This may result in a more pow- methods define the elementary reaction types.
erful CRO-based algorithm for particular problems. – It is easy to modify CRO to run in parallel, as the pop-
CRO enjoys the advantages of both Simulated Anneal- ulation size does not need to be synchronized between
ing (SA) [17] and GA. The energy conservation requirement computing units.
gives similar effects of the Metropolis Algorithm used in SA
while the decomposition and synthesis operations share sim-
ilarities with the crossover and mutation operations of GA. 4 Basic components, elementary reactions, and concepts
When the number of molecules is small, CRO is more like
SA. When some crossover and mutation operators are imple- In this section, we introduce the building blocks of CRO and
mented in decomposition and synthesis, CRO performs more explain how they are integrated as a complete algorithm. We
like GA (more discussion can be found in Sect. 5.6). first define the manipulated agent, then describe the elemen-
The basic unit in CRO is a molecule. Each molecule has tary reactions, and finally elaborate on the core concept of
certain attributes, e.g. potential and kinetic energies, molec- CRO, namely, conservation of energy.
ular structures, etc., with values characterizing that mole-
cule. The elementary reactions define the implementations 4.1 The manipulated agent
of molecular interactions. Thus, we can easily program CRO
with an object-oriented programming language [30], e.g. CRO is a multi-agent algorithm and the manipulated agents
C++ and Java. We can define a class whose data fields rep- are molecules. Each molecule has several attributes, some
resent the attributes and whose methods describe the ele- of which are essential to the basic operations of CRO. The
mentary reactions. Whenever a molecule is constructed (in essential attributes include (a) the molecular structure (ω); (b)
initialization and decomposition), we create an object repre- the potential energy (PE); and (c) the kinetic energy (KE).
senting a molecule from the class. If a molecule is removed The rest depends on the algorithm operators and they are uti-
from the system by combining with another one (in synthe- lized to construct different CRO variants for particular prob-
sis), we can simply destroy the corresponding object. lems provided that their implementations satisfy the charac-
Parallelization of CRO can be done without too much teristics of the elementary reactions. The optional attributes
effort. When we implement multiple CROs for solving a adopted in most of the published CRO variants are (d) the
123
Memetic Comp. (2012) 4:3–17 7
number of hits (NumHit); (e) the minimum structure (Min- 4.2.1 On-wall ineffective collision
Struct); (f) the minimum PE (MinPE); and (g) the minimum
hit number (MinHit). Illustrations of the attributes mentioned An on-wall ineffective collision represents the situation when
above are listed in the following: a molecule collides with a wall of the container and then
bounces away remaining in one single unit. In this collision,
1. Molecular structure ω captures a solution of the prob- we only perturb the existing ω to ω , i.e.,
lem. It is not required to be in any specific format: it can ω → ω .
be a number, a vector, or even a matrix. For example, if
the problem solution space is defined as a set of vectors This can be done by picking ω in the neighborhood of ω.
composed of five real numbers, then ω can be any of Let N (·) be any neighborhood search operator, we have ω =
these vectors. N (ω) and PE ω = f (ω ). Moreover, a certain portion of
2. Potential energy PE is defined as the objective function KE of the transformed molecule is withdrawn to the cen-
value of the corresponding solution represented by ω. If tral energy buffer (buffer). Let KElossRate be a parameter of
f denotes the objective function, then we have CRO, 0 ≤ KELossRate ≤ 1, and a ∈ [KELossRate, 1] be
a random number, uniformly distributed from KELossRate
PE ω = f (ω). (1) to 1. We get
KE ω = (PE ω − PE ω + KE ω ) × a (2)
3. Kinetic energy KE is a non-negative number and it quan-
tifies the tolerance of the system accepting a worse and the remaining energy, (PE ω − PE ω + KE ω ) × (1 − a),
solution than the existing one. We will elaborate on the is transferred to buffer. If KE ω is large enough such that the
concept later in this section. transformed molecule satisfies the following energy conser-
4. Number of hits When a molecule undergoes a collision, vation condition:
one of the elementary reactions will be triggered and it
PE ω + KE ω ≥ PE ω (3)
may experience a change in its molecular structure. Num-
Hit is a record of the total number of hits (i.e. collisions) (further discussed in Sect. 4.3), we can have PE ω > PE ω .
a molecule has taken. In other words, we can obtain a worse solution in this ele-
5. Minimum structure MinStruct is the ω with the mini- mentary reaction. Of course, it is always possible to undergo
mum corresponding PE which a molecule has attained an on-wall ineffective collision when PE ω ≤ PE ω . When
so far. After a molecule experiences a certain number of a molecule experiences more of this elementary reaction, it
collisions, it has undergone many transformations of its will have more KE transferred to buffer. Hence, the chance
structure, with different corresponding PE. MinStruct is of having a worse solution is lower in a subsequent change.
the one with the lowest PE in its own reaction history.
6. Minimum potential energy When a molecule attains its 4.2.2 Decomposition
MinStruct, MinPE is the corresponding PE.
7. Minimum hit number MinHit is the number of hits when Decomposition refers to the situation when a molecule hits
a molecule realizes MinStruct. It is an abstract notation a wall and then breaks into several parts (for simplicity, we
of time when Minstruct is achieved. consider two parts in our discussion). Assume that ω pro-
duces ω1 and ω2 , i.e.,
4.2 Elementary reactions
ω → ω1 + ω2 .
There are four types of elementary reactions, each of which Any mechanism, which can produce ω1 and ω2 from ω, is
takes place in each iteration of CRO. They are employed allowed. Theoretically, even generating solutions indepen-
to manipulate solutions (i.e. explore the solution space) and dent of the existing one (random generation of new solution)
to redistribute energy among the molecules and the buffer. is feasible. The idea of decomposition is to allow the system
For demonstration purposes, we will also give examples of to explore other regions of the solution space after enough
the most frequently used operators in various applications of local search by the ineffective collisions. The effectiveness
CRO in Sect. 5. Other designs can be found in the references of the solution generation mechanism is problem-dependent.
provided in Sect. 6. Note that there is no strict requirements Since more solutions are created, the total sum of PE and KE
on the mechanisms of the operators and operators designed of the original molecule may not be sufficient. In other words,
for other algorithms may also be adopted. However, CRO we may have
ensures the conservation of energy when new solutions are
generated with the operators. PE ω + KE ω < PE ω1 + PE ω2 .
123
8 Memetic Comp. (2012) 4:3–17
123
Memetic Comp. (2012) 4:3–17 9
number of molecules involved before and after a particular Algorithm 1 “Molecule” class
elementary reaction, and let ω and ω be the molecular struc- 1: class Molecule
tures of an existing molecule and the one to be generated 2: Attributes:
3: ω, PE, KE, NumHit, MinStruct, MinPE, MinHit
from the elementary reaction, respectively. In general, the 4: Method:
elementary reaction can only take place when it satisfies the 5: Molecule() \\constructor
following energy conservation condition: 6: {
7: Randomly generate ω in the solution space
k
l 8: PE ← f (ω)
(PE ωi + KE ωi ) ≥ PE ωi . (14) 9: KE ← InitialKE
i=1 i=1 10: NumHit ← 0
11: MinStruct ← ω
We modify this condition for decomposition as it involves 12: MinPE ← PE
buffer on the left-hand side of (14). Note that PE is deter- 13: MinHit ← 0
mined by (1) according to the molecular structure. If the 14: }
15: Onwall IneffectiveCollision()
resultant molecules have very high potential energy, i.e. they 16: Decomposition()
give very bad solutions, the reaction will not occur. 17: IntermolecularIneffectiveCollision()
Theoretically, energy cannot attain a negative value and 18: Synthesis()
any operation resulting in negative energy should be forbid- 19: end class
den. However, some problems may attain negative objec-
tive function values (i.e. negative PE), but we can convert
the problem to an equivalent one by adding an offset to the In this stage, we define the manipulated agent, i.e. a mol-
objective function to make each PE non-negative. The law ecule, set the parameter values, and construct the initial pop-
of conservation of energy is still obeyed and the system ulation of molecules. As mentioned in Sect. 3, it is preferred
works perfectly. Interested readers may refer to [20] for more to program CRO with an object-oriented programming lan-
details. guage [30]. We create a “Molecule” class with some attri-
butes and methods. The attributes are those mentioned in
Sect. 4.1 while we define five methods in the class, including
5 Algorithm design the class constructor and the four elementary reactions.7 The
constructor defines the details of an object when it is created
In this section, we will guide the readers to develop a basic according to the class. Here the object refers to a “mole-
version of CRO. This serves to help the readers understand cule”. As we normally generate the initial set of solutions
how CRO works. We also give examples of some common randomly in the solution space, we assign a random solution
operators used in CRO. Although CRO is a general-purpose to ω in the constructor. The pseudocode of the “Molecule”
metaheuristic, as with other general-purpose metaheuristics, class is given in Algorithm 1. We create PopSize number of
this basic CRO may not give good performance to all prob- molecules from “Molecule” to form the initial population of
lems of interest. At the end of this section, we also give some molecules.
suggestions on how to proceed to more advanced versions of
CRO which can be more adaptive to the problem. 5.2 Iterations
Similar to other evolutionary algorithms or metaheuris-
tics, CRO consists of three stages: initialization, iterations, Molecules with energy move and trigger collisions. A mole-
and the final stage. We define the elements of the algorithms cule can either hit on a wall of the container or collide with
in the initialization and the algorithm explores the solution each other. This is decided by generating a random num-
space in iterations. In the final stage, the algorithm terminates ber b in [0, 1]. If b > MoleColl or the system only has one
and the best found solution is output. molecule, we have a uni-molecular collision. Otherwise, an
inter-molecular collision follows.
5.1 Initialization For a uni-molecular collision, we randomly select one
molecule from the population and decide if it results in an
Imagine there is a container with some chemical substances on-wall ineffective collision or a decomposition, by check-
inside in the forms of molecules. We initialize the set- ing the decomposition criterion on the chosen molecule. In
tings of the algorithm and assign values to the algorithmic most of the CRO applications in Sect. 6, the decomposition
parameters, including PopSize, KELossRate, MoleColl,
buffer, InitialKE, α, and β.6
7As we only carry out the elementary reactions in the iterations stage,
6 We will discuss MoleColl, α, and β in Sect. 5.2. we will explain their implementations in Sect. 5.2.
123
10 Memetic Comp. (2012) 4:3–17
123
Memetic Comp. (2012) 4:3–17 11
Inequalities (15) and (16) control the degree of diversifi- without improvements, etc. In this stage, we simply output
cation by α and β. Proper values of α and β balance intensifi- the best solution found with its objective function value and
cation (i.e. exploitation) and diversification (i.e. exploration). terminate the algorithm.
After an elementary reaction (manipulation of solutions)
completes, we check if the energy conservation condition is 5.4 The overall algorithm
obeyed. If not, the change is abolished. Then we check if
any newly determined solution has a lower objective func- To program CRO, we just need to assemble the previously
tion value. If so, we record the best solution obtained so far. mentioned components together according to the peudocode
If no stopping criteria are met, we will start a new iteration. given in Algorithm 6. To ease understanding the flow of the
algorithm, we also give the schematic diagram of CRO in
5.3 The final stage Fig. 2.
Here are some suggested values for the parameters:
If any of the stopping criteria is met, we will go to the PopSi ze = 10, K E Loss Rate = 0.2, MoleColl =
final stage. The stopping criteria are defined according to 0.2, I nitial K E = 1000, α = 500, β = 10, and bu f f er =
the user’s requirements and preferences. Typical stopping 0. These values are deduced from our implementation in
criteria include the maximum amount of CPU time used, [18,20]. However, these parameter values are problem-
the maximum number of function evaluations performed, dependent. To maximize the performance of CRO for a par-
obtaining an objective function value less than a predefined ticular problem, the readers may perform some parameter
threshold, the maximum number of iterations performed tunings to determine a good combination of parameter values.
123
12 Memetic Comp. (2012) 4:3–17
Gaussian perturbation is also called Gaussian mutation [4]. It 5.6 Advanced settings
is a neighborhood search operator N (·) for continuous prob-
lems. It can be used in Line 2 of Algorithm 2 and Line 2 of We have only specified how the basic CRO works. Due
Algorithm 4. to the No-Free-Lunch Theorem [35], it cannot have good
Consider a problem with continuous solution space. Let performance for all kinds of problems. Recall that CRO is
ω = [ω(i), 1 ≤ i ≤ n] where ω(i) ∈ [li , u i ], li ≤ cast as a general-purpose algorithm and it is stated in the form
123
Memetic Comp. (2012) 4:3–17 13
of an algorithmic framework. Many details can be modified More information about constraint-handling techniques can
to suit a particular problem. Here we attempt to give the read- be found in [9].
ers some directions for developing advanced versions.
In the basic CRO, we always specify the maximum num-
ber of molecules involved in an elementary reaction to be 6 Applications
two. Each elementary reaction needs to satisfy the energy
conservation condition (14) in order to realize a new solution. Although CRO is a newly proposed algorithm, it has
However, if more than two molecules are involved (except been applied to problems in many disciplines successfully.
the on-wall ineffective collision which is always one-to-one), CRO has been compared with many existing evolutionary
more energy may be committed and the molecules may attain approaches and it achieves very competitive or even superior
new solutions to a greater extent; more molecules may com- performance. Applications of CRO with the operators used
pensate others for a great change in PE. For example, we are summarized in Table 2.
may allow an inter-molecular ineffective collision with three
molecules: ω1 + ω2 + ω3 → ω1 + ω2 + ω3 . 6.1 Quadratic assignment problem
We only give the principles of the elementary reactions in
Sect 5.2, where operators are required to specify how to gen- Quadratic Assignment Problem is a fundamental combinato-
erate new solutions from existing ones. In Sect. 5.5, we give rial problem in operations research [22]. It belongs to location
examples of some commonly used operators of CRO. Sim- analysis, about minimizing the transportation cost by vary-
ilar to other evolutionary algorithms, it is possible to design ing the locations of facilities. Consider the assignment of n
good operators to gain better performance for a particular facilities to n locations. We know the distance between each
problem. Moreover, we can also adopt the operators suc- pair of locations and the human flow between each pair of
cessfully used in other algorithms in CRO. For example, the facilities. The problem is to minimize the total cost (distance
effect of decomposition is similar to that of mutation in GA. × flow) by arranging the locations of the facilities. In [18],
We can apply a mutation operator to a molecule twice to the earliest version of CRO is compared with the variants
produce two different molecules. We can also apply a GA of some popular evolutionary algorithms and CRO achieves
crossover operator in synthesis to combine solutions into superior performance in many test instances. In [39], a paral-
one. lel version of CRO with a synchronous communication strat-
We can design the decomposition and synthesis criteria egy is proposed to tackle the problem. The computation time
other than those given in (15) and (16). Decomposition and and the solution quality of the parallel implementation are
synthesis bring diversification to the algorithm. Diversifi- improved when compared to those of the sequential version
cation cannot take place too often, or it will be become a of CRO.
completely random algorithm. The criteria specify when a
diversification happens in between intensifications. Recall 6.2 Resource-constrained project scheduling problem
that in Sect. 4.1, only the molecular structure ω, PE, and
KE are necessary to characterize a molecule. Other attributes Resource-Constrained Project Scheduling Problem is one of
are optionally used to describe the condition of a molecule the most intractable, NP-hard optimization problems in oper-
for checking the decomposition and synthesis criteria. Other ations research, related to project management, resource allo-
attributes can also be introduced for different designs of the cation, and the manufacturing process [7]. Consider that time
criteria. is divided into slots and there are some activities to be sched-
Many optimization problems impose constraints to dif- uled for a project. Each activity requires resources to process
ferentiate feasible solutions from the infeasible ones. New and it may span more than one time slot. Resources are lim-
solutions generated in the elementary reactions may be fea- ited; we need to decide which activities should be supported
sible or infeasible depending on the operators used. There in a certain time slot. There are also precedence constraints
are generally three approaches to handle constraints: among the activities. In other words, some activities can only
start when certain ones have been completed. The objective
1. One can design the operators which always map to fea- is to minimize the lifespan of the project. In [18], CRO can
sible solutions. achieve the known global minimums of most instances in the
2. We allow the operators to produce infeasible solutions standard benchmarks.
but we introduce a mechanism to convert any infeasible
solutions into feasible ones at the end of each iteration. 6.3 Channel assignment problem in wireless mesh networks
3. We allow infeasible solutions without any correction
mechanism but we impose a penalty at the objective func- A wireless mesh network is composed of some stationary
tion to any infeasible solution. wireless mesh routers, each of which is equipped with certain
123
14
123
Problem Ref. Type Field Year Solution Structure Neighborhood Decomposition Synthesis operator
operator operator
Quadratic [18,39] Combinatorial, Operations research 2010 Permutation vector Two-exchange Circular shift Distance-preserving
Assignment NP-hard crossover
Problem
Resource-Con- [18] Combinatorial, Operations research 2010 Permutation vector Two-exchange Circular shift Distance-preserving
strained Project NP-hard crossover
Scheduling
Problem
Channel Assignment [18] Combinatorial, Communications, 2010 Integer vector One-difference Half-total-change Probabilistic select
Problem in NP-hard Networking
wireless mesh
networks
Population [19] Continuous Communications, 2010 Right stochastic Randomly Randomly assign Probabilistic select
Transition Problem Networking matrix redistribute the rows to new on rows
in peer-to-peer live sum of two random solutions with
streaming numbers into two random generation
of unassigned rows
Cognitive Radio [21] Combinatorial, Communications, 2010 Binary vector One-difference Randomly assign Probabilistic select
Spectrum NP-hard Networking bits to new
Allocation solutions with
Problem random generation
of unassigned bits
Grid Scheduling [36,37] Combinatorial, Computing 2010, 2011 Permutation vector, Insertion, Random generation, Position-based,
Problem NP-hard integer vector two-exchange, half-random one-position
One-difference exchange
Standard continuous [20] Continuous Mathematics 2011 Real vector Gaussian Half-total-change Probabilistic select,
benchmark perturbation BLX-α
functions
Stock Portfolio [38] Mixed-integer, Finance 2011 Mixed-integer vector One-difference Half-random Keep the aligned
Selection Problem multi-objective, numbers and
NP-hard randomly generate
the rest
Artificial neural [41] Continuous Computational 2011 Real matrices and Gaussian Perturb every Probabilistic select
network training intelligence vectors perturbation element with 0.5
probability
Network Coding [25] Combinatorial, Communications, 2011 Integer vector One-difference Randomly generate a Probabilistic select
Optimization NP-hard Networking solution and
Problem modify the coding
links
Memetic Comp. (2012) 4:3–17
Memetic Comp. (2012) 4:3–17 15
radio interfaces. Two routers establish a communication link 6.6 Grid scheduling problem
if they are located in the transmission range of each other with
the same channel assigned to one of their interfaces. There Grid computing is the next wave of computing where we del-
are only limited channels available and two established com- egate computational tasks to the computer cloud (e.g. Inter-
munication links on the same channel interfere each other if net), instead of on a standalone machine [27]. The tasks are
they are in close proximity. The Channel Assignment Prob- taken up by idle computing resources and computed results
lem assigns channels to the communications links so as to are then returned to the requester. The resources may be het-
minimize the induced interference subject to interface con- erogeneous in computational power and volatile. Grid Sched-
straint, which means that we cannot assign more channels to uling Problem schedules tasks to resources so as to minimize
a router than the number of interfaces equipped. This prob- computational overheads and to utilize the resources effec-
lem is NP-hard and combinatorial [33]. CRO can improve tively. Several variants of CRO are proposed with different
the existing solutions to the problem [18]. considerations of solution representation and priority of the
elementary reactions [36,37]. CRO outperforms many exist-
ing evolutionary methods in most test cases.
6.4 Population Transition Problem in peer-to-peer live
streaming
6.7 Standard continuous benchmark functions
In a peer-to-peer live streaming system, there is a stream
Many optimization problems are continuous problems. The
source providing streaming data, together with peers receiv-
original CRO [18] mainly focuses on discrete problems
ing the data. Due to heterogeneous network conditions, peers
and a successful general-purpose metaheuristic should also
experience different transmission delays for the data from the
be applicable to the continuous domain. CRO is extended
source and they can be grouped into colonies according to the
to solve continuous problems systematically in [20]. This
delays. For a particular peer, its upstream peers with shorter
continuous version is tested with the standard benchmarks
transmission delays and those in the same colony can serve
comprised of unimodal, high-dimensional and low-dimen-
as a source for the data. The system is in universal streaming
sional multimodal functions [40]. Many existing evolution-
when all peers are served with sufficient streaming data. Peers
ary approaches are compared with CRO, which show very
can join and leave the system and switch to another colony
competitive results. An adaptive scheme for CRO is also pro-
while the system can impose rules to guide peers to join the
posed in [20].
colonies (e.g. assign transition probability for peers transiting
from colony to colony). Population Transition Problem max-
6.8 Stock portfolio selection problem
imizes the probability of universal streaming by assigning
population transition probabilities among all colonies [19].
Investing in a selection of stocks, instead of a single one,
In [19], CRO is compared with some practical strategies and
is almost the golden rule in finance to reduce risk. There is
simulation shows that the evolutionary approach by CRO
always a tradeoff in investment: minimizing the risk while
performs better than the non-evolutionary ones.
maximizing the return. Stock portfolio selection studies how
to build a portfolio of stocks with the consideration of the two
6.5 Cognitive radio spectrum allocation problem contradicting objectives [34]. It is a multi-objective mixed-
integer NP-hard problem. In [38], CRO is employed to solve
In many countries, wireless channel utilization is regu- the problem based on the Markowitz model and the Sharpe
lated and most of the channels can only be used by autho- ratio. A super molecule-based CRO is proposed to compute
rized users. Due to the widespread employment of wireless the Pareto frontier with better performance in terms of Sharpe
devices, the shared channels become overcrowded and their ratio, expected return, and variance than the canonical form.
quality of service deteriorates. With underutilization of the
restricted channels, the capacity of the whole wireless sys- 6.9 Artificial neural network training
tem will substantially increase when unauthorized users are
allowed to use the restricted channels provided that higher An artificial neural network is a very successful tool to model
priority is given to the authorized users. Adjoining users on the input–output relationships of complex systems. The net-
the same channel induce interference. Restricted to an inter- work is formed by interconnecting artificial neurons arranged
ference-free environment, the spectrum allocation problem in layers, simulating a biological neural network [24]. It is
assigns channels to users in order to maximize system util- an adaptive system with the ability to learn from provided
ity subject to hardware constraint (which is similar to the data. It has many real-world applications, e.g. classification,
interface constraint in Sect. 6.3) [26]. CRO shows dramatic function approximation, and data mining. In order to model
improvement over other existing approaches [21]. a system, a neural network requires a set of related data for
123
16 Memetic Comp. (2012) 4:3–17
training, i.e., to evolve the network structure and to tune the employ CRO to their own problems and learn the charac-
weights. In [41], CRO is employed to train neural networks. teristics of CRO with the toolbox. Moreover, there are very
The CRO-trained neural networks have the best testing error few efforts on parallelization and distributed computation of
rate among many representative evolutionary schemes. CRO. Due to its variable population structure, there is no
strict requirement on the population size and synchroniza-
6.10 Network coding optimization problem tion among the distributed computational platforms. Further-
more, the frequencies of decomposition and synthesis affect
In a traditional computer networks, sources send data to the performance. How best to control these frequencies to
destinations via some routers and the routers only receive further improve the performance is still an open question.
and forward the data without further processing. Network
Acknowledgments This work was supported in part by the Strate-
coding enhances network performance when routers are gic Research Theme of Information Technology of The University of
endowed with processing ability (coding). Network coding Hong Kong. A.Y.S. Lam was also supported in part by the Croucher
can increase the system throughput without any topologi- Foundation Research Fellowship.
cal changes to the network. However, enabling coding on all
Open Access This article is distributed under the terms of the Creative
possible links increases computational cost. Network Coding Commons Attribution License which permits any use, distribution, and
Optimization Problem minimizes the number of coding links reproduction in any medium, provided the original author(s) and the
while maintaining a certain transmission rate [16]. It is an source are credited.
NP-hard combinatorial problem. In [25], CRO is employed
to tackle this problem and is shown to outperform existing References
algorithms.
1. AlRashidi M, El-Hawary M (2009) A survery of particle swarm
optimization applications in electric power systems. IEEE Trans
Evol Comput 13(4):913–918
7 Concluding remarks and future work 2. Ashlock D (2004) Evolutionary computation for modeling and
optimization. Springer, New York
3. Boyd S, Vandenberghe L (2004) Convex optimization. Cambridge
CRO is a recently developed general-purpose optimization University Press, Cambridge, UK
technique. Its inspiration comes from the nature of chem- 4. Burger R (2000) The mathematical theory of selection, recombi-
ical reactions. It mimics the interactions of molecules, in nation, and mutation. Wiley, Chichester
5. Cela E (1998) The quadratic assignment problem: theory and algo-
the form of elementary reactions. The randomly constructed
rithms. Kluwer Academic Publishers, Dordrecht, The Netherlands
sequence of elementary reactions lets the molecules explore 6. Chen XS, Ong YS, Lim MH, Tan KC (2011) A multi-facet survey
the solution space for the global minimum. Energy manage- on memetic computation. IEEE Trans Evol Comput 15(5):591–
ment is the fundamental characteristic of CRO. The conser- 607
7. Demeulemeester EL, Herroelen WS (2002) Project scheduling: a
vation of energy governs the acceptance of new solutions and
research handbook. Academic Publishers, Boston, MA, USA
the scope of search. Its variable population structure allows 8. Dorigo M, Stutzle T (2004) Ant colony optimization. The MIT
the algorithm to adapt to the problem with reasonable mix- Press, Cambridge, MA, USA
ture of intensification and diversification. These are the rea- 9. Eiben AE (2001) Evolutionary algorithms and constraint satisfac-
tion: definitions, survey, methodology, and research directions.
sons why CRO performs very well in solving optimization
Theoretical aspects of evolutionary computing. Springer, London,
problems. Although CRO is just recently proposed, it has pp 13–30
been successfully applied to many benchmarks and practical 10. Fortnow L (2009) The status of the P versus NP problem. Commun
problems. The examples described in this paper give readers ACM 52(9):78–86
11. Garey MR, Johnson DS (1979) Computers and intractability: A
some ideas on how to apply CRO to their own problems. We
guide to the theory of NP-completeness. WH Freeman & Co Ltd,
believe this is just the start of the CRO journey. This tutorial New York
serves to summarize the current development of CRO and to 12. Geem ZW, Kim JH, Loganathan GV (2001) A new heuristic opti-
lay down potential research directions. mization algorithm: harmony search. Simulation 76(2):60–68
13. Goldberg DE (1989) Genetic algorithms in search, optimization,
Thanks to the No-Free-Lunch Theorem, each successful
and machine learning. Addison-Wesley, Reading, MA, USA
metaheuristic performs well on certain classes of problems. 14. Guggenheim EA (1967) Thermodynamics: an advanced treatment
Which classes of problems are suitable for CRO? There is for chemists and physicists. 5th edn. Wiley, North Holland
no easy answer at this moment. Similar to other evolutionary 15. Kennedy J, Eberhart RC (2001) Swarm intelligence. Morgan
Kaufmann, San Francisco
algorithms, when CRO is applied to more areas, the research
16. Kim M, Medard M, Aggarwal V, OReilly UM, Kim W, Ahn CW
community will give an answer. To ease implementation, a (2007) Evolutionary approaches to minimizing network coding
toolbox called CROToolbox is available.8 Users can quickly resources. In: Proceedings of the 26th annual IEEE conference
on computer Communications, Anchorage, AK, USA
17. Kirkpatrick S, Gelatt CD, Vecchi MP (1983) Optimization by sim-
8 CROToolbox can be downloaded at http://cro.eee.hku.hk. ulated annealing. Science 220((4598):671–680
123
Memetic Comp. (2012) 4:3–17 17
18. Lam AYS, Li VOK (2010) Chemical-reaction-inspired metaheu- 31. Shadbolt N (2004) Nature-inspired computing. IEEE Intell Syst
ristic for optimization. IEEE Trans Evol Comput 14(3):381–399 19(1):2–3
19. Lam AYS, Xu J, Li VOK (2010) Chemical reaction optimization for 32. Shin SY, Lee IH, Kim D, Zhang BT (2005) Multiobjective evolu-
population transition in peer-to-peer live streaming. In: Proceed- tionary optimization of DNA sequences for reliable DNA comput-
ings of the IEEE congress on evolutionary computation. Barcelona, ing. IEEE Trans Evol Comput 9(2):143–158
Spain 33. Subramanian AP, Gupta H, Das SR, Cao J (2008) Minimum inter-
20. Lam AYS, Li VOK, Yu JJQ (2011, in press) Real-coded chemi- ference channel assignment in multiradio wireless mesh networks.
cal reaction optimization. IEEE Trans Evol Comput (accepted for IEEE Trans Mobile Comput 7(12):1459–1473
publication) 34. Tollo GD, Roli A (2008) Metaheuristics for the portfolio selection
21. Lam AYS, Li VOK (2010) Chemical reaction optimization for problem. J Financial Quant Anal 8(4):621–636
cognitive radio spectrum allocation. In: Proceedings of the IEEE 35. Wolpert DH, Macready WG (1997) No free lunch theorems for
Global Communications Conference. Miami, FL, USA optimization. IEEE Trans Evol Comput 1(1):67–82
22. Loiola EM, de Abreu NMM, Boaventura-Netto PO, Hahn P, 36. Xu J, Lam AYS, Li VOK (2010) Chemical reaction optimization
Querido T (2007) A survey for the quadratic assignment problem. for the grid scheduling problem. In: Proceedings of the IEEE inter-
Eur J Oper Res 176(2):657–690 national conference on communications. Cape Town, South Africa
23. Ong YS, Lim MH, Chen XS (2010) Research frontier: memetic 37. Xu J, Lam AYS, Li VOK (2011) Chemical reaction optimization
computation past, present and future. IEEE Comput Intell Mag for task scheduling in grid computing. IEEE Trans Parallel Distrib
5(2):24–36 Syst 22(10):1624–1631
24. Palmes PP, Hayasaka T, Usui S (2005) Mutation-based genetic 38. Xu J, Lam AYS, Li VOK (2011) Stock portfolio selection using
neural network. IEEE Trans Neural Netw 16(3):587–600 chemical reaction optimization. In: Proceedings of the international
25. Pan B, Lam AYS, Li VOK (2011) Network coding optimization conference on operations research and financial engineering. Paris,
based on chemical reaction optimization. In: Proceedings of the France
IEEE global communications conference. Houston, TX, USA 39. Xu J, Lam AYS, Li VOK (2010) Parallel chemical reaction opti-
26. Peng C, Zheng H, Zhao BY (2006) Utilization and fairness in spec- mization for the quadratic assignment problem. In: Proceedings of
trum assignment for opportunistic spectrum access. ACM/Kluwer the international conference on genetic and evolutionary methods.
Mobile Netw Appl 11(4):555–576 Las Vegas, NV, USA
27. Ritchie G, Levine J (2004) A hybrid ant algorithm for scheduling 40. Yao X, Liu Y, Lin G (1999) Evolutionary programming made
independent jobs in heterogeneous computing environments. In: faster. IEEE Trans Evol Comput 3(2):82–102
Proceedings of 23rd workshop of the UK planning and scheduling 41. Yu JJQ, Lam AYS, Li VOK (2011) Evolutionary artificial neural
special interest group. Cork, Ireland network based on chemical reaction optimization. In: Proceedings
28. Price K, Storn R, Lampinen J (2005) Differential evolution: a prac- of the IEEE congress on evolutionary computation. New Orleans,
tical approach to global optimization. Springer, Berlin LA, USA
29. Rogers H (1987) Theory of recursive functions and effective com- 42. Yu L, Chen H, Wang S, Lai KK (2009) Evolving least squares sup-
putability. The MIT Press, Cambridge, MA, USA port vector machines for stock market trend mining. IEEE Trans
30. Schach S (2010) Object-oriented and classical software engineer- Evol Comput 13(1):87–102
ing. 8th edn. McGraw-Hill, New York
123