Japs-2 2 3
Japs-2 2 3
PRIMARY RESEARCH
1, 2
Computer Engineering Department, Yildiz Technical University, Istanbul, Turkey
Abstract—Mean-Variance Mapping Optimization (MVMO) is the newest class of the
Index Terms modern meta-heuristic algorithms. The original version of this algorithm is suitable for
Mean-Variance Mapping
continuous search problems, so can’t apply it directly to discrete search problems. In this
Optimization
paper, the binary version of the MVMO (BMVMO) algorithm proposed. The proposed
Binary Meta-Heuristic
Binary Mean-Variance Mapping Optimization algorithm compare with well-known binary
Optimization
meta-heuristic optimization algorithms such, Binary genetic Algorithm, Binary Particles
Discrete evolutionary
algorithms Swarm Optimization, and Binary Bat Algorithm over fifteen benchmark functions
conducted to draw a conclusion. The numeric experiments result proves that BMVMO is
better performance
Received: 24 April 2016 © 2016 TAF Publishing. All rights reserved.
Accepted: 23 14 May 2016
Published: 24 June 2016
Content from this work is copyrighted by TAF Publishing, which permits restricted commercial use, distribution and reproduction in any medium under a written permission. Users may
print articles for educational and research uses only, provided the original author and source are credited. Any further utilization of this work must maintain attribution to the author(s),
the title of the work and journal citation in the form of a proper scientific referencing.
43 A. H. Al-Saeedi, O. Altun - Binary mean-variance mapping optimization … 2016
Evaluate the performance of BMVMO and compare with randomly. β is real value and calculated as fallowing:
well-known meta-heuristic algorithms, Binary GA (BGA), ( ( (( ) )) (7)
BBA, and BPSO, by using fifteen functions of CEC 2015 and
Therefore; MVMO cannot be directly applied to the
the result proves the BMVMO is better performance
research binary or discrete problems. To solve this
problem, using a transfer function to harmonize MVMO
I. MEAN-VARIANCE MAPPING OPTIMIZATION
with the binary research and also to achieve the essential
ALGORITHM
principal of the binary research it's the search value is
either 0 or 1, but before using the transfer function there
MVNO is the newest class of population-based stochastic
are some issues that need to be taken into consideration
optimization technique [13]. The uniformity amongst
[12]:
MVMO and other stochastic optimization technique in
1- Transfer function work in range [0,1].
basic evolutionary operations characteristic are operations
2- The high absolute value of the transfer function
selection such as crossover, and mutation. But the features
gives a high probability of changing particle value
that distinct the MVMO are the search space and all
and vice versa.
optimization operations internal of MVMO bounded
The value of mapping function is restricted [0,1]
between [0, 1], and use the unique mutation, by use special
therefore can be employed as input to the transfer
mapping function for mutation [14]. The mapping function
function for the mutation the particle as following:
depends on mean and variance of the n-best solutions,
calculated as following:
( ( )) ( ( ( )) (8)
∑ () (1)
∑ ( () ) (2)
The new population created by applying the H-function as
fellowing:
( ) (3) Where ( ( )) is Transfer function, ( ) is
The H-function is defined as following: Complement ( ) “ 0 1,10” , ( ) is Offspring of i-th
( ) ( ) ( ) ( ) (4) child in t-iteration with k-dimensions, ( ) The value
( ) ( ) ( ) return from mapping function and Rand is continuous
Where j = 1, 2, 3.....n, n = population size, Offspring .xi value limited between [0,1]. Figure 2 explain the proposed
= mathematical mean, vi = variance and s1, s2 shape transfer function
variables. The extension in BMVMO for improvement performance
The shape variables depends on value of si which calculate updates a value of shape factors s1, s2, control shape factor
: fs and Variable increment Δd.
si=-ln(vi).fs (5) A. Update control shape factor fs
Where fs is function control on shapes vaFigure1 explain f2= ( ) (10)
the basic steps of MVMO algorithm. ( ) ( ) (11)
Where: Values of and greeter than zero
A. Binary MVMO algorithm
B. Update shape factors s1, s2
To update the shape factors s1 and s2, we need to give an
In binary search style, the particles shift inside search
initial value to the di and adopt the update on the si value.
space to different positions by flipping a different number
Then we check if the si is bigger than 0 (we check the d i if
of bits can represent as the things are rolling inside
it is bigger than si then di= di .Δd otherwise di=di/ Δd.
hypercube during rotation (Kennedy and Eberhart 1997).
Then we choose the random number and check if it is
The Original version of MVMO the range of search space
bigger than 0.5 then s1=si and s2=di, but if it is smaller than
bounded between [0,1]. The crossover to generate next
0.5 then vise verse. If either si is smaller than or equal to 0
generations using a multi-parent strategy as fallowing:
then s1and s2 are equal to the si
X = xk + β(xa –xb) (6)
C. Variable increment Δd
Where X is offering, xk ,xa ,xb are parents selected
Δd = (1 + 0 Δd0 ) + 2 Δd0(Rand – 0.5) (12)
ISSN: 2414-3103
DOI: 10.20474/japs-2.2.3 TAF
Publishing
2016 J. appl. phys. sci. 44
TABLE 1
TAF
ISSN: 2414-3103 Publishing
DOI: 10.20474/japs-2.2.3
45 A. H. Al-Saeedi, O. Altun - Binary mean-variance mapping optimization … 2016
COMPARISON OF BPSO, BGA, BBA AND DMVMO OVER 15 TEST FUNCTIONS OF 30 DIMENSIONS AND 1500 ITERATION
BMVMO BBA BPSO BGA
fun
Mean Std. Dev Mean Std. Dev Mean Std. Dev Mean Std. Dev
The mean and standard deviation of the results found over the 30 independent runs of each algorithm
ISSN: 2414-3103
DOI: 10.20474/japs-2.2.3 TAF
Publishing
2016 J. appl. phys. sci. 46
REFERENCES
TAF
ISSN: 2414-3103 Publishing
DOI: 10.20474/japs-2.2.3
47 A. H. Al-Saeedi, O. Altun - Binary mean-variance mapping optimization … 2016
[1] C. H. Papadimitriou and K. Steiglitz, Combinatorial optimization on the IEEE-CEC 2014 test suite,” in
Optimization: Algorithms and Complexity. New York, IEEE Congress on Evolutionary Computation (CEC),
NY: Courier Dover Publications, 1998. 2014, pp. 1625-1632. DOI:
[3] J. Kennedy and R. C. Eberhart, “A discrete binary 10.1109/cec.2014.6900516
version of the particle swarm algorithm,” in IEEE [10] L. Wang, X, Fu, M. I. Menhas and M. Fei, “A modified
International Conference on Computational binary differential evolution algorithm,” in K. Li, M.
Cybernetics and Simulation, 1997, pp 4104-4108. Fei and Jia L, eds. Life System Modeling and
DOI: 10.1109/icsmc.1997.637339 Intelligent Computing, Lecture Notes in Computer
[4] J. Holland, Adaptation in Natural and Srtificial Science. Berlin, Germany: Springer, 2010, pp 49-57.
Systems,” Ann Arbor, MI: The University of DOI: 10.1007/978-3-642-15597-0_6
Michigan Press, 1975. [11] S. Mirjalili, S. M. Mirjalili and A. Lewis, “Grey wolf
[5] M. Dorigo, V. Maniezzo and A. Colorni, “Ant system: optimizer,” Advances in Engineering Software, vol.
Optimization by a colony of cooperating 69, pp. 46-61, 2014. DOI:
agents,” IEEE Transactions on Systems, Man, and 10.1016/j.advengsoft.2013.12.007
Cybernetics, Part B (Cybernetics), vol. 26, no. 1, pp. [12] S. Mirjalili and A. Lewis, “S-shaped versus V-shaped
29-41, 1996. DOI: 10.1109/3477.484436 transfer functions for binary particle swarm
[6] E. Rashedi, H. Nezamabadi-Pour and S. Saryazdi, optimization,” Swarm and Evolutionary
“BGSA: binary gravitational search Computation, vol. 9, pp. 1-14, 2013. DOI:
algorithm,” Natural Computing, vol. 9(3), pp. 727- 10.1016/j.swevo.2012.09.002
745, 2010. DOI: 10.1007/s11047-009-9175-3 [13] Vasant, P., and Balbir Singh. "Solving Economic
[7] X. S. Yang, “A new metaheuristic bat-inspired Dispatch By Using Swarm Based Mean-Variance
algorithm,” in J. R. Gonzalez, D.A. Pelta, C. Cruz, G. Mapping Optimization (MVMOS),” Global Journal of
Terrazas, N. Krosnogoret eds. Nature Inspired Technology and Optimization vol. 6, no. 3, 1-8.
Cooperative Strategies for Optimization (NICSO [14] I. Erlich, G. K. Venayagamoorthy and W. Nakawiro,
2010). Berlin, Germany: Springer, Berlin, 2010, pp “A mean-variance optimization algorithm,” in IEEE
65-74. DOI: 10.1007/978-3-642-12538-6_6 Congress on Evolutionary Computation, 2010, pp.1-
[8] A. Kaveh and N. Farhoudi, “A new optimization 6. DOI: 10.1109/CEC.2010.5586027
method: Dolphin echolocation,” Advances in
Engineering Software, 59, 53-70, 2013. DOI:
10.1016/j.advengsoft.2013.03.004
[9] I. Erlich, J. L. Rueda, S. Wildenhues and F. Shewarega,
“Evaluating the mean-variance mapping
ISSN: 2414-3103
DOI: 10.20474/japs-2.2.3 TAF
Publishing