0% found this document useful (0 votes)
10 views6 pages

Japs-2 2 3

Uploaded by

Ali ALsaeedi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views6 pages

Japs-2 2 3

Uploaded by

Ali ALsaeedi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

TAF Journal of Applied and Physical Sciences

2016, 2(2):42-47 JAPS 1

PRIMARY RESEARCH

Binary mean-variance mapping optimization algorithm


(BMVMO)
Ali Hakem Al-Saeedi 1, Oğuz Altun 2, *

1, 2
Computer Engineering Department, Yildiz Technical University, Istanbul, Turkey
Abstract—Mean-Variance Mapping Optimization (MVMO) is the newest class of the
Index Terms modern meta-heuristic algorithms. The original version of this algorithm is suitable for
Mean-Variance Mapping
continuous search problems, so can’t apply it directly to discrete search problems. In this
Optimization
paper, the binary version of the MVMO (BMVMO) algorithm proposed. The proposed
Binary Meta-Heuristic
Binary Mean-Variance Mapping Optimization algorithm compare with well-known binary
Optimization
meta-heuristic optimization algorithms such, Binary genetic Algorithm, Binary Particles
Discrete evolutionary
algorithms Swarm Optimization, and Binary Bat Algorithm over fifteen benchmark functions
conducted to draw a conclusion. The numeric experiments result proves that BMVMO is
better performance
Received: 24 April 2016 © 2016 TAF Publishing. All rights reserved.
Accepted: 23 14 May 2016
Published: 24 June 2016

I. INTRODUCTION statistical characteristics function for mutation operation


named mapping function [9] this function mathematically
In computer science, the Meta-heuristic optimization is depend on mean and variance of n- best solutions. And the
the set of operations and technique models, use search range of MVMO algorithm is a continuous value
randomness to optimization the candidates and find the between [0, 1].
best solution [1]. Many Meta-heuristic optimization The original version of many meta-heuristic algorithms
algorithms inspired by nature [2] some of them are, deals with continuous problems. There are different
Particle Swarm Optimization (PSO) [3], Genetic Algorithm methods to harmonize these algorithms with discrete
(GA) [4] Grey Wolf Optimizer Ant Colony Optimization problems. [10] proposed a probability estimation operator
(ACO) [5], Gravitational Search Algorithm (GSA) [6], Bat in order to solve discrete problems by DE. But the binary
search Algorithm (BA) [7] and Dolphin Echolocation [8]. version of BDE different from originated algorithm. The
The flexibility of deal with different problems and the high Binary Bat Algorithm (BBA) [11], Binary GSA , Binary PSO
performance of these algorithms make them more popular (BPSO) [12] use the transfer function for solving binary
than tradition optimization technique. problems with conserving the Original versions of these
The mean-Variance Mapping Optimization (MVMO) One algorithms. For that, use the transfer function with the
of the algorithms of modern meta-heuristic high-efficiency, MVMO for binary search to order to preserve standards
flexible to deal with different kinds of problems. The concepts of MVMO in the search process
unique features of MVMO algorithm use the special In this paper a proposal the Binary version of MVMO
algorithm named BMVMO by employing the concept of the
* Corresponding author: Oğuz Altun transfer function for adapt to binary search problems.
E-mail: [email protected]

Content from this work is copyrighted by TAF Publishing, which permits restricted commercial use, distribution and reproduction in any medium under a written permission. Users may
print articles for educational and research uses only, provided the original author and source are credited. Any further utilization of this work must maintain attribution to the author(s),
the title of the work and journal citation in the form of a proper scientific referencing.
43 A. H. Al-Saeedi, O. Altun - Binary mean-variance mapping optimization … 2016

Evaluate the performance of BMVMO and compare with randomly. β is real value and calculated as fallowing:
well-known meta-heuristic algorithms, Binary GA (BGA), ( ( (( ) )) (7)
BBA, and BPSO, by using fifteen functions of CEC 2015 and
Therefore; MVMO cannot be directly applied to the
the result proves the BMVMO is better performance
research binary or discrete problems. To solve this
problem, using a transfer function to harmonize MVMO
I. MEAN-VARIANCE MAPPING OPTIMIZATION
with the binary research and also to achieve the essential
ALGORITHM
principal of the binary research it's the search value is
either 0 or 1, but before using the transfer function there
MVNO is the newest class of population-based stochastic
are some issues that need to be taken into consideration
optimization technique [13]. The uniformity amongst
[12]:
MVMO and other stochastic optimization technique in
1- Transfer function work in range [0,1].
basic evolutionary operations characteristic are operations
2- The high absolute value of the transfer function
selection such as crossover, and mutation. But the features
gives a high probability of changing particle value
that distinct the MVMO are the search space and all
and vice versa.
optimization operations internal of MVMO bounded
The value of mapping function is restricted [0,1]
between [0, 1], and use the unique mutation, by use special
therefore can be employed as input to the transfer
mapping function for mutation [14]. The mapping function
function for the mutation the particle as following:
depends on mean and variance of the n-best solutions,
calculated as following:
( ( )) ( ( ( )) (8)
∑ () (1)
∑ ( () ) (2)
The new population created by applying the H-function as
fellowing:
( ) (3) Where ( ( )) is Transfer function, ( ) is
The H-function is defined as following: Complement ( ) “ 0 1,10” , ( ) is Offspring of i-th
( ) ( ) ( ) ( ) (4) child in t-iteration with k-dimensions, ( ) The value
( ) ( ) ( ) return from mapping function and Rand is continuous
Where j = 1, 2, 3.....n, n = population size, Offspring .xi value limited between [0,1]. Figure 2 explain the proposed
= mathematical mean, vi = variance and s1, s2 shape transfer function
variables. The extension in BMVMO for improvement performance
The shape variables depends on value of si which calculate updates a value of shape factors s1, s2, control shape factor
: fs and Variable increment Δd.
si=-ln(vi).fs (5) A. Update control shape factor fs
Where fs is function control on shapes vaFigure1 explain f2= ( ) (10)
the basic steps of MVMO algorithm. ( ) ( ) (11)
Where: Values of and greeter than zero
A. Binary MVMO algorithm
B. Update shape factors s1, s2
To update the shape factors s1 and s2, we need to give an
In binary search style, the particles shift inside search
initial value to the di and adopt the update on the si value.
space to different positions by flipping a different number
Then we check if the si is bigger than 0 (we check the d i if
of bits can represent as the things are rolling inside
it is bigger than si then di= di .Δd otherwise di=di/ Δd.
hypercube during rotation (Kennedy and Eberhart 1997).
Then we choose the random number and check if it is
The Original version of MVMO the range of search space
bigger than 0.5 then s1=si and s2=di, but if it is smaller than
bounded between [0,1]. The crossover to generate next
0.5 then vise verse. If either si is smaller than or equal to 0
generations using a multi-parent strategy as fallowing:
then s1and s2 are equal to the si
X = xk + β(xa –xb) (6)
C. Variable increment Δd
Where X is offering, xk ,xa ,xb are parents selected
Δd = (1 + 0 Δd0 ) + 2 Δd0(Rand – 0.5) (12)

ISSN: 2414-3103
DOI: 10.20474/japs-2.2.3 TAF
Publishing
2016 J. appl. phys. sci. 44

( ) ( ) (13) for BPSO inertia weight 1 , maximum inertia weight 1 ,


minimum inertia weight 0.05 , c1,c2 =0.49 , maximum
Where Values of greeter than zero.
velocity 4 and minimum velocity -4. While for BMVMO size
The steps of proposed BMVMO algorithm are:
of solution achieve 20 , di 1 , ∆d0-ini 0.02 , , ∆d0-fin 0.05 , fs-ini
xi random population with k-dimension and ,i- Population
1 and fs-fin 20. For all algorithms above use 30 dimensions
size
and 100 sizes of the population with 1500 iterations , and
set value of di , fs-ini ,fs-fin , ∆d0-ini ,∆d0-fin
repeat every function 30 iteration and use an average of
While t < max_iteration
these iterations in the in the comparison, and the stop
Evaluation population
criteria are the maximum iteration.
Save n-best solution
Mean = mean(n-best solution ) Eq(1)
Variance = variance(n-best solution )Eq(2)
Classification population good & bad
If xi ∈ bad
xi= uniform crossover (select parents randomly )
endif
Update value di , Δd (Eq12)), , fs (Eq(10)
xi = mapping function (xj )Eq(8,9) ,where (xj ⊂ xi )
Endwhile.

II. TEST FUNCTIONS

For testing performing of the algorithms (BMVMO, BBA,


BGA, and BPSO) use the 15 functions of IEEE-CEC 2015
benchmark functions are single objective optimization (Qu,
B. Y., 2014) are divide into 3 groups (f1,f2) unimodal
function , (f3,f4,f5) simple multimodal function,(f6,f7,f8)
hybrid function , and rest functions are composite
functions.

III. NUMERIC AND EXPERIMENT RESULT

The algorithms use in the comparative study with


BMVMO are Binary Genetic Algorithm BGA, Binary Particle
Swarm Optimization [12], Binary Bat Algorithm [11]
because these algorithms popular of binary meta-heuristic Fig. 1. Basic step of MVMO algorithm
fields and succeeded in solving many binary optimization
problems. Moreover, the BBA and BPSO deploy transfer
function in excellent style without changing the original
form of these algorithms. In comparison prefer to use
stander version of these algorithms in comparative.
The primary parameters set for BGA crossover
percentage and mutation rate 0.3, Roulette Wheel use for
parent selection and for crossover used a uniform
crossover. While for BBA loudness rate 0.25, plus rate 0.5 ,
maximum frequency 2 and minimum frequency 0 . While Fig. 2. Proposed transfer function

TABLE 1

TAF
ISSN: 2414-3103 Publishing
DOI: 10.20474/japs-2.2.3
45 A. H. Al-Saeedi, O. Altun - Binary mean-variance mapping optimization … 2016

COMPARISON OF BPSO, BGA, BBA AND DMVMO OVER 15 TEST FUNCTIONS OF 30 DIMENSIONS AND 1500 ITERATION
BMVMO BBA BPSO BGA
fun
Mean Std. Dev Mean Std. Dev Mean Std. Dev Mean Std. Dev

f1 7.72E+10 1.32E+08 7.74E+10 1.61E+08 7.73E+10 2.36E+08 7.74E+10 2.29E+08


f2 2.49E+08 6393937 2.59E+08 9210521 2.55E+08 11464837 2.62E+08 12287816
f3 351.5272 0.23812 351.8331 0.217999 351.7235 0.291511 351.9016 0.319559
f4 11066.79 76.5067 11175.01 93.35784 11133.27 127.1745 11215.55 124.9636
f5 507.9713 0.853272 508.3915 0.995869 507.9457 1.298267 508.6814 1.231251
f6 606.6553 0.007412 606.6665 0.008235 606.6607 0.011355 606.6693 0.011671
f7 844.361 0.269909 844.7394 0.347229 844.5716 0.47364 844.8921 0.465846
f8 49061590 490642.7 49657187 570070.2 49435109 759183.1 49956352 754423.1
f9 914.3755 0.055842 914.441 0.055959 914.4059 0.072981 914.4553 0.060275
f10 1.01E+09 8658304 1.02E+09 10308575 1.01E+09 13055275 1.02E+09 13360288
f11 2176.334 9.445389 2187.157 10.54527 2183 14.28329 2192.542 14.30543
f12 1619036 36316.39 1655830 39589.26 1641424 54531.86 1672572 58681.06
f13 4671.701 7.429789 4682.868 10.32592 4677.858 12.40378 4687.291 11.79262
f14 2325.193 4.067205 2331.665 4.743113 2328.729 6.928223 2334.058 7.102059
f15 6693.273 18.14667 6712.949 21.46824 6704.959 26.36587 6723.303 26.11719

The mean and standard deviation of the results found over the 30 independent runs of each algorithm

ISSN: 2414-3103
DOI: 10.20474/japs-2.2.3 TAF
Publishing
2016 J. appl. phys. sci. 46

Fig. 3. Comparison between BMVMO, BBA, BPSO, and B


IV. CONCLUSION AND FUTURE WORK
The table 1 shows the statistical results mean and stander
division of the comparative algorithms. And figure 3 MVMO is newest class of meta-heuristic algorithm, search
illustrate the behavior of BMVMO, BGA, BBA, and BPSO in within the continuous range [0, 1] therefore can’t apply
15 evaluation functions. directly to the binary problem. Use transfer function for
The summary of result proven the BMVMO have a good adapting MVMO to binary search without change original
performance at most benchmark functions among binary form of an algorithm. Comparison the BMVMO
meta-heuristic optimization algorithms (BGA, BBA, and performance with BGA, BBA, and BPSO by use 15
BPSO). benchmark functions of CEC 15. The statistical study
The performance of a very close between algorithms at f6, proved the BMVMO worthiness among binary meta-
while at f5 the BPSO have better performance than heuristic optimization algorithms. For the future work,
BMVMO. According to statistical study in table 1. We can study the effect change dimensional of the problem and
say the BMVMO proven worthiness among the binary use the different type of the transfer function on the
meta-heuristic optimization algorithms. performance of the BMMO algorithm and apply BMVMO in
different application such as feature selection.

REFERENCES

TAF
ISSN: 2414-3103 Publishing
DOI: 10.20474/japs-2.2.3
47 A. H. Al-Saeedi, O. Altun - Binary mean-variance mapping optimization … 2016

[1] C. H. Papadimitriou and K. Steiglitz, Combinatorial optimization on the IEEE-CEC 2014 test suite,” in
Optimization: Algorithms and Complexity. New York, IEEE Congress on Evolutionary Computation (CEC),
NY: Courier Dover Publications, 1998. 2014, pp. 1625-1632. DOI:
[3] J. Kennedy and R. C. Eberhart, “A discrete binary 10.1109/cec.2014.6900516
version of the particle swarm algorithm,” in IEEE [10] L. Wang, X, Fu, M. I. Menhas and M. Fei, “A modified
International Conference on Computational binary differential evolution algorithm,” in K. Li, M.
Cybernetics and Simulation, 1997, pp 4104-4108. Fei and Jia L, eds. Life System Modeling and
DOI: 10.1109/icsmc.1997.637339 Intelligent Computing, Lecture Notes in Computer
[4] J. Holland, Adaptation in Natural and Srtificial Science. Berlin, Germany: Springer, 2010, pp 49-57.
Systems,” Ann Arbor, MI: The University of DOI: 10.1007/978-3-642-15597-0_6
Michigan Press, 1975. [11] S. Mirjalili, S. M. Mirjalili and A. Lewis, “Grey wolf
[5] M. Dorigo, V. Maniezzo and A. Colorni, “Ant system: optimizer,” Advances in Engineering Software, vol.
Optimization by a colony of cooperating 69, pp. 46-61, 2014. DOI:
agents,” IEEE Transactions on Systems, Man, and 10.1016/j.advengsoft.2013.12.007
Cybernetics, Part B (Cybernetics), vol. 26, no. 1, pp. [12] S. Mirjalili and A. Lewis, “S-shaped versus V-shaped
29-41, 1996. DOI: 10.1109/3477.484436 transfer functions for binary particle swarm
[6] E. Rashedi, H. Nezamabadi-Pour and S. Saryazdi, optimization,” Swarm and Evolutionary
“BGSA: binary gravitational search Computation, vol. 9, pp. 1-14, 2013. DOI:
algorithm,” Natural Computing, vol. 9(3), pp. 727- 10.1016/j.swevo.2012.09.002
745, 2010. DOI: 10.1007/s11047-009-9175-3 [13] Vasant, P., and Balbir Singh. "Solving Economic
[7] X. S. Yang, “A new metaheuristic bat-inspired Dispatch By Using Swarm Based Mean-Variance
algorithm,” in J. R. Gonzalez, D.A. Pelta, C. Cruz, G. Mapping Optimization (MVMOS),” Global Journal of
Terrazas, N. Krosnogoret eds. Nature Inspired Technology and Optimization vol. 6, no. 3, 1-8.
Cooperative Strategies for Optimization (NICSO [14] I. Erlich, G. K. Venayagamoorthy and W. Nakawiro,
2010). Berlin, Germany: Springer, Berlin, 2010, pp “A mean-variance optimization algorithm,” in IEEE
65-74. DOI: 10.1007/978-3-642-12538-6_6 Congress on Evolutionary Computation, 2010, pp.1-
[8] A. Kaveh and N. Farhoudi, “A new optimization 6. DOI: 10.1109/CEC.2010.5586027
method: Dolphin echolocation,” Advances in
Engineering Software, 59, 53-70, 2013. DOI:
10.1016/j.advengsoft.2013.03.004
[9] I. Erlich, J. L. Rueda, S. Wildenhues and F. Shewarega,
“Evaluating the mean-variance mapping

— This article does not have any appendix. —

ISSN: 2414-3103
DOI: 10.20474/japs-2.2.3 TAF
Publishing

You might also like