A Hybrid Genetic Algorithm Based On Information Entropy and Game Theory
A Hybrid Genetic Algorithm Based On Information Entropy and Game Theory
March 2, 2020.
Digital Object Identifier 10.1109/ACCESS.2020.2971060
ABSTRACT To overcome the disadvantages of traditional genetic algorithms, which easily fall to local
optima, this paper proposes a hybrid genetic algorithm based on information entropy and game theory. First,
a calculation of the species diversity of the initial population is conducted according to the information
entropy by combining parallel genetic algorithms, including using the standard genetic algorithm (SGA),
partial genetic algorithm (PGA) and syncretic hybrid genetic algorithm based on both SGA and PGA
for evolutionary operations. Furthermore, with parallel nodes, complete-information game operations are
implemented to achieve an optimum for the entire population based on the values of both the information
entropy and the fitness of each subgroup population. Additionally, the Rosenbrock, Rastrigin and Schaffer
functions are introduced to analyse the performance of different algorithms. The results show that compared
with traditional genetic algorithms, the proposed algorithm performs better, with higher optimization ability,
solution accuracy, and stability and a superior convergence rate.
INDEX TERMS Genetic algorithm, partheno-genetic algorithm, information entropy, game theory, parallel
genetic.
This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see http://creativecommons.org/licenses/by/4.0/
36602 VOLUME 8, 2020
L. Jiacheng, L. Lei: Hybrid Genetic Algorithm Based on Information Entropy and Game Theory
Tu Chengyuan et al. proposed overcoming the problem of population in each generation. Additionally, the genetic algo-
‘‘premature convergence’’ by constructing new genetic oper- rithm involves genetic and selection operations; thus, whether
ators, such as restoration, reconstruction and self-intersection the new population is retained and how it is retained will
operators, in the so-called partheno-genetic algorithm. affect the final optimization result and the speed of opti-
Generally, the above algorithms are relatively classic mization. Therefore, the concept of information entropy is
genetic algorithms; they have efficient search capabilities, introduced in this paper. Sexual optimization and game theory
but there are still inherent drawbacks to these algorithms. optimization are applied for new populations generated by
To find a better solution to the NP problem, many scholars genetic operations. Based on a review of relevant studies,
have studied hybrid genetic algorithms and improved genetic scholars have begun to study the combination of informa-
algorithms. tion entropy and genetic algorithms and the combination of
Li Jia proposed a new genetic algorithm, namely, a hybrid game theory and genetic algorithms to optimize algorithms.
genetic algorithm that integrated the tabu search algorithm However, applications involving information entropy, game
into a genetic algorithm. theory and genetic algorithms for optimization have not been
Zhang Tao et al. also proposed a new hybrid genetic algo- reported in China.
rithm in which a genetic algorithm and the 3-OPT algorithm
are combined. This approach utilizes the efficient global B. INFORMATION ENTROPY
search ability of the genetic algorithm and the local search Entropy was introduced in the 1860s by the German physicist
ability of the 3-OPT algorithm. Clausius as a concept of thermodynamics. This term was
Chen Xiangzhou et al. embedded a reversal operator in the originally used to describe the amount of energy conversion
genetic algorithm and improved the convergence speed of the and the direction of transformation. In 1854, Clausius defined
algorithm in the late stage of operation. the state function entropy of a reversible thermodynamic
Zhang et al. introduced the idea of cloning in genetic algo- system. The mathematical expression is as follows.
rithms. Dai Xiaoming et al. introduced the concept of parallel
dQ
evolution in genetic algorithms. Fang Xia et al. applied an ds = (1)
immunity algorithm with a genetic algorithm to solve the dT
VRP problem. A change in entropy is equal to the ratio of heat absorption
The hybrid algorithm effectively balances the diversity to the temperature from state X0 to state X in a reversible
and convergence properties of the algorithms and embodies element-based process.
the advantages of each algorithm. However, the diversity of Information entropy is an important concept in the field
most hybrid algorithms is relatively low, and the information of information theory. The famous paper ‘‘The Mathematical
exchange between populations is not considered. With the Theory of Communication’’ published by American scientist
advent of interdisciplinary studies, scholars have introduced C.E. Shannon in 1948 laid the theoretical foundation for
information entropy and game theory into genetic algorithms information theory. In information theory, entropy describes
to further optimize these algorithms. the degree of uncertainty of random variables, and informa-
Xue Feng et al. used information entropy to generate the tion entropy is often used to reflect the degree of uncertainty
initial population, which increased the diversity of the initial in selection or information.
population and provided the basis for subsequent processing. The information entropy can be defined as the average
Chen Xiaofeng used the concept of information entropy to amount of information provided by each representation of
improve and fuse quantum evolution algorithms and immune source X, or the statistical average of the amount of informa-
genetic algorithms and proposed a quantum immune genetic tion contained in various representations sent by information
algorithm based on information entropy. source X in the probability space of information source X.
Wei Qinfang et al. proposed a genetic algorithm for wire- This average value is H(X):
less sensor network intrusion detection based on information X
H (x) = − P (x) LnP (x) (2)
entropy.
xX
Yang Mei and others borrowed relevant ideas from game
theory for strategy optimization and proposed a hybrid where X represents all solutions.
optimization mechanism for multi-subgroup and multi- The distribution degree of individuals in a population can
strategy methods; this approach improved the search accu- be calculated based on the information entropy. If the popula-
racy and convergence speed, but it requires the selection of tion diversity is high, the corresponding information entropy
subpopulations. will be high, and vice versa, so that the population can be
The optimization ability of the genetic algorithm is prevented from falling to local optima.
reflected by the diversity of the population and the conver-
gence speed of the algorithm. The traditional genetic algo- C. GAME THEORY
rithm easily falls to local optima, and this issue is largely Game theory is related to the direct interaction of decision-
related to the diversity of individuals in the population. making behaviours and the equilibrium problem associated
Therefore, it is necessary to maintain the diversity of the with such decisions. Game theory originated in the early
calculate the entropy value S until the condition of TABLE 1. Element mapping relations for game theory and the hybrid
optimization algorithm.
S > S0 is met.
4) Step 4: Repeat Step until the number of chromosomes
reaches the specified initial population.
updated according to formula (5): population excludes individuals whose fitness function value
is less than the median.
Xit+1 = Xit + δxit+1 (P∗ ) (5) b. The population is optimized to the Kth (K = 1,
where Xit representsthe position at node t of subgroup i and 2, . . ., n) generation, and the entropy values Q1 , Q2 , and Q3
δx it+1 (P∗ ) is the strategy for subgroup updating based on the of the three populations are calculated. When the entropy Q
following three modes. of two of the populations is less than the critical entropy S0 ,
the information entropy of onepopulation is greater than the
a: COOPERATION MODE critical entropy S0 . At this time, from the population with an
When the population is optimized to the Kth (K = 1, information entropy that satisfies the threshold requirement,
2, . . ., n) generation, the entropy values Q1 , Q2 , and Q3 of the the individuals with fitness values larger than the median are
three populations are calculated according to the following retained, and other individuals are excluded.
formula. In general, coordination mode is used to improve the
diversity of the algorithm and reduce the possibility of local
max(Q1, Q2, Q3) < S0 (6) populations being trapped.
At this time, cooperative mode begins. Moreover, the three III. IMPROVED GENETIC ALGORITHM BASED ON
populations are merged into one population, and according INFORMATION ENTROPY AND GAME THEORY
to the fitness function value f of the individuals in the pop- Based on the above analysis, the steps used to develop the
ulation, the individuals with function values less than the hybrid genetic algorithm in this paper based on information
media value are eliminated, and the remaining are retained entropy and game theory are as follows.
and evenly distributed among the three populations. These 1) First, determine the coding method according to the
populations are used to regenerate the population in the same relevant requirements. Then, complete the initialization of
manner as discussed for the initial population. the three populations based on the population information
The cooperative mode is a multi-group self-coordination entropy in the search space. Next, set the search boundary
mechanism. By synergizing with other populations, the per- and initialize the maximum iteration number MaxDT and the
formance of at least one population is improved, and the parallel optimization threshold MaxJT. Finally, initialize the
diversity and convergence of the algorithm are balanced. global optimal solution fbest.
2) Implement the parallel genetic algorithm for the three
b: COMPETITION MODE
populations according to the standard genetic algorithm,
When the population is optimized to the Kth (K = 1, partheno-genetic algorithm and hybrid genetic algorithm
2, . . ., n) generation, the entropy values Q1 , Q2 , andQ3 of the methods. Record the fitness value of each population i.
three populations are calculated according to the following 3) Calculate the global maximum return P∗ and the pop-
formula. ulation information entropy according to the payment utility
min(Q1 ,Q2 ,Q3 ) > S0 (7) rule; then, obtain the population updating strategy δx t+1 ∗
i (P ).
4) Update the population according to the population
At this time, competition mode begins. The maximum updating strategy determined in (3).
fitness value Fmax of each population is calculated, and 5) Determine whether the global maximum return P∗ is
the largest function value Fmax is obtained. The individual better than the global optimal fbest and whether to update
represented by the function value is copied into the other two fbest.
populations, and the individuals with the smallest function 6) Evolve the MaxJT generation according to the free
values are excluded. strategy for the updated population.
Competition mode yields excellent populations in the algo- 7) Determine whether the maximum number of iterations
rithm and improves the convergence. M is reached. If so, the process proceeds to step (8); other-
wise, the process proceeds to step (3).
c: COORDINATION MODE 8) Output the results.
Coordination mode is divided into two situations. The flow chart of this process is shown in Figure 3.
a. The population is optimized to the Kth (K = 1,
2, . . ., n) generation, and the entropy values Q1 , Q2 , and Q3 IV. SIMULATION EXPERIMENT
of the three populations are calculated. When the entropy Q In this section, the hybrid optimization algorithm is tested
of two of the populations is greater than the critical entropy based on the information entropy and game theory process
S0 , the information entropy of one population is lower than developed in the previous sections. Three simulation func-
the critical entropy S0 . At this time, each of the two popula- tions, namely, the Rosenbrock function, Rastrigin function
tions with information entropy values that meet the threshold and Schaffer function, are used in the numerical simulation
requirement account for 1/4 of the population used to obtain experiments. SGA, PGA and SGA-PGA are selected as the
the best fitness function. In this case, the information entropy reference algorithms to verify the rationality and effective-
is less than the critical threshold S0 in the population, and the ness of the proposed algorithm.
A. ROSENBROCK FUNCTION
Xn 2
f (X ) = [100 xi+1 − xi2 + (1−xi )2 ] (8)
i=1
B. RASTRIGIN FUNCTION
Xn
f (X ) = [xi2 − 10 cos (2π xi ) + 10] (9)
i=1
TABLE 3. Results for the Rosenbrock function under different optimization algorithms.
TABLE 4. Results for the Rosenbrock function under different optimization algorithms.
TABLE 5. Results for the Rosenbrock function under different optimization algorithms.
solution in the feasible domain. The optimal value indicates population genetics. Additionally, combined with game the-
that the information entropy game genetic algorithm has good ory, various types of evolutionary processes occur. The
global and local search abilities. changes in group information entropy using the game strategy
In the comparison of algorithm stability, the experiments that is most conducive to the diversity and adaptability of
in this paper are repeated 50 times. In addition to the strong the whole group are considered to strengthen good individ-
search ability of the proposed algorithm, the stability of the uals and eliminate invalid individuals. Three test functions
algorithm is another indicator of performance. Based on the are introduced to assess the validity and convergence of
50 repetitions, the variance of the experimental results was commonly used test algorithms. Based on the Rosenbrock,
determined to assess the fluctuations in the optimal value Rastrigin and Schaffer functions and a coding test, the results
obtained by the algorithm. Although the Rosenbrock and show that the proposed improved hybrid genetic algorithm
Schaffer functions yielded unstable results for most algo- has a considerable advantage over traditional genetic score-
rithms, the proposed method displayed good stability. based methods in initializing the population and obtaining
For the function value of the initial population, the informa- high fitness values, rapid convergence, and a high optimiza-
tion entropy game genetic algorithm yielded a better fitness tion speed.
function can be obtained using traditional genetic score meth-
ods, indicating that the method based on information entropy REFERENCES
can increase population diversity and optimize the initial pop- [1] J. Holland, Adaptation in Natural and Artificial Systems. Ann Arbor, MI,
ulation. Additionally, the information entropy game genetic USA: Univ. of Michigan Press, 1975, pp. 21–24.
algorithm uses parallel genetic operations; therefore, the effi- [2] L. D. Davie, Handbook of Genetic Algorithm. New York, NY, USA: Van
Nostrand Reinhold, 1991, pp. 248–345.
ciency of algorithm optimization can far exceed that of [3] D. E. Goldberg, Genetic Algorithms for Search, Optimization and Machine
traditional genetic methods. Thus, with the proposed algo- Learning. Boston, MA, USA: Addison-Wesley, 1989.
rithm, the optimal value is found earlier, and the informa- [4] B. Dorronsoro and E. Alba, ‘‘A simple cellular genetic algorithm for
tion entropy-based population game operations are associated continuous optimization,’’ in Proc. CEC, Vancouver, BC, Canada, 2006,
pp. 2838–2844.
with genetic nodes, thus avoiding the main disadvantage of [5] N. A. Al-Madi, ‘‘De Jong’s sphere model test for a human community
the traditional genetic algorithm, which easily falls to local based genetic algorithm model (HCBGA),’’ Int. J. Adv. Comput. Sci., vol. 5,
optima. no. 1, pp. 166–172, Jan. 2014, doi: 10.14569/IJACSA.2014.050123.
[6] M. Liand and T. Tong, ‘‘Partheno-genetic algorithm and its global con-
Based on comprehensive analysis of the above experimen- vergence analysis,’’ Acta Automatica Sin., vol. 25, no. 1, pp. 68–72,
tal results, the information entropy game genetic algorithm Jan. 1999.
can be applied to complex nonlinear and high-dimensional [7] J. Li, M. G. Wang, L. X. Tang, and J. H. Son, ‘‘A special vehi-
functions with multiple extreme points and obtain high- cle routing problem,’’ J. Northeast. Univ., vol. 22, no. 3, pp. 245–248,
Jun. 2001.
precision global optimal values with low computational costs. [8] T. Zhang and M. Wang, ‘‘Genetic algorithm and 3-OPT combination for
Notably, the proposed algorithm not only has fast conver- solving VRP with capacity constraints,’’ J. Northeast. Univ., vol. 20, no. 3,
gence speed but also has better global and local optimization pp. 253–256, Jun. 1999.
[9] X. Z. Chen, Z. M. Li, and Z. R. Liu, ‘‘Application of an improved inte-
performance than traditional methods. ger coding genetic algorithm in vehicle routing optimization problem,’’
J. Southern Inst. Metall., vol. 25, no. 1, pp. 36–41, Apr. 2004, doi: 10.3969/
j.issn.2095-3046.2004.01.010.
V. CONCLUSION [10] W. Zhang and W. L. Chen, ‘‘A multi-logistics center distribution model
In this paper, the genetic algorithm is improved, and and its genetic algorithm,’’ Comput. Sci. Devel, vol. 18, no. 2, pp. 46–50,
Jan. 2008, doi: 10.3969/j.issn.1673-629X.2008.02.013.
multi-group genetic operations are performed in parallel.
[11] X. M. DAI and C. L. Chen, ‘‘Genetic algorithm for extracting mutation
Information entropy is introduced to quantitatively analyse operator based on improved model,’’ J. Shanghai Jiaotong Univ., vol. 4,
the diversity in the evolution process and ensure diversity in pp. 25–29, Aug. 2003, doi: 10.3321/j.issn:1006-2467.2002.08.024.
[12] X. Fang, S. F. Chen, H. Huang, and G. Z. Zhen, ‘‘Research on vehicle rout- LI LEI received the Ph.D. degree in science from
ing optimization problem of logistics distribution based on immune genetic Xi’an Jiaotong University, China, in 1989, and
algorithm,’’ China Civil Eng. J., vol. 36, no. 7, pp. 43–46, Oct. 2003, Ph.D. degree in engineering from Tohoku Univer-
doi: 10.3321/j.issn:1000-131X.2003.07.009. sity, Japan, in 1994. He was a Ph.D. and a Research
[13] F. Xue, C. G. Wang, and F. Yan, ‘‘Genetic-ant colony cooperative optimiza- Assistant with Hirosaki University, Japan, from
tion algorithm based on information entropy and chaos theory,’’ Control April 1989 to March 1992. Since April 1992, he
Decis., vol. 26, no. 1, pp. 44–48, Jan. 2011. has been on the Faculty of Computer Science and
[14] X. F. Chen and G. M. Yang, ‘‘Quantum immune genetic algorithm based on
Engineering. He was an Associate Professor with
information entropy,’’ J. Liaoning Tech. Univ., vol. 32, no. 4, pp. 549–556,
Aomori University, Japan, from 1992 to 1997, and
Jun. 2013.
[15] Q. F. Wei, Y. Cheng, and X. D. Hu, ‘‘Genetic algorithm-based intrusion Yamaguchi University, Japan, from 1997 to 2001.
detection genetic algorithm for wireless sensor networks,’’ J. Chongqing He was a Visiting Professor with the State University of New York at Stony
Univ. Posts Telecommun., vol. 28, no. 1, pp. 107–112, May 2016, doi: 10. Brook, from 1999 to 2000. Since 2002, he has been a Professor and the Dean
3979/j.issn.1673-825X.2016.01.016. of Hosei University, Japan. He has been acting as an author/coauthor or the
[16] M. Yang and J. Liu, ‘‘A hybrid optimization algorithm based on game the- Editor/Co-Editor of 20 books. His research interest includes the fast algo-
ory,’’ Comput. Appl. Res., vol. 33, no. 8, pp. 2350–2352, 2362, Oct. 2016, rithms, parallel algorithms, genetic algorithms, neural networks, machine
doi: 10.3969/j.issn.1001-3695.2016.08.025. learning algorithms, and so on. He has published around 220 articles in
[17] E. Shannon, ‘‘A mathematical theory of communication,’’ Bell Syst. refereed journals, conference proceedings, and book chapters in these areas.
Tech. J., vol. 196, no. 4, pp. 519–520, Jan. 1948, doi: 10.1016/S0016- He has been involved in more than 35 conferences and workshops as a
0032(23)90506-5. Program/General/Organizing Chair. He received the Hitachi to Software
[18] G. Q. Yao, Game Theory. Beijing, China: Higher Education Press, 2007. Papers International Award in Sendai, Japan, in 1990, the Outstanding Lead-
[19] J. von Neumann and O. Morgenstern, Theory of Games and Economic ership Award from the International Conference on Computer Convergence
Behavior. Princeton, NJ, USA: Princeton Univ. Press, 1947. Technology, Seoul, Korea, in 2011, the Outstanding Service Award from
[20] Y. Jin and G. Kesidis, ‘‘Equilibria of a noncooperative game for heteroge-
ACM Research in Applied Computation Symposium, Miami, USA, in 2011,
neous users of an ALOHA network,’’ IEEE Commun. Lett., vol. 6, no. 7,
and the Asia Education Friendship Award from Education Forum for Asia
pp. 282–284, Jul. 2002, doi: 10.1109/lcomm.2002.801326.
[21] R. Ferrero, S. Shahidehpour, and V. Ramesh, ‘‘Transaction analysis in Annual Conference in Chengdu, China, in 2019.
deregulated power systems using game theory,’’ IEEE Trans. Power Syst.,
vol. 12, no. 3, pp. 1340–1347, Aug. 1997, doi: 10.1109/59.630479.
[22] C. Fisk, ‘‘Game theory and transportation systems modelling,’’ Transp.
Res. B, Methodol., vol. 18, nos. 4–5, pp. 301–313, Aug. 1984.
[23] D. Wu, J. Cao, Y. Ling, J. C. Liu, and L. Sun, ‘‘Routing algorithm based on
multi-community evolutionary game for VANET,’’ J. Netw., vol. 7, no. 7,
pp. 597–601, Jul. 2012, doi: 10.4304/jnw.7.7.1106-1115.