A Novel Evolutionary Algorithm With Column and Sub-Block Local Search For Sudoku Puzzles
A Novel Evolutionary Algorithm With Column and Sub-Block Local Search For Sudoku Puzzles
1, MARCH 2024
Fig. 1. Example of 9 × 9 Sudoku puzzle and its solution. (a) Sudoku puzzle.
(b) Solution to the puzzle.
TABLE I the compared algorithms are obtained directly from their original
PARAMETERS IN LSGA
papers. Table II gives the experimental results.
From Table II, we can see that in these six Sudoku puzzles,
LSGA, NB-COIN, and GA-II all obtain the final results in all
100 runs, while GA-I in Hard 106 only finds the solution in 96
runs. Thus, LSGA, NB-COIN, and GA-II are better than GA-I.
Subsequently, compared with NB-COIN, the performance of
LSGA is worse than NB-COIN when solving easy-level puzzles.
Furthermore, as NB-COIN depends on probability distributions
to generate solutions for Sudoku, NB-COIN is less influenced by
local optimal solutions than other GAs. Specifically, both LSGA
Algorithm 4: Pseudocode of LSGA. and the other comparison algorithms solve Hard 106 with more
Input: maximum number of generations FESmax , generations than Hard 77, but the performance of NB-COIN on
population. the two puzzles is not very different. By analyzing the solution
1: Initialize population; process, we found that there is a very competitive local optimal
2: Evaluate population; solution in the Hard 106. This local optimal solution has only
3: While (count≤ FESmax ) do: two columns that do not conform to the rules of Sudoku, but its
4: Tournament selection; structure is different from the best solution. As a result, the GAs
5: Crossover; can easily fall into this local optimum. In general, comparing the
6: Mutation; results of solving medium-level, and hard-level Sudoku puzzles,
7: Column local search; the average number of generations of LSGA is less than GA-I,
8: Sub-block local search; GA-II, and NB-COIN. Therefore, the performance of LSGA in
9: Evaluate population; solving Sudoku puzzles is very competitive.
10: Elite population learning; Next, we conduct an experiment on three so-called super diffi-
11: Reserve the best individual as gbest; cult Sudoku puzzles named super difficult-1 (SD1), AI-escargot
12: If fx(gbest) = = 0: (SD2), and super difficult-2 (SD3) selected from [14]. These are
13: Break; shown in Fig. 9. Among these three puzzles, the AI Escargot
14: End If is one of the most difficult Sudoku puzzles in the world [46].
15: End While Table III gives the comparison result of LSGA and some other
16: Obtain the best solution gbest and its fitness fx(gbest); algorithms that have successfully solved these super difficult
puzzles: GA-III [14], GPU-GA [47], and GA-I [46]. Each
Output: fx(gbest) and gbest algorithm runs 100 times on each puzzle, where Succ_Count
is the solution success rate among the 100 runs within 1 × 104
generations, and Avg_Gen is the average number of generations
executed in line 10. The algorithm iteratively repeats the above required to find the optimal solution.
operations until the optimal solution is found or the maximal From Table III, we see that LSGA and GPU-GA can solve all
number of generations is reached. puzzles with 100% success rate, while GA-I and GA-III cannot.
Moreover, both LSGA and GPU-GA require fewer number of
IV. EXPERIMENTAL STUDIES generations than the other algorithms in each puzzle, and LSGA
requires the fewest. Furthermore, we evaluate the difficulty of
A. Comparisons With State-of-the-Art Methods SD1, SD2, and SD3 with the help of the SE and get scores of 7.2,
To illustrate the performance of LSGA, we compare it with 10.5, and 2.8, respectively, which means, for the Sudoku solving
state-of-the-art algorithms, including the node-based coinci- methods that have been recorded in SE (such as WXYZ-Wing,
dence algorithm named NB-COIN [13], the preserve building Swampfish, ALS-Wing, etc.), SD2 is very difficult to solve,
blocks GA named GA-I [46], and the GA with local optima but SD3 is much simpler. However, the experimental results
handling named GA-II [18]. To make a fair comparison, the pop- in Table III are different. Compared with SD1 and SD2, LSGA
ulation size is set to 150, while all algorithms run 1 × 104 gen- uses much more generations to solve the SD3, this situation
erations. The parameter settings of LSGA are given in Table I. not only occurs in LSGA, but also in other compared GAs in
In the experiments, six classic Sudoku puzzles, which are also Table III. Therefore, we consider that if some known methods
solved by the compared algorithms NB-COIN, GA-I, and GA-II, for solving Sudoku can be introduced into GA, the efficiency of
are selected. These Sudoku puzzles cover three difficulty levels solving difficult Sudoku puzzles could be greatly improved.
(i.e., easy, medium, and hard), as shown in Fig. 8. Each algorithm
runs 100 times on each puzzle, where Succ_Count is the number
of runs among the 100 runs that can find the optimal solutions B. Statistical Performance on Open Sudoku Puzzles
within 1 × 104 generations, and Avg_Gen is the average number To illustrate the statistical performance of LSGA on more
of generations required to find the optimal solution. Note that to Sudoku puzzles, we conduct experiments based on a large
ensure the fairness of the comparison, the experimental results of number of Sudoku puzzles selected from the open website
168 IEEE TRANSACTIONS ON GAMES, VOL. 16, NO. 1, MARCH 2024
Fig. 8. Six Sudoku puzzles and their solutions. (a) Initial Sudoku puzzle. (b) Solution to puzzle. (c) Initial Sudoku puzzle. (d) Solution to puzzle. (e) Initial Sudoku
puzzle. (f) Solution to puzzle. (g) Initial Sudoku puzzle. (h) Solution to puzzle. (i) Initial Sudoku puzzle. (j) Solution to puzzle. (k) Initial Sudoku puzzle. (l) Solution
to puzzle.
TABLE II
RESULTS OF PROPOSED LSGA AND OTHER METHODS FOR SOLVING SUDOKU ON SIX DIFFERENT SUDOKU PUZZLES
TABLE III
RESULTS OF PROPOSED LSGA AND OTHER METHODS FOR SOLVING SUDOKU ON THREE SUPER DIFFICULT SUDOKU PUZZLES
WANG et al.: NOVEL EVOLUTIONARY ALGORITHM WITH COLUMN AND SUB-BLOCK LOCAL SEARCH FOR SUDOKU PUZZLES 169
TABLE IV
RESULTS OF PROPOSED LSGA FOR SOLVING SUDOKU PUZZLES AT
WWW.WEBSUDOKU.COM
From Table IV, we see that LSGA efficiently solves all puz-
zles. As the difficulty level increases, the number of generations
needed by LSGA to obtain the optimal solution also increases
exponentially, especially for the puzzles of the evil level. There-
fore, we conduct a further investigation on the factors affecting
the performance of the LSGA in the following part.
Fig. 9. Three super difficult Sudoku puzzles and their solutions. (a) Initial
Sudoku puzzle. (b) Solution to puzzle. (c) Initial Sudoku puzzle. (d) Solution to C. Further Investigation and Discussion
puzzle. (e) Initial Sudoku puzzle. (f) Solution to puzzle.
To further study the factors affecting the performance of
LSGA, we decide to rate and generate Sudoku puzzles by using
www.websudoku.com. We select Sudoku puzzles from four SE. The score of a Sudoku puzzle in SE is determined by the
difficulty levels: easy; medium; hard; and evil. In each difficulty complexity of the skills required to solve it. The more complex
level, 30 puzzles are randomly selected. Therefore, totally 120 the skills required to solve a Sudoku puzzle, the higher SE
puzzles are adopted for testing. The details of all the 120 puzzles score it will get, which can determine the difficulty level of
are provided in the Supplemental_Material. The configurations the Sudoku puzzle accordingly. This type of evaluation is very
of LSGA are the same as those in Table I. LSGA runs 10 times effective for players and is widely used [4]. Therefore, we use
on each puzzle and the average performance of the 10 runs is SE to generate Sudoku puzzles with seven difficulty levels, each
calculated and given as Avg_Gen in Table S.I in the Supplemen- difficulty level containing 10 different Sudoku puzzles. These
tal_Material. Then, the mean performance of the LSGA on all the seven levels are called easy, medium, hard, superior, fiendish,
30 puzzles (i.e., the 30 Avg_Gen values) in each difficulty level is super, and advance, and their SE score intervals are [1.0, 1.2],
given as Mean_Gen in the last row of Table S.I. Moreover, all the [1.3, 1.5], [1.6, 2.6], [2.7, 3.9], [4.0, 5.9], [6.0, 6.9], and [7.0,
Mean_Gen values of the four difficulty levels and other statistical 8.0], respectively. The details of all the 70 puzzles are given in
values are given in Table IV. For example, in the second row the Supplemental_Material. The LSGA is adopted to solve these
of Table IV for all the 30 puzzles in easy level, the average Sudoku puzzles. Similar to the experiments in Section IV-B, each
number of generations needed by LSGA to obtain the optimal puzzle is solved ten times and the average number of generations
solution to each puzzle among the 10 runs is calculated, and then needed by LSGA to obtain the optimal solution of the ten runs
the maximal average number and the minimal average number is calculated. Then, we can obtain ten average results on each
among the 30 puzzles are given as Max_Gen and Min_Gen. difficulty level (i.e., there are ten puzzles and each puzzle has
Moreover, the mean of the 30 average numbers is given as an average result). The details of these ten average results are
Mean_Gen and the Mean_Succ_Rate is the success rate of LSGA given in Table V and their distribution is plotted as Box in
in solving all the 30 puzzles in all the 10 runs. Fig. 10. There are seven columns in Table V and seven Boxes in
170 IEEE TRANSACTIONS ON GAMES, VOL. 16, NO. 1, MARCH 2024
V. CONCLUSION
In this article, we propose an improved GA with a local search
(named LSGA) for Sudoku. In particular, we adopt a matrix-
based encoding GA and devise mutation and crossover operators
for this coding scheme. Then, to improve the convergence speed
of LSGA, a local search method incorporating column and sub-
block search is proposed. Finally, by comparing with GA-based
algorithms in different dimensions and levels of Sudoku puzzles,
LSGA successfully solves all of these puzzles and shows good
performance.
LSGA can also be applied to solve other types of Sudoku
puzzles such as Mini Sudoku and Ring Sudoku. However, for
Sudoku variants, such as Killer Sudoku and Kakuro Sudoku
[48], the initialization and local search strategies need to be
redesigned because the local search in LSGA is designed for
regularly shaped sub-blocks. Furthermore, for puzzles without
sub-blocks, such as those of Futoshiki and Takuzu [49], the
column local search strategy is still applicable. Therefore, LSGA
deserves further research to better solve other Sudoku puzzles.
Fig. 11. Mean generations needed by LSGA to solve Sudoku puzzles with
Although our algorithm is successful in solving many Sudoku
different given numbers. puzzles, there is still room for improvement. For example, with
the help of manual Sudoku solving methods, such as direct hid-
den pair and fish methods [50], humans can easily find numbers
Fig. 10 for seven difficulty levels. Moreover, we also look into the in the irrational position and adjust them, whereas LSGA needs
number of given numbers in all the 70 puzzles. Fig. 11 shows several, tens, or even hundreds of generations to achieve the same
the average number of generations needed by LSGA to solve results. Therefore, in the future, we can improve the performance
Sudoku puzzles with different given numbers. For example, the of LSGA by combining it with other Sudoku-solving methods.
first bar means that, there may be several puzzles among the
70 puzzles that are with 23 given numbers, then the average REFERENCES
number of generations needed by LSGA to solve each of these [1] J.-P. Delahaye, “The science behind SUDOKU,” Sci. Amer., vol. 294, no. 6,
several puzzles among the 10 runs is calculated, and at last the pp. 80–87, Jun. 2006.
mean of these several average values is calculated, which is 177. [2] X. Qi, G. Li, N. Wang, X. Wang, and L. Wen, “Method study on solving
sudoku problem,” in Proc. 3rd Int. Conf. Data Sci. Bus. Analytics, 2019,
As shown in Fig. 10 and Table V, we can conclude that the pp. 269–271.
required generations for LSGA to solve Sudoku are not signifi- [3] T. Yato and T. Seta, “Complexity and completeness of finding another solu-
cantly affected by difficulty levels. For example, the generations tion and its application to puzzles,” IEICE Trans. Fundamentals Electron.,
Commun. Comput. Sci., vol. E86-A, no. 5, pp. 1052–1060, May 2003.
required to solve most Sudoku puzzles at the hard and superior [4] M. Henz and H.-M. Truong, “SudokuSat—A tool for analyzing difficult
levels are less than that at the Medium level. That is, LSGA sudoku puzzles,” in Tools and Applications with Artificial Intelligence,
inherits the problem-independent characteristics of GA and is vol. 166, New York, NY, USA: Springer, 2009, pp. 25–35.
[5] S. Jana, A. K. Maji, and R. K. Pal, “A novel SPN-based video stegano-
more general for Sudoku puzzles. Moreover, as shown in Fig. 11, graphic scheme using Sudoku puzzle for secured data hiding,” Innov. Syst.
we can conclude that there is a correlation between the difficulty Softw. Eng., vol. 15, no. 1, pp. 65–73, Jan. 2019.
of solving Sudoku puzzles and the number of given numbers. [6] N. R. Munson, T. D. Bufler, and R. M. Narayanan, “Sudoku based
phase-coded radar waveforms,” in Proc. Radar Sensor Technol., Apr. 2021,
More given numbers can give LSGA more help in finding a vol. 11742, pp. 81–93.
solution, because the given numbers can effectively reduce the [7] S. Jose and R. Abraham, “Influence of Chess and Sudoku on cognitive
search space and eliminate interference solutions. However, the abilities of secondary school students,” Issues Ideas Educ., vol. 7, no. 1,
pp. 23–30, Mar. 2019.
relationship between the given numbers and difficulty is not [8] X. Li, Y. Luo, and W. Bian, “Retracing extended sudoku matrix for high-
strictly linear or exponential. That is, there exists the situation capacity image steganography,” Multimedia Tools Appl., vol. 80, no. 12,
that some Sudoku puzzles with more given numbers but are pp. 18627–18651, Feb. 2021.
WANG et al.: NOVEL EVOLUTIONARY ALGORITHM WITH COLUMN AND SUB-BLOCK LOCAL SEARCH FOR SUDOKU PUZZLES 171
[9] M. Horoufiany and R. Ghandehari, “Optimization of the Sudoku based [34] J. Y. Li, K. J. Du, Z. H. Zhan, H. Wang, and J. Zhang, “Distributed
reconfiguration technique for PV arrays power enhancement under mutual differential evolution with adaptive resource allocation,” IEEE Trans.
shading conditions,” Sol. Energy, vol. 159, pp. 1037–1046, Jan. 2018. Cybern., to be published, doi: 10.1109/TCYB.2022.3153964.
[10] M. Harrysson and H. Laestander, “Solving sudoku efficiently with dancing [35] Z. H. Zhan, Z. J. Wang, H. Jin, and J. Zhang, “Adaptive distributed dif-
links,” Exp. Math., vol. 23, pp. 190–217, Dec. 2014. ferential evolution,” IEEE Trans. Cybern., vol. 50, no. 11, pp. 4633–4647,
[11] J. Gunther and T. Moon, “Entropy minimization for solving Sudoku,” IEEE Nov. 2020.
Trans. Signal Process., vol. 60, no. 1, pp. 508–513, Jan. 2012. [36] J. Y. Li, Z. H. Zhan, K. C. Tan, and J. Zhang, “A meta-knowledge transfer-
[12] L. Clementis, “Advantage of parallel simulated annealing optimization based differential evolution for multitask optimization,” IEEE Trans. Evol.
by solving sudoku puzzle,” in Proc. Emergent Trends Robot. Intell. Syst., Comput., vol. 26, no. 4, pp. 719–734, Aug. 2022.
2015, pp. 207–213. [37] J. Y. Li, Z. H. Zhan, J. Xu, S. Kwong, and J. Zhang, “Surrogate-
[13] K. Waiyapara, W. Wattanapornprom, and P. Chongstitvatana, “Solving assisted hybrid-model estimation of distribution algorithm for mixed-
sudoku puzzles with node based coincidence algorithm,” in Proc. 10th Int. variable hyperparameters optimization in convolutional neural net-
Joint Conf. Comput. Sci. Softw. Eng., 2013, pp. 11–16. works,” IEEE Trans. Neural Netw. Learn. Syst., to be published,
[14] M. Becker and S. Balci, “Improving an evolutionary approach to Sudoku doi: 10.1109/TNNLS.2021.3106399.
puzzles by intermediate optimization of the population,” in Proc. Int. Conf. [38] Z. H. Zhan, L. Shi, K. C. Tan, and J. Zhang, “A survey on evolutionary
Inf. Sci. Appl., 019, pp. 369–375. computation for complex continuous optimization,” Artif. Intell. Rev.,
[15] T. Mantere and J. Koljonen, “Solving and rating sudoku puzzles with vol. 55, no. 1, pp. 59–110, Jan. 2022.
genetic algorithms,” in Proc. 12th Finnish Artif. Intell. Conf. Step, 2006, [39] Z. H. Zhan, J. Y. Li, and J. Zhang, “Evolutionary deep learning: A survey,”
pp. 86–92. Neurocomputing, vol. 483, pp. 42–58, Apr. 2022.
[16] H. Lloyd and M. Amos, “Solving Sudoku with ant colony optimization,” [40] Z. G. Chen, Z. H. Zhan, S. Kwong, and J. Zhang, “Evolutionary computa-
IEEE Trans. Games, vol. 12, no. 3, pp. 302–311, Sep. 2020. tion for intelligent transportation in smart cities: A survey,” IEEE Comput.
[17] D. E. Knuth, “Dancing links,” in Millennial Perspectives in Computer Intell. Mag., vol. 17, no. 2, pp. 83–102, May 2022.
Science, Boston, MA, USA: Palgrave Macmillan, 2000, pp. 187–214. [41] J. Y. Li, Z. H. Zhan, and J. Zhang, “Evolutionary computation for expensive
[18] F. Gerges, G. Zouein, and D. Azar, “Genetic algorithms with local optima optimization: A survey,” Mach. Intell. Res., vol. 19, no. 1, pp. 3–23,
handling to solve sudoku puzzles,” in Proc. Int. Conf. Comput. Artif. Intell., Jan. 2022.
Mar. 2018, pp. 19–22. [42] N. Pathak and R. Kumar, “Improved wisdom of crowds heuristic for
[19] A. Z. Sevkli and K. A. Hamza, “General variable neighborhood search solving sudoku puzzles,” in Proc. Soft Comput. Signal Process., 2019,
for solving sudoku puzzles: Unfiltered and filtered models,” Soft Comput., pp. 369–377.
vol. 23, no. 15, pp. 6585–6601, Aug. 2019. [43] X. Q. Deng and Y. D. Li, “A novel hybrid genetic algorithm for solving
[20] M. A. Al-Betar, M. A. Awadallah, A. L. Bolaji, and B. O. Alijla, “β-hill sudoku puzzles,” Optim. Lett., vol. 7, no. 2, pp. 241–257, Oct. 2013.
climbing algorithm for Sudoku game,” in Proc. Palestinian Int. Conf. Inf. [44] K. Rodríguez-Vázquez, “GA and entropy objective function for solving
Commun. Technol., 2017, pp. 84–88. sudoku puzzle,” in Proc. Genet. Evol. Comput. Conf. Companion, 2018,
[21] J. Horn, “Solving a large sudoku by co-evolving numerals,” in Proc. Genet. pp. 67–68.
Evol. Comput. Conf. Companion, 2017, pp. 29–30. [45] N. Musliu and F. Winter, “A hybrid approach for the Sudoku problem:
[22] J.-Y. Li, Z.-H. Zhan, H. Wang, and J. Zhang, “Data-driven evolutionary Using constraint programming in iterated local search,” IEEE Intell. Syst.,
algorithm with perturbation-based ensemble surrogates,” IEEE Trans. vol. 32, no. 2, pp. 52–62, Mar./Apr. 2017.
Cybern., vol. 51, no. 8, pp. 3925–3937, Aug. 2021. [46] Y. Sato and H. Inoue, “Solving Sudoku with genetic operations that
[23] Z. H. Zhan et al., “Matrix-based evolutionary computation,” IEEE Trans. preserve building blocks,” in Proc. IEEE Conf. Comput. Intell. Games,
Emerg. Topics Comput. Intell., vol. 6, no. 2, pp. 315–328, Apr. 2022. 2010, pp. 23–29.
[24] J.-Y. Li, Z.-H. Zhan, C. Wang, H. Jin, and J. Zhang, “Boosting data-driven [47] Y. Sato, N. Hasegawa, and M. Sato, “GPU acceleration for Sudoku solution
evolutionary algorithm with localized data generation,” IEEE Trans. Evol. with genetic operations,” in Proc. IEEE Congr. Evol. Comput., 2011,
Computation, vol. 24, no. 5, pp. 923–937, Oct. 2020. pp. 296–303.
[25] X. Zhang, Z.-H. Zhan, W. Fang, P. Qian, and J. Zhang, “Multipopulation [48] N. Pillay, “Finding solutions to Sudoku puzzles using human intuitive
ant colony system with knowledge-based local searches for multiobjective heuristics,” South Afr. Comput. J., vol. 49, no. 1, pp. 25–34, Sep. 2012.
supply chain configuration,” IEEE Trans. Evol. Comput., vol. 26, no. 3, [49] A. Groza, “Japanese puzzles,” in Modelling Puzzles in First Order Logic,
pp. 512–526, Jun. 2022. Berlin, Germany: Springer, 2021, pp. 221–253.
[26] L. Shi, Z. H. Zhan, D. Liang, and J. Zhang, “Memory-based ant colony [50] X. Peng, Y. Huang, and F. Li, “A steganography scheme in a low-bit rate
system approach for multi-source data associated dynamic electric vehicle speech codec based on 3D-sudoku matrix,” in Proc. IEEE 8th Int. Conf.
dispatch optimization,” IEEE Trans. Intell. Transp. Syst., vol. 23, no. 10, Commun. Softw. Netw., 2016, pp. 13–18.
pp. 17491–17505, Oct. 2022, doi: 10.1109/TITS.2022.3150471.
[27] J. Y. Li et al., “A multipopulation multiobjective ant colony system
considering travel and prevention costs for vehicle routing in COVID-
19-like epidemics,” IEEE Trans. Intell. Transp. Syst., vol. 23, no. 12, Chuan Wang received the B.S. degree in computer
pp. 25062–25076, Dec. 2022, doi: 10.1109/TITS.2022.3180760. science and M.S. degree in education from Henan
[28] J. R. Jian, Z. G. Chen, Z. H. Zhan, and J. Zhang, “Region encoding Normal University, Xinxiang, China, in 1999 and
helps evolutionary computation evolve faster: A new solution encoding 2009, respectively.
scheme in particle swarm for large-scale optimization,” IEEE Trans. Evol. He is currently an Associate Professor with the
Comput., vol. 25, no. 4, pp. 779–793, Aug. 2021. College of Software, Henan Normal University.
[29] J. Y. Li, Z. H. Zhan, R. D. Liu, C. Wang, S. Kwong, and J. Zhang, “Gen- His current research interests computational intelli-
eration level parallelism for evolutionary computation: A pipeline-based gence and its applications on intelligent information
parallel particle swarm optimization,” IEEE Trans. Cybern., vol. 51, no. 10, processing and Big Data.
pp. 4848–4859, Oct. 2021.
[30] X. F. Liu, Z. H. Zhan, Y. Gao, J. Zhang, S. Kwong, and J. Zhang,
“Coevolutionary particle swarm optimization with bottleneck objective
learning strategy for many-objective optimization,” IEEE Trans. Evol.
Bing Sun (Student Member, IEEE) received the B.S.
Comput., vol. 23, no. 4, pp. 587–602, Aug. 2019.
degree in computer science and technology from
[31] Z. J. Wang et al., “Dynamic group learning distributed particle swarm
Henan University of Science and Technology, Henan,
optimization for large-scale optimization and its application in cloud China, in 2020. He is currently working toward the
workflow scheduling,” IEEE Trans. Cybern., vol. 50, no. 6, pp. 2715–2729,
M.S. degree in electronic and information engineer-
Jun. 2020.
ing with Henan Normal University, Xinxiang, China.
[32] X. Xia et al., “Triple archives particle swarm optimization,” IEEE Trans.
His current research interests mainly include evo-
Cybern., vol. 50, no. 12, pp. 4862–4875, Dec. 2020.
lutionary computation, swarm intelligence, and their
[33] J. Y. Li, Z. H. Zhan, K. C. Tan, and J. Zhang, “Dual differential grouping: A
applications in real-world problems.
more general decomposition method for large-scale optimization,” IEEE
Trans. Cybern., to be published, doi: 10.1109/TCYB.2022.3158391.
172 IEEE TRANSACTIONS ON GAMES, VOL. 16, NO. 1, MARCH 2024
Ke-Jing Du received the B.S. degree from Sun Yat- Sang-Woon Jeon (Member, IEEE) received the B.S.
Sen University, Guangzhou, China, in 2012, and the and M.S. degrees from Yonsei University, Seoul,
M.S. degree from City University of Hong Kong, South Korea, in 2003 and 2006, respectively, and the
Hong Kong, in 2014. She is currently working toward Ph.D. degree from the Korea Advanced Institute of
the Ph.D. degree with Victoria University, Melbourne, Science and Technology, Daejeon, South Korea, in
VIC, Australia. 2011, all in electrical engineering.
Her current research interests include evolutionary He has been an Associate Professor with the De-
computation (EC) and supply chain management, partment of Military Information Engineering (under-
especially the distributed EC and application of EC graduate school) and the Department of Electronics
in supply chain, feature selection, and games. and Communication Engineering (graduate school),
Hanyang University, Ansan, South Korea, since 2017.
From 2011 to 2013, he was a Postdoctoral Associate with the School of Computer
and Communication Sciences, Ecole Polytechnique Federale de Lausanne, Lau-
sanne, Switzerland. From 2013 to 2017, he was an Assistant Professor with the
Department of Information and Communication Engineering, Andong National
University, Andong, Korea. His research interests include network information
theory, wireless communications, sensor networks, and their applications to the
Jian-Yu Li (Member, IEEE) received the Bachelor’s Internet of Things and Big Data.
and Ph. D. degrees in computer science and technol- Dr. Jeon was a recipient of the Haedong Young Scholar Award in 2017, which
ogy from the South China University of Technology, was sponsored by the Haedong Foundation and given by the Korea Institute of
Guangzhou, China, in 2018 and 2022, respectively. Communications and Information Science (KICS), the Best Paper Award of the
His research interests mainly include computa- KICS journals in 2016, the Best Paper Award of IEEE International Conference
tional intelligence, data-driven optimization, machine on Communications in 2015, the Best Thesis Award from the Department of
learning including deep learning, and their applica- Electrical Engineering, KAIST, in 2012, the Best Paper Award of the KICS
tions in real-world problems, and in environments of Summer Conference in 2010, and the Bronze Prize of the Samsung Humantech
distributed computing and Big Data. Paper Awards in 2009.
Dr. Li has been the Reviewer for IEEE TRANS-
ACTIONS ON EVOLUTIONARY COMPUTATION and the
Neurocomputing journal, and the program committee member and reviewer of
some international conferences.
Hua Wang (Senior Member, IEEE) received the
Ph.D. degree from the University of Southern
Queensland, Toowoomba, QLD, Australia, in 2004.
He is currently a Full-Time Professor with Vic-
toria University, Footscray, VIC, Australia. He has
expertise in electronic commerce, business process
modeling, and enterprise architecture. As a Chief
Zhi-Hui Zhan (Senior Member, IEEE) received the
Investigator, three Australian Research Council Dis-
Bachelor’s and Ph. D. degrees in computer sci-
covery grants have been awarded since 2006, and 200
ence from the Sun Yat-Sen University, Guangzhou
peer-reviewed scholar papers have been published.
China, in 2007 and 2013, respectively.
He is currently the Changjiang Scholar Young
Professor with the School of Computer Science and
Engineering, South China University of Technology,
Guangzhou, China. His current research interests in-
clude evolutionary computation, swarm intelligence, Jun Zhang (Fellow, IEEE) received the Ph.D. degree
and their applications in real-world problems and from the City University of Hong Kong, Hong Kong,
environments of cloud computing and Big Data. in 2002.
Dr. Zhan was a recipient of IEEE Computational Intelligence Society Out- He is currently a Korea Brain Pool Fellow Pro-
standing Early Career Award in 2021, the Outstanding Youth Science Foundation fessor with Hanyang University, South Korea. He has
from National Natural Science Foundations of China in 2018, and the Wu authored or coauthored more than 150 IEEE Transac-
Wen-Jun Artificial Intelligence Excellent Youth from the Chinese Association tions papers in his research areas. His current research
for Artificial Intelligence in 2017. His doctoral dissertation was awarded the interests include computational intelligence, cloud
IEEE CIS Outstanding Ph.D. Dissertation and the China Computer Federation computing, operations research, and power electronic
Outstanding Ph. D. Dissertation. He is one of the World’s Top 2% Scientists for circuits.
both Career-Long Impact and Year Impact in Artificial Intelligence and one of Dr. Zhang was a recipient of the Changjiang Chair
the Highly Cited Chinese Researchers in Computer Science. He is currently the Professor from the Ministry of Education, China, in 2013, The National Science
Chair of Membership Development Committee in IEEE Guangzhou Section and Fund for Distinguished Young Scholars of China in 2011 and the First-Grade
the Vice-Chair of IEEE CIS Guangzhou Chapter. He is currently an Associate Award in Natural Science Research from the Ministry of Education, China,
Editor for IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, the Neuro- in 2009. He is currently an Associate Editor for IEEE TRANSACTIONS ON
computing, the Memetic Computing, and the Machine Intelligence Research. EVOLUTIONARY COMPUTATION and the IEEE TRANSACTIONS ON CYBERNETICS.