0% found this document useful (0 votes)
3 views

Implementation and Validation of NSGA-II Algorithm For Constrained and Unconstrained Multi-Objective Optimization Problem

Uploaded by

albert.judit.eng
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views

Implementation and Validation of NSGA-II Algorithm For Constrained and Unconstrained Multi-Objective Optimization Problem

Uploaded by

albert.judit.eng
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/363421433

Implementation and Validation of NSGA-II Algorithm for Constrained and


Unconstrained Multi-Objective Optimization Problem

Conference Paper · May 2022


DOI: 10.1109/GlobConET53749.2022.9872465

CITATIONS READS

2 28

5 authors, including:

Nivedita Naik Madhu G M


National Institute of Technology Goa National Institute of Technology Goa
7 PUBLICATIONS 28 CITATIONS 19 PUBLICATIONS 237 CITATIONS

SEE PROFILE SEE PROFILE

All content following this page was uploaded by Nivedita Naik on 12 April 2024.

The user has requested enhancement of the downloaded file.


2022 IEEE IAS Global Conference on Emerging Technologies (GlobConET)
Arad, Romania. May 20-22, 2022

Implementation and Validation of NSGA-II


Algorithm for Constrained and Unconstrained
Multi-Objective Optimization Problem
2022 IEEE IAS Global Conference on Emerging Technologies (GlobConET) | 978-1-6654-4357-9/22/$31.00 ©2022 IEEE | DOI: 10.1109/GlobConET53749.2022.9872465

Adithya B Uday1 , Nivedita Naik1 , Madhu G M1 , C. Vyjayanthi1 and Chirag Modi2


1
Department of Electrical and Electronics Engineering, 2 Department of Computer Science and Engineering
National Institute of Technology Goa, Goa, India
[email protected], [email protected], [email protected], [email protected], [email protected]

Abstract—Multi-Objective optimization (MOO) algorithms are tive function, whereas multi and many optimization problems
gaining more attention among designers or decision makers refer to problems with more than two objectives and more than
working on optimization problems in practice, compared to three objectives respectively.
the single objective optimization algorithms that are limited
to single objectives. Evolutionary Computation (EC) techniques While single objective optimization problems provide single
are employed for solving these MOO problems. Techniques optimal solution, MOO problems provide multiple solutions.
such as Non-dominated Sorting Genetic Algorithm (NSGA-I), its Practically, most of the optimization problems have multiple
improved version called NSGA-II, Strength-Pareto Evolutionary objectives that yields multiple optimal solutions [2]. There are
Algorithm (SPEA), Multi-Objective Particle Swarm Optimization two approaches in dealing such MOO problems. First one is
(MOPSO) are some widely employed EC techniques. The NSGA-
II algorithm implementation and validation is carried out in this preference-based approach, wherein each objective is weighted
paper. The mathematical analysis of the NSGA-II algorithm and and a single objective function is developed thus obtaining a
its implementation in the MATLAB environment are presented single optimal solution.
here. Developed algorithm is also tested using unconstrained The classical optimization methods like weighted sum ap-
and constrained standard test functions for MOO problems. proach [3], epsilon constraint approach [4], weighted metric
The simulation results revealed that the developed algorithm
is providing results similar to the standard test results. Hence, method [5] and Benson’s method [6] comes under preference
incorporating a suitable mathematical model of any practical based, where the multi-objective problem is formulated as sum
MOO problem into the implemented algorithm will yield optimal of different single objective functions. The second approach,
solution for the multiple objectives under consideration. the Ideal MOO approach which considers convergence and
Index Terms—Constrained Optimization, Evolutionary Com- diversity as the two prime goals, can handle conflicting objec-
putation, Multi-Objective Optimization, NSGA-II, Pareto Opti-
mal Solutions, Unconstrained Optimization. tives as in [2] and [7]. The collection of best solutions of the
MOO problem are called non-dominated solutions or pareto
optimal solutions and the path along which such solutions
I. I NTRODUCTION
lie in the objective space is termed pareto optimal front.
The conventional gradient based numerical optimization Better convergence means, the solutions are lying close to the
techniques are unable to handle non-differentiable functions, expected pareto optimal front and better divergence means,
discontinuous functions etc. and are inefficient in handling the solutions are separated as far as possible in the objective
discrete variables. The inability of classical optimization space.
methods in dealing with the multi-modal and multi-objective One of the popular and well cited ideal MOO method
optimization problems were resolved by the introduction of is NSGA-II [8], an elitist and developed version of NSGA-
evolutionary computation (EC) techniques [1]. They help to I [9]. Here, the convergence is ensured through the non-
solve complex optimization problems by generating, eval- dominated sorting while the divergence is guaranteed through
uating and modifying a population of possible solutions. crowding distance sorting. The NSGA-II algorithm implemen-
They are nature inspired algorithms and are based on natural tation provides pareto optimal solutions. The best solution
evolution or genetics. The ability to handle mixed variables have to be found out using Multi-Criteria Decision Making
such as continuous, discrete, and permutation have boosted (MCDM) methods such as Technique for Order of Preference
the research in the field of EC. by Similiarity to Ideal Solution (TOPSIS), Analytic Hierarchy
Any optimization problem has objectives, constraints and Process(AHP) etc.
bounds. Objectives are functions that are to be either mini- This paper deals with the implementation of NSGA-II algo-
mized or maximized through the optimization process. Single rithm followed by its validation using a set of unconstrained
objective optimization refers to problems with only one objec- standard test functions like Fonseca-Fleming function, Kur-

978-1-6654-4357-9/22/$31.00 ©2022 IEEE 539


Authorized licensed use limited to: National Institute of Technology Goa. Downloaded on April 12,2024 at 03:19:36 UTC from IEEE Xplore. Restrictions apply.
sawe function, Schaffer function 1, Schaffer function 2, ZDT1
and ZDT3 and also using constrained standard test functions
like Binh & Korn function and Chankong & Haimes function.
The simulation results revealed that the designed algorithm
produces findings that are comparable to those of standard test
functions. As a result, combining a proper mathematical model
of any practical MOO problem into the developed algorithm
will result in the optimal solution with respect to the multiple
objectives under study.
The rest of this paper is organized as follows: Section II
and III introduces MOO and NSGA-II algorithm respectively.
Section IV includes the simulation results and discussions.
Section V concludes the paper with acknowledgement at the
end. Fig. 1: Decision space and objective space representation
II. M ULTI -O BJECTIVE O PTIMIZATION
As the term suggests, a multi-objective optimization prob- also constrained.
lem (MOOP) deals with two or three objective functions. A
MOOP in its general form is stated as shown below: In the objective space, a solution fm (Za ) is said to be
dominating another solution fm (Zb ), if
M inimize/M aximize fm (Z), m = 1, 2, ..., M ;
subject to:- 1) fm (Za ) is no worse than fm (Zb ) in all m objectives.
2) fm (Za ) is strictly better than fm (Zb ) in at least one of
gj (Z) ≤ 0, j = 1, 2, ..., J; the m objectives.
hk (Z) = 0, k = 1, 2, ..., K; III. N ON - DOMINATED S ORTING G ENETIC A LGORITHM
(L) (U ) (NSGA-II)
Zi ≤ Zi ≤ Zi , i = 1, 2, ..., n.
In NSGA-II, the feasible solutions found from the objective
Here, Z is the vector of n decision variables represented space is made to compete each other using the dominance
as Z = (z1 , z2 , ..., zn )T . M is the total number of objec- concept and the best collection of solutions termed as non-
tives considered in MOOP. Objectives could be minimized dominating solutions or pareto optimal solutions are found out.
or maximized based on the problem statement. A common The curve formed by joining these solutions are called pareto
practise is to minimize all objectives by using duality prop- optimal front. The pareto optimal front of a two objective
erty on maximizing objectives. The duality property states minimization problem is shown at the objective space of Fig.1.
that by multiplying the objective function by −1, we can
change a maximization problem into a minimization problem. The overall block diagram of the NSGA-II algorithm is
The constraints can be of two types, inequality and equality shown in Fig. 2. Randomly generated initial population gives
constraints. gj (Z) represents j th inequality constraint and N possibilities of the decision variables. Similarly N solutions
hk (Z) represents k th equality constraint. K and J gives of the old generation or the last iteration is combined together
the total number of equality and inequality constraints resp. to make a pool of 2N solutions. Out of these 2N solutions
The inequality constraints treated here as ‘less-than-equal-to’ best N solutions are sorted out using non-dominated sorting
type can also be replaced with ‘greater-than-equal-to’ type. as well as crowding distance sorting. The convergence of the
Here less-than-equal-to type is used for the ease of constraint algorithm is ensured through non-dominated sorting proce-
handling. n represents the total number of decision variables. dure, which rank the solutions based on dominance. Crowding
Each decision variable zi is limited within a lower bound of distance sorting is done for the sake of maintaining diversity.
(L) (U )
zi and an upper bound of zi . This method helps to hunt the solution across the whole search
The decision variables can be mapped on n dimensional space and ensures that the obtained optimal point is not a
space called as decison variable space. The M objectives local optima. The survival selection collects the sorted best
of MOOP introduces the additional space of M dimensions N solutions by removing worst solutions and these solutions
called objective space. The mapping takes places from the are subjected to crossover and mutation variations to generate
n-dimensional decison variable space to the M dimensional the next generation population of decision variables. After
objective space. Fig. 1 illustrates the two spaces and the the fitness evaluation, this new population is again merged
mapping between them. It is clear from this figure that not with the old population to continue the next iteration. The
all solutions in the decision space are feasible. They are process continues updating the solutions at each iteration
restricted by constraints and bounded limits. As a result, the leaving behind the non-dominated solutions at the end of the
objective value mapped by those decision variables will be iteration.

540
Authorized licensed use limited to: National Institute of Technology Goa. Downloaded on April 12,2024 at 03:19:36 UTC from IEEE Xplore. Restrictions apply.
Fig. 2: NSGA-II Overall Block Diagram

IV. S IMULATION R ESULTS AND D ISCUSSIONS optimal solutions of these unconstrained test functions with
Even though the NSGA-II algorithm is widely discussed simulation results. Each of the test functions are defined as
in EC techniques, its algorithm implementation is not yet follows:
standardized. Even if the pseudo code of the algorithm remains 1. Fonseca-Fleming function [11]: This is an unconstrained
same, authors have used different types of crossover and n variable two objective test problem. The function is defined
mutation operators, selection procedures, constraint handling as follows:
approaches etc. Fig.3, Fig. 4 and Fig. 5 shows the standard M inimize F (f1 (z), f2 (z)), where
test functions [10] at left and the corresponding pareto optimal n √
f1 (z) = 1 − exp[− i=1 (zi − (1/ n))2 ]
front results obtained at right for the purpose of comparison. n √ 2
f2 (z) = 1 − exp[− i=1 (zi + (1/ n)) ]
The results obtained using the implemented algorithm is Bounded Constraint: −4 ≤ zi ≤ 4
observed to be similar to the standard test results, which
1≤i≤n
validates the accuracy of the algorithm. Mathematical mod-
2. Kursawe function [12]: This is an unconstrained three
elling of any multi objective optimization problem followed by
variable two objective test problem. The function is defined
replacing the problem parameters in the current algorithm is
as follows:
also expected to give the similar accuracy. The pareto optimal
M inimize F (f1 (z), f 2 (z)), where 
front thus obtained can be used at the decision making stages 2
f1 (z) = i=1 [−10exp(−0.2
 (zi)2 + (zi+1 )2 )]
of the respective multi objective problem. 3
f2 (z) = i=1 |zi |0.8 + 5sin(zi3 )
A. Simulation Setup Bounded Constraint: −5 ≤ zi ≤ 5
Inorder to get an unbiased comparison of CPU times, all 1≤i≤3
the test function evaluations are performed for 100 iterations 3. Schaffer function 1 [13]: This is an unconstrained single
and population count of 50 using the same PC with the variable two objective test problem. The function definition is
specifications as shown in Table 1. as shown below.
M inimize F (f1 (z), f2 (z)), where
TABLE I: Specifications of the PC used for simulations
f1 (z) = z 2
Name Specifications f2 (z) = (z − 2)2
Hardware Bounded Constraint: −A ≤ z ≤ A
Processor AMD Ryzen 3 3200U [Values of A from 10 to 105 are used successfully.]
RAM 4.00 GB
4. Schaffer function 2: This is a modified test function of
Frequency 2.6 GHz
Schaffer function 1. The function definition is as shown below.
Software
M inimize F (f1 (z), f⎧ 2 (z)), where
Operating System Windows 10

⎪ -z if z ≤ 1
Language MATLAB R2020a ⎪
⎨z-2 if 1 < z ≤ 3
f1 (z) =

⎪ 4-z if 3 < z ≤ 4
B. Unconstrained Standard Test Functions ⎪

z-4 if z > 4
The NSGA-II algorithm is tested for some unconstrained f2 (z) = (z − 5)2
standard test functions to check its performance. The test
Bounded Constraint: −5 ≤ z ≤ 10
functions considered here are Fonseca-Fleming function,
Kursawe function, Schaffer function 1, Schaffer function 5. ZDT1 [14]: Zitzler–Deb–Thiele’s function 1 is an uncon-
2, ZDT1 and ZDT3. Fig. 3 and 4 shows standard pareto strained 30 variable two objective test problem. The function

541
Authorized licensed use limited to: National Institute of Technology Goa. Downloaded on April 12,2024 at 03:19:36 UTC from IEEE Xplore. Restrictions apply.
(a) (b)

(c) (d)

(e) (f)

(g) (h)
Fig. 3: Simulation results of standard test functions. (a) Expected pareto optimal front of Fonseca-Fleming function. (b) Obtained
pareto optimal front of Fonseca-Fleming function. (c) Expected pareto optimal front of Kursawe function. (d) Obtained pareto
optimal front of Kursawe function. (e) Expected pareto optimal front of Schaffer function 1. (f) Obtained pareto optimal front
of Schaffer function 1. (g) Expected pareto optimal front of Schaffer function 2. (h) Obtained pareto optimal front of Schaffer
function 2.

542
Authorized licensed use limited to: National Institute of Technology Goa. Downloaded on April 12,2024 at 03:19:36 UTC from IEEE Xplore. Restrictions apply.
(a) (b)

(c) (d)
Fig. 4: Simulation results of standard test functions. (a) Expected pareto optimal front of ZDT1 function. (b) Obtained pareto
optimal front of ZDT1 function. (c) Expected pareto optimal front of ZDT3 function. (d) Obtained pareto optimal front of
ZDT3 function.

is defined as follows: variable two objective test problem. The function is defined
M inimize F (f1 (z), f2 (z)), where as follows:
f1 (z) = z1 M inimize F (f1 (z), f2 (z)), where
f2 (z) = g(z)h(f1 (z),
30g(z)) f1 (z) = 4(z1 )2 + 4(z2 )2
g(z) = 1 + (9/29) i=2  zi f2 (z) = (z1 − 5)2 + (z2 − 5)2
h(f1 (z), g(z)) = 1 − f1 (z)/g(z) Bounded Constraint: 0 ≤ z1 ≤ 5
Bounded Constraint: 0 ≤ zi ≤ 1 0 ≤ z2 ≤ 3
1 ≤ i ≤ 30 Inequality Constraint: (z1 − 5)2 + z22 ≤ 25
(z1 − 8)2 + (z2 + 3)2 ≥ 7.7
6. ZDT3: Zitzler–Deb–Thiele’s function 3 is a modified test
problem of ZDT1. The function is defined as follows: 2. Chankong and Haimes function [16]: This is a con-
M inimize F (f1 (z), f2 (z)), where strained two variable two objective test problem. The function
f1 (z) = z1 is defined as follows:
f2 (z) = g(z)h(f1 (z), g(z)) M inimize F (f1 (z), f2 (z)), where
30 f1 (z) = 2 + (z1 − 2)2 + (z2 − 1)2
g(z) = 1 + (9/29)  i=2 zi
h(f1 (z), g(z)) = 1 − f1 (z)/g(z) f2 (z) = 9z1 − (z2 − 1)2
-(f1 (z)/g(z))sin(10πf1 (z)) Bounded Constraint: −20 ≤ z1 ≤ 20
Bounded Constraint: 0 ≤ zi ≤ 1 −20 ≤ z2 ≤ 20
1 ≤ i ≤ 30 Inequality Constraint: (z1 )2 + z22 ≤ 225
z1 − 3z2 + 10 ≤ 0
C. Constrained Standard Test Functions V. C ONCLUSION
The same NSGA-II algorithm tested using unconstrained In this paper a multi-objective optimization algorithm
functions is now tested for some constrained standard test was implemented considering minimization of two objective
functions to verify its performance. The test functions consid- functions using Non-dominated Sorting Genetic Algorithm
ered here are Binh & Korn function and Chankong & Haimes (NSGA-II). The developed NSGA-II algorithm was vali-
function. Fig. 5 shows these test functions along with obtained dated using various unconstrained standard test functions like
results. The details of these test functions are as follows: Fonseca-Fleming function, Kursawe function, Schaffer func-
1. Binh & Korn function [15]: This is a constrained two tion 1, Schaffer function 2, ZDT1 and ZDT3 and also using

543
Authorized licensed use limited to: National Institute of Technology Goa. Downloaded on April 12,2024 at 03:19:36 UTC from IEEE Xplore. Restrictions apply.
(a) (b)

(c) (d)
Fig. 5: Simulation results of constrained test functions. (a) Expected pareto optimal front of Binh & Korn function. (b) Obtained
pareto optimal front of Binh & Korn function. (c) Expected pareto optimal front of Chankong and Haimes function. (d) Obtained
pareto optimal front of Chankong and Haimes function.

constrained standard test functions like Binh & Korn function [5] Nour Alsana R and Kamali Ardakani M, “A weighted metric method
and Chankong & Haimes function. The obtained results are to optimize multi-response robust problems.” in Journal of Industrial
Engineering International , vol. 5, no. 8, pp. 10-19, Jan. 2009.
found to be matching with the standard test results thus [6] Evtushenko, Yu G., and M. A. Posypkin. “A deterministic algorithm
proving the reliability of the algorithm. Hence, the developed for global multi-objective optimization,” in Optimization Methods and
algorithm can be incorporated for optimization of practical Software, vol. 29, no. 5, pp. 1005-1019, Nov 2013.
[7] Haibo YU, Chao Zhang, Zuqiang Deng, Haifeng Bian and Chen Jia,
MOO problems with suitable mathematical modelling. “Economic optimization for configuration and sizing of micro integrated
energy systems.” in Journal of Modern Power Systems and Clean
VI. ACKNOWLEDGMENT Energy, vol. 6, pp. 330–341, 2018.
[8] Deb, Kalyanmoy et al. “A fast and elitist multiobjective genetic algo-
The work is carried out under the project ”Developing Smart rithm: NSGA-II,” in IEEE transactions on evolutionary computation,
vol. 6, no. 2, pp. 182-197, 2002.
Controller for Optimum Utilization of Energy and Trustworthy [9] Zhao, Xuancai, et al. “Optimizing security and quality of service in a
Management in a Micro Grid Environment-IMP/2019/000251” Real-time database system using Multi-objective genetic algorithm.” in
funded by IMPacting Research, INnovation and Technology Expert Systems with Applications, vol 64, pp. 11-23, 2016.
[10] Mirjalili, S., Mirjalili, S., Saremi, S. et al., “Grasshopper optimization
(IMPRINT) 2C.1 from Science and Engineering Research algorithm for multi-objective optimization problems.” in Applied Intel-
Board, Government of India. ligence, vol 48, pp. 805–820, 2018.
[11] M. T. Sebastiani, R. Lüders and K. V. O. Fonseca, “Evaluating Electric
R EFERENCES Bus Operation for a Real-World BRT Public Transportation Using Simu-
lation Optimization,” in IEEE Transactions on Intelligent Transportation
[1] Y. Tang, Z. Wang, H. Gao, S. Swift and J. Kurths, “A Constrained Systems, vol. 17, no. 10, pp. 2777-2786, Oct. 2016.
Evolutionary Computation Method for Detecting Controlling Regions [12] Lim, W.J., Jambek, A.B. and Neoh S.C., “Kursawe and ZDT functions
of Cortical Networks,” in IEEE/ACM Transactions on Computational optimization using hybrid micro genetic algorithm,” in Soft Computing,
Biology and Bioinformatics, vol. 9, no. 6, pp. 1569-1581, Nov.-Dec. vol. 19, no. 12, pp. 3571–3580, 2015.
2012. [13] Zhu, Guopu, and Sam Kwong, “Gbest-guided artificial bee colony
[2] M. B. Shadmand and R. S. Balog, “Multi-Objective Optimization and algorithm for numerical function optimization,” in Applied mathematics
Design of Photovoltaic-Wind Hybrid System for Community Smart DC and computation, vol. 217, no. 7, pp. 3166-3173, 2010.
Microgrid,” in IEEE Transactions on Smart Grid, vol. 5, no. 5, pp. [14] Costa, Joao Pedro Augusto, et al. “An adaptive algorithm for up-
2635-2643, Sept. 2014. dating populations on SPEA2,” in Simpósio Brasileiro de Automaçao
[3] Marler, R. Timothy, and Jasbir S. Arora, “The weighted sum method Inteligente (SBAI), pp. 78-83, Oct. 2017.
for multi-objective optimization: new insights” in Structural and multi- [15] Y. Wang, Z. Cai, G. Guo and Y. Zhou, “Multiobjective Optimization
disciplinary optimization, vol. 41, no. 6, pp. 853-862, Jul. 2010. and Hybrid Evolutionary Algorithm to Solve Constrained Optimization
[4] Z. Fan et al., “An improved epsilon constraint handling method embed- Problems,” in IEEE Transactions on Systems, Man, and Cybernetics,
ded in MOEA/D for constrained multi-objective optimization problems,” Part B (Cybernetics), vol. 37, no. 3, pp. 560-575, June 2007.
in 2016 IEEE Symposium Series on Computational Intelligence (SSCI), [16] Chankong, Vira, and Yacov Y. Haimes, “Multiobjective decision making:
pp. 1-8, 2016. theory and methodology,” by Courier Dover Publications, 2008.

544
Authorized licensed use limited to: National Institute of Technology Goa. Downloaded on April 12,2024 at 03:19:36 UTC from IEEE Xplore. Restrictions apply.

View publication stats

You might also like