Multicriterial Optimization Using Genetic Algorithm
Multicriterial Optimization Using Genetic Algorithm
180
175
170
165
160
Fitness
155
150
145
140
Best Fitness
Mean Fitness
135
130
0 100 200 300 400 500 600
Generations
*
Then x is the global solution(s), f is the objective
function, and the set Ω is the feasible region ( Ω ⊂ S ).
Local optimums
Global optimum
f2
objective
functions
F = [ f1, f2 ]
f1
= Best solutions + = ”Normal” solutions
Page 7 Multicriterial Optimization Using Genetic Algorithm
Multicriterial optimalization
The Multiobjective Optimalization Problem also called
multicriteria optimisation or vector optimisation problem
can then be determined (in words) as a problem of finding
a vector of decision variables which satisfies constraints
and optimises a vector function whose elements represent
the objective functions.
This functions form a mathematical description of
performance criteria which are usually in conflict with each
other.
Hence the term ”optimise” means finding such a solution
which would give the values of all the objective functions
acceptable to the decision maker.
where p < n
and n is the size of decision vector
min{ f ( x )} = max{ − f ( x )}
f2( x* )
Single
optimal solution
f1( x )
Singe
f1( x * ) x* solution
Figure 1.1
vector
f2
objective
functions
F = [ f1, f2 ]
f1
= Pareto Optimal Set + = ”Normal” solutions
Page 35 Multicriterial Optimization Using Genetic Algorithm
Multicriterial Optimization Problem
Pareto Optimality
objective
functions
F = [ f1, f2 ] F
Pareto Front
f1
Page 38 Multicriterial Optimization Using Genetic Algorithm
Multicriterial Optimization Problem
Global Optimization
Defining an MOP global optimum is not a trivial task as the
”best” compromise solution is really dependent on the
specific preferences (or biases) of the (human) decision
maker.
Solutions may also have some temporal dependences (e.g.
acceptable resource expeditures may vary from month to
month).
Thus, there is no universally accepted definition for the MOP
global optimization problem.
(But there are implemented more and more individual
solutions...)
Page 39 Multicriterial Optimization Using Genetic Algorithm
General Optimization Algorithms
Overview
Genaral search and optimization techniques are classified
into three categories: enumarative, detereministic and
stochastic (random). (Figure 1.11 on next page)
As many real-world problems are computationally intensive,
some means of limiting the search space must be
implemented to find ”acceptable solutions” in ”acceptable
time” (Mihalewicz and Fogel 2000)
Deterministic algorithms attempt this by incorporating
problem domain knowledge. Many of graph/tree search
algorithms are known and applied.
Fitness Values
…. ….. ….. …. ….. …..
Parents
0.15 3.32 1.83 7.54 2.00 6.12
Mutation
CROSSOVER
Children
Variable_1
Rank = 3
Rank = 2
Rank = 1
Pareto Objective 1
Front
Page 48 Multicriterial Optimization Using Genetic Algorithm
MOGA Optimization Algorithms Genetic Algorithm
Rank RankMAX
Pareto Objective_1
Front
Page 50 Multicriterial Optimization Using Genetic Algorithm
MOGA Optimization Algorithms Genetic Algorithm
Pareto Objective_1
Front
Page 51 Multicriterial Optimization Using Genetic Algorithm
MOGA Optimization Algorithms Genetic Algorithm
Genetic Drift Break with Fitness Correction
∆1
Normalization
[0,1]x[0,1]
Objective_2
∆2
Objective_1
1
1
Niche count
distanc σshare
e
Σ(Niche Count) = 1/Wi
0 1
3 3 3 2 2 2 1 1 0 0 0
Rank Values
f 1 ( x ) = ( x − 2 )
2
MOP-2
f1( x , y ) = x
x 2 x where
f 2 ( x , y ) = ( 1 + 10 y ) 1 − − sin( 8πx )
1 + 10 y 1 + 10 y
0 ≤ x,y ≤ 1 MOP-2
MOP-2 with Drift Break
normal
MOP 3
( )
f 1 ( x , y ) = − 1 + ( A1 − B1 )2 + ( A2 − B2 )2
where
(
f2 ( x, y ) = − 1 + ( x + 3 ) + ( y + 1 )
2
) 2
MOP-3
MOP-3 with Drift Break
normal
MOP 4
n 1
2
f 1 ( xi ) = 1 − exp − ∑ xi −
i =1 n
where i = 1, 2 and
n 1
2
f 2 ( xi ) = 1 − exp − ∑ xi +
i =1 n
− 4 ≤ xi ≤ 4
MOP-4
MOP-4 with Drift Break
normal
Questions?