0% found this document useful (0 votes)
90 views18 pages

MOGA Unit 5

Multi-objective genetic algorithms (MOGAs) are used to find multiple optimal solutions to multi-modal function optimization problems. MOGAs employ techniques like mutation, preselection, and crowding to maintain diversity in the population and discover diverse optimal solutions. The MOGA algorithm assigns fitness based on dominance rank, calculates niche counts to measure crowding, and applies sharing to assign shared fitness while preserving the average fitness per rank. MOGAs can efficiently solve multi-objective problems but require properly setting the niche size parameter and may not perfectly scale fitness between ranks.

Uploaded by

Kunal Agarwal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
90 views18 pages

MOGA Unit 5

Multi-objective genetic algorithms (MOGAs) are used to find multiple optimal solutions to multi-modal function optimization problems. MOGAs employ techniques like mutation, preselection, and crowding to maintain diversity in the population and discover diverse optimal solutions. The MOGA algorithm assigns fitness based on dominance rank, calculates niche counts to measure crowding, and applies sharing to assign shared fitness while preserving the average fitness per rank. MOGAs can efficiently solve multi-objective problems but require properly setting the niche size parameter and may not perfectly scale fitness between ranks.

Uploaded by

Kunal Agarwal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 18

Presentation on

MultiObjective GA Optimization
Multi –modal Function Optimization
• The objective in a multi-modal function optimization problem is to find
multiple optimal solutions having either equal to or unequal objective
values.(in addition to find the global optimum solutions ,the no of local
optimum solutions is found).
• Difficulties with classical methods:
• A classical optimization method has to be applied a no. of times each
starting with a different initial guess solution .
• Different initial guess solutions do not guarantee in finding diff optimal
solutions .
Diversity through Mutation:
• The mutation operators is often used as a diversity-preserving operator.
• It has constructive as well as destructive effect.
• As it can create a better solution by perturbing a solution, it can also
destroy a good solution.
• Mutation is always used with small probability.
Preselection:
• Replacing an individual with a like individual is the main concept in a
preselection operator.
• Replacement of parent with an offspring allows many different solutions
to co-exist in the population.
Crowding model:
• Crowding of solutions anywhere in the search space is discouraged, there
by providing the diversity needed to maintain multiple optimal solutions.
• In GA only population “G”(Generation Gap) is permitted to reproduce in
each generation.
• When population is to be introduced in the overlapping population,
“CF” (Crowding Factor) solutions from the populations are choosen at
random.
• The offspring is compared with these CF solutions and the solution which
is most similar to the offspring is replaced.
G=0.1 CF=2 or 3
• Sharing Function Model:
• Goldberg and Richardson suggested another concept, where instead of
replacing a solution by a similar solution the focus was more on
degrading the fitness of similar solutions.
• Sharing Fitness function Fi’=fi/mi
• Due to practical problem a sharing function is used to obtain an estimate
of the no of solutions belonging to each optimum.
• The niche count provides an estimate of the extent of crowding near a
solution.
• Niche count calculated for ith solution:
N
nci   Sh(d ij )
j 1
MOGA -Algorithm
1. The fitness is assigned to each solution in the population with
the random values of the variables.
2. A stochastic universal selection operator , single point crossover
and bit wise mutation is applied to form a new population.
3. Choose a  share .Initialize μ(j)=0 for all the possible ranks j=1,
…,N. Set solution counter i=1.
4. Calculate the number of solutions (ni) that dominates solution i.
Compute the rank of the i-th solution as
ri=1+ni
Increment the count for the number of solutions in rank r i by
one , that is,
μ(ri)=μ(ri)+1
MOGA -Algorithm
5. If i < N , increment i by one and go to step 3.Otherwise , go to
step 6.
6. Identify the maximum rank r* by checking the largest ri
which has μ(ri)>0.The sorting according to rank and fitness –
averaging yields the following assignment of the average
fitness to any solution
r 1
i=1, …, N.
i

Fi  N     k   0.5   r   1
k 1
i

7. For each solution i in rank r , calculate the niche count nci


with other solutions of the same rank using the following
equations.
MOGA -Algorithm
Normalized distance between any two solutions i and j in a
rank is calculated as:
2
M  f k i   f k j  
d ij   
 f max  f min
k 1  k k



Sharing function:
 d ik
1  , if d ik   share ;
Sh(d ik )    share
0, otherwise

Niche count of the solution i is:
N
nci   Sh(d
k 1
ik )
MOGA -Algorithm
• Calculate the shared fitness using :
Fj
F j' 
nc j
• To preserve the same average fitness, scale the shared fitness
as follows: F  r j
F j'   (r )
F j'
F
k 1
k
'

8. If r<r* , increment r by one and go to step 7.Otherwise, the


process is complete.
Advantages:
• The fitness assignment scheme is simple.
• Since nitching is performed in OS, MOGA can be easily applied to
optimization problems.
• MOGA is a suitable choice if a spread of pareto-optimal solution is
required.
Disadvantages:
• All the solutions in a non-dominated front need not have the same
assigned fitness.
• The shared fitness computation procedure does not make sure that a
solution in a poorer rank will always have a worse scale fitness than every
solution in a better rank.
• σshare parameter needs fixing.

You might also like