0% found this document useful (0 votes)
16 views27 pages

Step 1

Uploaded by

thamizhajith007
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views27 pages

Step 1

Uploaded by

thamizhajith007
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 27

Introduction

Myriads of crossover operators have been developed over time. In this


article, I’ll be discussing 13 such crossover operators which can further
be classified into 3 categories (based on the encoding of parents) as
follows:

Single Point Crossover

Step 1- Select two parents for mating.

Step 2- Select a crossover point at random and swap the bits at the
right site.
After crossover, the new offsprings generated look as follows:

Two-Point Crossover

Step 1- Select two parents for mating.


Step 2- Select two crossover points at random and swap the bits at the
middle site.

After crossover, the new offsprings generated look as follows:

Multi-Point Crossover

Step 1- Select two parents for mating.


Step 2- Select multiple crossover points at random and swap the bits
at the alternate sites.

After crossover, the new offsprings generated look as follows:

Uniform Crossover
Step 1- Select two parents for mating.

Step 2- At each bit position of the parents, toss a coin (let H=1 and
T=0).

Step 3- Follow the algorithm mentioned below to generate both


offsprings:
if Toss=1,
then swap the bits
if Toss=0,
then don’t swap
After crossover, the new offsprings generated look as follows:

Half-Uniform Crossover

Step 1- Select two parents for mating.

Step 2- Toss a coin (let H=1 and T=0) only if the corresponding bits in
P1 and P2 do not match.
Step 3- Follow the algorithm mentioned below to generate both
offsprings:
if Toss=1,
then swap the bits
if Toss=0,
then don’t swap

After crossover, the new offsprings generated look as follows:

Uniform Crossover with Crossover Mask (CM)


Step 1- Select two parents for mating.

Step 2- Define a Crossover Mask (CM).

Step 3- Follow the algorithm mentioned below:


To generate first offspring O1
------------------------------
if CM=0,
then select P1 bit
if CM=1,
then select P2 bitTo generate second offspring O2
------------------------------
if CM=0,
then select P2 bit
if CM=1,
then select P1 bit

After crossover, the new offsprings generated look as follows:


Shuffle Crossover

Step 1- Select two parents for mating.

Step 2- Select a crossover point at random and shuffle the genes of


both parents.

Note: Shuffle genes for the right site and left site separately.
Step 3- Perform Single Point Crossover.

After crossover, the new offsprings generated look as follows:

Matrix Crossover

Step 1- Select two parents for mating.


Step 2- Write parents in a 2-D representation and then divide the
resulting matrices into non-overlapping zones.

Step 3- Perform crossover based on zones.


After crossover, the new offsprings generated look as follows:

Three-Parent Crossover

Step 1- Select three parents for mating.


Step 3- Follow the algorithm mentioned below to generate one
offspring:
To generate offspring O1 from (P1,P2,P3) combination
----------------------------------------------------
if P1 bit = P2 bit,
then select P1 bit
if P1 bit != P2 bit,
then select P3 bit

After crossover, the new offspring generated looks as follows:


Repeat the same steps to generate offsprings O2 and O3 using (P1, P3,
P2) and (P2, P3, P1) combinations respectively.

Note: Sometimes, the third parent can be taken as a Coin Toss or a


Crossover Mask (CM).

Linear Crossover

Step 1- Select two parents for mating.

Step 2- Select a single gene (k) at random.

Step 3- Define α and β parameters.


Step 4- Modify kth gene of P1 using the formula mentioned below:

After crossover, the new offsprings generated looks as follows:


Note: We can generate any number of offsprings by giving different α
and β values.

Single Arithmetic Crossover

Step 1- Select two parents for mating.

Step 2- Select a single gene (k) at random.

Step 3- Define the α parameter.

Step 4- Modify kth gene of P1 and P2 to generate offsprings:


Partially Mapped Crossover

Step 1- Select two parents for mating.

Step 2- Select a substring from the parents at random using crossover


points.
Step 3- Perform Two-Point Crossover.

Step 4- Determine mapping relationship from substring.


Step 5- Use the mapping to legalize offsprings in unselected
substrings.

After crossover, the new offsprings generated looks as follows:

Cycle Crossover
Step 1- Select two parents for mating.

Step 2- Find the cycle defined by parents.


Step 3- Follow the algorithm mentioned below to generate one
offspring:
To generate offspring O1
------------------------
if P1 bit in cycle,
then select P1 bit
if P1 bit not in cycle,
then select P2 bit

After crossover, the new offspring generated looks as follows:

To generate offspring O2, proceed similarly by first filling with P2 and


the remaining with P1.
Introduction to Mutation

In simple terms, mutation may be defined as a small random tweak in


the chromosome, to get a new solution. It is used to maintain and
introduce diversity in the genetic population and is usually applied with
a low probability – pm. If the probability is very high, the GA gets
reduced to a random search.

Mutation is the part of the GA which is related to the “exploration” of


the search space. It has been observed that mutation is essential to the
convergence of the GA while crossover is not.

Bit Flip Mutation

In this bit flip mutation, we select one or more random bits and flip
them. This is used for binary encoded GAs.

Random Resetting Mutation —

In random resetting mutation, we select one or more genes (array


indices) and replace their values with another random value from their
given ranges. Let’s say a[i] (an array index / gene) ranges from [1, 6]
then random resetting mutation will select one value from [1, 6] and
replace a[i]’s value with it.

Swap Mutation
In swap mutation, we select two positions on the chromosome at
random, and interchange the values. This is common in permutation
based encodings.

Scramble Mutation

Scramble mutation is also popular with permutation representations. In


this, from the entire chromosome, a subset of genes is chosen and their
values are scrambled or shuffled randomly.

Inversion Mutation

In inversion mutation, we select a subset of genes like in scramble


mutation, but instead of shuffling the subset, we merely invert the entire
string in the subset.

Selection:

Parent Selection is the process of selecting parents which mate and


recombine to create off-springs for the next generation. Parent selection
is very crucial to the convergence rate of the GA as good parents drive
individuals to a better and fitter solutions.
However, care should be taken to prevent one extremely fit solution
from taking over the entire population in a few generations, as this leads
to the solutions being close to one another in the solution space thereby
leading to a loss of diversity. Maintaining good diversity in the
population is extremely crucial for the success of a GA. This taking up of
the entire population by one extremely fit solution is known
as premature convergence and is an undesirable condition in a GA.

Fitness Proportionate Selection

Fitness Proportionate Selection is one of the most popular ways of


parent selection. In this every individual can become a parent with a
probability which is proportional to its fitness. Therefore, fitter
individuals have a higher chance of mating and propagating their
features to the next generation. Therefore, such a selection strategy
applies a selection pressure to the more fit individuals in the population,
evolving better individuals over time.

Consider a circular wheel. The wheel is divided into n pies, where n is


the number of individuals in the population. Each individual gets a
portion of the circle which is proportional to its fitness value.

Two implementations of fitness proportionate selection are possible −

Roulette Wheel Selection

In a roulette wheel selection, the circular wheel is divided as described


before. A fixed point is chosen on the wheel circumference as shown and
the wheel is rotated. The region of the wheel which comes in front of the
fixed point is chosen as the parent. For the second parent, the same
process is repeated.
It is clear that a fitter individual has a greater pie on the wheel and
therefore a greater chance of landing in front of the fixed point when the
wheel is rotated. Therefore, the probability of choosing an individual
depends directly on its fitness.

Implementation wise, we use the following steps −


 Calculate S = the sum of a finesses.
 Generate a random number between 0 and S.
 Starting from the top of the population, keep adding the finesses
to the partial sum P, till P<S.
 The individual for which P exceeds S is the chosen individual.

Stochastic Universal Sampling (SUS)

Stochastic Universal Sampling is quite similar to Roulette wheel


selection, however instead of having just one fixed point, we have
multiple fixed points as shown in the following image. Therefore, all the
parents are chosen in just one spin of the wheel. Also, such a setup
encourages the highly fit individuals to be chosen at least once.
It is to be noted that fitness proportionate selection methods don’t work
for cases where the fitness can take a negative value.

Tournament Selection

In K-Way tournament selection, we select K individuals from the


population at random and select the best out of these to become a
parent. The same process is repeated for selecting the next parent.
Tournament Selection is also extremely popular in literature as it can
even work with negative fitness values.
Rank Selection

Rank Selection also works with negative fitness values and is mostly
used when the individuals in the population have very close fitness
values (this happens usually at the end of the run). This leads to each
individual having an almost equal share of the pie (like in case of fitness
proportionate selection) as shown in the following image and hence each
individual no matter how fit relative to each other has an approximately
same probability of getting selected as a parent. This in turn leads to a
loss in the selection pressure towards fitter individuals, making the GA
to make poor parent selections in such situations.

In this, we remove the concept of a fitness value while selecting a parent.


However, every individual in the population is ranked according to their
fitness. The selection of the parents depends on the rank of each
individual and not the fitness. The higher ranked individuals are
preferred more than the lower ranked ones.

Chromosome Fitness Value Rank

A 8.1 1
B 8.0 4

C 8.05 2

D 7.95 6

E 8.02 3

F 7.99 5

Random Selection

In this strategy we randomly select parents from the existing population.


There is no selection pressure towards fitter individuals and therefore
this strategy is usually avoided.

You might also like