Chapter 4 - GA (Selection - Crossover)
Chapter 4 - GA (Selection - Crossover)
Chapter 4
Genetic Algorithms: Selection -
Crossover
3
Contents
1. Genetic Algorithm Basic Structure
5. Mutation Operators
1. Genetic Algorithm Basic Structure
• We start with an initial population (which may be generated at random or seeded by other heuristics),
• Apply crossover and mutation operators on the parents to generate new off-springs.
• And finally these off-springs replace the existing individuals in the population and the process repeats.
In this way genetic algorithms actually try to mimic the human evolution to some extent.
1. Genetic Algorithm Basic Structure
2. Genetic Algorithms - Parent Selection
• Parent Selection: is the process of selecting parents which mate and recombine to create off-springs
for the next generation.
• Parent selection is very crucial to the convergence rate of the GA as good parents drive individuals to
a better and fitter solutions.
2. Genetic Algorithms - Parent Selection
• Fitness Proportionate Selection is one of the most popular ways of parent selection.
• In this selection, every individual can become a parent with a probability which is proportional to its
fitness.
• Therefore, fitter individuals have a higher chance of mating and propagating their features to the next
generation.
• Therefore, such a selection strategy applies a selection pressure to the more fit individuals in the
population, evolving better individuals over time.
2.1. Fitness Proportionate Selection
• A fixed point is chosen on the wheel circumference as shown and the wheel is rotated.
• The region of the wheel which comes in front of the fixed point is chosen as the parent.
• It is clear that a fitter individual has a greater pie on the wheel and therefore a greater chance of
landing in front of the fixed point when the wheel is rotated.
• Therefore, the probability of choosing an individual depends directly on its fitness.
2.1.1 Roulette Wheel Selection
• Stochastic Universal Sampling is quite similar to Roulette wheel selection, however instead of having just one fixed
point, we have multiple fixed points as shown in the following image.
• Therefore, all the parents are chosen in just one spin of the wheel. Also, such a setup encourages the highly fit
individuals to be chosen at least once.
• It is to be noted that fitness proportionate selection methods don’t work for cases where the fitness can take
a negative value.
2.2. Tournament Selection
• In K-Way tournament selection, we select K individuals from the population at random and select the
best out of these to become a parent.
• The same process is repeated for selecting the next parent.
• Tournament Selection is also extremely popular in literature as it can even work with negative fitness
values.
2.3. Rank Selection
• Rank Selection also works with negative fitness values and is mostly used when the individuals in the
population have very close fitness values (this happens usually at the end of the run).
• This leads to each individual having an almost equal share of the pie (like in case of fitness
proportionate selection) as shown in the following image and hence each individual no matter how fit
relative to each other has an approximately same probability of getting selected as a parent.
• This in turn leads to a loss in the selection pressure towards fitter individuals, making the GA to make
poor parent selections in such situations.
2.3. Rank Selection
• In this, we remove the concept of a fitness value while selecting a parent. However, every individual
in the population is ranked according to their fitness.
2.3. Rank Selection
• Selection of the parents depends on the rank of each individual and not the fitness.
• The higher ranked individuals are preferred more than the lower ranked ones.
A 8.1 1
B 8.0 4
C 8.05 2
D 7.95 6
E 8.02 3
F 7.99 5
2.4. Random Selection
• There is no selection pressure towards fitter individuals and therefore this strategy is usually avoided.
3. Genetic Algorithms - Crossover
• In this more than one parent is selected and one or more off-springs are produced using the genetic
material of the parents.
• In this section we will discuss some of the most popularly used crossover operators.
• It is to be noted that these crossover operators are very generic and the GA Designer might choose
to implement a problem-specific crossover operator as well.
1. One Point Crossover
2. Multi Point Crossover
3. Uniform Crossover
3.1. One Point Crossover
• In this one-point crossover, a random crossover point is selected and the tails of its two parents are
swapped to get new off-springs.
3.2. Multi Point Crossover
• Multi point crossover is a generalization of the one-point crossover where in alternating segments are
swapped to get new off-springs.
3.3. Uniform Crossover
• In a uniform crossover, we don’t divide the chromosome into segments, rather we treat each gene
separately.
• In this, we essentially flip a coin for each gene to decide whether or not it’ll be included in the off-
spring.
• We can also bias the coin to one parent, to have more genetic material in the child from that parent.
4. Genetic Algorithms - Mutation
• Mutation is the part of the GA which is related to the “exploration” of the search space.
• It has been observed that mutation is essential to the convergence of the GA while crossover is not.
5. Mutation Operators
• Like the crossover operators, this is not an exhaustive list and the GA designer might find a
combination of these approaches or a problem-specific mutation operator more useful.
• In this bit flip mutation, we select one or more random bits and flip them.
• Random Resetting is an extension of the bit flip for the integer representation.
• In this, a random value from the set of permissible values is assigned to a randomly chosen gene.
1 2 3 4
1 4 3 4
10.3. Swap Mutation
• In swap mutation, we select two positions on the chromosome at random, and interchange the
values.
• In this, from the entire chromosome, a subset of genes is chosen and their values are scrambled or
shuffled randomly.
10.5. Inversion Mutation
• In inversion mutation, we select a subset of genes like in scramble mutation, but instead of shuffling
the subset, we merely invert the entire string in the subset.
Python | Single Point Crossover in Genetic Algorithm
• Single Point Crossover in Genetic Algorithm is a form of crossover in which two-parent chromosomes
are selected and a random/given point is selected and the genes/data are interchanged between
them after the given/selected point for example.
• Example:
P1: 000011110011
P2: 101010101010
Point: 4
After Crossover:
C1: 000010101010
C2: 101011110011
Code : Python program for single-point crossover in Genetic Algorithm
# patent chromosomes:
s = '1100110110110011'
p = '1000110011011111'
print("Parents")
print("P1 :", s)
print("P2 :", p, "\n")
Generation 1 Children :
Crossover point : 2
1100110011011111
1000110110110011
Generation 2 Children :
Crossover point : 7
1100110110110011
1000110011011111
Generation 3 Children :
Crossover point : 0
1000110011011111
1100110110110011
Output:
Generation 4 Children :
Crossover point : 7
1000110110110011
1100110011011111
Generation 5 Children :
Crossover point : 2
1000110011011111
1100110110110011
Traveling-Salesman-Problem-using-Genetic-Algorithm
https://github.com/mahdihassanzade/Traveling-Salesman-Problem-using-Genetic-Algorithm
Traveling-Salesman-Problem-using-Genetic-Algorithm
Algorithm:
Pseudo-code:
import random
import math
import matplotlib.pyplot as plt
return cities
Traveling-Salesman-Problem-using-Genetic-Algorithm: Code
d = math.sqrt(
math.pow(cityB[1] - cityA[1], 2) + math.pow(cityB[2] - cityA[2], 2)
)
total_sum += d
cityA = cities[0]
cityB = cities[-1]
d = math.sqrt(math.pow(cityB[1] - cityA[1], 2) + math.pow(cityB[2] - cityA[2], 2))
total_sum += d
return total_sum
Traveling-Salesman-Problem-using-Genetic-Algorithm: Code
for i in range(size):
c = cities.copy()
random.shuffle(c)
distance = calcDistance(c)
population.append([distance, c])
fitest = sorted(population)[0]
gen_number = 0
for i in range(200):
new_population = []
parent_chromosome2 = sorted(
random.choices(population, k=TOURNAMENT_SELECTION_SIZE)
)[0]
child_chromosome2 = parent_chromosome2[1][0:point]
for j in parent_chromosome1[1]:
if (j in child_chromosome2) == False:
child_chromosome2.append(j)
# MUTATION
if random.random() < MUTATION_RATE:
point1 = random.randint(0, lenCities - 1)
point2 = random.randint(0, lenCities - 1)
child_chromosome1[point1], child_chromosome1[point2] = (
child_chromosome1[point2],
child_chromosome1[point1],
)
Traveling-Salesman-Problem-using-Genetic-Algorithm: Code
new_population.append([calcDistance(child_chromosome1), child_chromosome1])
new_population.append([calcDistance(child_chromosome2), child_chromosome2])
population = new_population
gen_number += 1
if gen_number % 10 == 0:
print(gen_number, sorted(population)[0][0])
answer = sorted(population)[0]
for i in range(len(answer[1])):
try:
first = answer[1][i]
secend = answer[1][i + 1]
first = answer[1][0]
secend = answer[1][-1]
plt.plot([first[1], secend[1]], [first[2], secend[2]], "gray")
plt.show()
Traveling-Salesman-Problem-using-Genetic-Algorithm: Code
def main():
# initial values
POPULATION_SIZE = 2000
TOURNAMENT_SELECTION_SIZE = 4
MUTATION_RATE = 0.1
CROSSOVER_RATE = 0.9
TARGET = 450.0
cities = getCity()
firstPopulation, firstFitest = selectPopulation(cities, POPULATION_SIZE)
answer, genNumber = geneticAlgorithm(
firstPopulation,
len(cities),
TOURNAMENT_SELECTION_SIZE,
MUTATION_RATE,
CROSSOVER_RATE,
TARGET,
)
Traveling-Salesman-Problem-using-Genetic-Algorithm: Code
print("\n----------------------------------------------------------------")
print("Generation: " + str(genNumber))
print("Fittest chromosome distance before training: " + str(firstFitest[0]))
print("Fittest chromosome distance after training: " + str(answer[0]))
print("Target distance: " + str(TARGET))
print("----------------------------------------------------------------\n")
drawMap(cities, answer)
main()
Thanks!
Any questions?
48