0% found this document useful (0 votes)
40 views48 pages

Chapter 4 - GA (Selection - Crossover)

The document discusses genetic algorithms and their components of selection, crossover and mutation. It describes various selection methods like fitness proportionate selection and tournament selection. It also explains different crossover techniques like one point crossover and uniform crossover. Finally it covers mutation operators such as bit flip mutation and random resetting.

Uploaded by

Ahmed Elshenawy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
40 views48 pages

Chapter 4 - GA (Selection - Crossover)

The document discusses genetic algorithms and their components of selection, crossover and mutation. It describes various selection methods like fitness proportionate selection and tournament selection. It also explains different crossover techniques like one point crossover and uniform crossover. Finally it covers mutation operators such as bit flip mutation and random resetting.

Uploaded by

Ahmed Elshenawy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 48

Genetic Algorithm

Chapter 4
Genetic Algorithms: Selection -
Crossover

Dr. Fatma M. Talaat


Contents

1. Chapter 1: Introduction to Genetic Algorithm

2. Chapter 2: Genetic Algorithm in Machine Learning

3. Chapter 3: Genetic Algorithms: Population Representation - Fitness


Function

4. Chapter 4: Genetic Algorithms: Selection - Crossover

5. Chapter 5: The Applications of Genetic Algorithms in Medicine

6. Chapter 6: Practical Examples of Genetic Algorithms

7. Chapter 7: The use of GA in the field of robotics


Chapter 4: Genetic Algorithms:
Selection - Crossover

3
Contents
1. Genetic Algorithm Basic Structure

2. Genetic Algorithms - Parent Selection


○ 2.1. Fitness Proportionate Selection
○ 2.2. Tournament Selection
○ 2.3. Rank Selection
○ 2.4. Random Selection

3. Genetic Algorithms - Crossover


○ 3.1. One Point Crossover
○ 3.2. Multi Point Crossover
○ 3.3. Uniform Crossover
4. Genetic Algorithms – Mutation

5. Mutation Operators
1. Genetic Algorithm Basic Structure

The basic structure of a GA is as follows:

• We start with an initial population (which may be generated at random or seeded by other heuristics),

• Select parents from this population for mating.

• Apply crossover and mutation operators on the parents to generate new off-springs.

• And finally these off-springs replace the existing individuals in the population and the process repeats.
In this way genetic algorithms actually try to mimic the human evolution to some extent.
1. Genetic Algorithm Basic Structure
2. Genetic Algorithms - Parent Selection

• Parent Selection: is the process of selecting parents which mate and recombine to create off-springs
for the next generation.

• Parent selection is very crucial to the convergence rate of the GA as good parents drive individuals to
a better and fitter solutions.
2. Genetic Algorithms - Parent Selection

1. Fitness Proportionate Selection


1.1. Roulette Wheel Selection
1.2. Stochastic Universal Sampling (SUS)
2. Tournament Selection
3. Rank Selection
4. Random Selection
2.1. Fitness Proportionate Selection

• Fitness Proportionate Selection is one of the most popular ways of parent selection.

• In this selection, every individual can become a parent with a probability which is proportional to its
fitness.

• Therefore, fitter individuals have a higher chance of mating and propagating their features to the next
generation.

• Therefore, such a selection strategy applies a selection pressure to the more fit individuals in the
population, evolving better individuals over time.
2.1. Fitness Proportionate Selection

• Consider a circular wheel.


• The wheel is divided into n pies, where n is the number of individuals in the population.
• Each individual gets a portion of the circle which is proportional to its fitness value.
• Two implementations of fitness proportionate selection are possible:
1) Roulette Wheel Selection
2) Stochastic Universal Sampling (SUS)
2.1.1 Roulette Wheel Selection

• In a roulette wheel selection, the circular wheel is divided as described before.

• A fixed point is chosen on the wheel circumference as shown and the wheel is rotated.

• The region of the wheel which comes in front of the fixed point is chosen as the parent.

• For the second parent, the same process is repeated.


2.1.1 Roulette Wheel Selection

• It is clear that a fitter individual has a greater pie on the wheel and therefore a greater chance of
landing in front of the fixed point when the wheel is rotated.
• Therefore, the probability of choosing an individual depends directly on its fitness.
2.1.1 Roulette Wheel Selection

Implementation wise, we use the following steps:


1. Calculate S = the sum of a finesses.
2. Generate a random number between 0 and S.
3. Starting from the top of the population, keep adding the finesses to the partial sum P, till P<S.
4. The individual for which P exceeds S is the chosen individual.
2.1.2 Stochastic Universal Sampling (SUS)

• Stochastic Universal Sampling is quite similar to Roulette wheel selection, however instead of having just one fixed
point, we have multiple fixed points as shown in the following image.

• Therefore, all the parents are chosen in just one spin of the wheel. Also, such a setup encourages the highly fit
individuals to be chosen at least once.

• It is to be noted that fitness proportionate selection methods don’t work for cases where the fitness can take
a negative value.
2.2. Tournament Selection

• In K-Way tournament selection, we select K individuals from the population at random and select the
best out of these to become a parent.
• The same process is repeated for selecting the next parent.
• Tournament Selection is also extremely popular in literature as it can even work with negative fitness
values.
2.3. Rank Selection

• Rank Selection also works with negative fitness values and is mostly used when the individuals in the
population have very close fitness values (this happens usually at the end of the run).

• This leads to each individual having an almost equal share of the pie (like in case of fitness
proportionate selection) as shown in the following image and hence each individual no matter how fit
relative to each other has an approximately same probability of getting selected as a parent.

• This in turn leads to a loss in the selection pressure towards fitter individuals, making the GA to make
poor parent selections in such situations.
2.3. Rank Selection

• In this, we remove the concept of a fitness value while selecting a parent. However, every individual
in the population is ranked according to their fitness.
2.3. Rank Selection

• Selection of the parents depends on the rank of each individual and not the fitness.
• The higher ranked individuals are preferred more than the lower ranked ones.

Chromosome Fitness Value Rank

A 8.1 1

B 8.0 4

C 8.05 2

D 7.95 6

E 8.02 3

F 7.99 5
2.4. Random Selection

• In this strategy we randomly select parents from the existing population.

• There is no selection pressure towards fitter individuals and therefore this strategy is usually avoided.
3. Genetic Algorithms - Crossover

• The crossover operator is analogous to reproduction and biological crossover.

• In this more than one parent is selected and one or more off-springs are produced using the genetic
material of the parents.

• Crossover is usually applied in a GA with a high probability.


3. Crossover Operators

• In this section we will discuss some of the most popularly used crossover operators.

• It is to be noted that these crossover operators are very generic and the GA Designer might choose
to implement a problem-specific crossover operator as well.
1. One Point Crossover
2. Multi Point Crossover
3. Uniform Crossover
3.1. One Point Crossover

• In this one-point crossover, a random crossover point is selected and the tails of its two parents are
swapped to get new off-springs.
3.2. Multi Point Crossover

• Multi point crossover is a generalization of the one-point crossover where in alternating segments are
swapped to get new off-springs.
3.3. Uniform Crossover

• In a uniform crossover, we don’t divide the chromosome into segments, rather we treat each gene
separately.

• In this, we essentially flip a coin for each gene to decide whether or not it’ll be included in the off-
spring.

• We can also bias the coin to one parent, to have more genetic material in the child from that parent.
4. Genetic Algorithms - Mutation

• Mutation is the part of the GA which is related to the “exploration” of the search space.

• It has been observed that mutation is essential to the convergence of the GA while crossover is not.
5. Mutation Operators

• Like the crossover operators, this is not an exhaustive list and the GA designer might find a
combination of these approaches or a problem-specific mutation operator more useful.

1. Bit Flip Mutation


2. Random Resetting
3. Swap Mutation
4. Scramble Mutation
5. Inversion Mutation
10.1. Bit Flip Mutation

• In this bit flip mutation, we select one or more random bits and flip them.

• This is used for binary encoded GAs.


10.2. Random Resetting

• Random Resetting is an extension of the bit flip for the integer representation.

• In this, a random value from the set of permissible values is assigned to a randomly chosen gene.

1 2 3 4

1 4 3 4
10.3. Swap Mutation

• In swap mutation, we select two positions on the chromosome at random, and interchange the
values.

• This is common in permutation based encodings.


10.4. Scramble Mutation

• Scramble mutation is also popular with permutation representations.

• In this, from the entire chromosome, a subset of genes is chosen and their values are scrambled or
shuffled randomly.
10.5. Inversion Mutation

• In inversion mutation, we select a subset of genes like in scramble mutation, but instead of shuffling
the subset, we merely invert the entire string in the subset.
Python | Single Point Crossover in Genetic Algorithm

• Single Point Crossover in Genetic Algorithm is a form of crossover in which two-parent chromosomes
are selected and a random/given point is selected and the genes/data are interchanged between
them after the given/selected point for example.

• Example:
P1: 000011110011
P2: 101010101010

Point: 4
After Crossover:
C1: 000010101010
C2: 101011110011
Code : Python program for single-point crossover in Genetic Algorithm

# library to generate a random number


import random

# function for implementing the single-point crossover


def crossover(l, q):
# converting the string to list for performing the crossover
l = list(l)
q = list(q)

# generating the random number to perform crossover


k = random.randint(0, 15)
print("Crossover point :", k)

# interchanging the genes


for i in range(k, len(s)):
l[i], q[i] = q[i], l[i]
l = ''.join(l)
q = ''.join(q)
print(l)
print(q, "\n\n")
return l, q
Code : Python program for single-point crossover in Genetic Algorithm

# patent chromosomes:

s = '1100110110110011'
p = '1000110011011111'
print("Parents")
print("P1 :", s)
print("P2 :", p, "\n")

# function calling and storing the off springs for


# next generation crossover
for i in range(5):
print("Generation ", i+1, "Children :")
s, p = crossover(s, p)
Output:
Parents
P1 : 1100110110110011
P2 : 1000110011011111

Generation 1 Children :
Crossover point : 2

1100110011011111
1000110110110011

Generation 2 Children :
Crossover point : 7

1100110110110011
1000110011011111

Generation 3 Children :
Crossover point : 0

1000110011011111
1100110110110011
Output:
Generation 4 Children :
Crossover point : 7

1000110110110011
1100110011011111

Generation 5 Children :
Crossover point : 2

1000110011011111
1100110110110011
Traveling-Salesman-Problem-using-Genetic-Algorithm

https://github.com/mahdihassanzade/Traveling-Salesman-Problem-using-Genetic-Algorithm
Traveling-Salesman-Problem-using-Genetic-Algorithm

Algorithm:

1. Initialize the population randomly.


2. Determine the fitness of the chromosome.
3. Until done repeat:
1. Select parents.
2. Perform crossover and mutation.
3. Calculate the fitness of the new population.
4. Append it to the gene pool.

Pseudo-code:

Initialize procedure GA{


Set cooling parameter = 0;
Evaluate population P(t);
While( Not Done ){
Parents(t) = Select_Parents(P(t));
Offspring(t) = Procreate(P(t));
p(t+1) = Select_Survivors(P(t), Offspring(t));
t = t + 1;
}
}
Traveling-Salesman-Problem-using-Genetic-Algorithm: Code

import random
import math
import matplotlib.pyplot as plt

# get cities info


def getCity():
cities = []
f = open("TSP51.txt")
for i in f.readlines():
node_city_val = i.split()
cities.append(
[node_city_val[0], float(node_city_val[1]), float(node_city_val[2])]
)

return cities
Traveling-Salesman-Problem-using-Genetic-Algorithm: Code

# calculating distance of the cities


def calcDistance(cities):
total_sum = 0
for i in range(len(cities) - 1):
cityA = cities[i]
cityB = cities[i + 1]

d = math.sqrt(
math.pow(cityB[1] - cityA[1], 2) + math.pow(cityB[2] - cityA[2], 2)
)

total_sum += d

cityA = cities[0]
cityB = cities[-1]
d = math.sqrt(math.pow(cityB[1] - cityA[1], 2) + math.pow(cityB[2] - cityA[2], 2))

total_sum += d

return total_sum
Traveling-Salesman-Problem-using-Genetic-Algorithm: Code

# selecting the population


def selectPopulation(cities, size):
population = []

for i in range(size):
c = cities.copy()
random.shuffle(c)
distance = calcDistance(c)
population.append([distance, c])
fitest = sorted(population)[0]

return population, fitest

# the genetic algorithm


def geneticAlgorithm(
population,
lenCities,
TOURNAMENT_SELECTION_SIZE,
MUTATION_RATE,
CROSSOVER_RATE,
TARGET,
):
Traveling-Salesman-Problem-using-Genetic-Algorithm: Code

gen_number = 0
for i in range(200):
new_population = []

# selecting two of the best options we have (elitism)


new_population.append(sorted(population)[0])
new_population.append(sorted(population)[1])

for i in range(int((len(population) - 2) / 2)):


# CROSSOVER
random_number = random.random()
if random_number < CROSSOVER_RATE:
parent_chromosome1 = sorted(
random.choices(population, k=TOURNAMENT_SELECTION_SIZE)
)[0]

parent_chromosome2 = sorted(
random.choices(population, k=TOURNAMENT_SELECTION_SIZE)
)[0]

point = random.randint(0, lenCities - 1)


Traveling-Salesman-Problem-using-Genetic-Algorithm: Code
child_chromosome1 = parent_chromosome1[1][0:point]
for j in parent_chromosome2[1]:
if (j in child_chromosome1) == False:
child_chromosome1.append(j)

child_chromosome2 = parent_chromosome2[1][0:point]
for j in parent_chromosome1[1]:
if (j in child_chromosome2) == False:
child_chromosome2.append(j)

# If crossover not happen


else:
child_chromosome1 = random.choices(population)[0][1]
child_chromosome2 = random.choices(population)[0][1]

# MUTATION
if random.random() < MUTATION_RATE:
point1 = random.randint(0, lenCities - 1)
point2 = random.randint(0, lenCities - 1)
child_chromosome1[point1], child_chromosome1[point2] = (
child_chromosome1[point2],
child_chromosome1[point1],
)
Traveling-Salesman-Problem-using-Genetic-Algorithm: Code

point1 = random.randint(0, lenCities - 1)


point2 = random.randint(0, lenCities - 1)
child_chromosome2[point1], child_chromosome2[point2] = (
child_chromosome2[point2],
child_chromosome2[point1],
)

new_population.append([calcDistance(child_chromosome1), child_chromosome1])
new_population.append([calcDistance(child_chromosome2), child_chromosome2])

population = new_population

gen_number += 1

if gen_number % 10 == 0:
print(gen_number, sorted(population)[0][0])

if sorted(population)[0][0] < TARGET:


break

answer = sorted(population)[0]

return answer, gen_number


Traveling-Salesman-Problem-using-Genetic-Algorithm: Code

# draw cities and answer map


def drawMap(city, answer):
for j in city:
plt.plot(j[1], j[2], "ro")
plt.annotate(j[0], (j[1], j[2]))

for i in range(len(answer[1])):
try:
first = answer[1][i]
secend = answer[1][i + 1]

plt.plot([first[1], secend[1]], [first[2], secend[2]], "gray")


except:
continue

first = answer[1][0]
secend = answer[1][-1]
plt.plot([first[1], secend[1]], [first[2], secend[2]], "gray")

plt.show()
Traveling-Salesman-Problem-using-Genetic-Algorithm: Code

def main():
# initial values
POPULATION_SIZE = 2000
TOURNAMENT_SELECTION_SIZE = 4
MUTATION_RATE = 0.1
CROSSOVER_RATE = 0.9
TARGET = 450.0

cities = getCity()
firstPopulation, firstFitest = selectPopulation(cities, POPULATION_SIZE)
answer, genNumber = geneticAlgorithm(
firstPopulation,
len(cities),
TOURNAMENT_SELECTION_SIZE,
MUTATION_RATE,
CROSSOVER_RATE,
TARGET,
)
Traveling-Salesman-Problem-using-Genetic-Algorithm: Code

print("\n----------------------------------------------------------------")
print("Generation: " + str(genNumber))
print("Fittest chromosome distance before training: " + str(firstFitest[0]))
print("Fittest chromosome distance after training: " + str(answer[0]))
print("Target distance: " + str(TARGET))
print("----------------------------------------------------------------\n")

drawMap(cities, answer)

main()
Thanks!
Any questions?

48

You might also like