0% found this document useful (0 votes)
40 views32 pages

GA Part 1 Slides

Genetic algorithms are computational search and optimization techniques based on Darwinian evolution and natural genetics. They use techniques like mutation, crossover, and selection to evolve solutions to problems. A genetic algorithm begins with a random population which is then evaluated. Fit individuals are selected to breed a new population by combining representations through crossover and applying random mutations. This process is repeated until a termination condition is reached, ideally finding an optimal solution. Key aspects that must be defined are the representation of solutions, the fitness function, selection methods, and genetic operators like crossover and mutation that create new solutions.

Uploaded by

Vasanth Nair
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
40 views32 pages

GA Part 1 Slides

Genetic algorithms are computational search and optimization techniques based on Darwinian evolution and natural genetics. They use techniques like mutation, crossover, and selection to evolve solutions to problems. A genetic algorithm begins with a random population which is then evaluated. Fit individuals are selected to breed a new population by combining representations through crossover and applying random mutations. This process is repeated until a termination condition is reached, ideally finding an optimal solution. Key aspects that must be defined are the representation of solutions, the fitness function, selection methods, and genetic operators like crossover and mutation that create new solutions.

Uploaded by

Vasanth Nair
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 32

THE GENETIC ALGORITHM:

COMPUTATIONAL EVOLUTION FOR


OPTIMISATION AND SEARCH

Fixed-length representation algorithms


Introduction

“Genetic algorithms (GAs) are search algorithms based on the


mechanics of natural selection and natural genetics” Goldberg
(1989)

 One form of an Evolutionary Algorithm (stochastic-based


search algorithm)
 Traditionally each individual is a bit-string, although this is
really just an historical accident since bit-strings allow some
mathematical analysis
 Key concept is a FIXED Length representation
Population Structure

 Variation means creating new individuals and (therefore)


replacing individuals
 This can be done as a:
 1. Generational Model: fill a new population which then
replaces the old one once it is complete;
 2. Steady-State: Each new individual replaces one
(randomly or a weak individual) from the current population;
Basic steps in a generational GA
1. Select a fixed-length representation
2. Randomly initialise individuals (their representation)
3. Evaluate each individual in the population
4. Select fit individuals to breed new population
5. Create new population from parents
6. Replace old population with new population
7. Repeat steps 3 - 6 until some termination criterion
reached (#gens, no improvement, time, etc.)
REPRESENTATION
Changes how
individuals are
initialized, evaluated
and the variation
operators
Evaluation

 Defined by the problem


 There is clearly a strong link between the
representation of individuals and the problem being
searched
 Evaluation Fitness function
How do we use a Genetic Algorithm?

 Main issues to resolve when designing an


evolutionary algorithm:
 How do we represent individuals?
 What is the fitness function?

 How is selection performed?

 How are new individuals (children)


created?
 recombination – (often referred to as
crossover). This is mixing parent
representations to produce children
 mutation – small changes
(back to) The tank example
 Each individual is a vector of numbers (genotype)
 The vector represents the weights of a neural
network
 A neural network is created that uses the vector of

weights (G  R)
 The phenotype (P) is the behaviour of the tank,
driven by the neural network, and Fitness (F) is the
number of mines that it runs over/unit time in
environment E.
http://www.ai-junkie.com/ann/evolved/nnt1.html
Representation

 Two principles
 Meaningful building blocks
 “user should select a coding so that short, low order schemata are
relevant to the underlying problem”
 Minimal alphabets
 “user should select the smallest alphabet that permits a natural
expression of the problem”
 Although in reality an encoding is normally selected
that is relatively easy to code and represent the
problem (and therefore often NOT binary!)
Fitness

 Fitness defines the problem


 Involves quantifying performance of the phenotypes
 Usually trying to either minimise or maximise some
value (the fitness measure)
 May be single or multi-objective, but ultimately the
fitness is required to select parents for the next
generation
Selection

 Involves selecting the parents for the next


generation
 Many methods, such as
 Proportional selection (roulette wheel)
 Tournament

 Ranking

 All based on the fitness of the individual


 All bias towards a probabilistic selection of fitter
individuals
Roulette wheel (proportional) selection

 Roulette selection
 each individual is given a slice of a virtual roulette
wheel
 size of each slice is proportional to fitness

 spin the wheel once to select each individual

fitness(i )
propfitness (i )  N

 fitness(i)
i 1
Roulette wheel selection

Individual Fitness Proportional Fitness

A 25 A 0.25
B 20 B 0.2
C 40 C 0.4
D 10 D 0.1
E 5 E 0.05
Roulette selection
E

D
A

C B
N-Tournament selection

 Randomly select N individuals from the population


 Chose the fittest from this subset as one parent
 Does not require the population to have normalised
fitness (like roulette)
 Easier to extend to problems involving partial
evaluation
 Easy to implement
 Easier to control the degree of selection pressure,
just by changing N
Rank-based selection

 Many other examples (here is just one)


 Rank based – order population from fittest to least fit
 Use ranking as a means to select parents

 For example, just take the top half of the population


and randomly sample from this to create the next
population
 Use the ranking as the relative fitness and then use a
proportional selection measure to pick parent
Creating new individuals

 Creates new individuals from selected parents


 Two operators come into play
 Crossover (mixing of 2 parents)
 Mutation

 Note that these operators are trying to “mimic”


nature, where mutation and recombination of DNA
alleles occurs.
Myers, S. et al. “A Fine-Scale Map of Recombination Rates and Hotspots
Across the Human Genome” Science 14 Oct 2005: Vol. 310, Issue 5746, pp.
321-324
DOI: 10.1126/science.1117196
Fixed-Length Crossover

 Two chromosomes join at one or more points and


exchange genes
 Types of crossover include
 one point
 n-point

 uniform

 For different representations (wait until we look at


variable-length concepts) there will be different ways
to “mix” two parent representations
One-point crossover
1. Randomly select a point to cut
2. Exchange representations after the cut
Two-point crossover
Exchange section based on 2 random positions
Uniform crossover
 At each locus (position), toss a coin to decide if you
will exchange
Mutation
 One or more loci are randomly chosen and values
changed
 If the representation is binary, then 0  1, 1  0
 If a number, it might be a small random change using a
value sampled from a distribution with mean 0, small
variance
 Method of change depends upon coding schema
used
 Generally low probability of mutation at any locus
(position)
Mutation

 Effect of mutation dependents on the size of


alphabet (possible values)
 the higher the cardinality of the alphabet, the
(possibly) lower the benefit of mutation (hence, very
useful with bitstrings)
 However, mutation does seem valuable in that it
maintains diversity within the population, and
therefore slows/stops convergence
Creating the new population

 Replace some or all of the parent population with children


 Many different replacement strategies available
 some create a breeding pool (based on selection), and then randomly select
parents from this pool
 others keep the original population, and just sample from this population
repeatedly
 Both have similar final effects
 Most concerned with preserving ‘good’ genes and purging ‘bad’
genes
 Some selection schemes always copy the best current individuals to
the next population (elitism)
Drift and selection
 There are really two factors operating with GA’s
 One is the convergence towards fit individuals
 The other is random drift due to the stochastic selection
of parents
 Premature convergence to non-optimal solutions
sometimes a problem
 If you only use crossover with a fixed length the
population will converge to a single representation
Advantages of GA

 Efficient means of investigating large combinatorial


problems
 can solve combinatorial problems many orders of
magnitude faster than exhaustive ‘brute force’ searches
 Robust, weak method since it requires few assumptions
regarding the problem being solved, and works pretty
well under a variety of parameter settings
 Does not require gradient information regarding the
fitness landscape (both a strength and a weakness)
Disadvantages

 GAs are not ‘silver bullets’ for solving problems


 Must be able to assess the quality of each attempt
at a solution
 can’t solve problems where it is either right or wrong
e.g. code cracking
 Computationally expensive
 some problems require many days or weeks to run if
the fitness evaluation is costly
Other issues
 Can be sensitive to initial parameters
 parameters such as mutation can significantly influence
the search
 Size of population can have a significant effect

 Stochastic process
 not guaranteed to find an optimal solution
 For problems with multiple solutions (e.g. multi-modal
problems) it converges to just one solution
 Extensions to multimodal/multiobjective concepts
will be examined in later lectures
References
Genetics Algorithm Tutorial by A. Townsend
(available on Blackboard under this lecture)

Whigham, P.A., Dick, G. & Maclaurin, J. On the mapping of genotype to phenotype in evolutionary
algorithms. Genet Program Evolvable Mach 18, 353–361 (2017).
https://doi.org/10.1007/s10710-017-9288-x

Eiben, A., Smith, J. From evolutionary computation to the evolution of things. Nature 521, 476–482
(2015). https://doi.org/10.1038/nature14544

Genetic Algorithm Tutorial & Java Applets (needs Java support on browser) :
http://www.obitko.com/tutorials/genetic-algorithms/

Goldberg, D.E. (1989) “Genetic Algorithms in Search, Optimization and Machine Learning”,
Addison-Wesley,Reading (library)

You might also like