02_GeneticAlgorithms
02_GeneticAlgorithms
Problems:
B1:
Evolution as a search problem
Competing animals and “Survival of the fittest”
“Fittest” animals
◼ Livelonger
◼ Stronger
◼ More attractive
Hence, they get more mates and produce more and “healthier” off springs
Nature is biased towards
“fitter” animals for
reproduction
Basisfor Genetic
Algorithms
Parent chromosomes are
copied randomly to the
child
Butcopy errors can
happen: Mutation
Genetic Algorithm (GA)
Modelling a problem as a GA
A method for representing solutions as chromosomes (or
string of characters)
A way to calculate the fitness of a solution
One generation
A selection method to choose parents
𝐿=#items
𝑆= 1 1 0 0 0 1 0 1 0 1 0 0 0
1. Tournament Selection
2. Truncation Selection
3. Fitness Proportional Selection
Tournament Selection
Way of selecting one parent (individual) at a time
Choose k (the tournament size) individuals from the population at random
Choose the best individual from the tournament with probability 𝑝
Global exploration
Offspring are radically different than their parents
Mutation
1
Flip a bit with a low probability 𝑝 =
𝐿
Choosing Next Generation
Choosing only off spring can be risky
New generation can have lesser fitness values
We can lose a really good sample
Elitism
Selecta few fittest strings from parents, say 𝑋
Replace strings from offspring with 𝑋
◼ Replacement at random, or,
◼ Replace the least fit
Tournament
Two fittest parent and their offspring
Tournament winners proceed to the next round
Elitism and Tournament can lead to premature convergence
Both promote fitter members
After some time, same set of fittest members keep getting promoted
Exploration is downplayed
Solution:
Niching (or “using island populations”)
Fitness sharing
Niching
Separate populations into subpopulations
Each subpopulation converges independently to different local minima
A few members of one sub-population are randomly injected as
“immigrants” to another.
Fitness Sharing
𝛼
𝐹
𝐹𝛼 =
#Times 𝛼 appears in a population
Biased for uncommon strings
But, can loose very good common strings
Modelling a problem as a GA
A method for representing solutions as chromosomes (or string of
characters)
A way to calculate the fitness of a solution
Mutation
Crossover
Limitations of GA
Can get stuck to a local minima for a very long time
Like a black box:
We don’t know how error landscape looks like and how it is working
No guarantee to converge
Training Neural Networks with GA
Encode weights as strings
Fitness function: sum-of-square errors
Reasonably good results.
Problems:
Local error information at an o/p node is lost – the entire error is clubbed
into one number
Not using gradient information
Types of Encoding in GA
Binary Encoding: a string of 0s and 1s
E.g.: Knapsack problem
Permutation Encoding: string of numbers representing a sequence
E.g.: Sorting a sequence of numbers
4 2 5 9 0 7 8 1 3 6
How to do crossover?
4 2 5 9 0 7 8 1 3 6
+
8 1 9 4 2 5 6 3 0 7
=
4 2 5 9 0 8 1 6 3 7
How to mutate?
Value Encoding: a string of real numbers
E.g.: Training weights in neural networks
Tree Encoding: Each chromosome is a tree
E.g.:Given input and output values, find a function, which will give the best
(closest to wanted) output to all inputs.
(+ x (/ 5 y))