05 Geneticalgorithms
05 Geneticalgorithms
1
Introduction to Genetic Algorithms Introduction to Genetic Algorithms
• Mechanisms of evolutionary change: • Mechanisms of evolutionary change:
– Mutation: the rare occurrence of errors during – Natural selection: the fittest survive in a
the process of copying chromosomes resulting in competitive environment resulting in better
organisms
• changes that are nonsensical or deadly,
producing organisms that can't survive • individuals with better survival traits
generally survive for a longer period of time
• changes that are beneficial, producing
"stronger" organisms • this provides a better chance for reproducing
• changes that aren't harmful or beneficial, and passing the successful traits on to offspring
producing organisms that aren't improved • over many generations the species improves
since better traits will out-number weaker ones
Genetic Algorithm
Representation of Individuals
Create initial random population
Solutions represented as a vector of values
– For example: Satisfiability problem (SAT) Evaluate fitness of each individual
• determine if a statement in propositional logic is yes
satisfiable, for example: Termination criterion satisfied?
(P1∨ P2)∧(P1 ∨ ¬P3)∧(P1 ∨ ¬P4)∧(¬P3 ∨ ¬P4) stop
• each element corresponds to a symbol having no
a value of either true (i.e., 1) or false (i.e., 0) Select parents according to fitness
• vector: P1 P2 P3 P4
• values: 1 0 1 1 ç rep. of 1 individual Combine parents to generate offspring
– Traveling salesperson problem (TSP)
• Tour can be represented as a sequence of cities visited Mutate offspring
2
Genetic Algorithm (1 version*)
Initialization: Seeding the Population
1. Let s = {s1, …, sN} be the current population
2. Let p[i] = f(si)/SUMjf(sj) be the fitness probabilities • Initialization sets the beginning population
3. for k = 1; k < N; k += 2 of individuals from which future generations
• Parent1 = randomly pick si with prob. p[i] are produced
• Parent2 = randomly pick another sj with prob. p[j]
• Randomly select 1 crossover point, and swap • Issues:
strings of parents 1 and 2 to generate two children – size of the initial population
t[k] and t[k+1] • experimentally determined for problem
4. for k = 1; k ≤ N; k++ – diversity of the initial population (genetic
• Randomly mutate each position in t[k] with a diversity)
small probability • a problem resulting from lack of diversity is premature
5. New generation replaces old generation: s = t convergence to a non-optimal solution
*different than in book
3
Selection: Finding the Fittest Selection Techniques
• Choose which individuals survive and possibly • Deterministic Selection
reproduce in the next generation – relies on evaluation/fitness function
• Selection depends on the evaluation/fitness – converges fast
function • Two approaches:
– if too dependent, then, like greedy search, a non- – next generation contains parents and children
optimal solution may be found • parents are the best of the current generation
– if not dependent enough, then may not converge to • parents produce children, and parents survive to next
a solution at all generation
• Nature doesn't eliminate all "unfit" genes; – next generation contains only children
they usually become recessive for a long period • parents are the best of the current generation
of time, and then may mutate to something • parents are used to produce children
• parents don't survive (counters early convergence)
useful
4
Selection Techniques Crossover: Producing New Individuals
• Crossover is used to produce new
Crowding: a potential problem associated with individuals (i.e., children)
Selection • Crossover for vector representations:
– occurs when the individuals that are most-fit – Pick pairs of individuals as parents and randomly
quickly reproduce so that a large percentage swap their segments
of the entire population looks very similar – also known as "cut and splice"
– reduces diversity in the population
– may hinder the long-run progress of the algorithm • Parameters:
– number of crossover points
– positions of the crossover points
5
Genetic Algorithm (1 version*)
Producing New Individuals
1. Let s = {s1, …, sN} be the current population
2. Let p[i] = f(si)/SUMjf(sj) be the fitness probabilities
• Mutation 3. for k = 1; k < N; k += 2
– randomly change an individual • Parent1 = randomly pick si with prob. p[i]
• Parent2 = randomly pick another sj with prob. p[j]
– e.g. TSP: two-swap, two-interchange
• Randomly select 1 crossover point, and swap
– e.g. SAT: bit flip
strings of parents 1 and 2 to generate children t[k]
and t[k+1]
• Parameters: 4. for k = 1; k ≤ N; k++
– mutation rate • Randomly mutate each position in t[k] with a
– size of the mutation small prob.
5. New generation replaces old generation: s = t
*different than in book
GA Solving TSP
Genetic Algorithm Applications
6
Genetic Algorithms as Search Genetic Algorithms as Search
The Problem of Local Maxima
Individuals get stuck at pretty good, but not • GA is a kind of hill-climbing search
optimal, solutions • Very similar to a randomized beam search
– any small mutation gives worse fitness • One significant difference between GAs and HC
– crossover can help get out of a local maximum is that, it is generally a good idea in GAs to fill
– mutation is a random process, so it is possible that the local maxima up with individuals
we may have a sudden large mutation to get these • Overall, GAs have less problems with local
individuals out of this situation maxima than HC or neural networks
Summary
• Easy to apply to a wide range of problems
– Optimization problems such as TSP
– inductive concept learning
– scheduling
– Layout
• Results can be very good on some problems and
poor on others
• GA is very slow if only mutation is used;
crossover makes the algorithm significantly
faster