0% found this document useful (0 votes)
6 views

05 Geneticalgorithms

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views

05 Geneticalgorithms

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Introduction to Genetic Algorithms

• Another Local Search method


• Inspired by natural evolution
Genetic Algorithms Living things evolved into more successful organisms
– offspring exhibit some traits of each parent
Chapter 4.1.4

Introduction to Genetic Algorithms Introduction to Genetic Algorithms


• Keep a population of individuals that are • Mechanisms of evolutionary change:
complete solutions (or partial solutions)
– Crossover (Alteration): the (random)
• Explore solution space by having these combination of 2 parents’ chromosomes during
individuals “interact” and “compete” reproduction resulting in
– interaction produces new individuals
– competition eliminates weak individuals
offspring that have some traits of each parent
• After multiple generations a strong individual
(i.e., solution) should be found • Crossover requires genetic diversity among
• Simulated Evolution via a form of the parents to ensure sufficiently varied
Randomized Beam Search offspring

1
Introduction to Genetic Algorithms Introduction to Genetic Algorithms
• Mechanisms of evolutionary change: • Mechanisms of evolutionary change:
– Mutation: the rare occurrence of errors during – Natural selection: the fittest survive in a
the process of copying chromosomes resulting in competitive environment resulting in better
organisms
• changes that are nonsensical or deadly,
producing organisms that can't survive • individuals with better survival traits
generally survive for a longer period of time
• changes that are beneficial, producing
"stronger" organisms • this provides a better chance for reproducing
• changes that aren't harmful or beneficial, and passing the successful traits on to offspring
producing organisms that aren't improved • over many generations the species improves
since better traits will out-number weaker ones

Genetic Algorithm
Representation of Individuals
Create initial random population
Solutions represented as a vector of values
– For example: Satisfiability problem (SAT) Evaluate fitness of each individual
• determine if a statement in propositional logic is yes
satisfiable, for example: Termination criterion satisfied?
(P1∨ P2)∧(P1 ∨ ¬P3)∧(P1 ∨ ¬P4)∧(¬P3 ∨ ¬P4) stop
• each element corresponds to a symbol having no
a value of either true (i.e., 1) or false (i.e., 0) Select parents according to fitness
• vector: P1 P2 P3 P4
• values: 1 0 1 1 ç rep. of 1 individual Combine parents to generate offspring
– Traveling salesperson problem (TSP)
• Tour can be represented as a sequence of cities visited Mutate offspring

Replace population by new offspring

2
Genetic Algorithm (1 version*)
Initialization: Seeding the Population
1. Let s = {s1, …, sN} be the current population
2. Let p[i] = f(si)/SUMjf(sj) be the fitness probabilities • Initialization sets the beginning population
3. for k = 1; k < N; k += 2 of individuals from which future generations
• Parent1 = randomly pick si with prob. p[i] are produced
• Parent2 = randomly pick another sj with prob. p[j]
• Randomly select 1 crossover point, and swap • Issues:
strings of parents 1 and 2 to generate two children – size of the initial population
t[k] and t[k+1] • experimentally determined for problem
4. for k = 1; k ≤ N; k++ – diversity of the initial population (genetic
• Randomly mutate each position in t[k] with a diversity)
small probability • a problem resulting from lack of diversity is premature
5. New generation replaces old generation: s = t convergence to a non-optimal solution
*different than in book

Initialization: Seeding the Population Evaluation: Ranking by Fitness


• How is a diverse initial population generated? • Evaluation ranks the individuals using some
– uniformly random: generate individuals fitness measure that corresponds with the
randomly from a solution space with uniform distribution
quality of the individual solutions
– grid initialization: choose individuals
at regular "intervals" from the solution space
• For example, given individual i:
– non-clustering: require individuals to be a predefined
"distance" away from those already in the population – classification: (#correct(i))2
– local optimization: use another technique (e.g. HC) – TSP: 1/tour-length(i)
to find initial population of local optima; doesn't ensure – SAT: #ofClausesSatisfied(i)
diversity but guarantees solution to be no worse than the
local optima

3
Selection: Finding the Fittest Selection Techniques
• Choose which individuals survive and possibly • Deterministic Selection
reproduce in the next generation – relies on evaluation/fitness function
• Selection depends on the evaluation/fitness – converges fast
function • Two approaches:
– if too dependent, then, like greedy search, a non- – next generation contains parents and children
optimal solution may be found • parents are the best of the current generation
– if not dependent enough, then may not converge to • parents produce children, and parents survive to next
a solution at all generation
• Nature doesn't eliminate all "unfit" genes; – next generation contains only children
they usually become recessive for a long period • parents are the best of the current generation
of time, and then may mutate to something • parents are used to produce children
• parents don't survive (counters early convergence)
useful

Selection Techniques Selection Techniques


• Proportional Fitness Selection
– each individual is selected proportionally to their
fitness score Proportional selection example:
– even the worst individual has a chance to survive • Given the following fitness values for population:
– helps prevent “stagnation” in the population l Sum all the Fitnesses
Individual Fitness Prob.
• Two approaches: 5 + 20 + 11 + 8 + 6 = 50
A 5 10%
– rank selection: individual selected with a l Determine probability B 20 40%
probability proportional to its rank in population for each individual, i
sorted by fitness C 11 22%
Fitness(i) / 50 D 8 16%
– proportional selection: individual selected with a
probability: Fitness(individual) / ∑ Fitness for E 6 12%
all individuals

4
Selection Techniques Crossover: Producing New Individuals
• Crossover is used to produce new
Crowding: a potential problem associated with individuals (i.e., children)
Selection • Crossover for vector representations:
– occurs when the individuals that are most-fit – Pick pairs of individuals as parents and randomly
quickly reproduce so that a large percentage swap their segments
of the entire population looks very similar – also known as "cut and splice"
– reduces diversity in the population
– may hinder the long-run progress of the algorithm • Parameters:
– number of crossover points
– positions of the crossover points

Crossover: Producing New Individuals Crossover: Producing New Individuals

• 1-point Crossover • N-point Crossover


– generalization of 1-point crossover
– pick a dividing point in the parents' vectors
– pick n dividing points in the parents' vectors and
and swap their segments splice together alternating segments
• Example • Uniform Crossover
– given parents: 1101101101 and 0001001000 – the value of each element of the vector is
randomly chosen from the values in the
– crossover point: after the 4th digit corresponding elements of the two parents
– children produced are:
• Techniques also exist for permutation
1101 + 001000 and 0001 + 101101 representations

5
Genetic Algorithm (1 version*)
Producing New Individuals
1. Let s = {s1, …, sN} be the current population
2. Let p[i] = f(si)/SUMjf(sj) be the fitness probabilities
• Mutation 3. for k = 1; k < N; k += 2
– randomly change an individual • Parent1 = randomly pick si with prob. p[i]
• Parent2 = randomly pick another sj with prob. p[j]
– e.g. TSP: two-swap, two-interchange
• Randomly select 1 crossover point, and swap
– e.g. SAT: bit flip
strings of parents 1 and 2 to generate children t[k]
and t[k+1]
• Parameters: 4. for k = 1; k ≤ N; k++
– mutation rate • Randomly mutate each position in t[k] with a
– size of the mutation small prob.
5. New generation replaces old generation: s = t
*different than in book

GA Solving TSP
Genetic Algorithm Applications

6
Genetic Algorithms as Search Genetic Algorithms as Search
The Problem of Local Maxima
Individuals get stuck at pretty good, but not • GA is a kind of hill-climbing search
optimal, solutions • Very similar to a randomized beam search
– any small mutation gives worse fitness • One significant difference between GAs and HC
– crossover can help get out of a local maximum is that, it is generally a good idea in GAs to fill
– mutation is a random process, so it is possible that the local maxima up with individuals
we may have a sudden large mutation to get these • Overall, GAs have less problems with local
individuals out of this situation maxima than HC or neural networks

Summary
• Easy to apply to a wide range of problems
– Optimization problems such as TSP
– inductive concept learning
– scheduling
– Layout
• Results can be very good on some problems and
poor on others
• GA is very slow if only mutation is used;
crossover makes the algorithm significantly
faster

You might also like