0% found this document useful (0 votes)
13 views56 pages

Ec 04 2023

The document discusses the key components of evolutionary computation algorithms. It covers representation, evaluation, population, selection mechanisms, and variation operators like mutation and crossover. Representation encodes solutions, evaluation assigns fitness, selection identifies parents and survivors, and variation explores the search space.

Uploaded by

. 蝦米
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views56 pages

Ec 04 2023

The document discusses the key components of evolutionary computation algorithms. It covers representation, evaluation, population, selection mechanisms, and variation operators like mutation and crossover. Representation encodes solutions, evaluation assigns fitness, selection identifies parents and survivors, and variation explores the search space.

Uploaded by

. 蝦米
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 56

演化式計算

Evolutionary Computation
EE-6950, Fall 2023
Lecture # 4 - EC Components
Outline
I. Homework Solution Review

II. EC Components

III. Homework #3

2
I. Homework:
Solution Review
Homework Solution Review

• HW #02

• Solution has also been uploaded to iLearning

4
II. EC Components
Recap: Natural Evolution metaphor
• A population of individuals exists in an environment with limited resources

• Competition for those resources causes selection of those tter individuals that are better
adapted to the environment

• These individuals act as seeds for the generation of new individuals through recombination
and mutation

• The new individuals have their tness evaluated and compete (possibly also with parents)
for survival.

• Over time Natural selection causes a rise in the tness of the population
6

fi
Connection with Evolutionary Algorithms
• Stochastic Population of individuals

• Individuals have a state representation


• Can be thought of as their genetic code (genotype/phenotype)

• Individuals have a tness (a measure of quality)


• Derived from individual’s state

• Variation operators for Individual’s state (inspired by natural genetics)


• Crossover (recombination, mating)
• Mutation
•  Push towards novelty (exploration)

• Selection mechanisms towards higher tness  “survival of the ttest”


7
fi
fi
Basic Evolutionary Algorithm (EA)

/Crossover

8
Basic EA Pseudo-Code

9
Two pillars of evolution

There are two competing forces:

Increasing population diversity by Decreasing population diversity by


genetic operators selection
● Mutation ● of parents
● Recombination ● of survivors

Push towards novelty Push towards quality

10
Evolutionary Algorithm Components

• Main EA components:
• Representation (Individual)
• Evaluation
• Population
• Selection: Parent selection/Survivor
• Variation: Recombination/Mutation
• Initialization/Termination
• Examples:
• Eight-queens problem
• EV1, the simplest EA ever!

11
Main EA components: Representation

• Role: Encoding for candidate solutions that can be manipulated by variation


operators

• Leads to two levels of existence


• object in original problem context, external traits (phenotype)
• encoding to denote that object, the inside (“digital DNA”, genotype)

• Implies two mappings:


• Encoding : phenotype=> genotype (not necessarily one to one – degenerate states)
• Decoding : genotype=> phenotype (must be one to one)

12
EA components: Representation
Example: Represent integer values by their binary code

Internal state representation


External representation Encoding
(representation) 10010

18
10
2

9 1001
Decoding
(inverse representation)

Note: In order to nd the global optimum, every feasible solution must be


represented in internal state space
13
fi
EA components: Evaluation (fitness) function
• Role:
• Represents the task to solve, requirements to adapt to (“the environment”)
• Enables selection (provides basis for comparison)
• e.g., some traits are advantageous, desirable,
• Advantageous traits are rewarded by more o spring that can carry the same trait

• Also known as the quality function or objective function

• Typically assigns a real-valued tness to each individual used as basis for selection
• So the more discrimination (di erent values) the better

• Typically we talk about tness being maximized


• Some problems are best posed as minimization problems, conversion is trivial
14
fi
ff
fi
ff
EA components: Population

• Role: Holds the candidate solutions of the problem as individuals (states)

• Formally, a population is a multiset of individuals, i.e. repetitions possible

• Population is the basic unit of evolution, i.e., the population is evolving, not the
individuals

• Selection operators act on population level

• Variation operators act on individual level


15
EA components: Population

• Some EAs also impose a spatial structure on the population e.g., a grid

• Some EA’s use multiple populations (population Islands)

• Selection operators usually take whole population into account i.e.,


reproductive probabilities are relative to current generation

• Diversity of a population refers to number of di erent tnesses/states present

16
EA components: Selection mechanism
Role:
• Identi es individuals…
- To become parents
- To survive

• Pushes population towards higher tness

• Usually probabilistic
• High quality solutions more likely to be selected than low quality
• But not guaranteed
• Even worst in current population usually has non-zero probability of being selected

• This stochastic nature can aid escape from local optima

17
fi
fi
EA components: Selection mechanism

Example: Roulette wheel selection


1/6 = 17%

fitness(A) = 3 B
A C
fitness(B) = 1
3/6 = 50% 2/6 = 33%
fitness(C) = 2

In principle, any selection mechanism can be used for


parent selection as well as for survivor selection (but there are tradeoffs…)

18
EA components: Selection mechanism

• Survivor selection (also known as “replacement”)


• Most EAs use xed population size so need a way of going from (parents +
o spring) to next generation

• Often deterministic (while parent selection is usually stochastic)


• Fitness based : e.g., rank parents + o spring and take best
• Age based: make as many o spring as parents and delete all parents

• Sometimes a combination of stochastic and deterministic (elitism)

19
ff
fi
ff
ff
EA components: Variation operators

• Role: To generate new candidate solutions, explore state space


• Usually divided into two types according to their arity (number of inputs):
• Arity 1 : mutation operators
• Arity >1 : recombination operators
• Arity = 2: typically called crossover
• Arity > 2: is formally possible, seldom used in EC

• There has been much debate about relative importance of recombination and
mutation
• Nowadays most EAs use both
• Variation operators must match the given representation

20
EA components: Mutation
• Role: Causes random variance, potentially outside current “DNA” pool

• Acts on a single state (Individual) and delivers another (e.g., it’s a point operator)

• Element of randomness is essential, di erentiates it from other unary heuristic operators

• Importance ascribed depends on representation and historical dialect:


• Binary GAs – background operator responsible for preserving and introducing diversity
• EP for FSM’s/continuous variables – the only search operator
• GP – hardly used

• May guarantee connectedness of search space and hence convergence proofs


21

ff
EA components: Mutation

before 1 1 1 1 1 1 1

after 1 1 1 0 1 1 1

22
EA components: Recombination
• Role: Merges information from parents into o spring

• Choice of what information to merge is stochastic

• O spring may be worse, better, or the same as the parents

• Hope is that some are better by combining elements of genotypes that lead
to good traits

• Principle has been used for millennia by breeders of plants and livestock
23
ff
EA components: Recombination

Parents
cut cut
1 1 1 1 1 1 1 0 0 0 0 0 0 0

1 1 1 0 0 0 0 0 0 0 1 1 1 1

Offspring

24
EA components: Initialization/Termination

• Initialization usually random,


• Need to ensure even spread and mixture of possible states
• Can include existing solutions, or use problem-speci c heuristics, to “seed” the population

• Termination condition checked every generation


• Reaching some (known/hoped for) tness
• Reaching some maximum allowed number of generations
• Reaching some minimum level of diversity or entropy
• Reaching some speci ed number of generations without tness improvement

25
fi
fi
Example: The 8-queens problem

Place 8 queens on an 8x8 chessboard in


such a way that they cannot check each other
26
The 8-queens problem: Representation

External representation:
a board configuration

Internal state:
Possible mapping
a permutation of
the numbers 1–8 1 3 5 2 6 4 7 8

27
The 8-queens problem: Fitness evaluation

• Penalty of one queen: the number of queens she can check

• Penalty of a con guration: the sum of penalties of all queens

• Note: penalty is to be minimized

• Fitness of a con guration: inverse penalty to be maximized

28
fi
fi
The 8-queens problem: Mutation

Small variation in one permutation:


- swapping values of two randomly chosen positions

1 3 5 2 6 4 7 8 1 3 7 2 6 4 5 8

29
The 8-queens problem: Recombination

Combining two permutations into two new permutations:


• choose random crossover point
• copy first parts into children
• create second part by inserting values from other parent:
- in the order they appear there
- beginning after crossover point
- skipping values already in child

1 3 526 4 7 8 1 3 542 8 7 6
8 7 654 3 2 1 8 7 624 1 3 5

30
The 8-queens problem: Selection

• Parent selection:
• Pick 5 parents and take best two to undergo crossover

• Survivor selection (replacement)


• When inserting a new child into the population, choose an existing member
to replace by:
- sorting the whole population by decreasing tness
- enumerating this list from high to low
- replacing the rst with a tness lower than the given child

31
fi
fi
fi
The 8-queens problem: Summary

Note: this is only one possible set of choices of


operators and parameters
32
Example: 1-D optimization

• Simple 1-D Optimization problem from Lecture 03:


Maximize: f(x) = 50 − x 2

33
Simple 1-D Optimization: Representation

• External representation: The real number,

• Internal state: The real number,

- Note: Genotype and phenotype are equivalent in this case (i.e. there is no encoding/decoding
from internal state to natural external problem representation)

34
𝑥𝑥
Simple 1-D Optimization: Fitness Evaluation

• Fitness (objective) function: f(x) = 50 − x 2


- Fitness function maps internal state to tness “landscape”

35
Simple 1-D Optimization: Mutation

• Mutation: Random choice from Gaussian distribution centered around x

36
Simple 1-D Optimization: Recombination

• Crossover/recombination:
Take the average value of between the two parents

x′ = (x1 + x2)/2

Parent 1 Parent 2

Child

x1 x′ x2

37

𝑥

Simple 1-D Optimization: Selection

• Parent Selection:
• Randomly choose two parents from population using uniform distribution

• Survivor selection (replacement):


- Use “replace worst” policy: Child replaces worst solution in population if
child has better tness

38
fi
Simple 1-D Optimization: Summary

Representation Real number, x


Recombination Average of parents
Recombination probability 100%
Mutation Random, Gaussian distribution
Mutation probability 25%
Mutation Variance 1
Parent selection Random, Uniform distribution
Survivor selection Replace worst
Population Size 10
Number of O spring 1
Initialization Random, uniform distribution
Termination condition Max generations 20

Note: this is only one possible set of choices of


operators and parameters

39
ff
Putting it all together: EV1

• So does this really all work?!

• Let’s try a simple program for the 1-D optimization problem

• EV1: The simplest EA ever!!

40
EV1: The simplest EA ever!
def ev1():
# start random number generator
prng=Random()
prng.seed(randomSeed)

#random initialization of population


population=[]
for i in range(populationSize): • Note: EV1 is quite naïve, and has many
x=prng.uniform(minLimit,maxLimit)
ind=Individual(x,fitnessFunc(x))
fundamental limitations
population.append(ind)
• But even though it is quite simple, it works!
#evolution main loop
for i in range(generationCount):
#randomly select two parents • I will post complete source code for EV1 online
parents=prng.sample(population,2)

#recombine using simple average


childx=(parents[0].x+parents[1].x)/2

#random mutation using normal distribution


if prng.random() <= mutationProb:
childx=prng.normalvariate(childx,mutationStddev)

#survivor selection: replace worst


child=Individual(childx,fitnessFunc(childx))
iworst=findWorstIndex(population)
if child.fit > population[iworst].fit:
population[iworst]=child 41
EV1

• Interactive code walkthrough/demo…

• Warning: Don't fall too in love with EV1, there are


many improvements to come!

42
EA’s in Action

• Representation & Historical types of EA’s

• Runtime stages/behavior

• EA’s and Domain Knowledge


• The early days: Universal problem solver
• Current: Domain-speci c representation and operators

43
fi
Representation & Different types of EAs

• Historically di erent avors of EAs have been associated with particular data
types to represent solutions
• Binary strings : Genetic Algorithms (GA’s)
• Real-valued vectors : Evolution Strategies
• Finite state Machines: Evolutionary Programming
• LISP trees: Genetic Programming

• These di erences are largely irrelevant. Presently, best strategy is:


• Choose representation to suit problem
• Choose variation operators to suit representation

• Selection operators only use tness and so are independent of representation

44
ff
ff
fl
fi
EA behavior: Runtime stages, Convergence
Typical stages in optimizing a 1-D tness landscape

Early stage:
quasi-random population distribution

Mid-stage:
population arranged around/on hills

Late stage:
population concentrated on high hills

45

fi
EA behavior: Runtime stages, Convergence

Initial Generation Intermediate Generation Final Generation

Many local minima One Global Minimum

Note: Multiple local minima can “trap” simple hill-climbing algorithms

46
EA behavior: Typical progression of fitness

- Convergence near optimal solution


- Potential loss of diversity
- Potential loss of solving power

47
EA behavior: Evolutionary Algorithms in context

• There are many views on the use of EAs as robust problem solving tools

• For most problems a problem-speci c tool may:


• perform better than a generic search algorithm on most instances
• have limited utility
• not do well on all instances

• Goal is to provide robust tools that provide:


• evenly good performance
• over a range of problems and instances

48

fi
EA behavior: EAs as problem solvers: The view circa 1989
Performance of methods on problems

Special, problem tailored method

Evolutionary algorithm

Random search

Scale of “all” problems


(Attributed to Goldberg, 1989)

49
EA behavior: EAs and domain knowledge
• Trend in the 90’s:
- adding problem speci c knowledge to EAs
(special variation operators, repair, etc.)

• Result: EA performance curve “deformation”:


- better on problems of the given type
- worse on problems di erent from given type
- amount of added knowledge is variable
- (we will discuss this in much more detail later…)

• Recent theory suggests the search for a “universal” algorithm may be fruitless
50
ff
fi
EA behavior: EAs as problem solvers: The view circa 1996
Performance of methods on problems

EA 4

EA 2
EA 3

EA 1

Scale of “all” problems (Attributed to : Michalewicz, 1996)

51
Homework #3
- Due by Oct 5 @ noon
- Submit to iLearning
Homework #3

• Part 1: Download my version of EV1 (on iLearning, ev1.py) or create your


own version of EV1
1. Examine the code: Try to better understand how this basic EC
algorithm works

2. Experiment with parameters, what happens?


- Di erent random seeds, are the results the same?
- Di erent population sizes, generation count, mutation rate
- Try a few di erent 1-D tness functions

53
ff
ff
ff
fi
Homework #3

• Part 2: Make the following modi cations & improvements to ev1


1. New tness function:
- Modify the existing EV1 python code to use the following new 1-D tness function
f(x) = − 10 − (0.04x)2 + 10 cos(0.04πx)

- Run some test parameters to see what happens, can you nd the global maximum?
(suggested parameters: population=10, generations=50, random-seed=1234
mutation-prob=25%, mutation-stddev=1.0, min/max-x=-100/+100)

- For output data, please use the same format as my existing printStats function in ev1.py, otherwise
you will drive the TA crazy! ☺

2. Using matplotlib, add capability to ev1 to plot a few interesting runtime metrics
(doesn't need to be interactive or real-time, unless you prefer):
- Best tness and state value vs. generation count
- Average and standard deviation of population tness vs. generation count
- The tness function, f(x)
54
fi
fi
fi
fi
fi
Homework #3: Recommended reading

• Recommended reading:
- Eiben Chapter 3

55
Next lecture…
EC Representation

You might also like