12-Modern Optimization Techniques - Evolutionary Algorithms - W
12-Modern Optimization Techniques - Evolutionary Algorithms - W
[email protected]
05077058644
OUTLINE
•Introduction
2
Introduction
Some optimization methods that are conceptually different from the traditional
optimization techniques are labeled as modern, evolutionary, or nature-inspired
methods of optimization.
3
Introduction
Most of these methods are based on certain characteristics and behavior of biological,
molecular, swarm of insects, and neurobiological systems. Some of them can be
listed as,
1. Genetic algorithms
2. Particle swarm optimization
3. Simulated annealing
4. Ant colony optimization
5. Fuzzy optimization
6. Neural-network-based methods
…Etc.
4
Introduction
Where these algorithms were inspired?
• The genetic algorithms are based on the principles of natural genetics and natural
selection (survival of the fittest).
• Ant colony optimization is based on the cooperative behavior of real ant colonies,
which are able to find the shortest path from their nest to a food source.
5
Introduction
• Most of them use only the function values in the search process to make progress toward a
solution without regard to how the functions are evaluated.
• In general, continuity or differentiability of the problem functions is neither required nor used
in calculations of the algorithms. Therefore, the algorithms are very general and can be used
to all kinds of problems-discrete, continuous, and nondifferentiable.
• In addition, the methods determine global optimum solutions as opposed to the local
solutions determined by a derivative-based optimization algorithm. (Population based
methods instead of starting point)
• The methods are easy to use and program since they do not require the use of gradients of
cost or constraint functions.
6
7
Genetic Algorithms
• Genetic algorithms loosely parallel biological evolution and are based on Darwin’s
theory of natural selection / survival of the fittest.
• Although GAs were first presented systematically by Holland, the basic ideas of analysis
and design based on the concepts of biological evolution can be found in the work of
Rechenberg.
9
Genetic Algorithms
GAs differ from the traditional methods of optimization in the
following respects:
• A population of points (trial design vectors) is used for
starting the procedure instead of a single starting point. Since
several points are used as candidate solutions, GAs are less
likely to get trapped at a local optimum. The probability of
convergence to the global optimum is relatively high.
• GAs use only the values of the objective function. The
derivatives are not used in the search procedure.
• In GAs the design variables are represented as strings of
binary variables that correspond to the chromosomes in
natural genetics. Thus the search method is naturally
applicable for solving discrete and integer programming
problems. For continuous design variables, the string length
can be varied to achieve any desired resolution.
10
Genetic Algorithms
• The objective function value corresponding to a design vector plays the role of fitness
in natural genetics.
• In every new generation, a new set of strings (chromosome) is produced by using
randomized parents selection and crossover from the old generation (old set of strings
- chromosome).
• Although randomized, GAs are not simple random search techniques. They efficiently
explore the new combinations with the available knowledge to find a new generation
with better fitness or objective function value.
11
Genetic Algorithms
Representation of the design variable – Binary encoding:
In GAs, the design variables are represented as strings of binary numbers, 0 and 1. This
set of strings is called as chromosome.
12
Genetic Algorithms
Representation of the design variable – Binary encoding:
If each design variable , is coded in a string of length , a design vector is represented
using a string of total length .
For example, if a string of length 5 is used to represent each variable, a total string of
length 20 describes a design vector with . The following string of 20 binary digits denote
the vector ():
It is an individual in
the population. If it is
selected, it will be a
parent for the next
generation.
13
Genetic Algorithms
Representation of the design variable – Binary encoding:
14
Genetic Algorithms
Representation of the design variable – Binary encoding:
The number of binary digits needed to represent a continuous variable in steps (accuracy)
of can be computed from the relation
15
16
Genetic Algorithms
17
Genetic Algorithms
Representation of the objective function:
Because genetic algorithms are based on the survival-of-the-fittest principle of nature, they
try to maximize a function called the fitness function. Thus GAs are naturally suitable for
solving unconstrained maximization problems.
: objective function Many GA implementations expect the fitness value to be positive
Fitness function and higher is better. If you multiply by -1, you may end up with
negative fitness values, which can cause errors or unexpected
behaviour. You can multiply by -1 in principle, but you must be
ensure the GAs fitness expectations are still met.
Convert it into an unconstrained optimization problem via penalty functions which try to
keep the design variables within the feasible region.
constant penalty parameters
Bracket function
The penalty will be proportional to the square of the amount of violation of the inequality
and equality constraints at the design vector , while there will be no penalty added to if
19
all the constraints are satisfied at the design vector .
Genetic Algorithms
Representation of the objective function with constraints:
The fitness function, F(X), to be maximized in the GAs can be obtained, similar to
equation (1), as
20
Genetic Algorithms
Genetic Operators
GAs start with a population of random strings denoting several (population of) design
vectors. The population size is usually fixed. Each string (or design vector) is evaluated to
find its fitness value. After the evaluation of the fitnesses, three operators are used to
generate the new population of points (designs).
Genetic operators
1.Reproduction (Selection)
2.Crossover
3.Mutation
One cycle of reproduction, crossover, and mutation and the evaluation of the fitness
The first operation applied to the population to select good strings/individual (designs) of
the population to form a mating pool. The selected individuals are called as parents.
In a commonly used reproduction operator, a string is selected from the mating pool with a
probability proportional to its fitness. The commonly used method is Roulette-wheel
Selection
22
Genetic Algorithms
Genetic Operators - Reproduction (Selection)
If denotes the fitness of the string (individual) in the population of size , the probability
for selecting the individual for the mating pool ( ) is given by
These probabilities are used to determine the cumulative probability of string being
copied to the mating pool, , by adding the individual probabilities of strings through as
23
Genetic Algorithms
Genetic Operators - Reproduction (Selection)
Thus the roulette-wheel selection process can be implemented by associating the
cumulative probability range to the string.
24
Genetic Algorithms
Genetic Operators - Reproduction (Selection)
By spinning the roulette-wheel times ( being the population size) and selecting, each time,
the string chosen by the roulette-wheel pointer, we obtain a mating pool of size .
By this process, the string/individual with a higher (lower) fitness value will be selected
more (less) frequently to the mating pool because it has a larger (smaller) range of
cumulative probability. Thus strings with high fitness values in the population,
probabilistically, get more copies in the mating pool.
Note that no new strings/individuals are formed in the reproduction stage; only the
existing strings/individuals in the population get copied to the mating pool. The
reproduction stage ensures that highly fit individuals (strings) live and reproduce, and less
fit individuals (strings) die. Thus the GAs simulate the principle of “survival-of-the-fittest”
of nature.
25
Genetic Algorithms
Genetic Operators - Reproduction (Selection)
Example for mating-pool: Find the levels of contribution of the various strings to the mating
pool using the roulette-wheel selection process with the following 12 random numbers: 0.41,
0.65, 0.42, 0.80, 0.67, 0.39, 0.63, 0.53, 0.86, 0.88, 0.75, 0.55.
27
Genetic Algorithms
Genetic Operators - Crossover
The purpose of crossover is to create new strings
by exchanging information among strings of the
mating pool.
In most crossover operators, two individual
strings (designs) are picked (or selected) at
random from the mating pool generated by the
reproduction operator and some portions of the
strings are exchanged between the strings.
In single-point crossover operator, a crossover
site is selected at random along the string length,
and the binary digits lying on the right side of the
crossover site are swapped (exchanged) between
the two strings. 28
Genetic Algorithms
Genetic Operators - Crossover
The number of crossover sites can be increased to two or higher (Two point crossover /
three point crossover). The crossover site is selected randomly in each crossover. The
number of crossover sites can even be selected randomly, which is called a uniform
crossover.
29
Genetic Algorithms
Genetic Operators - Crossover
For example, if two design vectors (parents), each with a string length of 10, are given by
The childs may or may not be better than their parents in terms of their fitness values. If
they are better, they will contribute to a faster improvement of the average fitness value of
the new population. If they are worse than their parents, they will not survive very long as
they are less likely to be selected in the next reproduction stage (because of the survival-
of-the-fittest strategy used). 30
Genetic Algorithms
Genetic Operators - Mutation
The mutation operator is applied to the new strings with a specific small mutation
probability, . The mutation operator changes the binary digit (allele’s value) 0 to 1 and
vice versa. There are several mutation methods.
Single-point mutation. In the single-point mutation, a mutation site is selected at random
along the string length and the binary digit at that site is then changed from 1 to 0 or 0 to
1 with a probability of .
Bit-wise mutation. In the bit-wise mutation, each bit (binary digit) in the string is
considered one at a time in sequence, and the digit is changed from 1 to 0 or 0 to 1 with a
probability .
31
Genetic Algorithms
Genetic Operators - Mutation
Purposes of the mutation
1. To generate a string (design point) in the neighborhood of the current string, thereby
accomplishing a local search around the current solution.
2. To safeguard against a premature loss of important genetic material at a particular
position,
3. To maintain diversity in the population.
32
Genetic Algorithms
Genetic Operators - Mutation
The true optimum solution of the problem requires a 0 as the first
As an example
bit. The required 0 cannot be created by either the reproduction or
the crossover operators.
1. Choose a suitable string length to represent the design variables of the design vector . Assume
suitable values for the following parameters: population size , mutation probability ,
permissible value of standard deviation of fitness values of the population to use as a
convergence criterion, and maximum number of generations () to be used as a second
convergence criterion.
2. Generate a random population of size , each consisting of a string of length . Evaluate the
fitness values , , of the strings.
6. Evaluate the fitness values , , of the strings of the new population. Find the standard
deviation of the fitness values.
7. Test for the convergence of the algorithm or process. If , the convergence criterion is
satisfied and hence the process may be stopped. Otherwise, go to step 8.
8. Test for the generation number. If , the computations have been performed for the
maximum permissible number of generations and hence the process may be stopped.
Otherwise, set the generation number as and go to step .
Genetic Algorithms
Example: Maximize the function where
Genetic Algorithms
Example: Maximize the function where
1
2
3
4
Sum
Avr.
Max.
Genetic Algorithms
Crossover operation
1
2
3
4
Sum
Avr.
Max.
Genetic Algorithms
Mutation operation
String No (i) Offsprings Random numbers for each bit Offsprings after mutation
value fitness
1
2
3
4
Sum
Avr.
Max.
Genetic Algorithms
Example: Minimize the function where