0% found this document useful (0 votes)
9 views38 pages

Genetic Algorithms by KVK

A genetic algorithm (GA) is an optimization method inspired by natural selection that evolves a population of solutions over generations to find optimal solutions. Key components of GAs include population generation, encoding techniques, fitness functions, selection methods, crossover, and mutation. GAs are widely used in various fields such as machine learning, robotics, and optimization problems.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views38 pages

Genetic Algorithms by KVK

A genetic algorithm (GA) is an optimization method inspired by natural selection that evolves a population of solutions over generations to find optimal solutions. Key components of GAs include population generation, encoding techniques, fitness functions, selection methods, crossover, and mutation. GAs are widely used in various fields such as machine learning, robotics, and optimization problems.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 38

Genetic Algorithm

Dr. K.V. Kadambari


1.
Description

1
Description

A genetic algorithm (GA) is a method for solving both constrained and


unconstrained optimization problems based on a natural selection
process that mimics biological evolution. The algorithm repeatedly
modifies a population of individual solutions. At each step, the genetic
algorithm randomly selects individuals from the current population and
uses them as parents to produce the children for the next generation.
Over successive generations, the population "evolves" toward an
optimal solution.

3
Description

• Directed search algorithms based on the mechanics of biological


evolution
• Developed by John Holland, University of Michigan (1970’s)
• To understand the adaptive processes of natural systems
• To design artificial systems software that retains the robustness of natural
system
• Provides efficient, effective techniques for optimization and machine
learning applications
• Widely-used today in business, scientific and engineering circles

4
Description
• GAs work with a coding of the parameter set and not on the
parameter themselves.

• GAs search from a population of points and not from a single point
and move parallel.

• GAs use objective function information and not derivatives or other


auxiliary knowledge.

• GAs are probabilistic transition rules and not deterministic rules and
uses random stochastic nature.

5
2.
Components
of GA
2
Components of a GA

• Population generation
• Encoding technique (gene, chromosome)

• Initialization procedure (creation)

• Evaluation/fitness function (environment)

• Selection of parents (reproduction)

• Genetic operators (mutation, recombination)


• Parameter settings

7
Components of a GA

Initial Generation

Encoding

Fitness

Next
Selection
Generation

Mutation Crossover

8
Population generation

• There are two population models widely in use −


Steady State
• In steady state GA, we generate one or two off-springs in each
iteration and they replace one or two individuals from the population.
A steady state GA is also known as Incremental GA.
Generational
• In a generational model, we generate ‘n’ off-springs, where n is the
population size, and the entire population is replaced by the new one
at the end of the iteration.

9
Let us consider an example in which we need to find the optimum
solution of F(X1,X2) .

F(X1,X2) = X12 – X22 – 5.13

The unknown parameters are X1 and X2.

Solution Space of X1 and X2 Є [ 0, 15 ]

10
Encoding

Chromosomes could be encoded as:


• Bit strings (0101 ... 1100)
• Real numbers (43.2 -33.1 ... 0.0 89.2)
• Permutations of element (E11 E3 E7 ... E1 E15)
• Lists of rules (R1 R2 R3 ... R22 R23)
• Program elements (genetic programming)
• ... any data structure ...

Generally bit strings are preferred as encoding.

11
Initial population

• 9 13

X1 X2

1001 1101

10011101

1 Complete Population

12
Initial Generation
Combinations of possible numbers taken randomly
1 1 0 1 1 0 1 0
0 1 1 1 1 0 1 1
1 1 1 0 1 0 1 0
0 0 0 1 1 0 1 0
0 0 1 1 0 0 0 1
1 1 0 1 1 0 0 0
1 0 1 0 1 0 1 1
1 1 1 0 0 0 1 0
1 0 1 0 0 0 1 0
0 1 1 1 1 0 1 1

13
3. Fitness
function and
Selection
3
Fitness function

Fitness Function is an Objective Function which we need to maximize .

In our example we have considered


Fitness Function = 1/abs(X12 – X22 – 5.13)

While Choosing An Appropriate Fitness Function


define Fitness Function in such a way so that it increases even with
searching for minima or maxima.

15
Fitness function

10011101

In the above string X1 is 9 and X2 is 13.

Therefore, the value of Fitness Function is 0.011.

In this way, the fitness values of all the populations are calculated .

16
Selection

The selection function choses parent for the next generation based on
their scaled values from the fitness scaling function.
There are several methods for Selection.

Fitness Proportionate Selection


• Fitness Proportionate Selection is one of the most popular ways of
parent selection. In this every individual can become a parent with a
probability which is proportional to its fitness.

17
Selection

Roulette Wheel Selection


• In a roulette wheel selection, the circular wheel is divided as
according to the fitness value. A fixed point is chosen on the wheel
circumference as shown and the wheel is rotated. The region of the
wheel which comes in front of the fixed point is chosen as the parent.
For the second parent, the same process is repeated.

18
Selection

Tournament Selection
In K-Way tournament selection, we select K individuals from the
population at random and select the best out of these to become a
parent. The same process is repeated for selecting the next parent.
Tournament Selection is also extremely popular in literature as it can
even work with negative fitness values.

19
Selection
Before Selection After Selection

1 1 0 1 1 0 1 0 1 1 1 0 1 0 1 0
0 1 1 1 1 0 1 1 0 0 1 1 0 0 0 1
1 1 1 0 1 0 1 0 0 0 1 1 0 0 0 1
0 0 0 1 1 0 1 0 0 0 1 1 0 0 0 1
0 0 1 1 0 0 0 1 0 0 1 1 0 0 0 1
1 1 0 1 1 0 0 0 1 1 0 1 1 0 0 0
1 0 1 0 1 0 1 1 0 0 1 1 0 0 0 1
1 1 1 0 0 0 1 0 0 0 1 1 0 0 0 1
1 0 1 0 0 0 1 0 1 1 0 1 1 0 0 0
0 1 1 1 1 0 1 1 1 1 1 0 1 0 1 0
We can see that some of the strings are repeated which have greater fitness values.
20
4. Crossover
& Mutation

4
Crossover

• The crossover operator is analogous to reproduction and biological


crossover. In this more than one parent is selected and one or more
off-springs are produced using the genetic material of the parents.
Crossover is usually applied in a GA with a high probability – pc .

Crossover Operators
• In this section we will discuss some of the most popularly used
crossover operators. It is to be noted that these crossover operators
are very generic and the GA Designer might choose to implement a
problem-specific crossover operator as well.

22
Crossover

One Point Crossover


• In this one-point crossover, a random crossover point is selected and
the tails of its two parents are swapped to get new off-springs

Multi Point Crossover


• Multi point crossover is a generalization of the one-point crossover
wherein alternating segments are swapped to get new off-springs.

23
Crossover

One Point Crossover


• In this one-point crossover, a random crossover point is selected and
the tails of its two parents are swapped to get new off-springs

Multi Point Crossover


• Multi point crossover is a generalization of the one-point crossover
wherein alternating segments are swapped to get new off-springs.

24
Crossover

Uniform Crossover
• In a uniform crossover, we don’t divide the chromosome into
segments, rather we treat each gene separately. In this, we essentially
flip a coin for each chromosome to decide whether or not it’ll be
included in the off-spring. We can also bias the coin to one parent, to
have more genetic material in the child from that parent.

25
Crossover
Before Crossover

1 1 1 0 1 0 1 0
0 0 1 1 0 0 0 1

After Crossover

1 1 1 1 0 0 0 1
0 0 1 0 1 0 1 0

26
Mutation

• In simple terms, mutation may be defined as a small random tweak in


the chromosome, to get a new solution. It is used to maintain and
introduce diversity in the genetic population and is usually applied
with a low probability – pm. If the probability is very high, the GA gets
reduced to a random search.
• Mutation is the part of the GA which is related to the “exploration” of
the search space. It has been observed that mutation is essential to
the convergence of the GA while crossover is not.

27
Mutation

Bit Flip Mutation


• In this bit flip mutation, we select one or more random bits and flip
them. This is used for binary encoded GAs.

Swap Mutation
• In swap mutation, we select two positions on the chromosome at
random, and interchange the values. This is common in permutation
based encodings.

28
Mutation

0 0 1 1 0 0 0 1

0 0 0 1 0 0 0 1

Mutation make small random changes in the individuals in the


population which provide genetic diversity and enables the GA to
search a broader space.

29
5. Survivors
&
Termination
5
Survivor Selection
• The Survivor Selection Policy determines which individuals are to be
kicked out and which are to be kept in the next generation. It is
crucial as it should ensure that the fitter individuals are not kicked out
of the population, while at the same time diversity should be
maintained in the population.
• Age Based Selection: in Age-Based Selection, we don’t have a notion
of a fitness. It is based on the premise that each individual is allowed
in the population for a finite generation where it is allowed to
reproduce, after that, it is kicked out of the population no matter how
good its fitness is.
• Fitness Based Selection: In this fitness based selection, the children
tend to replace the least fit individuals in the population.

31
Termination Condition

• The termination condition of a Genetic Algorithm is important in


determining when a GA run will end. It has been observed that initially, the
GA progresses very fast with better solutions coming in every few
iterations, but this tends to saturate in the later stages where the
improvements are very small. We usually want a termination condition
such that our solution is close to the optimal, at the end of the run.
Usually, we keep one of the following termination conditions −
• When there has been no improvement in the population for X iterations.
• When we reach an absolute number of generations.
• When the objective function value has reached a certain pre-defined value.

32
Application
Domain Application Types
Control gas pipeline, pole balancing, missile evasion, pursuit

Design semiconductor layout, aircraft design, keyboard


configuration, communication networks
Scheduling manufacturing, facility scheduling, resource allocation

Robotics trajectory planning

Machine Learning designing neural networks, improving classification


algorithms, classifier systems
Signal Processing filter design

Game Playing poker, checkers, prisoner’s dilemma

Combinatorial set covering, travelling salesman, routing, bin packing,


graph colouring and partitioning
Optimization

33
Genetic Programming

• Genetic programming is indeed a special type of genetic algorithm.


The difference lies in their representations. Genetic programming
adopts trees as genotypes to represent programs or expressions.

your date here större - a multipurpose PowerPoint template 34


GP contd…
• The distinct features of genetic programming are their crossover and mutation operators. For instance,
swapping sub-trees between two trees and random generation of sub-trees.
• A list of common crossover and mutation operators for genetic programming is tabulated in Table 1.

your date here större - a multipurpose PowerPoint template 35


Differential Evolution
• Differential Evolution was first proposed by Price and Storn in the 1990s (Storn & Price, 1997). It
demonstrated great potential for real function optimization in the subsequent contests(Price, 1997).
• Without loss of generality, a typical strategy of differential evolution (DE/rand/1)(Feoktistov, 2006) is shown
in Figure 3.

your date here större - a multipurpose PowerPoint template 36


DE contd…
• For each individual in a generation, the algorithm randomly selects three individuals to
form a trial vector.
• One individual forms a base vector, whereas the value difference between the other two
individuals forms a difference vector.
• The sum of those two vectors forms a trial vector, which recombines with the individual
to form an offspring.
• Replacing the typical crossover and mutation operation by this trial vector generation,
manual parameter tuning of crossover and mutation is no longer needed.
• It can provide differential evolution a self-organizing ability and high adaptability for
choosing suitable step sizes which demonstrated its potential for continuous
optimization in the past contests.
• A self-organizing ability is granted for moving toward the optima. A high adaptability is
achieved for optimizing different landscapes (Feoktistov, 2006). With such self-
adaptability, differential evolution is considered as one of the most powerful
evolutionary algorithms for real function optimization.

your date here större - a multipurpose PowerPoint template 37


Thank You!

You might also like