Genetic Algorithm and Its Application: M.Sivagami
Genetic Algorithm and Its Application: M.Sivagami
Agenda
History of GA
John Holland and his students-in 1960s1970s in University of Michigan
Book on GA Adaption in natural and
artificial systems by holland in 1975
Applications of GA
Engineering design
Automotive design
Computer gaming
Robotics
Evolvable hardware
Optimized telecommunication Routing
Encryption and code breaking
Biometric Inventions
Financial Investment
Marketing
Implementation
Java JGAP,JGAL
Python
Lisp
MATLAB
c/c++
Fitness calculation
void calc_fitness(ga_vector &population)
{
string target = GA_TARGET;
int tsize = target.size();
unsigned int fitness;
for (int i=0; i<GA_POPSIZE; i++) {
fitness = 0;
for (int j=0; j<tsize; j++) {
fitness += abs(int(population[i].str[j] - target[j]));
}
population[i].fitness = fitness;
}
}
Selection
void elitism(ga_vector &population,
ga_vector &buffer, int esize )
{
for (int i=0; i<esize; i++) {
buffer[i].str = population[i].str;
buffer[i].fitness = population[i].fitness;
}
}
Sorting
bool fitness_sort(ga_struct x, ga_struct y)
{ return (x.fitness < y.fitness); }
inline void sort_by_fitness(ga_vector
&population)
{ sort(population.begin(), population.end(),
fitness_sort); }
Mutation
void mutate(ga_struct &member)
{
int tsize = GA_TARGET.size();
int ipos = rand() % tsize;
int delta = (rand() % 90) + 32;
member.str[ipos] = ((member.str[ipos] + delta)% 122);
Crossover
void mate(ga_vector &population, ga_vector &buffer)
{
int esize = GA_POPSIZE * GA_ELITRATE;
int tsize = GA_TARGET.size(), spos, i1, i2;
elitism(population, buffer, esize);
// Mate the rest
for (int i=esize; i<GA_POPSIZE; i++) {
i1 = rand() % (GA_POPSIZE / 2);
i2 = rand() % (GA_POPSIZE / 2);
spos = rand() % tsize;
buffer[i].str = population[i1].str.substr(0, spos) +
population[i2].str.substr(spos, tsize - spos);
if (rand() < GA_MUTATION) mutate(buffer[i]);
}
}
Solution representation
Binary digits
Integer
Real number
Character array
Any data structure tree in GP
Terminating criteria
A pre-determined number of generations
or time has elapsed
A satisfactory solution has been achieved
No improvement in solution quality has
taken place for a pre-determined number
of generations
Genetic Operators
Selection
Cross over
Mutation
Selection Operator
Random based selection
Fitness based selection
Crossover operator
Mutation
Flip flop
Boundary
Non-Uniform
Uniform
Gaussian
Role of operators
High and low mutation
Selection is important
Selection and crossover tend to
converge on a good but sub optimal
solution
Selection and mutation create a parallel
hill climbing algorithm
Criticisms
Fitness function for complex problems
(structural optimisization)-expensive fitness function
evaluation and it takes long time several hours to
several days
Terminating criteria is not clear- it is only comparison
with other solution
Operating on dynamic data set begin to converge early
on towards solution which may no longer valid for later
data
Solutions high mutation probability
- comma strategy new parents are selected
only from offspring
Strength of GA
Fast comparatively with exhaustive search
algorithm
GA searches from a population of points
not from a single point. Hill climbing)
- No free lunch theorem