0% found this document useful (0 votes)
2 views

Week 5 - Evolutionary Optimization Algorithms

Uploaded by

barkhabatool771
Copyright
© © All Rights Reserved
Available Formats
Download as PPSX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Week 5 - Evolutionary Optimization Algorithms

Uploaded by

barkhabatool771
Copyright
© © All Rights Reserved
Available Formats
Download as PPSX, PDF, TXT or read online on Scribd
You are on page 1/ 36

ARTIFICIAL

INTELLIGENCE &
EXPERT SYSTEMS

Evolutionary
Algorithms
By
Engr. Dr. Jawwad Ahmad
&
Dr. Nasir Uddin
1
Today’s Goal
 Introduction

 Evolutionary Algorithm (EA)

 Genetic Algorithm (GA)

 Particle Swarm Optimization (PSO)

 Ant Colony Optimization (ACO)

Engr. Dr. Jawwad Ahmad & Dr. Nasir Uddin 2


Introduction
 How to find maxima or minima of any single variable
function?

 In classical method there are simple two sets of steps.

 First set comprises of three further steps to find a point


that might be maxima or minima .

1. Take Derivative of the function

2. Equate the Derivative to zero and find the value


of the variable.

3. Put it again in the function to find the point.


Engr. Dr. Jawwad Ahmad & Dr. Nasir Uddin 3
Introduction
 Second set is to find whether the point is maxima,
minima or a saddle point.

 This set has further two steps

1. Take second derivate

2. Now second derivate decide that the point is a


maxima, minima or a saddle point.

 If then the point is maxima.


 If then the point is minima.
 If then the point is saddle.

Engr. Dr. Jawwad Ahmad & Dr. Nasir Uddin 4


Introduction
 Find maxima or minima for the given functions.

Point

Now second derivate decide that the point is a maxima,


Take Derivative
minima of the function
or a saddle point.

If then the point is maxima.


Equate the Derivative
If then to zero
the and
pointfind the value of𝑥=0
is minima. the variable.
If then the point is saddle.
Put it again in the function to find the point.

Practice
Take second derivate

Engr. Dr. Jawwad Ahmad & Dr. Nasir Uddin 5


Introduction
Ackley
Holder
Easom
Deb Function
Table
Griewank Function
Function
Function
Decreasing
Deb's Function Function
 What if the function is difficult?

Engr. Dr. Jawwad Ahmad & Dr. Nasir Uddin 6


Single Vs Multi-objective Optimization
Single Objective Optimization:

 When an optimization problem involves only one objective


function, the task of finding the optimal solution is called Single
Objective Optimization.

 Example: Find out a CAR for me with Minimum cost.

Multi-objective Optimization:

 When an optimization problem involves more than one objective


function, the task of finding one or more optimal solutions is known
as Multi-objective Optimization.

 Example: Find out a CAR for me with Minimum cost and


Maximum comfort.

Engr. Dr. Jawwad Ahmad & Dr. Nasir Uddin 7


Optimization Techniques

Soft Data Computing

Classification/Modeling Optimization

Artificial Evolutionary
Fuzzy Classical
Neural Approaches
Logic Approaches
Networks

Engr. Dr. Jawwad Ahmad & Dr. Nasir Uddin 8


Optimization Techniques
Basic Types
 Optimization techniques can be divided into two broad
classifications :

 Classical Optimization Techniques


 Monte Carlo Simulations (MCS)
 Simulated Annealing (SA)
 Iterative Least Squares (ILS)
 Projection Methods (PM)

 Modern Optimization Techniques


 Fuzzy Logic (FL)
 Artificial Neural Networks (ANN)
 Evolutionary Algorithms (EA)

Engr. Dr. Jawwad Ahmad & Dr. Nasir Uddin 9


Optimization Techniques
Types of Evolutionary Algorithms
 Modern Optimization Techniques
 Fuzzy Logic (FL)
 Artificial Neural Networks (ANN)
 Evolutionary Algorithms (EA)

 Evolutionary Programming (EP)


 Differential Evolution (DE)
 Genetic Programming (GP)
 Evolutionary Strategies (ES)
 Genetic Algorithm (GA)
 Population Based Incremental Learning (PBIL)
 Particle Swarm Optimization (PSO)
 Ant Colony Optimization (ACO)

Engr. Dr. Jawwad Ahmad & Dr. Nasir Uddin 10


Evolutionary Algorithm (EA)
 In computational intelligence (CI), an Evolutionary Algorithm (EA) is
a subset of evolutionary computation, a generic population-
based metaheuristic optimization algorithm.

 An EA uses mechanisms inspired by biological evolution, such


as reproduction, mutation, recombination, and selection.

 Candidate solutions to the optimization problem play the role of


individuals in a population, and the fitness/cost function determines the
quality of the solutions.

 Evolution of the population then takes place after the repeated application
of the above operators.

 Evolutionary Algorithm often perform well approximating solutions to all


types of problems because they ideally do not make any assumption about
the underlying fitness/cost landscape.
Engr. Dr. Jawwad Ahmad & Dr. Nasir Uddin 11
Evolutionary Algorithm (EA)
 Techniques from Evolutionary algorithms applied to the modelling of
biological evolution are generally limited to explorations
of microevolutionary processes and planning models based upon cellular
processes.

 In most real applications of EAs, computational complexity is a prohibiting


factor. In fact, this computational complexity is due to fitness/cost function
evaluation.

 Fitness/cost approximation is one of the solutions to overcome this


difficulty.

 However, seemingly simple EA can solve often complex


problems; therefore, there may be no direct link between algorithm
complexity and problem complexity.

Engr. Dr. Jawwad Ahmad & Dr. Nasir Uddin 12


Optimization Techniques
Types of Evolutionary Algorithms

Reference:

D. Ansell, “Antenna performance


optimization using evolutionary
algorithms”, 2010.
Engr. Dr. Jawwad Ahmad & Dr. Nasir Uddin 13
Introduction and History

Engr. Dr. Jawwad Ahmad & Dr. Nasir Uddin 14


Introduction and History

Engr. Dr. Jawwad Ahmad & Dr. Nasir Uddin 15


Genetic Algorithm (GA)
 Genetic Algorithm (GA) was proposed by John Henry Holland in the
1970s.

 In GA, a population of candidate solutions (called individuals,


creatures, or phenotypes) to an optimization problem is evolved
toward better solutions.

 Each individual has a set of properties (its chromosomes or


genotype).

 Each individual is evaluated based on its properties (Fitness or Cost


value); this evaluation helps to survive.

 Properties are shared between individuals of population or altered


(crossover or mutation).

Engr. Dr. Jawwad Ahmad & Dr. Nasir Uddin 16


Frame-work Genetic Algorithm (GA)

Engr. Dr. Jawwad Ahmad & Dr. Nasir Uddin 17


Features of GA
 Literature shows numerous features of genetic algorithm.
 GA optimizes both continuous and discrete variables
 It does not need to have derivative information
 It can search from a wide sample of fitness/cost function
surface
 It deals with large number of variables
 GA is a good choice for parallel computing
 It can optimize variables with high complexity
 It may provide a list of many optimum variables, not just one
variable
 GA can work with numerical or experimental data or with
analytical function
Engr. Dr. Jawwad Ahmad & Dr. Nasir Uddin 18
Implementation
 Step 1: Generate Random Population with the Bounds provided.

 Step 2: Evaluate Cost / Fitness Function

 Step 3: Apply Natural Selection, Roulette Wheel, Tournament Methods to


find Pheno_Parents in accordance with the Cost / Fitness

 Step 4: Convert Pheno_Parents to Geno_Parents

 Step 5: Reproduce Off-Springs with the help of Geno_Parents Crossover


Techniques

 Step 6: Integrate Geno_Paraents and Off-Springs to set the Pop-Size again.

 Step 7: Mutation

 Step 8: Convert Geno_Parents to Pheno_Parents

Engr. Dr. Jawwad Ahmad & Dr. Nasir Uddin 19


Implementation

Engr. Dr. Jawwad Ahmad & Dr. Nasir Uddin


Swarm Intelligence (SI) Optimization
 First introduced by Beni and Wang in 1989 with their study of
 Particle Swarm Optimization
cellular robotic systems.
Theraulaz,
 Extended by Ant Colony Optimization
Bonabeau, Dorigo, Kennedy,…..
 SI is also treated
 Beeas Colony
an artificial intelligence (AI) technique based on
Optimization
the collective behavior in decentralized, self-organized systems.
 Wasp Colony Optimization
 Generally made up of agents who interact with each other and the
 Intelligent
environment. Water Drops
 No centralized control structures.
 Based on group behavior found in nature.

Engr. Dr. Jawwad Ahmad & Dr. Nasir Uddin 21


Particle Swarm Optimization (PSO)
 A population based stochastic optimization technique.

 Searches for an optimal solution in the computable search space.

 Developed in 1995 by Eberhart and Kennedy.

 Inspiration: Flocks of Birds, Schools of Fish.

 In PSO individuals strive to improve themselves and often achieve

this by observing and imitating their neighbors.

 Each PSO individual has the ability to remember

Engr. Dr. Jawwad Ahmad & Dr. Nasir Uddin 22


Particle Swarm Optimization (PSO)
 PSO has simple algorithms and low overhead

 Making it more popular in some circumstances than

Genetic/Evolutionary Algorithms

 Has only one operation calculation:

 Velocity: a vector of numbers that are added to the position

coordinates to move an individual

Engr. Dr. Jawwad Ahmad & Dr. Nasir Uddin 23


Features of PSO
 Particle swarm optimization on the other hand has following

characteristics.

 It is based on natural intelligence

 It can be applied on engineering use and scientific research

 Its speed of search is very fast and calculation is very simple

Engr. Dr. Jawwad Ahmad & Dr. Nasir Uddin 24


Working
 Individuals in a population learn from previous experiences and the
experiences of those around them.
 The direction of movement is a function of:
 Current position
 Velocity
 Location of individuals “best” success
 Location of neighbors “best” successes
 Therefore, each individual in a population will gradually move towards the
“better” areas of the problem space.
 Hence, the overall population moves towards “better” areas of the problem
space.

Engr. Dr. Jawwad Ahmad & Dr. Nasir Uddin 25


Implementation
 A swarm consists of N particles in a D-dimensional search space. Each
particle holds a position (which is a candidate solution to the problem) and a
velocity (which means the flying direction and step of the particle).
 Each particle successively adjust its position toward the global optimum
based on two factors:
 the best position visited by itself (pbest) denoted as

 And the best position visited by the whole swarm (gbest) denoted as

Engr. Dr. Jawwad Ahmad & Dr. Nasir Uddin 26


Implementation

𝒗 𝒊 (𝒕+𝟏)=𝒘 ⋅𝒗 𝒊 (𝒕)+𝒄 𝟏 ⋅𝒓𝒂𝒏𝒅()⋅(𝒑 𝒊 − 𝒙𝒊 (𝒕))+𝒄𝟐 ⋅𝒓𝒂𝒏𝒅()⋅(𝒑 𝒈 −𝒙 𝒊 (𝒕))


𝒙 𝒊 (𝒕+𝟏)=𝒙 𝒊 (𝒕)+𝒗 𝒊 (𝒕)
Local
P B es t best perf.
pi
𝒙 𝒊 (𝒕 ) Global best
pg perf. of team

𝒙 𝒊 (𝒕 +𝟏)
𝒗 𝒊 (𝒕 )
g B es t
Engr. Dr. Jawwad Ahmad & Dr. Nasir Uddin 27
Implementation

Engr. Dr. Jawwad Ahmad & Dr. Nasir Uddin 28


GA Vs PSO
 Similarity
 Both algorithms start with a group of a randomly generated population
 Both have fitness values to evaluate the population.
 Both update the population and search for the optimum with random
techniques.
 Both systems do not guarantee success.

 Dissimilarity
 However, unlike GA, PSO has no evolution operators such as crossover
and mutation.
 In PSO, the potential solutions, called particles, fly through the problem
space by following the current optimum particles.
 Particles update themselves with the internal velocity.
 They also have memory, which is important to the algorithm.

 Advantages
 PSO is easy to implement and there are few parameters to adjust.
 Compared with GA, all the particles tend to converge to the best solution
quickly even in the local version in most cases
Engr. Dr. Jawwad Ahmad & Dr. Nasir Uddin 29
Ant Colony Optimization (ACO)
 Ant Colony Optimization (ACO) was inspired by the behaviors of
ants and has many successful applications in discrete optimization
problems. ACO was published in 1991.
 Certain species of ants are able to find the shortest path to a food
source merely by laying and following chemical trails known as
pheromone – which then attracts other ants.
 The colony’s efficient behavior emerges from the collective activity of
individuals following two very simple rules:
 Lay pheromone
 Follow the trails of others

Engr. Dr. Jawwad Ahmad & Dr. Nasir Uddin 30


Features of ACO
 A general-purpose heuristic algorithm which can be used to solve

different combinatorial optimization problems.

 Not interested in simulation of ant colonies, but in the use of artificial

ant colonies as an optimization tool.

 Major differences with a real (natural) ant:


 Artificial ants will have some memory
 They will not be completely blind
 They will live in an environment where time is discrete
 Artificial pheromone will have a faster evaporation rate
Engr. Dr. Jawwad Ahmad & Dr. Nasir Uddin 31
Implementation
 Let be the number of ants in town at time while the total number of ants be
 Each ant is a simple agent with the following characteristics:
 It chooses the town to go to with a probability that is a function of the
town distance and of the amount of trail present on the connecting edge;
 To force the ant to make legal tours, transitions to already visited towns
are disallowed until a tour is completed (this is controlled by a tabu list);
 When it completes a tour, it lays a substance called trail on each edge
visited.
 Let be the intensity of trail on edge at time .
 Each ant at time chooses the next town, where it will be at time .
 In iterations, each ant completes a tour.

Engr. Dr. Jawwad Ahmad & Dr. Nasir Uddin 32


Implementation
 At this point the trail intensity is updated according to the following formula

𝝉 𝒊𝒋 ( 𝒕 + 𝒏) = 𝝆 𝝉 𝒊𝒋 ( 𝒕 )+ 𝚫 𝝉𝒊𝒋

 Where is a coefficient such that represents the evaporation of trail between


and . 𝒎

𝒊𝒋 ∑
𝒌
𝑚𝚫 𝑚
𝝉 = 𝚫 𝝉 𝒊𝒋
𝜏 𝑘𝑖, 𝑗 =( 1−𝜏𝜌𝑘𝑖,) 𝑗𝜏=𝑖 ,∑
𝑗 ∑ 𝑖, 𝑗 𝑖, 𝑗
+ Δ 𝜏Δ𝑘𝒌=𝟏
𝜏 𝑘
With
Without
vaporization
vaporization
 Where is the quantity𝑘=1
per
𝑘=1unit of length of trail substance (pheromone) laid
on the edge by the ant between and is given by

{
𝑄 𝒕𝒉
𝒌 , 𝒊𝒇 𝒌 𝒂𝒏𝒕 𝒖𝒔𝒆𝒔 ( 𝒊 , 𝒋 ) 𝒆𝒅𝒈𝒆 𝒊𝒏 𝒊𝒕𝒔 𝒕𝒐𝒖𝒓 𝒃𝒆𝒕𝒘𝒆𝒆𝒏𝒕 𝒂𝒏𝒅 𝒕+𝒏
𝚫 𝝉 𝒊𝒋 = 𝐿 𝐾
¿ 𝟎 , 𝒐𝒕𝒉𝒆𝒓𝒘𝒊𝒔𝒆
 Where is a constant and is the tour length of the ant.

Engr. Dr. Jawwad Ahmad & Dr. Nasir Uddin 33


Implementation
Start

Initialization of ACO

Solution using pheromone trail


(heuristic probability)

Evaluation Yes
Condition to Stop
Stop

No

Pheromone Evaporation and Updates


parameters

Engr. Dr. Jawwad Ahmad & Dr. Nasir Uddin 34


Extra
 Different Evolutionary & Optimization Algorithms
 Tabu Search
 Artificial Bee Colony TASKS
 Fish Optimization Algorithms  Solve N-Queen Problem by
 Simulated Annealing 1. GA
 Etc. 2. PSO
3. ACO

 Various Selection Methods  Compare the EA through


Time Complexity,
 Natural Selection Accuracy and other
 Roulette-Wheel Selection parameters
 Tournament selection
 Stochastic Universal Sampling
 Truncation Selection
 Etc.
Engr. Dr. Jawwad Ahmad & Dr. Nasir Uddin 35
Thank you

36

You might also like