0% found this document useful (0 votes)
9 views28 pages

2

Uploaded by

mohamed9ahmed12
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views28 pages

2

Uploaded by

mohamed9ahmed12
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 28

‫بسم هللا الرحمن الرحيم‬

‫‪1‬‬
Lec 2

Introduction to Population-based

Optimization Algorithms

DR. M. Arafa
Definition of Optimization

➢ In engineering, physical, and mathematical sciences, computational

optimization, or simply optimization, means either minimization or

maximization of a certain objective function.

➢ optimization is aimed to find the best (optimum) solution for any

optimization problem.

➢ The minimum value of 𝑓(𝑋) is the same as the negative


maximum value of −𝑓(𝑋).

Min (𝑓(𝑋)) = − Max (−𝑓(𝑋))


Examples of Optimization Problems/ Applications
➢ Optimizing the architecture and parameters of the ANN models (weights and
biases).

➢ Optimizing the architecture and parameters of the ANFIS models (premise and
antecedent parameters).

➢ Optimizing the architecture of convolution neural network (CNN) models.

➢ Solving Path Planning and Tracking problems for Autonomous Vehicles

➢ Tuning the parameters of the PID controller (get the best value for KP, KI and
KD).

➢ Getting the best Placement of Wi-Fi Access Point for Indoor Positioning
system (IPS).

➢ Approximating experimental, mathematical functions with the least possible


error.
Global and local optimum

➢ Optimum can be global or local. To illustrate, the single-variable

function f (x) (represents the objective function to be minimized) in

Fig.1 has one global minimum and two local minimum ; the global
minimum is the least among all local minimum.

➢ Note that every global optimum is a local optimum, but the reverse
is not necessarily true.

➢ A unimodal function has a single local optimum which is itself the


global optimum (as in Fig. 2).
5
Global and local optimum

➢ A multimodal function has more than one local optimum and one
global optimum which is a local optimum with the least objective
function value among all other local optimum.

➢ In an optimization problem, the ideal target is ofcourse the global


optimum, and a 'good' optimization algorithm (optimizer) does not
get trapped in any local optimum.

6
Global and local optimum

f (x)

Local minimum
minimum

Local minimum
minimum
Global minimum
minimum x
7
Fig. 1: 𝒇 (𝑿) is a multimodal function
Global and local optimum

f (x)

x
Fig. 2: 𝒇 (𝑿) is a unimodal function 8
Basic Elements of Optimization Problem

(1) An objective function 𝑓 which is the function to be optimized


(minimized or maximized).
(2) The set of components or variables of the objective function that
specifies the dimensionality of the optimization problem. If the
set 𝑥𝑑 , 𝑑 = 1,2, … , 𝐷, represents such variables, then the
objective function 𝑓 is expressed in the form
𝑓 (𝑥1 , 𝑥2 , ……., 𝑥𝐷 )
Then 𝑥1 , 𝑥2 , ……., 𝑥𝐷 are the independent variables and D is the
number of variables specifies the dimensionality of the problem.
The objective function can be written compactly as:
𝑓 (𝑋), 𝑋 = [ 𝑥1 , 𝑥2 , ……., 𝑥𝐷 ] is a 1xD vector
Basic Elements of Optimization Problem

(3) A set of constraints forced on the required solution

Most problems constrain at least the search domains of the

variables vector 𝑋 = [ 𝑥1 , 𝑥2 , ……., 𝑥𝐷 ].

➢ Note that the aim of an optimization method is to find the global

optimum 𝑋 ∗  RD from the allowable search domains, such that

𝑓 𝑋 ∗ has the minimum value in the search domain (in the case of

minimization optimization problem).


10
Classification of Optimization Problems
Classification basis Types of optimization problems

Number of variables xd ,
Univariate Multivariate
𝑑 = 1,2, … , 𝐷
(D = 1) (D > 1)
(Dimensionality)

Linear Nonlinear
Linearity / nonlinearity
(Objective function (Objective function
of objective function
is linear in xd ) is nonlinear in xd )

Unconstrained Constrained
Constraints (Only search ranges (Additional constraints
of xd are constrained) are forced on xd )
12
Classification
Types of optimization problems
basis

Unimodal Multimodal
Number of
(Objective function has one (Objective function has more
optimum values
optimum only) than one optimum)

Single-objective Multi-objective
Number of
(Single objective function is to be (More than one objective
objectives
optimized) function are to be optimized )

Separable Non-separable
If the function 𝑓 (𝑥1 , 𝑥2 , ……., If the function 𝑓 (𝑥1 , 𝑥2 , …….,
Separability of
𝑥𝐷 ) can be divided to D functions 𝑥𝐷 ) can not be divided to D
variables xd
in the following form: functions in the following form:
𝑓 𝑥1 + 𝑓 𝑥2 + … … 𝑓(𝑥𝐷 ) 𝑓 𝑥1 + 𝑓 𝑥2 + … … 𝑓(𝑥𝐷 )
Population-based Optimization Algorithms

➢ Evolutionary Optimization Algorithms

➢ Swarm intelligence algorithms


➢ Evolutionary optimization algorithms are stochastic (random) search algorithms

based on Darwin’s theory of evolution, "survival of the fittest".

➢ Swarm intelligence algorithms are stochastic (random) search algorithms

emulates the swarm behavior of animals herding, a flock of birds, a school of fish,

and a colony of ants or honey bees where these swarms search for food in a

collaborative manner.

➢ They are considered heuristic optimization algorithms.

➢ They are referred to as population-based optimization algorithms as they use a

group of candidate solutions, not just one solution as in traditional methods.


➢ They are used to solve complex global optimization problems (like multi-modal

problems, large-scale difficult problems, multi-objective problems, problems with

no mathematical knowledge, etc.) that cannot be solved by classical optimization

methods or solved with unsatisfactory results.

➢ For the swarm intelligence algorithms and some of the evolutionary algorithms ,

the inspiration often comes from nature, especially biological systems.

➢ The basic characteristic of a population-based optimization method is that the

iteration policy depends on a population.

➢ During the iterations, a population of constant size is maintained, and the group of

solutions is improved progressively.


Examples of Evolutionary Optimization Algorithms

➢ Genetic algorithm (GA) Holland, J. H. (1970)

➢ Genetic programming (GP) John R.Koza (1990)

➢ Evolution strategy (ES)

(It was created in the early 1960s and developed further in the early 1970s

by Ingo Rechenberg and Hans-Paul Schwefel)

➢ Evolutionary programming (EP) Lawrence J. Fogel (1960)

➢ Differential evolution (DE) Storn, R., and Price, K. (1997)


Examples of Swarm intelligence algorithms

➢ Ant colony optimization (ACO) Dorigo, M. (1992)

➢ Particle swarm optimization (PSO)

Kennedy, J., and Eberhart, R. (1995)

➢ Artificial bee colony (ABC) Karaboga (2005)

➢ Bat algorithm (BA) Yang (2010)


➢ Shuffled frog-leaping algorithm (SFLA)

Eusuff and Lansey (2006)


➢ Grey wolf optimization algorithm (GWO)

Mirjalili et al. (2014)


Examples of Swarm intelligence algorithms

➢ Dolphin Echolocation Algorithm (DEA)


(It was first proposed in 2013 by A. Kaveh and N. Farhoudi)

➢ Fish Swarm Optimization Algorithm (FSOA)

(It was first proposed in 2002 by Li LX, Shao ZJ and Qian JX)

➢ Glowworm swarm optimization (GSO)

(It was first proposed in 2006 by Krishnanand and Ghose)

➢ Cat swarm optimization (CSO)

(It was first proposed in 2006 by S. C. Chu, P. W. Tsai, and J. S. Pan)


Population-based Optimization in Continuous Space

20
After Convergence

21
Exploration Versus Exploitation
Exploration And Exploitation

 In the context of optimization, we have also two distinctive terms:


exploration and exploitation.

➢ Exploration means finding new solutions in the search domains


which have not been evaluated before.

In exploration, the variation of the population members from one


iteration to another is large.

➢ Exploitation means trying to improve the current found


solutions by performing relatively small changes that lead to
new solutions in the immediate neighborhood.

In exploitation, the variation of the population members from


one iteration to another is very small.
23
The Basic elements that Affect on Exploration and Exploitation

1) The population size (number of members in the population) affects


on the exploration rate. Large size of the population, increase the
rate of exploration.

2) The control parameters of the optimization algorithm affect on both


of exploration and exploitation.

Note that:

Any optimization algorithm starts from large exploration rate and


this allows the algorithm to cover large regions of search domains
quickly. As iterations processes; the exploration rate is decreased,
allowing to exploit the promising regions that previously explored .
Evaluation of Optimization Algorithms
Testing Optimization Algorithms
➢ In order to evaluate the efficiency and robustness of an optimization
algorithm, standard complex mathematical functions with different
characteristics called benchmark functions are used to test the
optimization algorithm.

➢ After selecting a suitable set of benchmark functions, the algorithm is


running over these functions for N independent of runs. Each run consists
of a predefined No. of iterations.

➢ The results are obtained for each function in terms of the mean and
standard deviation of the error, percentage success rate (%SR) and the
average number of function evaluations over successful runs (convergence
speed).
Testing An Optimization Algorithm

Where:
• The function error value in a run = f (xbest) − f (x*),

where xbest is the best solution found by the algorithm in a run and
x* is the global minimum of the function.

𝑁𝑜.𝑜𝑓 𝑠𝑢𝑐𝑐𝑒𝑠𝑠 𝑟𝑢𝑛𝑠


• %𝑆𝑅 = X 100
𝑇𝑜𝑡𝑎𝑙 𝑛𝑜.𝑜𝑓 𝑟𝑢𝑛𝑠

➢ The run is considered successful if the algorithm reached to the

required global optimum (Sometimes an approximated value for the

global optimum is allowed).


Testing An Optimization Algorithm

➢ There is a competition made every year between evolutionary and

swarm optimization algorithms called CEC (Competition on

Evolutionary Computation) to evaluate these algorithms and

determine the fittest optimization algorithm. It uses very complex

benchmark functions with different characteristics.

(Ex: CEC 2005, CEC2006, CEC2007, …………., CEC2017.)

(https://github.com/P-N-Suganthan)

You might also like