2
2
1
Lec 2
Introduction to Population-based
Optimization Algorithms
DR. M. Arafa
Definition of Optimization
optimization problem.
➢ Optimizing the architecture and parameters of the ANFIS models (premise and
antecedent parameters).
➢ Tuning the parameters of the PID controller (get the best value for KP, KI and
KD).
➢ Getting the best Placement of Wi-Fi Access Point for Indoor Positioning
system (IPS).
Fig.1 has one global minimum and two local minimum ; the global
minimum is the least among all local minimum.
➢ Note that every global optimum is a local optimum, but the reverse
is not necessarily true.
➢ A multimodal function has more than one local optimum and one
global optimum which is a local optimum with the least objective
function value among all other local optimum.
6
Global and local optimum
f (x)
Local minimum
minimum
Local minimum
minimum
Global minimum
minimum x
7
Fig. 1: 𝒇 (𝑿) is a multimodal function
Global and local optimum
f (x)
x
Fig. 2: 𝒇 (𝑿) is a unimodal function 8
Basic Elements of Optimization Problem
𝑓 𝑋 ∗ has the minimum value in the search domain (in the case of
Number of variables xd ,
Univariate Multivariate
𝑑 = 1,2, … , 𝐷
(D = 1) (D > 1)
(Dimensionality)
Linear Nonlinear
Linearity / nonlinearity
(Objective function (Objective function
of objective function
is linear in xd ) is nonlinear in xd )
Unconstrained Constrained
Constraints (Only search ranges (Additional constraints
of xd are constrained) are forced on xd )
12
Classification
Types of optimization problems
basis
Unimodal Multimodal
Number of
(Objective function has one (Objective function has more
optimum values
optimum only) than one optimum)
Single-objective Multi-objective
Number of
(Single objective function is to be (More than one objective
objectives
optimized) function are to be optimized )
Separable Non-separable
If the function 𝑓 (𝑥1 , 𝑥2 , ……., If the function 𝑓 (𝑥1 , 𝑥2 , …….,
Separability of
𝑥𝐷 ) can be divided to D functions 𝑥𝐷 ) can not be divided to D
variables xd
in the following form: functions in the following form:
𝑓 𝑥1 + 𝑓 𝑥2 + … … 𝑓(𝑥𝐷 ) 𝑓 𝑥1 + 𝑓 𝑥2 + … … 𝑓(𝑥𝐷 )
Population-based Optimization Algorithms
emulates the swarm behavior of animals herding, a flock of birds, a school of fish,
and a colony of ants or honey bees where these swarms search for food in a
collaborative manner.
➢ For the swarm intelligence algorithms and some of the evolutionary algorithms ,
➢ During the iterations, a population of constant size is maintained, and the group of
(It was created in the early 1960s and developed further in the early 1970s
(It was first proposed in 2002 by Li LX, Shao ZJ and Qian JX)
20
After Convergence
21
Exploration Versus Exploitation
Exploration And Exploitation
Note that:
➢ The results are obtained for each function in terms of the mean and
standard deviation of the error, percentage success rate (%SR) and the
average number of function evaluations over successful runs (convergence
speed).
Testing An Optimization Algorithm
Where:
• The function error value in a run = f (xbest) − f (x*),
where xbest is the best solution found by the algorithm in a run and
x* is the global minimum of the function.
(https://github.com/P-N-Suganthan)