Variable Neighborhood Search
Variable Neighborhood Search
Introduction
VNS systematically changes the neighborhood in two phases: firstly, descent to find a local optimum and
finally, a perturbation phase to get out of the corresponding valley.
Applications are rapidly increasing in number and pertain to many fields: location theory, cluster analysis,
scheduling, vehicle routing, network design, lot-sizing, artificial intelligence, engineering, pooling
problems, biology, phylogeny, reliability, geometry, telecommunication design, etc.
There are several books important for understanding VNS, such as: Handbook of Metaheuristics, 2010,[3]
Handbook of Metaheuristics, 2003[4] and Search methodologies, 2005.[5] Earlier work that motivated this
approach can be found in
1. Davidon, W.C.[6]
2. Fletcher, R., Powell, M.J.D.[7]
3. Mladenović, N.[8] and
4. Brimberg, J., Mladenović, N.[9]
Recent surveys on VNS methodology as well as numerous applications can be found in 4OR, 2008[10] and
Annals of OR, 2010.
, (1)
where S, X, x, and f are the solution space, the feasible set, a feasible solution, and a real-valued objective
function, respectively. If S is a finite but large set, a combinatorial optimization problem is defined. If
, there is continuous optimization model.
A solution is optimal if
.
Exact algorithm for problem (1) is to be found an optimal solution x*, with the validation of its optimal
structure, or if it is unrealizable, in procedure have to be shown that there is no achievable solution, i.e.,
, or the solution is unbounded. CPU time has to be finite and short. For continuous optimization, it
is reasonable to allow for some degree of tolerance, i.e., to stop when a feasible solution has been found
such that
or
Some heuristics speedily accept an approximate solution, or optimal solution but one with no validation of
its optimality. Some of them have an incorrect certificate, i.e., the solution obtained satisfies
Heuristics are faced with the problem of local optima as a result of avoiding boundless computing time. A
local optimum of problem is such that
Description
According to (Mladenović, 1995), VNS is a metaheuristic which systematically performs the procedure of
neighborhood change, both in descent to local minima and in escape from the valleys which contain them.
1. A local minimum with respect to one neighborhood structure is not necessarily a local
minimum for another neighborhood structure.
2. A global minimum is a local minimum with respect to all possible neighborhood structures.
3. For many problems, local minima with respect to one or several neighborhoods are
relatively close to each other.
Unlike many other metaheuristics, the basic schemes of VNS and its extensions are simple and require few,
and sometimes no parameters. Therefore, in addition to providing very good solutions, often in simpler
ways than other methods, VNS gives insight into the reasons for such a performance, which, in turn, can
lead to more efficient and sophisticated implementations.
There are several papers where it could be studied among recently mentioned, such as (Hansen and
Mladenović 1999, 2001a, 2003, 2005; Moreno-Pérez et al.;[11])
Local search
A local search heuristic is performed through choosing an initial solution x, discovering a direction of
descent from x, within a neighborhood N(x), and proceeding to the minimum of f(x) within N(x) in the same
direction. If there is no direction of descent, the heuristic stops; otherwise, it is iterated. Usually the highest
direction of descent, also related to as best improvement, is used. This set of rules is summarized in
Algorithm 1, where we assume that an initial solution x is given. The output consists of a local minimum,
denoted by x', and its value. Observe that a neighborhood structure N(x) is defined for all x ∈ X. At each
step, the neighborhood N(x) of x is explored completely. As this may be time-consuming, an alternative is to
use the first descent heuristic. Vectors are then enumerated systematically and a move is made
as soon as a direction for the descent is found. This is summarized in Algorithm 2.
Function BestImprovement(x)
1: repeat
2: x' ← x
3: x ← argmin_{ f(y) }, y ∈ N(x)
4: until ( f(x) ≥ f(x') )
5: return x'
Function FirstImprovement(x)
1: repeat
2: x' ← x; i ← 0
3: repeat
4: i ← i + 1
5: x ← argmin{ f(x), f(x^i)}, x^i ∈ N(x)
6: until ( f(x) < f(x^i) or i = |N(x)| )
7: until ( f(x) ≥ f(x') )
8: return x'
Let one denote , a finite set of pre-selected neighborhood structures, and with
the set of solutions in the kth neighborhood of x.
One will also use the notation when describing local descent. Neighborhoods
or may be induced from one or more metric (or quasi-metric) functions introduced into a
solution space S. An optimal solution (or global minimum) is a feasible solution where a minimum of
problem is reached. We call x' ∈ X a local minimum of problem with respect to , if there is no
solution such that .
In order to solve problem by using several neighborhoods, facts 1–3 can be used in three different ways: (i)
deterministic; (ii) stochastic; (iii) both deterministic and stochastic. We first give in Algorithm 3 the steps of
the neighborhood change function which will be used later. Function NeighborhoodChange() compares the
new value f(x') with the incumbent value f(x) obtained in the neighborhood k (line 1). If an improvement is
obtained, k is returned to its initial value and the new incumbent updated (line 2). Otherwise, the next
neighborhood is considered (line 3).
When VNS does not render a good solution, there are several steps which could be helped in process, such
as comparing first and best improvement strategies in local search, reducing neighborhood, intensifying
shaking, adopting VND, adopting FSS, and experimenting with parameter settings.
The Basic VNS (BVNS) method (Handbook of Metaheuristics, 2010)[3] combines deterministic and
stochastic changes of neighborhood. Its steps are given in Algorithm 4. Often successive neighborhoods
will be nested. Observe that point x' is generated at random in Step 4 in order to avoid cycling, which
might occur if a deterministic rule were applied. In Step 5, the best improvement local search (Algorithm 1)
is usually adopted. However, it can be replaced with first improvement (Algorithm 2).
VNS variants
The basic VNS is a best improvement descent method with randomization.[3] Without much additional
effort, it can be transformed into a descent-ascent method: in NeighbourhoodChange() function, replace
also x by x" with some probability, even if the solution is worse than the incumbent. It can also be changed
into a first improvement method. Another variant of the basic VNS can be to find a solution x' in the
'Shaking' step as the best among b (a parameter) randomly generated solutions from the kth neighborhood.
There are two possible variants of this extension: (1) to perform only one local search from the best among
b points; (2) to perform all b local searches and then choose the best. In paper (Fleszar and Hindi[12]) could
be found algorithm.
Extensions
VND[13]
RVNS[14]
The reduced VNS (RVNS) method is obtained if random points are selected from
and no descent is made. Rather, the values of these new points are compared with that of
the incumbent and an update takes place in case of improvement. It is assumed that a
stopping condition has been chosen like the maximum CPU time allowed or the
maximum number of iterations between two improvements.
To simplify the description of the algorithms it is used below. Therefore, RVNS uses
two parameters: and . RVNS is useful in very large instances, for which local
search is costly. It has been observed that the best value for the parameter is often 2.
In addition, the maximum number of iterations between two improvements is usually used
as a stopping condition. RVNS is akin to a Monte-Carlo method, but is more systematic.
Skewed VNS
The skewed VNS (SVNS) method (Hansen et al.)[15] addresses the problem of exploring
valleys far from the incumbent solution. Indeed, once the best solution in a large region
has been found, it is necessary to go some way to obtain an improved one. Solutions
drawn at random in distant neighborhoods may differ substantially from the incumbent and
VNS can then degenerate, to some extent, into the Multistart heuristic (in which descents
are made iteratively from solutions generated at random, a heuristic which is known not to
be very efficient). Consequently, some compensation for distance from the incumbent must
be made.
Parallel VNS
Several ways of parallelizing VNS have recently been proposed for solving the p-Median
problem. In García-López et al.[17] three of them are tested: (i) parallelize local search; (ii)
augment the number of solutions drawn from the current neighborhood and make a local
search in parallel from each of them and (iii) do the same as (ii) but update the information
about the best solution found. Three Parallel VNS strategies are also suggested for
solving the Travelling purchaser problem in Ochi et al.[18]
Primal-dual VNS
For most modern heuristics, the difference in value between the optimal solution and the
obtained one is completely unknown. Guaranteed performance of the primal heuristic may
be determined if a lower bound on the objective function value is known. To this end, the
standard approach is to relax the integrality condition on the primal variables, based on a
mathematical programming formulation of the problem.
However, when the dimension of the problem is large, even the relaxed problem may be
impossible to solve exactly by standard commercial solvers. Therefore, it seems a good
idea to solve dual relaxed problems heuristically as well. It was obtained guaranteed
bounds on the primal heuristics performance. In Primal-dual VNS (PD-VNS) (Hansen et
al.)[19] one possible general way to attain both the guaranteed bounds and the exact
solution is proposed.
FSS is a method which is very useful because, one problem could be defined in addition
formulations and moving through formulations is legitimate. It is proved that local search
works within formulations, implying a final solution when started from some initial solution
in first formulation. Local search systematically alternates between different formulations
which was investigated for circle packing problem (CPP) where stationary point for a
nonlinear programming formulation of CPP in Cartesian coordinates is not strictly a
stationary point in polar coordinates.
Applications
Applications of VNS, or of varieties of VNS are very abundant and numerous. Some fields where it could
be found collections of scientific papers:
Industrial applications
Design problems in communication
Location problems
Data mining
Graph problems
Knapsack and packing problems
Mixed integer problems
Time tabling
Scheduling
Vehicle routing problems
Arc routing and waste collection
Fleet sheet problems
Extended vehicle routing problems
Problems in biosciences and chemistry
Continuous optimization
Other optimization problems
Discovery science
Conclusion
VNS implies several features which are presented by Hansen and Mladenović[22] and some are presented
here:
References
1. Hansen, P.; Mladenović, N.; Perez, J.A.M. (2010). "Variable neighbourhood search: methods
and applications". Annals of Operations Research. 175: 367–407. doi:10.1007/s10479-009-
0657-6 (https://doi.org/10.1007%2Fs10479-009-0657-6). S2CID 26469746 (https://api.sema
nticscholar.org/CorpusID:26469746).
2. Nenad Mladenović; Pierre Hansen (1997). "Variable neighborhood search". Computers and
Operations Research. 24 (11): 1097–1100. CiteSeerX 10.1.1.800.1797 (https://citeseerx.ist.p
su.edu/viewdoc/summary?doi=10.1.1.800.1797). doi:10.1016/s0305-0548(97)00031-2 (http
s://doi.org/10.1016%2Fs0305-0548%2897%2900031-2).
3. Gendreau, M.; Potvin, J-Y. (2010). "Handbook of Metaheuristics". Springer.
4. Glover, F.; Kochenberger, G.A. (2003). "Handbook of Metaheuristics". Kluwer Academic
Publishers.
5. Burke, EK.; Kendall, G. (2005). Burke, Edmund K; Kendall, Graham (eds.). Search
methodologies. Introductory tutorials in optimization and decision support techniques.
Springer. doi:10.1007/978-1-4614-6940-7 (https://doi.org/10.1007%2F978-1-4614-6940-7).
ISBN 978-1-4614-6939-1.
6. Davidon, W.C. (1959). "Variable metric algorithm for minimization". Argonne National
Laboratory Report ANL-5990.
7. Fletcher, R.; Powell, M.J.D. (1963). "Rapidly convergent descent method for minimization" (h
ttps://doi.org/10.1093%2Fcomjnl%2F6.2.163). Comput. J. 6 (2): 163–168.
doi:10.1093/comjnl/6.2.163 (https://doi.org/10.1093%2Fcomjnl%2F6.2.163).
8. Mladenović, N. (1995). "A variable neighborhood algorithm—a new metaheuristic for
combinatorial optimization". Abstracts of Papers Presented at Optimization Days, Montréal:
112.
9. Brimberg, J.; Mladenović, N. (1996). "A variable neighborhood algorithm for solving the
continuous location-allocation problem". Stud. Locat. Anal. 10: 1–12.
10. Hansen, P.; Mladenović, N.; Perez, J.A.M (2008). "Variable neighbourhood search: methods
and applications". 4OR. 6 (4): 319–360. doi:10.1007/s10288-008-0089-1 (https://doi.org/10.1
007%2Fs10288-008-0089-1). S2CID 538959 (https://api.semanticscholar.org/CorpusID:538
959).
11. Moreno-Pérez, JA.; Hansen, P.; Mladenović, N. (2005). "Parallel variable neighborhood
search". In Alba, E (ed.). Parallel Metaheuristics: A New Class of Algorithms. pp. 247–266.
CiteSeerX 10.1.1.615.2796 (https://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.615.
2796). doi:10.1002/0471739383.ch11 (https://doi.org/10.1002%2F0471739383.ch11).
ISBN 9780471739388.
12. Fleszar, K; Hindi, KS (2004). "Solving the resource-constrained project scheduling problem
by a variable neighbourhood search". Eur J Oper Res. 155 (2): 402–413.
doi:10.1016/s0377-2217(02)00884-6 (https://doi.org/10.1016%2Fs0377-2217%2802%2900
884-6).
13. Brimberg, J.; Hansen, P.; Mladenović, N.; Taillard, E. (2000). "Improvements and comparison
of heuristics for solving the multisource Weber problem" (http://bura.brunel.ac.uk/handle/243
8/6681). Oper. Res. 48 (3): 444–460. doi:10.1287/opre.48.3.444.12431 (https://doi.org/10.12
87%2Fopre.48.3.444.12431).
14. Mladenović, N.; Petrovic, J.; Kovacevic-Vujcic, V.; Cangalovic, M. (2003b). "Solving spread
spectrum radar polyphase code design problem by tabu search and variable neighborhood
search". Eur. J. Oper. Res. 151 (2): 389–399. doi:10.1016/s0377-2217(02)00833-0 (https://do
i.org/10.1016%2Fs0377-2217%2802%2900833-0).
15. Hansen, P.; Jaumard, B; Mladenović, N; Parreira, A (2000). "Variable neighborhood search
for weighted maximum satisfiability problem". Les Cahiers du GERAD G–2000–62, HEC
Montréal, Canada.
16. Hansen, P; Mladenović, N; Pérez-Brito, D (2001). "Variable neighborhood decomposition
search". J Heuristics. 7 (4): 335–350. doi:10.1023/A:1011336210885 (https://doi.org/10.102
3%2FA%3A1011336210885). S2CID 31111583 (https://api.semanticscholar.org/CorpusID:3
1111583).
17. García-López, F; Melián-Batista, B; Moreno-Pérez, JA (2002). "The parallel variable
neighborhood search for the p-median problem". J Heuristics. 8 (3): 375–388.
doi:10.1023/A:1015013919497 (https://doi.org/10.1023%2FA%3A1015013919497).
S2CID 16096161 (https://api.semanticscholar.org/CorpusID:16096161).
18. Ochi, LS; Silva, MB; Drummond, L (2001). "Metaheuristics based on GRASP and VNS for
solving traveling purchaser problem" (https://www.researchgate.net/publication/228986047).
MIC'2001, Porto: 489–494.
19. Hansen, P; Brimberg, J; Uroševi´c, D; Mladenović, N (2007a). "Primal-dual variable
neighborhood search for the simple plant location problem" (http://bura.brunel.ac.uk/handle/
2438/6678). INFORMS J Comput. 19 (4): 552–564. doi:10.1287/ijoc.1060.0196 (https://doi.or
g/10.1287%2Fijoc.1060.0196).
20. Hansen, P.; Mladenović, N.; Urosevic, D. (2006). "Variable neighborhood search and local
branching". Computers and Operations Research. 33 (10): 3034–3045.
CiteSeerX 10.1.1.108.987 (https://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.108.9
87). doi:10.1016/j.cor.2005.02.033 (https://doi.org/10.1016%2Fj.cor.2005.02.033).
21. Mladenović, N.; Plastria, F.; Urosevic, D. (2006). "Reformulation descent applied to circle
packing problems". Computers and Operations Research. 32 (9): 2419–2434.
doi:10.1016/j.cor.2004.03.010 (https://doi.org/10.1016%2Fj.cor.2004.03.010).
22. Hansen, P; Mladenović, N (2003). "Variable neighborhood search". In Glover F;
Kochenberger G (eds.). Handbook of Metaheuristics. International Series in Operations
Research & Management Science. Vol. 57. Dordrecht: Kluwer. pp. 145–184.
CiteSeerX 10.1.1.635.7056 (https://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.635.
7056). doi:10.1007/0-306-48056-5_6 (https://doi.org/10.1007%2F0-306-48056-5_6).
ISBN 978-1-4020-7263-5.
External links
EURO Mini Conference XXVIII on Variable Neighbourhood Search (http://toledo.mi.sanu.ac.
rs/~grujicic/vnsconference)
The 5th International Conference on Variable Neighborhood Search (http://vnsconference.uf
op.br/)
The 8th International Conference on Variable Neighborhood Search (http://www.icvns2020.i
nfo/)
Retrieved from "https://en.wikipedia.org/w/index.php?title=Variable_neighborhood_search&oldid=1030774399"