Unit Vi
Unit Vi
“Optimization”
UNIT NO: VI
MODERN METHODS OF
OPTIMIZATION
1
Modern Optimization
Techniques
7 Hours
Initialization of an algorithm
Termination of Algorithm
Fitness:
Fitness is the value assigned to an individual. It is based on
how far or close a individual is from the solution. Greater the
fitness value better the solution it contains.
Fitness function: Fitness function is a function which assigns
fitness value to the individual. It is problem specific.
Selection: Selecting individuals for creating the next
generation.
Recombination (or crossover): Genes from parents form in
some way the whole new chromosome.
Mutation: Changing a random gene in an individual.
General Algorithm
GA START
Generate initial population.
Assign fitness function to all individuals.
DO UNTIL best solution is found Select individuals from
current generation
Create new offspring with mutation
Compute new fitness for all individuals
Kill all unfit individuals to give space to new offspring
Check if best solution is found
LOOP END
Genetic Algorithm
Genetic Algorithm
One of the modern optimization technique
Based on biological molecular phenomena
Principle of Natural genetics, natural selection includes
reproduction, crossover, mutation for genetic search
GA repeatedly modifies a population of individual solution
At each step, individuals are selected as parents randomly and
used to produce next generation.
In successive generation, the population evolves towards an
optimal solution.
Simulated Annealing
Simulated annealing – is a technique of optimization based on
the analogy between the way the metal cools and freezes in a
minimum energy of the crystalline structure (the annealing
process) and the search for a minimum in a more general
system
Natural motivation- Properties of structure depend on
cooling factor after the substance was heated to melting
point. Slow cooling – large crystals are formed, that is useful
for a substance structure. Spasmodic cooling– the weak
structure is formed.
«Agitation» at a heat is accompanied by high molecular
activity in physical system.
Introduction Simulated Annealing
SA algorithm
The initial solution- For the majority of problems the initial
solution is casual.
Solution estimation-The solution estimation consists of decoding
of the current solution and performance of the necessary act,
allowing to fathom its expediency for the solution of the given
problem.
Casual search of the solution- Solution search begins with copying
of the current solution in the working solution which is any way
inoculated further.
Create the initial solution -Evaluate the solution-Change the
solution in a random way- Evaluate the new solution- Criterion of
the admission- Reduce temperature -The current solution -The
working solution-The best solution.
Temperature
The initial temperature should be enough high to make possible sampling
of other areas of a range of solutions.
If the maximum distance between the next solutions is known it is easy
to count initial temperature: The initial temperature also can be changed
dynamically. If the statistics on criterion of the admission of the worst
solutions and a finding of new best solutions is set, it is possible to raise
temperature until the necessary quantity of admission (opening of new
solutions) will be attained.
This process is analogous to heating of substance to its transition in the
liquid form then already there is no sense to raise temperature.
Final temperature- Though the zero is convenient final temperature,
geometrical function which is used in an instance, shows, that the
algorithm will work much longer, than it is really necessary. Therefore
the final temperature usually is accepted hardly more null (for example,
0.5)
Advantages of annealing
absence of restrictions of the form of the minimizing function
search of a global minimum;
efficiency in a solving of the various classes of problems
demanding optimization.
Annealing deficiencies
the demand of infinitely slow cooling, in practice meaning slow
work of algorithm
complexity of tuning
Ranges of application (SA)
way creation
image reconstruction
assignment routine and planning
network placement
global routing
detection and recognition of visual targets
design of special digital filters
Tabu Search
Meta-heuristics techniques
Motivation - barrier to local search starting point descend
direction local minima global minima
Tabu Search
Background:
Tabu search (TS) algorithm was proposed by Glover (1986).
In the 1990s, the tabu search algorithm became very popular
in solving optimization problems.
Nowadays, it is one of the most wide spread (single ) S-
metaheuristics.
The use of memory represents the particular feature of
tabu search.
TS behaves like a steepest algorithm, but it accepts non
improving solutions to escape from local optima.
Tabu Search
Main concepts:
The key feature of TS method is the use of memory, which records
information related of the search process.
TS generates a neighborhood solution from the current solution and
accepts the best solution even if is not improving the current solution.
This strategy may lead to cycles i.e the previous visited solutions could
be selected again.
In order to avoid cycles, TS discards the solution that have been previously
visited by using memory which is called tabu list.
The length of the memory (tabu list) control the search process.
A high length of the tabu list is high then search will explore larger regions
and forbids revisiting high number of solution.
A low length of the tabu list concentrates the search on a small area of the
search space.
Tabu Search
Main concepts
At each iteration the tabu list is updated (first in – first out
queue).
The tabu list contains a constant number of tabu moves called
tabu tenure, which is the length of time for which a move is
forbidden.
If a move is good and can improve the search process but it is
in tabu list, there is no need to be prohibited and the solution
is accepted in a process called aspiration criteria.
Tabu search Algorithm
Tabu Search Applications:
Neural Network based optimization
Development of Neural Networks date back to the early 1940s. It
experienced an upsurge in popularity in the late 1980s.
This was a result of the discovery of new techniques and developments
and general advances in computer hardware technology.
Some NNs are models of biological neural networks and some
are not, but historically, much of the inspiration for the field of NNs
came from the desire to produce artificial systems capable of
sophisticated, perhaps intelligent, computations similar to those that
the human brain routinely performs, and thereby possibly to enhance
our understanding of the human brain.
Most NNs have some sort of training rule. In other words, NNs
learn from examples (as children learn to recognize dogs from
examples of dogs) and exhibit some capability for generalization
beyond the training data.
The Biological Neuron
The brain is a collection of about 10 billion interconnected
neurons. Each neuron is a cell that uses biochemical reactions to
receive, process and transmit information.
Each terminal button is connected to other neurons across a small
gap called a synapse.
A neuron's dendritic tree is connected to a thousand
neighboring neurons.
When one of those neurons fire, a positive or negative
charge is received by one of the dendrites.
The strengths of all the received charges are added together
through the processes of spatial and temporal summation.
Neural Networks:
Neural computing requires a number of neurons, to be connected together into a
neural network. Neurons are arranged in layers. Each neuron within the network is
usually a simple processing unit which takes one or more inputs and produces an
output. At each neuron, every input has an associated weight which modifies the
strength of each input. The neuron simply adds together all the inputs and calculates
an output to be passed on.
NN Example:
Comparison
Computers have to be explicitly programmed
Analyze the problem to be solved.
Write the code in a programming language.
Neural networks learn from examples
No requirement of an explicit description of the problem.
No need for a programmer.
The neural computer adapts itself during a training period,
based on examples of similar problems even without a desired
solution to each problem.
After sufficient training the neural computer is able to relate the
problem data to the solutions, inputs to outputs, and it is then able
to offer a viable solution to a brand new problem.
Able to generalize or to handle incomplete data.
Applications of NNs- classification
recognition and identification
In marketing: consumer spending pattern classification
In defence: radar and sonar image classification
In general computing and telecommunications: speech, vision and
handwriting recognition
In finance: signature verification and bank note verification assessment
In engineering: product inspection monitoring and control
In defence: target tracking
In security: motion detection, surveillance image analysis and fingerprint
matching
In finance: foreign exchange rate and stock market forecasting
In agriculture: crop yield forecasting
In marketing: sales forecasting
In meteorology: weather prediction
Ant Colony optimization
Ant Behavior
food
obstacle
Ant Behavior
Ant Colony :
Ant behavior is stochastic
The behavior is induced by indirect communication
(pheromone paths) - Stigmergy
Ants explore the search space
Limited ability to sense local environment
Act concurrently and independently
High quality solutions emerge via global cooperation
Pheromones
Ants lay pheromone trails while traveling
Pheromones accumulate with multiple ants using a path
This behavior leads to the appearance of shortest paths
Pheromones evaporate