0% found this document useful (0 votes)
43 views47 pages

Unit Vi

The document discusses several modern optimization techniques including genetic algorithms, simulated annealing, tabu search, and neural network-based optimization. It provides brief overviews of each technique, explaining concepts like chromosomes and genes in genetic algorithms, temperature parameters in simulated annealing, tabu lists in tabu search, and biological neurons that inspire neural networks. The document also notes some applications of these methods in areas like scheduling, routing, and pattern recognition.

Uploaded by

Rushikesh Jadhav
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
43 views47 pages

Unit Vi

The document discusses several modern optimization techniques including genetic algorithms, simulated annealing, tabu search, and neural network-based optimization. It provides brief overviews of each technique, explaining concepts like chromosomes and genes in genetic algorithms, temperature parameters in simulated annealing, tabu lists in tabu search, and biological neurons that inspire neural networks. The document also notes some applications of these methods in areas like scheduling, routing, and pattern recognition.

Uploaded by

Rushikesh Jadhav
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 47

JSPM

Rajarshi Shahu College of Engineering Pune


Department of Mechanical Engineering

“Optimization”
UNIT NO: VI

MODERN METHODS OF
OPTIMIZATION

1
Modern Optimization
Techniques
7 Hours

Modern methods of Optimization: Genetic Algorithms, Simulated Annealing, Ant

colony optimization, Tabu search, Neural-Network based Optimization, Fuzzy

optimization techniques – Applications


Modern Optimization Techniques
 Genetic Algorithm
 Simulated Annealing
Genetic Algorithm
 One of the modern optimization technique
 Based on biological molecular phenomena
 Principle of Natural genetics, natural selection includes
reproduction, crossover, mutation for genetic search
 GA repeatedly modifies a population of individual solution
 At each step, individuals are selected as parents randomly and
used to produce next generation.
 In successive generation, the population evolves towards an
optimal solution.
Introduction
Genetic algorithms are adaptive procedures derived from
Darwin’s principal of survival of the fittest in natural genetics.
GA maintains the population of potential solutions of the
candidate problem termed as individuals.
By manipulation of these individuals through genetic
operators such as selection, crossover and mutation, GA
evolves towards better solutions over a no. of generations.
Where GAs can be used? OPTIMIZATION: Where there are
large solutions to the problem but we have to find the best one.
best moves in chess mathematical problems financial
problems
Key Points
 Chromosome: A set of genes.
 Chromosome contains the solution in form of genes.
 Gene: A part of chromosome. A gene contains a part of
solution. It determines the solution. E.g. 16743 is a
chromosome and 1, 6, 7, 4 and 3 are its genes.
 Individual: Same as chromosome.
 Population: No of individuals present with same length of
chromosome.
Genetic Algorithm
Basic steps in GA

 Initialization of an algorithm

 Generation of new population

 Termination of Algorithm
Fitness:
 Fitness is the value assigned to an individual. It is based on
how far or close a individual is from the solution. Greater the
fitness value better the solution it contains.
 Fitness function: Fitness function is a function which assigns
fitness value to the individual. It is problem specific.
 Selection: Selecting individuals for creating the next
generation.
 Recombination (or crossover): Genes from parents form in
some way the whole new chromosome.
 Mutation: Changing a random gene in an individual.
General Algorithm

 GA START
 Generate initial population.
 Assign fitness function to all individuals.
 DO UNTIL best solution is found Select individuals from
current generation
 Create new offspring with mutation
 Compute new fitness for all individuals
 Kill all unfit individuals to give space to new offspring
 Check if best solution is found
 LOOP END
Genetic Algorithm
Genetic Algorithm
 One of the modern optimization technique
 Based on biological molecular phenomena
 Principle of Natural genetics, natural selection includes
reproduction, crossover, mutation for genetic search
 GA repeatedly modifies a population of individual solution
 At each step, individuals are selected as parents randomly and
used to produce next generation.
 In successive generation, the population evolves towards an
optimal solution.
Simulated Annealing
 Simulated annealing – is a technique of optimization based on
the analogy between the way the metal cools and freezes in a
minimum energy of the crystalline structure (the annealing
process) and the search for a minimum in a more general
system
 Natural motivation- Properties of structure depend on
cooling factor after the substance was heated to melting
point. Slow cooling – large crystals are formed, that is useful
for a substance structure. Spasmodic cooling– the weak
structure is formed.
 «Agitation» at a heat is accompanied by high molecular
activity in physical system.
Introduction Simulated Annealing
 SA algorithm
 The initial solution- For the majority of problems the initial
solution is casual.
 Solution estimation-The solution estimation consists of decoding
of the current solution and performance of the necessary act,
allowing to fathom its expediency for the solution of the given
problem.
 Casual search of the solution- Solution search begins with copying
of the current solution in the working solution which is any way
inoculated further.
 Create the initial solution -Evaluate the solution-Change the
solution in a random way- Evaluate the new solution- Criterion of
the admission- Reduce temperature -The current solution -The
working solution-The best solution.
Temperature
 The initial temperature should be enough high to make possible sampling
of other areas of a range of solutions.
 If the maximum distance between the next solutions is known it is easy
to count initial temperature: The initial temperature also can be changed
dynamically. If the statistics on criterion of the admission of the worst
solutions and a finding of new best solutions is set, it is possible to raise
temperature until the necessary quantity of admission (opening of new
solutions) will be attained.
 This process is analogous to heating of substance to its transition in the
liquid form then already there is no sense to raise temperature.
 Final temperature- Though the zero is convenient final temperature,
geometrical function which is used in an instance, shows, that the
algorithm will work much longer, than it is really necessary. Therefore
the final temperature usually is accepted hardly more null (for example,
0.5)
 Advantages of annealing
 absence of restrictions of the form of the minimizing function
 search of a global minimum;
 efficiency in a solving of the various classes of problems
demanding optimization.
 Annealing deficiencies
 the demand of infinitely slow cooling, in practice meaning slow
work of algorithm
 complexity of tuning
Ranges of application (SA)
 way creation
 image reconstruction
 assignment routine and planning
 network placement
 global routing
 detection and recognition of visual targets
 design of special digital filters
Tabu Search
 Meta-heuristics techniques
 Motivation - barrier to local search starting point descend
direction local minima global minima
Tabu Search
Background:
 Tabu search (TS) algorithm was proposed by Glover (1986).
 In the 1990s, the tabu search algorithm became very popular
in solving optimization problems.
 Nowadays, it is one of the most wide spread (single ) S-
metaheuristics.
 The use of memory represents the particular feature of
tabu search.
 TS behaves like a steepest algorithm, but it accepts non
improving solutions to escape from local optima.
Tabu Search
Main concepts:
 The key feature of TS method is the use of memory, which records
information related of the search process.
 TS generates a neighborhood solution from the current solution and
accepts the best solution even if is not improving the current solution.
 This strategy may lead to cycles i.e the previous visited solutions could
be selected again.
 In order to avoid cycles, TS discards the solution that have been previously
visited by using memory which is called tabu list.
 The length of the memory (tabu list) control the search process.
 A high length of the tabu list is high then search will explore larger regions
and forbids revisiting high number of solution.
 A low length of the tabu list concentrates the search on a small area of the
search space.
Tabu Search
 Main concepts
 At each iteration the tabu list is updated (first in – first out
queue).
 The tabu list contains a constant number of tabu moves called
tabu tenure, which is the length of time for which a move is
forbidden.
 If a move is good and can improve the search process but it is
in tabu list, there is no need to be prohibited and the solution
is accepted in a process called aspiration criteria.
Tabu search Algorithm
Tabu Search Applications:
Neural Network based optimization
 Development of Neural Networks date back to the early 1940s. It
experienced an upsurge in popularity in the late 1980s.
 This was a result of the discovery of new techniques and developments
and general advances in computer hardware technology.
 Some NNs are models of biological neural networks and some
are not, but historically, much of the inspiration for the field of NNs
came from the desire to produce artificial systems capable of
sophisticated, perhaps intelligent, computations similar to those that
the human brain routinely performs, and thereby possibly to enhance
our understanding of the human brain.
 Most NNs have some sort of training rule. In other words, NNs
learn from examples (as children learn to recognize dogs from
examples of dogs) and exhibit some capability for generalization
beyond the training data.
The Biological Neuron
 The brain is a collection of about 10 billion interconnected
neurons. Each neuron is a cell that uses biochemical reactions to
receive, process and transmit information.
 Each terminal button is connected to other neurons across a small
gap called a synapse.
 A neuron's dendritic tree is connected to a thousand
neighboring neurons.
 When one of those neurons fire, a positive or negative
charge is received by one of the dendrites.
 The strengths of all the received charges are added together
through the processes of spatial and temporal summation.
Neural Networks:
Neural computing requires a number of neurons, to be connected together into a
neural network. Neurons are arranged in layers. Each neuron within the network is
usually a simple processing unit which takes one or more inputs and produces an
output. At each neuron, every input has an associated weight which modifies the
strength of each input. The neuron simply adds together all the inputs and calculates
an output to be passed on.
NN Example:
Comparison
 Computers have to be explicitly programmed
 Analyze the problem to be solved.
 Write the code in a programming language.
 Neural networks learn from examples
 No requirement of an explicit description of the problem.
 No need for a programmer.
 The neural computer adapts itself during a training period,
based on examples of similar problems even without a desired
solution to each problem.
 After sufficient training the neural computer is able to relate the
problem data to the solutions, inputs to outputs, and it is then able
to offer a viable solution to a brand new problem.
 Able to generalize or to handle incomplete data.
Applications of NNs- classification
recognition and identification
 In marketing: consumer spending pattern classification
 In defence: radar and sonar image classification
 In general computing and telecommunications: speech, vision and
handwriting recognition
 In finance: signature verification and bank note verification assessment
 In engineering: product inspection monitoring and control
 In defence: target tracking
 In security: motion detection, surveillance image analysis and fingerprint
matching
 In finance: foreign exchange rate and stock market forecasting
 In agriculture: crop yield forecasting
 In marketing: sales forecasting
 In meteorology: weather prediction
Ant Colony optimization
 Ant Behavior

food
obstacle
Ant Behavior
Ant Colony :
 Ant behavior is stochastic
 The behavior is induced by indirect communication
 (pheromone paths) - Stigmergy
 Ants explore the search space
 Limited ability to sense local environment
 Act concurrently and independently
 High quality solutions emerge via global cooperation
Pheromones
 Ants lay pheromone trails while traveling
 Pheromones accumulate with multiple ants using a path
 This behavior leads to the appearance of shortest paths
 Pheromones evaporate

 Avoids being trapped in local optima


 ρ small low evaporation ⇒ ⇒ slow adaptation
 ρ large high evaporation ⇒ ⇒ fast adaptation
Ant Colony Optimization Algorithm:
Steps:
 Start
 Construct solutions
 Explore the search space
 Choose next step probabilistically according to the
 pheromone model
 Apply local search to constructed solutions (Optional)
 Update pheromones (add new + evaporate)
 End
Fuzzy optimization techniques –
Applications
 Fuzzy control does not exist as an isolated topic devoid of
relationships to other fields, and it is important to
understand how it relates to these other fields in order to
strengthen your understanding of it.
 We have emphasized that fuzzy control has its foundations in
conventional control and that there are many relationships to
techniques, ideas, and methodologies there.
 Fuzzy control is also an "intelligent control" technique,
and hence there are certain relationships between it and
other intelligent control methods.
Relationships Between Fuzzy Systems
and Neural Networks
 There are two ways in which there are relationships between
fuzzy systems and neural networks. First, techniques from
one area can be used in the other. Second, in some cases the
functionality (i.e., the nonlinear function that they
implement) is identical.
 Some label the intersection between fuzzy systems and
neural networks with the term "fuzzy-neural" or "neuro-
fuzzy" to highlight that techniques from both fields are being
used. Here, we avoid this terminology and simply highlight
the basic relationships between the two fields.
Multilayer Perceptrons
 The multilayer perceptron should be viewed as a nonlinear network
whose nonlinearity can be tuned by changing the weights, biases, and
parameters of the activation functions. The fuzzy system is also a
tunable nonlinearity whose shape can be changed by tuning, for
example, the membership functions. Since both are tunable
nonlinearities, the following approaches are possible:
 Gradient methods can be used for training neural networks to
perform system identification or to act as estimators or predictors in
the same way as fuzzy systems Indeed, the gradient training of neural
networks, called "back-propagation training," was introduced well
before the gradient training of fuzzy systems, and the idea for training
fuzzy systems this way came from the field of neural networks.
Hybrid methods for training can also be
used for neural networks
 Hybrid methods for training can also be used for neural
networks. For instance, gradient methods may be used in
conjunction with clustering methods applied to neural
networks.
 • Indirect adaptive control can also be achieved with a
multilayer perceptron. To do this we use two multilayer
perceptrons as the tunable nonlinearities in the certainty
equivalence control law and the gradient method for tuning.
 • Gain scheduled control may be achieved by training a
multilayer perceptron to map the associations between
operating conditions and controller parameters.
Computer-Aided Design of Fuzzy
Systems
 The genetic algorithm can be used in the (off-line) computer-aided
design of control systems since it can artificially evolve an
appropriate controller that meets the performance specifications to
the greatest extent possible.
 To do this, the genetic algorithm maintains a population of strings
that each represent a different controller (digits on the strings
characterize parameters of the controller), and it uses a fitness
measure that characterizes the closed-loop specifications.
 Suppose, for instance, that the closed-loop specifications indicate that
you want, for a step input, a (stable) response with a rise-time of , a
percent overshoot of , and a settling time of We need to define the
fitness function so that it measures how close each individual in the
population at time k (i.e., each controller candidate) is to meeting
these specifications.
Expert Control Systems
Fuzzy Expert system:
 It is possible to use an expert system in adaptive or
supervisory control systems. Expert systems can be used in a
supervisory role for conventional controllers or for the
supervision of fuzzy controllers (e.g., for supervision of the
learning mechanism and reference model in an adaptive fuzzy
controller). Expert systems themselves can also be used as
the basis for general learning controllers.
Thank you

You might also like