0% found this document useful (0 votes)
1 views

Optimization

Chapter 6 discusses optimization algorithms, focusing on cost functions, search spaces, and methods like hill-climbing, simulated annealing, and genetic algorithms. It explains the concepts of global and local optima, as well as the processes involved in genetic algorithms such as selection, reproduction, and mutation. Additionally, it covers various selection methods and their implications on the optimization process.

Uploaded by

tranlehaiqsb
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
1 views

Optimization

Chapter 6 discusses optimization algorithms, focusing on cost functions, search spaces, and methods like hill-climbing, simulated annealing, and genetic algorithms. It explains the concepts of global and local optima, as well as the processes involved in genetic algorithms such as selection, reproduction, and mutation. Additionally, it covers various selection methods and their implications on the optimization process.

Uploaded by

tranlehaiqsb
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 33

Chapter 6: Optimization

Algorithms
Outline

l Cost – fitness – objective function


l Search space
l Hill-climbing
l Simulated annealing
l Genetic Algorithm
Learning problem

l Symbolic learning
– Steering rules by examples
l Numerical learning è optimization
– Minimizing cost function
– Maximizing fitness function
– Dynamically optimizing objective function
Search space

l Search space = parameter space


– Combinatorial problem
– Permutation problem
l Global optimum vs. local optimum
Global optimum-oriented searching

l Hill-climbing
l Simulated annealing
l Genetic algorithm
Hill-climbing algorithms

l Hill-climbing
l Steepest gradient
l Gradient-proportional descent
Simulated annealing

l Simulating the process of annealing metal


atoms
l High temperature è high probability for
atoms to overcome (break) local energy
Acceptance Probability

l P = exp(-Ea/kT)
l P = exp(-ΔE/T)

1
l P=
1 + exp(ΔE / T )
Genetic Algorithm

l Inspired by natural evolution


l Claimed to reach global optimum
Genetic Algorithm

l Using chromosomes to represent candidate


solutions
l Generating a population of candidates
l Fitness evaluation
l Selection for reproduction
l Reproduction
– Mating
– Cross-over
– Mutation
– Selection
Basic GA

l Make thing “genetic”


– Maximize f(x,y)
Basic GA

l Make thing “genetic”


– Maximize f(x,y) (log(x^2+x^y)-cos(y^x))

x y
Basic GA

l Make thing “genetic”


– Maximize f(x,y) Solution candidate

2 6
Basic GA

l Make thing “genetic”


– Maximize f(x,y) Solution candidate

2 6

– Allele
– Locus
Basic GA

l Make thing “genetic”


– Maximize f(x,y)

0010 0110
Basic GA

l Make thing “genetic”


– Maximize f(x,y)

00100110
Basic GA

l Make thing “genetic”


– Maximize f(x,y)
10000110
00100110

00110010 00111110 00101111

10100110
11100110
Crossover
Crossover
Crossover

10000110

00100010

00000010

10100110
Mutation

l Altering the value of one or more loci


l Based on a probability Pm
Validity check

l A = 01, B = 10, C = 11
l BACA
Validity check

l A = 01, B = 10, C = 11
l BACA
Selection

l Selection methods:
– Strong
– Week
Selection

l Selection methods:
– Too Strong: premature convergence
– Too Week: slow evolution
Selection methods

l Roulette wheel selection


l Linear fitting scaling
l Boltzman fitness scaling
l Rank selection

You might also like