Local Search Algorithms: Chapter 4, Sections 3-4
Local Search Algorithms: Chapter 4, Sections 3-4
Chapter 4, Sections 34
Chapter 4, Sections 34
Outline
Hill-climbing Simulated annealing Genetic algorithms Local search in continuous spaces (briey)
Chapter 4, Sections 34
Chapter 4, Sections 34
Example: n-queens
Problem: Put n queens on an n n board with no two queens on the same row, column, or diagonal Move a queen to reduce number of conicts (number of attacking queens).
h=5
h=2
h=0
Almost always solves n-queens problems almost instantaneously for very large n, e.g., n = 1 million
Chapter 4, Sections 34
Variants of this approach get within 1% of optimal very quickly with thousands of cities
Chapter 4, Sections 34
Chapter 4, Sections 34
Note that this is a maximization problem, but Hill Climbing can be used for minimizing a cost, equally well (use negative of the cost as a benet).
Chapter 4, Sections 34
Hill-climbing contd.
Problem: depending on initial state, can get stuck on local maxima
global maximum value
local maximum
states
Chapter 4, Sections 34
Chapter 4, Sections 34
Chapter 4, Sections 34
10
Chapter 4, Sections 34
11
Chapter 4, Sections 34
12
Chapter 4, Sections 34
13
Hill-climbing variations
Stochastic hill-climbing Choose at random from among the uphill moves. The probability of selection can vary with the steepness of the uphill move. Convergence: usually slower than hill climbing Solutions: Sometimes better
Chapter 4, Sections 34
14
Hill-climbing variations
First choice hill-climbing A variation of stochastic hill-climbing. Instead of nding all the possible moves (successor states) and picking one at random, it randomly generates a next state until it nds one which is better. Useful when there are many successor states (thousands).
Chapter 4, Sections 34
15
Hill-climbing variations
All hill-climbing algorithms so far are incomplete. Random-start-hill-climbing Search a goal state, starting from randomly generated initial states. This variation is complete with probability approaching to 1 (as the number of searches increase). In fact, it can solve a 3-million queen problem in under a minute (for 8-queen, the probability of success is roughly 0.14, hence 7 random starts is expected to nd the solution (n = 1/p))
Chapter 4, Sections 34
16
Simulated annealing
Hill-climbing algorithms never makes downhill moves, hence they are guaranteed to be incomplete since they can get stuck in local maxima. Solution: simulated annealing (again, complete only probabilistically) Idea: escape local maxima by allowing some bad moves but gradually decrease their size and frequency E.g. shaking the surface where a ping pong ball is rolling to get it to the global minimum. Devised by Metropolis et al., 1953, for physical process modelling. Widely used in VLSI layout, airline scheduling, etc.
Chapter 4, Sections 34
17
Simulated annealing
function Simulated-Annealing( problem, schedule) returns a solution state inputs: problem, a maximization problem schedule, a mapping from time to temperature local variables: current, a node next, a node T, a temperature controlling the probability of downward steps current Make-Node(Initial-State[problem]) for t 1 to do T schedule[t] if T=0 then return current next a randomly selected successor of current E Value[next] Value[current] if E > 0 then current next else current next only with probability eE /T
Chapter 4, Sections 34
18
Chapter 4, Sections 34
19
Chapter 4, Sections 34
20
Genetic algorithms
= stochastic local beam search + generate successors from pairs of states
24748552 32752411 24415124 32543213
(a) Initial Population 24 31% 23 29% 20 26% 11 14% (b) Fitness Function
Chapter 4, Sections 34
21
Genetic algorithms
Population Fitness function Crossover Mutation Selection
Chapter 4, Sections 34
22
Chapter 4, Sections 34
23
Genetic algorithms
Start with a population of k randomly selected states selection w.r.t the tness function crossover (mating) mutation Idea: Try to take solutions to dierent subproblems from dierent nearsolutions.
Chapter 4, Sections 34
24