Local Search Algo and Optimistic Problems
Local Search Algo and Optimistic Problems
• The search algorithms we have seen so far, more often concentrate on path through
which the goal is reached. But if the problem does not demand the path of the
solution and it expects only the final configuration of the solution then we have
different types of problem to solve.
• Following are the problems where only solution state configuration is important and
3) Factory-floor layout.
4) Job-shop scheduling.
5) Automatic programming.
7) Vehicle routing.
8) Portfolio management.
If we have such type of problem then we can have another class of algorithms, the
algorithms that do not worry about the paths at all. These are local search algorithms.
Local search algorithms are useful for solving pure optimization problems. In pure
optimization problems main aim is to find the best state according to required
objective function.
Local search algorithm make use of concept called as state space landscape. This
landscape has two structures -
• If elevation corresponds to the cost, then the aim is to find the lowest valley - (a
global minimum).
• If elevation corresponds to the objective function then aim is to find the highest
Performance measurement
3. Ridges: A ridge is a special form of the local maximum. It has an area which is higher than its
surrounding areas, but itself has a slope, and cannot be reached in a single move.
Solution: With the use of bidirectional search, or by moving in different directions, we can improve
this problem.
Example for Local Search
Consider the 8-queens problem:
1) The successor function: It is function which returns all possible states which are
generated by a single queen move to another cell in the same column. The total
successor of the each state 8 x 7 = 56.
2) The heuristic cost function: It is a function 'h' which hold the number of attacking
pair of queens to each other either directly or indirectly. The value is zero for the
global minimum of the function which occurs only at perfect solutions.
Simulated Annealing:
A hill-climbing algorithm which never makes a move towards a lower value guaranteed to be
incomplete because it can get stuck on a local maximum. And if algorithm applies a random walk,
by moving a successor, then it may complete but not efficient. Simulated Annealing is an
algorithm which yields both efficiency and completeness.
In mechanical term Annealing is a process of hardening a metal or glass to a high temperature then
cooling gradually, so this allows the metal to reach a low-energy crystalline state. The same process
is used in simulated annealing in which the algorithm picks a random move, instead of picking the
best move. If the random move improves the state, then it follows the same path. Otherwise, the
algorithm follows the path which has a probability of less than 1 or it moves downhill and chooses
another path.
Simulated Annealing
Simulated annealing is a probabilistic local search algorithm inspired by the annealing process in
metallurgy. It allows the algorithm to accept worse solutions with a certain probability, which
decreases over time. This randomness introduces exploration into the search process, helping the
algorithm escape local optima and potentially find global optima.
Initialization: Start with an initial solution.
Evaluation: Evaluate the quality of the initial solution.
Neighbor Generation: Generate neighboring solutions.
Selection: Choose a neighboring solution based on the improvement in the objective
function and the probability of acceptance.
Termination: Continue iterating until a termination condition is met.
The key to simulated annealing's success is the "temperature" parameter, which controls the
likelihood of accepting worse solutions. Initially, the temperature is high, allowing for more
exploration. As the algorithm progresses, the temperature decreases, reducing the acceptance
probability and allowing the search to converge towards a better solution.
Genetic Algorithms
1) Evolutionary pace of learning algorithm is genetic algorithm. The higher degree of
eugency can be achieved with new paradigm of AI called a genetic algorithms.
3) In the genetic algorithm two parent states are combined together by which a good
successor state is generated.
4) The analogy to natural selection is the same as in stochastic beam search, except
now we are dealing with sexual rather than asexual reproduction.
In 8-queen problem the fitness function has 28 value for number of non attacking
pairs.
Steps
1) Create an individual 'X' (parent) by using random selection with fitness function
'A' of 'X'.
2) Create an individual 'Y' (parent) by using random selection with fitness function'B'
of 'Y'.
6) The above process is repeated until child (an individual) is not fit as specified by
fitness function.
States:
Assume each queen has its own column, represent a state by listing a row where the
queen is in each column (digits 1 to 8)
For example, fitness of the state below is 27 (queens in columns 4 and 7 attack each
other).
Example: 8 queens problem crossover.
Choose pairs for reproduction (so that those with higher fitness are more likely to be
chosen, perhaps multiple times).
Produce offspring by taking substring 1-3 from the first parent and 4 - 8 from the
second (and vice versa). Apply mutation (with small probability) to the offspring.
Importance of representation
Consider what would happen with binary representation (where position requires 3
digits).
Also chosen representation reduces search space considerably (compared to
representing each square for example).
2) The poor candidate solution will be vanished, as genetic algorithm generates best
child.