0% found this document useful (0 votes)
6 views

CH 3 Problem Solving and Search Algorithms Part III

The document discusses various local search algorithms in artificial intelligence, focusing on hill-climbing, simulated annealing, and genetic algorithms. Hill-climbing is a greedy approach that can get stuck in local optima, while simulated annealing introduces randomness to escape these traps and find global optima. Genetic algorithms mimic natural selection to evolve solutions over generations, emphasizing the importance of fitness functions and population initialization.

Uploaded by

hidn2016
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views

CH 3 Problem Solving and Search Algorithms Part III

The document discusses various local search algorithms in artificial intelligence, focusing on hill-climbing, simulated annealing, and genetic algorithms. Hill-climbing is a greedy approach that can get stuck in local optima, while simulated annealing introduces randomness to escape these traps and find global optima. Genetic algorithms mimic natural selection to evolve solutions over generations, emphasizing the importance of fitness functions and population initialization.

Uploaded by

hidn2016
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 43

3.

Problem Solving and Search Algorithms


Artificial Intelligence (AI)
Part III
Dr. Udaya Raj Dhungana
Assist. Professor
Pokhara University, Nepal
Guest Faculty
Hochschule Darmstadt University of Applied Sciences, Germany
E-mail: [email protected] and [email protected]

1
Overview
3.3. Local Search and Optimization Problems
3.3.1. Hill-Climbing Search and its problems
3.3.2. Simulated Annealing
3.3.3. Genetic Algorithms
3.3.4. Gradient Descent

Dr. Udaya R. Dhungana, Pokhara University, Nepal. [email protected] 2


Local Search Algorithms
• A path to goal state is a solution
• for search algorithms like A* Search,
• we wanted to find paths through the search space, (eg. from Arad to
Bucharest)

• The path to the goal is irrelevant


• In many problems, we care only about the final state
• We don’t the path to get there
• Example: The 8-queens problem
• we care only about finding a valid final configuration of 8 queens
• what matters is the final configuration of queens, not the order in which
they are added.
• for such problems, we can use local search algorithms

Dr. Udaya R. Dhungana, Pokhara University, Nepal. [email protected] 3


Local Search Algorithms
• Local search algorithms
• operate by searching from a start state to neighboring states, with the aim to find the
next best state according to an objective function
• They don’t keep track of the paths, nor the set of states that have been reached. (So,
they are not systematic)
• they might never explore a portion of the search space where a solution actually
resides.
• Advantages:
• use very little memory
• can often find reasonable solutions in large or infinite state spaces for which systematic
algorithms are unsuitable.
• Disadvantage:
• May stuck in Local Optimum- a solution better than all neighbors but not necessarily
the best overall (global optimum).

Dr. Udaya R. Dhungana, Pokhara University, Nepal. [email protected] 4


Local Search Algorithms
• Each point (state) in the landscape has an “elevation,” defined by
the value of the objective function.

Dr. Udaya R. Dhungana, Pokhara University, Nepal. [email protected] 5


Local Search Algorithms
• Hill climbing
• If elevation corresponds to an objective function, the aim is to find
the highest peak—a global maximum
• we call the process a hill climbing.

Dr. Udaya R. Dhungana, Pokhara University, Nepal. [email protected] 6


Local Search Algorithms
• Gradient descent
• If elevation corresponds to cost, then the aim is to find the lowest
valley—a global minimum
• we call it gradient descent.
Gradient descent is an optimization
algorithm used to find the minimum
of a objective function.

Dr. Udaya R. Dhungana, Pokhara University, Nepal. [email protected] 7


Hill-climbing Search
• Hill climbing starts with an current state
• keeps track of the current state only
• On each iteration moves to the neighboring state with highest
value—that is, it heads in the direction of increasing value
(uphill or steepest ascent). (greedy approach)
• terminates when it reaches a “peak” where no neighbor has a
higher value.
• Hill climbing is greedy local search
• It chooses a good neighbor in every iteration
• It doesn’t look ahead beyond the immediate neighbors of the
current state.

Dr. Udaya R. Dhungana, Pokhara University, Nepal. [email protected] 8


Hill-climbing Search

Dr. Udaya R. Dhungana, Pokhara University, Nepal. [email protected] 9


Hill-climbing Search
• Hill climbing can make rapid progress toward a solution. From the state in (b) , it takes
just five steps to reach the state in (a), which has h = 1 and is very nearly a solution.

Dr. Udaya R. Dhungana, Pokhara University, Nepal. [email protected] 10


Hill-climbing Search
• Unfortunately, Hill climbing can get stuck for any of the following
reasons:

• Local Maxima

• Ridges

• Plateaus

In each case, it reaches


a point at which no
progress is being made.

Dr. Udaya R. Dhungana, Pokhara University, Nepal. [email protected] 11


Hill-climbing Search
• Local Maxima
• A local maximum is a peak that is higher than each of its neighboring
states but lower than the global maximum.

• Hill-climbing algorithms that reach near the area of a local maximum


will be drawn upward toward the peak but will then be stuck with
nowhere else to go.
Global Maximum
Local Maximum

Dr. Udaya R. Dhungana, Pokhara University, Nepal. [email protected] 12


Hill-climbing Search
• Ridges
• Ridges result in a sequence of local maxima that is very difficult for
greedy algorithms to navigate.

Global Maximum
Ridges
(Sequence of Local Maxima)

Dr. Udaya R. Dhungana, Pokhara University, Nepal. [email protected] 13


Hill-climbing Search
• Plateaus
• A plateau is a flat area of the state-space landscape.
• A plateau can be a flat local maximum, from which no uphill exit exists,
or a shoulder, from which progress is possible but no indication of the
correct direction.
• A hill-climbing search can get lost wandering on the plateau.

Dr. Udaya R. Dhungana, Pokhara University, Nepal. [email protected] 14


Variants of Hill-climbing Search
• Stochastic hill climbing

• Steepest ascent

• First-choice hill climbing

• Random-restart hill climbing

Assignment

Dr. Udaya R. Dhungana, Pokhara University, Nepal. [email protected] 15


Simulated Annealing
• Optimization tasks are usually focused on finding a global maximum (or minimum).
Sometime deterministic search methods get stuck in a local optimum and never find the
global optimal solution. Stochastic methods such as Monte Carlo methods can improve
search methods by helping algorithms escape these local optimal solution in order to
move closer towards the global optimal solution.
• Simulated annealing is alternative to the generic hill-climbing algorithm.
• A hill-climbing algorithm is really good at finding local optimal solutions but does not
perform well global solution. It will be efficient for the problems which has a very large
search space.
• Simulated Annealing is a probabilistic approach to finding a global maximum. The main
difference (in strategy) between greedy search and simulated annealing is that search
will always choose the best proposal, where simulated annealing has a probability (using
a Boltzman distribution) of choosing a worse proposal than strictly only accepting
improvements. This helps the algorithm find a global optimum by jumping out of the local
maximum.
• The greater the value of T (temperature), the greater likelihood of moving around the
search space. As T get closer to zero the algorithm will function like greedy hill climbing.

Dr. Udaya R. Dhungana, Pokhara University, Nepal. [email protected] 16


Simulated Annealing
• A hill-climbing algorithm
• never makes “downhill” moves toward states with lower value (or
higher cost)
• So, it may get stuck in a local maximum.
• In contrast, a purely random walk that
• moves to a successor state without concern for the value will eventually
lead to the global maximum,
• but it will be extremely inefficient.
• Therefore, combine hill-climbing and random walk
• With aim to get both efficiency (like hill climbing) and completeness
(Like random walk)
• This achieves what is called Simulated Annealing.

Dr. Udaya R. Dhungana, Pokhara University, Nepal. [email protected] 17


Simulated Annealing
• Simulated annealing is a probabilistic optimization technique inspired from the
annealing in metallurgy
• In metallurgy, annealing is the process used to temper or harden metals and
glass by heating them to a high temperature and then gradually cooling
them, thus allowing the material to reach a low-energy crystalline state.
• Heating: High temperatures allow atoms to move freely, exploring many
configurations.
• Cooling: As the temperature decreases, atomic movement slows, and the
material settles into a stable configuration.
• In optimization, it is analogous to
• Heating: initially allowing large and random movements in the search
space (Exploration)
• Cooling: gradually reducing randomness to focus on fine-tuning the
solution.(Exploitation)

Dr. Udaya R. Dhungana, Pokhara University, Nepal. [email protected] 18


Simulated Annealing
• Simulated Annealing uses a probability function P to accept worse solutions early in the
search to escape local optima
−ΔE/T
P=e
• Where ΔE is the difference in the objective function (new cost - current cost)
• T is the current Temperature.
• It is a control parameter that governs the algorithm’s randomness. At high
temperatures, the algorithm is more likely to accept worse solutions to escape
local optima. As the temperature decreases, the algorithm focuses on refining
solutions.

• Cooling Schedule: Controls how temperature T decreases over time. A geometric


cooling schedule can be T = αT, where α is a cooling factor, typically between 0.8–
0.99).

• Termination Criteria: The algorithm stops when T reaches a predefined minimum or a


maximum number of iterations is reached or no significant improvement occurs

Dr. Udaya R. Dhungana, Pokhara University, Nepal. [email protected] 19


Simulated Annealing
• The overall structure of the simulated-annealing algorithm is similar to hill
climbing. Instead of picking the best move, however, it picks a random move.
• If the move improves the situation, it is always accepted.
• Otherwise, the algorithm accepts the move with some probability less than 1.
• The probability decreases exponentially with the “badness” of the move—the
amount by which the evaluation is worsened.
• The probability also decreases as the “temperature” goes down: “bad” moves
are more likely to be allowed at the start when is high, and they become
more unlikely as decreases.
• If the schedule lowers to 0 slowly enough, then a property of the Boltzmann
−ΔE/T
distribution, e , is that all the probability is concentrated on the global
maxima, which the algorithm will find with probability approaching 1.

Dr. Udaya R. Dhungana, Pokhara University, Nepal. [email protected] 20


Simulated Annealing

Dr. Udaya R. Dhungana, Pokhara University, Nepal. [email protected] 21


Hill-climbing search
• Simulated annealing was used to solve VLSI layout problems
beginning in the 1980s.
• It has been applied widely to factory scheduling and other large-
scale optimization tasks.
• Simulated Annealing is a powerful and versatile optimization
technique.
• By balancing exploration (accepting worse solutions) and
exploitation (refining good solutions), it can find near-optimal
solutions for complex problems like the TSP.
• Proper parameter tuning and cooling schedules are essential to
maximize its effectiveness.

Dr. Udaya R. Dhungana, Pokhara University, Nepal. [email protected] 22


Genetic Algorithm
• Genetic Algorithms(GAs) are adaptive heuristic search algorithms.
• They are inspired by Charles Darwin’s theory of natural evolution.
• This algorithm reflects the process of natural selection where the
fittest individuals are selected for reproduction in order to produce
offspring of the next generation.
• They are commonly used to generate high-quality solutions for
optimization problems and search problems.
• This algorithm is important because it solves difficult problems that
would take a long time to solve.

See an Example at https://towardsdatascience.com/genetic-algorithm-explained-step-by-step-65358abe2bf

Dr. Udaya R. Dhungana, Pokhara University, Nepal. [email protected] 23


Genetic Algorithm
• The process of natural selection starts with the selection of fittest
individuals from a population.
• Each individual is represented as a string of character/integer/float/
bits. This string is analogous to the Chromosome.
• They produce offspring which inherit the characteristics of the
parents and will be added to the next generation.
• If parents have better fitness, their offspring will be better than
parents and have a better chance at surviving.
• This process keeps on iterating and at the end, a generation with the
fittest individuals will be found.
• This notion can be applied for a search problem. We consider a set
of solutions for a problem and select the set of best ones out of
them.
See an Example at https://towardsdatascience.com/genetic-algorithm-explained-step-by-step-65358abe2bf

Dr. Udaya R. Dhungana, Pokhara University, Nepal. [email protected] 24


Phases in Genetic Algorithm
• Genetic algorithms follow the following phases to solve complex
optimization problems:
• Initialization
• Fitness assignment
• Selection
• Reproduction
• Crossover
• Mutation
• Termination

See an Example at https://towardsdatascience.com/genetic-algorithm-explained-step-by-step-65358abe2bf

Dr. Udaya R. Dhungana, Pokhara University, Nepal. [email protected] 25


Phases in Genetic Algorithm

Dr. Udaya R. Dhungana, Pokhara University, Nepal. [email protected] 26


Phases in Genetic Algorithm
• Initial Population:
• The process begins with a set of individuals which is called a Population.
Each individual is a solution to the problem you want to solve.
• An individual is characterized by a set of parameters (variables) known as
Genes. Genes are joined into a string to form a Chromosome (solution).

Dr. Udaya R. Dhungana, Pokhara University, Nepal. [email protected] 27


Phases in Genetic Algorithm
• Initial Population:
• In a genetic algorithm, the set of genes of an individual is represented using
a string, in terms of an alphabet.
• Usually, binary values are used (string of 1s and 0s). We say that we encode
the genes in a chromosome.

Dr. Udaya R. Dhungana, Pokhara University, Nepal. [email protected] 28


Phases in Genetic Algorithm
• Initial Population:
• This initial population consists of all the probable solutions to the given
problem.
• The most popular technique for initialization is the use of random binary
strings

Dr. Udaya R. Dhungana, Pokhara University, Nepal. [email protected] 29


Phases in Genetic Algorithm
• Fitness Function:
• We need to define the evaluation criteria for best chromosomes. This is
called fitness function.
• The fitness function determines how fit an individual is (the ability of an
individual to compete with other individuals).
• It gives a fitness score to each individual.
• The probability that an individual will be selected for reproduction is based
on its fitness score.
• Better the fitness score, the higher the chances of being chosen for
reproduction.
• The individuals having better fitness scores are given more chance to
reproduce than others.
• The individuals with better fitness scores are selected who mate and
produce better offspring by combining chromosomes of parents.

Dr. Udaya R. Dhungana, Pokhara University, Nepal. [email protected] 30


Phases in Genetic Algorithm
• Selection:
• The idea of selection phase is to select the fittest individuals and let them
pass their genes to the next generation
• Pairs of individuals (parents) are selected based on their fitness scores.
• These individuals pass on their genes to the next generation
• The main objective of this phase is to establish the region with high chances
of generating the best solution to the problem (better than the previous
generation).
• The genetic algorithm uses the fitness proportionate selection technique to
ensure that useful solutions are used for recombination

Dr. Udaya R. Dhungana, Pokhara University, Nepal. [email protected] 31


Phases in Genetic Algorithm
• Reproduction:
• This phase involves the creation of a child population. The algorithm
employs following operators that are applied to the parent population:
• Crossover:
• Mutation:

Dr. Udaya R. Dhungana, Pokhara University, Nepal. [email protected] 32


Phases in Genetic Algorithm
• Crossover:
• Crossover is the most significant phase in a genetic algorithm.
• This operator swaps the genetic information of two parents to reproduce
an offspring.
• It is performed on parent pairs that are selected randomly to generate a
child population of equal size as the parent population.
• For each pair of parents to be mated, a crossover point is chosen at
random from within the genes.

Dr. Udaya R. Dhungana, Pokhara University, Nepal. [email protected] 33


Phases in Genetic Algorithm
• Crossover:
• For example, consider the crossover point to be 3 as shown below.
• Offspring are created by exchanging the genes of parents among
themselves until the crossover point is reached.
• The new offspring are added to the population.

Exchanging genes among parents New offspring

Dr. Udaya R. Dhungana, Pokhara University, Nepal. [email protected] 34


Phases in Genetic Algorithm
• Mutation:
• This operator adds new genetic information to the new child population.
• This is achieved by flipping some bits in the chromosome.
• Mutation solves the problem of local minimum and enhances
diversification.
• In certain new offspring formed, some of their genes can be subjected to a
mutation with a low random probability.

Dr. Udaya R. Dhungana, Pokhara University, Nepal. [email protected] 35


Phases in Genetic Algorithm
• Termination:
• The algorithm terminates if the population has converged (does not
produce offspring which are significantly different from the previous
generation).
• Then it is said that the genetic algorithm has provided a set of
solutions to our problem.
• The algorithm will terminate after the threshold fitness solution has
been attained.
• It will identify this solution as the best solution in the population.

Dr. Udaya R. Dhungana, Pokhara University, Nepal. [email protected] 36


Phases in Genetic Algorithm
Genetic Algorithm:

1. Randomly initialize populations p


2. Determine fitness of population
3. Until convergence, repeat the following steps:
A. Select parents from population
B. Crossover and generate new population
C. Perform mutation on new population
D. Calculate fitness for new population

Dr. Udaya R. Dhungana, Pokhara University, Nepal. [email protected] 37


Advantages of Genetic Algorithm
• Parallelism
• Global optimization
• A larger set of solution space
• Requires less information
• Provides multiple optimal solutions
• Probabilistic in nature
• Genetic representations using chromosomes

Dr. Udaya R. Dhungana, Pokhara University, Nepal. [email protected] 38


Disadvantages of Genetic Algorithm
• They are not effective in solving simple problems.
• Lack of proper implementation may make the algorithm converge to
a solution that is not optimal.
• The quality of the final solution is not guaranteed.
• Repetitive calculation of fitness values may make some problems to
experience computational challenges.
• The fitness function identification is a limitation.
• Genetic Algorithms might be costly in computational terms since the
evaluation of each individual requires the training of a model.
• These algorithms can take a long time to converge since they have a
stochastic nature.

Dr. Udaya R. Dhungana, Pokhara University, Nepal. [email protected] 39


When to apply Genetic Algorithm
• There are multiple local optima
• The objective function is not smooth (so derivative methods cannot
be applied)
• Number of parameters is very large
• Objective function is noisy or stochastic (may not be predicted
precisely)

Dr. Udaya R. Dhungana, Pokhara University, Nepal. [email protected] 40


Applications of Genetic Algorithm
• Transport: Genetic algorithms are used in the traveling salesman
problem to develop transport plans that reduce the cost of travel
and the time taken. They are also used to develop an efficient way
of delivering products.
• DNA Analysis: They are used in DNA analysis to establish the DNA
structure using spectrometric information.
• Aircraft Design: They are used to develop parametric aircraft
designs. The parameters of the aircraft are modified and upgraded
to provide better designs.
• Economics: They are used in economics to describe various models
such as the game theory, cobweb model, asset pricing, and schedule
optimization.

Dr. Udaya R. Dhungana, Pokhara University, Nepal. [email protected] 41


Applications of Genetic Algorithm
• Optimization Problem: One of the best examples of the
optimization problems is the travel salesman problem which uses GA.
Other optimization problems such as job scheduling, sound quality
optimization GAs are widely used.
• Immune system model: GAs are used to model various aspects of
the immune system for individual gene and multi-gene families
during evolutionary time.
• Machine Learning: GAs have been used to solve problem-related to
classification, prediction, create rules for learning and classification.

Dr. Udaya R. Dhungana, Pokhara University, Nepal. [email protected] 42


THANK YOU

End of Chapter

43

You might also like