0% found this document useful (0 votes)
10 views9 pages

Unit - 5

Genetic Algorithms (GAs) are optimization techniques that mimic natural selection to evolve candidate solutions over generations. They utilize various representations for hypotheses, such as binary and integer strings, and follow steps including initialization, evaluation, selection, crossover, mutation, and termination. GAs are beneficial for solving complex problems without requiring gradient information and maintain diversity through genetic operators.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views9 pages

Unit - 5

Genetic Algorithms (GAs) are optimization techniques that mimic natural selection to evolve candidate solutions over generations. They utilize various representations for hypotheses, such as binary and integer strings, and follow steps including initialization, evaluation, selection, crossover, mutation, and termination. GAs are beneficial for solving complex problems without requiring gradient information and maintain diversity through genetic operators.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 9

UNIT-5

Genetic Algorithms
Genetic Algorithms (GAs) are optimization techniques inspired by the
process of natural selection. They are used to solve complex problems by
evolving a population of candidate solutions over multiple generations. A
key aspect of GAs is how hypotheses (candidate solutions) are
represented.
Representing Hypotheses
a) Binary Representation
 Each hypothesis is represented as a binary string (0s and 1s).
 Each bit in the string is called a gene.
 Example: A hypothesis for a 4-bit binary string could be 1010.
Real-Time Example:
 Knapsack Problem:
o You have a knapsack with a limited capacity and a set of items
with specific weights and values.
o Each hypothesis represents a selection of items (1 = included,
0 = not included).
o Example: 1010 means items 1 and 3 are included, while items
2 and 4 are not.
b) Integer Representation
 Each hypothesis is represented as a string of integers.
 Useful for problems where the solution involves discrete values.
 Example: A hypothesis for a 3-integer string could be [5, 2, 7].
Steps in Genetic Algorithms
1. Initialization:
o Create an initial population of random hypotheses
(chromosomes).
2. Evaluation:
o Evaluate the fitness of each hypothesis using a fitness
function.
3. Selection:
o Select the best-performing hypotheses for reproduction.
4. Crossover:
o Combine pairs of hypotheses to create offspring.
5. Mutation:
o Randomly modify some genes in the offspring.
6. Replacement:
o Replace the old population with the new population.
7. Termination:
o Repeat the process until a stopping condition is met (e.g.,
maximum generations or satisfactory fitness).

Advantages of Genetic Algorithms


 Can solve complex, non-linear, and multi-modal problems.
 Do not require gradient information.
 Can explore a large search space efficiently.

Genetic Operators

1. Initialization: A population of candidate solutions is randomly


generated.
2. Fitness Evaluation: The fitness of each individual is calculated.
3. Selection: Individuals are selected based on their fitness to act as
parents.
4. Crossover: Parents are recombined to produce offspring.
5. Mutation: Offspring are mutated to introduce diversity.
6. Replacement: The new population is formed by replacing some or
all of the current population with the offspring.
7. Termination: The process repeats until a stopping criterion is met
(e.g., a maximum number of generations or a satisfactory solution is
found).

Importance of Genetic Operators

 Exploration vs. Exploitation: Selection and crossover focus on


exploiting high-fitness solutions, while mutation ensures exploration
of new regions in the search space.
 Diversity Maintenance: Mutation and certain selection strategies
help maintain diversity, preventing the population from converging
prematurely to suboptimal solutions.
 Adaptability: Genetic operators allow the algorithm to adapt to the
problem landscape, making GAs suitable for a wide range of
optimization problems.

Fitness Function and Selection in Genetic Algorithms

In genetic algorithms (GAs), the fitness function and selection are two
critical components that guide the evolution of the population toward
better solutions

1. Fitness Function

The fitness function is a mathematical function that evaluates how well a


candidate solution (individual) solves the problem at hand. It assigns
a fitness score to each individual in the population, which quantifies its
quality or performance. The goal of the genetic algorithm is to maximize
(or minimize, depending on the problem) this fitness score.

Key Characteristics of a Fitness Function:

 Problem-Specific: The fitness function is tailored to the specific


problem being solved.
 Objective: It reflects the objective of the optimization problem (e.g.,
minimizing cost, maximizing accuracy).
 Scalability: It should be computationally efficient, as it is evaluated
for every individual in every generation.
 Differentiability: While not always required, a smooth and
differentiable fitness function can help in some optimization
scenarios.
Example

Selection

Selection is the process of choosing individuals from the current


population to act as parents for the next generation. The selection process
is biased toward individuals with higher fitness scores, as they are more
likely to produce better offspring. However, some randomness is
introduced to maintain diversity in the population.

Common Selection Methods:


illustrative examples of genetic algorithms (GAs)
Example 1: Traveling Salesman Problem (TSP)
Problem:
A salesman needs to visit a set of cities exactly once and return to the
starting city, minimizing the total travel distance.
Example 2: Optimizing a Delivery Route
Problem:
A delivery company wants to optimize the route for its delivery trucks to
minimize fuel consumption and delivery time.
Example 3: Feature Selection in Machine Learning
Problem:
A data scientist wants to select the most relevant features for a machine
learning model to improve accuracy and reduce overfitting.
Steps:
1. Representation:
o Each individual represents a subset of features (e.g., a binary
string where 1 indicates the feature is selected, and 0
indicates it is not).
o Example: For 5 features, an individual might be represented
as [1, 0, 1, 1, 0], meaning features 1, 3, and 4 are selected.
2. Fitness Function:
o The fitness function evaluates the performance of the
machine learning model using the selected features.
o For example, use accuracy or F1-score as the fitness score.
o If the model achieves 85% accuracy with the selected
features, the fitness score is 0.85.
3. Selection:
o Use rank-based selection to choose parents.
o Rank individuals based on their fitness scores and assign
probabilities accordingly.
4. Crossover:
o Perform uniform crossover to combine two parent feature
subsets into a new subset.
o For example:
1. Parent 1: [1, 0, 1, 1, 0]
2. Parent 2: [0, 1, 1, 0, 1]
3. Offspring: [1, 1, 1, 0, 1] (after crossover).
5. Mutation:
o Apply bit flip mutation to introduce diversity.
o For example, flip a random bit in the binary string: [1, 1, 1, 0,
1] becomes [1, 1, 0, 0, 1].
6. Replacement:
o Replace the least fit individuals in the population with the new
offspring.
7. Termination:
o Repeat the process until a satisfactory feature subset is found
(e.g., a subset that maximizes model accuracy).

You might also like