Unit 4 & 5 SC
Unit 4 & 5 SC
A Genetic Algorithm (GA) is a search and optimization technique inspired by the process of
natural selection in biological evolution. It's often used to find approximate solutions to
complex problems where traditional methods might be too slow or ineffective. It is frequently
used to find optimal or near-optimal solutions to difficult problems which otherwise would take
a lifetime to solve. It is frequently used to solve optimization problems, in research, and in
machine learning.
Genetic Algorithms (GAs) are used in real-world, live applications where traditional methods
struggle due to complexity or a huge search space. Here are several impactful live and practical
applications of GAs:
**How GAs help**: GAs evolve optimal driving paths, avoiding obstacles and minimizing travel
time or energy.
**Example**: Tesla and other autonomous systems have explored GAs to evolve decision-
making policies for edge cases.
* **Example**: Amazon or Netflix might evolve recommendation strategies using GAs in A/B
testing environments.
**How GAs help**: Solve NP-hard scheduling problems where tasks, machines, and resources
must be allocated efficiently.
**How GAs help**: Automatically evolve antenna designs that would be hard to design
manually.
**Example**: NASA evolved a small, unusual antenna for satellites that outperformed
traditional designs.
**How GAs help**: Select the best combination of assets under complex constraints.
**Example**: Hedge funds and algorithmic trading firms use GAs to evolve trading strategies.
**How GAs help**: Create non-repetitive, adaptive content and smarter AI agents.
**Example**: In games like *No Man’s Sky*, procedural worlds can be evolved using similar
concepts.
**Example**: European Space Agency (ESA) has used GAs in mission planning.
1. Start / Initialization
2. Evaluate Fitness
🔹 Example: For maximizing a function f(x), higher f(x) means better fitness.
3. Selection
🔹 Common methods:
4. Crossover (Recombination)
🔹 Example:
Parent 1: 1010
Parent 2: 1100
Child: 1010 (from 1st half of P1 + 2nd half of P2)
5. Mutation
🔹 Example:
Before: 1010
After: 1000 (bit flipped)
Flowchart of GA
Biological Background :
Chromosome: All living organisms consist of cells. In each cell, there is the same set of
Chromosomes. Chromosomes are strings of DNA and consist of genes, blocks of DNA. Each
gene encodes a trait, for example, the color of the eye.
Reproduction: During reproduction, combination (or crossover) occurs first. Genes from
parents combine to form a whole new chromosome. The newly created offspring can then be
mutated. The changes are mainly caused by errors in copying genes from parents. The fitness
of an organism is measured by the success of the organism in its life.
Basic principles :
Algorithmic Phases :
Encoding Methods :
Binary Encoding: Most common methods of encoding. Chromosomes are string of 1s and
0s and each position in the chromosome represents a particular characteristics of the
solution.
Permutation Encoding: Useful in ordering such as the Travelling Salesman Problem (TSP).
In TSP, every chromosome is a string of numbers, each of which represents a city to be
visited.
Value Encoding: Used in problems where complicated values, such as real numbers, are
used and where binary encoding would not suffice. Good for some problems, but often
necessary to develop some specific crossover and mutation techniques for these
chromosomes.
SELECTION OPERATOR IN GA
In Genetic Algorithms (GAs), the selection operator chooses individuals from the current population to
act as parents for creating the next generation. The idea is to favor fitter individuals, while still
maintaining some genetic diversity to avoid premature convergence.
Summary Table
Selection
Based on Suitable For Pros Cons
Method
Needs tuning of
Tournament Fitness Most GAs Easy, scalable
tournament size
Avoiding Maintains
Rank Selection Rank Slower convergence
dominance diversity
More complex to
SUS Fitness Fair distribution Less stochastic
implement
Guarantees
Top
Elitism Any GA best Can reduce diversity
fitness
preserved
2 Two-Point Crossover :
This is a specific case of a N-point Crossover technique. Two random points are chosen on the
individual chromosomes (strings) and the genetic material is exchanged at these points.
3 Uniform Crossover:
Each gene (bit) is selected randomly from one of the corresponding genes of the parent
chromosomes.
Use tossing of a coin as an example technique.
The crossover between two good solutions may not always yield a better or as good a
solution. Since parents are good, the probability of the child being good is high. If offspring is
not good (poor solution), it will be removed in the next iteration during "Selection".
4. Arithmetic Crossover
Example:Select a segment from parent 1 and preserve the order of the remaining genes from
parent 2.
Choosing a Technique:
Mutation in GA
In Genetic Algorithms (GA), the mutation operator is a critical mechanism used to maintain
genetic diversity in the population of candidate solutions and to help the algorithm explore the
solution space more thoroughly. Here's a breakdown of what the mutation operator does and
**Exploration:** Explores new solutions that may not be possible through just crossover.
1. **Binary Representation**
**Bit Flip Mutation**: Randomly selects one or more bits and flips them (0 → 1 or 1 → 0).
**Example**:
2. **Real-Valued Representation**
**Uniform Mutation**: Replace a gene with a random value within the gene’s allowed range.
3. **Permutation Representation**
**Mutation Rate**
* The mutation rate is a parameter that defines the probability of mutation occurring in an
individual.
* Typical values: 0.001 to 0.01 (for binary GAs), but this can vary depending on the problem and
representation.
GENERATIONAL CYCLE
The generational cycle in a Genetic Algorithm (GA) refers to the iterative process that mimics
natural evolution to evolve a population of candidate solutions toward better solutions over time.
1. Initialization
Generate an initial population of individuals (randomly or heuristically).
Each individual represents a potential solution, often encoded as a binary string, real-
valued vector, or permutation.
2. Fitness Evaluation
3. Selection
4. Crossover (Recombination)
5. Mutation
Form the new generation by selecting individuals from the current population and
offspring.
Strategies:
o Generational replacement: Replace the entire population.
o Steady-state replacement: Only a few individuals are replaced.
o Elitism: Keep the best individuals from the previous generation.
CONVERGENCE IN GA
In the context of Genetic Algorithms (GAs), convergence refers to the process by which the
population of solutions becomes increasingly similar over generations, often resulting in little
to no variation between individuals.
The algorithm has found a solution (or a set of similar solutions) and is no longer
exploring much of the search space.
Most individuals in the population have very similar or identical genetic material
(genes).
Fitness values across the population become similar, indicating reduced diversity.
1. Selection Pressure: Favoring the best individuals too strongly can cause their genes to
dominate the population.
2. Low Mutation Rate: Without enough mutation, diversity isn't maintained.
3. Small Population Size: Fewer individuals can limit the exploration of the search space.
4. Noisy or Simple Fitness Landscape: The problem may not have many viable diverse
solutions.
Types of Convergence
Type Description
Desirable Convergence Population converges to a global optimum or near-optimal solution.
Premature Convergence Population converges too early to a local optimum, missing better
solutions.
Genetic Diversity Metrics: Measure how different individuals are from each other.
Fitness Stagnation: The best fitness hasn't improved over many generations.
Entropy of Population: Low entropy suggests similarity in genetic makeup.
Hybrid Systems
Hybrid systems: A Hybrid system is an intelligent system that is framed by combining at least
two intelligent technologies like Fuzzy Logic, Neural networks, Genetic algorithms,
reinforcement learning, etc. The combination of different techniques in one computational
model makes these systems possess an extended range of capabilities. These systems are
capable of reasoning and learning in an uncertain and imprecise environment. These systems
can provide human-like expertise like domain knowledge, adaptation in noisy environments,
etc.
The Neuro-fuzzy system is based on fuzzy system which is trained on the basis of the working
of neural network theory. The learning process operates only on the local information and
causes only local changes in the underlying fuzzy system. A neuro-fuzzy system can be seen as
a 3-layer feedforward neural network. The first layer represents input variables, the middle
(hidden) layer represents fuzzy rules and the third layer represents output variables. Fuzzy
sets are encoded as connection weights within the layers of the network, which provides
functionality in processing and training the model.
Working flow:
In the input layer, each neuron transmits external crisp signals directly to the next layer.
Each fuzzification neuron receives a crisp input and determines the degree to which the
input belongs to the input fuzzy set.
The fuzzy rule layer receives neurons that represent fuzzy sets.
An output neuron combines all inputs using fuzzy operation UNION.
Each defuzzification neuron represents the single output of the neuro-fuzzy system.
Advantages:
It can handle numeric, linguistic, logic, etc kind of information.
It can manage imprecise, partial, vague, or imperfect information.
It can resolve conflicts by collaboration and aggregation.
It has self-learning, self-organizing and self-tuning capabilities.
It can mimic the human decision-making process.
Disadvantages:
Hard to develop a model from a fuzzy system
Problems of finding suitable membership values for fuzzy systems
Neural networks cannot be used if training data is not available.
Applications:
Student Modelling
Medical systems
Traffic control systems
Forecasting and predictions
Classification of Neuro-Fuzzy Hybrid Systems
Neuro-fuzzy systems can be classified based on how the integration occurs between neural networks
and fuzzy systems.
Main Classifications:
1. Cooperative Neuro-Fuzzy Systems
Neural networks are used to help design or optimize parts of a fuzzy system.
Learning Stage: Neural network tunes parameters (e.g., membership functions).
After Learning: The neural component is removed; only the fuzzy system is used.
✅Static hybrid: no learning during runtime.
Example:
Use a neural network to tune the membership functions of a fuzzy controller offline.
2. Concurrent Neuro-Fuzzy Systems
Neural networks and fuzzy systems work in parallel or independently but share information during
execution.
NN and fuzzy logic run simultaneously
Each may contribute to decision-making
Useful for multi-modal systems (e.g., if one fails, the other supports).
Example:
A fuzzy rule-based system handles known conditions; an NN takes over in novel situations.
3. Fully Integrated (or Fused) Neuro-Fuzzy Systems
Neural networks and fuzzy systems are tightly integrated — the system acts like a neural network
structure that embeds fuzzy logic.
Fuzzy inference is represented in neural network form
Parameters (like membership functions and rules) are learned automatically
Also called adaptive neuro-fuzzy systems
Most common and powerful type
Online learning is supported
Advantages:
GA is used for topology optimization i.e to select the number of hidden layers, number of
hidden nodes, and interconnection pattern for ANN.
In GAs, the learning of ANN is formulated as a weight optimization problem, usually using
the inverse mean squared error as a fitness measure.
Control parameters such as learning rate, momentum rate, tolerance level, etc are also
optimized using GA.
It can mimic the human decision-making process.
Disadvantages:
Highly complex system.
The accuracy of the system is dependent on the initial population.
Maintenance costs are very high.
Applications:
Face recognition
DNA matching
Animal and human research
Behavioral system
Working Flow:
Start with an initial population of solutions that represent the first generation.
Feed each chromosome from the population into the Fuzzy logic controller and compute
performance index.
Create a new generation using evolution operators till some condition is met.
Advantages:
GAs are used to develop the best set of rules to be used by a fuzzy inference engine
GAs are used to optimize the choice of membership functions.
A Fuzzy GA is a directed random search over all discrete fuzzy subsets.
It can mimic the human decision-making process.
Disadvantages:
Interpretation of results is difficult.
Difficult to build membership values and rules.
Takes lots of time to converge.
Applications:
Mechanical Engineering
Electrical Engine
Artificial Intelligence
Economics
Genetic Fuzzy Rule-Based Systems are intelligent hybrid systems that combine:
Fuzzy logic: to handle uncertainty, imprecision, and human-like reasoning.
Genetic algorithms (GA): to automatically learn, optimize, or evolve the fuzzy rules
and/or membership functions.
Core Concept
A GFRBS uses a fuzzy rule-based system as its reasoning engine, and genetic algorithms to
evolve the rules or tune parameters — making the system self-adaptive and data-driven.
Key Components
1. Fuzzy Rule-Based System (FRBS)
Chromosomes represent:
o Rule sets
o Membership function parameters
o Rule weights or priorities
GA operations:
o Selection (based on fitness)
o Crossover (combine rules or parameters)
o Mutation (change part of rules or MFs)
Fitness function measures:
o Accuracy (e.g., classification)
o Error (e.g., in regression or control tasks)
o Complexity (e.g., number of rules)
Challenges
Computational cost: Evolving rule sets can be expensive
Rule bloat: Too many rules can reduce interpretability
Overfitting risk: Needs proper validation and pruning
“Which fuzzy rules best describe the “Where should the membership
Example Task
system?” functions peak or spread?”
GA
Encodes fuzzy rules Encodes fuzzy parameters
Chromosome
Create or select effective rule Fine-tune existing rules to improve
Goal
combinations performance
Optimized membership functions, rule
Typical Output A new or evolved rule base
weights, or thresholds
Key Idea
Neural Network is used to support the fuzzy system — typically to help design,
initialize, or tune it.
Once this support is complete, the neural part is removed or frozen.
The resulting system is a pure fuzzy system, but it’s been optimized using neural
learning.
How It Works
Step-by-Step Process:
1. Collect Data
o Input–output pairs for training.
2. Use Neural Network to Learn
o Train a neural network to approximate the input–output mapping.
o Extract fuzzy rules, or tune membership functions, based on NN behavior.
3. Build Fuzzy Inference System
o Use the knowledge from the NN to define fuzzy sets, rules, or parameters.
4. Use Only the Fuzzy System
o After training, the NN is no longer used.
o The system runs as a standalone fuzzy system.
Purpose
Neural networks are good at learning from data.
Fuzzy systems are good at interpretability and handling uncertainty.
A cooperative system learns from data (via NN) but runs using fuzzy rules
(interpretable).
Use a neural network to learn how to adjust fan speed based on temperature and
humidity.
Benefits
Combines data-driven learning and expert knowledge.
Produces interpretable models (unlike black-box neural networks).
Easier to deploy and explain, especially in expert systems.
Limitations
Not adaptive during execution (since NN is removed/frozen).
Performance depends on quality of initial neural training.