We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 107
B.M.
S College of Engineering
Department of Machine Learning
Nature Inspired Computing UNIT 03 Genetic Algorithm
Prof. VARSHA R Asst. Prof, Dept. of MEL, BMSCE Introduction
Presentation by Prof. Varsha R
How Genetic Algorithm works?
Presentation by Prof. Varsha R
Step 1 Initial Population
Presentation by Prof. Varsha R
Step 2 Fitness Function
Presentation by Prof. Varsha R
Step 3 Selection
Presentation by Prof. Varsha R
Step 3 Selection
Presentation by Prof. Varsha R
Step 4 Crossover
Presentation by Prof. Varsha R
Step 5 Offspring
Presentation by Prof. Varsha R
Step 6 Mutation
Presentation by Prof. Varsha R
Step 6 Mutation
Presentation by Prof. Varsha R
Step 7 Termination
Presentation by Prof. Varsha R
Working of Genetic Algorithm
Presentation by Prof. Varsha R
Pseudo Code for Genetic Algorithm 1) Randomly initialize populations p 2) Determine fitness of population 3) Until convergence repeat: a) Select parents from population b) Crossover and generate new population c) Perform mutation on new population d) Calculate fitness for new population
Presentation by Prof. Varsha R
Encoding Techniques in Genetic Algorithm
Presentation by Prof. Varsha R
Binary Encoding
Presentation by Prof. Varsha R
Binary Encoding
Presentation by Prof. Varsha R
Binary Encoding Example 1
Presentation by Prof. Varsha R
Binary Encoding Example 2
Presentation by Prof. Varsha R
Value Encoding
Presentation by Prof. Varsha R
Value Encoding Example 1
Presentation by Prof. Varsha R
Permutation (or order) Encoding
Presentation by Prof. Varsha R
Permutation (or order) Encoding Example 1
Presentation by Prof. Varsha R
Tree Encoding
Presentation by Prof. Varsha R
Parent Selection Operators
Presentation by Prof. Varsha R
Fitness Proportionate Selection
Presentation by Prof. Varsha R
Roulette Wheel Selection
Presentation by Prof. Varsha R
Roulette Wheel Selection
Presentation by Prof. Varsha R
Stochastic Universal Sampling (SUS)
Presentation by Prof. Varsha R
Ranking Selection
Presentation by Prof. Varsha R
Ranking Selection
Presentation by Prof. Varsha R
Tournament Selection
Presentation by Prof. Varsha R
Truncation Selection
Presentation by Prof. Varsha R
Crossover Operators
Presentation by Prof. Varsha R
Binary Coded Crossover Operators
Presentation by Prof. Varsha R
Single Point Crossover
Presentation by Prof. Varsha R
Single Point Crossover
Presentation by Prof. Varsha R
Two Point Crossover
Presentation by Prof. Varsha R
Two Point Crossover
Presentation by Prof. Varsha R
Two Point Crossover
Presentation by Prof. Varsha R
Multi Point Crossover
Presentation by Prof. Varsha R
Multi Point Crossover
Presentation by Prof. Varsha R
Multi Point Crossover
Presentation by Prof. Varsha R
Uniform Crossover
Presentation by Prof. Varsha R
Uniform Crossover
Presentation by Prof. Varsha R
Uniform Crossover
Presentation by Prof. Varsha R
Half- Uniform Crossover
Presentation by Prof. Varsha R
Half- Uniform Crossover
Presentation by Prof. Varsha R
Uniform Crossover with Crossover Mask (CM)
Presentation by Prof. Varsha R
Uniform Crossover with Crossover Mask (CM)
Presentation by Prof. Varsha R
Uniform Crossover with Crossover Mask (CM)
Presentation by Prof. Varsha R
Shuffle Crossover
Presentation by Prof. Varsha R
Shuffle Crossover
Presentation by Prof. Varsha R
Shuffle Crossover
Presentation by Prof. Varsha R
Three Parent Crossover
Presentation by Prof. Varsha R
Three Parent Crossover
Presentation by Prof. Varsha R
Three Parent Crossover
Presentation by Prof. Varsha R
Real Coded Crossover Operators
Presentation by Prof. Varsha R
Single Arithmetic Crossover
Presentation by Prof. Varsha R
Single Arithmetic Crossover
Presentation by Prof. Varsha R
Single Arithmetic Crossover
Presentation by Prof. Varsha R
Single Arithmetic Crossover
Presentation by Prof. Varsha R
Linear Crossover
Presentation by Prof. Varsha R
Linear Crossover
Presentation by Prof. Varsha R
Linear Crossover
Presentation by Prof. Varsha R
Linear Crossover
Presentation by Prof. Varsha R
Linear Crossover
Presentation by Prof. Varsha R
Order Coded Crossover Operator
Presentation by Prof. Varsha R
Partially Mapped Crossover
Presentation by Prof. Varsha R
Partially Mapped Crossover
Presentation by Prof. Varsha R
Partially Mapped Crossover
Presentation by Prof. Varsha R
Partially Mapped Crossover
Presentation by Prof. Varsha R
Partially Mapped Crossover
Presentation by Prof. Varsha R
Cycle Crossover
Presentation by Prof. Varsha R
Cycle Crossover
Presentation by Prof. Varsha R
Mutation Operators
Presentation by Prof. Varsha R
Mutation Operators
Presentation by Prof. Varsha R
Bit Flip Mutation
Presentation by Prof. Varsha R
Random Resetting
Presentation by Prof. Varsha R
Swap Mutation
Presentation by Prof. Varsha R
Scramble Mutation
Presentation by Prof. Varsha R
Inversion Mutation
Presentation by Prof. Varsha R
Stopping Condition for Genetic Algorithm Flow
Presentation by Prof. Varsha R
Stopping Condition for Genetic Algorithm Flow
Presentation by Prof. Varsha R
Problem Solving Using GA- Example 1
Presentation by Prof. Varsha R
Problem Solving Using GA- Example 1
Presentation by Prof. Varsha R
Problem Solving Using GA- Example 1
Presentation by Prof. Varsha R
Problem Solving Using GA- Example 1
Presentation by Prof. Varsha R
Problem Solving Using GA- Example 1
Presentation by Prof. Varsha R
Problem Solving Using GA- Example 1
Presentation by Prof. Varsha R
Problem Solving Using GA- Example 1
Presentation by Prof. Varsha R
Problem Solving Using GA- Example 1
Presentation by Prof. Varsha R
Problem Solving Using GA- Example 1
Presentation by Prof. Varsha R
Problem Solving Using GA- Example 1
Presentation by Prof. Varsha R
Problem Solving Using GA- Example 2
Presentation by Prof. Varsha R
Problem Solving Using GA- Example 2
Presentation by Prof. Varsha R
Problem Solving Using GA- Example 2
Presentation by Prof. Varsha R
Problem Solving Using GA- Example 2 • Repeat the steps until we get the maximum value for f(x) function.
Presentation by Prof. Varsha R
Classification of GA 1. Messy Genetic Algorithm 2. Adaptive Genetic Algorithm 3. Hybrid Genetic Algorithm 4. Parallel Genetic Algorithm
Presentation by Prof. Varsha R
Messy Genetic Algorithms (mGA) • Concept: Messy Genetic Algorithms are designed to work efficiently with highly complex optimization problems by allowing individuals to represent partial solutions. They address issues related to premature convergence and inefficiency in representing solutions in standard GAs. • Key Characteristics: 1. Variable-length chromosomes: Unlike standard GAs where chromosome lengths are fixed, mGAs allow variable lengths to represent solutions. 2. Explicit gene representation: Genes in mGAs include both their position (locus) and value, enabling a more flexible representation. 3. Building blocks emphasis: mGAs focus on evolving and combining "building blocks" or smaller problem subcomponents to form complete solutions.
Presentation by Prof. Varsha R
Messy Genetic Algorithms (mGA) • Phases: 1. Initialization Phase: Starts with a diverse population of chromosomes, including short (partial) solutions. 2. Juxta positional Phase: Combines short solutions into longer, more comprehensive ones. 3. Clean-up Phase: Refines the solutions by removing redundant or unfit components. • Applications: 1. Problems with highly interdependent variables. 2. Complex combinatorial optimization tasks, such as scheduling and network design.
Presentation by Prof. Varsha R
Adaptive Genetic Algorithms (AGA) • Concept: Adaptive Genetic Algorithms dynamically adjust their parameters (e.g., mutation and crossover rates) during the evolution process to enhance convergence speed and maintain diversity. • Key Characteristics: • Dynamic parameter tuning: Adapt mutation and crossover rates based on the performance of the current population. • Focus on diversity: Prevents premature convergence by adapting to the search space’s characteristics. • Feedback-based adaptation: Uses feedback from fitness values to guide the adjustments.
Presentation by Prof. Varsha R
Adaptive Genetic Algorithms (AGA) • Advantages: • Reduces the need for manual parameter tuning. • Balances exploration (diversity) and exploitation (convergence) effectively. • Applications: • Problems where the search space characteristics change during optimization. • Dynamic optimization problems, such as real-time system control.
Presentation by Prof. Varsha R
Hybrid Genetic Algorithms (hGA) • Concept: Hybrid Genetic Algorithms combine GAs with other optimization techniques (e.g., local search, simulated annealing) to leverage the strengths of each method. • Key Characteristics: • Enhanced exploration and exploitation: GAs handle global exploration, while other methods refine local solutions. • Domain-specific customization: Integrates problem-specific heuristics or algorithms for better performance. • Modular design: Combines multiple optimization paradigms flexibly.
Presentation by Prof. Varsha R
Hybrid Genetic Algorithms (hGA) • Types of Hybrids: 1. GA + Local Search (Memetic Algorithms): Incorporates a local search phase after the GA operators to fine-tune solutions. 2. GA + Simulated Annealing: Combines the probabilistic nature of simulated annealing with the population-based approach of GAs. 3. GA + Gradient Descent: Uses gradient-based methods for refining solutions generated by GAs. • Applications: • Large-scale optimization problems, such as vehicle routing and resource allocation. • Scenarios requiring both high-quality solutions and computational efficiency.
Presentation by Prof. Varsha R
Parallel Genetic Algorithms (pGA) • Concept: Parallel Genetic Algorithms distribute the computational workload across multiple processors to accelerate the optimization process and explore the search space more effectively. • Key Characteristics: • Distributed population: Divides the population into smaller subpopulations (islands) that evolve independently. • Migration mechanism: Periodically exchanges individuals between subpopulations to maintain diversity.
Presentation by Prof. Varsha R
Parallel Genetic Algorithms (pGA) • Parallelism levels: • Coarse-grained parallelism: Subpopulations evolve in parallel on different processors. • Fine-grained parallelism: Each individual or group of genes is processed in parallel. • Master-slave parallelism: A central processor (master) manages fitness evaluation distributed to worker nodes (slaves). • Advantages: • Significantly faster convergence, especially for computationally intensive problems. • Increased diversity due to isolated evolution in subpopulations. • Scalability with the number of processors. • Applications: • High-dimensional optimization problems, such as protein folding and aerodynamic design. • Scenarios requiring rapid solution times, like real-time simulations.
Soumen Paul Department of Computer Science and Informatics Haldia Institute of Technology ICARE Complex, HIT Campus, P.O - HIT, PIN 721657 Haldia, West Bengal