0% found this document useful (0 votes)
11 views7 pages

Scoa U3

Uploaded by

jivan.karande21
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views7 pages

Scoa U3

Uploaded by

jivan.karande21
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

1.

What are the various optimization techniques to solve Soft ● Working:


Computing Problems? ○ A swarm of particles represents candidate solutions
in the search space.
1. Simulated Annealing (SA): Inspired by the annealing process in ○ Each particle has a position and velocity, which are
metallurgy, where a material is heated and cooled to alter its updated based on:
structure for a stronger configuration. ■ Personal best position.
■ The global best position found by the
● Working: swarm.
○ Starts with a random solution and a high ○ Over iterations, particles converge towards the best
"temperature" parameter. solution.
○ A neighbor solution is generated by small
modifications to the current solution. 2. Differentiate between local and global optimization
○ If the neighbor's solution is better, it is accepted. techniques.
Otherwise, it is accepted probabilistically based on a
Aspect Local Optimization Global Optimization=
function involving the temperature and the difference
in solutions. Objectiv Finds the optimal solution Searches the entire
○ The temperature is gradually decreased, reducing e within a small, localized solution space to find
the probability of accepting worse solutions over region of the search the best possible
time. space. solution.

2. Evolutionary Algorithms (EAs): Inspired by natural evolution Scope Focuses on refining Explores the search
processes like selection, mutation, and recombination. solutions near an initial space to avoid getting
guess. trapped in local optima.
● Working:
○ A population of candidate solutions (individuals) is Efficienc Faster and Slower and
initialized. y computationally less computationally
○ Each individual is evaluated using a fitness function. expensive. intensive due to a
○ The fittest individuals are selected to reproduce, broader search.
generating new solutions through crossover and
mutation. Converg May converge to a local Converges to the global
○ The process repeats for several generations, ence optimum, which is not optimum, or a
gradually evolving better solutions. necessarily the global best near-global solution,
solution. with higher certainty.
3. Particle Swarm Optimization (PSO): Mimics the behavior of a
flock of birds or fish searching for food.
Example: Traveling Salesman Problem (TSP): Find the shortest
Techniqu Gradient Descent, GA, SA, PSO
path that visits all cities exactly once and returns to the starting city.
es Used Newton's Method.
1. Initialization:
○ Start with a random tour of cities as the initial
3. Write a Simulated Annealing Algorithm and discuss with solution.
examples. ○ Set a high initial temperature T.
2. Iterative Process:
Simulated Annealing (SA) is a probabilistic optimization technique ○ Generate Neighbor Solution: Swap two cities in the
inspired by the annealing process in metallurgy. The algorithm aims current tour to create a new tour.
to find a good approximation of the global optimum of a function in a ○ Evaluate Cost: Compute the total distance of the
large search space. new tour and compare it with the current tour.
○ Acceptance Rule: Accept the new tour if it has a
Algorithm Steps: shorter distance or with a probability based on T if it
is longer.
○ Cool Down: Gradually reduce T using T=αT

3. Termination: Stop when T is very low or after a fixed number of


iterations.

Example Execution:

1. Initial tour: A → B → C → D → A, Cost = 100.


2. Neighbor tour: A → C → B → D → A, Cost = 95.
○ Accept because the cost is lower.
3. At a later stage: Neighbor tour cost = 105.
○ Accept with probability P=e^(−(105−95)/T )
○ if a random number is less than P.

4 “Simulated annealing is an algorithm that always accepts


better moves but also accepts worse with some probability”
Justify with example.

Simulated Annealing (SA) is an optimization algorithm that mimics


the process of cooling in metals. It can accept both better solutions
(to improve the result) and sometimes worse solutions (to explore
more possibilities). This helps it avoid getting stuck in local minima applied to evolve better solutions over generations.
and gives it a better chance to find the global best solution.

How It Works:

1. If a better solution is found, it is always accepted.


2. If a worse solution is found, it is accepted with a certain
probability. This probability depends on:
○ How much worse the solution is.
○ The current "temperature" of the system (higher
temperature = more chance of accepting worse
solutions).

Justification with Example: Suppose you are solving the (TSP):

● Initial Path: A → B → C → D → A, Cost = 100.


● A better path (lower cost) is found: A → C → B → D → A,
Cost = 95. Initial Population: A randomly initialized set of potential solutions.
○ SA always accepts this path because it improves ● Fitness Evaluation: Each individual is evaluated using a
the solution. fitness function to measure how good the solution is.
● Later, a worse path is found: A → C → D → B → A, Cost = ● Selection: The fittest individuals are selected to reproduce,
105. ensuring that better solutions have a higher chance of
○ SA might accept this worse path with a probability, surviving.
especially if the "temperature" is high, to explore ● Reproduction (Crossover & Mutation):
other areas of the solution space. ○ Crossover: Combines parts of two solutions to
create offspring.
By sometimes accepting worse solutions, SA avoids being trapped in ○ Mutation: Introduces small random changes to
a local minimum and increases its chances of finding the global solutions to maintain diversity.
minimum. ● New Population (Next Generation): The next generation is
formed from the selected individuals and undergoes the
5. Draw and discuss the framework of the Simple Evolutionary process again, continuing over multiple generations until a
System. stopping condition is met (e.g., reaching a satisfactory
A Simple Evolutionary System typically involves the following key solution).
components: initial population, fitness evaluation, selection,
reproduction, and mutation. These components are iteratively
6. What is an Evolution Strategy? How Biologically inspired especially those that are difficult to solve using traditional methods
algorithms use evolution strategies? due to the problem's size, nonlinearity, or complexity.
Evolution Strategy (ES) is an optimization technique inspired by Evolutionary systems iteratively improve candidate solutions by
natural evolution, focusing on mutation and selection to improve mimicking the natural selection process, involving mechanisms
solutions over generations. Unlike genetic algorithms that emphasize like selection, mutation, crossover, and reproduction.
crossover, ES primarily uses mutation to explore the solution space
and improve candidate solutions. Key Features:

Key Features: 1. Population-based: Multiple solutions evolve together,


allowing for a more robust search.
● Population: Set of potential solutions. 2. Exploration & Exploitation: Balances exploring new
● Fitness evaluation : measure how good the solution is. solutions and refining existing ones.
● Mutation: Random changes to individuals' solutions. 3. No Gradient Needed: Can solve problems where traditional
● Selection: Best solutions methods like gradient descent don't work (e.g.,
● Step Size Control: Adjusts mutation size for better non-differentiable functions).
exploration.
Advantages:
Biologically Inspired Algorithms Using Evolution Strategies:
● Robustness to noisy, dynamic data.
Biologically inspired algorithms like (GA) and Evolutionary ● Parallelism for faster computation.
Programming (EP), alongside Evolution Strategies, mimic processes ● Flexibility across diverse problem types.
found in nature such as natural selection and genetic evolution:
Challenges:
● Genetic Algorithms (GA): Use selection, crossover
(recombination), and mutation to evolve solutions. ● Premature Convergence: Risk of getting stuck in local
● Evolutionary Programming (EP): Primarily uses mutation optima.
and selection, similar to ES but focuses more on behavioral ● Computationally Expensive: Requires many generations
evolution. and evaluations.
● Differential Evolution (DE): A variation that uses different ● Parameter Sensitivity: Requires careful tuning of
vectors for mutation. parameters.

7. Discuss Evolutionary Systems as a Problem Solver. 8. What is Evolutionary Programming? Write an example.
Evolutionary Systems (or Evolutionary Algorithms) are a family of
optimization techniques inspired by the process of natural Evolutionary Programming (EP) is a type of optimization algorithm
evolution. They are used to find solutions to complex problems, inspired by biological evolution. It focuses on evolving solutions over
time to solve complex problems. It uses techniques like mutation
and selection to improve a population of candidate solutions until a 9. Write a pseudocode for evolutionary computation.
good or optimal solution is found. Unlike (GAs), EP mainly focuses 1. Initialize population P with random individuals
on mutating solutions and does not use crossover. 2. Evaluate fitness of each individual in P
3. Repeat until termination criteria are met:
How It Works: a. Select individuals from population P based on Selection -
Select parents using a method like tournament or roulette-wheel
1. Initialization: Start with a population of random solutions. selection
2. Fitness Evaluation: Measure how good each solution is. b. Create offspring through crossover: Perform crossover on
3. Mutation: Modify each solution slightly to create a new . selected parents to create new individuals
4. Selection: Choose the best solutions. c. Apply mutation to offspring: Introduce small random changes to
5. Repeat: Continue mutating and selecting solutions for offspring
multiple generations until the problem is solved. d. Evaluate the fitness of offspring
e. Replace old population P with the new population
4. Choose the best individuals from both parents and offspring for
the next generation
5. Return the best individual in the population as the solution.

10.Selection pressure and diversity of the population are very


important for the success of EA” Justify giving examples.

1. Selection Pressure: The intensity with which


better-performing individuals are chosen to pass their traits
to the next generation.
○ Why Important: Ensures the algorithm focuses on
high-quality solutions, speeding up convergence
toward optimal solutions.
○ Example:
■ High Selection Pressure: Quickly eliminates
poor solutions, but risks premature
convergence to local optima.
■ Low Selection Pressure: Maintains diversity
but slows convergence.
○ Example in Action: In a (TSP), high selection constants/variables. Mutation introduces variability by modifying
pressure rapidly identifies promising routes, but too parts of these trees, helping explore new solutions.
much pressure might miss better global solutions.
2. Population Diversity: The variety of solutions within the 1. Mutation Point: The diagram shows a mutation point on the
population. tree. This is where the mutation will occur. A mutation point
○ Why Important: Prevents the algorithm from getting is typically selected randomly in the existing tree structure.
stuck in local optima and ensures exploration of the 2. Sub-tree Replacement: In GA, mutation replaces a subtree
entire search space. at the mutation point with a randomly generated subtree.
○ Example: For example, if the node "3" is the mutation point, its subtree
■ High Diversity: Promotes exploration of (e.g., x and y) is replaced with a new subtree (e.g., x, y, and
new areas but may delay convergence. constant 2). This introduces diversity into the population.
■ Low Diversity: Speeds up exploitation but 3. Purpose: In Genetic Programming, mutation introduces
risks stagnation. new genetic material to maintain diversity, preventing
○ Example in Action: In Neural Network premature convergence. It helps explore new solution areas
Optimization, maintaining diverse populations that crossover and selection might miss.
ensures different network architectures are explored,
avoiding suboptimal solutions. Mutation in Genetic Programming:

11. Discuss an operation represented in the following diagram ● Exploration of Solution Space: Alters program trees to
for Genetic Programming. discover new possibilities.
● Maintaining Genetic Diversity: Prevents the population
from becoming homogeneous, avoiding local optima.
● Random Sub-tree Generation: Introduces new subtrees to
explore unexplored solution regions.

12 “In the Evolutionary system, finite state machines are used


to solve a prediction problem”, Justify giving examples.

Finite State Machines (FSMs) are computational models that


The diagram in the image represents an operation in (GP), represent systems as a finite number of states and transitions. In
specifically the mutation operation. evolutionary systems, FSMs can evolve through techniques like
(GAs) to address prediction problems. They are particularly useful
In Genetic Programming, solutions are represented as expression because they can model sequential or temporal processes
trees, with nodes as functions/operators and leaves as effectively.
FSMs are used in prediction problems because:
● Efficient Representation: They model systems with finite ○ Why EC? Genetic Algorithms (GAs) efficiently
states and transitions simply. evolve routes using crossover and mutation.
● Evolvability: Can be optimized with evolutionary techniques 2. Neural Network Optimization: Optimizing weights and
like mutation and crossover. hyperparameters in machine learning models.
● Sequential Dependencies: Ideal for sequence-based data ○ Why EC? (PSO) and Differential Evolution (DE)
like time-series or patterns. adjust weights iteratively for better accuracy.

Examples of FSMs in Prediction Problems: 14 What is Evolutionary Computing? Perform analysis of


different strategies for evolutionary computing giving examples.
1. Weather Prediction:
States = Weather conditions (e.g., sunny, rainy), transitions = Evolutionary Computing is a problem-solving method inspired by
weather changes. natural evolution. It uses techniques like selection, mutation, and
2. Stock Market Trends: crossover to evolve solutions for optimization problems.
States = Stock movements (up, down, stable), transitions = Strategies in Evolutionary Computing
market shifts.
1. Genetic Algorithms (GAs): Solutions are like
13 “Evolutionary Computations are used for complex "chromosomes" that evolve using crossover and mutation.
optimization problems”, Justify with examples. ○ Example: Finding the shortest path in a travel
Evolutionary Computation methods are particularly effective for in.(tsp)
complex optimization problems due to their ability to explore large ○ Pro: Handles large search spaces well.
and non-linear search spaces. These problems often have no ○ Con: May get stuck in local optima.
straightforward mathematical solution, making traditional methods 2. Genetic Programming (GP):
inefficient. ○ What: Evolves computer programs or expressions.
○ Example: Creating a formula to fit data.
1. Global Search: EC explores the entire search space, ○ Pro: Good for symbolic problems.
avoiding local optima. ○ Con: Needs high computation.
2. Adaptability: It adapts to dynamic and multi-objective 3. Evolution Strategies (ES):Optimizes real-valued problems
problems. by adapting parameters.
3. No Assumptions: Does not require gradient information or ○ Example: Improving mechanical designs.
problem linearity. ○ Pro: Great for continuous optimization.
○ Con: Not ideal for discrete problems.
Examples 4. (PSO): Simulates the social behavior of particles to find
solutions.
1. (TSP): Find the shortest route connecting multiple cities. ○ Example: Optimizing neural network weights.
○ Pro: Fast and easy to use.

You might also like