0% found this document useful (0 votes)
18 views14 pages

BID Module4

Uploaded by

Nitin N Raikar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
18 views14 pages

BID Module4

Uploaded by

Nitin N Raikar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 14

BIO INSPIRED DESIGN AND INNOVATION

MODULE 4

BIO COMPUTING AND OPTIMISATION:

Bio-inspired computing represents the umbrella of different studies of computer science,


mathematics, and biology in the last years. Bio-inspired computing optimization algorithms is
an emerging approach which is based on the principles and inspiration of the biological
evolution of nature to develop new and robust competing techniques. In the last years, the bio-
inspired optimization algorithms are recognized in machine learning to address the optimal
solutions of complex problems in science and engineering. However, these problems are
usually nonlinear and restricted to multiple nonlinear constraints which propose many
problems such as time requirements and high dimensionality to find the optimal solution. To
tackle the problems of the traditional optimization algorithms, the recent trends tend to apply
bio-inspired optimization algorithms which represent a promising approach for solving
complex optimization problems. Few examples include: No Free Lunch Theorem, Bat
Algorithm, Flower Pollination Algorithm, Genetic Algorithm- Crossover and Mutation
Operations. Bio-Inspired Optimisation, Ant Colony Optimisation (ACO), Swarm Intelligence-
Particle Swarm Optimisation (PSO).

No Free Lunch Theorem (NFL):

In mathematical folklore, the "no free lunch" (NFL) theorem (sometimes pluralized) of David
Wolpert and William Macready, alludes to the saying "no such thing as a free lunch", that is,
there are no easy shortcuts to success. It appeared in the 1997 "No Free Lunch Theorems for
Optimization". Wolpert had previously derived no free lunch theorems for machine learning
(statistical inference). The "no free lunch" (NFL) theorem is an easily stated and easily
understood consequence of theorems Wolpert and Macready actually prove. It is weaker than
the proven theorems, and thus does not encapsulate them. Various investigators have extended
the work of Wolpert and Macready substantively. In terms of how the NFL theorem is used in
the context of the research area, the no free lunch in search and optimization is a field that is
dedicated for purposes of mathematically analysing data for statistical identity, particularly
search and optimization.

• NFLT states that no universally better algorithm can solve all types of optimization
problems.
• NFLT highlights the importance of adapting design principles to the unique challenges
presented by biological systems.
• NFLT originated in computer science, asserting that no one optimization algorithm is
universally superior.
• Relevance to biology-inspired design: Understanding the limitations of optimization
informs how we draw inspiration from nature.
• NFLT states that there is no universal optimization algorithm; performance is context-
dependent.
• In the realm of biology, this means no single design solution works for all environments.
• The diversity of biological systems challenges the idea of a one-size-fits-all approach.
• NFLT underscores the need for a diverse toolbox of design strategies to address the
complexity of natural systems.
Applying a single optimization strategy to all design problems is unrealistic. Biological
systems showcase a variety of adaptive strategies, emphasizing the need for context-
specific design in biomimicry. NFLT encourages a nuanced understanding of the
complexity of natural systems. Embracing the principles of NFLT prompts designers to
explore unconventional solutions that might be overlooked by traditional approaches.

The NLFT are a set of mathematical proofs and general framework that explores the connection
between general-purpose algorithms that are considered “black-box” and the problems they
solve.
The No Free Lunch Theorems state that any one algorithm that searches for an optimal cost or
fitness solution is not universally superior to any other algorithm.
This is due to the huge potential problem space for the application of the general-purpose
algorithm; if an algorithm is particularly adept at solving one class of problem, and the fitness
surface that comes with it, then then it has to perform worse on the remaining average of
problems. Wolpert and Macready wrote in their paper, No Free Lunch Theorems for
Optimisation:

“If an algorithm performs better than random search on some class of problems then in
must perform worse than random search on the remaining problems.”

In short because of the NFLT proof, we can state: A superior black-box optimisation strategy is
impossible.

Bat Algorithm:
The Bat algorithm is a population-based metaheuristics algorithm for solving continuous
optimization problems. It’s been used to optimize solutions in cloud computing, feature
selection, image processing, and control engineering problems.
Metaheuristics are similar to heuristics, and they seek promising solutions to problems.
However, metaheuristics are generic and can deal with a variety of problems. Generally,
metaheuristic algorithms are designed for global optimization.
Yang proposed the Bat algorithm in 2010. The basic Bat algorithm is bio-inspired. It’s based
on the characteristics of bat bio-sonar or echolocation. Bats in the wild emit ultrasonic waves
into the environment to aid in hunting or navigation. Following the emission of these waves, it
receives the echoes of the waves. It locates itself and identifies obstacles by using the received
echoes.
It’s illustrated in the following figure:

The flowchart of the Bat algorithm is given as follows:


We must first initialize the bat population before defining the pulse frequency. After this, we
define the maximum number of iterations and initialize pulse rates and loudness.
If the result is better, new values will be generated, and values for velocities will be updated.
If the solution is yes, then we must choose the best solution from these random values. If not,
the program will return.
Applications:
• Feature selection is a data pre-processing method used in classification. The main goal
is to minimize the number of characteristics while improving classification
performance. The binary vector solves the problem of which features to select or not
select in a specific problem. In this case, 0 indicates no selection, and 1 indicates that
the feature for the provided data should be chosen.
• In digital image processing, thresholding is the easiest way to segment images. The
Bat algorithm is used to perform multilevel image thresholding in this context.
• Bat algorithm is applied on brushless DC wheel motor problem in order to optimize
the parameters.
• Load frequency control using Bat algorithm is applied in the relevant literature. It is
used to schedule PI controllers for interconnected power systems.
• Bat algorithm is applied in order to optimize fuel in the reactor core.
• In medical imaging for diagnosis and treatment, and for ultrasound techniques.
• Bio-inspired technology in robotics – biomimetic drones and autonomous navigation.
• Bat wings and Aerodynamics – Flight efficiency, Wind turbines and fans, and noise
reduction.
Limitations:
• The time complexity of meta-heuristic algorithms that search for global optimization
can’t be found. Since these algorithms don’t guarantee to find the global optima
solution within a given time limit, it’s impossible to find time complexity.
• On the other hand, the best way to compare the complexity of metaheuristic algorithms
is to know time and space requirements by using computational complexity theory.
Computational complexity of an algorithm refers to the amount of resources required
to run the relevant algorithm.
Flower Pollination Algorithm:
Flower pollination algorithm is the most recently developed nature inspired algorithm based
on pollination process of plants and is principally used to solve constrained and unconstrained
optimization problems.

There are over a quarter of a million types of flowering plants in Nature, 80% of them are
flowering species. The main purpose of a flower is ultimately reproduction via pollination.
Flower pollination process is associated with the transfer of pollen by using pollinators such as
insects, birds, bats…etc.

There are two major processes for transferring the pollen

• Biotic and cross-pollination process


• Abiotic and self-pollination process

Flower Pollination Algorithm (FPA) is a nature inspired population based algorithm proposed
by Xin-She Yang (2012). The main objective of the flower pollination is to produce the optimal
reproduction of plants by surviving the fittest flowers in the flowering plants. In fact this is an
optimization process of plants in species. FPA is a swarm-based optimization technique that
has attracted the attention of many researchers in several optimization fields due to its
impressive characteristics. FPA has very fewer parameters and has shown a robust performance
when applied in various optimization problems. In addition, FPA is a flexible, adaptable,
scalable, and simple optimization method. Therefore, FPA, compared with other metaheuristic
algorithms, shows good results for solving various real-life optimization problems from
different domains such as electrical and power system, signal and image processing, wireless
sensor networking, clustering and classification, global function optimization, computer
gaming, structural and mechanical engineering optimization, and many others. FPA is a
population-based optimization technique, initiated with a set of provisional or random
solutions. At each iteration, either one of the two operators is carried out for each individual
population member: local pollination operator and global pollination operator. In a local
pollination operator, the decision variables of the current solution attract the other two
randomly selected solutions from two population members. In a global pollination operator,
the decision variables of the current solution attract to the globally best solution found. The
switch operator is responsible for exchanging the improvement loop either locally or globally.
This process repeats until a predefined stopping criterion is met. The procedural optimization
framework of FPA at its initial version has undergone modification or hybridization to enhance
its performance with relation to different types of problem landscapes. FPA is investigated to
determine the optimal loading of generators in power systems. Simulations results for small
and large scale power system considering the valve loading effect are implemented to indicate
the robustness of FPA.

Basic Steps for Flower Pollination Algorithm (FPA):

i) Input Population: The main purpose of a flower is ultimately reproduction via


pollination. Flower pollination is typically correlating with the transfer of pollen,
which often associated with pollinators such as birds and insects. Indeed, some
flowers and insects have a very specialized flower-pollinator partnership, as some
flowers can only attract a specific species of insect or bird for effective pollination.
ii) Find the current best solution: Pollination appears in two major forms: abiotic
and biotic. About 90% of flowering plants depend on the biotic pollination process,
in which the pollen is transferred by pollinators. About 10% of pollination follows
abiotic form that does not require any pollinators. Wind and diffusion help in the
pollination process of such flowering plants.
iii) Global pollination using Levy Flight: Cross-pollination, or allogamy, means
pollination can occur from pollen of a flower of a different plant, while self-
pollination is the fertilization of one flower, such as peach flowers, from pollen of
the same flower or different flowers of the same plant, which often occurs when
there is no reliable pollinator available. Biotic, cross-pollination may occur at long
distance, and the pollinators such as bees, bats, birds and flies can fly a long
distance, thus they can considered as the global pollination. In addition, bees and
birds may behave as Levy flight behavior, with jump or fly distance steps obeying
a Levy distribution.
iv) Evaluate new solution: To generate the updating formulas, the above rules have
to be converted into proper updating equations. For example at the global
pollination step, the pollinators such as insects carry the flower pollen gametes, so
the pollen can travel over a long distance because of the ability of these insects to
fly and move in much longer ranges.
v) Output the Best solution: It is clear that the fuel cost obtained by the proposed
FPA is better than other algorithms. Also, lists the statistical comparison between
FPA and different algorithms reported in terms of the best, mean, worst cost and
computational (CPU) time through 50 trials. Best output solutions are calculated.
Applications of Flower Pollination Algorithm

• Power and energy


• Signal and image processing
• Structural design
• Clustering and feature selection
• Computer gaming
• Wireless sensor networking

Advantages of Flower Pollination Algorithm

• FPA have been developed by modification, hybridization, and parameter-tuning to cope


with the complex nature of optimization problems.
• Flower algorithm is used to solve a nonlinear design benchmark, which shows the
convergence rate is almost exponential.
• The flower algorithm is more efficient than both GA and PSO.
• The Flower Pollination Algorithm is recommended as the fastest and the most accurate
optimization technique for the optimal parameters extraction process.
• The metaheuristic algorithms can be used as an alternative tool to find near-optimal
solutions. Thus, inventing new metaheuristic techniques and enhancing the current
algorithms is necessary.
• Flower Pollination Algorithm (FPA) is used to solve ELD and CEED problems in
power systems.

Genetic Algorithm - Crossover and Mutation Operations.

Genetic algorithms are iterative optimization techniques that simulate natural selection to find
optimal solutions.

Genetic algorithms are defined as a type of computational optimization technique inspired by


the principles of natural selection and genetics. They are used to solve complex problems by
mimicking the process of evolution to improve a population of potential solutions iteratively.

Crossover, or recombination, is a genetic operator where two selected individuals exchange


genetic information to create offspring. This operation is analogous to sexual reproduction,
where genetic material from both parents is combined to produce genetically diverse offspring.

i) Single point Crossover: A crossover point on the parent organism string is selected. All data
beyond that point in the organism string is swapped between the two parent organisms. Strings
are characterized by Positional Bias.
ii) Two-point Crossover: This is a specific case of a N-point Crossover technique. Two
random points are chosen on the individual chromosomes (strings) and the genetic material is
exchanged at these points.

Mutation introduces small random changes in the genetic information of selected individuals.
This operation helps maintain genetic diversity within the population, allowing exploration of
different regions of the solution space.

How the genetic algorithm works:

Let’s consider an example that involves optimizing a common task: finding the best route to
commute from home to work.
Imagine you want to optimize your daily commute and find the shortest route to go from your
home to your workplace. You have multiple possible routes to choose from, each with different
distances, traffic conditions, and travel times. You can use a GA to help you find the optimal
route.

1. Encoding the solutions

In this case, potential solutions can be encoded as permutations of the cities or locations along
the commute route. For example, you can represent each possible route as a string of city
identifiers, such as “A-B-C-D-E-F,” where each letter represents a location (e.g., a street,
intersection, or landmark).
2. Initialization

Start by creating an initial population of potential routes. You can randomly generate a set of
routes or use existing routes as a starting point.

3. Evaluation

Evaluate each route in the population by considering factors such as distance, traffic conditions,
travel time, and other relevant criteria. The evaluation function should quantify the quality of
each route, where lower values indicate better solutions (e.g., shorter distance, less time spent
in traffic).

4. Selection

Perform a selection process to choose which routes will be part of the next generation. Selection
methods aim to favour fitter individuals, in this case, routes with lower evaluation values.
Common selection techniques include tournament selection, roulette wheel selection, or rank-
based selection.

5. Crossover

Apply crossover to create new routes by combining genetic material from two parent routes.
For instance, you can select two parent routes and exchange segments of the routes to create
two new offspring routes.

6. Mutation

Introduce random changes in the routes through mutation. This helps explore new possibilities
and avoid getting stuck in local optima. A mutation operation could involve randomly
swapping two cities in a route, inserting a new city, or randomly changing the order of a few
cities.

7. New generation

The offspring generated through crossover and mutation and a few fittest individuals from the
previous generation form the new population for the next iteration. This ensures that good
solutions are preserved and carried forward.

8. Termination

The GA continues the selection, crossover, and mutation process for a fixed number of
generations or until a termination criterion is met. Termination criteria can be a maximum
number of iterations or reaching a satisfactory solution (e.g., a route with a predefined low
evaluation value).

9. Final solution

Once the GA terminates, the best solution, typically the route with the lowest evaluation value,
represents the optimal or near-optimal route for your daily commute.
By iteratively applying selection, crossover, and mutation, GAs help explore and evolve the
population of routes, gradually converging toward the shortest and most efficient route for your
daily commute.
It’s important to note that GAs require appropriate parameter settings, such as population size,
selection strategy, crossover and mutation rates, and termination criteria, to balance exploration
and exploitation.

Applications of Genetic Algorithms:

1. Optimization problems: GAs excel at solving optimization problems, aiming to find the
best solution among a large set of possibilities. These problems include mathematical function
optimization, parameter tuning, portfolio optimization, resource allocation, and more. GAs
explore the solution space by enabling the evolution of a population of candidate solutions
using genetic operators such as selection, crossover, and mutation, gradually converging
towards an optimal or close-to-optimal solution.

2. Combinatorial optimization: GAs effectively solve combinatorial optimization problems,


which involve finding the best arrangement or combination of elements from a finite set.
Examples include the traveling salesman problem (TSP), vehicle routing problem (VRP), job
scheduling, bin packing, and DNA sequence alignment. GAs represent potential solutions as
chromosomes, and through the process of evolution, they search for the optimal combination
of elements.

3. Machine learning: GAs have applications in machine learning, particularly to optimize the
configuration and parameters of machine learning models. GAs can be used to optimize
hyperparameters, such as learning rate, regularization parameters, and network architectures
in neural networks. They can also be employed for feature selection, where the algorithm
evolves a population of feature subsets to identify the most relevant subset for a given task.

4. Evolutionary robotics: GAs find use in evolutionary robotics, which involves evolving
robot behavior and control strategies. By representing the robot’s control parameters or policies
as chromosomes, GAs can evolve solutions that maximize performance metrics such as speed,
stability, energy efficiency, or adaptability. GAs are particularly useful when the optimal
control strategies are difficult to determine analytically.

5. Image and signal processing: GAs are applied in image and signal processing tasks,
including image reconstruction, denoising, feature extraction, and pattern recognition. They
can optimize the parameters of reconstruction algorithms to enhance image quality. In signal
processing, they can optimize filtering parameters for denoising signals while preserving
important information. GAs can also be used for automatic feature extraction, evolving feature
extraction algorithms to identify relevant features/objects in images or signals.

6. Design and creativity: GAs have been used for design and creativity tasks, such as
generating artistic designs, music composition, and game design. By representing design
elements or musical notes as genes, GAs can evolve populations of designs or compositions
and evaluate their quality using fitness functions tailored to the specific domain. GAs have
demonstrated the ability to generate novel and innovative solutions in creative domains.

7. Financial modelling: GAs are applied in financial modeling for portfolio optimization,
algorithmic trading, and risk management tasks. GAs can optimize the allocation of assets in
an investment portfolio to maximize returns and minimize risk. They can also evolve trading
strategies by adjusting trading parameters to adapt to market conditions and maximize profits.
GAs provide a flexible and adaptive approach to modeling complex financial systems.

These applications demonstrate the versatility and effectiveness of genetic algorithms in


solving optimization and search problems across various domains. The ability of GAs to
explore the solution space, handle constraints, and adaptively evolve solutions makes them a
valuable tool for tackling complex real-world problems.

Examples of Genetic Algorithms:

Companies across various industries have used genetic algorithms to tackle a range of
challenges. Here are a few recent noteworthy examples of GA:

1. Google’s DeepMind
2. Tesla’s self-driving tasks
3. Amazon’s logistics operations
4. Autodesk’s design optimization
5. Uber’s evolutionary optimizer
6. Boeing’s wing design optimization
7. Ford’s vehicle routing optimization
8. Siemens’ manufacturing process optimization
9. NVIDIA’s GPU architecture optimization
10. Toyota’s supply chain optimization

Bio-Inspired Optimisation:

Bio-inspired optimization techniques (BOT) are part of intelligent computing


techniques. There are several BOTs available and many new BOTs are evolving in
this era of industrial revolution 4.0. Genetic algorithm, particle swarm optimization,
artificial bee colony, and grey wolf optimization are the techniques explored by
researchers in the field of food processing technology. Although, there are other
potential methods that may efficiently solve the optimum related problem in food
industries.

Ant Colony Optimisation (ACO):

Ant Colony Optimization technique is purely inspired from the foraging behaviour of ant
colonies, first introduced by Marco Dorigo in the 1990s. Ants are eusocial insects that prefer
community survival and sustaining rather than as individual species.
Ant Colony Optimization (ACO) is a metaheuristic algorithm inspired by the foraging
behaviour of ants.
Inspiring source of ACO is the foraging behavior of real ants. The ant deposits a chemical
pheromone trail on the ground when it forages. The quantity of pheromone deposited depends
upon the quantity and quality of the food that will guide other ants to the food source. The ants
find shortest paths between their nest and food sources with the help of indirect communication
between them via pheromone trails. The below figure represents the operators, control
parameters and the application areas of the ant colony optimization algorithm. Pheromone trail
update and evaporation of pheromone trail are the operators of the ACO algorithm. The control
parameters used by the ACO algorithm includes the number of ants and number of iterations
considered in the algorithm. Some of the application areas, where Ant Colony Optimization
(ACO) algorithm can be used are represented as shown in the given figure.

Ant Colony Optimization (ACO) Algorithm

How ACO works:

i) Foraging Behavior
ACO algorithms involves simulating the foraging behaviour of ants to solve
optimization problems . Ants deposit pheromones as they search for food , and these
pheromones influence other ants decisions.
ii) Exploration and Exploitation
ACO strikes a balance between exploration and exploitation to find optimal
solutions.
iii) Global Optimization
The algorithms focus on finding a global best solution rather than getting stuck in
local optima by utilizing the collective knowledge of the ant colony.

Advantages:

i) Global Optimality: ACO tends to find near optimal solutions by simultaneously


exploring multiple paths and avoiding getting stuck in local optima.
ii) Adaptability: The algorithm can adapt to dynamic environments , making it suitable
for real world problems with changing constraints.
iii) Parallel Computation: ACO inherently supports parallelism making it well suited
for execution on parallel and distributed systems.

Limitations:

i) Time Intensiveness: ACO algorithms can be computationally intensive , especially


for large scale problems , affecting response times.
ii) Parameter Sensitivity: It often requires careful tuning of parameters , such as the
pheromone evaporation rate and exploration exploitation balance.
iii) Complexity: Implementing ACO for certain problems may require handling
complex data structures and communication protocols.

Applications:

i) Routing and Networking: ACO has been used to optimize network routing such as
packet switching networks and telecommunication systems.
ii) Vehicle Routing: It is applied in the optimization of vehicle routing problems like
the famous Travelling Salesman Problem.
iii) Manufacturing: ACO enables efficient scheduling and resource allocation in
manufacturing processes , minimizing production time and cost.
iv) Robotics and Automation: Used to optimize robot motion planning and control ,
improving autonomy and efficiency in robotic systems.

Swarm Intelligence – Particle Swarm Optimization(PSO):

Particle Swarm Optimization is a population-based optimization algorithm inspired by the


social behavior of birds and fish. Developed by James Kennedy and Russell Eberhart in 1995,
PSO is a heuristic algorithm used to find optimal solutions to optimization problems.

• Particle swarm optimization (PSO) is an artificial intelligence (AI) or a computational


method for getting solutions to problems through maximization and minimization of
numeric.
• Emanated from social behavior of bird flocking or fish schooling.
• Particles are the population of candidate solutions that move around in a search-space.
• Particles move within the search-space and therefore have velocity.
• Particles also move to find their best known positions.
• Particles move within the problem space by following the current optimum particles.
• Swarm particles work together and exchange information about the best known
solutions.
• Particles rely on information within their neighbourhood.
• Particles understand conditions of others in the neighbourhood.
• The position of a particle with the best known fitness or solution guides other
particles.
• The position helps in optimizing the particle’s velocity.

Key Reasons for Incorporating Swarm Intelligence in Bio Inspired Engineering:

i) Collective Problem Solving:

● Swarm intelligence leverages the power of many simple agents working


collaboratively.
● In nature, collective behavior often leads to efficient problem-solving strategies.

ii) Adaptability to Dynamic Environments:

● Natural swarms exhibit remarkable adaptability to changing conditions.


● Bio-inspired algorithms, like those based on swarm intelligence, excel in
handling dynamic and uncertain scenarios.

iii) Robustness in Optimization:

● Swarm intelligence allows for robust optimization in the face of uncertainties


and variations.
● Collective decision-making and distributed control contribute to the resilience
of swarm-inspired algorithms.

iv) Efficiency in Exploration and Exploitation:

● Swarm algorithms balance exploration (searching for new solutions) and


exploitation (exploiting known solutions).
● This equilibrium is crucial for navigating complex optimization landscapes
effectively.

Advantages and Disadvantages:

Advantages of Swarm Intelligence Particle Size Optimization

● Robustness in handling complex optimization landscapes.


● Adaptability to changing conditions.
● Potential for faster convergence.

Disadvantages and Challenges

● Sensitivity to parameter tuning.


● Limited understanding of the optimization process.
● Computational complexity.

Applications:

● Pharmaceutical Industry: Tailoring drug formulations for enhanced efficacy.


● Materials Science: Optimizing material properties for various applications
● Environmental Engineering: Particle size optimization for pollution control.
● Function Optimization: PSO is commonly used to find optimal solutions for
mathematical functions.
● Neural Network Training: PSO can be applied to optimize neural network
parameters.
● Robotics: PSO can help in path planning and optimization for robotic systems.

You might also like