Lec 9
Lec 9
Soft Computing
2h Lecture Dr. Elaf Adil
Learning Objectives
The general objective of the lecture:
he primary objective of this lecture is to provide an in-depth understanding of Soft
Computing and Genetic Algorithms (GA), focusing on their definitions,
components, working principles, and real-world applications. The lecture aims to
highlight how soft computing techniques, particularly Genetic Algorithms, solve
complex optimization problems in various domains.
Detailed objectives:
Understand Soft Computing – Learn what soft computing is and how it differs
from traditional computing.
• Tolerant to approximations
• These traits are passed to offspring, gradually improving the species over
generations.
• Over time, natural selection leads to the evolution of the fittest organisms.
This idea of selection, inheritance, and gradual improvement forms the basis of
Genetic Algorithms in artificial intelligence and optimization.
4. What is GA
➢ A genetic algorithm (or GA) is a search technique used in computing to find
true or approximate solutions to optimization and search problems.
➢ (GA)s are categorized as global search heuristics.
➢ (GA)s are a particular class of evolutionary algorithms that use techniques
inspired by evolutionary biology such as inheritance, mutation, selection, and
crossover (also called recombination).Hierarchical clustering is a method of
Kerbala University Machin Learning
College of Computer Science and Information Technology
Department of Computer Science Dr. Elaf Adil – 8 Lecture
Vocabulary
3. Fitness – Target function that we are optimizing (each individual has a fitness)
Key Steps:
• Quit when you have a satisfactory solution (or you run out of time) Compute.
Designing an effective fitness function is crucial for the success of your genetic
algorithm. Here are a few guidelines to follow when creating one:
Avoid Local Optima: A good fitness function should help your genetic algorithm
escape local optima – a suboptimal solution that appears optimal within a limited
search space. Design your fitness function to encourage exploration of the search
space, avoiding premature convergence.
7. Methods of Selection
Selection methods are crucial for guiding the evolution of solutions toward
optimality. They determine which individuals from the current population are
chosen to reproduce and pass their genetic material to the next generation. Effective
selection increases the likelihood of propagating high-quality solutions, thereby
enhancing the algorithm's performance.
It is clear that a fitter individual has a greater pie on the wheel and therefore a greater
chance of landing in front of the fixed point when the wheel is rotated. Therefore,
the probability of choosing an individual depends directly on its fitness.
It is to be noted that fitness proportionate selection methods don’t work for cases
where the fitness can take a negative value
Tournament Selection
Rank Selection
Rank Selection also works with negative fitness values and is mostly used when the
individuals in the population have very close fitness values (this happens usually at
the end of the run). This leads to each individual having an almost equal share of the
pie (like in case of fitness proportionate selection) as shown in the following image
and hence each individual no matter how fit relative to each other has an
approximately same probability of getting selected as a parent. This in turn leads to
Kerbala University Machin Learning
College of Computer Science and Information Technology
Department of Computer Science Dr. Elaf Adil – 8 Lecture
a loss in the selection pressure towards fitter individuals, making the GA to make
poor parent selections in such situations.
In this, we remove the concept of a fitness value while selecting a parent. However,
every individual in the population is ranked according to their fitness. The selection
of the parents depends on the rank of each individual and not the fitness. The higher
ranked individuals are preferred more than the lower ranked ones.
A 8.1 1
B 8.0 4
C 8.05 2
D 7.95 6
E 8.02 3
F 7.99 5
Random Selection
Kerbala University Machin Learning
College of Computer Science and Information Technology
Department of Computer Science Dr. Elaf Adil – 8 Lecture
In this strategy we randomly select parents from the existing population. There is no
selection pressure towards fitter individuals and therefore this strategy is usually
avoided.
8. Methods of Reproduction
1. Crossover (Recombination):
o Common Techniques:
Uniform Crossover: Each gene is chosen randomly from one of the parents,
allowing for more diverse offspring.
2. Mutation:
Example:
It may seem so because we know the answer in advance. However, we can think of
it as maximizing the number of correct answers, each encoded by 1, to l yes/no
difficult questions.
Step 2: crossover
➢ Next we mate strings for crossover. For each couple we first decide
(using some pre-defined probability, for instance 0.6) whether to
actually perform the crossover or not.
➢ If we decide to actually perform crossover, we randomly extract the
crossover points, for instance 2 and 5.
Crossover result
Before crossover:
s1 = 1111010101 s2 = 1110110101
After crossover:
Step 3: mutations
The final step is to apply random mutations: for each bit that we are to copy
to the new population we allow a small probability of error (for instance 0.1)
Initial strings
In one generation, the total population's fitness changed from 34 to 37, thus
improving by ~9%.
We go through the same process all over again, until a stopping criterion is
met.
<Best Regards>