SVM Ot
SVM Ot
w·x+b=0
where w is the weight vector, x is the feature vector, and b is the bias.
2. Margin Maximization
The margin is the distance between the hyperplane and the closest data points from each class,
known as support vectors. SVM maximizes this margin to enhance generalization, leading to
the optimization problem:
1
min ∥w∥2
w,b 2
3. Kernel Trick
For non-linearly separable data, SVM employs the kernel trick to map data into a higher-
dimensional space where a linear hyperplane can separate the classes. Common kernels in-
clude:
1
Figure 1: Schematic diagram of SVM architecture
where ξi are slack variables and C is a regularization parameter balancing margin width and
classification error.
Applications of SVM
SVM has been effectively applied in:
1. Text Classification: Spam detection and sentiment analysis.
2. Image Recognition: Handwritten digit recognition and face detection.
3. Bioinformatics: Gene classification and protein structure prediction.
2
Limitations
• Computationally expensive for large datasets.
• Requires careful tuning of kernel parameters and regularization.
Conclusion
SVM is a powerful tool in machine learning, offering robust performance in diverse applica-
tions. While it requires careful parameter selection and preprocessing, its ability to handle
high-dimensional data makes it a popular choice for many classification tasks.
3
1 Particle Swarm Optimization (PSO)
Introduction
Particle Swarm Optimization (PSO) is a population-based optimization algorithm inspired by
the social behavior of bird flocking or fish schooling. Developed by James Kennedy and Russell
Eberhart, PSO is widely used in solving complex optimization problems.
2. Particle Movement
The movement of a particle is influenced by:
• Personal Best (pi ): The best position a particle has achieved so far.
• Global Best (g): The best position achieved by any particle in the swarm.
xi (t + 1) = xi (t) + vi (t + 1)
where:
Algorithm Steps
1. Initialize a swarm of particles with random positions and velocities.
2. Evaluate the fitness of each particle based on the objective function.
3. Update each particle’s personal best (pi ) and the swarm’s global best (g).
4. Update the velocity and position of each particle using the equations above.
5. Repeat steps 2–4 until a termination criterion (e.g., maximum iterations or desired accu-
racy) is met.
4
Figure 2: Flowchart for basic PSO algorithm
5
Applications of PSO
PSO is versatile and has been applied to various optimization problems, including:
Limitations
• Prone to premature convergence.
• Sensitive to parameter settings (ω, c1 , c2 ).
• May struggle with highly complex or multimodal problems.
Conclusion
Particle Swarm Optimization is a powerful heuristic algorithm inspired by nature. Its simplicity
and flexibility make it suitable for a wide range of optimization tasks. However, careful tuning
of parameters is essential to ensure optimal performance.
6
2 Genetic Algorithm (GA)
Introduction
The Genetic Algorithm (GA) is a heuristic optimization technique inspired by natural selection.
Developed by John Holland, GA is widely used for solving complex optimization problems.
2. Fitness Function
The fitness function evaluates the quality of each solution (chromosome). The goal of the GA
is to optimize this fitness function by iteratively evolving the population.
3. Genetic Operators
GA uses three primary genetic operators to evolve the population:
• Selection: Chooses parent chromosomes based on their fitness to propagate their genes
to the next generation. Common methods include:
– Roulette Wheel Selection
– Tournament Selection
– Rank-Based Selection
• Crossover (Recombination): Combines the genetic information of two parents to pro-
duce offspring. Common crossover techniques are:
– Single-Point Crossover
– Two-Point Crossover
– Uniform Crossover
• Mutation: Introduces random changes in the chromosomes to maintain genetic diversity
and avoid premature convergence. Mutation rate controls the frequency of these changes.
Algorithm Steps
The Genetic Algorithm follows these steps:
7
Figure 3: Genetic Algorithm optimization flowchart.
8
• Bioinformatics: DNA sequence alignment and drug design.
Limitations
• Computationally expensive for large populations or complex problems.
• Prone to premature convergence without sufficient diversity.
• Requires careful tuning of parameters (population size, mutation rate, etc.).
Conclusion
The Genetic Algorithm is a versatile and powerful optimization technique inspired by natural
evolution. Its ability to explore large and complex solution spaces makes it suitable for a variety
of applications. However, parameter tuning and computational cost must be carefully managed
for optimal performance.