0% found this document useful (0 votes)
4 views

Module_6

Hyperparameter optimization is essential in machine learning for improving model performance by tuning parameters not learned during training. Two common techniques for optimization are Grid Search, which exhaustively evaluates all combinations of hyperparameters, and Random Search, which samples random combinations. Grid Search guarantees finding the best combination within the defined grid but is computationally expensive, while Random Search is more efficient but does not guarantee the best outcome.

Uploaded by

gohodoh495
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views

Module_6

Hyperparameter optimization is essential in machine learning for improving model performance by tuning parameters not learned during training. Two common techniques for optimization are Grid Search, which exhaustively evaluates all combinations of hyperparameters, and Random Search, which samples random combinations. Grid Search guarantees finding the best combination within the defined grid but is computationally expensive, while Random Search is more efficient but does not guarantee the best outcome.

Uploaded by

gohodoh495
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Module 5: Optimization Algorithm -

Hyperparameter Optimization
In machine learning, hyperparameter optimization is crucial for tuning models to
achieve better performance. Hyperparameters are parameters not learned during
training, such as the learning rate, number of trees, or kernel type.

1. Hyperparameter Optimization

Hyperparameter optimization is the process of finding the best set of


hyperparameters to maximize the performance of a model.

Why is it important?

• Improves model accuracy.


• Prevents underfitting/overfitting.
• Helps to utilize model capabilities fully.

Common techniques include:

• Grid Search
• Random Search

2. Grid Search vs. Random Search

A. Grid Search

Grid Search is an exhaustive search method where you define a grid of


hyperparameter values, and the algorithm evaluates every possible combination.

How it Works:

1. Define hyperparameters and their possible values.


2. Evaluate all combinations using cross-validation.
3. Select the combination with the best performance metric.
Pros:

• Guarantees to find the best combination (if within the defined grid).
• Systematic and easy to implement.

Cons:

• Computationally expensive, especially with large grids or many


hyperparameters.
• Inefficient if the optimal values are not within the grid.

Example:

Suppose you are tuning a Decision Tree.

• Parameters:
o Max Depth: [3, 5, 10]
o Min Samples Split: [2, 4, 6]
• Total combinations: 3 × 3 = 9

B. Random Search

Random Search selects random combinations of hyperparameter values from the


defined space, rather than testing all combinations.

How it Works:

1. Define hyperparameters and their possible value ranges.


2. Randomly select a subset of combinations to evaluate.
3. Select the combination with the best performance.

Pros:

• More efficient than Grid Search, especially with large hyperparameter


spaces.
• Can find good configurations in fewer iterations.
Cons:

• Does not guarantee finding the best combination.


• Performance depends on the number of random trials.

Example:

Using the same parameters:

• Randomly select 5 out of 9 combinations.


• Evaluate only those 5.

Key Differences:

Aspect Grid Search Random Search

Exhaustive, tests all Random sampling of


Search Strategy
combinations. combinations.

Efficient, evaluates fewer


Efficiency Inefficient for large spaces.
points.

Best
Guaranteed (if in grid). Not guaranteed.
Combination

Implementation Straightforward. Slightly more complex.

When to Use Which?

1. Grid Search:
o When hyperparameter space is small.
o When computational resources are not a constraint.
2. Random Search:
o When hyperparameter space is large.
o When time/resources are limited.

You might also like