Module_6
Module_6
Hyperparameter Optimization
In machine learning, hyperparameter optimization is crucial for tuning models to
achieve better performance. Hyperparameters are parameters not learned during
training, such as the learning rate, number of trees, or kernel type.
1. Hyperparameter Optimization
Why is it important?
• Grid Search
• Random Search
A. Grid Search
How it Works:
• Guarantees to find the best combination (if within the defined grid).
• Systematic and easy to implement.
Cons:
Example:
• Parameters:
o Max Depth: [3, 5, 10]
o Min Samples Split: [2, 4, 6]
• Total combinations: 3 × 3 = 9
B. Random Search
How it Works:
Pros:
Example:
Key Differences:
Best
Guaranteed (if in grid). Not guaranteed.
Combination
1. Grid Search:
o When hyperparameter space is small.
o When computational resources are not a constraint.
2. Random Search:
o When hyperparameter space is large.
o When time/resources are limited.