0% found this document useful (0 votes)
2 views

hyper parameter

Hyperparameters are external settings that influence the training of machine learning models, distinct from parameters learned during training. Examples include learning rate, number of epochs, and batch size, which affect model performance, training speed, and generalization. Tuning methods such as grid search, random search, and Bayesian optimization are used to select optimal hyperparameters.

Uploaded by

22btcse067
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

hyper parameter

Hyperparameters are external settings that influence the training of machine learning models, distinct from parameters learned during training. Examples include learning rate, number of epochs, and batch size, which affect model performance, training speed, and generalization. Tuning methods such as grid search, random search, and Bayesian optimization are used to select optimal hyperparameters.

Uploaded by

22btcse067
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Hyperparameters are the settings or configurations external to the model that are set before

training begins. They control the training process and influence the behavior of the machine
learning algorithm.

They are different from parameters like weights in neural networks, which are learned
during training.​
Hyperparameters are chosen manually or using techniques like grid search, random
search, or Bayesian optimization.

Examples of Hyperparameters

●​ Learning Rate: How fast the model updates its weights.


●​ Number of Epochs: How many times the model sees the entire dataset.
●​ Batch Size: How many samples are processed before updating the model.
●​ Number of Hidden Layers/Neurons: In neural networks.
●​ Maximum Depth: In decision trees.
●​ Number of Trees: In random forests.
●​ Kernel Type: In Support Vector Machines (linear, RBF, etc.).
●​ Dropout Rate: In deep learning, for regularization.
●​ Regularization Parameters: Like L1 (lasso) or L2 (ridge) penalties.

Why Are Hyperparameters Important?

Reason Explanation
Model Performance Good hyperparameters improve accuracy, precision, recall, etc.
Proper settings can make models train faster without sacrificing
Training Speed
performance.
Overfitting and Hyperparameters control whether the model generalizes well or
Underfitting memorizes the training data.
Stability and Especially important in models like neural networks; bad learning
Convergence rates can make training unstable.
Good choices save computation, memory, and time, especially for
Resource Management
large datasets or deep models.

Tuning Hyperparameters

You typically tune hyperparameters using:

●​ Grid Search: Try all combinations from a fixed set.


●​ Random Search: Randomly sample combinations.
●​ Bayesian Optimization: Model the performance and choose hyperparameters based
on probability.
●​ Automated Tools: Like Optuna, Hyperopt, Ray Tune.

You might also like