0% found this document useful (0 votes)
17 views4 pages

Adaboost

Uploaded by

amansonnii
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views4 pages

Adaboost

Uploaded by

amansonnii
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

What is AdaBoost?

 AdaBoost (Adaptive Boosting) is an ensemble learning method


that combines multiple weak learners to form a strong learner.
It is used primarily for classification but can also be adapted
for regression tasks.

 AdaBoost improves the performance of weak learners by


focusing on the errors made by previous models and adjusting
the model weights accordingly.

 Weak Learner: A classifier that performs just slightly better


than random guessing. A decision tree with a single split
(decision stump) is commonly used as a weak learner in AdaBoost.

Advantages of AdaBoost:
 Improves Weak Learners: AdaBoost can take weak models and combine them to
produce a stronger predictive model.
 No Overfitting (Generally): AdaBoost tends to work well even with a relatively large
number of iterations, making it less prone to overfitting compared to other models like
decision trees.
 Versatile: It can be applied to many types of base learners (although decision trees are
the most common).
 Easy to Implement: It is simple and easy to implement in most programming
languages and machine learning libraries.

Disadvantages of AdaBoost:
 Sensitive to Noisy Data: If there is noisy data or outliers, AdaBoost might
overemphasize these points, as it increases the weight of misclassified samples.
 Computationally Expensive: AdaBoost can be computationally expensive when there
are a large number of iterations or a large dataset.
 Weak Learner Dependent: The performance of AdaBoost is highly dependent on the
base learner. If the base learner performs poorly, the overall model performance will
suffer.
1. Assign equal weights to all the data points
2. Find the stump that does the best job classifying the new collection of
samples by finding their Gini Index and selecting the one with the
lowest Gini index
3. Calculate the “Amount of Say” and “Total error” to update the
previous sample weights.
4. Normalise the new sample weights.

You might also like