Boosting: 1. What Is The Difference Between Adaboost and Gradient Boosting?
Boosting: 1. What Is The Difference Between Adaboost and Gradient Boosting?
2. When to apply AdaBoost (Adaptive Boosting Algorithm)?
Here is the list of all the key points below for an understanding of Adaboost:
AdaBoost can be applied to any classification algorithm, so it’s really a
technique that builds on top of other classifiers as opposed to being a classifier
itself.
You could just train a bunch of weak classifiers on your own and combine the
results.
There’s really two things it figures out for you:
o It helps you choose the training set for each new classifier that you train
based on the results of the previous classifier.
o It determines how much weight should be given to each classifier’s
proposed answer when combining the results.
3. How does AdaBoost (Adaptive Boosting Algorithm) work?
Important Points regarding working of Adaboost:
AdaBoost can be applied to any classification algorithm, so it’s really a
technique that builds on top of other classifiers as opposed to being a classifier
itself.
Each weak classifier should be trained on a random subset of the total training
set.
AdaBoost assigns a “weight” to each training example, which determines the
probability that each example should appear in the training set.
o g. (The initial weights generally add up to 1. for example: if there are 8
training examples, the weight assigned to each will be 1/8 initially. So all
training examples are having equal weights.)
Now, if a training example is misclassified, then that training example is assigned
a higher weight so that the probability of appearing that particular misclassified
training example is higher in the next training set for training the classifier.
So, after performing the previous step, hopefully, the trained classifier will
perform better on the misclassified examples next time.
So the weights are based on increasing the probability of being chosen in
the next sequential training sub-set.