FAQ - Boosting - Ensemble Techniques - Great Learning
FAQ - Boosting - Ensemble Techniques - Great Learning
Here is the list of all the key points below for an understanding of Adaboost:
AdaBoost can be applied to any classification algorithm, so it’s really a technique that builds on to
other classifiers as opposed to being a classifier itself.
You could just train a bunch of weak classifiers on your own and combine the results.
There’s really two things it figures out for you:
It helps you choose the training set for each new classifier that you train based on the results of th
previous classifier.
It determines how much weight should be given to each classifier’s proposed answer when combin
the results.
AdaBoost can be applied to any classification algorithm, so it’s really a technique that builds on to
other classifiers as opposed to being a classifier itself.
Each weak classifier should be trained on a random subset of the total training set.
AdaBoost assigns a “weight” to each training example, which determines the probability that each
example should appear in the training set.
g. (The initial weights generally add up to 1. for example: if there are 8 training examples, the weig
assigned to each will be 1/8 initially. So all training examples are having equal weights.)
Now, if a training example is misclassified, then that training example is assigned a higher weight so t
the probability of appearing that particular misclassified training example is higher in the next training
for training the classifier.
So, after performing the previous step, hopefully, the trained classifier will perform better on the
misclassified examples next time.
So the weights are based on increasing the probability of being chosen in the next sequential
training sub-set.
Note: Open a new terminal before using the above command in the Mac terminal. To open the terminal, p
open the Launchpad and then click on the terminal icon.
To remove the warning kindly try setting the eval_metric hyperparameter as 'logloss', as shown below:
xgb = XGBClassifier(eval_metric='logloss')
Previous Next
Proprietary content.©Great Learning. All Rights Reserved. Unauthorized use or distribution prohibited.