Improving Classification With AdaBoost
Improving Classification With AdaBoost
meta-algorithm
Hawking Bear
February 5, 2022
National Institute of Science Education and Research
1
Problem Statement
Problem Statement
2
Meta-algorithms
3
Bagging and Boosting
Bagging
4
Boosting
5
AdaBoost
AdaBoost
6
AdaBoost
7
AdaBoost Pseudocode
10
Class Imbabalance
What is it?
11
How do we detect it?
12
How do we detect it?
TP
• Precision = TP+FP = fraction of records that were positive
from the group that the classifier predicted to be positive.
TP
• Recall = TP+FN = fraction of positive examples the classifier
got right.
• Very useful when used together.
13
How do we address it?
14
References i
15
Why the name?
1
• Let the training error t of ht be given by 2 − γt .
• Previous learning algorithms required that γt be known a
priori before boosting begins.
• AdaBoost adapts to the error rates of the individual weak
hypotheses, thus the name ’adaptive’.