Time To Explore (5) ML
Time To Explore (5) ML
Submitted To: -
Submitted By: -
Dr. Nilamadhab Mishra
Aditya Vikram
Faculty, SCSE
Singh
(Reg: 22BCE10522)
Ques 1 . What do you mean by Ensemble Learning? Investigate
common ensemble learning methods?
Ans. Ensemble learning is a technique in machine learning where multiple models, often
called "weak learners," are trained to solve the same problem and combined to get better
performance than any of the individual models could achieve on their own. The idea is
that by combining the predictions from multiple models, the ensemble can reduce the risk
of overfitting and improve generalization to new data.
Architecture:
Advantages:
• Reduces variance
• Handles overfitting better than a single model
Disadvantages:
Boosting
Architecture:
Example: AdaBoost
Advantages:
Disadvantages:
Architecture:
• Step 1: Train base models (e.g., decision tree, SVM, neural network)
on the training data.
• Step 2: Use the predictions of the base models as input features for
the meta-learner.
• Step 3: Train the meta-learner on this new feature set.
• Step 4: For final prediction, use the meta-learner to combine the
predictions from the base models.
Advantages:
1. Coding Matrix:
o Create a binary coding matrix where rows correspond to the
original classes and columns correspond to the binary classifiers.
o Each class is represented by a binary string (code word), and
each binary classifier distinguishes between a subset of classes.
2. Training Binary Classifiers:
o Train a separate binary classifier for each column of the coding
matrix.
o Each classifier learns to distinguish between two groups of
classes based on the coding matrix.
3. Decoding Predictions:
o For a new instance, get predictions from all binary classifiers.
o Compare the resulting binary string to each row in the coding
matrix to find the closest match (using a distance measure like
Hamming distance).
4. Final Prediction:
o The class whose code word is closest to the predicted binary
string is chosen as the final class prediction.
• For a new handwritten digit image, get predictions from each binary
classifier.
• Example predictions: [1, 0, 1, 0].
Decoding Predictions:
• Compare the predicted binary string [1, 0, 1, 0] with each row in the
coding matrix.
• Calculate the Hamming distance between the predicted code and each
class code
Final Prediction:s
• The digit whose code has the smallest Hamming distance to the
predicted binary string is selected (in this case, digit 1).