0% found this document useful (0 votes)
6 views

Introduction to Machine Learning - - Unit 10 - Week 7

The document outlines the content for Week 7 of the 'Introduction to Machine Learning' course, focusing on assignments related to model evaluation, cross-validation, and ensemble methods. It includes questions and accepted answers regarding training and validation datasets, confusion matrices, boosting, and bagging techniques. The assignment submission deadline was March 12, 2025, and the document confirms correct answers for various evaluation scenarios.

Uploaded by

240188.ad
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views

Introduction to Machine Learning - - Unit 10 - Week 7

The document outlines the content for Week 7 of the 'Introduction to Machine Learning' course, focusing on assignments related to model evaluation, cross-validation, and ensemble methods. It includes questions and accepted answers regarding training and validation datasets, confusion matrices, boosting, and bagging techniques. The assignment submission deadline was March 12, 2025, and the document confirms correct answers for various evaluation scenarios.

Uploaded by

240188.ad
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

4/22/25, 4:52 AM Introduction to Machine Learning - - Unit 10 - Week 7

(https://swayam.gov.in) (https://swayam.gov.in/nc_details/NPTEL)

[email protected]

NPTEL (https://swayam.gov.in/explorer?ncCode=NPTEL) » Introduction to Machine Learning (course)


Click to register
for Certification
exam
Week 7 : Assignment 7
(https://examform.nptel.ac.in/2025_01/exam_form/dashboard)
The due date for submitting this assignment has passed.

If already Due on 2025-03-12, 23:59 IST.


registered, click
to check your Assignment submitted on 2025-03-11, 14:28 IST
payment status
1) Which of the following statement(s) regarding the evaluation of Machine Learning 1 point
models is/are true?

A model with a lower training loss will perform better on a validation dataset.
Course
outline A model with a higher training accuracy will perform better on a validation dataset.
The train and validation datasets can be drawn from different distributions
About The train and validation datasets must accurately represent the real distribution of data
NPTEL () Yes, the answer is correct.
Score: 1
How does an Accepted Answers:
NPTEL The train and validation datasets must accurately represent the real distribution of data
online
course 2) Suppose we have a classification dataset comprising of 2 classes A and B with 200 1 point
work? () and 40 samples respectively. Suppose we use stratified sampling to split the data into train and
test sets. Which of the following train-test splits would be appropriate?
Week 0 ()

Train-{A : 50 samples, B : 10 samples}, Test-{A : 150 samples, B : 30 samples}


Week 1 ()

Train-{A : 50 samples, B : 30 samples}, Test- {A : 150 samples, B : 10 samples}


Week 2 ()

Train- {A : 150 samples, B : 30 samples}, Test- {A : 50 samples, B : 10 samples}


Week 3 ()
Train- {A : 150 samples, B : 10 samples}, Test- {A : 50 samples, B : 30 samples}
Week 4 ()
Yes, the answer is correct.

Week 5 ()

https://onlinecourses.nptel.ac.in/noc25_cs46/unit?unit=84&assessment=314 1/5
4/22/25, 4:52 AM Introduction to Machine Learning - - Unit 10 - Week 7

Score: 1
Week 6 () Accepted Answers:
Train- {A : 150 samples, B : 30 samples}, Test- {A : 50 samples, B : 10 samples}
Week 7 ()
3) Suppose we are performing cross-validation on a multiclass classification dataset 1 point
Evaluation and with N data points. Which of the following statement(s) is/are correct?
Evaluation
Measures I
(unit?
In k -fold cross-validation, we train k − 1 different models and evaluate them on the same
unit=84&lesso test set
n=85)
In k -fold cross-validation, we train k different models and evaluate them on different test
Evaluation and
sets
Evaluation
Measures II -
In k -fold cross-validation, each fold should have a class-wise proportion similar to the given
Bootstrapping
dataset.
and Cross
Validation
(unit? In LOOCV (Leave-One-Out Cross Validation), we train N different models, using N − 1

unit=84&lesso data points for training each model


n=86)
Yes, the answer is correct.
Score: 1
2 Class
Accepted Answers:
Evaluation
In k -fold cross-validation, we train k different models and evaluate them on different test sets
Measures
(unit? In k -fold cross-validation, each fold should have a class-wise proportion similar to the given
unit=84&lesso dataset.
n=87) In LOOCV (Leave-One-Out Cross Validation), we train N different models, using N − 1 data
points for training each model
The ROC
Curve (unit?
unit=84&lesso 4) (Qns 4 to 7) For a binary classification problem we train classifiers and evaluate 1 point
n=88) them to obtain confusion matrices in the following format:

Minimum
Description
Length and
Exploratory
Analysis (unit?
unit=84&lesso
Which of the following classifiers should be chosen to maximize the recall?
n=89)

Ensemble
4 6
Methods - [ ]
13 77
Bagging,
Committee
8 2
Machines and
[ ]
Stacking (unit? 40 60

unit=84&lesso
n=90) 5 5
[ ]
9 81
Ensemble
Methods -
7 3
Boosting (unit? [ ]
unit=84&lesso 0 90

n=91) Yes, the answer is correct.


Quiz: Week 7
: Assignment

https://onlinecourses.nptel.ac.in/noc25_cs46/unit?unit=84&assessment=314 2/5
4/22/25, 4:52 AM Introduction to Machine Learning - - Unit 10 - Week 7

7 Score: 1
(assessment? Accepted Answers:
name=314) 8 2
[ ]
40 60
Week 8 ()
5) For the confusion matrices described in Q4, which of the following classifiers should 1 point
Week 9 () be chosen to minimize the False Positive Rate?

Week 10 () 4 6
[ ]
6 84
Week 11 ()
8 2
[ ]
Week 12 () 13 77

Text 1 9
[ ]
Transcripts 2 88
()
10 0
[ ]
Download 4 86

Videos ()
Yes, the answer is correct.
Score: 1
Books () Accepted Answers:
1 9
[ ]
Problem 2 88
Solving
Session - 6) For the confusion matrices described in Q4, which of the following classifiers should 1 point
Jan 2025 () be chosen to maximize the precision?

4 6
[ ]
6 84

8 2
[ ]
13 77

1 9
[ ]
2 88

10 0
[ ]
4 86

Yes, the answer is correct.


Score: 1
Accepted Answers:
10 0
[ ]
4 86

7) For the confusion matrices described in Q4, which of the following classifiers should 1 point
be chosen to maximize the F1-score?

4 6
[ ]
6 84

https://onlinecourses.nptel.ac.in/noc25_cs46/unit?unit=84&assessment=314 3/5
4/22/25, 4:52 AM Introduction to Machine Learning - - Unit 10 - Week 7

8 2
[ ]
3 87

1 9
[ ]
2 88

10 0
[ ]
4 86

Yes, the answer is correct.


Score: 1
Accepted Answers:
10 0
[ ]
4 86

8) Which of the following statement(s) regarding boosting is/are correct? 1 point

Boosting is an example of an ensemble method


Boosting assigns equal weights to the predictions of all the weak classifiers
Boosting may assign unequal weights to the predictions of all the weak classifiers
The individual classifiers in boosting can be trained parallelly
The individual classifiers in boosting cannot be trained parallelly

Yes, the answer is correct.


Score: 1
Accepted Answers:
Boosting is an example of an ensemble method
Boosting may assign unequal weights to the predictions of all the weak classifiers
The individual classifiers in boosting cannot be trained parallelly

9) Which of the following statement(s) about bagging is/are correct? 1 point

Bagging is an example of an ensemble method


The individual classifiers in bagging can be trained in parallel

Training sets are constructed from the original dataset by sampling with replacement

Training sets are constructed from the original dataset by sampling without replacement
Bagging increases the variance of an unstable classifier.

Yes, the answer is correct.


Score: 1
Accepted Answers:
Bagging is an example of an ensemble method
The individual classifiers in bagging can be trained in parallel
Training sets are constructed from the original dataset by sampling with replacement

10) Which of the following statement(s) about ensemble methods is/are correct? 1 point

Ensemble aggregation methods like bagging aim to reduce overfitting and variance
Committee machines may consist of different types of classifiers
Weak learners are models that perform slightly worse than random guessing
Stacking involves training multiple models and stacking their predictions into new training
data

https://onlinecourses.nptel.ac.in/noc25_cs46/unit?unit=84&assessment=314 4/5
4/22/25, 4:52 AM Introduction to Machine Learning - - Unit 10 - Week 7

Yes, the answer is correct.


Score: 1
Accepted Answers:
Ensemble aggregation methods like bagging aim to reduce overfitting and variance
Committee machines may consist of different types of classifiers
Stacking involves training multiple models and stacking their predictions into new training data

https://onlinecourses.nptel.ac.in/noc25_cs46/unit?unit=84&assessment=314 5/5

You might also like