0% found this document useful (0 votes)
4 views

Introduction to Machine Learning - - Unit 9 - Week 6

The document outlines the Week 6 assignment for the 'Introduction to Machine Learning' course, focusing on decision trees and their properties. It includes questions on decision tree algorithms, overfitting indicators, and entropy calculations, along with the correct answers and scores for each question. The assignment submission deadline was March 5, 2025, and some answers were marked correct or incorrect based on the provided statements.

Uploaded by

240188.ad
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views

Introduction to Machine Learning - - Unit 9 - Week 6

The document outlines the Week 6 assignment for the 'Introduction to Machine Learning' course, focusing on decision trees and their properties. It includes questions on decision tree algorithms, overfitting indicators, and entropy calculations, along with the correct answers and scores for each question. The assignment submission deadline was March 5, 2025, and some answers were marked correct or incorrect based on the provided statements.

Uploaded by

240188.ad
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

4/22/25, 4:51 AM Introduction to Machine Learning - - Unit 9 - Week 6

(https://swayam.gov.in) (https://swayam.gov.in/nc_details/NPTEL)

[email protected]

NPTEL (https://swayam.gov.in/explorer?ncCode=NPTEL) » Introduction to Machine Learning (course)


Click to register
for Certification
exam
Week 6 : Assignment 6
(https://examform.nptel.ac.in/2025_01/exam_form/dashboard)
The due date for submitting this assignment has passed.

If already Due on 2025-03-05, 23:59 IST.


registered, click
to check your Assignment submitted on 2025-03-04, 07:05 IST
payment status
1) Statement: Decision Tree is an unsupervised learning algorithm. 1 point
Reason: The splitting criterion use only the features of the data to calculate their respective
measures
Course
Statement is True. Reason is True.
outline
Statement is True. Reason is False

About Statement is False. Reason is True


NPTEL () Statement is False. Reason is False

No, the answer is incorrect.


How does an Score: 0
NPTEL Accepted Answers:
online Statement is False. Reason is False
course
work? () 2) Increasing the pruning strength in a decision tree by reducing the maximum depth: 1 point

Week 0 () Will always result in improved validation accuracy.


Will lead to more overfitting
Week 1 () Might lead to underfitting if set too aggressively
Will have no impact on the tree’s performance.
Week 2 ()
Will eliminate the need for validation data.
Week 3 () Yes, the answer is correct.
Score: 1
Week 4 () Accepted Answers:
Might lead to underfitting if set too aggressively
Week 5 ()

https://onlinecourses.nptel.ac.in/noc25_cs46/unit?unit=71&assessment=312 1/4
4/22/25, 4:51 AM Introduction to Machine Learning - - Unit 9 - Week 6

3) What is a common indicator of overfitting in a decision tree? 1 point


Week 6 ()
The training accuracy is high while the validation accuracy is low.
Decision Trees
- Introduction The tree is shallow.
(unit? The tree has only a few leaf nodes.
unit=71&lesso
The tree’s depth matches the number of attributes in the dataset.
n=72)
The tree’s predictions are consistently biased.
Regression
Trees (unit?
Yes, the answer is correct.
Score: 1
unit=71&lesso
Accepted Answers:
n=73)
The training accuracy is high while the validation accuracy is low.
Decision Trees
- Stopping 4) Consider the following statements: 1 point
Criteria and
Statement 1: Decision Trees are linear non-parametric models.
Pruning (unit?
Statement 2: A decision tree may be used to explain the complex function learned by a neural
unit=71&lesso
network.
n=74)

Decision Trees Both the statements are True.


for Statement 1 is True, but Statement 2 is False.
Classification -
Statement 1 is False, but Statement 2 is True.
Loss
Functions Both the statements are False.
(unit?
Yes, the answer is correct.
unit=71&lesso Score: 1
n=75)
Accepted Answers:
Decision Trees Statement 1 is False, but Statement 2 is True.
- Categorical
Attributes 5) Entropy for a 50-50 split between two classes is: 1 point
(unit?
unit=71&lesso 0
n=76) 0.5
Decision Trees 1
- Multiway None of the above
Splits (unit?
unit=71&lesso Yes, the answer is correct.
n=77)
Score: 1
Accepted Answers:
Decision Trees 1
- Missing
Values,
6) Consider a dataset with only one attribute(categorical). Suppose, there are 10 1 point
Imputation,
unordered values in this attribute, how many possible combinations are needed to find the best
Surrogate
split-point for building the decision tree classifier?
Splits (unit?
unit=71&lesso
1024
n=78)
511
Decision Trees
1023
- Instability,
Smoothness, 512
Repeated
Yes, the answer is correct.
Subtrees Score: 1
(unit?
Accepted Answers:
511

https://onlinecourses.nptel.ac.in/noc25_cs46/unit?unit=71&assessment=312 2/4
4/22/25, 4:51 AM Introduction to Machine Learning - - Unit 9 - Week 6

unit=71&lesso 7) Consider the following dataset: 2 points


n=79)

Decision Trees
- Example
(unit?
unit=71&lesso
n=80)

Week 6
Feedback
Form :
Introduction To
Machine
Learning (unit?
unit=71&lesso
What is the initial entropy of Malignant?
n=287)

Quiz: Week 6 0.543


: Assignment 0.9798
6
0.8732
(assessment?
name=312) 1
No, the answer is incorrect.
Week 7 () Score: 0
Accepted Answers:
Week 8 () 0.9798

Week 9 () 8) For the same dataset, what is the info gain of Vaccination? 2 points

Week 10 () 0.4763
0.2102
Week 11 () 0.1134
0.9355
Week 12 ()
Yes, the answer is correct.
Score: 2
Text
Accepted Answers:
Transcripts
0.4763
()

Download
Videos ()

Books ()

Problem
Solving
Session -
Jan 2025 ()

https://onlinecourses.nptel.ac.in/noc25_cs46/unit?unit=71&assessment=312 3/4

You might also like