0% found this document useful (0 votes)
43 views2 pages

ML-IT Insem-2024

Sppu TEIT

Uploaded by

lucifer267302
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
43 views2 pages

ML-IT Insem-2024

Sppu TEIT

Uploaded by

lucifer267302
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Total No. of Questions : 4] SEAT No.

P-5056 [Total No. of Pages : 2

14
[6187]-459

:20
T.E. (IT) (Insem.)

:45
MACHINE LEARNING

/20 71
10
(2019 Pattern) (Semester - I) (314443)
06 142

23
Time : 1 Hour] [Max. Marks : 30
P0
/09
Instructions to the candidates:
3.1 G

1) Answer Q.1 or Q.2, Q.3 or Q.4.


.14 CE

.6

2) Neat diagrams must be drawn wherever necessary.


70

3) Figures to the right side indicate full marks.

14
4) Assume suitable data if necessary.

:20
14

:45
Q1) a) Compare supervised, unsupervised, and semi-supervised learning with
71
examples (CO1).
10 [6]
42
23

b) Explain k-fold Cross Validation technique with example (CO1). [5]


/20
01

c) Why dataset splitting is required? State importance of each split in a


/09

machine learning model (CO1). [4]


GP
06

OR
CE

Q2) a) Why size of training dataset is more compare to testing dataset? What
.6
70

should be ratio of Training & testing dataset? Explain any one dataset
3.1

validation techniques. (CO1). [6]


14
.14

b) What is the need for dimensionality reduction? Explain the concept of


:20

[5]
14

the Curse of Dimensionality (CO1).


:45

c) State and justify Real life applications of supervised and unsupervised


71
10

learning. (CO1). [4]


42
23
/20
01

Q3) a) Consider the following feature tree. (Positive Class: Decline offer)
/09
GP
06
CE
.6
70
3.1
.14
14

P.T.O.
Find : [6]
i) Contingency table ii) Recall

14
iii) Precision iv) Accuracy

:20
v) False positive rate. (CO2)

:45
/20 71
b) What is multiclass classification? Explain One-Vs-one construction

10
method of multi class classifier with suitable example (CO2). [5]
06 142

23
c) What are advantages and limitations of the Logistic Regression
P0
(CO2)? [4]
/09
3.1 G

OR
.14 CE

.6

Q4) a) Consider the following three class confusion matrix :


70

Predicated

14
:20
A B C
14

:45
A 14 3 3 20
71
10
B 5 15 10 30
42
23

Actual C 2 5 43 50
/20
01
/09

21 23 56 100
GP
06

Calculate per class precision, per class recall, weighted average precision,
CE
.6

weighted average recall and accuracy (CO2). [6]


70

b) What is Support Vector Machine (SVM)? How does the SVM work
3.1

(CO2)? [5] 14
.14

:20
14

c) Why do we use Logistic Regression? Explain with suitable example


:45

(CO2). [4]
71
10
42
23


/20
01
/09
GP
06
CE
.6
70
3.1
.14
14

[6187]-459 2

You might also like