0% found this document useful (0 votes)
60 views1 page

CS60050: Machine Learning Mid-Semester Examination, Autumn 2017

This document contains the questions for the mid-semester examination in the Machine Learning course CS60050 at an unnamed university in Autumn 2017. The 3 questions cover: (1) building a decision tree to realize the parity function of 4 Boolean variables and classify a set of patterns, (2) deriving the minimum error classification rule and finding the prior probability for a 2-class classification problem, and (3) finding the feature mapping for an R2 kernel function and deriving the optimal margin classifier for 5 training examples.

Uploaded by

mansi uniyal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
60 views1 page

CS60050: Machine Learning Mid-Semester Examination, Autumn 2017

This document contains the questions for the mid-semester examination in the Machine Learning course CS60050 at an unnamed university in Autumn 2017. The 3 questions cover: (1) building a decision tree to realize the parity function of 4 Boolean variables and classify a set of patterns, (2) deriving the minimum error classification rule and finding the prior probability for a 2-class classification problem, and (3) finding the feature mapping for an R2 kernel function and deriving the optimal margin classifier for 5 training examples.

Uploaded by

mansi uniyal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 1

CS60050: Machine Learning

Mid-semester Examination, Autumn 2017


Time= 2 hrs. Marks: 45. Answer all THREE questions.

1.(a) Show a decision tree that realizes the parity function of four Boolean variables, A, B, C, and D. A parity
function evaluates to 1 if there are odd number of ones in input, and 0 otherwise. [5]

(b) Build a decision tree to classify the following patterns. Use the information gain criterion. Show all the
calculations systematically. What Boolean function does the tree realize? [ 10 ]

Pattern Class
(x1,x2,x3)
(0, 0, 0) 0
(0, 0, 1) 0
(0, 1, 0) 0
(0, 1, 1) 0
(1, 0, 0) 0
(1, 0, 1) 1
(1, 1, 0) 0
(1, 1, 1) 1

2. Consider two classes ω1 and ω2. We want to classify a variable x into one of these two classes. Suppose
𝑝 𝑥 𝜔1 and 𝑝 𝑥 𝜔2 are defined as follows: [ 10 + 5 ]

1 −𝑥 2
𝑝 𝑥 𝜔1 = 𝑒 2 , ∀𝑥
2𝜋
1
𝑝 𝑥 𝜔2 = , −2 < 𝑥 < 2
4

(a) Find the minimum error classification rule 𝑔 𝑥 for this two-class problem, assuming 𝑝 𝜔1 = 𝑝 𝜔2 = 0.5.

(b) There is a prior probability of class 1, designated as 𝜋1∗ , so that if 𝑝 𝜔1 > 𝜋1∗, the minimum error
classification rule is to always decide 𝜔1 regardless of 𝑥. Find 𝜋1∗. There is no 𝜋2∗ so that if 𝑝 𝜔2 > 𝜋2∗ , we
would always decide 𝜔2 . Why not?

3. (a). Consider a support vector machine whose input space is ℝ2 , and in which the kernel function is
computed as, 𝑘 𝒙, 𝒚 = 𝒙 ⋅ 𝒚 + 1 2 − 1, (bold letters represents vectors in ℝ2 ). Find the mapping ɸ 𝒙 to the
feature space corresponding to this kernel. Show your derivation. [5]

(b) Let X1 = (1,−1,−1), y1 =−1, X2 = (−3,1,1), y2 = 1, X3=(−3,1,−1), y3=−1, X4= (1,2,1), y4 =−1, and X5 =
(−1,−1,2). y5 = 1, be five binary labeled training examples. These points are linearly separable. Derive the
optimum margin classifier (support vectors, weights and threshold value) and the margin. [ 10 ]

---------- BEST WISHES ---------

You might also like