ASSIGN8
ASSIGN8
Ans: A
A. 21
B. -21
C. 3
D. -3
Ans: D
Explanation: slope intercept form of a line is y=mx+c.
A. real variable
B. integer variable
C. character variable
D. string variable
Ans: A
Ans: C
5. The linear regression model y = a0 + a1x is applied to the data in the table
shown below. What is the value of the sum squared error function S(a0, a1),
when a0 = 1, a1 = 2?
x y
1 1
2 1
4 6
3 2
A. 0.0
B. 27
C. 13.5
D. 54
Ans: D
Explanation: y’ is the predicted output.
y’ = 1+2x
x y y’
1 1 3
2 1 5
4 6 9
3 2 7
A. y = a0 + a1/x1 + a2/x2
B. y = a0 + a1x1 + a2x2
C. y = a0 + a1x1 + a2x22
D. y = a0 + a1x12 + a2x2
Ans: B
A. 1,3
B. 2,3
C. 1,2,3
D. Eigenvalues cannot be found.
Ans: C
Explanation: If A is an n × n triangular matrix (upper triangular, lower
triangular, or diagonal), then the eigenvalues of A are entries of the main
diagonal of A. Therefore, eigenvalues are 1,2,3.
8. In the figures below the training instances for classification problems are
described by dots. The blue dotted lines indicate the actual functions and the
red lines indicate the regression model. Which of the following statement is
correct?
A. Figure 1 represents overfitting and Figure 2 represents underfitting
Ans: B
Ans: B
Explanation: We must first subtract the mean of each variable from the dataset to cen-
ter the data around the origin. Then, we compute the covariance matrix of the data and
calculate the eigenvalues and corresponding eigenvectors of this covariance ma-
trix. Then we must normalize each of the orthogonal eigenvectors to become unit vectors.
Once this is done, each of the mutually orthogonal, unit eigenvectors can be interpreted as
an axis of the ellipsoid fitted to the data. This choice of basis will transform our covariance
matrix into a diagonalised form with the diagonal elements representing the variance of
each axis.
10. A time series prediction problem is often best solved using?
A. Multivariate regression
B. Autoregression
C. Logistic regression
D. Sinusoidal regression
Ans : B