100% found this document useful (1 vote)
121 views7 pages

Quiz M2

Principal component analysis (PCA) can be used to summarize data and detect linear relationships. PCA finds the direction of maximal variance. PCA should be used to compress high-dimensional data into a smaller dimension on which models can be built. Singular value decomposition (SVD) can also be used to help find a solution for PCA and is a numerically stable algorithm. Linear discriminant analysis (LDA) is better suited than PCA for separating classes in a lower-dimensional space.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
121 views7 pages

Quiz M2

Principal component analysis (PCA) can be used to summarize data and detect linear relationships. PCA finds the direction of maximal variance. PCA should be used to compress high-dimensional data into a smaller dimension on which models can be built. Singular value decomposition (SVD) can also be used to help find a solution for PCA and is a numerically stable algorithm. Linear discriminant analysis (LDA) is better suited than PCA for separating classes in a lower-dimensional space.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

My courses ▶ (20/11) MScFE 650 Machine Learning in Finance (C20-S1) ▶ Module 2: Dimensionality Reduction ▶ Quiz M2

Started on Tuesday, 12 January 2021, 3:21 PM


State Finished
Completed on Tuesday, 12 January 2021, 3:43 PM
Time taken 22 mins 2 secs
Marks 15.00/15.00
Grade 20.00 out of 20.00 (100%)

Question 1
Principal component analysis can be used to summarize data and detect ______ relationships.
Correct

Mark 1.00 out Select one:


of 1.00
linear 
non-linear

deterministic
non-deterministic
None of the above
Question 2
PCA nds the direction of maximal ______.
Correct

Mark 1.00 out Select one:


of 1.00
mean
variance 
skew
kurtosis
None of the above

Question 3
When should you use PCA?
Correct

Mark 1.00 out Select one:


of 1.00
For clustering
For le compression

For calculating eigenvectors


To compress high-dimensional data into a smaller dimension, on which models can be built.  

Question 4
The angle between the two vectors [−111] and[−222] , is (in degrees),
Correct
Mark 1.00 out
Select one:
of 1.00
90


180

45
Question 5
2 1
Correct The rst principal direction of data with covariance matrix [ ] , is given by,
1 2
Mark 1.00 out
of 1.00
Select one:

1
1
[ ]
√2
−1

1
1

√2
[ ] 
1

1
[ ]
0

0
[ ]
1

Question 6
1
⎡ ⎤
Correct
The length of the vector ⎢ −1 ⎥
Mark 1.00 out ⎣ ⎦
of 1.00 2

Select one:
3


√6


Question 7
What technique can also be used to help nd a solution for PCA?
Correct

Mark 1.00 out Select one:


of 1.00
Middle-out compression

Vector quantization
Directed acyclic graphs
Singular Value Decomposition 

Question 8
Which of the following is true?
Correct

Mark 1.00 out Select one:


of 1.00
The principal components are sorted by ascending order of their means, which are equal to the associated eigenvalues.

The rst principal component is the linear combination of the Y variables that accounts for the greatest possible
skew

The scores on the rst j principal components have the highest possible generalized skew of any set of unit-length
linear combinations of the original variables

Each subsequent principal component is the linear combination of the Y variables that has the greatest possible
variance and is uncorrelated with the previously de ned components.


Question 9
An advantage of using SVD to t a PCA is?
Correct

Mark 1.00 out Select one:


of 1.00
It is a numerically stable algorithm 
It is a non-deterministic method
The speed of convergence is not dependent on k
It will not get stuck in local optimum

Question 10
The angle between the two vectors [-111] and [110], is (in degrees),
Correct

Mark 1.00 out


Select one:
of 1.00
45
0
90 
180

Question 11
PCA is very sensitive with respect to ___.
Correct

Mark 1.00 out Select one:


of 1.00
the scaling of the variables 
noise
ordinal variables

categorical variables
Question 12
Given data values [-111] and [-1-11] , their mean is given by,
Correct

Mark 1.00 out


Select one:
of 1.00
[-101]


[-11-1]

\frac{1}{2}[-101]

[000]

Question 13
The product \begin{bmatrix} 0\\ 1 \end{bmatrix} [11] is,
Correct

Mark 1.00 out


Select one:
of 1.00
\begin{bmatrix} 1 & 1\\ 1 & 1 \end{bmatrix}

\begin{bmatrix} 0 & 1\\ 1 & 1 \end{bmatrix}

\begin{bmatrix} 0 & 0\\ 1 & 1 \end{bmatrix} 

Question 14
The second principal direction of data with covariance matrix \begin{bmatrix} 2 & 1\\ 1 & 2 \end{bmatrix} , is given by,
Correct

Mark 1.00 out


Select one:
of 1.00
\frac{1}{\sqrt{2}}\begin{bmatrix} 1\\ -1 \end{bmatrix}

\frac{1}{\sqrt{2}}\begin{bmatrix} 1\\ 1 \end{bmatrix} 

\begin{bmatrix} 1\\ 0 \end{bmatrix}

\begin{bmatrix} 0\\ 1 \end{bmatrix}


Question 15
Which of the following is true?
Correct

Mark 1.00 out Select one:


of 1.00
LDA is better suited than PCA for separating classes in a lower-dimensional space. 
The last principal component accounts for as much of the skew in the data as possible.
PCA is better suited than t-SNE for separating classes in a lower-dimensional space.
The goal of dimensionality reduction is to have highly interpretable lower-dimensions. 

◄ Live Session M2 Compiled Content M2 ►


Jump to...

You might also like