0% found this document useful (0 votes)
26 views3 pages

Aim: Theory: Experiment 3

Uploaded by

0105cd211061
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
26 views3 pages

Aim: Theory: Experiment 3

Uploaded by

0105cd211061
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 3

Experiment 3

AIM: Implementation of PCA using Tensorflow in python.


THEORY:
Principal Component Analysis, or PCA, is a dimensionality-reduction method that is
often used to reduce the dimensionality of large data sets, by transforming a large set of
variables into a smaller one that still contains most of the information in the large set.
Steps Involved in Principal Component Analysis
1. Standardization
The aim of this step is to standardize the range of the continuous initial variables so
that each one of them contributes equally to the analysis. Mathematically, this can be done by
subtracting the mean and dividing by the standard deviation for each value of each variable.

2. COVARIANCE MATRIX COMPUTATION


The aim of this step is to understand how the variables of the input data set are
varying from the mean with respect to each other, or in other words, to see if there is
any relationship between them.
For example, for a 3-dimensional data set with 3 variables x, y, and z, the covariance
matrix is a 3×3 matrix of this

from:
3. COMPUTE THE EIGENVECTORS AND EIGENVALUES OF THE
COVARIANCE MATRIX TO IDENTIFY THE PRINCIPAL COMPONENTS
Eigenvectors and eigenvalues are the linear algebra concepts that we need to compute
from the covariance matrix in order to determine the principal components of the
data. Eigenvectors and eigenvalues always come in pair. Every eigen vectors has
eigen values and their number is equal to number of a data.

4. SORT THE VALUES IN DECREASING ORDER


Our first component will be the eigen vector corresponding to the highest eigen values
will be our first principle component. Similarly, our second principle component will
be the eigen vector correspond to the second highest eigen value. So, to arrange the
principle component the most to least relevance index, we need to sort the eigen
values.
5. FORMING THE NEWER FEATURE SET ALONG THE PRINCIPLE
COMPONENT AXIS
In this step, which is the last one, the aim is to use the feature vector formed using
the eigenvectors of the covariance matrix, to reorient the data from the original axes
to the ones represented by the principal components (hence the name Principal
Components Analysis). This can be done by multiplying the transpose of the original
data set by the transpose of the feature vector.

PYTHON CODE:

Output:

You might also like