Module-II
Module-II
Chapter 2
MATHEMATICAL FOUNDATION FOR ML
Linear Algebra is an essential field of mathematics, which defines the study of
vectors, matrices, planes, mapping, and lines required for linear transformation.
Linear Algebra for Machine learning
A system of linear equations consists of two or more linear equations that share the
same variables.
A system of linear equations involves multiple linear equations that share common
variables.
Each equation can represent a line, plane, or higher-dimensional surface,
depending on the number of variables involved.
The solution to the system is the set of variable values that satisfy all equations
simultaneously, usually corresponding to points of intersection among the lines or
planes represented by these equations.
Solution of a System of Linear Equation
Linear equations can have three kinds of possible solutions:
•No Solution
•Unique Solution
•Infinite Solution
How to Solve System of Linear Equations?
The subsequent techniques for solving the system of linear equations AX = B are viable
solely under the condition that the coefficient matrix A is non-singular, meaning |A| ≠
0.we can then further solve the equation using following methods.
These methods include:
•Cramer’s Rule
•Inverse Method
•Gauss-Jordan Method
•Gauss Elimination Method
•LU Decomposition Method of Factorization (also known as Method of
Triangularization)
Definition: A vector is a mathematical object that has both magnitude (or length) and direction.
• Two vectors are the same if they have the same magnitude and direction. This means that if you
take a vector and translate it to a new position (without rotating it), then the vector you obtain at
the end of this process is the same vector you had initially.
• Two examples of vectors are those that represent force and velocity. Both force and velocity are
in a particular direction. The magnitude of the vector would indicate the strength of the force, or
the speed associated with the velocity.
• You denote vectors using boldface as in a or b. Especially when writing by hand, where
you cannot easily write in boldface, you denote vectors using arrows as in a or b, or use
other markings.
• The magnitude of the vector a is denoted by ||a||.
• When you want to refer to a number and stress that it is not a vector, you can call the
number a scalar.
• You denote scalars with italics, as in a or b.
• Definition: A vector space is a set of vectors that can be added and multiplied (scaled)
by numbers, known
as scalars.
Solve the problems from book page 2-8
length of a vector
The length of a vector (commonly known as the magnitude) allows us to quantify the property
of a given vector. To find the length of a vector, simply add the square of its components then
take the square root of the result. In this article, we'll extend our understanding of magnitude
to vectors in three dimensions.
Step 2: The value obtained in Step 2 are named as, λ1, λ2, λ3….
Step 3: Find the eigenvector (X) associated with the eigenvalue λ1 using the
equation, (A – λ1I) X = 0
Step 4: Repeat step 3 to find the eigenvector associated with other remaining
eigenvalues λ2, λ3….
Types of Eigenvector
The eigenvectors calculated for the square matrix are of two types which are,
Right Eigenvector
The eigenvector which is multiplied by the given square matrix from the
right-hand side is called the right eigenvector. It is calculated by using the
following equation,
AVR = λVR
where
A is given square matrix of order n×n
λ is one of the eigenvalues
VR is the column vector matrix
Types of Eigenvector
The eigenvectors calculated for the square matrix are of two types which are,
VLA = VLλ
where
A is given square matrix of order n×n The value of VL is,
λ is one of the eigenvalues
VL is the row vector matrix VL = [v1, v2, v3,…, vn]
Singular Value Decomposition (SVD)
● The Singular Value Decomposition (SVD) of a matrix is a factorization of
that matrix into three matrices.
● It has some interesting algebraic properties and conveys important
geometrical and theoretical insights about linear transformations. It also has
some important applications in data science.
● In this article, I will try to explain the mathematical intuition behind SVD
and its geometrical meaning.
Calculation of Pseudo-inverse:
● Pseudo inverse or Moore-Penrose inverse is the generalization of
the matrix inverse that may not be invertible (such as low-rank
matrices).
● If the matrix is invertible then its inverse will be equal to Pseudo
inverse but pseudo inverse exists for the matrix that is not
invertible.
● It is denoted by A+.
Applications
Calculation of Pseudo-inverse:
● Pseudo inverse or Moore-Penrose inverse is the generalization of the
matrix inverse that may not be invertible (such as low-rank matrices).
● If the matrix is invertible then its inverse will be equal to Pseudo inverse
but pseudo inverse exists for the matrix that is not invertible.
● It is denoted by A+.