Linear Transformation
Linear Transformation
Presentation Roadmap
1 Transformations 2 Linearity
What is a Transformation? Defining Linearity: The Core
Properties
3 Matrices
Representing Transformations: The Power of Matrices
What is a Transformation?
In mathematics, a transformation is a function that maps elements
from one set (the domain) to another set (the codomain).
In the context of geometry and vector spaces, it often means moving,
resizing, or changing the orientation of objects (represented by
vectors or points).
Think of it as a rule: Input a vector/point, get a transformed
vector/point out.
Defining Linearity
A transformation T from a vector space V to a vector space W (T: V →
W) is linear if it satisfies two conditions for all vectors u, v in V and any
scalar c:
Additivity: T(u + v) = T(u) + T(v)
(The transformation of a sum is the sum of transformations)
Homogeneity: T(cu) = cT(u)
(The transformation of a scaled vector is the scaled
transformation of the vector)
Consequences of Linearity
Grid Lines
3
Parallel & Evenly Spaced
The Matrix Connection
Matrix A
Every linear transformation T can be represented by a matrix
A.
Transformed Vector
y = Ax
Columns of A
Transformations of the standard basis vectors.
Matrix Representation Example
Basis Vectors
1 T( [1, 0]ᵀ ) = [0, 1]ᵀ
Basis Vectors
2 T( [0, 1]ᵀ ) = [-1, 0]ᵀ
Matrix A
3 A = | 0 -1 | | 1 0 |
Geometric View: Scaling
Uniform Scaling Non-uniform Scaling
Matrix is diagonal with equal entries. Matrix is diagonal with unequal entries.
Geometric View: Rotation
θ
Rotation
Rotates vectors/objects around the origin by angle θ.
Geometric View: Rotation
Importance of Linear Transformations
• Fundamental Building Blocks: They underpin many areas of mathematics, physics, engineering,
and computer science.
• Simplification: They allow complex geometric operations to be described concisely using algebra
(matrices).
• Predictability: Their properties (preserving lines, origin) make systems easier to analyze.
• Machine Learning: Dimensionality reduction (e.g., Principal Component Analysis - PCA), data
transformation, neural network layers.
• Cryptography: Some encryption techniques involve matrix operations (though often non-linear
elements are added).
Shear Transformations
Definition: A shear transformation distorts the shape by shifting one axis while
keeping the other fixed.
Example Matrices:
Across x-axis:
Across y-axis:
Benefit: One final matrix for all steps — efficient and clean.
Inverse Transformations
Definition: Reverses the effect of a linear transformation.
If T(x)=AxT(x) = AxT(x)=Ax,
then the inverse transformation is A−1A^{-1}A−1, such that:
x is an eigenvector
λ is the eigenvalue
Matrix size: 3x3 for linear; 4x4 for affine (includes translation).
•Mastery of these concepts opens doors to deeper topics like eigen theory and SVD.
Thank You