Things You Should Be Able To Prove:: 1. Properties of Matrix Arithmetics
Things You Should Be Able To Prove:: 1. Properties of Matrix Arithmetics
2. System of equations
• If A is an m × n matrix and x is an n × 1 column vector, then the product Ax can be expressed
as a linear combination of column vectors of A.
• Two matrices have the same reduced row echelon form if and only if they are row equivalent.
• (Theorm 1.6.1) A system of linear equations has zero, one, or infinitely many solutions. There
are no other possibilities.
• (Problem 1.6.22) Let Ax = 0 be a homogeneous system of n linear equations in n unknowns,
and let Q be an invertible n × n matrix. Show that Ax = 0 has just the trivial solution if and only
if (QA)x = 0 has just the trivial solution.
• (Problem 1.6.23) Let Ax = b be any consistent system of linear equations, and let x1 be a fixed
solution. Show that every solution to the system can be wirtten in the form x = x1 + x0 , where x0
is a solution to Ax = 0 (the homogeneous system). Show also that every matrix of this form is a
solution.
• For a constant k and a matrix A, if kA = 0 then either k = 0 or A = 0. (0 is a scalar or a zero
matrix of appropriate dimensions.)
3. Transpose operation
• If A and B are matrices of appropriate dimensions,
– (AT )T = A.
– (A + B)T = AT + BT .
– (A − B)T = AT − BT .
– (AB)T = BT AT .
– (kA)T = kAT .
Email [email protected] if you find any typos or errors. Thanks! 2
5. Symmetry
• if A and B are symmetric matrices then:
– AT is symmetric.
– A + B is symmetric.
– A − B is symmetric.
– kA is symmetric.
– A−1 is symmetric.
– AB is symmetric if and only if A and B commute.
6. Determinant
• If A is a square matrix and A0 is obtained from A by switching two row of A then we have
det(A0 ) = − det(A).
– Note: I know there are probably easier proofs for this theorem but I take this opportunity to
use induction proof.
Proof: We prove by induction on the dimension of A. It is clear that the property holds for
every 2 × 2 matrix (check). Suppose for all k × k matrix B, k ≥ 2, we have det(B0 ) = − det(B)
where B0 is obtained from B by switching two rows. Let A0 be a matrix obtained by switching
rows i and j of A. Then the determinat of A0 can be obtained by cofactor expansion of A0
along the p-th row where p 6= i, j. We have:
det(A0 ) = (−1)p+1 ap 1 Mp0 1 + (−1)p+2 ap 2 Mp0 2 + · · · + (−1)p+k+1 ap k+1 Mp0 k+1
Expanding A along the same row we get:
det(A) = (−1)p+1 ap 1 Mp 1 + (−1)p+2 ap 2 Mp 2 + · · · + (−1)p+k+1 ap k+1 Mp k+1 .
Minors Mp0 q and Mp q correspond to the same k × k submatrices where one is obtained by
switiching two of the rows of the other. Hence by induction hypothesis, Mp0 q = −Mp q and
det(A0 ) = (−1)p+1 ap 1 Mp0 1 + (−1)p+2 ap 2 Mp0 2 + · · · + (−1)p+k+1 ap k+1 Mp0 k+1
= (−1)p+1 ap 1 (−Mp 1 ) + (−1)p+2 ap 2 (−Mp 2 ) + · · · + (−1)p+k+1 ap k+1 (−Mp k+1 )
= − det(A). ///
a1 a1
a2 a2
.. ..
. .
• det
= k det for all i.
ka i
ai
. .
.. ..
an an
• If A ∈ Mn×n then det(kA) = k n det(A).
• If one row of a square matrix A is a nonzero multiple of another row of A then det(A) = 0.
a1 a1 a1
a2 a2 a2
.. .. ..
.
.
.
• det
ai + bi = det ai + det bi where ai , bi are n-dimensional row.
.. .
.
.
.
. . .
an an an
a1 a1
a 2
a2
.. ..
. .
• det
aj + kai = det aj (Basically the third type elementary row operation does
.. .
.
. .
an an
not change the determinant).
Email [email protected] if you find any typos or errors. Thanks! 4
• For every square matrix A and elementary matrix E det(EA) = det(E) det(A).
• For every square matrices A and B of the same dimensions, det(AB) = det(A) det(B).
1
• If A is invertible, det(A−1 ) = .
det(A)
• Product of a row and the column vector of the cofactors of a different row is zero.
adj(A)
• for an invertible matrix A, A−1 = .
det(A)
Au · v = u · AT v
u · Av = AT u · v
• Orthogonal decomposition of a vector (find the orthogonal and projection components of a vector
with respect to a given vector)
• If A is an m × n matrix, then the solution set of the homogeneous linear system Ax = 0 consists
of all vectors in Rn that are orthogonal to every row vector of A.
• The general solution of a consistent linear system Ax = b can be obtained by adding any specific
solution of Ax = b to the general solution of Ax = 0.
8. Vector Spaces
• Real-valued functions that are defined an R form a vector space with addition and scalar multipli-
cation of real numbers.
• The finite intersection of subspaces of a vector space is a subspace of that vector space. The union
of subspaces is not necessarily a subspace.
• Span of vectors in a vector space is the smallest subspace of that space.
• The solution set of a homogeneous linear system Ax = 0 in n unknowns is a subspace of Rn .
• If S = {v1 .v2 , . . . vr } and S 0 = {w1 , w2 , . . . , wk } are nonempty sets of vectors in a vector space V ,
then
if and only if each vector in S is a linear combination of those in S 0 , and each vector in S 0 is a linear
combination of those in S.
• A set S with two or more vectors is linearly dependent if and only if at least one of the vectors in
S is expressible as a linear combination of the other vectors in S.
• If Wronskian of n functions is not identically zero, then those functions form a linearly independent
set of vectors in Cn−1 (−∞, ∞). (no conclusion if the Wronskian IS identically zero).
Email [email protected] if you find any typos or errors. Thanks! 5
• Any set of vectors that has a linearly dependent subset is linearly dependent.
• Any subset of a linearly independent set of vectors is linearly independent.
• (Uniqueness of Basis Representation) If S = {v1 , v2 , . . . , vn } is a basis for a vector space V , then
every vector v in V can be represented in the form v = c1 v1 + c2 v2 + · · · + cn vn in exactly one way.
• (no proof) If a vector space has a basis with n vectors, then any set with more than n vectors is
linearly dependent and any set with less than n vectors does not span that vector space.
• (no proof) All bases for a finite-dimensional vector space have the same number of vectors.
• (Plus/Minus Theorem, no proof) If S is a linearly dependent set of vectors in a vector space V ,
then we can remove a vector in S that can be expressed as a linear combination of other vectors
in S and the span of the resulting set is the same as the span of S. On the other hand, if S is a
linearly independent set of vectors in V then the union of S and a vector from V that is not in the
span of S is also linearly independent.
• If V is an n-dimensional vector space and S is a subset of V with exactly n vectors, then S is a
basis for V if and only if either S is linearly independent or S spans V .
• (no proof) A linearly dependent finite set of vectors in a vector space V that spans V can be made
into a basis for V by removing appropriate vectors. A linearly independent finite set of vectors in
a vector space V can be made to a basis for V by inserting appropriate vectors from V .
• (no proof) If W is a subspace of a finite dimensional vector space V , dim(W ) ≤ dim(V ) with
equality if and only if W = V .
• (no proof) If P is the transition matrix from a basis B0 to a basis B for a finite dimensional vector
space V , then P is invertible and P−1 is the transition matrix from B to B0 .
• (no proof) A system of linear equations Ax = b is consistent if and only if b is in the column space
of A.
• (no proof) Elementary row operations do not change the null space and row space of a matrix but
they do change the column space.
• (no proof) If A and B are row equivalent matrices, then a given set of column vectors of A is
linearly independent if and only if the corresponding column vectors of B are linearly independent.
Also, a given set of column vectors of A form a basis for the column space of A if and only if the
corresponding column vectors of B form a basis for the column space of B.
• (no proof) If a matrix R is in ref, the row vectors and column vectors with the leading 1’s form a
basis for the row and column spaces of R, respectively.
• (Dimension Theorem for Matrices) If A is a matrix with n columns, then rank(A)+nullity(A) =
n.
• Let A be an m × n matrix
(a) (Overdetermined Case) If m > n then the linear system Ax = b is inconsistent for at least
one vector b in Rm . (Correction: your book is wrong; it says Rn which is not correct)
(b) (Underdetermined Case) If m < n, then for each vector b in Rm the linear system Ax = b is
either inconsistent or has infinitely many solutions.
• If A is any matrix, then rank(A) = rank(AT ).
Email [email protected] if you find any typos or errors. Thanks! 6
• If A is an m × n matrix, then
(a) The null space of A and the row space of A are orthogonal complements in Rn .
(b) The null space of AT and the column space of A are orthogonal complements in Rm .
9. Transformations
• If TA : Rn → Rm and TB : Rn → Rm are matrix transformations, and if TA (x) = TB (x) for every
vector x in Rn , then A = B.
• Be able to describe and use the reflection, projection, rotation, contraction, dilation, and shear from
R2 to R2 only.
• Every linear transformation from Rn to Rm is a matrix transformation, and conversely, every matrix
transformation from Rn to Rm is a linear transformation.
10. Eigenvalues and Eigenvectors
(W ⊥ )⊥ = W.
• For every inner product space V , if w is orthogonal to each of the vectors u1 , u2 , . . . ur then it is
orthogonal to every vector in span{u1 , u2 , . . . ur }.
• If {v1 , v2 , . . . , vr } is a basis for an inner product space V , then zero vector is the only vector in V
that is orthogonal to all of the basis vectors.
• If S = {v1 , v2 , . . . , vn } is an orthonormal basis for an inner product space V , and if u is any vector
in V , then
A = QR
• (No proof) For every linear system Ax = b, the associated normal system
AT Ax = AT b
is consistent, and all solutions of it are least squares solutions of Ax = b. Moreover, if W is the
column space of A, and x is any least squares solution of Ax = b, then the orthogonal prijection
of b on W is projW b = Ax.
• If A is an m × n matrix, then A has linearly independent column vectors if and only if AT A is
invertible.
• Find the components of a vector relative to one basis given its components relative to a differnt
basis
• Row space, column space, null space
• Use row operations to find a basis for the row space, column space, null space
• Pick row vectors of a matrix to form a basis for the row space of that matrix.(Note that this is
different form the last one)
• Find a vector form of the general solution of Ax = 0 and Ax = b.
• Find a basis for the null space of a matrix.
• Similar matrices, similarity transformation, and similarity invariant properties of matrices (Table
1, pg 306).
• Geometric and algebraic multiplicity of an eigenvalue.
• Properties of an inner product and a norm defined on a vector space.
• Given a basis for a vector space, use Gram-Schmidt process to find an orthonormal basis for that
vector space.
• Find the projection of a vector on a subspace given an orthogonal basis for that subspace.
• What is meant by best approximation
• Given a set of points, be able to find the best fitting linear, quadratic, and cubic function.