0% found this document useful (0 votes)
72 views6 pages

MATH 415 Review by Nitesh Nath: Chapter 1!

1. The document summarizes key concepts from chapters 1-3 of a math 415 review, including matrix multiplication, LU decomposition, inverse matrices, vector spaces, subspaces, row reduction, and linear transformations. 2. Matrix multiplication is associative but not commutative, and LU decomposition can be used to solve systems of equations. The column space and nullspace are fundamental subspaces. 3. Orthogonal vectors and subspaces have the property that the dot product of any vector from one with any vector from the other is zero. The row space is orthogonal to the nullspace and the column space is orthogonal to the left nullspace.

Uploaded by

Esther Alvarado
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
72 views6 pages

MATH 415 Review by Nitesh Nath: Chapter 1!

1. The document summarizes key concepts from chapters 1-3 of a math 415 review, including matrix multiplication, LU decomposition, inverse matrices, vector spaces, subspaces, row reduction, and linear transformations. 2. Matrix multiplication is associative but not commutative, and LU decomposition can be used to solve systems of equations. The column space and nullspace are fundamental subspaces. 3. Orthogonal vectors and subspaces have the property that the dot product of any vector from one with any vector from the other is zero. The row space is orthogonal to the nullspace and the column space is orthogonal to the left nullspace.

Uploaded by

Esther Alvarado
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

MATH 415 Review by Nitesh Nath

Chapter 1!
!
1.4 Matrix Notation and Matrix Multiplication!
!
Matrix multiplication is associative:!
!
Matrix operations are distributive: !
!
Matrix multiplication is not commutative usually: !
!
!
1.5 Triangular Factors and Row Exchanges!
!
A = LU with no exchanges of rows. L is the lower triangular with 1’s on the diagonal. U is the
upper triangular yielded by forward elimination. L can be found by going back from U to A, as
done in the following example:!
!
!
!
!
!
!
!
!
!
A = LU is a record of elimination steps, but it is more important than that. L and U are the
matrices that can be used to solve Ax = b. !
!
!
!
!
But because the LU decomposed form is “unsymmetric” about the diagonal (L has 1’s in the
diagonal, and U has the pivots), dividing out the diagonal gives A = LDU where D is the pivots. !
!
1.6 Inverses and Transposes!
!
The inverse of a matrix is the matrix that brings A to I. !
!
!
!
!
A couple notes on inverse matrices: !
! 1. Inverse exists if and only if elimination produces n pivots (row exchanges allowed)!
! 2. A can have one and only one inverse. !
! 3. If there is a nonzero vector x such that Ax = 0, then A is not invertible. !

The inverse matrix of a diagonal matrix (whose only values are placed on the diagonal) is the
inverse of the diagonals.!
!
MATH 415 Review by Nitesh Nath

The inverse can be applied to matrices that are being multiplied in the following way. !
!
!
!
!
The transpose of a matrix R transpose is just R.!
!
If A is symmetric then the following property applies.!
!
!
!
!
Chapter 2!
!
2.1 Vector Spaces and Subspaces!
!
We can think of a vector space, Rn as the space which consists of all column vectors
represented by all column vectors with n components. Within vector spaces, the subspaces are
spaces such that we can either:!
! !
! 1. Adding any two vectors in the subspace yields another vector in the subspace.!
! 2. Multiply any vector in the subspace by any scalar, the product is in the subspace.!
!
If the above two conditions are satisfied, then the space spanned by the vectors is a subspace. !
!
The column space of A contains all linear combinations of the columns of A. This means that the
system Ax = b is only solvable when b is in the column space of A, or that the vector b can be
achieved through some combination of the columns of A. !
!
The nullspace exists as a duality to the column space. While the column space is spanned by all
solutions of Ax = b, the nullspace is spanned by all solutions of Ax = 0. The nullspace will always
contain the zero vector, but there may be infinitely more solutions. Furthermore, there will
always be more solutions than just the zero vector to Ax = 0 if there are more unknowns than
equations: n > m. !

For the invertible matrix, where the number of pivots is n, the nullspace contains only the zero
vector, and the column space is the whole space. !
!
The column space is a subspace of Rm.!
The nullspace is a subspace of Rn.!
!
2.2 Solving Ax = 0 and Ax = b!
!
The solutions to all linear equations have the form x = xp + xn.!
!
The following highlights the difference between the echelon form, and the row reduced echelon
form.!
!
! !
MATH 415 Review by Nitesh Nath

! Echelon Form:!
! ! 1. Pivots are the first nonzero entries in their rows!
! ! 2. Below each pivot is a column of zeros, obtained by elimination. !
! ! 3. Each pivot lies to the right of the pivot above it. This produces a staircase !
! ! pattern. !
!
! Reduced Row Echelon Form:!
! ! 1. Pivots are the first nonzero entries in their rows and all pivots are 1.!
! ! 2. Below each pivot is a column of zeros, and above each pivot is a column of !
! ! zeros.!
! ! 3. Each pivot lies to the right of the pivot above it. !
!
Pivot variables are the variables that correspond to columns with pivots in them, whereas free
variables are the variables that correspond to the columns without pivots in them. !
!
The nullspace contains all combinations of the special solutions. The following example
highlights the differences between the nullspace and the column space, and includes how to
calculate each.!
!

!
!
From U and R we know that the pivot columns are the first and third. So looking back at A, we
see that the column space is spanned by the first and third rows of A. Notice that the column
space lives in R3, which is the dimension along the rows (m). Thus, the nullspace will live in R4,
which is the dimension along the columns (n).!
!
!
!
!
!
!
!
To find the general solution to Rx = 0, which will give us the nullspace’s basis, assign arbitrary
values to the free variables. Calling these arbitrary variables v and y, we can determine the pivot
variables completely in terms of v and y. To be clear, in the following example, u and w
represent the pivot variables.!
!
!
!
!
!
!
!
!
!
!
MATH 415 Review by Nitesh Nath

In a way, there is a “double infinity” of solutions, because v and y are both free. Rewriting this as
the vectors give us the following.!
!
!
!
!
!
!
!
!
And so the nullspace is formed by the combination of these vectors. !
!
2.3 Linear Independence, Basis, and Dimension!
!
A set of vectors are linearly independent when they can only combine to 0 by being multiplied by
the zero vector. In other words, the set of vectors which make the columns of some matrix A are
independent exactly when N(A) = {zero vector}.!
!
The basis for a vector space is defined by the following rules.!
! 1. The vectors must be linearly independent. You can’t have four 3-component vectors
that form a basis for R3 because there will be overlap between the vectors. !
! 2. The span the space V, not too few vectors. Two 3-component vectors will span a
plane in R3 but not the entire space R3.!
!
2.4 Four Fundamental Subspaces!
!
1. Column Space is C(A) with a rank r. !
2. Nullspace is N(A), and it’s dimension is n-r!
3. Row space is C(AT) and it’s dimension is r.!
4. Left nullspace is N(AT) and it’s dimension is m-r. !
!
Column Space!
The column space has been explored above. But it’s relationship with the row space is that the
number of independent columns is exactly the number of independent rows. !
!
Nullspace!
The nullspace has been explored above. However, note that the nullspace of A is the same as
the nullspace of U and R. !
!
Row Space!
Nonzero rows of either A, U, or R form the basis for the row space of A. !
!
Left Nullspace!
The left nullspace has dimension m - r. The left nullspace is actually pretty boring in terms of the
415 final. !
!
!
!
!
MATH 415 Review by Nitesh Nath

2.6 Linear Transformations!


!
For all numbers c and d and all vectors x and y, matrix multiplication satisfies the rule of
linearity:!
!
!
!
!
Every transformation T(x) that meets this transformation is a linear transformation. !
!
The following example illustrates the above point.!
!

!
!
!
!
!
!
!
!
will be a linear combination of !
!
!
!
So find which combination yields and substitute in the values from the
transformation, and add the results. In this situation, the answer will be 3 by 1
matrix. !
!
!
Chapter 3!
!
3.1 Orthogonal Vectors and Subspaces!
!
Note: Fundamental Subspaces meet at right angles, i.e. they’re orthogonal.!
!
Specifically, the row space is orthogonal to the nullspace and the column space is orthogonal to
the left null space.!
!
!
!
Orthogonal vectors: !
!
!
Orthogonal subspaces must have the property that each vector in the first subspace must be
orthogonal to each vector in the second subspace. !
MATH 415 Review by Nitesh Nath

!
is the space of all vectors orthogonal to V. !
!
!
Ax = b must mean that b is orthogonal to the left nullspace, because b is in the column space of
A, and the column space is orthogonal to the left nullspace.!
!
Every matrix transforms its row space onto its column space. This is because by definition,
every b in Ax = b is in the column space of A. However, x lies in the row space. So the matrix A
takes x (which is in the row space) into the column space. !
!
!
3.2 Cosines and Projections Onto Lines!
!
!
!
The cosine of any angle between any nonzero vectors a and b is: !
!
!
!
The projection of the vector b onto the line in the direction of a is :!
!
!
!
The projection matrix, P, is the matrix that multiples b and produces p: !
!
Notes: !
! 1. P is a symmetric matrix!
! 2. Its square is itself P 2 = P!
! 3. P is the same even if a is doubled. !
! 4. To project b onto a, multiply by P: p = Pb!
!
The transpose can be defined from Inner Products:!
!

You might also like