Linear - Algebra Lecture 4
Linear - Algebra Lecture 4
Lecture 04
January 2024
Let n ∈ N. We shall work entirely with row vectors in R1×n (of length n), or entirely
with column vectors in Rn×1 (of length n), both of which will be referred to as
‘vectors’.
We have already considered a linear combination
α1 a1 + · · · + αm am
α1 a1 + · · · + αm am = 0.
set S is linearly
dependent
since
0 −3 3 = 1 2 3 − 2 2 3 1 + 3 1 2 . Clearly,
1 2 3 −2 2 3 1 + 3 1 2 − 0 −3 3 = 0 0 0 .
In (i) above, S is a set of 3 vectors in R2×1 , and in (ii) above, S is a set of 4 vectors
in R1×3 . These examples illustrate an important phenomenon to which we now turn.
First we prove the following crucial result.
Let A := [ajk ] ∈ Rs×r . Then AT ∈ Rr ×s . Since r < s, the linear system AT x = 0 has
a nonzero solution, that is, there are α1 , . . . , αs , not all zero, such that
s
X s
X r
X X s
r X
αj xj = αj ajk yk = ajk αj yk = 0.
j=1 j=1 k=1 k=1 j=1
Corollary
Let n ∈ N and S be a set of vectors of length n. If S has more than n elements, then
S is linearly dependent.
A set S of vectors is called linearly independent if it is not linearly dependent, that is,
α1 a1 + · · · + αm am = 0 =⇒ α1 = · · · = αm = 0,
Suppose c1 , . . . , cn are column vectors each of length m. Let A ∈ Rm×n be the matrix
having c1 , . . . , cn as its n columns. Then for x1 , . . . , xn ∈ R,
x1
.
x1 c1 + · · · + xn cn = c1 ··· cn . = Ax.
.
xn
Definition;
Let A ∈ Rm×n . The row rank of A is the maximum number of linearly independent
row vectors of A.
Thus the row rank of A is equal to r
⇐⇒ if there is a linearly independent set of r rows of A and any set of r + 1 rows of
A is linearly dependent.
Let r be the row rank of A.
1 Then r = 0 if and only if A = O.
2 Since the total number of rows of A is m, we see that r ≤ m. Also, since the row
vectors of A form a subset of R1×n , no more than n of them can be linearly
independent. Thus r ≤ n.
Let a1 , . . . , am be any m vectors in R1×n . Clearly, they are linearly independent if and
only if the matrix A formed with these vectors as row vectors has row rank m, and
they are linearly dependent if the row rank of A is less than m.
Clearly, α3 = 0, and the two equations 3α1 − 3α2 = 0, 2α1 + 12α2 = 0 show that
α1 = α2 = 0 as well. Thus the 3 rows of A are linearly independent. Hence the row
rank of A is 3.
But the set {a1 , a2 } is linearly independent Hence the row rank of A is 2.
We used the relation 6a1 − a2 + a3 = 0 above to determine the row rank of A. It is
difficult to think of such a relation out of nowhere. We shall develop a systematic
approach to find the row rank of a matrix.
Proposition
If a matrix A is transformed to a matrix A′ by elementary row operations, then the
row ranks of A and A′ are equal, that is, EROs do not alter the row rank of a matrix.
Proof. Ri ←→ Rj with i ̸= j: A and A′ have the same set of row vectors. So there is
nothing to prove.
Rj ←→ α Rj with α ̸= 0: {aj , aj1 , . . . ajs } is linearly independent
⇐⇒ {α aj , aj1 , . . . , ajs } is linearly independent.
Thus the maximum number of linearly independent rows of A is the same as the
maximum number of linearly independent rows of A′ . That is, the row ranks of A and
A′ are equal.
Ri ←→ Ri + αRj with i ̸= j: Suppose the set {a1 , . . . , ai , . . . , aj , . . . , am } of all row
vectors of A contains a linearly independent subset S := {aj1 , . . . , ajs } having s
elements. We claim that the set {a1 , . . . , ai + α aj , . . . , aj , . . . am } of all row vectors of
A′ also contains a linearly independent set S ′ containing s elements. If ai ̸∈ S, then
we let S ′ := S. Next, suppose ai ∈ S. Then
Proof. Let the number of nonzero rows of A′ be r . Let the pivots in these rows be in
columns k1 , . . . , kr , where 1 ≤ k1 < · · · < kr ≤ n. Suppose
α1 a1 + · · · + αr ar = 0,
where α1 , . . . , αr ∈ R
The kj th component of α1 a1 + · · · + αr ar is equal to αj since all entries in the kj th
column other than the pivot are equal to 0.
Hence αj = 0.Thus α1 = · · · = αr = 0. This shows that the first r rows of A′ are
linearly independent.
Also, since the last m − r rows of A′ are zero rows, any r + 1 row vectors of A′ will
contain the vector 0, and so they will be linearly dependent. Hence the
row rank of A′ = r .
This completes the proof.
Proposition
The row rank of a matrix is equal to the number of nonzero rows in any row reduced
echelon form of the matrix.
Definition
Let A ∈ Rm×n . The column rank of A is the maximum number of linearly independent
column vectors of A.
Proposition
The column rank of a matrix is equal to its row rank.
Among these m row vectors of A, there are r row vectors aj1 , . . . , ajr which are linearly
independent, and each row vector of A is a linear combination of these r row vectors.
for k = 1, . . . , n. These m equations show that the kth column of A can be written as
follows:
In view of the above result, we define the rank of a matrix A to be the common value
of the row rank of A and the column rank of A, and we denote it by rank A .
Consequently, rank AT = rank A .
(Note: Each nonzero row has exactly one pivot, and a zero row has no pivot.)
3. Let m = n. Then
A is invertible ⇐⇒ rank A = n.
(Recall: A square matrix is invertible if and only if it can be transformed to the
identity matrix by EROs.)