0% found this document useful (0 votes)
22 views4 pages

Probability Practice Problems With Solutions 3

probability practice problems with solutions

Uploaded by

getasew
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
22 views4 pages

Probability Practice Problems With Solutions 3

probability practice problems with solutions

Uploaded by

getasew
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

STAT 714 MATRIX ALGEBRA REVIEW 3

TERMINOLOGY : Let V ⊆ Rn be a set of n × 1 vectors. We call V a vector space if

(i) x1 ∈ V, x2 ∈ V ⇒ x1 + x2 ∈ V, and

(ii) x ∈ V ⇒ cx ∈ V for c ∈ R.

That is, V is closed under addition and scalar multiplication.

TERMINOLOGY : A set of n × 1 vectors S ⊆ Rn is a subspace of V if S is a vector


space and S ⊆ V; i.e., if x ∈ S ⇒ x ∈ V.

TERMINOLOGY : We say that subspaces S1 and S2 are orthogonal, and write S1 ⊥S2 ,
if x01 x2 = 0, for all x1 ∈ S1 and for all x2 ∈ S2 .

Example. Suppose that V = R3 . Then, V is a vector space.


Proof. Suppose x1 ∈ V and x2 ∈ V. Then, x1 + x2 ∈ V and cx1 ∈ V for all c ∈ R. 

Example. Suppose that V = R3 . The subspace consisting of the z-axis is


  
 0 
S1 =  0  : for z ∈ R .
z
 

The subspace consisting of the x-y plane is


  
 x 
S2 =  y  : for x, y ∈ R .
0
 

It is easy to see that S1 and S2 are orthogonal. That S1 is a subspace is argued as follows.
Clearly, S1 ⊆ V. Now, suppose that x1 ∈ S1 and x2 ∈ S1 ; i.e.,
   
0 0
x1 =  0  and x2 =  0  ,
z1 z2

for z1 , z2 ∈ R. Then,  
0
x 1 + x2 =  0  ∈ S1
z1 + z2
and  
0
cx1 =  0  ∈ S1 ,
cz1
for all c ∈ R. Thus, S1 is a subspace. That S2 is a subspace follows similarly. 

PAGE 1
STAT 714 MATRIX ALGEBRA REVIEW 3

TERMINOLOGY : Suppose that V is a vector space and that x1 , x2 , ..., xn ∈ V. The set
of all linear combinations of x1 , x2 , ..., xn ; i.e.,
( n
)
X
S= x∈V:x= ci x i
i=1

is a subspace of V. We say that S is generated by x1 , x2 , ..., xn . In other words, S is


the space spanned by x1 , x2 , ..., xn , written S = span{x1 , x2 , ..., xn }.

Example. Suppose that V = R3 and let


   
1 1
x1 =  1  and x2 =  0 .
1 0

For c1 , c2 ∈ R, the linear combination c1 x1 + c2 x2 = (c1 + c2 , c1 , c1 )0 . Thus, the space


spanned by x1 and x2 is the subspace S which consists of all the vectors in R3 of the
form (a, b, b)0 , for a, b ∈ R. 

TERMINOLOGY : Suppose that S is a subspace of V. If {x1 , x2 , ..., xn } is a linearly


independent spanning set for S, we call {x1 , x2 , ..., xn } a basis for S. In general, a basis
is not unique. However, the number of vectors in the basis, called the dimension of S,
written dim(S), is unique.

Result MAR3.1. Suppose that S and T are vector spaces. If S ⊆ T , and dim(S) =
dim(T ), then S = T .
Proof. See pp 244-5 in Monahan. 

TERMINOLOGY : The subspaces S1 and S2 are orthogonal complements in Rm if


and only if S1 ⊆ Rm , S2 ⊆ Rm , S1 and S2 are orthogonal, S1 ∩ S2 = {0}, dim(S1 ) = r,
and dim(S2 ) = m − r.

Result MAR3.2. Let S1 and S2 be orthogonal complements in Rm . Then, any vector


y ∈ Rm can be uniquely decomposed as y = y1 + y2 , where y1 ∈ S1 and y2 ∈ S2 .
Proof. Suppose that the decomposition is not possible; that is, suppose that y is linearly
independent of basis vectors in both S1 and S2 . However, this would give m + 1 linearly
independent vectors in Rm , which is not possible. Thus, the decomposition must be
possible. To establish uniqueness, suppose that y = y1 + y2 and y = y1∗ + y2∗ , where
y1 , y1∗ ∈ S1 and y2 , y2∗ ∈ S2 . Then, y1 −y1∗ = y2∗ −y2 . But, y1 −y1∗ ∈ S1 and y2∗ −y2 ∈ S2 .
Thus, both y1 − y1∗ and y2∗ − y2 must be the 0 vector. 

NOTE : In the last result, note that we can write

||y||2 = y0 y = (y1 + y2 )0 (y1 + y2 ) = y10 y1 + 2y10 y2 + y20 y2 = ||y1 ||2 + ||y2 ||2 .

This is simply Pythagorean’s Theorem. The cross product term is zero since y1 and
y2 are orthogonal.

PAGE 2
STAT 714 MATRIX ALGEBRA REVIEW 3

TERMINOLOGY : For the matrix



Am×n = a1 a2 · · · an ,

where aj is m × 1, the column space of A,


( n
)
X
C(A) = x ∈ Rm : x = cj aj ; cj ∈ R
j=1
= {x ∈ R : x = Ac; c ∈ Rn },
m

is the set of all m × 1 vectors spanned by the columns of A; that is, C(A) is the set of all
vectors that can be written as a linear combination of the columns of A. The dimension
of C(A) is the column rank of A.

TERMINOLOGY : Let  
b01
 b02 
Am×n =  ,
 
..
 . 
b0m
where bi is n × 1. Denote
m
X
n
R(A) = {x ∈ R : x = di bi ; di ∈ R}
i=1
= {x ∈ Rn : x0 = d0 A; d ∈ Rm }.

We call R(A) the row space of A. It is the set of all n × 1 vectors spanned by the rows
of A; that is, the set of all vectors that can be written as a linear combination of the
rows of A. The dimension of R(A) is the row rank of A.

TERMINOLOGY : The set N (A) = {x : Ax = 0} is called the null space of A, denoted


N (A). The dimension of N (A) is called the nullity of A.

Result MAR3.3.

(a) C(B) ⊆ C(A) iff B = AC for some matrix C.

(b) R(B) ⊆ R(A) iff B = DA for some matrix D.

(c) C(A), R(A), and N (A) are all vector spaces.

(d) R(A0 ) = C(A) and C(A0 ) = R(A).

(e) C(A0 A) = C(A0 ) and R(A0 A) = R(A).

(f) For any A and B, C(AB) ⊆ C(A). If B is nonsingular, then C(AB) = C(A).

PAGE 3
STAT 714 MATRIX ALGEBRA REVIEW 3

Result MAR3.4. If A has full column rank, then N (A) = {0}.


Proof. Suppose that A has full column rank. Then, the columns of A are linearly
independent and the only solution to Ax = 0 is x = 0. 

Example. Define    
1 1 2 3
A= 1 0 3  and c =  −1  .
1 0 3 −1
The column space of A is the set of all linear combinations of the columns of A; i.e., the
set of vectors of the form
 
c1 + c2 + 2c3
c1 a1 + c2 a2 + c3 a3 =  c1 + 3c3  ,
c1 + 3c3

where c1 , c2 , c3 ∈ R. Thus, the column space C(A) is the set of all 3 × 1 vectors of
the form (a, b, b)0 , where a, b ∈ R. Any two vectors of {a1 , a2 , a3 } span this space. In
addition, any two of {a1 , a2 , a3 } are linearly independent, and hence form a basis for
C(A). The set {a1 , a2 , a3 } is not linearly independent since Ac = 0. The dimension of
C(A); i.e., the rank of A, is r = 2. The dimension of N (A) is 1, and c forms a basis for
this space. 

Result MAR3.5. For an m × n matrix A with rank r ≤ n, the dimension of N (A) is


n − r. That is, dim{C(A)} + dim{N (A)} = n.
Proof. See pp 241-2 in Monahan. 

Result MAR3.6. For an m×n matrix A, N (A0 ) and C(A) are orthogonal complements
in Rm .
Proof. Both N (A0 ) and C(A) are vector spaces with vectors in Rm . From the last
result, we know that dim{C(A)} = rank(A) = r, say, and dim{N (A0 )} = m − r, since
r = rank(A) = rank(A0 ). Now we need to show N (A0 ) ∩ C(A) = {0}. Suppose x is in
both spaces. If x ∈ C(A), then x = Ac for some c. If x ∈ N (A0 ), then A0 x = 0. Thus,

A0 x = A0 Ac = 0 =⇒ c0 A0 Ac = 0 =⇒ (Ac)0 Ac = 0 =⇒ Ac = x = 0.

To finish the proof, we need to show that N (A0 ) and C(A) are orthogonal spaces. Suppose
that x1 ∈ C(A) and x2 ∈ N (A0 ). It suffices to show that x01 x2 = 0. But, note that
x1 ∈ C(A) =⇒ x1 = Ac, for some c. Also, x2 ∈ N (A0 ) =⇒ A0 x2 = 0. Since x01 x2 =
(Ac)0 x2 = c0 A0 x2 = c0 0 = 0, the result follows. 

Result MAR3.7. Suppose that S1 and T1 are orthogonal complements. Suppose that
S2 and T2 are orthogonal complements. If S1 ⊆ S2 , then T2 ⊆ T1 .
Proof. See pp 244 in Monahan. 

PAGE 4

You might also like