33A Linear Algebra: Homework 5 Solutions: Warm-Up Exercises
33A Linear Algebra: Homework 5 Solutions: Warm-Up Exercises
Puck Rombach
Warm-up exercises
Problem 1
.........
Answer. We know that V is 1-dimensional. There are only two vectors of length 1 in any 1-
dimensional space, because the only scalars that we can multiply a basis vector ~v by to get a unit
1
vector are k~1vk and k~
vk
. There are of course systematic ways of finding the missing vector, but in this
case it is not hard to guess
1
1
1 1 1 1
u4 = or u4 =
2 1 2 1
1 1
.
Problem 2
Problem 3
1 0
1 1
In R4 , consider the subspace V spanned by the vectors and . Find the matrix A of the
1 1
0 1
orthogonal projection onto V.
.........
Answer. This basis is easy to turn into an orthonormal basis because they are already orthogonal.
However, lets practice the direct way without doing that. We can directly apply the formula:
P = A(AT A)1 AT ,
where
1 0
1 1
A =
.
1 1
0 1
We obtain
3 3 0
3
1 3 6 0 3
P = .
9 3 0 6 3
0 3 3 3
Compare this to your own QQT .
Problem 4
Show that, if a set of vectors ~v1 , ~v2 , . . . , ~vm is linearly independent, and ~v m = ~ vm projW ~vm , with
W = span(~v1 , ~v2 , . . . , ~vm1 ), then we have that span(~v1 , ~v2 , . . . ~vm1 , ~vm ) = span(~v1 , ~v2 , . . . , ~vm1 , ~v
m ).
.........
Answer. This is an important fact to think about if you want to justify G-S as an algorithm for
changing a basis of a space to an orthonormal basis, while still spanning the same space.
To show that two sets are the same, we can show that one is contained in the other and vice
versa. We have that
~v
m =~
vm (k1~v1 + . . . + km1~vm1 ).
Quiz Exercises
Problem 1
Next,
1 1/ 2 1/2
1
~v v2 ~vk2 = ~v2 (~v2 u1 )u1 = 0 1/ 2 = 1/2 .
2 =~
1 2 0 1
Finally:
1/2
1/2
~v 1 1/(2 3/2)
u2 = 2 = = 1/(2 3/2) .
k~v2 k 3/2
1/ 3/2
Problem 2
1
Find the orthogonal projection of the vector ~x = 2 onto V from the previous question.
1
.........
Answer. We use the dot product definition:
1 1/2 11
3 2 1
projV ~x = (~x u1 )u1 + (~x u2 )u2 = 1 + 1/2 = 7 .
2 3 6
0 1 4
Problem 3
1 !
1 0 1
Express ~x = 2 as the sum of a vector in the kernel of A = and one in its row-span.
1 1 0
1
.........
Answer. We know that ker(A) = (row span) , so the decomposition ~x = ~xk + ~x is unique. This
means we only have to project ~x once, and may as well do that to a 1-dimensional space. We find
1
with usual methods that ker(A) = span 1. Then
1
1
4
~x = (~x u1 )u1 = 1 ,
k
3
1
and
1 1 1
4 1
~x = ~x ~xk = 2 1 = 2 .
3 3
1 1 1
Now we have that
1 1 1
4 1
~x = 2 = 1 + 2 = ~xk + ~x ,
3 3
1 1 1
where indeed ~xk is in the kernel of A and ~x in the row-space (~r1 + 2~r2 ).
Problem 4
Let u1 , . . . , u p be an orthonormal set in Rn . For any ~x Rn , what is the relationship between k~xk2
and (u1 ~x)2 + . . . + (u p ~x)2 ? When are they equal?
.........
Answer. Because of the orthonormality of the set u1 , . . . , u p , we have that
q
kk1 u1 + . . . + k p u p k = k12 + . . . + k2p .
Problem 5
Problem 6
Optional Exercises
Problem 1
Describe all orthogonal 2 2 matrices. (An orthogonal matrix is a matrix such that its columns are
an orthonormal set.)
.........
Answer. Suppose that we pick any unit vector for the first column, then there are 2 options for the
second column that complete the orthonormal basis for R2 . So, we could describe these matrices as
matrices of the form: ! !
a b a b
or , with a2 + b2 = 1.
b a b a
This is a precise enough answer, but lets interpret geometrically. We recognize the first option as
all the pure rotations. For the second one, it might be a little harder to notice what type of trans-
formation it is. As discussed in the lecture, we can guess intuitively that the only transformations
that have the right magnitude and angle preserving properties are rotations and reflections. We can
describe reflection matrices as
1 2u21 2u1 u2 u22 u21 2u1 u2
! !
= , with u21 + u22 = 1.
2u1 u2 1 2u22 2u1 u2 u21 u22
We can check easily that these are indeed exactly the matrices
!
a b
, with a2 + b2 = 1,
b a
under the translation a = 1 2u21 and b = 2u1 u2 .
Problem 2
Prove that, given two bases for Rn , any element that is removed from the first bases can be replaced
by one from the second basis (such that the set is still a basis).
.........
Answer. This is Steinitz Exchange Lemma, which doesnt seem to be mentioned in the book, but
you can find many resources on it online.
Problem 3
Suppose that U is a subspace of V. Is it true that every basis for V contains a basis for U? Is it true
that every basis for U is contained in a basis for V?
.........
Answer. No, easy to find a counterexample. Yes, also Steinitz.
Problem 4
In order to find the correlation coefficient of variables, we needed to subtract the mean of each vector
from each element in the vector (in order to get a vector of deviations). For a vector ~x Rn , let
x = x1 +...+x
n
n
. Find a matrix A, such that for all ~x Rn ,
x1 x
x x
2
A~x = . .
.
.
xn x
.........
Answer. This is called a centering matrix, in case youd like to Google, which is indeed a projection
matrix to the hyperplane x1 + . . . + xn = 0.