Math 215 HW #11 Solutions
Math 215 HW #11 Solutions
Answer: First,
2 − 4i
k~xk2 = ~xH ~x = [2 + 4i − 4i] = (4 + 16) + 16 = 36,
4i
so k~xk = 6. Likewise,
2 H 2 + 4i
k~y k = ~y ~y = [2 − 4i − 4i] = (4 + 16) + 16,
4i
so k~y k = 6.
Finally,
H 2 + 4i
h~x, ~y i = ~x ~y = [2 + 4i − 4i] = (2 + 4i)2 − (4i)2 = (4 − 16 + 16i) + 16 = 4 + 16i.
4i
2. Problem 5.5.16. Write one significant fact about the eigenvalues of each of the following:
hA~x, A~xi = (A~x)T A~x = ~xT AT A~x = ~xT ~x = h~x, ~xi = k~xk2 .
Therefore,
λ2 k~xk2 − k~xk2 ,
meaning that λ2 = 1, so |λ| = 1.
(d) A Markov matrix.
Answer: We saw in class that λ1 = 1 is an eigenvalue of every Markov matrix, and that
all eigenvalues λi of a Markov matrix satisfy |λi | ≤ 1.
1
(e) A defective matrix (nondiagonalizable).
Answer: If A is n × n and is not diagonalizable, then A must have fewer than n
eigenvalues (if A had n distinct eigenvalues and since eigenvectors corresponding to
different eigenvalues are linear independent, then A would have n linearly independent
eigenvectors, which would imply that A is diagonalizable).
(f) A singular matrix.
Answer: If A is singular, then A has a non-trivial nullspace, which means that 0 must
be an eigenvalue of A.
3. Problem 5.5.22. Every matrix Z can be split into a Hermitian and a skew-Hermitian part,
Z = A + K, just as a complex number z is split into a + ib. The real part of z is half of z + z,
and the “real part” (i.e. Hermitian part) of Z is half of Z + Z H . Find a similar formula for
the “imaginary part” (i.e. skew-Hermitian part) K, and split these matrices into A + K:
3 + 4i 4 + 2i i i
Z= and Z = .
0 5 −i i
(Z − Z H )H = Z H − (Z H )H = Z H − Z = −(Z − Z H ),
2
4. Problem 5.5.28. If A~z = ~0, then AH A~z = ~0. If AH A~z = ~0, multiply by ~zH to prove that
A~z = ~0. The nullspaces of A and AH are . AH A is an invertible Hermitian matrix
when the nullspace of A contains only ~z = .
Answer: Suppose A A~z = ~0. Then, multiplying both sides by ~zH yields
H
since U −1 = U H . Note that Λ−1 is just the diagonal matrix with entries 1/λi (where the λi
are the entries in Λ). Hence,
6. Problem 5.6.8. What matrix M changes the basis V ~1 = (1, 1), V~2 = (1, 4) to the basis
~P ~ ~2 as combinations
v1 = (2, 5), ~v2 = (1, 4)? The columns of M come from expressing V1 and V
mij ~vi of the ~v ’s.
Answer: Since
~ 1 2 1
V1 = = − = ~v1 − ~v2
1 5 4
and
~2 = 1 = ~v2 ,
V
4
we see that
1 0
M= .
−1 1
7. Problem 5.6.12. The identity transformation takes every vector to itself: T ~x = ~x. Find the
corresponding matrix, if the first basis is ~v1 = (1, 2), ~v2 = (3, 4) and the second basis is
w
~ 1 = (1, 0), w
~ 2 = (0, 1). (It is not the identity matrix!)
Answer: Despite the slightly confusing way this question is worded, it is just asking for the
matrix M which converts the ~v basis into the w
~ basis. Clearly,
1 1 0
~v1 = = +2 =w ~ 1 + 2w
~2
2 0 1
3
and
3 1 0
~v2 = =3 +4 = 3w
~ 1 + 4w
~ 2,
4 0 1
so the desired matrix is
1 3
M= .
2 4
8. Problem 5.6.38. These Jordan matrices have eigenvalues 0, 0, 0, 0. They have two eigenvectors
(find them). But the block sizes don’t match and J is not similar to K.
0 1 0 0 0 1 0 0
0 0 0 0
and K = 0 0 1 0 .
J =0 0 0 1 0 0 0 0
0 0 0 0 0 0 0 0
For any matrix M , compare JM and M K. If they are equal, show that M is not invertible.
Then M −1 JM = K is impossible.
Answer: First, we find the eigenvectors of J and K. Since all eigenvalues of both are 0,
we’re just looking for vectors in the nullspace of J and K. First, for J, we note that J is
already in reduced echelon form and that J~v = ~0 implies that ~v is a linear combination of
1 0
, 0 .
0
0 1
0 0
4
and
m11 m12 m13 m14 0 1 0 0 0 m11 m12 0
m21 m22 m23 m24 0 0 1 0 0 m21 m22 0
MK =
m31
= .
m32 m33 m34 0 0 0 0 0 m31 m32 0
m41 m42 m43 m44 0 0 0 0 0 m41 m42 0
Clearly, the second and fourth rows are multiples of each other, so M cannot possibly have
rank 4. However, M not having rank 4 means that M cannot be invertible. Therefore,
M −1 JM = K is impossible, so it cannot be the case that J and K are similar.
9. Problem 5.6.40. Which pairs are similar? Choose a, b, c, d to prove that the other pairs aren’t:
a b b a c d d c
.
c d d c a b b a
and
0 1 b a 0 1 0 1 a b c d
= = .
1 0 d c 1 0 1 0 c d a b
Likewise, the first and fourth are similar, since
0 1 d c 0 1 0 1 c d a b
= = .
1 0 b a 1 0 1 0 a b c d
a = 1, b = c = d = 0.
5
Then the matrices are, in order
1 0 0 1 0 0 0 0
.
0 0 0 0 1 0 0 1
Each of these is already a diagonal matrix, and clearly the first and fourth have 1 as an
eigenvalue, whereas the second and third have only 0 as an eigenvalue. Since similar matrices
have the same eigenvalues, we see that neither the first nor the fourth can be similar to either
the second or the third.
10. (Bonus Problem) Problem 5.6.14. Show R x that every number is an eigenvalue for T f (x) =
df /dx, but the transformation T f (x) = 0 f (t)dt has no eigenvalues (here −∞ < x < ∞).
Proof. For the first T , note that, if f (x) = eax for any real number a, then
df
T f (x) = = aeax = af (x).
dx
Hence, any real number a is an eigenvalue of T .
Turning to the second T , suppose we had that T f (x) = af (x) for some number a and some
function f . Then, by the definition of T ,
Z x
f (t)dt = af (x).
0
Now, use the fundamental theorem of calculus to differentiate both sides:
f (x) = af 0 (x).
Solving for f , we see that
f 0 (x)dx
Z Z
1
= dx,
f (x) a
so
x
ln |f (x)| = + C.
a
Therefore, exponentiating both sides,
|f (x)| = ex/a+C = eC ex/a .
We can get rid of the absolute value signs by substituting A for eC (allowing A to possibly
be negative):
f (x) = Aex/a .
Therefore, we know that
Z x Z x ix
T f (x) = f (t)dt = Aet/a dt = aAet/a = aAex/a − aA = a(Aex/a − A) = a(f (x) − A).
0 0 0
On the other hand, our initial assumption was that T f (x) = af (x), so it must be the case
that
af (x) = a(f (x) − A) = af (x) − aA.
Hence, either a = 0 or A = 0. However, either implies that f (x) = 0, so T has no eigenvalues.