0% found this document useful (0 votes)
37 views5 pages

Differential Geometry

Homework

Uploaded by

Gag Paf
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
37 views5 pages

Differential Geometry

Homework

Uploaded by

Gag Paf
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

Homework set 4 - Solutions

Math 407 Renato Feres

1. (Exercise 4.1, page 49 of notes.) Let W := T0m (V ) and denote by GL(W ) the general linear group of W , defined as
the group of all linear isomorphisms of W onto itself. For each in the symmetric group S m (consisting of all
the permutations of {1, 2, . . . , m}) we have defined the multilinear map on V given by

()(v 1 , . . . , v m ) = v 1 (1) v 1 (m)

for S m . In this problem we regard () as a linear map on W . So we write

()(v 1 v m ) = v 1 (1) v 1 (m) .

Show that () GL(W ) for each S m and that

(0 ) = () (0 ) and (1 ) = ()1 .

This means that : S m GL(W ) is a group homomorphism, and that defines a linear representation of S m
with representation space W .

Solution. For , 0 S m ,

(0 )(v 1 v m ) = v (0 )1 (1) v (0 )1 (m) = v 0 1 1 (1) v 0 1 1 (m)


= (0 ) v 1 (1) v 1 (m) = (0 )(0 )(v 1 v m ).

It is immediate from the definition that if e is the identity element in the group, then

(e)(v 1 v m ) = v 1 v m .

Therefore, ()(1 ) = (1 ) = (e) = I , from which we obtain that (1 ) = (1 ).

2. (Exercise 4.2, page 50 of notes.) Show that the antisymmetrizing operator A, defined by

1 X
A= sign()()
m! S m

satisfies A 2 = A.
1
sign(0 )(0 ) for any 0 S m . Then
P
Solution. Keep in mind the following observation: A = m! S m

! ! !
2 1 X 1 X 1 X 1 X
A = sign(0 )(0 ) sign()() = 0 0
sign( )sign()( )()
m! 0 S m m! S m m! 0 S m m! S m
!
1 X 1 X 0 0
= sign( )( )
m! 0 S m m! S m
1 X
= A = A.
m! 0 S m

3. (Exercise 4.9, page 55 of notes.) Show that if is an alternating n-form on the n-dimensional real vector space
V and T : V V is any linear map, then

(T v 1 , . . . , T v n ) = det(T )(v 1 , . . . , v n ),

for arbitrary vectors v 1 , . . . , v n V .


Solution. We may suppose that 6= 0 since the claim is trivial otherwise. Thinking of as a non-zero linear map
on V V makes it clear that the set of vectors v 1 , . . . , v n for which (v 1 , . . . , v n ) 6= 0 is dense. This is because
the complement set of the kernel of a nonzero linear map is a dense set. So, by continuity, it is sufficient to
prove the identity under the assumption (v 1 , . . . , v n ) 6= 0. In particular, = {v 1 , . . . , v n } must be a basis of V . Let
{v 1 , . . . , v n } be the dual basis. Recall that v i (T v j ) is the (i , j )-entry of the matrix [T ] representing T in the basis
. So the isomorphism between Rn and V obtained under the basis sends the column vector ([T ]1 j , . . . , [T ]n j )t
to T v j . This makes it clear that the function

(T v 1 , . . . , T v n )
F ([T ]) :=
(v 1 , . . . , v n )

is multilinear when regarded as a function of the columns of [T ]. It is also clear that this is antisymmetric and
that F (I ) = 1, where I is the identity matrix. But the determinant is uniquely characterized by these properties.
Since the determinant of the matrix of a linear transformation is the same number regardless of choice of basis,
we must have
(T v 1 , . . . , T v n )
det(T ) = .
(v 1 , . . . , v n )
Consequently, the claimed identity holds.

4. (Exercise 4.10, page 56 of notes.) If is a k-form and is an l -form, show that

1 X
(v 1 , . . . , v k+l ) = sign()(v (1) , . . . , v (k) )(v (k+1) , . . . , v (k+l ) ).
k!l ! S k+l

Solution. The wedge product of forms is defined in the notes by


!
k +l
= ( ) A.
k

2
The definition of A yields
!
k +l 1
(v 1 , . . . , v k+l ) = sign()( )(v 1 (1) , . . . , v 1 (k+l ) )
X
k (k + l )! S k+l
1 X
= sign()(v (1) , . . . , v (k) )(v (k+1) , . . . , v (k+l ) ).
k!l ! S k+l

Note that, summing over all 1 gives the same result as summing over all the . Thus the stated identity holds.

5. (Exercise 5.10, page 65 of notes.) If D is the determinant function, show that it is everywhere differentiable and
its directional derivative at A in direction W is

d D A W = D(A)tr(W A 1 )

for all A GL(n, R) and all W M (n, R).


Solution. It is clear that D(A) is differentiable to every order at all points since it is a polynomial function of the
entries of A. Note that D(A + tW ) = D(I + tW A 1 )D(A). So

d d
d D AW = D(A + tW ) = D(A) D(I + tW A 1 ).
d t t =0 d t t =0

Thus it suffices to prove the identity for A = I . That is, it suffices to show that

d
D(I + tW ) = tr(W ).
d t t =0

Expressing the determinant explicitly as a function of the columns of the matrices, we have

d

d X n
D(I + tW ) = D(e 1 + t w 1 , . . . , e n + t w n ) = D(e 1 , . . . , e i 1 , w i , e i +1 , . . . , e n ) = tr(W ).
d t t =0
d t t =0

i =1

Thus the stated identity holds.

6. Let M symm (n, R) M (n, R) be the space of symmetric matrices. Define F : M (n, R) M symm (n, R) by F (A) = A t A.

(a) Show that for all A, X M (n, R)


d F A X = A t X + X t A.

(b) Show that if A is invertible, then d F A : M (n, R) M symm (n, R) is surjective.

Solution. (a) We have



d d t d
dFA = F (A + t X ) = (A + t X ) (A + t X ) = (A t A + t (A t X + X t A) + t 2 X t X ) = A t X + X t A.
d t t =0 d t t =0 d t t =0

(b) Let A be invertible. We need to show that for an arbitrary symmetric matrix S, there exists X such that
A t X + X t A = S. But this holds for X = 21 (A t )1 S since

1 t t 1 1
At X + X t A = A (A ) S + S A 1 A = S.
2 2

7. (Exercise 5.13, page 67 of notes.) Spectral theorem for symmetric matrices. Let f : U R be a differentiable
function defined on an open subset of Rn equipped with the standard inner product. The sphere of radius 1
centered at the origin is indicated by S n1 = {x Rn : kxk = 1}.

(a) Show that gradx f is orthogonal the kernel of d f x .

(b) Show that kgradx f k is the maximum rate of change of f along any direction:

max d f x v = kgradx f k
kvk=1

and that the maximum is achieved when v = gradx f /kgradx f k.


(c) Let A M (n, R) be a symmetric matrix and define f : Rn R by

1
f (x) = Ax, x.
2

Show that if x S n is a point where f achieves its maximum or its minimum valued, then x is an eigenvector
of A.
(d) Show that there exists an orthonormal basis of Rn consisting of eigenvectors of A. For this, use a finite
induction starting from the existence of one eigenvector, then restricting the function to the intersection
of the sphere with the subspace orthogonal to the previously obtained eigenvectors.
Solution. (a) Let u be any vector in the kernel of d f x , so that d f x u = 0. By definition

gradx f , u = d f x u = 0

proving the claim.


(b) By the Schwarz inequality,
d f x v = gradx f , v kgradx f kkvk.

On the other hand,


gradx f gradx f

d fx = gradx f , = kgradx f k.
kgradx f k kgradx f k
This shows that the maximum of d f x v over vectors v of length 1 is the norm of the gradient, and the
maximum is attained when v is the direction of the gradient.
(c) A point x of maximum or minimum is a critical point for f , meaning that d f x v = 0 for all v tangent to
S n1 at x. Note that x is perpendicular to all these tangent vectors. This means that the gradient vector of
f at x is parallel to x: gradx f = x for some Rn . On the other hand, the gradient of f (x) satisfies

d 1 1
gradx f , w = d f x w = A(x + t w), x + t w = (Ax, w + Aw, x = Ax, w) = Ax, w.
d t t =0 2 2

Therefore, the gradient of f at (any point) x is

gradx f = Ax.

Thus if x is a critical point we have Ax = x for some . But this means that x is an eigenvalue of A.

4
(d) Let V be the (n 1)-dimensional subspace of Rn perpendicular to the critical point of Rn . Since A is
symmetric,
0 = x, v = Ax, v = x, Av.

This means that Av is also perpendicular to x. So V is invariant under A. We can now repeat the argument
of (c) for the sphere in V . A simple finite induction now gives an orhonormal basis of Rn consisting of
eigenvectors of A.

You might also like