Differential Geometry
Differential Geometry
1. (Exercise 4.1, page 49 of notes.) Let W := T0m (V ) and denote by GL(W ) the general linear group of W , defined as
the group of all linear isomorphisms of W onto itself. For each in the symmetric group S m (consisting of all
the permutations of {1, 2, . . . , m}) we have defined the multilinear map on V given by
(0 ) = () (0 ) and (1 ) = ()1 .
This means that : S m GL(W ) is a group homomorphism, and that defines a linear representation of S m
with representation space W .
Solution. For , 0 S m ,
It is immediate from the definition that if e is the identity element in the group, then
(e)(v 1 v m ) = v 1 v m .
2. (Exercise 4.2, page 50 of notes.) Show that the antisymmetrizing operator A, defined by
1 X
A= sign()()
m! S m
satisfies A 2 = A.
1
sign(0 )(0 ) for any 0 S m . Then
P
Solution. Keep in mind the following observation: A = m! S m
! ! !
2 1 X 1 X 1 X 1 X
A = sign(0 )(0 ) sign()() = 0 0
sign( )sign()( )()
m! 0 S m m! S m m! 0 S m m! S m
!
1 X 1 X 0 0
= sign( )( )
m! 0 S m m! S m
1 X
= A = A.
m! 0 S m
3. (Exercise 4.9, page 55 of notes.) Show that if is an alternating n-form on the n-dimensional real vector space
V and T : V V is any linear map, then
(T v 1 , . . . , T v n ) = det(T )(v 1 , . . . , v n ),
(T v 1 , . . . , T v n )
F ([T ]) :=
(v 1 , . . . , v n )
is multilinear when regarded as a function of the columns of [T ]. It is also clear that this is antisymmetric and
that F (I ) = 1, where I is the identity matrix. But the determinant is uniquely characterized by these properties.
Since the determinant of the matrix of a linear transformation is the same number regardless of choice of basis,
we must have
(T v 1 , . . . , T v n )
det(T ) = .
(v 1 , . . . , v n )
Consequently, the claimed identity holds.
1 X
(v 1 , . . . , v k+l ) = sign()(v (1) , . . . , v (k) )(v (k+1) , . . . , v (k+l ) ).
k!l ! S k+l
2
The definition of A yields
!
k +l 1
(v 1 , . . . , v k+l ) = sign()( )(v 1 (1) , . . . , v 1 (k+l ) )
X
k (k + l )! S k+l
1 X
= sign()(v (1) , . . . , v (k) )(v (k+1) , . . . , v (k+l ) ).
k!l ! S k+l
Note that, summing over all 1 gives the same result as summing over all the . Thus the stated identity holds.
5. (Exercise 5.10, page 65 of notes.) If D is the determinant function, show that it is everywhere differentiable and
its directional derivative at A in direction W is
d D A W = D(A)tr(W A 1 )
Thus it suffices to prove the identity for A = I . That is, it suffices to show that
d
D(I + tW ) = tr(W ).
d t t =0
Expressing the determinant explicitly as a function of the columns of the matrices, we have
d
d X n
D(I + tW ) = D(e 1 + t w 1 , . . . , e n + t w n ) = D(e 1 , . . . , e i 1 , w i , e i +1 , . . . , e n ) = tr(W ).
d t t =0
d t t =0
i =1
6. Let M symm (n, R) M (n, R) be the space of symmetric matrices. Define F : M (n, R) M symm (n, R) by F (A) = A t A.
(b) Let A be invertible. We need to show that for an arbitrary symmetric matrix S, there exists X such that
A t X + X t A = S. But this holds for X = 21 (A t )1 S since
1 t t 1 1
At X + X t A = A (A ) S + S A 1 A = S.
2 2
7. (Exercise 5.13, page 67 of notes.) Spectral theorem for symmetric matrices. Let f : U R be a differentiable
function defined on an open subset of Rn equipped with the standard inner product. The sphere of radius 1
centered at the origin is indicated by S n1 = {x Rn : kxk = 1}.
(b) Show that kgradx f k is the maximum rate of change of f along any direction:
max d f x v = kgradx f k
kvk=1
1
f (x) = Ax, x.
2
Show that if x S n is a point where f achieves its maximum or its minimum valued, then x is an eigenvector
of A.
(d) Show that there exists an orthonormal basis of Rn consisting of eigenvectors of A. For this, use a finite
induction starting from the existence of one eigenvector, then restricting the function to the intersection
of the sphere with the subspace orthogonal to the previously obtained eigenvectors.
Solution. (a) Let u be any vector in the kernel of d f x , so that d f x u = 0. By definition
gradx f , u = d f x u = 0
gradx f = Ax.
Thus if x is a critical point we have Ax = x for some . But this means that x is an eigenvalue of A.
4
(d) Let V be the (n 1)-dimensional subspace of Rn perpendicular to the critical point of Rn . Since A is
symmetric,
0 = x, v = Ax, v = x, Av.
This means that Av is also perpendicular to x. So V is invariant under A. We can now repeat the argument
of (c) for the sphere in V . A simple finite induction now gives an orhonormal basis of Rn consisting of
eigenvectors of A.