Covariant Contra Variant
Covariant Contra Variant
html
Next: The physical significance of Up: Relativity and electromagnetism Previous: Transformation of
velocities
Tensors
It is now convenient to briefly review the mathematics of tensors. Tensors are of primary
importance in connection with coordinate transforms. They serve to isolate intrinsic
geometric and physical properties from those that merely depend on coordinates.
vector. A tensor of rank two has components, which can be exhibited in matrix format.
Unfortunately, there is no convenient way of exhibiting a higher rank tensor. Consequently,
tensors are usually represented by a typical component: e.g., the tensor (rank 3), or the
tensor (rank 4), etc. The suffixes are always understood to range from 1 to
.
For reasons which will become apparent later on, we shall represent tensor components using
both superscripts and subscripts. Thus, a typical tensor might look like (rank 2), or
(rank 2), etc. It is convenient to adopt the Einstein summation convention. Namely, if any
suffix appears twice in a given term, once as a subscript and once as a superscript, a
summation over that suffix (from 1 to ) is implied.
To distinguish between various different coordinate systems, we shall use primed and multiply
primed suffixes. A first system of coordinates can then be denoted by , a
in various coordinate systems are distinguished by their suffixes. Thus, the components of
some third rank tensor are denoted in the system, by in the system, etc.
(1369)
1 of 6 24-09-2020, 11:19 pm
Tensors http://farside.ph.utexas.edu/teaching/em/lectures/node111.html
(1370)
Note that
(1371)
respectively.
) under if
(1374)
When applied to a tensor of rank zero (a scalar), the above definitions imply that .
Thus, a scalar is a function of position only, and is independent of the coordinate system. A
scalar is often termed an invariant.
If two tensors of the same type are equal in one coordinate system, then they are
equal in all coordinate systems.
2 of 6 24-09-2020, 11:19 pm
Tensors http://farside.ph.utexas.edu/teaching/em/lectures/node111.html
The simplest example of a contravariant vector (tensor of rank one) is provided by the
differentials of the coordinates, , since
(1375)
The coordinates themselves do not behave as tensors under all coordinate transformations.
However, since they transform like their differentials under linear homogeneous coordinate
transformations, they do behave as tensors under such transformations.
(1376)
then we have
(1377)
(1378)
Tensors of the same type can be added or subtracted to form new tensors. Thus, if and
are tensors, then is a tensor of the same type. Note that the sum of
tensors at different points in space is not a tensor if the 's are position dependent. However,
under linear coordinate transformations the 's are constant, so the sum of tensors at different
points behaves as a tensor under this particular type of coordinate transformation.
suffixes. The process illustrated by this example is called outer multiplication of tensors.
Tensors can also be combined by inner multiplication, which implies at least one dummy
suffix link. Thus, and are tensors of the type indicated by the
suffixes.
3 of 6 24-09-2020, 11:19 pm
Tensors http://farside.ph.utexas.edu/teaching/em/lectures/node111.html
Finally, tensors can be formed by contraction from tensors of higher rank. Thus, if is a
tensor then and are tensors of the type indicated by the suffixes. The
most important type of contraction occurs when no free suffixes remain: the result is a scalar.
Thus, is a scalar provided that is a tensor.
Although we cannot usefully divide tensors, one by another, an entity like in the equation
, where and are tensors, can be formally regarded as the quotient of
and . This gives the name to a particularly useful rule for recognizing tensors, the quotient
rule. This rule states that if a set of components, when combined by a given type of
multiplication with all tensors of a given type yields a tensor, then the set is itself a tensor. In
other words, if the product transforms like a tensor for all tensors then it
Let
(1379)
(1380)
where etc., are terms involving derivatives of the 's. Clearly, is not a tensor
(1381)
etc., also behave as tensors under linear transformations. Each partial differentiation has the
effect of adding a new covariant suffix.
So far, the space to which the coordinates refer has been without structure. We can impose
a structure on it by defining the distance between all pairs of neighbouring points by means of
a metric,
4 of 6 24-09-2020, 11:19 pm
Tensors http://farside.ph.utexas.edu/teaching/em/lectures/node111.html
(1382)
where the are functions of position. We can assume that without loss of
generality. The above metric is analogous to, but more general than, the metric of Euclidian
-space, . A space whose structure is determined by a
metric of the type (1382) is called Riemannian. Since is invariant, it follows from a
simple extension of the quotient rule that must be a tensor. It is called the metric tensor.
The elements of the inverse of the matrix are denoted by . These elements are uniquely
(1383)
It is easily seen that the constitute the elements of a contravariant tensor. This tensor is
said to be conjugate to . The conjugate metric tensor is symmetric (i.e., ) just like
The tensors and allow us to introduce the important operations of raising and
lowering suffixes. These operations consist of forming inner products of a given tensor with
or . For example, given a contravariant vector , we define its covariant components
by the equation
(1384)
More generally, we can raise or lower any or all of the free suffixes of any given tensor. Thus,
if is a tensor we define by the equation
(1386)
5 of 6 24-09-2020, 11:19 pm
Tensors http://farside.ph.utexas.edu/teaching/em/lectures/node111.html
Note that once the operations of raising and lowering suffixes has been defined, the order of
raised suffixes relative to lowered suffixes becomes significant.
By analogy with Euclidian space, we define the squared magnitude of a vector with
(1387)
Finally, let us consider differentiation with respect to an invariant distance, . The vector
is a contravariant tensor, since
(1389)
since
(1390)
and, as we have seen, the first factor on the right-hand side is not generally a tensor. However,
under linear transformations it behaves as a tensor, so under linear transformations the
derivative of a tensor with respect to an invariant distance behaves as a tensor of the same
type.
Next: The physical significance of Up: Relativity and electromagnetism Previous: Transformation of
velocities
Richard Fitzpatrick 2006-02-02
6 of 6 24-09-2020, 11:19 pm