0% found this document useful (0 votes)
85 views9 pages

MATH 332: Vector Analysis Tensors: Ivan Avramidi

This document provides an introduction to tensors and tensor notation. It begins by establishing a Cartesian coordinate system in 3D Euclidean space. It then discusses index notation, the Kronecker delta symbol, scalars, vectors, tensors, and tensor components. The document defines types of tensors based on their indices and rank. It also discusses how tensor components transform under coordinate system changes. Finally, it provides some basic properties and rules for working with tensors, such as the Einstein summation convention and raising/lowering indices.

Uploaded by

Nectaria Gizani
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
85 views9 pages

MATH 332: Vector Analysis Tensors: Ivan Avramidi

This document provides an introduction to tensors and tensor notation. It begins by establishing a Cartesian coordinate system in 3D Euclidean space. It then discusses index notation, the Kronecker delta symbol, scalars, vectors, tensors, and tensor components. The document defines types of tensors based on their indices and rank. It also discusses how tensor components transform under coordinate system changes. Finally, it provides some basic properties and rules for working with tensors, such as the Einstein summation convention and raising/lowering indices.

Uploaded by

Nectaria Gizani
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

MATH 332: Vector Analysis, Tensors 1

MATH 332: Vector Analysis


Tensors
Ivan Avramidi
New Mexico Tech

Cartesian Coordinate System. First of all, let us introduce a Cartesian


coordinate system in three-dimensional Euclidean space. We will denote the
coordinates by
x1 = x, x2 = y, x3 = z (1)
and the unit vectors in the direction of positive axes (called the basis vectors)
by
e1 = i, e2 = j, e3 = k (2)

Index Notation. This can be denoted simply by xi and ej , where i, j =


1, 2, 3. For the indices one usually uses the lowercase Latin letters i, j, k, l, m, n
etc. (do not confuse with i, j, k). If you run out of letters, you can use any
other letters. The convention is though that the indices are denoted by small
(versus capital) Latin (versus Greek) letters, and take values 1, 2, 3. Greek
indices are used in four-dimensional space-time in special relativity, where
they take values 0, 1, 2, 3, with x0 = t denoting time.

Kronecker Delta Symbol. The scalar products of the basis vectors are:

1, if i=j
ei · ej = (3)
0, if i 6= j
One says that they form an orthonormal system. This can be written in a
compact form by defining so called Kronecker symbol δij

1, if i=j
δij = (4)
0, if i 6= j
This can also be represented by the unit 3 × 3 matrix
 
1 0 0
(δij ) =  0 1 0  (5)
0 0 1
MATH 332: Vector Analysis, Tensors 2

Then
ei · ej = δij (6)

Scalars. Physical quantities, like mass, energy, volume, temperature, den-


sity etc., that can be described by one number are called scalars. This number
does not depend on the coordinate system; it is an invariant.

Vectors. Vectors are physical quantities, like velocity, position, displace-


ment, force, acceleration, electric field, magnetic field etc., that are described
by three numbers.

Tensors. A tensor is a geometric object that requires for its full description
more than just one number, as scalar, and even more than three numbers,
as a vector. Examples of tensors include: stress tensor, strain tensor, inertia
tensor, energy-momentum tensor, tensor of the electromagnetic field, metric
tensor, curvature tensor etc.

Tensor Components. These numbers are called the components of the


tensor. The components of a tensor are labeled by indices, for example,

δij , εijk , T ij , Bij , σij , Ri ijk (7)

A tensor whose all components are zero is called a zero tensor.

Types of Tensors. The tenswors with upper indices are called contravari-
ant, and the ones with lower indices are called covariant. If a tensor has
both types of indices then it is of mixed type. The total number of indices is
called the rank of the tensor. A tensor that has p upper indices and q lower
indices
T i1 ...ip j1 ...jq (8)
is called a tensor of type (p, q). So, a scalar is a tensor of rank 0. A vector
is a tensor of rank 1.

Transformation Law. The actual numerical values of the components of


a tensor do depend on the coordinate system. If one changes the coordinate
system, for example, rotates it, then the components of a tensor will change.
If one goes from the Cartesian coordinate system to a curvilinear coordinate
MATH 332: Vector Analysis, Tensors 3

system, for example, a system of spherical or cylindrical coordinates, then


the components of a tensor will also change. It is this transformation law of
the components of the tensor that makes a collection of numbers a tensor.
We will not give the formal definition of a tensor, rather we give here a very
short review of tensor analysis in Cartesian coordinates along with some very
useful formulas and rules that enable one to deal with tensors.

Metric Tensor In Cartesian coordinates the square of the distance be-


tween two infinitesimally close points in space, one with coordinates xi and
another with coordinates xi + dxi , is
3
X
2 2 2 2
(ds) = (dx) + (dy) + (dz) = (dxi )2 (9)
i=1

This can be written in the following form


3 X
X 3
(ds)2 = δij dxi dxj (10)
i=1 j=1

The distance between infinitesimally close points determines a tensor of rank


2, so called metric tensor gij . In general coordinate system one has
3 X
X 3
2
(ds) = gij dxi dxj , (11)
i=1 j=1

Thus, the covariant components of the metric tensor in Cartesian coordinates


are given by Kronecker delta symbol

gij = δij (12)

The contravariant components of the metric tensor are defined by

g ij = (gij )−1 (13)

In Cartesian coordinates
g ij = δij (14)
MATH 332: Vector Analysis, Tensors 4

Tensor Equations.
• In any tensor equation an index can appear only once (single index) or
twice (repeated index). For example, Ai ii is impossible.
• A single index can be either covariant in the whole equation or con-
travariant in the whole equation. It cannot be contravariant in one
term and covariant in another term. For example, Aj i + Bij is wrong.
• The repeated indices always appear in pairs, one covariant and another
contravariant. For example, Ai ij .
• A pair of repeated indices cannot appear more than once. For example,
Ai i i i is wrong.

Einstein Summation Convention. In tensor analysis one always en-


counters the sums over the indices that appear twice in an equation. For
examle, in the formula for the distance above the indices i and j appear
twice, and there is summation over i and j running from 1 to 3. Accord-
ing to the standard convention, called Einstein summation convention, one
has agreed to sum over repeated indices and omit the summation signs. For
example,
X 3 X3
i j
δij dx dx = δij dxi dxj (15)
i=1 j=1
3
X
i
Ai B = Ai B i (16)
i=1
3
X
i
T i= T ii (17)
i=1
3
X
i
R jik = Ri jik (18)
i=1
3 X
X 3
ij
δij R = δij Rij (19)
i=1 j=1
3 X
X 3 X
3
εijk Ajk B i = εijk Ajk B i (20)
i=1 j=1 k=1
MATH 332: Vector Analysis, Tensors 5

Raising and Lowering Indices The metric tensor can be used to raise
and lower indices of tensors. For example, if Ai are contravariant components
of a vector then its covariant components are

Ai = δij Aj (21)

Conversely,
Ai = δ ij Aj (22)
This operations, called raising and lowering indices can be applied to any
tensor. If one applies it to the metric tensor, one gets

δji = δjk δ ik (23)

By raising and lowering indices any tensor can be put in a covariant or


contravariant form. One has to be careful though with the order of indices.
For example,
Ai j = δjk Aik 6= Aj i = δjk Aki (24)

Remark. In Cartesian coordinates the covariant and contravariant com-


ponents are equal, since the metric tensor is given by the Kronecker symbol.
Therefore, in this case it does not make any difference and all indices can be
placed down, for example.

Properties and Identities of Kronecker Delta Symbol

δij = δji (25)

δji = δij = δ ij = δij (26)


δii = 3 (27)
δij Aj = Ai (28)
δ ij Ai Bj = Ai Bi = A · B (29)

Addition. One can add tensors of the same type. The result is a tensor of
the same type.

Multiplication By Scalars. One can multiply tensors by scalars. The


result is a tensor of the same type.
MATH 332: Vector Analysis, Tensors 6

Tensor Multiplication. If one multiplies a tensor of rank r with a tensor


of rank k, one gets a new tensor of rank r+k. More precisely, if one multiplies
a tensor of type (p, q) with a tensor of type (r, s), then one gets a new tensor
of type (p + r, q + s). For example,

Ai Bj = Cij , T mn σij = Rmn ij (30)

Note: one just multiplies the components of the tensors without any sum-
mation.

Contraction. Given a tensor of type (p, q) (that is of rank r = p + q)


one may select a pair of indices, of which one should be an upper index and
another an lower index, and replace them by two identical (repeated indices),
summation over the latter being implied by the summation convention. This
process is called contraction. As a result one gets a new tensor of type
(p − 1, q − 1) of rank r − 2 = p + q − 2. For example,

Ai i , Rij ki , C ik k (31)

Clearly,
δii = δ11 + δ22 + δ33 = 3 (32)

Symmetrization and Anti-symmetrization. A tensor A of rank 2 is


said to be symmetric if
Aij = Aji (33)
and anti-symmetric (or skew-symmetric) if

Aij = −Aji (34)

Any tensor Aij of second rank can be decomposed

Aij = A(ij) + A[ij] (35)

into its symmetric


1
A(ij) = (Aij + Aji ) (36)
2
and anti-symmetric parts
1
A[ij] = (Aij − Aji ) (37)
2
MATH 332: Vector Analysis, Tensors 7

One can also symmetrize a tensor over three indices:


1
B(ijk) = (Bijk + Bjki + Bkij + Bikj + Bjik + Bkji ) (38)
6
Correspondingly, the anti-symmetrization over three indices is defined by
1
B[ijk] = (Bijk + Bjki + Bkij − Bikj − Bjik − Bkji ) (39)
6
What one does here is one sums over all possible permutations of indices and
changes sign if the permutation is odd.

Contraction of Symmetric and Anti-symmetric Tensors. Let Aij be


a symmetric tensor and Bij an antisymmetric. Then

Aij B ij = Aji B ij = −Aji B ji = −Aij B ij , (40)

and, therefore,
Aij B ij = 0 . (41)

Levi Civita Symbol Levi-Civita symbol εijk is defined by



 +1, if (i, j, k) is an even permutation of (1, 2, 3)
εijk = −1, if (i, j, k) is an odd permutation of (1, 2, 3) (42)
0, otherwise

If one raises the indices then one sees that in Cartesian coordinates one
obtains the same symbol, so that

εijk = εijk (43)

The Levi-Civita symbol defines a tensor of rank 3, called a Levi-Civita tensor.


This is another very important tensor that is purely geometric in nature. It
describes not the distances but the volume in three-dimensional space. The
volume of a parallelepiped based on three displacement vectors Ai , B j , C k is

V = εijk Ai B j C k (44)
MATH 332: Vector Analysis, Tensors 8

Properties and Identities of Levi-Civita Symbol. The Levi-Civita


symbol defines a completely antisymmetric tensor. The following properties
immediately follow from its definition.

εijk = −εjik = −εikj = −εkji (45)


εijk = εjki = εkij (46)
εij j = εj ij = εj ji = 0 (47)
εijk δ ij = εijk δ ik = εijk δ jk = 0 (48)
εijk Aj Ak = εijk Ai Ak = εijk Ai Aj = 0 (49)
εijk = εijk (50)

εijk εmnl = 6δ[im δjn δk]


l
(51)
= δim δjn δkl + δjm δkn δil + δkm δin δjl
−δim δkn δjl − δjm δin δkl − δkm δjn δil (52)

εijk εmnk = 2δ[im δj]n


= δim δjn − δjm δin (53)

εijk εmjk = 2δim (54)


εijk εijk = 6 (55)

Vector Operations in Tensor Notation. Tensor notation is very use-


ful in vector analysis, in particular when manipulating the multiple vector
products and vector identities.
First of all, the scalar and vector products of two vectors A and B are
given by
A · B = Ai B i (56)
and
( A × B )i = εijk Aj B k (57)
The triple product of three vectors A , B and C is then given by

[ A , B , C ] = A · ( B × C ) = εijk Ai Bj Ck (58)
MATH 332: Vector Analysis, Tensors 9

Note that the position of indices (up versus down) in Cartesian coordinates
is not important. However, it is still more clear, when you see one index
up and the same index down then you should immediately notice that this
is a contraction and there is a summation over this index from 1 to 3. We
repeat once again that the name of such repeated indices is not important,
they are dummy indices; one can rename them to any other letter if needed
(make sure that there are no other indices with that name in the given tensor
equation!).
By using the properties of Levi-Civita symbol and Kronecker symbol one
can derive now all vector identities. For example,

[( A × B ) × ( C × D )]i = εijk ( A × B )j ( C × D )k
= εijk εjmn Am Bn εkpq Cp Dq
= (δip δjq − δjp δiq )εjmn Am Bn Cp Dq
= (δip εqmn − δiq εpmn )Am Bn Cp Dq
= εqmn Am Bn Ci Dq − εpmn Am Bn Cp Di (59)
= [ D , A , B ]Ci − [ C , A , B ]Di (60)

In other words, we have just proven the following vector identity

(A × B) × (C × D) = [D, A, B]C − [C, A, B]D (61)

(do not forget that the triple product is a scalar!)

You might also like