PDF PPT Mathematical Physics Tensor Unit 7
PDF PPT Mathematical Physics Tensor Unit 7
UNIT – 7
Tensor Algebra
7.1. INTRODUCTION
7.2. n-DIMENSIONAL SPACE
7.3. CO-ORDINATE TRANSFORMATIONS
7.4. INDICAL AND SUMMATION CONVENTIONS
7.5. DUMMY AND REAL INDICES
7.6. KEONECKER DELTA SYMBOL
7.7. SCALARS, CONTRAVARIANT VECTORS AND COVARIANT
VECTORS
7.8. TENSORS OF HIGHER RANKS
7.9. SYMMETRIC AND ANTISYMMETRIC TENSORS
7.10. ALGEBRAIC OPERATIONS ON TENSORS
7.1. INTRODUCTION
Consider two sets of variables (x1, x2, x3, …, xn) and 𝑥 1 , 𝑥 2 , 𝑥 3 , … 𝑥 𝑛 which determine the
co-ordinates of point in an n-dimensional space in two different frames of reference. Let
the two sets of variables be related to each other by the transformation equations
𝑥 1 = 𝜙1 (𝑥 1 , 𝑥 2 , 𝑥 3 , … 𝑥 𝑛 )
𝑥 2 = 𝜙 2 (𝑥 1 , 𝑥 2 , 𝑥 3 , … 𝑥 𝑛 )
… … … …
… … … …
𝑥 𝑛 = 𝜙 𝑛 (𝑥 1 , 𝑥 2 , 𝑥 3 , … 𝑥 𝑛 )
or briefly 𝑥 𝜇 = 𝜙𝜇 (𝑥 1 , 𝑥 2 , 𝑥 3 , … , 𝑥 𝑖 , … , 𝑥 𝑛 ) …(7.1)
(i = 1, 2, 3, …, n)
where function 𝜙𝜇 are single valued, continuous differentiable functions of co-ordinates. Iit sis
essential that the n-function 𝜙𝜇 be independent.
Equations (7.1) can be solved for co-ordinates xi as functions of 𝑥 𝜇 to yield
𝑥 𝑖 = 𝜓 𝑖 (𝑥1 , 𝑥 2 , 𝑥 3 , … , 𝑥 𝜇 , … , 𝑥 𝑛 ) …(7.2)
Thus the summation convention means the drop of sigma sign for the index
appearing twice in a given term. In other words the summation convention implies the sum
of the term for the index appearing twice in that term over defined range.
7.5. DUMMY AND REAL INDICES
Any index which is repeated in a given term, so that the summation convention
implies, is called a dummy index and it may be replaced freely by any other
𝜇
index not already used in the term. For example I is a dummy index in 𝑎𝑖 𝑥 𝑖 . Also I
is a dummy index in eqn. (4.5a), so that equation (4.5a) is equally written as
𝜇 𝜇 𝜇
𝜕𝑥 𝜕𝑥
𝑑𝑥 = 𝑑𝑥 𝑘 = 𝑑𝑥 𝜆 . …(7.5b)
𝜕𝑥 𝑘 𝜕𝑥 𝜆
Also two or more dummy indices can be interchanged. In order to avoid confusion
the same index must not be used more than twice in any single team.
For example will not be written as aixi ai xi but rather aiajxi xj.
Any index which is not repeated in a given term is called a real index. For
𝜇
example 𝜇 is a real index in𝑎𝑖 𝑥 𝑖 . A real index cannot be replaced by another real
index, e.g.
𝜇
𝑎𝑖 𝑥 𝑖 ≠ 𝑎𝑖𝑣 𝑥 𝑖
7.6.KEONECKER DELTA SYMBOL
The symbol kronecker delta
𝑗 1 𝑖𝑓 𝑗 = 𝑘
𝛿𝑘 =ቊ …(7.6)
0 𝑖𝑓 𝑗 ≠ 𝑘
Some properties of kronecker delta
(i) If x1, x2, x3, …xn are independent variables, then
𝜕𝑥 𝑗 𝑗
= 𝛿𝑘 …(7.7)
𝜕𝑥 𝑘
(ii) An obvious property of kronecker delta symbol is
𝑗
𝛿𝑘 𝐴 𝑗 = 𝐴𝑘 . …(7.8)
Since by summation convention in the left hand side of this
equation the summation is with respect to j and by definition of
kronecker delta, the only surviving term is that for which j = k.
(iii) If we are dealing with n dimensions, then
𝑗
𝛿𝑗 = 𝛿𝑘𝑘 = 𝑛 …(7.9)
By summation convention
𝑗
𝛿𝑗 = 𝛿11 + 𝛿22 + 𝛿33 + ⋯ + 𝛿𝑛𝑛
= 1 + 1 + 1 + ⋯+ 1 = 𝑛
𝑗 𝑗
(iv) 𝛿𝑗 𝛿𝑘 = 𝛿𝑘𝑖 . …(7.10)
By summation convention
𝑗
𝛿𝑗𝑖 𝛿𝑘 = 𝛿1𝑖 𝛿𝑘1 + 𝛿2𝑖 𝛿𝑘2 + 𝛿3𝑖 𝛿𝑘3 + ⋯ + 𝛿𝑖𝑖 𝛿𝑘𝑖 + ⋯ 𝛿𝑛𝑖 𝛿𝑘𝑛
= 0 + 0 + 0 + ⋯ + 1. 𝛿𝑘𝑖 + ⋯ + 0
= 𝛿𝑘𝑖
𝑖
𝜕𝑥 𝑗 𝜕𝑥 𝜕𝑥 𝑗 𝑗
(v) 𝑖 𝑘 = = 𝛿𝑘 . …(7.11)
𝜕𝑥 𝜕𝑥 𝜕𝑥 𝑘
Generalised Kronecker Delta. The generalised kronecker delta is symbolized as
𝑗 𝑗 …𝑗
𝛿 1 2 𝑚
𝑘1 𝑘2 … 𝑘𝑚
and defined as follows :
The subscripts and superscripts can have any value from 1 to n.
If either at least two superscripts or at least two subscripts have the same value or the subscribts
are not the same set as super-scripts, then the generalised kronecker delta is zero. For example
𝑖 𝑖𝑗𝑘 𝑖𝑗𝑘
𝛿𝑗𝑘𝑙 𝑘𝑘 = 𝛿𝑙𝑚𝑚 = 𝛿𝑘𝑙𝑚 = 0.
If all the subscripts are separately different and the subscripts are the same set of numbers as
the superscripts, then the generalised kronecker delta has value +1 or -1 according to whether
it requires as even or odd number of permutations to arrange the superscripts in the same order
as the subscripts.
For example
123 123 1452
𝛿123 = 𝛿231 = 𝛿4125 = +1
and 123
𝛿213 123
= 𝛿132 1452
= 𝛿4152 = −1.
It should be noted that
𝑖 𝑖 𝑖 …𝑖 𝑖 𝑖 𝑖 …𝑖
3 𝑛
𝛿11 , 𝛿22 , 𝛿3…𝑛 𝛿𝑗11 , 𝛿𝑗22 , 𝛿𝑗3…𝑛
3 …𝑗𝑛
= 𝛿𝑗11 , 𝛿𝑗22 , 𝛿𝑗33…𝑗𝑛𝑛
7.7. SCALARS, CONTRAVARIANT VECTORS
(a)AND COVARIANT
Scalars. Consider a function 𝜙 in a VECTORS
co-ordinate system of variables x and let his function
i
𝜇
have the value 𝜙 in another system of variables 𝑥 . Then if
𝜙=𝜙
then the function 𝜙 is said to be scalar or invariant or a tensor of order zero.
The quantity
𝛿𝑖𝑖 = 𝛿11 + 𝛿22 + 𝛿33 + ⋯ + 𝛿𝑛𝑛 = 𝑛
is a scalar or an invariant.
(b) Contravariant Vectors. Consider a set of n quantities 𝐴1 , 𝐴2 , 𝐴3 , … 𝐴𝑛 in a system of
1 2 3 𝑛
variables x and let 𝜇these quantities have values 𝐴 , 𝐴 , 𝐴 , … 𝐴 in another co-ordinate
i
system of variables 𝑥 . If these quantities obey the transformation relation
𝜇 𝜇
𝜕𝑥
𝐴 = = 𝐴𝑖 …(7.12)
𝜕𝑥 𝑖
then the quantities 𝐴 are said to be the components of a contravariant vector or a
𝑖
contravariant tensor of first tank.
Any n functions can be chosen as the components of a contravariant vector in a
𝜇
system of variables 𝑥 .
𝜕𝑥 𝑗
Multiplying equation (4.12) by 𝜇 and taking the sum over the index 𝜇 from 1 to n,
𝜕𝑥
we get
𝜇
𝜕𝑥 𝑗 𝜇 𝜕𝑥 𝑗 𝜕𝑥 𝜕𝑥 𝑗 𝑖
𝜇 𝐴 = 𝜇 𝐴𝑖 = 𝐴 = 𝐴𝑗
𝜕𝑥 𝜕𝑥 𝜕𝑥 𝑖 𝜕𝑥 𝑖
𝜕𝑥 𝑗 𝜇
or 𝐴𝑗 = 𝜇𝐴 . …(7.13)
𝜕𝑥
As equations (7.12) and (7.14) are similar transformation equations, we can say that
the differentials dxi form the components of contravariant vector, whose
𝜇
components in any other system are the differentials 𝑑𝑥 of that system. Also we
conclude that the components of a contravariant vector are actually the
components of a contravariant tensor of rank one.
𝜇
Let us now consider a further change of variables from 𝑥 to x’p, then the new
components A’p must be given by
𝜇
𝜕𝑥 ′𝑃 𝜇 𝜕𝑥 ′𝑃 𝜕𝑥
𝐴 ′𝑃
= 𝜇 𝐴 = 𝜇 . 𝐴𝑖 (using 7.12)
𝜕𝑥 𝜕𝑥 𝜕𝑥 𝑖
𝜕𝑥 ,𝑝
= 𝐴𝑖 . …(7.15)
𝜕𝑥 𝜌
This equation has the same from as eqn. (4.12). This indicates that the transformations of
contravariant vectors form a group.
Note. A single superscripts is always used to indicate a contravariant vector unless the
contrary is explicitly stated
Covariant vectors. Consider a set of n quantities A1, A2, A3, … An in a system of variables
𝜇
xi and let these quantities have values 𝐴1 , 𝐴2 , 𝐴3 , … 𝐴𝑛 in another system of variables 𝑥 . If
these quantities obey the transformation equations
𝜕𝑥 𝑖
𝐴𝜇 = 𝜇 𝐴𝑖 …(7.16)
𝜕𝑥
then the quantities Aj are said to be the components of a convariant vector or a
covariant tensor of rank one.
Any n functions can be chosen as the components of a covariant vector in a system of
variables xi and equations (4.16) determine the𝜇
n-components in the new system of
𝜇 𝜕𝑥
variables 𝑥 . Multiplying equation (8.16) by 𝑖 and taking the sum over the index 𝜇 from
𝜕𝑥
1 to n, we get
𝜇 𝜇
𝜕𝑥 𝜕𝑥 𝜕𝑥 𝑖 𝜕𝑥 𝑖
𝐴𝜇 = 𝜇 𝐴𝑖 = 𝐴 = 𝐴𝑗
𝜕𝑥 𝑗 𝜕𝑥 𝑗 𝜕𝑥 𝜕𝑥 𝑗 𝑖
𝜇
𝜕𝑥
thus 𝐴𝑗 = 𝐴 . …(7.17)
𝜕𝑥 𝑗 𝜇
Equations (7.17) represent the solution of equations (7.16).
𝜇
Let us now consider a further change of variables from 𝑥 to x’p. Then the new
components 𝐴′𝑝 must be given by
𝜇 𝜇
𝜕𝑥 𝜕𝑥 𝜕𝑥 𝑖
𝐴′𝑝 = 𝜕𝑥 ,𝑝 𝐴𝜇 = 𝜕𝑥 ,𝑝 𝜇 𝐴𝑖
𝜕𝑥
𝜕𝑥 𝑖
= 𝜕𝑥 ,𝑝 𝐴𝑙 . …(7.18)
This equation has the same form as eqn. (4.16). This indicates that the transformation of
convariant vectors form a group.
𝜕𝜓 𝜕𝜓 𝜕𝑥 𝑖 𝜕𝑥 𝑖 𝜕𝜓
As 𝜇 = 𝜇 = 𝜇
𝜕𝑥 𝜕𝑥 𝑖 𝜕𝑥 𝜕𝑥 𝜕𝑥 𝑖
𝜕𝜓
It follows from (4.16) that form the components of a convariant vector, whose
𝜕𝑥 𝑖
𝜕𝜓
components in any other system are the corresponding partial derivatives 𝜇 . This
𝜕𝑥
contra variant vector is called grad 𝜓.
Note. A single subscript is always used to indicate covariant vector unless contrary is
explicitly stand; but the exception occurs in the notation of co-ordinates.
7.8. TENSORS OF HIGHER RANKS
The laws of transformation of vectors are:
𝜇 𝜇
𝜕𝑥
Contravariant …𝐴 = 𝐴𝑖 …(7.12)
𝜕𝑥 𝑖
𝜇 𝜕𝑥 𝑖
Covariant …𝐴 = 𝜇 𝐴𝑖 …(7.16)
𝜕𝑥
(a) Contravariant tensors of second rank.
Let us consider (n)2 quantities Aij (here I and j take the values from 1 to n independently) in a
𝜇𝑣 𝜇
system of variables xi and let these quantities have values 𝐴 in another system of variables 𝑥 . If
these quantities obey the transformation equations
𝜇𝑣 𝜇 𝑣
𝜕𝑥 𝜕𝑥
𝐴 = 𝜕𝑥 𝑖 𝜕𝑥 𝑗
𝐴𝑖𝑗 …(7.19)
then the quantities Aij are said to be the components of a contravariant tensor of second rank.
The transformation law represented by (7.19) is the generalisation of the transformation law (7.12).
Any set of (n)2 quantities can be chosen as the components of a contravariant tensor of second
rank in a system of variables xi and then quantities (7.19) determine (n)2 components in any other
𝜇
system of variable 𝑥 .
(b) Covariant tensor of second rank. If (n)2 quantities Aij in a system of variables xi are
𝜇
related to another (n)2 quantities 𝐴𝜇𝑣 in another system of variables 𝑥 by the
transformation equations
𝜕𝑥 𝑖 𝜕𝑥 𝑗
𝐴𝜇𝑣 = 𝜇 𝑣 𝐴𝑖𝑗 …(7.20)
𝜕𝑥 𝜕𝑥
then the quantities Aij are said to be the components of a covariant tensor of second
rank.
The transformation law (7.20) is a generalisation of (7.16). Any set of (n)2 quantities
can be chosen as the components of a covariant tensor of second rank in a system of
variables xi𝜇and then equation (7.20) determine (n)2 components in any other system of
variables 𝑥 .
(c) Mixed tensor of second rank. If (n)2 quantities 𝐴𝑗𝑖 in a system of variables xi are related
𝜇 𝜇
to another (n)2 quantities 𝐴𝑣 in another system of variables 𝑥 by the transformation
equations
𝜇
𝜇 𝜕𝑥 𝜕𝑥 𝑗 𝑖
𝐴𝑣 = 𝜕𝑥 𝑗 𝑣 𝐴𝑗 …(7.21)
𝜕𝑥
then the quantities 𝐴𝑗𝑖 are said to be component of a mixed tensor of second rank.
An important example of mixed tensor of second rank is kronecker delta 𝐴𝑗𝑖 .
(d) Tensor of higher ranks, rank of a tensor. The tensors of higher ranks are defined
by similar laws. The rank of a tensor only indicates the number of indices attached
𝑖𝑗𝑘
to its per component. For example 𝐴𝐼 are the components of a mixed tensor of
rank 4; contravariant of rank 3 and covariant of rank 1, if they transform according
to the equation
𝜇𝑣𝜎 𝜇 𝑣 𝜎 𝑙
𝜕𝑥 𝜕𝑥 𝜕𝑥 𝜕𝑥 𝑖𝑗𝑘
𝐴𝑝 = 𝐴 …(7.22)
𝜕𝑥 𝑖 𝜕𝑥 𝑗 𝜕𝑥 𝑘 𝜕𝑥 𝑝 𝑙
The rank of a tensor when raised as power to the number of dimensions gives the
number of components of the tensor. For example a tensor of rank r inn dimensional
space has (n)r components. Thus the rank of a tensor gives the number of the mode
of changes of a physical quantity when passing from one system to the other which
is in rotation relative to the first. Obviously a quantity that remains unchanged when
axes are rotated is a tensor of zero rank. The tensors of zero rank are scalars or
invariant and similarly the tensors of rank one are vectors.
7.9. SYMMETRIC AND ANTISYMMETRIC
TENSORS
If two contravariant or covarint indices can be interchanged without altering the
tensor, then the tensor is said to be symmetric with respect to these two indicas.
For example if
𝐴𝑖𝑗 = 𝐴𝑗𝑖
or ቋ …(7.23)
𝐴𝑖𝑗 = 𝐴𝑗𝑖
then the contracariant tensor of second rank Aij or covariant tensor Aij is said to be
symmetric.
𝑖𝑗𝑘
For a tensor of higher rank 𝐴𝑙 if
𝑖𝑗𝑘 𝑗𝑖𝑘
𝐴𝑙 = 𝐴𝑙
𝑖𝑗𝑘
then the tensor 𝐴𝑙 is said to be symmetric with respect to indices i and j.
The symmetry property of a tensor in independent of co-ordinate system used. So if
a tensor is symmetric with respect to two indices in any co-ordinate system, it
remains symmetric with respect to these two indices in any other co-ordinate
system.
This can be seen as follows:
𝑖𝑗𝑘
If tensor 𝐴𝑙 is symmetric with respect to first indices i and j, we have
𝑖𝑗𝑘 𝑗𝑖𝑘
𝐴𝑙 = 𝐴𝑙 …(7.24)
𝜇𝑣𝜎 𝜇 𝑣 𝜎 𝑙
𝜕𝑥 𝜕𝑥 𝜕𝑥 𝜕𝑥 𝑖𝑗𝑘
We have 𝐴𝑝 = 𝜕𝑥 𝑖 𝜕𝑥 𝑗 𝜕𝑥 𝑘 𝜕𝑥 𝑝 , 𝐴𝑙
𝜇 𝑣 𝜎
𝜕𝑥 𝜕𝑥 𝜕𝑥 𝜕𝑥 𝑙
= 𝜌 𝐴 𝑗𝑖𝑘 (using 7.24)
𝜕𝑥 𝑖 𝜕𝑥 𝑗 𝜕𝑥 𝑘 𝜕𝑥
Now interchanging the dummy indices i and j, we get
𝜇 𝑣 𝜎
𝜇𝑣𝜎 𝜕𝑥 𝜕𝑥 𝜕𝑥 𝜕𝑥 𝑙 𝑣𝜇𝜎
𝐴𝜌 = 𝜕𝑥 𝑗 𝜕𝑥 𝑖 𝜕𝑥 𝑘 𝜕𝑥
𝜌𝐴 𝑗𝑖𝑘 = 𝐴𝜌
i.e., given tensor is gain symmetric with respect to first two indices in new
co-ordinate system. This result can also be proved for covariant indices.
Thus the symmetry property of a tensor is independent of coordinate
system.
𝑖𝑗𝑘
Let 𝐴𝑙 be symmetric with respect to two indices, one contravarient I and the other covariant l,
then we have
𝑖𝑗𝑘 𝑙𝑗𝑘
𝐴𝑙 = 𝐴𝑖 …(7.25)
𝜇 𝑣 𝜎
𝜇𝑣𝜎 𝜕𝑥 𝜕𝑥 𝜕𝑥 𝜕𝑥 𝑙 𝑖𝑗𝑘
We have 𝐴𝜌 = 𝜕𝑥 𝑖 𝜕𝑥 𝑗 𝜕𝑥 𝑘 𝜌 𝐴𝑙
𝜕𝑥
𝜇 𝑣 𝜎 𝑙
𝜕𝑥 𝜕𝑥 𝜕𝑥 𝜕𝑥 𝑙𝑗𝑘
= 𝜕𝑥 𝑖 𝜕𝑥 𝑗 𝜕𝑥 𝑘 𝜕𝑥
𝜌 𝐴 𝑖 [(using 7.25)]
Now interchanging dummy indices i and l, we have
𝜇 𝑣 𝜎
𝜇𝑣𝜎 𝜕𝑥 𝜕𝑥 𝜕𝑥 𝜕𝑥 𝑙 𝑖𝑗𝑘
𝐴𝜌 = 𝑖 𝑗 𝑘 𝜌 𝐴𝑙
𝜕𝑥 𝜕𝑥 𝜕𝑥 𝜕𝑥
𝑣 𝜎 𝜇
𝜕𝑥 𝑖 𝜕𝑥 𝜕𝑥 𝜕𝑥 𝑖𝑗𝑘
= 𝜌 𝜕𝑥 𝑗 𝜕𝑥 𝑘 𝜕𝑥 𝑙 𝐴𝑙 …(7.26)
𝜕𝑥
According to tensor transformation law,
𝜌 𝑣 𝜎
𝜌𝑣𝜎 𝜕𝑥 𝜕𝑥 𝜕𝑥 𝜕𝑥 𝑙 𝑖𝑗𝑘
𝐴𝜇 = 𝜇 𝐴𝑙 …(7.27)
𝜕𝑥 𝑖 𝜕𝑥 𝑗 𝜕𝑥 𝑘 𝜕𝑥
Comparing (4.26) and (4.27), we see that
𝜇𝑣𝜎 𝜌𝑣𝜎
𝐴𝜌 ≠ 𝐴𝜇
i.e., summetry is not preserved after a change of co-ordinate system. But kronecker delta which is
a mixed tensor is symmetric with respect to its indices.
(b) Antisymmetric tensors or skew symmetric tensors. A tensor, whose each component
alters in sign but not in magnitude when two contravariant or covariant indices are
inhterchanged, is said to be skew symmetric or antisymmetric with respect to these two
indices.
For example if
𝐴𝑖𝑗 = −𝐴 𝑗𝑖
or ቋ …(7.28)
𝐴𝑖𝑗 = −𝐴𝑗𝑖
then contravariant tensor Aij or covariant tensor Aij of second rank is antisymmetric or for
𝑖𝑗𝑘
a tensor of higher rank 𝐴𝑙 if
𝑖𝑗𝑘 𝑖𝑘𝑗
𝐴𝑙 = −𝐴𝑙
𝑖𝑗𝑘
then tensor 𝐴𝑙 is antisymmetric with respect to indices j and k.
The skew-symmetry property of a tensor is also independent of athe choice of co-
ordinate system. So if a tensor is skew, symmetric with respect to two indices in any co-
ordinate system, it remains skew-symmetric with respect to these two indices in any
other co-ordinate system.
𝑖𝑗𝑘
If tensor 𝐴𝑙 is antisymmetric with respect to first two indices i and j.
We have
𝑖𝑗𝑘 𝑗𝑖𝑘
𝐴𝑙 = −𝐴𝑙 …(7.29)
𝜇 𝑣 𝜎
𝜇𝑣𝜎 𝜕𝑥 𝜕𝑥 𝜕𝑥 𝜕𝑥 𝑙 𝑖𝑗𝑘
We have 𝐴𝜌 = 𝜌 𝐴𝑙
𝜕𝑥 𝑖 𝜕𝑥 𝑗 𝜕𝑥 𝑘 𝜕𝑥
𝜇 𝑣 𝜎
𝜕𝑥 𝜕𝑥 𝜕𝑥 𝜕𝑥 𝑙 𝑗𝑖𝑘
= − 𝑖 𝑗 𝑘 𝜌 𝐴𝑙 [using (7.29)]
𝜕𝑥 𝜕𝑥 𝜕𝑥 𝜕𝑥
i.e., given tensor is again antisymmetric with respect to first two indices in new co-
ordinate system. Thus antisymmetry property is retained under co-ordinate
transformation.
Antisymmetry property, like symmetry property, cannot be defined with respect to
two indices of which one is contravariant and the other covariant.
If all the indices of a contravariant or covariant tensor can be interchanged so that
its components change sign at each interchange of a pair of indices, the tensor is
said to be antisymmetric, i.e.,
Aijk = -Ajik = +Ajki.
Thus we may state that a contravariant or covariant tensor is antisymmetric if its
components change sign under an odd permutation of its indices and do not
change sign under an even permutation of its indices.
7.10. ALGEBRAIC OPERATIONS ON TENSORS
𝑖𝑗 𝑖𝑗𝑙
𝑘𝑚 (say) …(7.41)
𝑙 =𝐶
𝐴𝑘 𝐵𝑚
is a tensor of rank 5 (= 3 + 2)
For proof of this statement we write the transformation equations of the given tensors as
𝜇 𝑣
𝜇𝑣 𝜇𝑣 𝜕𝑥 𝜕𝑥 𝜕𝑥 𝑘 𝑖𝑗
𝐴𝜎 − 𝐵𝜎 = 𝜕𝑥 𝑖 𝜕𝑥 𝑖 𝜕𝑥
𝜎 𝐴𝑘 …(7.42)
𝜌
𝜌 𝜕𝑥 𝜕𝑥 𝑚 𝑙
𝐵𝜆 = 𝐵 .
𝜕𝑥 𝑙 𝜕𝑥𝜆 𝑚
…(7.43)
Multiplying (8.42) and (8.43), we get
𝜇 𝑣 𝜌
𝜇𝑣 𝜌 𝜕𝑥 𝜕𝑥 𝜕𝑥 𝑘 𝜕𝑥 𝜕𝑥 𝑚 𝑖𝑗 𝑙
𝐴𝜎 − 𝐵𝜆 = 𝑖 𝑗 𝜎 𝑙 𝜆 𝐴𝑘 𝐵𝑚
𝜕𝑥 𝜕𝑥 𝜕𝑥 𝜕𝑥 𝜕𝑥
𝜇 𝑣 𝜌
𝜇𝑣𝜌 𝜕𝑥 𝜕𝑥 𝜕𝑥 𝜕𝑥 𝑘 𝜕𝑥 𝑚 𝑖𝑗𝑙
or 𝐶𝜎𝜆 = 𝜎 𝐶 …(7.44)
𝜕𝑥 𝑖 𝜕𝑥 𝑗 𝜕𝑥 𝑙 𝜕𝑥 𝜕𝑥𝜆 𝑘𝑚
𝑖𝑗
which is a transformation law for tensor of rank 5. Hence the outer product of two tensors 𝐴𝑘 and
𝑙 is a tensor 𝐶 𝑖𝑗𝑙 of rank (3 + 2 =) 5.
𝐵𝑚 𝑘𝑚
(iv) Contraction of tensors: The algebraic operation by which the rank of a mixed tonsor is lowered by 2 is
known as contraction. In the process of contraction one contravariant index and one convariant index of a
mixed tensor are set equal and the repeated index summed over, the result is a tensor of rank lower by two
than the original tensor.
𝑖𝑗𝑘
For example consider a mixed tensor 𝐴𝑙𝑚 of rank 5 with contravariant indices i, j, k and covariant indices
l, m
The transformation law of the given tensor is
𝜇 𝑣 𝜎
𝜇𝑣𝜌 𝜕𝑥 𝜕𝑥 𝜕𝑥 𝜕𝑥 𝑙 𝜕𝑥 𝑚 𝑖𝑗𝑘
𝐴𝜎𝜆 = 𝜌 𝐴 …(7.45)
𝜕𝑥 𝑖 𝜕𝑥 𝑗 𝜕𝑥 𝑘 𝜕𝑥 𝜕𝑥𝜆 𝑙𝑚
(b) An another example consider two tensors of rank 1 as 𝐴𝑖 and 𝐵𝑗 . The outer product of 𝐴𝑖 and 𝐵𝑗 is
𝐴𝑖 𝐵𝑗 = 𝐶𝑗𝑖
Applying contraction process by setting i = j, we get
𝑗
𝐴𝑖 𝐵𝑗 = 𝐶𝑗 (a scalar or a tensor of rank zero).
Thus the inner product of two tensors of rank one is a tensor of rank zero. (i.e., invariant).
(vi) Quotient law: In tensor analysis it is often necessary to ascertain whether a given
entity is a tensor or not. The direct method requires us to find out if the given entity
obeys the tensor transformation law or not. In practise this is troublesome and a
simpler test is provided by a law known as quantient law which states:
An entity whose inner product with an arbitrary tensor (contravariant or covariant) is
a tensor, is itself a tensor.
(vii) Extension of rank: The rank of a tensor can be extended by differentiating its each
component with respect to variables xi.
As an example consider a simple case in which the original tensor is of rank zero, i.e., a
𝜕𝑆
scalar S (xi) whose, derivatives relative to the variables xi are 𝑖 . In Other system of
𝜇 𝜇 𝜕𝑥
variables 𝑥 the scalar is 𝑆 𝑥 , such that
𝜕𝑆 𝜕𝑆 𝜕𝑥 𝑖 𝜕𝑥 𝑖 𝜕𝑆
𝜇 = 𝜕𝑥 𝑖 𝜕𝑥
𝜇 = 𝜇 …(7.47)
𝜕𝑥 𝜕𝑥 𝜕𝑥 𝑖
𝜕𝑆
This shows that 𝜕𝑥 𝑖
,
transforms like the components of a tensor of rank one. Thus the
differentiation of a tensor of rank zero gives a tensor of rank one. In general we may say
that the differentiation of a tensor with respect to variables xi yields a new tensor of rank
one greater than the original tensor.
The rank of a tensor can also be extended when a tensor depends upon another
tensor and the differentiation with respect to that tensor is performed. As an example
consider a tensor S of rank zero (i.e., a scalar) depending upon another tensor Aij, then
𝜕𝑆
= 𝐵𝑖𝑗 = 𝑎 tensor of rank 2. …(7.48)
𝜕𝐴𝑖𝑗
Thus the rank of the tensor of rank zero has been extended by 2.
THANKS