0% found this document useful (0 votes)
94 views

Tensor Analysis

The document discusses tensors and their properties. Tensors are mathematical objects that can describe vectors, matrices, and other multidimensional data. The document defines tensors and their transformation properties under changes of coordinate systems.

Uploaded by

Tushar Anand
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
94 views

Tensor Analysis

The document discusses tensors and their properties. Tensors are mathematical objects that can describe vectors, matrices, and other multidimensional data. The document defines tensors and their transformation properties under changes of coordinate systems.

Uploaded by

Tushar Anand
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 32

TENSOR ANALYSIS

Dr. Ranjit Baishya


Associate Professor
Department of Physics
J. N. College, Boko

17-May-21 RB 1
 Introduction of Tensor:-

Scalars are specified by magnitude only, Vectors have


magnitude as well as direction. But Tensors are associated
with magnitude and two or more directions.

Tensor Analysis is suitable for Mathematical


formulation of Natural Laws in forms which are invariant
with respect to different frames of reference. That is why
Einstein used Tensors for the formulation of his Theory of
Relativity.

17-May-21 RB 2
 Introduction of Tensor:-

A scalar is a zero-order tensor. A vector is a first-order tensor. A


matrix is a second order tensor. For example, consider the stress
tensor .

  xx  xy  xz 
 
    yx  yy  yz 
  zy  zz 
 zx

3
 Introduction of Tensor:-

Another way to write a vector is in Cartesian form:



x  x î  yĵ  zk̂  (x, y, z) ---- (1)
The coordinates x, y and z can also be written as x1, x2, x3. Thus the
vector can be written as

x  ( x1, x 2 , x 3 ) ---- (2)
or as

x  ( x i ) , i  1..3 ---- (3)
or in index notation, simply as
 ---- (4)
x  xi
where i is understood to be a dummy variable running from 1 to 3.

Thus xi, xj and xp all refer to the same vector (x1, x2 and x3) , as the
index (subscript) always runs from 1 to 3.

4
 Introduction of Tensor:-
Magnitude of a vector:
2  
A  A  A  Ai Ai ---- (6)

A tensor can be constructed by multiplying two vectors (not scalar


product):

 A1B1 A 2B1 A 3B1 


 
AiB j  ( AiB j ) ,i  1..3, j  1..3   A1B2 A 2B2 A 3B3  ---- (7)
A B A 2B3 A 3B3 
 1 3
Two free indices (i, j) means the result is a second-order tensor
Now consider the expression
Ai A jB j
This is a first-order tensor, or vector because there is only one free
index, i (the j’s are paired, implying summation).
Ai A jB j  ( A1B1  A 2B2  A 3B2 )( A1, A 2 , A 3 ) ---- (8)
5
That is, scalar times vector = vector.
CO-ORDINATE TRANSFORMATIONS

 Tensor analysis is intimately connected with the subject of co-


ordinate transformations.

 Consider two sets of variables (x1 , x2 , x3 , …, xn) and ( ’1, ’2,


’3, … ’𝑛), which determine the co-ordinates of point in an n-
dimensional space in two different frames of reference.
Let the two sets of variables be related to each other by
the transformation equations
’1 = 1 ( 1 , 2 , 3 , … n )
’2 = 2 ( 1 , 2 , 3 , … n )

……………………
’n = n ( 1 , 2 , 3 , … n )

or briefly
’𝜇 = 𝜇 ( 1 , 2 , 3 , … n ) -------- (9)
17-May-21 RB
(i = 1, 2, 3, …, n) 6
where function are single valued, continuous
differentiable functions of co-ordinates. It is essential
that the n-function be independent.

Equations (1) can be solved for co-ordinates xi as


functions of ’ to yield
i = ( ’1 , ’2 , ’3 ,… ’ …. ’n ) ------- (10)
Equations (9) and (10) are said to define co-ordinate
transformations.
From equations (9) the differentials ’ are
transformed as

17-May-21 RB 7
 SUMMATION CONVENTION:

17-May-21 RB 8
Kronecker delta ij

 1 0 0
1 if i  j  
ij    0 1 0 ---- (12)
0 if i  j  
 0 0 1 
Since there are two free indices, the result is a second-order tensor, or
matrix. The Kronecker delta corresponds to the identity matrix.

Third-order Levi-Civita tensor.

 1 if i, j,k cycle clockwise: 1,2,3, 2,3,1 or 3,1,2



ijl   1 if i, j,k cycle counterclockwise: 1,3,2, 3,2,2 or 2,1,3
 0 otherwise

Vectorial cross product:
 
AxB  ijk A jBk
One free index, so the result must be a vector. 9
Some properties of kronecker delta:

(i) If x1 , x2 , x3 , …xn are independent variables,


then
/ = ---- (13)

ii) An obvious property of kronecker delta symbol is


= ---- (14)

Since by summation convention in the left hand side of this


equation the summation is with respect to j and by definition of
kronecker delta, the only surviving term is that for which j = k

17-May-21 RB 10
(iii) If we are dealing with n dimensions, then
= =
By summation convention
= 11 + 22 + 3
3 + +
=1+1+1+ +1
=
(iv) i =
By summation convention
= 1 + 2 + 3 + + +
1 2 3
+
=0+0+0+ + 1. + +0
= i.

(v) / ’ . ’ / = / = .
17-May-21 RB 11
CONTRAVARIANT VECTORS AND COVARIANT VECTORS

(a) Contravariant Vectors.

Consider a set of n quantities 1 , 2, 3 , 4 … n in a


system of variables xi and let these quantities have values ’1, ’2,
’3 , ’4 … ’n in another co-ordinate system of variables /𝜇 . If
these quantities obey the transformation relation
/𝜇 = /𝜇 / 𝑖 𝑖 ---- (15)
then the quantities 𝑖 are said to be the components of a
contravariant vector or a contravariant tensor of first rank.

17-May-21 RB 12
CONTRAVARIANT VECTORS AND COVARIANT VECTORS

(b) Covariant Vectors.

Consider a set of n quantities 1 , 2, 3 , 4 … n in a


system of variables xi and let these quantities have values 1/, 2/,
3 , … n in another co-ordinate system of variables
/ / /𝜇 . If these

quantities obey the transformation relation


/ = 𝑖/ /𝜇
𝜇 𝑖 ---- (16)
then the quantities 𝑖 are said to be the components of a covariant
vector or a covariant tensor of first rank.

17-May-21 RB 13
TENSORS OF HIGHER RANKS
The laws of transformation of vectors are:
Contravariant … /𝜇 = /𝜇 / 𝑖 𝑖
---- (15)
Covariant … / =
𝜇
𝑖 / /𝜇
𝑖 ---- (16)
(a) Contravariant tensors of second rank:
Let us consider (n)2 quantities Aij (here i and j take the
values from 1 to n independently) in a system of variables xi
and let these quantities have values /𝜇𝑣 in another system of
variables /𝜇𝜇.
If these quantities obey the transformation equations
/𝜇𝑣 = ( /𝜇 / 𝑖) ( /𝑣 / 𝑗) 𝑖𝑗
---- (17)

then the quantities Aij are said to be the components of a


contravariant tensor of second rank.

17-May-21 RB 14
TENSORS OF HIGHER RANKS
The laws of transformation of vectors are:
Contravariant … /𝜇 = /𝜇 / 𝑖 𝑖
---- (15)
Covariant … / =
𝜇
𝑖 / /𝜇
𝑖 ---- (16)
(b) Covariant tensors of second rank:
Let us consider (n)2 quantities Aij (here i and j take the
values from 1 to n independently) in a system of variables xi
and let these quantities have values /𝜇𝑣 in another system of
variables /𝜇𝜇.
If these quantities obey the transformation equations
𝜇𝑣 = ( /
/ /𝜇 ) ( / /𝑣)
𝑖 𝑗
𝑖𝑗 ---- (18)

then the quantities Aij are said to be the components of a


covariant tensor of second rank.

17-May-21 RB 15
(c) Mixed tensor of second rank:

If (n)2 quantities 𝑗𝑖 in a system of variables xi are related to


another (n)2 quantities 𝑣𝜇 in another system of variables /𝜇 by the
transformation equations
/ 𝜇
𝑣 =( /𝜇 / i) ( 𝑗 / /𝑣)
j
𝑖 … (19)
then the quantities 𝑗
𝑖 are said to be component of a mixed tensor
of second rank.
(d) Tensor of higher ranks:
The tensors of higher ranks are defined by similar laws.
The rank of a tensor only indicates the number of indices attached
to its per component. For example p𝑖𝑗𝑘 are the components of a
mixed tensor of rank 4; contravariant of rank 3 and covariant of
rank 1, if they transform according to the equation
/ 𝜇𝑣𝜎 =( /𝜇 / 𝑖)( /𝑣 / 𝑗)( /𝜎 / 𝑘) ( p / /q) 𝑖𝑗𝑘 …(20)
q p
17-May-21 RB 16
RANK OF A TENSOR

The rank of a tensor when raised as power to the


number of dimensions gives the number of components
of the tensor. For example ‘a tensor of rank r’ in n
dimensional space has (n)r components. Thus the rank
of a tensor gives the number of the mode of changes of
a physical quantity when passing from one system to the
other which is in rotation relative to the first.
Obviously a quantity that remains unchanged
when axes are rotated is a tensor of zero rank. The
tensors of zero rank are scalars or invariant and similarly
the tensors of rank one are vectors

17-May-21 RB 17
SYMMETRIC AND ANTISYMMETRIC TENSORS

(a) Symetric Tensor:


If two contravariant or covarint indices can be interchanged
without altering the tensor, then the tensor is said to be symmetric
with respect to these two indices.
For example if 𝑖𝑗 = 𝑗𝑖

or 𝑖𝑗 = 𝑗𝑖 …. . (21)
then the contracariant tensor of second rank Aij or covariant tensor
Aij is said to be symmetric.
For a tensor 𝑙 of higher rank
𝑖𝑗𝑘

if 𝑙
𝑖𝑗𝑘 =
𝑙
𝑗𝑖𝑘

then the tensor 𝑙𝑖𝑗𝑘 is said to be symmetric with respect to indices


i and j.

17-May-21 RB 18
Theorem 1: The symmetry property of a tensor in independent of
co-ordinate system used.
If tensor 𝑙
𝑖𝑗𝑘 is symmetric with respect to first indices i and j,
we have 𝑙
𝑖𝑗𝑘 =
𝑙
𝑗𝑖𝑘
---- (22)
Now / 𝜇𝑣𝜎
𝑝 =( /𝜇 / 𝑖)( /𝑣 / 𝑗)( /𝜎 / 𝑘)( 𝑙 / /𝑝)
𝑙
𝑖𝑗𝑘

=( /𝜇 / 𝑖)( /𝑣 / 𝑗)( /𝜎 / 𝑘)( 𝑙 / /𝑝)


𝑙
𝑗𝑖𝑘

[using eq (22)]
Again interchanging the dummy indices i and j, we get
/ 𝜇𝑣𝜎 = ( /𝜇 / j)( /𝑣 / i)( /𝜎 / 𝑘)( 𝑙 / /𝑝) 𝑗𝑖𝑘
𝑝 𝑙
= ( /𝑣𝑣 / i)( /𝜇𝜇 / j)( /𝜎𝜎 / 𝑘)( 𝑙 / /𝑝𝑝) 𝑙
𝑖𝑗𝑘

= /𝑝𝑣𝜇𝜎

i.e. given tensor is gain symmetric with respect to first two indices
in new co-ordinate system. Thus the symmetry property of a
tensor is independent of coordinate system.

17-May-21 RB 19
Theorem 2: Symmetry is not preserved with respect to two
indices, one contravarient and the other covariant.

Let 𝑙𝑖𝑗𝑘 be symmetric with respect to two indices, one


contravarient i and the other covariant l, then we have

𝑙
𝑖𝑗𝑘 = 𝑖
𝑙𝑗𝑘 - ---- (23)
Now / 𝜇𝑣𝜎
𝑝 =( 𝜇/
/𝜇 𝑖)( 𝑣/
/𝑣 𝑗)( 𝜎/
/𝜎 𝑘)( 𝑙 / 𝑝)
/𝑝
𝑙
𝑖𝑗𝑘

=( 𝜇/
/𝜇 𝑖)( 𝑣/
/𝑣 𝑗)( 𝜎/
/𝜎 𝑘)( 𝑙 / 𝑝)
/𝑝
𝑖
𝑙𝑗𝑘

[using eq (23)]
Again interchanging the dummy indices i and l, we get
/ 𝜇𝑣𝜎 = ( /𝜇 / l)( /𝑣 / j)( /𝜎 / 𝑘)( i / /𝑝) 𝑙𝑗𝑘
𝑝 𝑖
/ 𝑝𝑣𝜎 = ( /𝑝 / i)( /𝑣 / j)( /𝜎 / 𝑘)( l / /𝜇) 𝑖𝑗𝑘
𝜇 𝑙
Thus / 𝜇𝑣𝜎 ≠ / 𝑝𝑣𝜎
𝑝 𝜇

17-May-21 RB 20
(b) Antisymmetric tensors or skew symmetric tensors.

A tensor, whose each component alters in sign but not in


magnitude when two contravariant or covariant indices are
interchanged, is said to be skew symmetric or antisymmetric with
respect to these two indices.
For example if 𝑖𝑗 = − 𝑗𝑖

or 𝑖𝑗 = − 𝑗𝑖 …. . . (24)
then contravariant tensor Aij or covariant tensor Aij of second rank
is antisymmetric or for a tensor of higher rank 𝑙𝑖𝑗𝑘
if 𝑙𝑖𝑗𝑘 = − 𝑙𝑖𝑘𝑗 then tensor 𝑙𝑖𝑗𝑘 is antisymmetric with respect
to indices j and k.
The skew-symmetry property of a tensor is also independent of
the choice of coordinate system. So if a tensor is skew symmetric
with respect to two indices in any coordinate system, it remains
skew-symmetric with respect to these two indices in any other co-
ordinate
17-May-21system. RB 21
If all the indices of a contravariant or covariant
tensor can be interchanged so that its components
change sign at each interchange of a pair of indices,
the tensor is said to be antisymmetric,
i.e., Aijk = - Ajik = + Ajki.
Thus we may state that a contravariant or covariant
tensor is antisymmetric if its components change
sign under an odd permutation of its indices and do
not change sign under an even permutation of its
indices

17-May-21 RB 22
ALGEBRAIC OPERATIONS ON TENSORS
(i) Additional and subtraction: The addition and subtraction of
tensors is defined only in the case of tensors of same rank and
same type. Same type means the same number of
contravarient and covariant indices. The addition or subtraction
of two tensors, like vectors, involves the individual elements. To
add or subtract two tensors the corresponding elements are
added or subtracted.
The sum or difference of two tensors of the same rank
and same type is also a tensor of the same rank and same type.
For example if there are two tenors 𝑘𝑖𝑗 and 𝑘𝑖𝑗 of the
same rank and same type, then the laws of addition and
subtraction are given by
𝑘 + 𝑘 = 𝑘 (Addition) …. .(25)
𝑖𝑗 𝑖𝑗 𝑖𝑗
𝑖𝑗 − 𝑖𝑗 = 𝑖𝑗 (Subtraction).…. (26)
𝑘 𝑘 𝑘

where 𝑘𝑖𝑗 and 𝑘𝑖𝑗 are the tensors of the same rank and same
type as the given tensors.
17-May-21 RB 23
The transformation laws for the given tensors are
/ =( / / )( / / )( / / ) …. .(27)
and / =( / / )( / / )( / / )B …. .(28)
Adding (27) and (28), we get
/ + / =( / / )( / / )( / / )( +B )
C/ =( / / )( / / )( / / )C
where is a transformation law for the sum and is similar
to transformation laws for and given by (27)
and (28).
Hence the sum = + is itself a tensor of the
same rank and same type as the given tensors.

17-May-21 RB 24
(ii) Outer product:

The outer product of two tensors is a tensor whose rank


is the sum of the ranks of given tensors.
Thus if r and r/ are the ranks of two tensors, their outer
product will be a tensor of rank (r + r/).

For example if 𝑘𝑖𝑗 and 𝑙


𝑚 are two tensors of ranks 3
and 2 respectively, then 𝑘
𝑖𝑗
𝑚
𝑙 =
𝑘𝑚
𝑖𝑗𝑙 (say) …(29)

𝑘𝑚
𝑖𝑗𝑙 is a tensor of rank 5 (= 3 + 2)

17-May-21 RB 25
Prove:- If 𝑘𝑖𝑗 and 𝑚 are two
𝑙 tensors of ranks 3 and 2
respectively, then 𝑘
𝑖𝑗
𝑚 = 𝑘𝑚
𝑙 𝑖𝑗𝑙 is a tensor of rank 5.

Proof:- The transformation equations of the given tensors as


/ 𝜇𝑣 = ( /𝜇/ 𝑖)( /𝑣/ 𝑖)( 𝑘/ /𝜎) 𝑖𝑗 …(30)
𝜎 𝑘
/ 𝜌 = ( /𝜌/ 𝑙)( 𝑚/ /𝜆) 𝑙 . …(31)
𝜆 𝑚

Multiplying (30) and (31), we get


/ 𝜇𝑣 / 𝜌 =( 𝜇/
/𝜇 𝑖)( 𝑣/
/𝑣 𝑗)( 𝑘/ 𝜎)(
/𝜎 𝜌/
/𝜌 𝑙)( 𝑚/ 𝜆)
/𝜆 𝑖𝑗 𝑙
𝜎 𝜆 𝑘 𝑚

/ 𝜇𝑣𝜌=( /𝜇/ 𝑖)( /𝑣/ 𝑗)( 𝑘/ /𝜎)( /𝜌/ 𝑙)( 𝑚/ /𝜆) 𝑖𝑗…(32)
𝜎𝜆 𝑘𝑚

which is a transformation law for tensor of rank 5.


Hence the outer product of two tensors 𝑘𝑖𝑗 and 𝑚𝑙 is a
tensor 𝑘𝑚𝑖𝑗𝑙 of rank 5.

17-May-21 RB 26
(iii) Contraction of tensors:

The algebraic operation by which the rank of a


mixed tensor is lowered by 2 is known as contraction.

In the process of contraction one contravariant


index and one covariant index of a mixed tensor are
set equal and the repeated index summed over, the
result is a tensor of rank lower by two than the original
tensor.

For example, consider a mixed tensor of rank 5.


If we consider k = m, then the tensor become
m = a tensor of rank 3.
17-May-21 RB 27
Let us consider a mixed tensor 𝑙𝑚𝑖𝑗𝑘 of rank 5 with contravariant
indices i, j, k and covariant indices l, m.
The transformation law of the given tensor is
/ 𝜇𝑣𝜎 =( /𝜇/ 𝑖)( /𝑣/ 𝑗)( /𝜎/ 𝑘)
𝜌𝜆
( 𝑙/ /𝜌)( 𝑚/ /𝜆) 𝑙𝑚𝑖𝑗𝑘 …(33)
To apply the process of contraction, we put = and obtain
/ 𝜇𝑣𝜎 =( /𝜇/ 𝑖)( /𝑣/ 𝑗)( /𝜎/ 𝑘)
𝜌𝜎
( 𝑙/ /𝜌𝜌)( 𝑚/ /𝜎𝜎) 𝑙𝑚𝑖𝑗𝑘
=( /𝜇/ 𝑖)( /𝑣/ 𝑗)( 𝑙/ /𝜌) 𝑘𝑚 𝑙𝑚𝑖𝑗𝑘
=( /𝜇𝜇/ 𝑖)( /𝑣𝑣/ 𝑗)( 𝑙/ /𝜌𝜌) 𝑙k 𝑙
𝑖𝑗𝑘

=( /𝜇/ 𝑖)( /𝑣/ 𝑗)( 𝑙/ /𝜌) 𝑙k𝑖𝑗𝑘


[since ( /𝜎/ 𝑘)( 𝑚/ /𝜎) = 𝑘𝑚]
which is a transformation law for a mixed tensor of rank 3. Hence
𝑖𝑗𝑘 is a mixed tensor of rank 3 and may be denoted by
𝑙 .
𝑖𝑗
𝑙k

Thus the process of contraction enables us to obtain a tensor of


rank (r – 2) from a mixed tensor of rank r.
17-May-21 RB 28
(iv) Inner Product:
The outer product of two tensors followed by a contraction
results a new tensor called and inner product of the two tensors
and the process is called the inner multiplication of two tensors.

For example, Consider two tensors 𝑘


𝑖𝑗 and 𝑚
𝑙.

The outer product of these two tensors is 𝑘


𝑖𝑗
𝑚
𝑙 = 𝑘𝑚
𝑖𝑗𝑙 (say).
Applying contraction process by setting m = i, we obtain
𝑚 = 𝑘𝑚 = 𝑘 (a new tensor)
𝑖𝑗 𝑙 𝑖𝑗𝑙 𝑗𝑙
𝑘
….. (34)
The new tensor 𝑘𝑗𝑙 is the inner product of the two tensors 𝑘
𝑖𝑗

and 𝑚𝑙 .

17-May-21 RB 29
Prove: The inner product of two tensors of rank one is an invariant.

Proof: Let us consider two tensors of rank 1 as 𝑖 and 𝑗 .


The outer product of 𝑖 and 𝑗 is 𝑖
𝑗 = 𝑗
𝑖.

Applying contraction process by setting i = j,


we get 𝑖
𝑗 = i
i (a scalar or a tensor of rank zero).
Thus the inner product of two tensors of rank one is a tensor of
rank zero. (i.e., invariant). proved

17-May-21 RB 30
(v) Extension of rank:
The rank of a tensor can be extended by differentiating its
each component with respect to variables xi .

Let us consider a simple case in which the original tensor is


of rank zero, i.e., a scalar S (xi ) whose, derivatives relative to the
variables xi are / 𝑖 .
In other system of variables /𝜇 the scalar is /( /𝜇), such that
/ 𝜇=
/𝜇 ( / 𝑖)( 𝑖/ 𝜇)
/𝜇 =( 𝑖/ 𝜇)
/𝜇 / 𝑖 ……(35)

This shows that / , transforms like the components of a


tensor of rank one. Thus the differentiation of a tensor of rank zero
gives a tensor of rank one.
In general we may say that the differentiation of a tensor with
respect to variables xi gives a new tensor of rank one greater than
the original tensor.
17-May-21 RB 31
17-May-21 RB 32

You might also like