Chapter - Two - CT - 1
Chapter - Two - CT - 1
A symmetric matrix is said to be positive definite if all its Eigen values are strictly positive (>0).
2 5
1 4
3 6
A well-known example is that of a frame element having 2 nodes and 3 degrees of freedom per node.
Matrix is having diagonal dominance and is positive definite. No need to re-arrange the equations
to get diagonal dominance.
Matrix is symmetric (obvious from Maxwell’s reciprocal theorem). Only upper or lower triangular
elements may be formed and rest can be obtained using symmetry.
The matrices are banded in nature i.e., the non-zero elements of stiffness matrix are concentrated
near the diagonal of the matrix. Elements away from diagonal are zero.
2.1 System of Linear Equations
Matrix displacement Equations are linear simultaneous equations normally of form [A] {x} = {b}
….
….
Algorithm to Solve:
To solve [A] {x} = {b}, we reduce it to an equivalent system [U] {x} = {g}, in which U is upper
triangular. This system can be easily solved by a process of backward substitution.
We carry out pivotal operation on row 2nd. For row 2nd, a11 is pivot.
First line/equation is maintained as it is.
For Equations below:
⋯ ⋯
⎡ 0 ⋯ ⋯ ⎤⎡ ⎤ ⎡ ⎤
⎢ ⎥ ⎢ ⎥
⎢ ⋮ ⋮ ⋮ ⋮ ⋮ ⋮ ⋮ ⎥⎢ ⋮ ⎥ ⎢ ⋮ ⎥
⎢ ⎥=
⎢ 0 ⋯ ⋯ ⎥⎢ ⎥ ⎢ ⎥
⎢ ⋮ ⋮ ⋮ ⋮ ⋮ ⋮ ⋮ ⎥⎢ ⋮ ⎥ ⎢ ⋮ ⎥
⎣ 0 … … ⎦⎣ ⎦ ⎣ ⎦
For pivotal operations on akk, no changes are made in kth row but rows below kth row
#$ #$
! " " " "
= − #
#$ And = − #
#$ for i,j = k+1,…,n
## ##
% " ∑& ( )(
= (* +
for i = n-1, n-2, ….. , 1
Gauss-Jordan Elimination is an algorithm that can be used to solve systems of linear equations and
to find the inverse of any invertible matrix. It relies upon three elementary row operations one can
use on a matrix:
1 0
For any Matrix, A-1 A = I, where I = identity matrix as , .
0 1
For a setoff equations, [A] {x} = {b},
{x} = [A]-1 {b}
{x} = {b} if [A]-1 = I
For this, we operate on [A] matrix and make it equal to Identity Matrix so that its inverse is also
equal to Identity Matrix.
⋯ ⋯
⎡ 0 ⋯ ⋯ ⎤⎡ ⎤ ⎡ ⎤
⎢ ⎥ ⎢ ⎥
⎢ ⋮ ⋮ ⋮ ⋮ ⋮ ⋮ ⋮ ⎥⎢ ⋮ ⎥ ⎢ ⋮ ⎥
⎢ ⎥=
0 ⋯ ⋯
is converted to
⎢ ⎥⎢ ⎥ ⎢ ⎥
⎢ ⋮ ⋮ ⋮ ⋮ ⋮ ⋮ ⋮ ⎥⎢ ⋮ ⎥ ⎢ ⋮ ⎥
⎣ 0 … … ⎦⎣ ⎦ ⎣ ⎦
1 0 0 ⋯ 0 ⋯ 0
⎡0 ⎡ ⎤ ⎡ ⎤ ⎡ ⎤
1 0 ⋯ 0 ⋯ 0⎤ ⎡ ⎤
⎢ ⎥⎢ ⋮ ⎥ ⎢ ⋮ ⎥ ⎢⋮⎥ ⎢⋮⎥
⎢⋮ ⋮ ⋮ ⋮ ⋮ ⋮ ⋮⎥ ⎢ ⎥ ⎢ ⎥
= and hence ⎢ ⎥ = ⎢ ⎥
⎢0 0 0 ⋯ 1 ⋯ 0⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥
⎢⋮ ⋮ ⋮ ⋮ ⋮ ⋮ ⋮⎥ ⎢ ⋮ ⎥ ⎢ ⋮ ⎥ ⎢⋮⎥ ⎢⋮⎥
⎣0 0 0 … 0 … 1⎦ ⎣ ⎦ ⎣ ⎦ ⎣ ⎦ ⎣ ⎦
1
= { −/ } … … … (1)
0
1
= { −/ } 345 6 ≠ 2 … … … (2)
0
and similarly,
1
= { −/ } 345 6 ≠ 9
0
Steps:
⎡ : ⎤
⎢ ⎥
⎢ ⎥
⎢ ⎥
⎢ ⎥
⎢ 0⎥
⎢ 0 0⎥
⎣ 0 0 0⎦ ) %;
In general, the pth diagonal of the main matrix is stored as pth column i.e., the principal diagonal or
1st diagonal is stored as 1st column.
The correspondence between the original matrix and new matrix is given by
(6 > <) = ( "= )
For example, the entry marked in red in above matrix is a24. The new location of a24 is given by:
> = (>" = ) =
Algorithm for Gaussian elimination method can be re-written for symmetric banded matrix
considering
1. For original matrix, =
2. For new matrix, number of elements in kth row is min(n-k+1, nbw)
For example – number of elements in 6th row is min(8-6+1, 4) = 3
The solution converges if G G reaches a small enough value to be neglected. This method is
robust and normally converges in n iterations for n x n matrix.