0% found this document useful (0 votes)
36 views29 pages

Numerical Analysis - Matrix

This document discusses solving systems of linear algebraic equations. It introduces Gauss elimination and LU decomposition as two methods for solving such systems. It provides pseudocode for Gauss elimination and describes the steps and advantages of LU decomposition.

Uploaded by

R Setiawan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
36 views29 pages

Numerical Analysis - Matrix

This document discusses solving systems of linear algebraic equations. It introduces Gauss elimination and LU decomposition as two methods for solving such systems. It provides pseudocode for Gauss elimination and describes the steps and advantages of LU decomposition.

Uploaded by

R Setiawan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 29

7.

Linear Algebraic Equations


7.1 Introduction
a11 x1  a12 x2  ...  a1n xn  b1
a21 x1  a22 x2  ...  a2 n xn  b2
M M
an1 x1  an 2 x2  ...  ann xn  bn

 a11 a12 ... a1n   x1   b1   a11 a12 ... a1n b1 
a ... a2 n   x2  b2  a
 21
a22
     a22 ... a2 n b2 
  21 
 M M  M   M   M M M
   
 a n1 an 2 ... ann   xn  bn  an1 an 2 ... ann bn 

R Setiawan 84
7. Linear Algebraic Equations

*Solve the following system of equations with Manual Gauss


Elimination, but make each step of calculation clear, using indices
(i, j, k)

* Then, expand it to 4 x 4 matrix

R Setiawan 85
7. Linear Algebraic Equations
7.2 Gauss Elimination
 There are two steps: Forward Elimination and Back Substitution
 In the first step, a Pivot equation is selected, with the aii is the
pivot coefficient, to find new coefficient:

aik
aij  aij  akj (See pseudocode for details)
akk

 Once, completed the result would be a simple, explicit relations


to be solved, then…
 Formula for Back substitution:

xn 
n 1
b
n
b
i
i 1
  ij x j
a i 1

j i 1
n 1
a xi  i 1
for i  n  1, n  2, ...,1
nn a ii

R Setiawan 86
7. Linear Algebraic Equations
7.2.1 Pseudocode:

DO k=1, n-1
DO i=k+1, n; i ≠ k  a11 a12 a13 b1 
factor = ai,k / ak,k a a22 a 23 b2 
DO j = k+1, n  21 
ai,j = ai,j – factor. ak,j  a31 a32 a33 b3 
ENDDO
bi = bi – factor.bk 
ENDDO Forward Elimination
a11 a12 a13 b1 
ENDDO
0 a ' 22 a ' 23 b' 2 
 
xn=bn/an,n  0 0 a"33 b"3 
DO i=n-1,1,-1
sum=0 
DO j=i+1,n x3  b"3 a"33
sum=sum+ai,j .xj Back Substitution
ENDDO x2  b' 2  a '23 x3  a '22
xi = (bi - sum)/ai,i x1  b1  a12 x2  a13 x3  a11
ENDDO

R Setiawan 87
7. Linear Algebraic Equations

*Study § 9.2, Fig. 9.4 and Example 9.5 for the development
and application of the algorithm

7.2.3 Operation Counting:


 the amount of fl ops increases nearly three orders of magnitude
for every order of magnitude increase in the dimension
 Most of the effort is incurred in the elimination step. Thus, efforts
to make the method more effi cient should probably focus on this
step

*Study § 9.2.1 and Tab. 9.1

R Setiawan 88
7. Linear Algebraic Equations
7.2.2 Pitfalls of Gauss Elimination:
 Division by 0 (Pivot coefficient)
 Round-off error
 Ill-condition (Fig. 9.2), due to:
 Matrix Determinant  0
 Matrix determinant = 0  singular, infinite solution

R Setiawan 89
7. Linear Algebraic Equations
7.2.3 Improvements
 More significant figure, to resolve round-off error problem
 Pivoting strategy, to avoid division by zero (Example 9.9)
 The equation with the largest coefficient shall be the
pivoting equation
 Scaling, to resolve round-off error problem (Example 9.10)
 Modifying the order of magnitude of one or more
equations so that all equations have roughly similar order
of magnitude
 Ill-conditioned problems are much more difficult and
requires advanced treatment. However, most real
engineering problems normally well-conditioned, so if you
find ill-condition problems check and recheck the physical
formulation

R Setiawan 90
7. Linear Algebraic Equations
7.3 Gauss-Jordan
 Modification of Gauss-elimination
 The first step is modified, so that all unknown is eliminated
rather than just the subsequent one. This results in an
Identity matrix
 Hence, there is no need for Back-substitution
 But, the elimination step requires even more work,
resulting in more computation than Gauss method, in
total.

R Setiawan 93
7. Linear Algebraic Equations
 Algorithm for Gauss-Jordan:
 Normalise the first row by the  a11 a12 a13 b1 
coefficient a11 a a 22 a 23 b2 
 With the use of the normalised  21 
a 31 a32 a 33 b3 
version of the fisrt row,
eliminate the first column of 
each of the remaining rows (ai1) Forward elimination
 Normalise a22’
process
 Eliminate ai2’ and a32’
 And so on until the last column 
and row (amn) 1 0 0 b" "1 
0 1 0 b" " 
 2

0 0 1 b" "3 


*Learn Ex. 9.12
R Setiawan 94
7. Linear Algebraic Equations
Ex. 9.12

Normalise the first row by the coefficient a11

eliminate the first column of each of the


remaining rows (ai1)

R Setiawan 95
7. Linear Algebraic Equations
Ex. 9.12
Normalise the second row by the coefficient a22

eliminate the second column of each of the


remaining rows (ai2)

Normalise the third row by the coefficient a33

x1 = 3.0000; x2 = -2.5000; x3 = 7.0000

R Setiawan 96
7. Linear Algebraic Equations
 Comparison with Gauss Elimination:
 Conceptually simpler (no apparent “back substitution”
process)
 All pitfalls in Gauss elimination prevails
 But, relatively more computation (approx. 50% more)
 Still used in some numerical methods.

R Setiawan 97
7. Linear Algebraic Equations
Text Book Further Study
 Understand and Train yourself Example 9.5, 9.7, 9.8, 9.12
 Build computer program based on Algorithm in Fig. 9.6
 Try Text book problems:
9.9, 9.12, 9.13, 9.18, 9.19

R Setiawan 98
7. Linear Algebraic Equations
7.4 LU Decomposition
u11 u12 u13  1 0 0
0 u u23  l 1 0
 22   21 
 0 0 u33  l31 l32 1
Upper matrix Lower matrix

 The Forward elimination stage, both in Gauss and Gauss-Jordan


is a time-consuming process
 With LU Decomposition, this stage is tried to be simplified, by
not computing the right-hand side of equations.
 This, particularly beneficial if we have the same system with the
same [A] but different set of {b}. Example: In FE model, we
have the same shape of structure but different force system.

R Setiawan 99
7. Linear Algebraic Equations
Description of Process:
 a11 a12 a13 
A  a21 a22 a23 
a 31 a 32 a33 
U  L
u11 u12 u13  a11 a12 a13 
U    0 u22 u23    0 a ' 22 a ' 23 
 0 0 u33   0 0 a"33 
 1 0 0  1 0 0
L  l21 1 0   f 21 1 0
l31 l32 1  f 31 f 32 1
check : LU   A

R Setiawan 100
7. Linear Algebraic Equations
Programming Strategy:
 The Decompose process is to form a matrix [A’] with the
following example form, So that, the computer only stores
one matrix but will be used according to the procedure [U],
[L] :

u11 u12 L u1n 


 SUB Decompose (a,n)
l u22 L u23 
DO k = 1,n-1 A'   21 
DO i = k+1, n M M O M 
factor=ai,k/ak,k  
ai,k = factor  ln1 ln 2 L unn 
DO j=k+1,n
ai,j = ai,j – factor*ak,j
ENDDO
ENDDO
ENDDO
END Decompose

R Setiawan 101
7. Linear Algebraic Equations
Basic Algorithm:
 During elimination phase, factors that are calculated are
stored as lij
 Partial pivoting strategy is implemented, instead of the
whole row
 While the equations are not scaled, scaling is used to
determine if pivoting is to be implemented
 The diagonal terms are monitored for near-zero
occurrences in order to raise singularity warning.

R Setiawan 102
7. Linear Algebraic Equations
 The Substitution Process is by obtaining {D} and finally {X},
using the following relationships:

n 1
d n  lnn
i 1
d i  d i   aij b j for i  2,3,L, n
j 1

xn  d n ann
n
di  a x
j i 1
ij j

xi  for i  n  1, n  2,L,1
aii

R Setiawan 103
7. Linear Algebraic Equations
Example 10.1 – 10.2

 3  0 .1  0 .2 
A  0.1 7  0.3
0.3  0.2 10 

R Setiawan 104
7. Linear Algebraic Equations
Example 10.1 – 10.2
 3  0.1  0.2 3  0.1  0.2 
A  0.1 7  0.3  0 7.00333  0.293333
  
0.3  0.2 10  0  0.19 10.02 
3  0.1  0.2 
 0 7.00333  0.293333  U 
 
0 0 10.0120 
0.1
f 21   0.03333333
3
0.3
f 31   0.10000000
3
 0.19
f 32   0.0271300
7.00333
 1 0 0
L  0.03333333 1 0

0.10000000  0.0271300 1

R Setiawan 105
7. Linear Algebraic Equations

 1 0 0  d1   7.85 
   
LD  B  0.03333333 1 0 d 2     19.3

0.10000000  0.0271300 1  d 3   71.4 
 D  7.85  19.5617 70.0843
T

3  0 .1  0.2   x1   7.85 
U X   D 0 7.00333  0.293333  x2    19.5617 
0 0 10.0120   x3   70.0843 

 X   3  2.5 7.00003
T
Case I

R Setiawan 106
7. Linear Algebraic Equations

1 0 0
⇒ 0.03333333 1 0
0.10000000 0.0271300 1
⇒ ?? ?? ??

3 0.1 0.2 ??
⇒ 0 7.00333 0.293333 ??
0 0 10.0120 ??

⇒ ?? ?? ??

Case II

R Setiawan 107
7. Linear Algebraic Equations

1 0 0
⇒ 0.03333333 1 0
0.10000000 0.0271300 1
⇒ ?? ?? ??

3 0.1 0.2 ??
⇒ 0 7.00333 0.293333 ??
0 0 10.0120 ??

⇒ ?? ?? ??

Case III

R Setiawan 108
7. Linear Algebraic Equations
7.6 Matrix Inverse

AA1  A1A  I 

 a '11 a '12 a '13   a '11   a '12   a '13 


A1  a '21 a '22 a '23   Ai11  a '21 ; Ai 2 1  a '22 ; Ai 31  a '23 
 a '31 a ' 23 a '33  a '  a '  a ' 
 31   32   33 
1 0 0  1  0  0 
       
I   0 1 0  I i1  0 ; I i 2   1 ; I i 3  0 
0 0 1 0  0  1 
     
AAi11  I i1; AAi 2 1  I i 2 ; AAi 31  I i 3 
Learn Ex. 10.3
R Setiawan 109
7. Linear Algebraic Equations
Cholesky Algorithm
 If Matrix [A] is symmetrical, then LU decomposition
algorithm may be simplified  Cholesky’s:

[A] = [L][L]T

Only one matrix to be calculated and stored

Learn Example 11.2

R Setiawan 110
7. Linear Algebraic Equations
7.7 Error Analysis & System Condition
 Application of Inverse Matrix for Ill-condition
matrix:
 Scale the matrix [A]; Invert matrix [A]; compare the
order of magnitude of elements of [A]-1 for Ill-condition
matrix
   
1
A A  I ??  if not, then Ill-condition

 A 
1 1
 A?? if not, then Ill-condition

R Setiawan 113
7. Linear Algebraic Equations
7.7 Error Analysis & System Condition
 Matrix Norm:
 Euclidian norms, for vectors
 Frobenius norms, for general matrices
 Uniform vector norm
 Row-sum norm
 Matrix Condition Number:

Cond A  A  A
1

Application of Cond[A]  Error estimate of {X}:

X A
 Cond A
X A

R Setiawan 114
7. Linear Algebraic Equations
Matrix Norm
 Euclidian Norms:
n n n
X e
  i
x 2
Ae  i, j
a 2

j 1 j 1
i 1

 Alternatives:
 P norm 1/ P
 n
P
X P
   xi 
 i 1  n
 Sum of Absolutes X 1   xi
i 1

 Uniform vector Norm X 


 max xi
1i  n
n

 Row-sum Norm A 1  max  aij


1i  n
i 1

R Setiawan 115
7. Linear Algebraic Equations
Text Book Further Study
 Understand and if necessary memorize procedure as in
Fig. 10.1
 Train yourself Example 10.1, 10.2, 10.3
 Build computer program based on Algorithm in Fig. 10.3
 Try Text book problems:
10.3, 10.6, 10.10

R Setiawan 116

You might also like