0% found this document useful (0 votes)
62 views14 pages

Chapter 11

The document discusses two topics: 1) Special matrices, including banded, symmetric, and tridiagonal matrices. Efficient solution methods like Cholesky decomposition are described. 2) The Gauss-Seidel iterative method for approximating solutions to systems of equations. It involves making initial guesses and iteratively refining the estimates. An example demonstrates applying the Gauss-Seidel method.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
62 views14 pages

Chapter 11

The document discusses two topics: 1) Special matrices, including banded, symmetric, and tridiagonal matrices. Efficient solution methods like Cholesky decomposition are described. 2) The Gauss-Seidel iterative method for approximating solutions to systems of equations. It involves making initial guesses and iteratively refining the estimates. An example demonstrates applying the Gauss-Seidel method.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 14

Chapter 11: Special Matrices and Gauss-Seidel

1. Special Matrices

2. Gauss-Seidel

Rabih

Rabih

Rabih
1 / 14
Introduction
Certain matrices have a particular structure that can be exploited
to develop efficient solution schemes. The first part of this chapter
is devoted to two such systems: banded and symmetric matrices.
Efficient elimination methods are described for both.
The second part of the chapter turns to an alternative to
elimination methods, that is, approximate, iterative methods. The
focus is on the Gauss-Seidel method, which employs initial guesses
and then iterates to obtain refined estimates of the solution. The
Gauss-Seidel method is particularly well suited for large numbers of
equations. In these cases, elimination methods can be subject to
round-off errors. Because the error of the Gauss-Seidel method is
controlled by the number of iterations, round-off error is not an
issue of concern with this method. However, there are certain
instances where the Gauss-Seidel technique will not converge on
the correct answer.

2 / 14
11.1 Special Matrices
Special Matrices
A banded matrix is a square matrix that has all elements equal to
zero, with the exception of a band centered on the main diagonal.
Banded systems are frequently encountered in engineering. They
typically occur in the solution of differential equations.
The dimensions of a banded system can be quantified by two
parameters: the band-width BW and the half-bandwidth HBW.
These two values are related by BW = 2HBW + 1. In general, a
banded system is one for which aij = 0 if |i − j| > HBW .

3 / 14
11.1 Special Matrices

Tridiagonal Systems
A tridiagonal system is one with a bandwidth of 3 and can be
expressed generally as

4 / 14
11.1 Special Matrices
Cholesky Decomposition
Recall that a symmetric matrix is one where aij = aji for all i and
j. In other words, A = AT . Such systems occur commonly in both
mathematical and engineering problems. They offer computational
advantages because only half the storage is needed and, in most
cases, only half the computation time is required for their solution.
A symmetric matrix can be decomposed according to Cholesky
decomposition as
A = LLT
The result is expressed by recurrence relations. For the kth row,
Pi−1
aki − j=1 lij lkj
lki = for i = 1, 2, · · · k − 1
lii
and v
u
u k−1
X
lkk = takk − lkj2
j=1
5 / 14
11.1 Special Matrices
Example 1
Apply Cholesky decomposition to the symmetric matrix
 
6 15 55
A = 15 55 225
55 225 979

Solution
For the first row (k = 1)
q √ √
l11 = a11 − −1 2
P
j=1 l1j = a11 = 6 = 2.4495

For the second row (k = 2)


a21 − 0j=1 l1j l1j
P
a21 15
l21 = = = = 6.1237
l11 l11 2.4495
q P1 2 q p
l22 = a22 − j=1 l2j = a22 − l21 2 = 55 − (6.1237)2 = 4.1833
6 / 14
11.1 Special Matrices

For the third row (k = 3)


a31 − 0j=1 l1j l3j
P
a31 55
l31 = = = = 22.454
l11 l11 2.4495
P1
a32 − j=1 l2j l3j a32 − l21 l31 225−6.1237·22.454
l32 = = = = 20.917
l22 l22 4.1833
q √ √
a33 − 2j=1 l3j2 = a33 −l31
P
l33 = 2 −l 2 =
32 979−(22.454)2 −(20.917)2 = 6.1101

Thus, the Cholesky decomposition yields


   
2.4495 0 0 2.4495 6.1237 22.454
L = 6.1237 4.1833 0  LT =  0 4.1833 20.917
22.454 20.917 6.1101 0 0 6.1101

7 / 14
11.1 Special Matrices

Example 2
Use Cholesky decomposition to solve the following system

 6x1 + 15x2 + 55x3 = 152.6

15x1 + 55x2 + 225x3 = 585.6

55x1 + 225x2 + 979x3 = 2488.8

Solution
We first decompose the matrix A into lower L and upper LT
triangular matrices (See Example 1).
   
2.4495 0 0 2.4495 6.1237 22.454
L = 6.1237 4.1833 0  LT =  0 4.1833 20.917
22.454 20.917 6.1101 0 0 6.1101

8 / 14
11.1 Special Matrices

Then, we solve Ld = b by forward substitution


    
2.4495 0 0 d1 152.6
6.1237 4.1833 0  d2  =  585.6 
22.454 20.917 6.1101 d3 2488.8

That gives

2.4495d1 = 152.6 =⇒ d1 = 62.2984

6.1237d1 + 4.1833d2 = 585.6 =⇒ d2 = 48.79

22.454d1 + 20.9170d2 + 6.1101d3 = 2488.8 =⇒ d3 = 11.3601

9 / 14
11.1 Special Matrices

Solution
Finally, we solve LT X = d by back substitution
    
2.4495 6.1237 22.454 x1 62.2984
 0 4.1833 20.917 x2  = 48.7900
0 0 6.1101 x3 11.3601

That gives:

6.1101x3 = 11.3601 =⇒ x3 = 1.8592

4.1833x2 + 20.917x3 = 48.79 =⇒ x2 = 2.3667

2.4495x1 + 6.1237x2 + 22.454x3 = 62.2984 =⇒ x1 = 2.4734

10 / 14
11.2 Gauss-Seidel
Gauss-Seidel
Iterative or approximate methods provide an alternative to the
elimination methods described to this point. Such approaches are
similar to the techniques we developed to obtain the roots of a
single equation in Chapter 6. Those approaches consisted of
guessing a value and then using a systematic method to obtain a
refined estimate of the root.
The Gauss-Seidel method is the most commonly used iterative
method. We limit ourselves to a 3 × 3 set of equations

a11 x1 + a12 x2 + a13 x3 = b1

a21 x1 + a22 x2 + a23 x3 = b2

a31 x1 + a32 x2 + a33 x3 = b3

If the diagonal elements are all nonzero, the first equation can be
solved for x1 , the second for x2 , and the third for x3 to yield:
11 / 14
11.2 Gauss-Seidel
Gauss-Seidel
b1 − a12 x2 − a13 x3
x1 =
a11
b2 − a21 x1 − a23 x3
x2 =
a22
b3 − a31 x1 − a32 x2
x3 =
a33
Now, we can start the solution process by choosing guesses for the
x’s. A simple way is to assume that they are all zero. These zeros
can be substituted into the first Equation, to calculate a new value
for x1 = b1 /a11 . Then, we substitute this new value of x1 along
with the previous guess of zero for x3 into the second Equation to
compute a new value for x2 . The process is repeated for the third
Equation to calculate a new estimate for x3 . Then we return to the
first equation and repeat the entire procedure until our solution
converges closely enough to the true values.
12 / 14
11.2 Gauss-Seidel
Example 3
Use the Gauss-Seidel method to obtain the solution of the system

 3x1 − 0.1x2 − 0.2x3 = 7.85

0.1x1 + 7x2 − 0.3x3 = −19.3

0.3x1 − 0.2x2 + 10x3 = 71.4

Recall that the true solution is x1 = 3, x2 = −2.5, and x3 = 7.

Solution
First, solve each of the equations for its unknown on the diagonal.
7.85+0.1x2 +0.2x3 7.85+0.1·0+0.2·0
x1 = = = 2.616667 t = 12.78%
3 3

−19.3−0.1x1 +0.3x3 −19.3−0.1·2.616667+0.3·0


x2 = = = −2.794524
7 7

71.4−0.3x1 +0.2x2 71.4−0.3·2.616667+0.2·(−2.794524)


x3 = = = 7.005610
10 10 13 / 14
11.2 Gauss-Seidel

For the second iteration, the same process is repeated to compute


7.85+0.1x2 +0.2x3 7.85+0.1·(−2.794524)+0.2·7.005610
x1 = = = 2.990557
3 3

−19.3−0.1x1 +0.3x3 −19.3−0.1·2.990557+0.3·7.005610


x2 = = = −2.499625
7 7

71.4−0.3x1 +0.2x2 71.4−0.3·2.990557+0.2·(−2.499625)


x3 = = = 7.000291
10 10
and the corresponding errors for the 3 variables are
|3 − 2.990557|
1t = 100 × = 0.31%
3

2
−2.5 + 2.499625
t = 100 ×
= 0.015%
−2.5

|7 − 7.000291|
3t = 100 × = 0.0042%
7 14 / 14

You might also like