Topic 6 - Summary
Topic 6 - Summary
Definitions and some properties of matrices were introduced last year. This
topic covers some further applications of matrices.
x = A−1 b.
Gaussian elimination
For larger matrices, inversion is difficult to do without a computer. Gaussian
ellimination solves the system Ax = b by reducing it to a far simpler system
by manipulating the equations, i.e the rows of the matrix system. Consider
a b c d w q
e f g h x r
i j k l y = s
m n o p z t
If we can eliminate, i.e make equal to zero, the lower triangular elements, then
this system is trivial to solve. We systematically eliminate the elements e, i, j,
m, n and o. To eliminate e, subtract the first row multiplied by e/a from the
second row to get
a b c d w q
0 f − be/a g − ce/a h − de/a x − we/a r − qe/a
= .
i j k l y s
m n o p z t
1
Do the same process, sequentially, to all other lower triangular elements to leave
the system as
A B C D W Q
0 F G H X R
0 0 K L Y = S
0 0 0 P Z T
Working from row 4 to row 1, the values of W , X, Y and Z can be easily
calculated.
LU factorisation
Square matrices can often by factorised into the product of a lower and an upper
triangular matrix, which is useful for solving systems of linear equations. If we
factorise A = LU then the system of equations Ax = b becomes LUx = b
which may be solved in two stages. First solve Ly = b for y, then solve Ux = y
for x. The LU factorisation can be found by setting the diagonal elements of
either L or U to one and solving algebraically.
Least-squares estimation
The least squares estimation is a useful technique for estimating overdetermined
systems of equations, i.e. systems in which there are more equations than un-
knowns. Suppose we have a system
Ax = b̃
x̃ = (AT A)−1 AT b̃
Av = λv
This equation has no unique solution for v, except v = 0, so the values of λ are
found by solving the equation
|A − λI| = 0
2
Eigenvalues and eigenvectors of symmetric matrices
The eigenvalues and eigenvectors of symmetric matrices have some additional
properties
A = PDP−1
A = USVT