0% found this document useful (0 votes)
28 views

MATH3322 1 Introduction

The document discusses computational problems involving matrices including solving linear systems, least squares problems, eigenvalue problems, and singular value decompositions. It covers computational methods for solving these problems including matrix factorizations and iterative methods. Applications discussed include image processing, machine learning, data analysis, and numerical optimization algorithms.

Uploaded by

darrenchua314
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
28 views

MATH3322 1 Introduction

The document discusses computational problems involving matrices including solving linear systems, least squares problems, eigenvalue problems, and singular value decompositions. It covers computational methods for solving these problems including matrix factorizations and iterative methods. Applications discussed include image processing, machine learning, data analysis, and numerical optimization algorithms.

Uploaded by

darrenchua314
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

MATH 3322 Matrix Computation (Lecture 1)

Topics covered:

• Computational problems:

A) Solve a system of linear equation.

Ax=b

Where A is an n x n matrix, x is an n x 1 unknown vector and b is a n x 1 vector

B) Solve least squares problems:

min || Ax-b|| AE Rm usually min be R


x Rn
C) Eigenvalue/Eigenvector problem

Ax= x

Where A is an n x n matrix, x is an n x 1 vector and is a number

D) Singular Value Decomposition (SVD)

Av= o u AE Rm VER U E Rm

A To
u= v OER and 0 0

• Computational Methods:

1. Matrix Factorization: Given A, we factorize it as A= P P


之 Pk

a) A=LU for problem A).

b) A=QR for problem B)

c) A=ULUT Schur decomposition for C)

d) A=U VT SVD for D)

2. Iterative methods: Gauss-Seidel, Kaczmarz, Gradient descent, Conjugate Gradient

3. Randomized Numerical Algorithm


• Applications:

1) Image processing/computer vision

2) Machine learning/ Data analysis

Why these problems are important?

Example 1: Google pageRank/ adjacency matrix

Network/Graph

f
ngjs
Adjancency matrix:

G gij
1
if j i
0 otherwise

Gf iǒǐ
The answer of the questions is related to solve a system of linear equations or

eigenvalue problem involving a normalized version of G.

Example 2: Regression:

Regression is one of the most important topics in statistics, which is about tting a function to

the sampled data in order to explore the relationship between two quantities. For example, we
want to use a cubic polynomial to t a few pairs of

IXi Y il i 1 2 8

That is, we want to nd a polynomial

y Cot C x 2x2 C3x3


such that

Yi Cot C.Xi 2x i t s xi i 1 2 8

In matrix notations, this problem is rewritten as

li 影 壛
Which is indeed a least-square problem.

Example 3: Principle Component Analysis

Suppose we are given some data points, modeled by n mx1 vectors {x } with x is an mx1

vector for I=1, … n.

Which direction is the most important?

To answer this question, we project all data points onto

di erent lines, and then we compute the sum of the projection errors iii
(in terms of squared Euclidean distance) of all data points.

Clearly, the direction of the line that minimizes the projection error is the most important

direction. For example, the red line in the gure.

What are the next important direction?


It is solved by the singular value decomposition (SVD) of the matrix X Xz Xn
X
This is known as principal component analysis (PCA) in data analysis/machine learning.

Example 4: Basic subroutines of advanced numerical algorithms

Consider the smooth unconstrained optimization problem

xminnfcxi
To solve this problem, let x be the current estimate, we do Taylor’s expansion of f(x) around
k
x to get
R

f x Df x x X x_x 对X x Xkl
fx
where is the gradient and is the Hessian. Then, in Newton’s method,
对1加 72ftXk
instead of minimizing f(x) directly, we minimize the right hand side of the Taylor’s expansion.

That is, we solve

flxkl 72ftXk x X 0

which is a linear equation.

Actually, solving linear equations plays a fundamental role in 2nd order optimization algorithms.

What are emphasized?


• Mathematical foundations for linear equations, least squares systems, and various

decompositions.

• How to solve matrix computation problems numerically? ———Algorithms

a) Stability and accuracy

Example 1: Which algorithm is more accurate?


ˋ
Relative
108
i.i iii ii Alg1

Error ㄨㄨㄨ ㄨㄨ ㄨㄨ ㄨㄨ x Alg2

problems
b) Runtime

Example 2: Which algorithm uses less runtime for the same accuracy?
1
Relative

error
iǒi Ag
Alg2

Runtime
Example 3: Runtime vs Problem size
101
ti

iiòilià
Run 3
Time

c) Memory
Matrix size N
……

• Algorithms are the most important.

• Won’t emphasize too much on the exact analysis of stability and accuracy, but just give rule

of thumb and use numerical results to demonstrate core ideas.

• Solve the case-study problems using taught algorithms.

You might also like