Mathematicalmethods Lecture Slides Week6 LinearAlgebraII
Mathematicalmethods Lecture Slides Week6 LinearAlgebraII
6. Linear Algebra II
1 / 53
Outline
1
Variance-Covariance Matrices
Orthogonal Matrices
6. Linear Algebra II
2 / 53
Outline
1
Variance-Covariance Matrices
Orthogonal Matrices
6. Linear Algebra II
3 / 53
Transposes
Let A be an m n matrix, the transpose of A is denoted by AT
The columns of AT are the rows of A
If the dimension of A is m n then the dimension of AT is n m
"
A=
1 2 3
4 5 6
1 4
AT = 2 5
3 6
Properties
(AT )T = A
The transpose of A + B is AT + B T
The transpose of AB is (AB)T = B T AT
The transpose of A1 is (A1 )T = (AT )1
Kjell Konis (Copyright 2013)
6. Linear Algebra II
4 / 53
Symmetric Matrices
A symmetric matrix satisfies A = AT
Implies that aij = aji
1 2 3
A = 2 5 4 = AT
3 4 9
A diagonal matrix is automatically symmetric
1 0 0
D = 0 5 0 = DT
0 0 9
6. Linear Algebra II
5 / 53
6. Linear Algebra II
6 / 53
Permutation Matrices
An n n permutation matrix P has the rows of I in any order
There are 6 possible 3 3 permutation matrices
1
1
I=
P31 =
1
1
P21 = 1
P32 =
P32 P21 =
1 P21 P32 = 1
P 1 is the same as P T
Example
1 0 0
1
1
1
P32 2 = 0 0 1 2 = 3
3
0 1 0
3
2
Kjell Konis (Copyright 2013)
6. Linear Algebra II
1
1
P32 3 = 2
2
3
7 / 53
Outline
1
Variance-Covariance Matrices
Orthogonal Matrices
6. Linear Algebra II
8 / 53
Spaces of Vectors
The space Rn consists of all column vectors with n components
For example, R3 all column vectors with 3 components and is called
3-dimensional space
The space R2 is the xy plane: the two components of the vector are
the x and y coordinates and the tail starts at the origin (0, 0)
Two essential vector operations go on inside the vector space:
Add two vectors in Rn
Multiply a vector in Rn by a scalar
The result lies in the same vector space Rn
Kjell Konis (Copyright 2013)
6. Linear Algebra II
9 / 53
Subspaces
A subspace of a vector space is a set of vectors (including the zero
vector) that satisfies two requirements:
If u and w are vectors in the subspace and c is any scalar, then
(i)
(ii)
u + w is in the subspace
cw is in the subspace
Some subspaces of R3
L Any line through (0, 0, 0), e.g., the x axis
P
R3
6. Linear Algebra II
10 / 53
an m n matrix
columns of A have m components
columns of A live in Rm
column space of A is a subspace of Rm
6. Linear Algebra II
11 / 53
The Nullspace of A
The nullspace of A consists of all solutions to Ax = 0
The nullspace of A is denoted by N(A)
Example: elimination
1 2 3
1 2 3
1 2 3
8 0 0 2 0 0 2
2 4
3 6 11
0 0 2
0 0 0
The pivot variables are x1 and x3
The free variable is x2 (column 2 has no pivot)
The number of pivots is called the rank of A
6. Linear Algebra II
12 / 53
Linear Independence
A sequence of vectors v1 , v2 , . . . , vn is linearly independent if the only
linear combination that gives the zero vector is 0v1 + 0v2 + . . . + 0vn
The columns of A are independent if the only solution to Ax = 0 is
the zero vector
Elimination produces no free variables
The rank of A is equal to n
The nullspace N(A) contains only the zero vector
A set of vectors spans a space if their linear combinations fill the space
A basis for a vector space is a sequence of vectors that
(i)
(ii)
6. Linear Algebra II
13 / 53
Outline
1
Variance-Covariance Matrices
Orthogonal Matrices
6. Linear Algebra II
14 / 53
m
1 X
xi
m i=1
m
1 X
(xi x )2
m 1 i=1
m
1 X
(xi x )(yi y )
m 1 i=1
6. Linear Algebra II
15 / 53
Sample Variance
Let e be a column vector of m ones
m
m
eTx
1 X
1 X
=
1xi =
xi = x
m
m i=1
m i=1
x is a scalar
Let e x be a column vector repeating x m times
eTx
= x e x
m
The i th element of x is
Let x = x e
xi = xi x
Take another look at the sample variance
m
m
1 X
1 X
x T x
k
x k2
2
Var(x ) =
(xi x ) =
xi2 =
=
m 1 i=1
m 1 i=1
m1
m1
Kjell Konis (Copyright 2013)
6. Linear Algebra II
16 / 53
Sample Covariance
A similar result holds for the sample covariance
Let y = y e
eTy
= y e y
m
m
m
1 X
x T y
1 X
xi yi =
(xi x )(yi y ) =
m 1 i=1
m1
m 1 i=1
6. Linear Algebra II
17 / 53
Variance-Covariance Matrix
Suppose x and y are the columns of a matrix R
R = x
|
y
|
| |
=
x
y
R
| |
Cov(x , x )
Cov(R) =
Cov(x , y )
Cov(y , x ) Cov(y , y )
TR
1 x x x y
R
=
m 1 y T x y T y
m1
6. Linear Algebra II
18 / 53
Outline
1
Variance-Covariance Matrices
Orthogonal Matrices
6. Linear Algebra II
19 / 53
x = x e x = x e
{z
A
What is A?
The outer product (ee T ) is an m m matrix
I is the m m identity matrix
A is an m m matrix
Premultiplication by A turns x into x
Can think of matrix multiplication as one matrix acting on another
Kjell Konis (Copyright 2013)
6. Linear Algebra II
20 / 53
AR = A x
|
|
|
y = A x
|
|
|
| |
A y = x y = R
|
| |
1 T
1
(AR)T (AR)
R R=
n1
m1
6. Linear Algebra II
21 / 53
1
1
R T AT AR
(AR)T (AR) =
m1
m1
I
IT
[e T ]T e T
m
=
=
Kjell Konis (Copyright 2013)
T #
ee T
m
ee T
m
ee T
I
m
I
ee T
m
ee T
m
6. Linear Algebra II
Since AT = A, A is symmetric
22 / 53
ee T
m
ee T
m
= I
= I2
ee T ee T
+
m
m
= I 2
ee T e(e T e)e T
+
m
m2
= I 2
ee T eme T
+
m
m2
ee T
I
m
ee T
m
ee T
m
ee T
m
2
= A2
=A
6. Linear Algebra II
23 / 53
1
1
R T |{z}
A |{z}
R
R T AT A R =
|{z}
m
1
m1
| {z } nm mm mn
scalar
1n
z
}|
{
1
R = R T R e |{z}
e T |{z}
R
R T AR = R T I
ee
m
1m mn
mn
= R T R
}|
1
1
e M1 = |{z}
R T |{z}
R M2 = Cov(R)
|{z}
|{z}
| {z }
m
m |{z}
m1 1n
6. Linear Algebra II
nm
mn
mn
nn
24 / 53
Outline
1
Variance-Covariance Matrices
Orthogonal Matrices
6. Linear Algebra II
25 / 53
Orthogonal Matrices
Two vectors q, w Rm are orthogonal if their inner product is zero
hq, w i = 0
Consider a set of m vectors {q1 , . . . , qm } where qj Rm \ 0
Assume that the vectors {q1 , . . . , qm } are pairwise orthogonal
qj
Let qj =
be a unit vector in the same direction as qj
kqj k
The vectors {
qa , . . . , qm } are orthonormal
Let Q be a matrix with columns {
qj }, consider the product QQ T
q1T
T
Q Q = q1
T
qm
..
T
. | q1 q1
..
=
. qm
..
qiT qj
. |
6. Linear Algebra II
qiT qj
..
.
T
qm
qm
=I
26 / 53
Orthogonal Matrices
A square matrix Q is orthogonal if Q T Q = I and QQ T = I
Orthogonal matrices represent rotations and reflections
Example:
cos() sin()
Q=
sin()
cos()
cos()
QTQ =
sin() cos()
sin()
cos()
cos2 () + sin2 ()
6. Linear Algebra II
sin2 () + cos2 ()
=I
27 / 53
cos()
QT =
sin()
sin() cos()
cos() sin()
sin()
cos()
6. Linear Algebra II
28 / 53
Outline
1
Variance-Covariance Matrices
Orthogonal Matrices
6. Linear Algebra II
29 / 53
d1
D=
..
0
.
dm
1/d1
..
0
.
1/dm
6. Linear Algebra II
30 / 53
6. Linear Algebra II
31 / 53
Multiplication
328
7 Linear Transformations
GA
Gli
I
Figure 76
T
#
"
# " is #a stretching
U and V"are rotations
and reflections.
" #
matrix,
T
v1
0
0
0
v2 = U 1
= u1 u2
T
0 2Z 1gives
2
2
I thatv
disappears.
Multiplying
u and n as before.
Av2 = UV v2 = U
This time it is V V
We have an ordinary factorization of the symmetric matrix A .4 T The columns of U
Kjell Konis (Copyright 2013)
6. Linear Algebra II
32 / 53
Multiplication
Visualize: Ax
U
= UV T x
rotate right 45
stretch x coord by 1.5, stretch y coord by 2
rotate right 22.5
" #
1 1
x :=
2 1
x := V T x
6. Linear Algebra II
x := x
x := Ux
33 / 53
Example
> load("R.RData")
> library(MASS)
> eqscplot(R)
5
BAC Returns (%)
0
svdR
U <S <V <-
<- svd(R)
svdR$u
diag(svdR$d)
svdR$v
>
>
>
>
0
C Returns (%)
6. Linear Algebra II
34 / 53
Example (continued)
5
BAC Returns (%)
0
6. Linear Algebra II
0
C Returns (%)
35 / 53
Outline
1
Variance-Covariance Matrices
Orthogonal Matrices
6. Linear Algebra II
36 / 53
= I
Let R be an m n matrix and let R
ee T
m
Let R
Recall that
Cov(R)
1
m1
1
m1
T
1
m1 V
T
T
1
m1 V V
TR
R
UV T
T
UV T
U T U V T
6. Linear Algebra II
37 / 53
..
..
.
n
{z
nm
}
|
..
{z
mn
T
1
m1
1 =
12
m1
6. Linear Algebra II
..
.
n =
n2
m1
38 / 53
Cov(R) =
T
T
1
m1 V V
= V V T
Av = v1 a1 + + v1 a1 = ej = 1
j = j ej
.
|
|
..
Kjell Konis (Copyright 2013)
6. Linear Algebra II
39 / 53
In summary
Cov(R) = R
Cov(R) vj = j vj
Cov(R) vj
6. Linear Algebra II
40 / 53
Diagonalizing a Matrix
Suppose an n n matrix A has n linearly independent eigenvectors
Let S be a matrix whose columns are the n eigenvectors of A
S 1 AS = =
..
.
n
S 1 AS =
A = SS 1
6. Linear Algebra II
41 / 53
with Q 1 = Q T
6. Linear Algebra II
42 / 53
a b
2 2 case: x Ax = [ x1 x2 ]
b c
#"
x1
= ax12 + 2bx1 x2 + cx22 > 0
x2
6. Linear Algebra II
43 / 53
R Example
6. Linear Algebra II
0
C Returns (%)
44 / 53
Outline
1
Variance-Covariance Matrices
Orthogonal Matrices
6. Linear Algebra II
45 / 53
Least Squares
0.20
y =
+ x
0.15
Criterion:
Citigroup Returns
0.00 0.05 0.10
0.10
m
X
yi yi
2
should
i=1
be minimum
~
y = + x
Choose
and so that
m
X
2
yi ( + xi )
i=1
0.05
minimized when
=
0.00
0.05
S&P 500 Returns
6. Linear Algebra II
46 / 53
Least Squares
What does the column picture look like?
Let y = (y1 , y2 , . . . , ym )
Let x = (x1 , x2 , . . . , xm )
Let e be a column vector of m ones
Can write y as a linear combination
" #
y1
x1
1 x1
|
.
.
.
y = .. = e + .. = .. ..
|
ym
xm
1 xm
= X
Want to minimize
m
X
yi yi
2
= ky y k2 = ky X k2
i=1
Kjell Konis (Copyright 2013)
6. Linear Algebra II
47 / 53
QR Factorization
Let A be an m n matrix with linearly independent columns
Full QR Factorization: A can be written as the product of
an m m orthogonal matrix Q
an m n upper triangular matrix R
(upper triangular means rij = 0 when i > j)
A = QR
Want to minimize
ky X k2 = ky QRk2
Recall: orthogonal transformation leaves vector lengths unchanged
ky X k2 = ky QRk2 = kQ T (y QR)k2 = kQ T y Rk2
Kjell Konis (Copyright 2013)
6. Linear Algebra II
48 / 53
Least Squares
Let u = Q T y
u1
r11 r12
u2 0
u R =
u3 0
. .
..
..
r22
..
.
u1 (r11 + r12 )
u2 r22
u3
..
.
+ u2 r22
2
m
X
ui2
i=(n+1)
6. Linear Algebra II
49 / 53
Least Squares
= u
Can find
and by solving the linear system R
u1
r11 r12
=
0 r22
u2
first n rows of R, u first n elements of u
R
System is already upper triangular, solve using back substitution
6. Linear Algebra II
50 / 53
R Example
First, get the data
>
>
>
>
library(quantmod)
getSymbols(c("C", "GSPC"))
citi <- c(coredata(monthlyReturn(C["2010"])))
sp500 <- c(coredata(monthlyReturn(GSPC["2010"])))
6. Linear Algebra II
51 / 53
R Example
0.20
Compute u = Q T y
0.15
6. Linear Algebra II
^
^ +
y^ =
x
0.10
Citigroup Returns
0.00 0.05 0.10
Solve for
and
~
y = + x
0.05
0.00
0.05
S&P 500 Returns
52 / 53
http://computational-finance.uw.edu
6. Linear Algebra II
53 / 53