0% found this document useful (0 votes)
40 views

Lecture 3 Slides

The document discusses matrix multiplication. It defines matrix multiplication as the dot product of the rows of the first matrix with the columns of the second. For the multiplication to be defined, the number of columns in the first matrix must equal the number of rows in the second matrix. An example calculates the product of two matrices to demonstrate this.

Uploaded by

iub.foisal
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
40 views

Lecture 3 Slides

The document discusses matrix multiplication. It defines matrix multiplication as the dot product of the rows of the first matrix with the columns of the second. For the multiplication to be defined, the number of columns in the first matrix must equal the number of rows in the second matrix. An example calculates the product of two matrices to demonstrate this.

Uploaded by

iub.foisal
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 60

Quantitative Methods in Finance

Lecture 3 : Matrices and Vectors (Final)


Simple Linear Regression (part 1)

October 19, 2023

Douglas Turatti
[email protected]
Aalborg University Business School
Denmark
Matrix Multiplication
Quantitative Methods

Douglas Turatti
▶ The rules for adding or subtracting matrices are quite
1 Matrix Multiplication
natural and simple. Matrix Transpose

Determinants
▶ The rule for matrix multiplication, however, is more subtle. Matrix Inverse
Matrix multiplication is not element-wise Solving Linear
Systems
multiplication.
Covariance Matrix

Codes in R
▶ To understand the idea behind matrix multiplication,
Introduction to
consider the following system of equations, Econometrics

Simple Regression
Model
a11 y1 + a12 y2 = z1 (1)
Estimating a Simple
a21 y1 + a22 y2 = z2 (2) Regression Model

▶ Matrix multiplication allows us to write this system as


AY = Z , where AY is the matrix multiplication of matrices
A and Y.
Aalborg University Business
School
59 Denmark
Matrix Multiplication
Quantitative Methods

Douglas Turatti
▶ How can this system be converted into the matrix form.
2 Matrix Multiplication

▶ Let’s suppose the answer is Matrix Transpose

Determinants
     Matrix Inverse
a11 a12 y1 z1
= (3) Solving Linear
a21 a22 y2 z2 Systems

Covariance Matrix
▶ This answer must recover the original system. This means Codes in R

that the matrix multiplication Introduction to


Econometrics
     Simple Regression
a11 a12 y1 a11 y1 + a12 y2 Model
= (4)
a21 a22 y2 a21 y1 + a22 y2 Estimating a Simple
Regression Model

which is a 2 × 1 vector.

▶ First conclusion: 2 × 2 matrix multiplied by a 2 × 1 vector


must yield a 2 × 1 vector.
Aalborg University Business
School
59 Denmark
Matrix Multiplication
Quantitative Methods
▶ How to perform the matrix multiplication? In the example, Douglas Turatti

the result is a 2 × 1 vector. 3 Matrix Multiplication

Matrix Transpose
▶ We recover the first element of original system by Determinants

multiplying the first row of the matrix A by the vector Y . Matrix Inverse

and adding them. Solving Linear


Systems

Covariance Matrix
▶ The first element of the first row of matrix A multiplies the
Codes in R
first element of the vector. The second element of the first Introduction to
row of matrix A multiplies the second element of the Econometrics

Simple Regression
vector. We then sum these elements Model

Estimating a Simple
Regression Model
a11 y1 + a12 y2 (5)

which is a scalar.

▶ We recover the second element of the original system in


the same way. Aalborg University Business
School
59 Denmark
Matrix Multiplication
Definition

Quantitative Methods
Definition Douglas Turatti

Suppose that A = (aij )m×n and that B = (bij )n×p . Then the product 4 Matrix Multiplication
C = AB is the m × p matrix C = (cij )m×p , whose element in the i-th Matrix Transpose
row and the j-th column is the inner product Determinants

n Matrix Inverse
X
cij = air brj = ai1 b1j + ai2 b2j + · · · + ain bnj (6) Solving Linear
Systems
r =1
Covariance Matrix

This means that the element cij is the dot product of the i row of A an Codes in R

the j column of B. Introduction to


Econometrics
The matrix multiplication exists if the number of columns in Simple Regression
matrix A is equal to the number of rows in matrix B, and the Model

resulting matrix has the number of rows of A and the number of Estimating a Simple
Regression Model
columns of B.

Aalborg University Business


School
59 Denmark
Matrix Multiplication
Example

Quantitative Methods

Douglas Turatti

▶ Consider the matrices A and B.Is the product AB defined? 5 Matrix Multiplication

Matrix Transpose
If so, compute the matrix product AB. What about the
Determinants
product BA? Matrix Inverse
    Solving Linear
0 1 2 3 2 Systems

A = 2 3 1 B =  1 0 (7) Covariance Matrix

4 −1 6 −1 1 Codes in R

Introduction to
Econometrics
▶ Let’s start with the product AB. Simple Regression
For this product to be defined, we need that the number of Model

Estimating a Simple
columns in matrix A is equal to the number of rows in Regression Model

matrix B, which is satisfied in this case. The matrix


multiplication is exists as A = 3 × 3 B = 3 × 2. The
product AB is then defined as 3 × 2.

Aalborg University Business


School
59 Denmark
Matrix Multiplication
Example

Quantitative Methods

Douglas Turatti

6 Matrix Multiplication
▶ The matrix multiplication is calculated then as
Matrix Transpose
   Determinants
0 1 2 3 2 Matrix Inverse
2 3 1  1 0 = Solving Linear
4 −1 6 −1 1 Systems

  Covariance Matrix
0 × 3 + 1 × 1 + 2 × −1 0×2+1×1+2×1 Codes in R
 2 × 3 + 3 × 1 + 1 × −1 1×2+3×0+1×1  Introduction to
Econometrics
4 × 3 + −1 × 1 + 6 × −1 4 × 2 + −1 × 0 + 6 × 1
Simple Regression
Model
▶ What about the product BA? Estimating a Simple
Regression Model
The product BA is not defined because the number of
columns in B (= 2) is not equal to the number of rows in A
(= 3). So the matrix BA does not exist.

Aalborg University Business


School
59 Denmark
Matrix Multiplication
Quantitative Methods

Douglas Turatti

7 Matrix Multiplication

Matrix Transpose

Determinants

Matrix Inverse
Remark Solving Linear
Systems
Differently from scalars, matrix multiplication is not
Covariance Matrix
commutative. In the previous example, AB was defined but BA Codes in R
was not. Even in cases in which AB and BA are both defined, Introduction to
they are usually not equal. When we write AB, we say that we Econometrics

Simple Regression
premultiply B by A, whereas in BA we postmultiply B by A. Model

Estimating a Simple
Regression Model

Aalborg University Business


School
59 Denmark
Matrix Multiplication
The Identity Matrix

Quantitative Methods

Douglas Turatti
▶ The identity matrix is the matrix equivalent to the scalar 1.
8 Matrix Multiplication

Matrix Transpose
▶ The identity matrix of order n, denoted by In , is the n × n matrix
Determinants
having ones along the main diagonal and zeros elsewhere
Matrix Inverse

1 0 ... 0
  Solving Linear
Systems
0 1 . . . 0 Covariance Matrix
In =  . . . (8)

. . ... 

 .. ..
 Codes in R

Introduction to
0 0 ... 1 Econometrics

Simple Regression
▶ For every n × n matrix A the following holds Model

Estimating a Simple
Regression Model
AIn = In A = A (9)

▶ The identity matrix is always a square matrix.

▶ Square matrix: Matrix with same number of rows and columns.


Aalborg University Business
School
59 Denmark
Matrix Transpose
Quantitative Methods

Douglas Turatti

Matrix Multiplication

▶ Consider any m × n matrix A. 9 Matrix Transpose

Determinants

▶ The transpose of A, denoted by A′ , is defined as the n × m Matrix Inverse

Solving Linear
matrix whose first column is the first row of A, whose Systems

second column is the second row of A, and so on. Covariance Matrix

Codes in R

▶ For example Introduction to


Econometrics
  Simple Regression
  1 3 Model
1 0 2
A= A′ = 0 4 (10) Estimating a Simple
3 4 5 Regression Model
2 5

Aalborg University Business


School
59 Denmark
Symmetric Matrices
Quantitative Methods

Douglas Turatti

Matrix Multiplication
10 Matrix Transpose

▶ Square matrices with the property that they are symmetric Determinants

Matrix Inverse
about the main diagonal are called symmetric.
Solving Linear
Systems

▶ If the square matrix A is symmetric, then it holds A = A . Covariance Matrix

Codes in R
▶ For example Introduction to
  Econometrics
−3 2
A= (11) Simple Regression
2 0 Model

Estimating a Simple
is symmetric as A = A′ . Regression Model

Aalborg University Business


School
59 Denmark
Symmetric Matrices
Quantitative Methods

Douglas Turatti

Matrix Multiplication

11 Matrix Transpose

Determinants

Matrix Inverse

Solving Linear
▶ Important result: Systems

Covariance Matrix

▶ Let X be an arbitrary m × n matrix, then XX ′ and X ′ X are Codes in R

both symmetric. Note that they are not the same! Introduction to
Econometrics

Simple Regression
Model

Estimating a Simple
Regression Model

Aalborg University Business


School
59 Denmark
Determinants
Introduction

Quantitative Methods

Douglas Turatti

Matrix Multiplication

▶ Determinants are numbers (a scalar) which are able to Matrix Transpose

summarize some key properties of the n2 elements of a 12 Determinants

Matrix Inverse
square n × n matrix.
Solving Linear
Systems
▶ Determinants are one of the most important concepts in Covariance Matrix

linear algebra. It is also relevant in empirical work. Codes in R

Introduction to
Econometrics
▶ Here, we will not focus on how to calculate determinants,
Simple Regression
but what their values mean. Model

Estimating a Simple
Regression Model
▶ Determinants are only defined for square matrices. So,
we only talk about square matrices.

Aalborg University Business


School
59 Denmark
Determinants
Determinants of Order 2

Quantitative Methods

Douglas Turatti
▶ Consider the pair of linear equations
Matrix Multiplication

a11 x1 + a12 x2 = b1 (12) Matrix Transpose


13 Determinants
a21 x1 + a22 x2 = b2 (13) Matrix Inverse

Solving Linear
the solution to this system is Systems

Covariance Matrix
b1 a22 − b2 a12 b2 a11 − b1 a21 Codes in R
x1 = x2 = (14)
a11 a22 − a21 a12 a11 a22 − a21 a12 Introduction to
Econometrics

Simple Regression
▶ This solution exists as long as a11 a22 − a21 a12 is diferent Model

from 0. Estimating a Simple


Regression Model

▶ It turns out that a11 a22 − a21 a12 is the determinant of the
matrix  
a11 a12
(15)
a21 a22
Aalborg University Business
School
59 Denmark
Determinants
Determinants of Order 2

Quantitative Methods

Douglas Turatti

Matrix Multiplication
▶ In this sense, the value of the determinant tells us whether Matrix Transpose
the system has a unique solution. 14 Determinants

Matrix Inverse
▶ Consider the matrix Solving Linear
Systems
 
4 1 Covariance Matrix
A= (16)
3 2 Codes in R

Introduction to
Econometrics
▶ The determinant is notated as |A| and is calculated as Simple Regression
Model

|A| = 4 × 2 − 3 × 1 = 5 (17) Estimating a Simple


Regression Model

▶ This means that a system with matrix of coefficients as A


has at least one solution.

Aalborg University Business


School
59 Denmark
Determinants
Determinants of Order n

Quantitative Methods

Douglas Turatti

Matrix Multiplication

Matrix Transpose
15 Determinants

Matrix Inverse
▶ This method of calculating the determinant only applies to Solving Linear
Systems
square matrices of order 2.
Covariance Matrix

Codes in R
▶ Square matrices of higher order need other techniques.
Introduction to
Econometrics
▶ Here, we will not study other methods. Simple Regression
Model

Estimating a Simple
Regression Model

Aalborg University Business


School
59 Denmark
Matrix Inverse
Matrix Inverse

Quantitative Methods

Douglas Turatti

Matrix Multiplication

Matrix Transpose

Determinants
▶ Suppose that α is a real number different from 0. Then 16 Matrix Inverse
there is a unique number α−1 with the property that Solving Linear
αα−1 = 1. We call α−1 the inverse of α. Systems

Covariance Matrix

▶ We also have seen that the identity matrix is equivalent to Codes in R

Introduction to
matrices as the number 1. Econometrics

Simple Regression
▶ We will now find the equivalent of the inverse number in Model

Estimating a Simple
matrix form. Regression Model

Aalborg University Business


School
59 Denmark
Matrix Inverse
Matrix Inverse

Quantitative Methods

Douglas Turatti

Matrix Multiplication
▶ Given a square n × n matrix A, we say that X is an inverse Matrix Transpose

of A if there exists a matrix X such that Determinants

17 Matrix Inverse

AX = XA = In (18) Solving Linear


Systems

Covariance Matrix
▶ Then A is said to be invertible.
Codes in R

Introduction to
▶ Because XA = AX = In , the matrix A is also an inverse of Econometrics

X, that is, A and X are inverses of each other. Simple Regression


Model

Estimating a Simple
▶ Only square matrices can have inverses. Regression Model

▶ However, some square matrices are not invertible.

Aalborg University Business


School
59 Denmark
Matrix Inverse
Matrix Inverse

Quantitative Methods

Douglas Turatti

Matrix Multiplication

Matrix Transpose
Theorem Determinants
A square matrix A of dimension n is invertible if and only if 18 Matrix Inverse
|A| =
̸ 0. Solving Linear
Systems
▶ This condition is necessary and sufficient. Covariance Matrix

Codes in R

▶ A square matrix A is said to be singular if |A| = 0 and Introduction to


Econometrics
nonsingular if |A| =
̸ 0. Simple Regression
Model

▶ A matrix has an inverse if and only if it is nonsingular. Estimating a Simple


Regression Model

▶ Singular matrices are those without an inverse.

Aalborg University Business


School
59 Denmark
Matrix Inverse
Matrix Inverse: Comments

Quantitative Methods

Douglas Turatti

Matrix Multiplication

Matrix Transpose

▶ If the inverse of the matrix A exists, it must be unique. Determinants


19 Matrix Inverse

▶ If the inverse of A exists, it is usually written as A−1 . Solving Linear


Systems

Covariance Matrix
▶ Note that I/A does not exist. There are no rules for Codes in R
dividing matrices. Introduction to
Econometrics

▶ If the product A−1 B is defined, it is usually quite different Simple Regression


Model

from BA−1 because matrix multiplication is not Estimating a Simple


Regression Model
commutative.

Aalborg University Business


School
59 Denmark
Matrix Inverse
Matrix Inverse: Properties

Quantitative Methods

Douglas Turatti

Matrix Multiplication

Matrix Transpose

Determinants
▶ Some properties of the inverse matrix: 20 Matrix Inverse

1. (A−1 )−1 = A Solving Linear


Systems

2. If AB is invertible, then (AB)−1 = B −1 A−1 . Covariance Matrix

Codes in R

3. The transpose A′ is invertible, then (A′ )−1 = (A−1 )′ . Introduction to


Econometrics

Simple Regression
4. For a scalar c: (cA)−1 = c −1 A−1 whenever c is a number Model

and c ̸= 0. Estimating a Simple


Regression Model

Aalborg University Business


School
59 Denmark
Solving Linear Systems
Quantitative Methods

Douglas Turatti

▶ Consider the system of equations Matrix Multiplication

Matrix Transpose

Determinants
2x1 + x2 = 3 (19)
Matrix Inverse
2x1 + 2x2 = 4 (20) 21 Solving Linear
Systems

▶ The system can be written in matrix form as Covariance Matrix

Codes in R
    
2 1 x1 3 Introduction to
= (21) Econometrics
2 2 x2 4 Simple Regression
Model

▶ The system of equations can be written in the form Ax = b Estimating a Simple


Regression Model
with      
2 1 x1 3
A= , x= , b= (22)
2 2 x2 4

Aalborg University Business


School
59 Denmark
Solving Linear Systems
Quantitative Methods

Douglas Turatti

▶ A solution to the system is to find the x1 and x2 that satisfy Matrix Multiplication

both equations. Matrix Transpose

Determinants

▶ This system is very easy to solve with the use of matrices. Matrix Inverse

22 Solving Linear
Systems
▶ Let’s find the general solution for Covariance Matrix

Codes in R
Ax = b (23) Introduction to
Econometrics

▶ To find x, we have to pre-multiply each side of (23) for A−1 Simple Regression
Model

Estimating a Simple
A−1 Ax = A−1 b (24) Regression Model

x = A−1 b (25)

▶ Using matrix inversion, the system is easy to solve.

Aalborg University Business


School
59 Denmark
Solving Linear Systems
Quantitative Methods

Douglas Turatti

Matrix Multiplication
▶ In the previous example, let’s find the inverse of A Matrix Transpose

  Determinants

−1 1 −1/2 Matrix Inverse


A = (26)
−1 1 23 Solving Linear
Systems

Covariance Matrix
▶ You can check that AA−1 = I. And the vector x Codes in R
      Introduction to
x1 3 1 Econometrics
= A−1 = (27)
x2 4 1 Simple Regression
Model

Estimating a Simple
▶ This simple system can be solved by systematic Regression Model

elimination. However, if we the dimension of x is greater


than 2, using matrices are an easier task.

Aalborg University Business


School
59 Denmark
Covariance Matrix
Quantitative Methods
▶ The covariance matrix is a square matrix giving the Douglas Turatti

covariance between each pair of elements of a given Matrix Multiplication

random vector. Matrix Transpose

Determinants
▶ Intuitively, the covariance matrix generalizes the notion of Matrix Inverse

variance to multiple dimensions. Solving Linear


Systems

′ 24 Covariance Matrix
▶ Let X = (x1 , . . . , xn ) be a column vector of random
Codes in R
variables with given mean and variances. The covariance Introduction to
matrix is then defined as Econometrics

Simple Regression
Model
Σ = var[X ] = E[(X − E[X ])(X − E[X ])′ ] (28) Estimating a Simple
Regression Model

▶ Note that this is the generalization of the variance


definition to the matrix context.

▶ Note that the covariance matrix is a n × n square matrix.


Why? What is the dimension of (X − E[X ])? Aalborg University Business
School
59 Denmark
Covariance Matrix
Quantitative Methods

Douglas Turatti
▶ The elements of the covariance matrix Σ are
Matrix Multiplication
 2  Matrix Transpose
σ1 σ12 . . . σ1n
Determinants
σ21 σ 2 . . . σ2n 
2 Matrix Inverse
Σ= . (29)
 
.. .. .. 
 .. . . .  Solving Linear
Systems
σn1 σn2 . . . σn2 25 Covariance Matrix

Codes in R
▶ The main diagonal represents the variance. Elements Introduction to
Econometrics
outside the diagonal represent covariances.
Simple Regression
Model
▶ The covariance matrix is symmetric Σ = Σ′ . Estimating a Simple
Regression Model

▶ The covariance matrix must be invertible. What if it is not?

▶ The inverse of the covariance matrix is called precision


matrix.
Aalborg University Business
School
59 Denmark
Covariance Matrix
Quantitative Methods

Douglas Turatti

Matrix Multiplication

▶ The correlation matrix is defined as Matrix Transpose

  Determinants

1 σ12 . . . ρ1n Matrix Inverse


ρ21 1 . . . ρ2n  Solving Linear
Systems
R= . (30)

.. .. 

 .. ..
. . .  26 Covariance Matrix

Codes in R
ρn1 ρn2 ... 1
Introduction to
Econometrics
▶ The main diagonal represents the correlation of the Simple Regression
Model
variable with itself. Elements outside the diagonal
Estimating a Simple
represent correlations. Regression Model

▶ The correlation matrix is also symmetric.

Aalborg University Business


School
59 Denmark
Covariance Matrix
Main Properties

Quantitative Methods
▶ Covariance matrices have several properties. We will
Douglas Turatti
discuss only the most important.
Matrix Multiplication

Matrix Transpose
▶ Can any matrix be a covariance matrix? No.
Determinants

Matrix Inverse
▶ Covariance matrices are positive-semidefinite: This
Solving Linear
extends the idea that variances are positive. In an informal Systems

way, covariance matrices must also be positive. 27 Covariance Matrix

Codes in R

▶ Covariance matrices are positive definite when all the Introduction to


Econometrics
variances are positive and all possible combinations of Simple Regression
Model
lower dimensional matrices have positive determinants.
Estimating a Simple
Regression Model
▶ For example, the covariance matrix between the first and
the second asset must have positive determinant
 2 
σ1 σ12
Σ12 = (31)
σ21 σ22
Aalborg University Business

and so for the first, second , and third and so on. 59


School
Denmark
Covariance Matrix
Main Properties: Dot product

Quantitative Methods

Douglas Turatti

Matrix Multiplication

Matrix Transpose

Determinants

Matrix Inverse

▶ Let b be a n × 1 vector of constant. Then the covariance Solving Linear


Systems
matrix of b′ X is 28 Covariance Matrix

var[b′ X ] = b′ Σb (32) Codes in R

Introduction to
Econometrics
▶ Note that this is a scalar as b′ X is a scalar.
Simple Regression
Model

Estimating a Simple
Regression Model

Aalborg University Business


School
59 Denmark
Covariance Matrix
Estimation of Covariance Matrices

Quantitative Methods

Douglas Turatti

Matrix Multiplication

Matrix Transpose
▶ Given a sample consisting of t independent observations
Determinants
x1 , . . . , xp of a n-dimensional random vector x. The Matrix Inverse
estimator is Solving Linear
Systems
t
1X 29 Covariance Matrix

Σ̂n = (xi − x̄)′ (xi − x̄) (33) Codes in R


t
i=1 Introduction to
Econometrics

▶ Note that each vector for a given time is specified as a row Simple Regression
Model
vector and are stacked together as a t × n matrix. Estimating a Simple
Regression Model

▶ Note that n is the number of series and t the time length.

Aalborg University Business


School
59 Denmark
Codes in R
Quantitative Methods

Douglas Turatti
▶ In practice, we do matrix algebra in a computer.
Matrix Multiplication

▶ I will briefly explain how to do simple matrix algebra in R. Matrix Transpose

Determinants

▶ We first need to know how to create a matrix. There are Matrix Inverse

Solving Linear
several ways to do this. Systems

Covariance Matrix
▶ A simple way is to create vectors and stack them as a 30 Codes in R

matrix. Introduction to
Econometrics

Simple Regression
▶ For example, the matrix Model

Estimating a Simple
Regression Model
 
1 4
(34)
0 2

can be created stacking the row vectors (1 4) and (0 2) or


stacking the column vectors (1 0)′ and (4 2)′ .
Aalborg University Business
School
59 Denmark
Codes in R
Quantitative Methods

Douglas Turatti

Matrix Multiplication

Matrix Transpose
▶ The matrix is then created as Determinants
matrix_a <- rbind(c(1,4), c(0,2)) Matrix Inverse
This stacks the vectors (1 4) and (0 2) by rows into matrix Solving Linear
Systems
matrix_a.
Covariance Matrix
31 Codes in R
▶ To find the same matrix stacking the columns
Introduction to
matrix_a <- cbind(c(1,0), c(4,2)) Econometrics

Simple Regression
Model
▶ Basic calculations like sum of matrices and multiplication
Estimating a Simple
by a scalar are trivial, for example Regression Model

matrix_b <- matrix_a + matrix_a

Aalborg University Business


School
59 Denmark
Codes in R
Quantitative Methods
▶ For matrix multiplication in R you need to be aware of the Douglas Turatti

syntax. For example Matrix Multiplication


x%*%y Matrix Transpose
This command performs the matrix multiplication between Determinants

matrices x and y. Matrix Inverse

Solving Linear
Systems
▶ For example let
Covariance Matrix
    32 Codes in R
1 4 1 2
x= y= (35) Introduction to
0 2 2 2 Econometrics

Simple Regression
Model
▶ To create the matrices and the multiplication xy
Estimating a Simple
x <- cbind(c(1,0), c(4,2)) Regression Model

y <- cbind(c(1,2), c(2,2))


xy <- x%*%y
which yields  
9 10
xy = (36)
4 4 Aalborg University Business
School
59 Denmark
Codes in R
Quantitative Methods

Douglas Turatti
▶ Be aware that the following command is NOT matrix
Matrix Multiplication
multiplication Matrix Transpose
x*y Determinants
If x and y are matrices, this is element-wise multiplication. Matrix Inverse

We are usually not interested in element-wise Solving Linear


Systems
multiplication.
Covariance Matrix
33 Codes in R
▶ Determinants are easy to obtain. For example
Introduction to
detxy <- det(xy) Econometrics

Simple Regression
Model
▶ The transpose is found for example as
Estimating a Simple
tx <- t(x) Regression Model

▶ The inverse is found for example as


invx <- inv(x)
You first need to install and load the package matlib.
Aalborg University Business
School
59 Denmark
Codes in R
Quantitative Methods

▶ Suppose you have two column vectors a and b. For Douglas Turatti

example, the following command gets their estimated Matrix Multiplication

covariance (a scalar) Matrix Transpose

covab <- cov(a,b) Determinants

Matrix Inverse

▶ However, if you want to get the covariance matrix, you first Solving Linear
Systems

need to stack them together where rows are observations Covariance Matrix

and columns are variables. 34 Codes in R

Introduction to
Econometrics
▶ For example, the following code stacks two column vectors
Simple Regression
a and b together in the matrix c and gets their estimated Model

covariance matrix Estimating a Simple


Regression Model
c <- cbind(a,b)
covc <- cov(c)
This results in a 2 × 2 covariance matrix.
and the correlation matrix
corc <- cor(c) Aalborg University Business
School
59 Denmark
Introduction to Econometrics
What is econometrics?

Quantitative Methods

Douglas Turatti

Matrix Multiplication

Matrix Transpose

Determinants

▶ Today, we start the econometrics’ block of the course. The Matrix Inverse

main book is Wooldridge’s Introductory Econometrics. Solving Linear


Systems

Covariance Matrix
▶ Econometrics uses statistical methods for estimating Codes in R
economic relationships and testing economic theories. 35 Introduction to
Econometrics

▶ Economic relationships and theories are understood in a Simple Regression


Model
broad sense. Recall that finance is a subset of economics. Estimating a Simple
Regression Model

Aalborg University Business


School
59 Denmark
Introduction to Econometrics
What is econometrics?

Quantitative Methods

Douglas Turatti

Matrix Multiplication
▶ Econometrics vs Statistics: Econometrics is different from Matrix Transpose
statistics. Econometrics is a discipline that uses statistics Determinants

to develop methods intended for economic data. It is a Matrix Inverse

discipline of economics, not statistics. Solving Linear


Systems

Covariance Matrix
▶ An important characteristic of econometric models is that
Codes in R
they take into account that econ data is observational data 36 Introduction to
not experimental data. Econometrics

Simple Regression
Model
▶ Econometrics vs Machine Learning: Econometrics is Estimating a Simple
Regression Model
interested in causal relations and interpretation of
parameters. Machine Learning is mostly concerned with
predictions.

Aalborg University Business


School
59 Denmark
Introduction to Econometrics
Types of economic data : Cross-Section

Quantitative Methods

Douglas Turatti

Cross-sectional data: A cross-sectional data set consists of a Matrix Multiplication

Matrix Transpose
sample of individuals, households, firms, assets, countries, or
Determinants
a variety of other units, taken at a given point in time.
Matrix Inverse

Solving Linear
Systems

Covariance Matrix

Codes in R
37 Introduction to
Econometrics

Simple Regression
Model

Estimating a Simple
Regression Model

Aalborg University Business


School
59 Denmark
Introduction to Econometrics
Types of economic data : Time-Series

Quantitative Methods

Douglas Turatti

A time series data set consists of observations on a variable or Matrix Multiplication


several variables over time. Time-series are the most important Matrix Transpose
data type for finance. Determinants

Matrix Inverse
In time series, the observations are dependent over time. Thus,
Solving Linear
they are not a simple extension of the cross-section case. Systems

Covariance Matrix

Codes in R
38 Introduction to
Econometrics

Simple Regression
Model

Estimating a Simple
Regression Model

Aalborg University Business


School
59 Denmark
Introduction to Econometrics
Types of economic data : Panel Data

Quantitative Methods

Douglas Turatti

Observe cross sections of the same individuals at different Matrix Multiplication

points in time. Matrix Transpose

Determinants

Matrix Inverse

Solving Linear
Systems

Covariance Matrix

Codes in R
39 Introduction to
Econometrics

Simple Regression
Model

Estimating a Simple
Regression Model

Aalborg University Business


School
59 Denmark
Simple Regression Model
Introduction

Quantitative Methods

Douglas Turatti

Matrix Multiplication

▶ We start analyzing econometric methods for cross-section data. Matrix Transpose

Determinants

▶ Models for time-series data build upon cross-section analysis Matrix Inverse

and introduces several new techniques. Solving Linear


Systems

Covariance Matrix
▶ We start with the simple regression analysis in the cross-section
Codes in R
context.
Introduction to
Econometrics
▶ A regression model studies the relationship between two or 40 Simple Regression
Model
more variables.
Estimating a Simple
Regression Model
▶ The simple regression analysis can be used to study the
relationship between two variables.

Aalborg University Business


School
59 Denmark
Simple Regression Model
Definitions

Quantitative Methods

Douglas Turatti
▶ Simple regression model: Studies the relationship between only
two variables, e.g., x and y . Matrix Multiplication

Matrix Transpose
▶ Let’s write the most basic simple linear regression model, Determinants

Matrix Inverse
y = β0 + β1 x + u (37) Solving Linear
Systems

▶ This equation is assumed to be Data Generating Process Covariance Matrix

(DGP) for y . We assume that the variable y is created by this Codes in R

mechanism in the real-word. Introduction to


Econometrics

41 Simple Regression
▶ y is called dependent variable, explained variable or regressand. Model

Estimating a Simple
▶ x is the independent variable, explanatory variable, or regressor. Regression Model

▶ u is called the error term or disturbance in the relationship, and


captures the effects of all unobserved variables on y .

▶ The expression β0 + β1 is called regression function.


Aalborg University Business
School
59 Denmark
Simple Regression Model
Definitions

Quantitative Methods

Douglas Turatti

Matrix Multiplication

Matrix Transpose

Determinants

Matrix Inverse

Solving Linear
Systems

Covariance Matrix

Codes in R

Introduction to
Econometrics

42 Simple Regression
Model

Estimating a Simple
Regression Model

▶ u captures the difference between the regression function and


the observed y .
Aalborg University Business
School
59 Denmark
Simple Regression Model
Definitions

Quantitative Methods

Douglas Turatti
▶ It is important to understand that so far the regression model is
Matrix Multiplication
not a statistical model. We need to make assumptions!
Matrix Transpose

▶ Note that the equation (37) is an identity, so any value of (β0 , β1 ) Determinants

will work as the error term will capture the difference, Matrix Inverse

Solving Linear
Systems
u = y − β0 − β1 x (38)
Covariance Matrix

Codes in R
▶ This is a problem as we assume that β0 and β1 are unknown
Introduction to
parameters of the real-world and unique. Econometrics
43 Simple Regression
▶ Identification: The parameters β0 and β1 are unique. This Model

means that only one value of (β0 , β1 ) must satisfy the regression Estimating a Simple
Regression Model
model.

▶ This means that we have to restrict the error terms somehow.


Firstly, we assume that the error terms are a random variable
with some (unspecified) probability density function.
Aalborg University Business
School
59 Denmark
Simple Regression Model
Definitions

Quantitative Methods

Douglas Turatti
▶ When we set up our models with u as a random variable, what
Matrix Multiplication
we are really doing is using the mathematical concept of
Matrix Transpose
randomness to model our ignorance of the details of economic
Determinants
mechanisms.
Matrix Inverse

▶ However, only assuming that u is a r.v is not enough. Solving Linear


Systems

Covariance Matrix
▶ A good assumption would be to imply that all other variables (not
Codes in R
x) have on average zero impact on y . Why this is a good Introduction to
assumption? Then we can write this condition as, Econometrics

44 Simple Regression
Model
E(u) = 0 (39)
Estimating a Simple
Regression Model
▶ Thus, the expected value of the shocks is 0. On average u does
not explain y .

▶ It is possible to show that this assumption on the mechanism of


the real world is able to identify (i.e. unique) β0 . What about β1 ?
Aalborg University Business
School
59 Denmark
Simple Regression Model
Definitions

Quantitative Methods

Douglas Turatti

▶ As u is a r.v so y is also a r.v. Matrix Multiplication

Matrix Transpose
▶ In finance x is also a random variable as it is observed data.
Determinants

Matrix Inverse
▶ We now turn to the crucial assumption regarding how u and x
Solving Linear
are related. Systems

Covariance Matrix
▶ Additionally, for the model to make sense u must be uncorrelated Codes in R
with x. Then, we have another assumption: Introduction to
Econometrics

E(u|X ) = E(uX ) = cov(u, x) = 0 (40) 45 Simple Regression


Model

Estimating a Simple
▶ This conditional expectation says that the mean of error term u is Regression Model

not affected by the value of x.

▶ It is possible to show that this assumption implies that x and u


are uncorrelated r.v., and we are able to identify β1 .
Aalborg University Business
School
59 Denmark
Simple Regression Model
Definitions

Quantitative Methods

Douglas Turatti

▶ Let’s take the conditional expectation on the regression model Matrix Multiplication

equation (1), Matrix Transpose

E[y |x] = β0 + β1 x (41) Determinants

Matrix Inverse
as E[β1 x|x] = β1 x.
Solving Linear
Systems
▶ Hence, the regression function models the conditional Covariance Matrix
expectation of y given x Codes in R

Introduction to
▶ Now, we can say that β1 is the expected change in y when x Econometrics

changes one unit. 46 Simple Regression


Model

Estimating a Simple
▶ β0 + β1 x can be understood as the systematic (or predictable Regression Model
part) of y and u is the random or unsystematic part.

▶ The observed value of y has some distribution having β0 + β1 x


as the mean.
Aalborg University Business
School
59 Denmark
Simple Regression Model
Definitions

Quantitative Methods

Douglas Turatti

Matrix Multiplication

Matrix Transpose

Determinants

Matrix Inverse

Solving Linear
Systems

Covariance Matrix

Codes in R

Introduction to
Econometrics

47 Simple Regression
Model

Estimating a Simple
Regression Model

The values of y may be dispersed, but the regression function


gives the conditional expectation of y for every value of x. Aalborg University Business
School
59 Denmark
Simple Regression Analysis
Matrix Form

Quantitative Methods
▶ Why matrix form? Because it is easier to show results.
Douglas Turatti

▶ Let y be a N × 1 vector,
 Matrix Multiplication
y1

Matrix Transpose
y 
 2
Determinants
y = .  (42)
 
 .  Matrix Inverse
 . 
yN Solving Linear
Systems

Then X can be written as N × 2 matrix, Covariance Matrix

Codes in R
1 x1
 
Introduction to
1 x2  Econometrics
X = . .  (43)
 
. .  48 Simple Regression
. . Model
1 xN
Estimating a Simple
Regression Model
why a vector of 1?

▶ Let β be the 2 × 1 vector of parameters, i.e


" #
β0
β= (44)
β1
Aalborg University Business
School
59 Denmark
Simple Regression Analysis
Matrix Form

Quantitative Methods

Douglas Turatti
▶ Let u be a N × 1 vector,
Matrix Multiplication
 
u1 Matrix Transpose

Determinants
 u2 
 
Matrix Inverse
u= .  (45)
 
 .  Solving Linear
 .  Systems
uN Covariance Matrix

Codes in R

▶ Then, the regression model is represented as, Introduction to


Econometrics
49 Simple Regression
y = X β + u (46) Model
(N×1) (N×2)(2×1) (N×1)
Estimating a Simple
Regression Model
▶ For example the first row of X β is

β0 + β1 x1 (47)

which is the regression function for the first observation.


Aalborg University Business
School
59 Denmark
Estimating a Simple Regression Model
Quantitative Methods

Douglas Turatti

▶ There are four steps when working with econometric models. Matrix Multiplication

Matrix Transpose
1. Theoretical DGP/Identification: Formulation of the
Determinants
theoretical model and the respective statistical model.
Matrix Inverse
Preliminar analysis (e.g. sample covariance) or testing. Solving Linear
Systems
2. Estimation: estimation of the statistical model using a Covariance Matrix
sample. Codes in R

Introduction to
3. Model validation/Diagnostics: Inference on the DGP. Tests Econometrics

Simple Regression
to validate the statistical model (t-test, F-Test, White’s test Model
for heteroscedasticity, LM test for autocorrelation, 50 Estimating a Simple
specification tests, etc.) Regression Model

4. Model selection: If relevant, competition between models,


for example, information criteria, forecasting study etc.

Aalborg University Business


School
59 Denmark
Estimating a Simple Regression Model
Quantitative Methods

Douglas Turatti

▶ The model we have discussed is the Data Generating Process Matrix Multiplication

(DGP), i.e. the mechanism in the real-world which generates y . Matrix Transpose

In other words, the real model. Determinants

Matrix Inverse
▶ The parameters of the DGP are fixed and unique, but unknown. Solving Linear
Systems
Hence, if we want to study the relationship between x and y we
Covariance Matrix
need to get a sample.
Codes in R

▶ When we have a sample, we need to try to recover β0 and β1 Introduction to


Econometrics
implied by this sample. Simple Regression
Model

▶ This process is called parameter estimation. We use a sample 51 Estimating a Simple


Regression Model
to make inferences on the DGP.

▶ Note that an estimate are never the real parameters β0 and β1 ,


which are always unknown.

Aalborg University Business


School
59 Denmark
Estimating a Simple Regression Model
Quantitative Methods

Douglas Turatti
▶ Parameter Estimation Methods: Set of principles on how to estimate the
Matrix Multiplication
model parameters using observations. 3 relevant principles:
Matrix Transpose

▶ Ordinary Least Squares (OLS): Minimization of the sum of the squares of Determinants

the error terms created in the results of every single equation. Matrix Inverse

Solving Linear
▶ The method of moments (MM): The parameters are estimated so the Systems

sample moments are the same as the model moments. Covariance Matrix

Codes in R
▶ Maximum Likelihood (MLE): The parameters of a econometric model are Introduction to
found by maximizing a likelihood function, so that under the assumed Econometrics

statistical model the observed data is most probable. Simple Regression


Model

▶ As a general rule, these estimators will have differentes estimates. 52 Estimating a Simple
Regression Model
However, in the case of the linear regression model all of them have
the same estimator.

▶ Here, I will show the formulas using the OLS criteria. Wooldridge shows
the same estimator by the method of moments.
Aalborg University Business
School
59 Denmark
Estimating a Simple Regression Model
OLS Derivation

Quantitative Methods

Douglas Turatti
▶ Let’s assume a sample xi , yi for i = 1, . . . , N. Every observation is
Matrix Multiplication
generated by,
yi = β0 + β1 xi + ui (48) Matrix Transpose

Determinants
▶ OLS aims to minimize the distance between yi and β0 + β1 xi or to Matrix Inverse
minimize ui in the whole sample. Solving Linear
Systems
▶ A better way is to minimize the squares of ui , why? Thus, Covariance Matrix

Codes in R
N
X
arg minβ0 ,β1 (yi − β0 − β1 xi )2 (49) Introduction to
Econometrics
i=1
Simple Regression
Model
▶ The OLS estimator for β0 and β1 minimizes the OLS objective function
53 Estimating a Simple
(49). Regression Model

▶ How to find the OLS estimators:

▶ Apply the rules of optimization. Solve the first-order conditions: partial


derivatives equal to 0.
Aalborg University Business
School
59 Denmark
Estimating a Simple Regression Model
OLS Derivation

Quantitative Methods

Douglas Turatti
▶ Let’s write the OLS objective function in matrix form. Why? Because it is
Matrix Multiplication
much easier to solve and find β̂.
Matrix Transpose

▶ Recall that OLS aims to minimize Determinants

Matrix Inverse
N
X
2 Solving Linear
arg minβ0 ,β1 (yi − β0 − β1 xi ) (50) Systems
i=1
Covariance Matrix

▶ Let’s write (50) in matrix form. Note that (50) is just the dot product of u Codes in R

with itself Introduction to


Econometrics
arg minβ0 ,β1 u ′ u (51)
Simple Regression
where u = (Y − X β). Model
54 Estimating a Simple
minβ0 ,β1 (Y − X β)′ (Y − X β) (52) Regression Model

a row vector times a column vector is a scalar, and each term of the sum
is a square of ui .

▶ All we have to do is to take the derivative of (52) and equal to 0.


Aalborg University Business
School
59 Denmark
Estimating a Simple Regression Model
OLS Derivation

Quantitative Methods
▶ I will briefly show this math. Douglas Turatti

Matrix Multiplication
▶ First perform the matrices multiplication
Matrix Transpose
arg minβ0 ,β1 y ′ y − β ′ X ′ y − y ′ X β + β ′ X ′ X β (53) Determinants

Matrix Inverse
as β ′ X ′ = (X β)′ .
Solving Linear
Systems
▶ Now, note that β ′ X ′ y = y ′ X β, as one is the transpose of the other, and
Covariance Matrix
y ′ X β is a scalar. Hence
Codes in R
arg minβ0 ,β1 y ′ y − 2β ′ X ′ y + β ′ X ′ X β (54) Introduction to
Econometrics

▶ Proceed to take the derivative and equal to 0. Use Simple Regression


Model
dβ ′ X ′ X β/dβ = 2X ′ X β
55 Estimating a Simple
Regression Model
−2X ′ y + 2X ′ X β̂ = 0 (55)
−X ′ y + X ′ X β̂ = 0 (56)
X ′ X β̂ = X ′ y (57)

▶ Note that now we have β̂. We just have to solve (57), which is just a
linear system of equations. Aalborg University Business
School
59 Denmark
Estimating a Simple Regression Model
OLS Derivation

Quantitative Methods

Douglas Turatti

▶ To solve the system pre-multiply each side of (57) by (X ′ X )−1 Matrix Multiplication

Matrix Transpose
(X ′ X )−1 X ′ X β̂ = (X ′ X )−1 X ′ y (58) Determinants

Matrix Inverse
which yields
′ −1 ′ Solving Linear
β̂ =( X X ) X y (59) Systems
(2×1) (2×N) (N×2) (2×N)(N×1)
Covariance Matrix

▶ This is the famous OLS formula. It is important to note that it works for Codes in R

any linear regression model, not only for the simple linear case. Introduction to
Econometrics

▶ For example if we have k regressors Simple Regression


Model
′ −1 ′
β̂ =( X X ) X y (60) 56 Estimating a Simple
(k ×1) (k ×N) (N×k ) (k ×N)(N×1) Regression Model

▶ Note that the OLS estimator β̂ is not the real β. The real β is never
fully known.

Aalborg University Business


School
59 Denmark
Estimating a Simple Regression Model
OLS Derivation

Quantitative Methods

Douglas Turatti

Matrix Multiplication
▶ In the case of the simple linear regression, it is possible to show that (59)
Matrix Transpose
can be rewritten as,
Determinants
PN
i=1 (xi − x̄)(yi − ȳ ) Matrix Inverse
β̂1 = PN (61)
2 Solving Linear
i=1 (xi − x̄) Systems
\y )
cov(x, Covariance Matrix
key β̂1 = (62) Codes in R
\
var(x)
Introduction to
This is the OLS estimator for β1 in the context of the simple linear Econometrics

regression. In words, the sample covariance between x and y divided Simple Regression
Model
by the sample variance of x.
57 Estimating a Simple
Regression Model
▶ The estimator β̂0 can be written as,

β̂0 = ȳ − β̂1 x̄ (63)

Aalborg University Business


School
59 Denmark
Estimating a Simple Regression Model
OLS Estimators

Quantitative Methods

Douglas Turatti
▶ These estimators are called Ordinary Least Squares (OLS).
Matrix Multiplication

▶ We define the fitted values as, Matrix Transpose

Determinants
ŷi = β̂0 + β̂1 xi (64) Matrix Inverse

Ŷ = X β̂ matrix form (65) Solving Linear


Systems

Covariance Matrix
This is the value we predict for y when x = xi
Codes in R

Introduction to
▶ Residual: Difference between yi and the fitted ŷi : Econometrics

Simple Regression
ûi = yi − ŷi = yi − β̂0 − β̂1 xi (66) Model
58 Estimating a Simple
û = Y − Ŷ = Y − X β̂ matrix form (67) Regression Model

▶ Residuals are not the error terms.

▶ Regression Line:
ŷ = β̂0 + β̂1 x (68)
Aalborg University Business
School
59 Denmark
Estimating a Simple Regression Model
OLS Estimators

Quantitative Methods

Douglas Turatti

Matrix Multiplication

Matrix Transpose

Determinants

Matrix Inverse

Solving Linear
Systems

Covariance Matrix

Codes in R

Introduction to
Econometrics

Simple Regression
Model
59 Estimating a Simple
Regression Model

▶ Residual is the difference of the observed yi and the fitted ŷi .


Aalborg University Business
School
59 Denmark

You might also like