0% found this document useful (0 votes)
3 views

Introduction to Probability Part II

The document provides an overview of multivariate distributions, including definitions of joint and marginal cumulative distribution functions (CDFs), independence, and implications of independence in probability. It also discusses the multivariate normal distribution, its properties, and the concept of martingales in the context of random processes and betting strategies. Key concepts include the mean vector, covariance matrix, and examples of martingales such as random walks and Polya's urn model.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views

Introduction to Probability Part II

The document provides an overview of multivariate distributions, including definitions of joint and marginal cumulative distribution functions (CDFs), independence, and implications of independence in probability. It also discusses the multivariate normal distribution, its properties, and the concept of martingales in the context of random processes and betting strategies. Key concepts include the mean vector, covariance matrix, and examples of martingales such as random walks and Polya's urn model.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 20

Financial Engineering & Risk Management

Review of Multivariate Distributions

M. Haugh G. Iyengar
Department of Industrial Engineering and Operations Research
Columbia University
Multivariate Distributions I
Let X = (X1 . . . Xn )€ be an n-dimensional vector of random variables.

Definition. For all x = (x1 , . . . , xn ) œ Rn , the joint cumulative distribution


function (CDF) of X satisfies

FX (x) = FX (x1 , . . . , xn ) = P(X1 Æ x1 , . . . , Xn Æ xn ).

Definition. For a fixed i, the marginal CDF of Xi satisfies

FXi (xi ) = FX (Œ, . . . , Œ, xi , Œ, . . . Œ).

It is straightforward to generalize the previous definition to joint marginal


distributions. For example, the joint marginal distribution of Xi and Xj satisfies

Fij (xi , xj ) = FX (Œ, . . . , Œ, xi , Œ, . . . , Œ, xj , Œ, . . . Œ).

We also say that X has joint PDF fX (·, . . . , ·) if


⁄ x1 ⁄ xn
FX (x1 , . . . , xn ) = ··· fX (u1 , . . . , un ) du1 . . . dun .
≠Œ ≠Œ

2
Multivariate Distributions II
Definition. If X1 = (X1 , . . . Xk )€ and X2 = (Xk+1 . . . Xn )€ is a partition of
X then the conditional CDF of X2 given X1 satisfies

FX2 |X1 (x2 | x1 ) = P(X2 Æ x2 | X1 = x1 ).

If X has a PDF, fX (·), then the conditional PDF of X2 given X1 satisfies

fX (x) fX1 |X2 (x1 | x2 )fX2 (x2 )


fX2 |X1 (x2 | x1 ) = = (1)
fX1 (x1 ) fX1 (x1 )

and the conditional CDF is then given by


⁄ xk+1 ⁄ xn
fX (x1 , . . . , xk , uk+1 , . . . , un )
FX2 |X1 (x2 |x1 ) = ··· duk+1 . . . dun
≠Œ ≠Œ f X1 (x 1 )

where fX1 (·) is the joint marginal PDF of X1 which is given by


⁄ Œ ⁄ Œ
fX1 (x1 , . . . , xk ) = ··· fX (x1 , . . . , xk , uk+1 , . . . , un ) duk+1 . . . dun .
≠Œ ≠Œ

3
Independence

Definition. We say the collection X is independent if the joint CDF can be


factored into the product of the marginal CDFs so that

FX (x1 , . . . , xn ) = FX1 (x1 ) . . . FXn (xn ).

If X has a PDF, fX (·) then independence implies that the PDF also factorizes
into the product of marginal PDFs so that

fX (x) = fX1 (x1 ) . . . fXn (xn ).

Can also see from (1) that if X1 and X2 are independent then

fX (x) fX1 (x1 )fX2 (x2 )


fX2 |X1 (x2 | x1 ) = = = fX2 (x2 )
fX1 (x1 ) fX1 (x1 )

– so having information about X1 tells you nothing about X2 .

4
Implications of Independence
Let X and Y be independent random variables. Then for any events, A and B,

P (X œ A, Y œ B) = P (X œ A) P (Y œ B) (2)

More generally, for any function, f (·) and g(·), independence of X and Y implies

E[f (X )g(Y )] = E[f (X )]E[g(Y )]. (3)

In fact, (2) follows from (3) since


# $
P (X œ A, Y œ B) = E 1{XœA} 1{Y œB}
# $ # $
= E 1{XœA} E 1{Y œB} by (3)
= P (X œ A) P (Y œ B) .

5
Implications of Independence

More generally, if X1 , . . . Xn are independent random variables then

E [f1 (X1 )f2 (X2 ) · · · fn (Xn )] = E[f1 (X1 )]E[f2 (X2 )] · · · E[fn (Xn )].

Random variables can also be conditionally independent. For example, we say X


and Y are conditionally independent given Z if

E[f (X )g(Y ) | Z ] = E[f (X ) | Z ] E[g(Y ) | Z ].

– used in the (in)famous Gaussian copula model for pricing CDOs!

In particular, let Di be the event that the i th bond in a portfolio defaults.


Not reasonable to assume that the Di ’s are independent. Why?
But maybe they are conditionally independent given Z so that

P(D1 , . . . , Dn | Z ) = P(D1 | Z ) · · · P(Dn | Z )

– often easy to compute this.


6
The Mean Vector and Covariance Matrix

The mean vector of X is given by

E[X] := (E[X1 ] . . . E[Xn ])


and the covariance matrix of X satisfies


# $
:= Cov(X) := E (X ≠ E[X]) (X ≠ E[X])€

so that the (i, j)th element of is simply the covariance of Xi and Xj .

The covariance matrix is symmetric and its diagonal elements satisfy i,i Ø 0.

It is also positive semi-definite so that x€ x Ø 0 for all x œ Rn .

The correlation matrix, fl(X), has (i, j)th element flij := Corr(Xi , Xj )
- it is also symmetric, positive semi-definite and has 1’s along the diagonal.

7
Variances and Covariances

For any matrix A œ Rk◊n and vector a œ Rk we have

E [AX + a] = AE [X] + a (4)


Cov(AX + a) = A Cov(X) A€ . (5)

Note that (5) implies

Var(aX + bY ) = a 2 Var(X ) + b2 Var(Y ) + 2ab Cov(X , Y ).

If X and Y independent, then Cov(X , Y ) = 0


– but converse not true in general.

8
Financial Engineering & Risk Management
The Multivariate Normal Distribution

M. Haugh G. Iyengar
Department of Industrial Engineering and Operations Research
Columbia University
The Multivariate Normal Distribution I

If the n-dimensional vector X is multivariate normal with mean vector µ and


covariance matrix then we write

X ≥ MNn (µ, ).

The PDF of X is given by


1 ≠ 12 (x≠µ)€ (x≠µ)
fX (x) = e
≠1

(2fi)n/2 | |1/2

where | · | denotes the determinant.

Standard multivariate normal has µ = 0 and = In , the n ◊ n identity matrix


- in this case the Xi ’s are independent.
The moment generating function (MGF) of X satisfies
Ë € È
s X s€ µ+ 12 s€ s
„X (s) = E e = e .

2
The Multivariate Normal Distribution II

Recall our partition of X into X1 = (X1 . . . Xk )€ and X2 = (Xk+1 . . . Xn )€ .


Can extend this notation naturally so that
3 4 3 4
µ1 11 12
µ = and = .
µ2 21 22

are the mean vector and covariance matrix of (X1 , X2 ).

Then have following results on marginal and conditional distributions of X:

Marginal Distribution
The marginal distribution of a multivariate normal random vector is itself normal.
In particular, Xi ≥ MN(µi , ii ), for i = 1, 2.

3
The Bivariate Normal PDF

The Bivariate Normal PDF

4
The Multivariate Normal Distribution III

Conditional Distribution
Assuming is positive definite, the conditional distribution of a multivariate
normal distribution is also a multivariate normal distribution. In particular,

X2 | X1 = x1 ≥ MN(µ2.1 , 2.1 )

where µ2.1 = µ2 + 21
≠1
11 (x1 ≠ µ1 ) and 2.1 = 22 ≠ 21
≠1
11 12 .

Linear Combinations
A linear combination, AX + a, of a multivariate normal random vector, X, is
normally distributed with mean vector, AE [X] + a, and covariance matrix,
A Cov(X) A€ .

5
Financial Engineering & Risk Management
Introduction to Martingales

M. Haugh G. Iyengar
Department of Industrial Engineering and Operations Research
Columbia University
Martingales

Definition. A random process, {Xn : 0 Æ n Æ Œ}, is a martingale with respect


to the information filtration, Fn , and probability distribution, P, if
1. EP [|Xn |] < Œ for all n Ø 0
2. EP [Xn+m |Fn ] = Xn for all n, m Ø 0.

Martingales are used to model fair games and have a rich history in the modeling
of gambling problems.

We define a submartingale by replacing condition #2 with

EP [Xn+m |Fn ] Ø Xn for all n, m Ø 0.

And we define a supermartingale by replacing condition #2 with

EP [Xn+m |Fn ] Æ Xn for all n, m Ø 0.

A martingale is both a submartingale and a supermartingale.


2
Constructing a Martingale from a Random Walk
qn
Let Sn := i=1 Xi be a random walk where the Xi ’s are IID with mean µ.

Let Mn := Sn ≠ nµ. Then Mn is a martingale because:


Cn+m D
ÿ
En [Mn+m ] = En Xi ≠ (n + m)µ
i=1
Cn+m D
ÿ
= En Xi ≠ (n + m)µ
i=1
n
C n+m
D
ÿ ÿ
= X i + En Xi ≠ (n + m)µ
i=1 i=n+1
ÿn
= Xi + mµ ≠ (n + m)µ = Mn .
i=1

3
A Martingale Betting Strategy

Let X1 , X2 , . . . be IID random variables with


1
P(Xi = 1) = P(Xi = ≠1) = .
2
Can imagine Xi representing the result of coin-flipping game:
Win $1 if coin comes up heads
Lose $1 if coin comes up tails
Consider now a doubling strategy where we keep doubling the bet until we
eventually win. Once we win, we stop and our initial bet is $1.

First note that size of bet on n th play is 2n≠1


– assuming we’re still playing at time n.

Let Wn denote total winnings after n coin tosses assuming W0 = 0.

Then Wn is a martingale!

4
A Martingale Betting Strategy

To see this, first note that Wn œ {1, ≠2n + 1} for all n. Why?

1. Suppose we win for first time on n th bet. Then


! n≠2
"
Wn = ≠ 1 + 2 + · · · + 2 + 2n≠1
! n≠1 "
= ≠ 2 ≠ 1 + 2n≠1
= 1

2. If we have not yet won after n bets then


! n≠1
"
Wn = ≠ 1 + 2 + · · · + 2
= ≠2n + 1.

To show Wn is a martingale only need to show E[Wn+1 | Wn ] = Wn


– then follows by iterated expectations that E[Wn+m | Wn ] = Wn .

5
A Martingale Betting Strategy
There are two cases to consider:
1: Wn = 1: then P(Wn+1 = 1|Wn = 1) = 1 so

E[Wn+1 | Wn = 1] = 1 = Wn (6)

2: Wn = ≠2n + 1: bet 2n on (n + 1)th toss so Wn+1 œ {1, ≠2n+1 + 1}.


Clear that

P(Wn+1 = 1 | Wn = ≠2n + 1) = 1/2


P(Wn+1 = ≠2n+1 + 1 | Wn = ≠2n + 1) = 1/2

so that

E[Wn+1 | Wn = ≠2n + 1] = (1/2)1 + (1/2)(≠2n+1 + 1)


= ≠2n + 1 = Wn . (7)

From (6) and (7) we see that E[Wn+1 | Wn ] = Wn .

6
Polya’s Urn
Consider an urn which contains red balls and green balls.
Initially there is just one green ball and one red ball in the urn.

At each time step a ball is chosen randomly from the urn:


1. If ball is red, then it’s returned to the urn with an additional red ball.
2. If ball is green, then it’s returned to the urn with an additional green ball.
Let Xn denote the number of red balls in the urn after n draws. Then
k
P(Xn+1 = k + 1 | Xn = k) =
n+2
n+2≠k
P(Xn+1 = k | Xn = k) = .
n+2
Show that Mn := Xn /(n + 2) is a martingale.

(These martingale examples taken from “Introduction to Stochastic Processes”


(Chapman & Hall) by Gregory F. Lawler.)

You might also like