0% found this document useful (0 votes)
12 views22 pages

2.4 Multivariate Random Variables-1607080443813

This document provides an overview of key concepts related to multivariate random variables, including: 1) How a probability matrix can represent the probability mass function of a bivariate random variable. 2) How to compute marginal and conditional distributions, expectations, covariance, and correlation for bivariate random variables. 3) Examples of how these concepts can be applied, such as to model the relationship between bond returns and analyst ratings.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views22 pages

2.4 Multivariate Random Variables-1607080443813

This document provides an overview of key concepts related to multivariate random variables, including: 1) How a probability matrix can represent the probability mass function of a bivariate random variable. 2) How to compute marginal and conditional distributions, expectations, covariance, and correlation for bivariate random variables. 3) Examples of how these concepts can be applied, such as to model the relationship between bond returns and analyst ratings.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 22

FRM Part 1

Book 2 – Quantitative Analysis

MULTIVARIATE RANDOM VARIABLES


Learning Objectives
After completing this reading you should be able
to:
 Explain how a probability matrix can be used to express a probability mass function
(PMF).
 Compute the marginal and conditional distributions of a discrete bivariate random
variable.
 Explain how the expectation of a function is computed for a bivariate discrete random
variable.
 Define covariance and explain what it measures.
 Explain the relationship between the covariance and correlation of two random
variables and how these are related to the independence of the two variables.
 Explain the effects of applying linear transformations on the covariance and
correlation between two random variables.
 Compute the variance of a weighted sum of two random variables.
 Compute the conditional expectation of a component of a bivariate random variable.
 Describe the features of an iid sequence of random variables.
 Explain how the iid property is helpful in computing the mean and variance of a sum of
iid random variables.
Multivariate Random Variables
 Multivariate random variables accommodate the dependence between two
or more random variables.
 Multivariate analysis is informed by the fact that in most real-life scenarios,
we are interested in several random variables simultaneously.
 Examples
o In a medical diagnosis context, the results of several tests may be
significant.
o In an investment context, we may wish to establish the relationship
between return and some other variable(s) such as the state of the
economy or security rating.

The concepts in the multivariate random variable analysis (such as


expectations and moments) are analogous to those under univariate
random variable analysis.
Multivariate PMF and CDF
 The PMF explains the probability of realization as a function of 𝑥1 and 𝑥2 .
The PMF has the following properties:
I. All probabilities must be greater than or equal to zero
𝐟𝐗𝟏 ,𝐗𝟐 𝐱 𝟏 , 𝐱 𝟐 ≥ 𝟎
II. The sum of all probabilities must add up to 1
෍෍
𝒇𝑿𝟏 ,𝑿𝟐 𝒙𝟏 , 𝒙𝟐 = 𝟏
𝒙𝟏
𝒙𝟐

 The CDF of a bivariate discrete random variable returns the total probability
that each component is less than or equal to a given value
𝑭𝑿𝟏 ,𝑿𝟐 𝒙𝟏 , 𝒙𝟐 = 𝑷 𝑿𝟏 < 𝒙𝟏 , 𝑿𝟐 < 𝒙𝟐
Illustration >>
Multivariate PMF and CDF
𝑷 𝑿𝟏 = 𝒙 𝟏 , 𝑿𝟐 = 𝒙 𝟐

𝒇𝑿𝟏 ,𝑿𝟐 𝒙𝟏 , 𝒙𝟐

𝑿𝟏 𝑿𝟐

A 3D representation of the bivariate discrete random variable distribution


Probability Matrix (1/2)
 A probability matrix is a tabular representation of the PMF.
Example
 In financial markets, market sentiments play a role in determining the
return earned on a security. Suppose the return earned on a bond is in part
determined by the rating given to the bond by analysts.
 For simplicity, we are going to assume the following:
o There are only three possible returns - 10%, 0%, or -10%
o Analyst ratings (sentiments) can be positive, neutral, or negative

 We can represent this in a probability matrix as follows:


Bond Return (X1)
-10% 0% 10%
Analyst Positive +1 5% 5% 30%
(X2) Neutral 0 10% 10% 15%
Negative -1 20% 5% 0%
Probability Matrix (2/2)
Bond Return (X1)
-10% 0% 10%
Analyst Positive +1 5% 5% 30%
(X2) Neutral 0 10% 10% 15%
Negative -1 20% 5% 0%

 Each cell represents the probability of a joint outcome.


o For example, there’s a 5% probability of a negative return (-10%) if
analysts have positive views about the bond and its issuer.
Marginal Distribution of a Discrete
Bivariate RV
 The marginal distribution gives the distribution of a single variable in a
joint distribution.
 In the case of bivariate distribution, the marginal PMF of X1 is computed by
summing up the probabilities for X1 across all the values in the support of
𝑋2 .

Bond Return (X1)


-10% 0% 10%
Analyst Positive +1 5% 5% 30%
(X2) Neutral 0 10% 10% 15%
Negative -1 20% 5% 0%

Return(𝑿𝟏 ) -10% 0% 10%


P(𝑿𝟏 = 𝒙𝟏 ) 35% 20% 45%
Independence of Bivariate Random
Variables
 If two (bivariate) random variables are independent, then the bivariate
distribution must be the product of their marginal distributions.
 For example, if we assume that the two variables in our example – return and
ratings – are independent, we can calculate the joint distribution by the
multiplying their marginal distributions
o But are they really independent?
Bond return (𝑿𝟏 )
-10% 0% 10% 𝒇𝑿𝟐 𝒙𝟐
Analyst (X2)

Positive +1 5% 5% 30% 40%


Neutral 0 10% 10% 15% 35%
Negative -1 20% 5% 0% 25%

𝒇𝑿𝟏 𝒙𝟏 35% 20% 45%


 It is clear that the two variables are not independent because multiplying their
marginal PMFs does not lead us back to the joint PMF (e.g., 5% ≠ 35% ∗ 40%).
Conditional Distributions
 The conditional distributions describe the probability of an outcome of a
random variable conditioned on a particular (or a set of) value(s) of the
other random variable.
 The conditional distribution of X1 given X2 is defined as:
𝒇𝑿𝟏 ,𝑿𝟐 𝒙𝟏 , 𝒙𝟐
𝒇 𝑿 𝟏 𝑿𝟐 𝒙 𝟏 𝑿𝟐 = 𝒙 𝟐 =
𝒇𝑿𝟐 𝒙𝟐
 The conditional distribution is joint distribution divided by the marginal
distribution of the conditioning variable.
Expectations
 The expression for the expected value of a function g(𝑋1 𝑋2 ) of the random
variables (X,Y) is found by summing the product:

𝑽𝒂𝒍𝒖𝒆 × 𝑷𝒓𝒐𝒃𝒂𝒃𝒊𝒍𝒊𝒕𝒚 𝒐𝒇 𝒂𝒔𝒔𝒖𝒎𝒊𝒏𝒈 𝒕𝒉𝒂𝒕 𝒗𝒂𝒍𝒖𝒆

Bond Return (X1)


-10% 0% 10%
Analyst Positive +1 5% 5% 30%
(X2) Neutral 0 10% 10% 15%
Negative -1 20% 5% 0%

Return(𝑿𝟏 ) -10% 0% 10%


P(𝑿𝟏 = 𝒙𝟏 ) 35% 20% 45%

 Expected Bond return = (35% × -10%) + (20% × 0%) + (45% × 10%) = 1%


Conditional Expectation
 A conditional expectation is simply the mean calculated after a set of prior
conditions has happened.
Example
 What is the expected bond return conditional on a positive analyst rating?

Bond return (𝑿𝟏 )


-10% 0% 10% 𝒇 𝑿𝟐 𝒙 𝟐
Analyst (X2)

Positive +1 5% 5% 30% 40%


Neutral 0 10% 10% 15% 35%
Negative -1 20% 5% 0% 25%

 The conditional expectation of the return is determined as follows:


5% 5% 30%
o 𝐸 𝑋1 𝑋2 = 1 = −10% × 40% + 0% × 40% + 10% × 40%
o = 6.25%
Covariance
 Covariance is a measure of the degree of co-movement between two
random variables.
 For instance, we could be interested in the degree of co-movement between
the rate of interest and the rate of inflation.
o X1 = interest rate
o X2 = inflation
𝒄𝒐𝒗[𝑿𝟏 𝐗 𝟐 ] = 𝑬 [(𝑿𝟏 − 𝑬[𝑿𝟏 ])(𝐗 𝟐 − 𝑬[𝐗 𝟐 ])]
 The covariance of returns from a joint probability model is based on the
probability-weighted average of the cross-products of the random
variables’ deviations from their expected values for each possible
outcome.

 Covariance can be:


o Positive: indicating a positive relationship,
o Negative: indicating an inverse relationship, or
o Zero: indicating the absence of a relationship between the variables
Correlation
 Correlation is the ratio of the covariance between two random variables
and the product of their two standard deviations.

𝐜𝐨𝐯𝐚𝐫𝐢𝐚𝐧𝐜𝐞 𝐗 𝟏 𝐗 𝟐
𝐂𝐨𝐫𝐫𝐞𝐥𝐚𝐭𝐢𝐨𝐧 𝐗 𝟏 𝐗 𝟐 =
𝐬𝐭𝐚𝐧𝐝𝐚𝐫𝐝 𝐝𝐞𝐯𝐢𝐚𝐭𝐢𝐨𝐧 𝐗 𝟏 × 𝐬𝐭𝐚𝐧𝐝𝐚𝐫𝐝 𝐝𝐞𝐯𝐢𝐚𝐭𝐢𝐨𝐧 𝐗 𝟐
Interpretation
 Increasingly positive correlation
o Strong positive linear relationship (up to 1, which indicates a perfect
linear relationship).
 Increasingly negative covariance
o Strong negative (inverse) linear relationship (down to −1, which indicates
a perfect inverse linear relationship).
 Zero correlation
o No linear relationship.
Linear Transformation
 A linear transformation takes the form of creating a new variable from the
old variable using the equation for a straight line:
𝑿𝟏 a + b𝑿𝟏
Linear T.
𝑿𝟐 c + d𝑿𝟐
Old variable New variable

 When two variables are linearly transformed, the covariance between them
is only affected by the scale constants and this happens in a
multiplicative manner; the shift constants have NO EFFECT. Thus,
𝐂𝐨𝐯 𝐚 + 𝐛𝐗 𝟏 , 𝐜 + 𝐝𝐗 𝟐 = 𝐛𝐝 𝐂𝐨𝐯(𝐗 𝟏 , 𝐗 𝟐 )
 Linear transformation does not change the correlation between two
random variables X1 and X2 .
Variance
Sum of Random Variables
 The variance of the sum of two random variables is given by:
𝐕𝐚𝐫 𝐗 𝟏 + 𝐗 𝟐 = 𝐕𝐚𝐫 𝐗 𝟏 + 𝐕𝐚𝐫 𝐗 𝟐 + 𝟐𝐂𝐨𝐯(𝐗 𝟏 𝐗 𝟐 )
 If the random variables are independent, then Cov X1 X2 = 0 and thus:
𝐕𝐚𝐫 𝐗 𝟏 + 𝐗 𝟐 = 𝐕𝐚𝐫 𝐗 𝟏 + 𝐕𝐚𝐫 𝐗 𝟐
Weighted Sum of Random Variables
 If a weight of a is attached to variable X1 , and a weight of b attached to X2
(where a + b = 1), then:
𝐕𝐚𝐫 𝐚𝐗 𝟏 + 𝐛𝐗 𝟐 = 𝐚𝟐 𝐕𝐚𝐫 𝐗 𝟏 + 𝐛𝟐 𝐕𝐚𝐫 𝐗 𝟐 + 𝟐𝐚𝐛𝐂𝐨𝐯(𝐗 𝟏 𝐗 𝟐 )
Continuous Random Variables
 Continuous random variables make use of the same concepts and
methodologies as discrete random variables.
 The main distinguishing factor is that instead of PMFs, continuous random
variables use PDFs.
P(𝑥1 , 𝑥2 ∈ 𝐴) = ‫𝑋𝑓 ׭‬1,𝑋2 𝑥1 , 𝑥2 𝑑𝑥1 𝑑𝑥2

𝑓𝑋1,𝑋2 𝑥1 , 𝑥2

𝑿𝟐

𝑿𝟏
Marginal PDF for Continuous
Random Variables
 For a continuous bivariate random variable, the marginal PDF integrates
one component out of the joint PDF.

𝑓𝑋2 𝑥2 = න 𝑓𝑋1,𝑋2 𝑥1 , 𝑥2 𝑑𝑥1
−∞
(integrating 𝑋1 out)

𝑓𝑋1,𝑋2 𝑥1 , 𝑥2

𝑿𝟐

𝑿𝟏
Conditional PDF for Continuous
Random Variables
 What’s the probability of 𝑋2 given a particular value of 𝑋1 ?
𝒇𝑿𝟏 ,𝑿𝟐 𝒙𝟏 , 𝒙𝟐
𝒇 𝑿𝟏 𝑿𝟐 𝒙 𝟐 𝑿 𝟏 = 𝒙 𝟏 =
𝒇𝑿𝟏 𝒙𝟏

𝑒. 𝑔. , 𝑋2 could
be interest
rates; 𝑋1 could
be a huge loss

𝑓𝑋1,𝑋2 𝑥1 , 𝑥2

𝑿𝟐

𝑿𝟏
What are i.i.d variables?
 A collection of random variables is independent and identically distributed
(iid) if each random variable has the same probability distribution as the
others and all are mutually independent.
Example
 Consider successive throws of a fair coin:
o The coin has no memory, so all the throws are “independent.”
o The probability of head vs. tail in every throw is 50:50; so the coin is
and stays fair; the distribution from which every throw is drawn is and
stays the same, and thus each outcome is “identically distributed.”
Mean and Variance of i.i.d variables
 iid variables have the same moments, and it is therefore easy to manipulate
them.
 Variables generated under the normal distribution are iid:

𝐱 𝐢𝐢𝐢𝐝 ~𝐍(𝛍, 𝛔𝟐 )
 The expected value of a sum of n iid random variables is just n times the
common mean of the random variables:
𝐧 𝐧 𝐧

𝐄 ෍ 𝐗 𝐢 = ෍ 𝐄 𝐗 𝐢 = ෍ 𝛍 = 𝐧𝛍
𝐢 𝐢 𝐢
o where 𝐸 𝑋𝑖 = 𝜇

 The variance of a sum of iid random variables is just n times the variance
of one variable:
𝐧

𝐕𝐚𝐫 ෍ 𝐗 𝐢 = 𝐧𝛔𝟐
𝐢
Book 2 – Quantitative Analysis

MULTIVARIATE RANDOM VARIABLES

Learning Objectives Recap


 Explain how a probability matrix can be used to express a probability mass function (PMF).
 Compute the marginal and conditional distributions of a discrete bivariate random variable.
 Explain how the expectation of a function is computed for a bivariate discrete random
variable.
 Define covariance and explain what it measures.
 Explain the relationship between the covariance and correlation of two random variables and
how these are related to the independence of the two variables.
 Explain the effects of applying linear transformations on the covariance and correlation
between two random variables.
 Compute the variance of a weighted sum of two random variables.
 Compute the conditional expectation of a component of a bivariate random variable.
 Describe the features of an iid sequence of random variables.
 Explain how the iid property is helpful in computing the mean and variance of a sum of iid
random variables.

You might also like