2.4 Multivariate Random Variables-1607080443813
2.4 Multivariate Random Variables-1607080443813
The CDF of a bivariate discrete random variable returns the total probability
that each component is less than or equal to a given value
𝑭𝑿𝟏 ,𝑿𝟐 𝒙𝟏 , 𝒙𝟐 = 𝑷 𝑿𝟏 < 𝒙𝟏 , 𝑿𝟐 < 𝒙𝟐
Illustration >>
Multivariate PMF and CDF
𝑷 𝑿𝟏 = 𝒙 𝟏 , 𝑿𝟐 = 𝒙 𝟐
𝒇𝑿𝟏 ,𝑿𝟐 𝒙𝟏 , 𝒙𝟐
𝑿𝟏 𝑿𝟐
𝐜𝐨𝐯𝐚𝐫𝐢𝐚𝐧𝐜𝐞 𝐗 𝟏 𝐗 𝟐
𝐂𝐨𝐫𝐫𝐞𝐥𝐚𝐭𝐢𝐨𝐧 𝐗 𝟏 𝐗 𝟐 =
𝐬𝐭𝐚𝐧𝐝𝐚𝐫𝐝 𝐝𝐞𝐯𝐢𝐚𝐭𝐢𝐨𝐧 𝐗 𝟏 × 𝐬𝐭𝐚𝐧𝐝𝐚𝐫𝐝 𝐝𝐞𝐯𝐢𝐚𝐭𝐢𝐨𝐧 𝐗 𝟐
Interpretation
Increasingly positive correlation
o Strong positive linear relationship (up to 1, which indicates a perfect
linear relationship).
Increasingly negative covariance
o Strong negative (inverse) linear relationship (down to −1, which indicates
a perfect inverse linear relationship).
Zero correlation
o No linear relationship.
Linear Transformation
A linear transformation takes the form of creating a new variable from the
old variable using the equation for a straight line:
𝑿𝟏 a + b𝑿𝟏
Linear T.
𝑿𝟐 c + d𝑿𝟐
Old variable New variable
When two variables are linearly transformed, the covariance between them
is only affected by the scale constants and this happens in a
multiplicative manner; the shift constants have NO EFFECT. Thus,
𝐂𝐨𝐯 𝐚 + 𝐛𝐗 𝟏 , 𝐜 + 𝐝𝐗 𝟐 = 𝐛𝐝 𝐂𝐨𝐯(𝐗 𝟏 , 𝐗 𝟐 )
Linear transformation does not change the correlation between two
random variables X1 and X2 .
Variance
Sum of Random Variables
The variance of the sum of two random variables is given by:
𝐕𝐚𝐫 𝐗 𝟏 + 𝐗 𝟐 = 𝐕𝐚𝐫 𝐗 𝟏 + 𝐕𝐚𝐫 𝐗 𝟐 + 𝟐𝐂𝐨𝐯(𝐗 𝟏 𝐗 𝟐 )
If the random variables are independent, then Cov X1 X2 = 0 and thus:
𝐕𝐚𝐫 𝐗 𝟏 + 𝐗 𝟐 = 𝐕𝐚𝐫 𝐗 𝟏 + 𝐕𝐚𝐫 𝐗 𝟐
Weighted Sum of Random Variables
If a weight of a is attached to variable X1 , and a weight of b attached to X2
(where a + b = 1), then:
𝐕𝐚𝐫 𝐚𝐗 𝟏 + 𝐛𝐗 𝟐 = 𝐚𝟐 𝐕𝐚𝐫 𝐗 𝟏 + 𝐛𝟐 𝐕𝐚𝐫 𝐗 𝟐 + 𝟐𝐚𝐛𝐂𝐨𝐯(𝐗 𝟏 𝐗 𝟐 )
Continuous Random Variables
Continuous random variables make use of the same concepts and
methodologies as discrete random variables.
The main distinguishing factor is that instead of PMFs, continuous random
variables use PDFs.
P(𝑥1 , 𝑥2 ∈ 𝐴) = 𝑋𝑓 1,𝑋2 𝑥1 , 𝑥2 𝑑𝑥1 𝑑𝑥2
𝑓𝑋1,𝑋2 𝑥1 , 𝑥2
𝑿𝟐
𝑿𝟏
Marginal PDF for Continuous
Random Variables
For a continuous bivariate random variable, the marginal PDF integrates
one component out of the joint PDF.
∞
𝑓𝑋2 𝑥2 = න 𝑓𝑋1,𝑋2 𝑥1 , 𝑥2 𝑑𝑥1
−∞
(integrating 𝑋1 out)
𝑓𝑋1,𝑋2 𝑥1 , 𝑥2
𝑿𝟐
𝑿𝟏
Conditional PDF for Continuous
Random Variables
What’s the probability of 𝑋2 given a particular value of 𝑋1 ?
𝒇𝑿𝟏 ,𝑿𝟐 𝒙𝟏 , 𝒙𝟐
𝒇 𝑿𝟏 𝑿𝟐 𝒙 𝟐 𝑿 𝟏 = 𝒙 𝟏 =
𝒇𝑿𝟏 𝒙𝟏
𝑒. 𝑔. , 𝑋2 could
be interest
rates; 𝑋1 could
be a huge loss
𝑓𝑋1,𝑋2 𝑥1 , 𝑥2
𝑿𝟐
𝑿𝟏
What are i.i.d variables?
A collection of random variables is independent and identically distributed
(iid) if each random variable has the same probability distribution as the
others and all are mutually independent.
Example
Consider successive throws of a fair coin:
o The coin has no memory, so all the throws are “independent.”
o The probability of head vs. tail in every throw is 50:50; so the coin is
and stays fair; the distribution from which every throw is drawn is and
stays the same, and thus each outcome is “identically distributed.”
Mean and Variance of i.i.d variables
iid variables have the same moments, and it is therefore easy to manipulate
them.
Variables generated under the normal distribution are iid:
𝐱 𝐢𝐢𝐢𝐝 ~𝐍(𝛍, 𝛔𝟐 )
The expected value of a sum of n iid random variables is just n times the
common mean of the random variables:
𝐧 𝐧 𝐧
𝐄 𝐗 𝐢 = 𝐄 𝐗 𝐢 = 𝛍 = 𝐧𝛍
𝐢 𝐢 𝐢
o where 𝐸 𝑋𝑖 = 𝜇
The variance of a sum of iid random variables is just n times the variance
of one variable:
𝐧
𝐕𝐚𝐫 𝐗 𝐢 = 𝐧𝛔𝟐
𝐢
Book 2 – Quantitative Analysis