Joint Dist
Joint Dist
Joint Distributions
Himadri Mukherjee
Department of Mathematics
BITS PILANI K K Birla Goa Campus, Goa
Definition
Let X, Y be two random variables defined on the same sample
space σ of a random experiment. The joint probability mass
function of these two random variable is
f (x, y) = P (X = x, Y = y).
Observation
Theorem
Let f (x, y) be the probability mass function of the joint
distribution of discrete random variables X, Y , on a sample space
σ, we have the following
• f (x, y) ≥ 0
XX
• f (x, y) = 1 where the sum runs over all the possible
x y
values of X, Y .
Notations
X
• fX (a) = f (a, y)
y
X
• fY (b) = f (x, b)
x
Example 1
2 X X
E(Y ) = yfY (y) = yf (x, y)
y (x,y)
3 For a function H : R2 → R,
X
E(H(X, Y )) = H(x, y)f (x, y)
(x,y)
Example 3
Find the probability that X > 2 and Y < 4, also find the
probability that X > Y
The marginals
f (x, y) = g(x)h(y)
Definition
The following curves are called the curve of regressions,
• µX|y = E(X | y)) is called the curve of regression of X on Y .
• µY |x = E(Y | x)) is called the curve of regression of Y on X.
Example 8.
Definition
The covariance of two random variables defined on the same
sample space σ is,
• Cov(X, Y ) = E((X − µX )(Y − µY )).
• or equivalently Cov(X, Y ) = E(XY ) − E(X)E(Y ).
• The Correlation coefficient of two random variables is
defined as ρ = √ Cov(X,Y )
V ar(X)V ar(Y )
Few facts regarding the mean
Theorem
• If X, Y are two independent random variables then
E(XY ) = E(X)E(Y ).
• For any two random variables (independent or not) X, Y and
real numbers a, b, we have E(aX + bY ) = aE(X) + bE(Y ).
• For any two independent random variables X, Y and a, b any
two real numbers, we have
V ar(aX + bY ) = a2 V ar(X) + b2 V ar(Y ).
• For any two random variables X, Y , not necessarily
independent,
V ar(X + Y ) = V ar(X) + V ar(Y ) + 2Cov(X, Y ).
• For any two random variables and any two real numbers a, b,
we have
V ar(aX + bY ) = a2 V ar(X) + b2 V ar(Y ) + 2abCov(X, Y ).
Example 10.