0% found this document useful (0 votes)
31 views30 pages

Joint Dist

Uploaded by

f20240508
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
31 views30 pages

Joint Dist

Uploaded by

f20240508
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 30

Probability and Statistics (MATH F113)

Joint Distributions

Himadri Mukherjee

Department of Mathematics
BITS PILANI K K Birla Goa Campus, Goa

October 14, 2024


Introduction

In many experiments we would like to look at more than one


attributes and we would like to understand relations between them.
• In a population we want to look at both the rate of infection
of a disease and the age of the person.
• Both size and nutrition value of a particular crop.
• Height and weight of children attending school.
• Income of a household and investment in mutual funds.
• TV watching duration and marks of students.
Definition

A number of random variables Xi , i ≤ n defined on the same


algebra of events σ give rise to joint distribution of these random
variables.
Discrete case

Definition
Let X, Y be two random variables defined on the same sample
space σ of a random experiment. The joint probability mass
function of these two random variable is
f (x, y) = P (X = x, Y = y).
Observation

Theorem
Let f (x, y) be the probability mass function of the joint
distribution of discrete random variables X, Y , on a sample space
σ, we have the following
• f (x, y) ≥ 0
XX
• f (x, y) = 1 where the sum runs over all the possible
x y
values of X, Y .
Notations

• For a subset A ⊂ σ, we have


X
P ({(x, y) ∈ A}) = f (x, y)
(x,y)∈A

• The marginal mass functions are defined as


fX (x) = P ({X = x}) and fY (y) = P ({Y = y}).
Marginal mass functions

We can also the marginal mass functions as follows,

X
• fX (a) = f (a, y)
y
X
• fY (b) = f (x, b)
x
Example 1

Example 1. A coin is tossed three times, X is the number of


heads and Y is the difference of heads and tails,
Continued
Example 2

Example 2 Two scanners are needed for an experiment. Out of


the 5 available it is known that there are two with defect in the
memory, one with defect in the wiring and two in good conditions.
Let X be the number of scanners with defect in memory, Y be the
number of scanners with defects in the wiring.
• Find the joint distribution of X, Y .
• Find the probability of at most one defect in the chosen
scanners.
• Find the marginal distribution of X, Y .
Means

Let X, Y be two discrete random variables on the same sample


space. We have the following means (provided the summations
exists):
1 Univariate mean of X,
X X
E(X) = xfX (x) = xf (x, y)
x (x,y)

2 X X
E(Y ) = yfY (y) = yf (x, y)
y (x,y)

3 For a function H : R2 → R,
X
E(H(X, Y )) = H(x, y)f (x, y)
(x,y)
Example 3

Example 3. Find the univariate means of the above two examples.


The joint cumulative distribution function

The cumulative distribution function of the joint distribution,


with probability mass functiono f (x, y), in the discrete case is as
follows, X
F (a, b) = f (x, y)
x≤a,y≤b
Example 4.

Example 4 Let X, Y be two independent Poisson random


variables with parameters µ, λ respectively, find the probability
density function of U = X + Y .
Example 5.

If X, Y are two discrete random variables with joint mass function


given by,
xy
f (x, y) = , for , x, y = 1, 2, 3
36
Find the joint mass function of U = X + Y and V = X − Y .
The continuous case

Let X, Y be two continuous random variables on the same sample


space σ, a function f (x, y) is called the joint probability density
function if the following are true,

f (x, y) ≥ 0
• For any subset A ⊂ R2 ,
ZZ
P ({(X, Y ) ∈ A}) = f (x, y)dxdy
A
Rectangular region

For a rectangular region A = {(x, y) | a ≤ x ≤ b, c ≤ y ≤ d}, we


have; Z bZ d
P ({(X, Y ) ∈ A}) = f (x, y)dydx
a c
Example 6.

Example 6. let X, Y be two continuous random variables with


joint distribution,
(
e−(x+y) if x, y > 0
f (x, y) =
0 else

Find the probability that X > 2 and Y < 4, also find the
probability that X > Y
The marginals

The marginal probability density functions of X, Y are denoted


by fX , fY respectively and are given by,
Z ∞
fX (x) = f (x, y)dy
−∞
Z ∞
fY (y) = f (x, y)dx
−∞
Independence of random variable

Two random variables X, Y (be continuous or discrete) are called


independent if for every (x, y), we have

f (x, y) = g(x)h(y)

for some functions g and h.


Example 7.

Example 7. Two people decided to meet at a railways station


between 10:00 AM and 11:00 AM, if their arrival times are random
variables X, Y following uniform distribution in the interval, what
is the probability that one will wait more than 10 minutes for the
other?
The conditional distribution

Let X, Y be continuous random variables with joint pdf f (x, y)


and marginal X pdf fX (x), then for any values of X, x, for which
f (x) 6= 0, the conditional probability density function of Y
given X = x is
f (x, y)
fY |x (y | x) =
fX (x)
The conditional means

The following are the conditional means


Definition
R∞
• µX|y = E(X | y)) = xfX|y (x)dx
R −∞

• µY |x = E(Y | x)) = −∞ yfY |x (x)dy

Note that the above two are functions of y and x respectively.


The curves of regression

Definition
The following curves are called the curve of regressions,
• µX|y = E(X | y)) is called the curve of regression of X on Y .
• µY |x = E(Y | x)) is called the curve of regression of Y on X.
Example 8.

Example 8. Assuming the following joint distribution of X, Y find


the regression curves,
(
1
(x + y) if 0 ≤ x ≤ 1, 0 ≤ y ≤ 1
f (x, y) = 3
0 else
Mean

Let X, Y be two continuous random variables with joint pdf,


f (x, y). Let h : R2 → R be any function, the mean of h(X, Y ),
µh(X,Y ) is given by,
Z ∞ Z ∞
E(h(X, Y )] = h(x, y)f (x, y)dxdy
−∞ −∞
Example 9.

Example 9. If the joint density of two random variables X, Y is


given by, (
2 if 0 ≤ x ≤ y ≤ 1
f (x, y) =
0 else
Then find E(X|Y = y) and E(Y |X = x), also find the covariance
of X and Y .
Covariance

Definition
The covariance of two random variables defined on the same
sample space σ is,
• Cov(X, Y ) = E((X − µX )(Y − µY )).
• or equivalently Cov(X, Y ) = E(XY ) − E(X)E(Y ).
• The Correlation coefficient of two random variables is
defined as ρ = √ Cov(X,Y )
V ar(X)V ar(Y )
Few facts regarding the mean

Theorem
• If X, Y are two independent random variables then
E(XY ) = E(X)E(Y ).
• For any two random variables (independent or not) X, Y and
real numbers a, b, we have E(aX + bY ) = aE(X) + bE(Y ).
• For any two independent random variables X, Y and a, b any
two real numbers, we have
V ar(aX + bY ) = a2 V ar(X) + b2 V ar(Y ).
• For any two random variables X, Y , not necessarily
independent,
V ar(X + Y ) = V ar(X) + V ar(Y ) + 2Cov(X, Y ).
• For any two random variables and any two real numbers a, b,
we have
V ar(aX + bY ) = a2 V ar(X) + b2 V ar(Y ) + 2abCov(X, Y ).
Example 10.

Example 10. Two points x, y are selected at random following a


uniform distribution from the interval [0, 1], find the probability
that the distance of the point (x, y) from the origin is less than 1.

You might also like