0% found this document useful (0 votes)
26 views6 pages

Bivariate Distribution (Discrete RV)

Uploaded by

pranjal16khare
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
26 views6 pages

Bivariate Distribution (Discrete RV)

Uploaded by

pranjal16khare
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

BIVARIATE DISTRIBUTION

INTRODUCTION
A bivariate distribution is a probability distribution that involves two random variables, often
denoted as X and Y.
Examples of bivariate distributions
• Height and Weight
• Temperature and Ice Cream Sales
Bivariate distribution or joint probability distribution are of two types:
• Discrete joint distribution
• Continuous joint distribution.

DISCRETE JOINT DISTRIBUTION


A discrete joint distribution is a probability distribution that describes the simultaneous
behavior of two or more discrete random variables.

Y
X y1 y2 ................ ym. Total
x1 P11 P12 ............... P1m P1.
x2 P21 P22 ................ P2m P2.
. . . ................ . .
. . . ................ . .
. . . .............. .
xn Pn1 Pn2 ............... Pnm Pn.
Total P.1 P.2 ............... P.m

Discrete distributions satisfy two conditions: -

• 0  PXY ( x, y)  1

 P XY ( x, y ) = 1
• x y

1
JOINT PROBABILITY MASS FUNCTION:
Two discrete random variables X and Y are said to be joint probability mass function.
PXY ( x, y) = P( X = x, Y = y)
or
P(X=x) and P(Y=y)

MARGINAL PROBABILITY MASS FUNCTION:


Marginal Probability mass functions are obtained by simply summing the probabilities over
the support of another variable.
• Marginal probability Mass function of X.
f x ( x) =  f ( x, y) = P( X = x), x  s
y

PX ( X = x) =  PXY ( x, y )
y

• Marginal probability Mass function of Y.


fY ( y ) =  f ( x, y ) = P(Y = y ), y  S 2
x

PY (Y = y ) =  PXY ( x, y )
x

• Expected value of X: It is defined by:


E ( X ) =  xPX ( x)
x

• Expected value of Y: It is defined by:


E (Y ) =  yPY ( y )
y

• Variance of X: It is defined by:


Var ( X ) =  x 2 PX ( x) − [ E ( x)]2
x

• Variance of Y: It is defined by:


Var (Y ) =  y 2 PY ( y) − [ E ( y)]2
y

• Covariance of (X, Y): It is defined by:

𝐶𝑜𝑣(𝑋, 𝑌) = ∑ ∑ 𝑥𝑦𝑃𝑋𝑌 (𝑥, 𝑦) − 𝐸(𝑋) ⋅ 𝐸(𝑌)


𝑥 𝑦

2
• Conditional Probability function of X, given that Y=y:
It is defined by:
𝑃(𝑋 = 𝑥 𝐴𝑁𝐷 𝑌 = 𝑦)
𝑃(𝑋 = 𝑥|𝑌 = 𝑦) =
𝑃(𝑌 = 𝑦)
or
f X / Y ( x / y ) = P( X = x | Y = y )
• Conditional Probability function of Y, given that X=x:
It is given by:
𝑃(𝑌 = 𝑦 𝐴𝑁𝐷 𝑋 = 𝑥)
𝑃(𝑌 = 𝑦|𝑋 = 𝑥) =
𝑃(𝑋 = 𝑥)
or
fY / X ( y / x) = P(Y = y | X = x)

Example 1. The bivariate probability distribution of X and Y is given below:


X=0 X=1 X=2 X=3 Totals
Y=0 0.01 0.01 0.06 0.03 0.11
Y=1 0.02 0.04 0.07 0.04 0.17
Y=2 0.03 0.08 0.09 0.04 0.24
Y=3 0.05 0.11 0.08 0.02 0.26
Y=4 0.13 0.05 0.03 0.01 0.22
Totals 0.24 0.29 0.33 0.14 1.00

Find (i) PXY (X=1, Y=3), (ii) PY(Y=3), (iii) PX(X=1), (iv)expected value of X, (v) expected
value of Y, (vi) variance of X, (vii) variance of Y and (viii) Cov(X, Y).
Solution:
(i) PXY (X=1, Y=3) =0.11
(ii) PY(Y=3) =0.26
(iii) PX(X=1) =0.29
(iv) Expected value of X

E ( X ) =  xPX ( x)
x

= 0(0.24) +1(0.29)+2(0.33)+3(0.14)
= 1.37

3
(v) Expected value of Y
E (Y ) =  y.Py ( y )
y

= 0(0.11) + 1.(0.17) + 2(0.24) + 3(0.26) + 4(0.22)

= 2.31
(vi) Variance of X =  x 2 PX ( x) − ( E ( X )) 2
x

= 02(0.24) +12(0.29) +22(0.33) +32(0.14)-1.372


= 0.9931
(vii) Variance of Y =  y 2 PY ( y) − ( E (Y )) 2
y

= 02(0.11) +12(0.17) +22(0.24) +32(0.26) +42(0.22)-2.312


= 1.6539
(viii) 𝐶𝑜𝑣(𝑋, 𝑌) = ∑𝑥 ∑𝑦 𝑥𝑦𝑃𝑋𝑌 (𝑥, 𝑦) − 𝐸(𝑋) ⋅ 𝐸(𝑌)

= 0(0)(0.01) + 0(1)(0.02) + 0(2)(0.03) + 0(3)(0.05) + 0(4) (0.13) + 1(0)(0.01) +


1(1)(0.04) + 1(2)(0.08) + 1(3)(0.11) + 1(4)(0.05) + 2(0)(0.06) + 2(1)(0.07) + 2(2)(0.09) +
2(3)(0.08) + 2(4)(0.03)+ 3(0)(0.03) + 3(1)(0.04) + 3(2)(0.04) + 3(3)(0.02) + 3(4)(0.01) -
(1.37)(2.31) = -0.5547
Since the covariance is non-zero, the variables are not independent. They have some sort of
relationship. Since the covariance is negative, the relation is an inverse relationship.

Example 3. Given the following bivariate probability distribution obtain (i) marginal
distributions of X and Y, (ii) the conditional distribution of X given Y = 2
Y X -1 0 1
0 1/15 2/15 1/15
1 3/15 2/15 1/15
2 2/15 1/15 2/15

4
Solution.

Y X -1 0 1  p ( x, y )
x

0 1/15 2/15 1/15 4/15


1 3/15 2/15 1/15 6/15
2 2/15 1/15 2/15 5/15

 p( x, y)
y
6/15 5/15 4/15 1

(i) Marginal distribution of X. From the above table, we get


6 2 5 1 4
P( X = −1) = = ; P( X = 0) = = ; P( X = 1) =
15 5 15 3 15
Marginal distribution of Y :
4 6 2 5 1
P(Y = 0) = ; P(Y = 1) = = ; P(Y = 2) = =
15 15 5 15 3
(ii) Conditional distribution of X given Y=2.We have
P( X = x  Y = 2) = P(Y = 2), P( X = x | Y = 2)

P( X = x  Y = 2)
 P( X = x | Y = 2) =
P(Y = 2)
2
𝑃(𝑋=−1∩𝑌=2) 15 2
∴ 𝑃(𝑋 = −1|𝑌 = 2) = = 1 =5
𝑃(𝑌=2)
3

5
Practice Questions:
Question-1: Two discrete random variables X and Y have:
2 1 1 5
P( X = 0, Y = 0) = ; P( X = 0, Y = 1) = , P( X = 1, Y = 0) = ; P( X = 1, Y = 1) = .
9 9 9 9
Examine whether X and Y are independent.

Question-2: The joint probability distribution of a pair of random variables is given by the
following table: -
Y X 1 2 3
1 0.1 0.1 0.2
2 0.2 0.3 0.1
Find
(a) The marginal distributions
(b) The conditional distribution of X given Y =1
(c) 𝑃{(𝑋 + 𝑌) < 4}

You might also like