0% found this document useful (0 votes)
118 views5 pages

Joint Distribution of Two Random Variables

1) A joint distribution describes the probability of two random variables occurring together. It is represented by a joint probability function p(x,y) which gives the probability of random variables X and Y taking on particular values x and y. 2) The marginal distributions are the probabilities of X and Y occurring individually, represented by px(x) and py(y). 3) Random variables X and Y are independent if p(x,y) = px(x) * py(y) for all x and y. Otherwise they are dependent.

Uploaded by

marielleramos
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
118 views5 pages

Joint Distribution of Two Random Variables

1) A joint distribution describes the probability of two random variables occurring together. It is represented by a joint probability function p(x,y) which gives the probability of random variables X and Y taking on particular values x and y. 2) The marginal distributions are the probabilities of X and Y occurring individually, represented by px(x) and py(y). 3) Random variables X and Y are independent if p(x,y) = px(x) * py(y) for all x and y. Otherwise they are dependent.

Uploaded by

marielleramos
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

Joint Distribution

P.1

Joint Distribution of Two Random Variables


When an experiment is conducted, two or more random variables are often observed simultaneously not only to study their individual behaviours but also to determine the degree of relationship between the variables. e.g. roll two dice, one red and one white Let X = score on red die Y = score on white die We have X = 1, 2, 3, 4, 5, 6 Y = 1, 2, 3, 4, 5, 6. We are interested in how likely the pair (X, Y ) is to take some particular value like (x, y). We want Pr {(X, Y ) = (x, y)} = Pr(X = x, Y = y) = p(x, y). We regard p(x, y) as being the joint probability function for X and Y . e.g. two dice problem p(x, y) = 1 , 36 x, y = 1, 2, . . . , 6.

e.g. Toss two coins 1 if head on rst coin Let X = 0 if tail on rst coin Y = number of heads on two coins Sample point (x, y) Here HH (1, 2) HT (1, 1) TH (0, 1) TT (0, 0)

X = 0 or 1 Y = 0, 1 or 2. The joint p.f. may be specied by a tabulation: Y X 0 1 0


1 4

1
1 4 1 4

2 0
1 4

If we only consider X we can see directly that Pr(X = 0) = 1 2

Pr(X = 0) = Pr {(X = 0, Y = 0) (X = 0, Y = 1) (X = 0, Y = 2)} = Pr (X = 0, Y = 0) + Pr (X = 0, Y = 1) + Pr (X = 0, Y = 2) since mutually exclusive i.e. the sum of the rst row (X = 0 row) will give Pr(X = 0) Similarly for Pr(X = 1).

Joint Distribution Y X 0 1 Pr(Y = y)

P.2

0
1 4

1
1 4 1 4 1 2

2 0
1 4 1 4

Pr(X = x)
1 2 1 2

0
1 4

The same also applies to Y . We call Pr(X = x) = px (x) the marginal probability function of X and Pr(Y = y) = py (y) the marginal probability function of Y .

Independence
The random variables X and Y are independent i Pr(X = x, Y = y) = Pr(X = x) Pr(Y = y) or p(x, y) = px (x) py (y) for all possible x and y. Otherwise X and y are dependent. e.g. Toss two coins: X and Y are dependent Roll two dice: X and Y are dependent

Conditional p.f.
The probability of X/Y is called the conditional probability of X given Y . px (x/y) = e.g. toss two coins Pr(X = x/Y = y) Y 0 0 X 1 0 1 1
1 2 1 2

p(x, y) is the conditional p.f. of X given Y . py (y)

2 0 1

Joint Distribution Pr(Y = y/X = x) Y 0 0 1


1 2

P.3

1
1 2 1 2

2 0
1 2

Expectations The random variable g(X, Y ) has an expected value given by XX E {g(X, Y )} = g(x, y)p(x, y)
x y

provided the series converges absolutely. e.g. toss two coins Let g(X, Y ) = XY Then E {g(X, Y )} = E(XY ) = = (0 0) XX xyp(x, y)

1 1 + (0 1) + (0 2) 0 4 4 1 1 +(1 0) 0 + (1 1) + (1 2) 4 4 3 = 4 g(X, Y ) = XY 2

Let Then E(XY 2 ) = (0 02 ) 1 1 + (0 12 ) + (0 22 ) 0 4 4 1 1 +(1 02 ) 0 + (1 12 ) + (1 22 ) 4 4 5 = 4

Note that E(X) & E(Y ) are obtained in the usual way using the marginal p.f. E(X) = 0 1 + 1 1 = 1 2 2 2 E(Y ) = 0 1 + 1 1 + 2 4 2
1 4

=1

= x = y

These are the means of X and Y respectively.

Joint Distribution

P.4

Convariance and Correlation


When g(X, Y ) = (X x )(Y y ), then Eg(X, Y ) = E(X x )(Y y ) XX (x x )(y y )p(x, y) =
x y

is referred to as the covariance of X and Y , cov(X, Y ). E(X x )(Y y ) = E(XY x y X + x y ) = E(XY ) x E(Y ) y E(X) + x y = E(XY ) x y e.g. cov(X, Y ) = 3 4 1 1 (1) = 2 4

This is a measure of the association or degree of relationship between X and Y. A standardized measure is given by = cov(X, Y ) xy

called the correlation coecient of X and Y. e.g. toss two coins E(X 2 ) = 0 1 1 1 + 12 = 2 2 2 2 1 1 1 2 = = x 2 2 4 1 1 3 2 2 1 E(Y ) = 0 + 1 + 22 = 4 2 4 2 3 1 2 = 12 = y 2 2
1

It can be shown that

q = q 4
1 4

1 2

1 = q = 0.7071
1 2

1 1 close to 1 or 1 close to 0 strong association weak association

Joint Distribution

P.5

< 0 X, Y tend to move inversely i.e. and large X is more likely to occur with small Y . small X is more likely to occur with large Y . > 0 X, Y tend to move proportionally i.e. large X is more likely to occur with large Y . and small X is more likely to occur with small Y . The closer is to +1 or 1, the stronger is the tendency.

You might also like