week 8 notes
week 8 notes
week 8 notes
3 Conditional Distributions
P( A B)
We already know that if P A 0, then P B | A ……………………………..(1)
P( A)
P ( X x, Y y ) f ( x, y )
P (Y y | X x )
P( X x) f1 ( x)
f ( x, y )
marginal probability function for X. we define f ( y | x) and call it the
f1 ( x)
f ( x, y )
given Y is f ( x | y ) . We sometimes denote f ( x | y ) and f ( y | x) by f1 ( x | y )
f2 ( y)
f ( x, y )
conditional density function of Y given X is f ( y | x) where f ( x, y ) is the
f1 ( x)
Thus we can find, for example, that the probability of Y being between c and d
given that x X x x as
d
P(c Y d | x X x x) f ( y | x)dy .
c
1
Example 4
The joint probability function of two discrete random variables X and Y is given
by
1
2x y x 0,1, 2; y 0,1, 2,3
f ( x, y ) 42
0 Otherwise
Find
a) f(y|2)
b) p(Y=1|x=2)
Solution.
f ( x, y ) p (Y y , x 2)
a) f ( y | 2)
f1 ( x) p ( x 2)
3 3
But f1 ( x) f ( x, y) p(Y y, X 2)
y 0 y 0
3
1
= (4 y ) =1/42(4+5+6+7) = 22/42 = 11/21.
y 0 42
1
f ( x, y ) P( x 2, Y y ) (4 y )
42
1 4 y
Hence f ( y | 2) 42 =1/22 (4+y)
11
21
2
Example 5
3
xy 0<x 1; 0<y 1
f ( x, y ) 4
0 Otherwise
Find
a) f ( y | x)
1 1 1
b) P Y | X dx
2 2 2
Solution.
a) For 0 <x<1
3
1 1
3 x
f1 ( x) f ( x, y )dy xy dy
0
0
4 4 2
3 xy 3 4 xy
f ( x, y ) 4 , 0 y 1
Hence f ( y | x) 3 x 3 2x
f1 ( x) 4 2
0, other y
3 2y
1
9
b) P Y 1 2 | 1 2 x 1 2 dx f y | 1 2 dy dy
1 1 4 16
2 2
3
4.1.4 Variance for Joint Distributions (Co-Variance)
Let X and Y be two continuous random variables having joint density function
f(x, y). Then the means or expectations of X and Y are given by:
X E X xf ( x, y )dxdy,
Y E Y yf ( x, y)dxdy.
X2 E X x x
2 2
f ( x, y )dxdy,
X
Y2 E X Y y
2 2
f ( x, y )dxdy.
Y
Another parameter that arises in the case of two variables X and Y is called the
covariance defined by
XY x y f ( x, y)dxdy
X Y
X xf ( x, y )
x y
4
Y yf ( x, y )
x y
XY x X y Y f ( x, y )
x y
Where the sums are taken over the discrete values of X and Y.
1. XY E ( XY ) E ( X ) E (Y ) E ( XY ) X Y
4. XY X Y
5
4.2 INDEPENDENCE OF RANDOM VARIABLES
Suppose that X and Y are discrete random variables. If the events X=x and Y=y
are independent events for all x and y then we say that X and Y are independent
f(x, y)= f1(x) f2(y). Conversely, if for all x and y the joint probability function f(x, y)
(which are then the marginal probability functions of x and y), then X and Y are
P(X ≤ x, Y ≤ y) = P(X ≤ x) P(Y ≤ y) or equivalently, F(x, y)= F1(x) F2(y) where F1(x)
Conversely, X and Y are independent random variables if for all x and y their
f1(x), and a function of y alone, f2(y), and these are the marginal density functions
of X and Y respectively.
Example 1
Show that the random variables X and Y whose joint probability distribution
function is
1
2x y x 0,1, 2; y 0,1, 2,3
f ( x, y ) 42
0 Otherwise
6
are not independent (i.e. are dependent)
Solution
3
f1 ( x) f ( x, y )
y 0
1
7 x0
f1 (0) x0
1
f1 (1) x 1 x 1
f (2) x2 3
1 11
21 x2
2
Similarly, f 2 ( y ) f ( x, y )
x 0
f 2 (0) y0
f (1) y 1
2
f 2 (2) y2
f 2 (3) y 3
1
7 y0
3 y 1
4
2 y2
7
5
y3
4
7
But P(X=0, Y=0) = 1/42 (0+0) = 0.
Or consider
= 1/3 × ¾ =1/4
And so on!