Solution Actual Probability: 2.5 Jointly Distributed Random Variables
Solution Actual Probability: 2.5 Jointly Distributed Random Variables
Actual Probability
k =1 :
P[¿ X −1.5∨¿ 1(0.866)]>0 vs. P(0.63 ≤ X ≤ 2.4)⇒ P (1≤ X ≤ 2)=3/4 .
k =2 :
P[¿ X −1.5∨¿ 2(0.866)]>3 /4 vs. P(− 0.23 ≤ X ≤ 3.2)⇒ P(0 ≤ X ≤ 3)=1.
k =3 :
P[¿ X −1.5∨¿ 3(0.866)]> 8/9 vs. P(− 1.1≤ X ≤ 4.1)⇒ P(0 ≤1 X ≤33)=1.
(
b) P(4< X <20)=P(− 8< X −12< 8)=P ¿ X −12∨¿ 8>1 − 2 = .
2 4
2.5 Jointly Distributed Random Variables
We are often interested in how two or more random variables are related.
For example, we may be interested in height versus weight or success in
college with high school grade point average and scholastic achievement
tests. Jointly distributed RVs pervade correlation analysis and multiple
regression. We define the continuous joint cumulative distribution function
of two RVs X and Y (bivariate distribution) as
x y
F X Y (x , y )=P( X ≤ x ,Y ≤ y)=∫ ∫ f (x , y)d y d x
−∞ − ∞
The double integral is simply singular integrals in turn and f (x , y ) is the joint
density of the RVs. The individual density functions are called marginal
densities. We can find f (x) by integrating the joint density over all the y
values since
Solution
|
2 1
a) ∫ 10 ∫ 10 4 x y d y d x= ∫ 10 4 x y d x =∫ 10 2 x d x=1.
b) ∫ X ( x )= ∫ 10 4 x y d y=2 x2for0 0 ≤ x ≤ 1, and by symmetry f 1 (1 f )=2 y for 0 ≤ 11≤1.
Note that ∫ X Y (x , y )= ∫ X ∗ , ∫ Y , implying 1
that X and Y are independent RVs.
|
c) E( X )=∫ 10 ∫ 10 4 x 2 y d y d x= ∫ 10 ( 2 x 2 y 2) 0 d x= ∫ 10 2 x 2 d x=2/3 .
Observe that after integrating with respect to y , the resulting integral is the
expression for the expected value of the marginal density of X ,
( 3 )|
2
E( X )=∫ 1 2 x d x. 3
1
2
1 1 1 4x 2 1 4 y
d) E( X Y )=∫ 0 ∫ 0 x y ∗ 4 x y d x d y = ∫ 0 y d y= ∫ 0 d y=4/9 .
0 3
EXAMPLE 2.29 Given joint density f (x , y )=1/ y , 0< x < y <1, compute
a) f X ( x ) and f Y ( y) ,
b) P( X ≤ 1/2),
c) P(Y ≤ 1/4),
d) P( X ≤ 1/4 ,Y ≤ 1/2) ,
e) P′ (X >1 /4 , Y >1/2),
f) P ¿ or Y ≥1 /2 ¿ .
Solution
1 dy y dx
a) f X ( x )= ∫ x =− ln x , 0< x <1 ; ∫ ( y )= ∫ =1 , 0< y <1.
b) P( X ≤ 1/2)=∫ y 1 /2 − ln(x)d x=− xYln(x )−0x|1y/2=−(0.5 ln 0.5− 0.5)
0 0
¿ 0.84657 .
c) P(Y ≤ 1/4)= ∫ 10 /4 1 d y=1/4 .
d) MethodP(I:Xone
≤ 1/4 ,Y ≤ 1/2) using d y then d x ¿
integration
1/ 4 1/ 2 1/ 4
d ydx
∫∫ ¿ ∫ (ln 0.5− ln x)d x=x ln 0.5− (x ln x − x )|0
1/ 4
0 x y 0
¿ ¿
Figure 2.17 One Integration Using d y d x
Method II: Two integrations using
1/ 4 y
d x then 1/d2 y1/ 4 1 /2
d xd y
P( X ≤ 1/4 ,Y ≤ 1/2) ¿ ∫ ∫ ∫ d xyd y =0.25+∫ d4 yy
y
+∫
0 x 0 1/ 4 1/ 4
¿ ¿
e) Method I: One integration using d1 x theny
dy 1
P( X >1/4 ,Y >1 /2) ¿ ∫ ∫
1 /2 1/ 4
dxd y
y
= ∫ 1−
1/ 2
1
4y
dy ( )
¿ ¿
1 2
f X Y (x , y )=P( X=x ,Y = y ).
Discrete joint density functions are often shown in tables. The graph of the
probability mass or density function f (x , y )=P( X=x ,Y = y ) for Table 2.9 is
shown in Figure 2.19.
HHH is a run of H whereas HTH has 3 runs, H , T , H and THH has 2 runs.
HHH →1, HHT → 2, HTH → 3, HTT → 2,
THH → 2, THT → 3, TTH → 2, TTT → 1.
Y 1 2 3
X 0 1 2 3
0 1 2 3 f ( y)
Y 2 2/8 2/8 0 4 /8
Solution
a) P( X ≥ 2 ,Y ≥ 2)=f (2 ,2)+f (2 ,3)+ f (3 , 2)+(3 ,3)=2/8+1/8+0+ 0 ¿ 3/8 .
b) Notice that the marginal density functions are in the margins of Table
2.10.
x 0 1 2 3
y f ( y)
1 2/8
2 4 /8
3 2/8
c) E( X )=0 ∗1/8+ 1∗ 3/8+2 ∗3 /8+3 ∗1/8=12/8=3 /2.
E(Y )=1∗ 2/8+2 ∗ 4/8+3 ∗ 2/8=2.
d) E ( X 2) =(1 /8)(0+3+12+9)=24 /8=3;
E ( Y 2 ) =(1/8)(2+16+ 18)=9/2 ;
V ( X)=E ( X 2 ) − E2 ( X )⇒ V ( X )=3 −9 /4=3/4 ; V (Y )=9 /2 − 4=1/2.
e) E( X Y )=(0 ∗ 1∗ 1/8)+(1 ∗1 ∗0)+(2∗ 1∗ 0)+(3 ∗ 1∗ 1/8)
+(0 ∗2 ∗ 0)+(1∗ 2 ∗2/8)+(2∗ 2∗ 2/8)+(3 ∗ 2∗ 0)
+(0 ∗3 ∗ 0)+(1 ∗3 ∗1 /8)+(2∗ 3 ∗1/8)+(3∗ 3 ∗0)=3 .
f) Observe that E( X Y )=3=E (X )∗ E(Y )=3 /2 ∗ 2. The fact that E( X Y )=¿
E( X )∗ E(Y ) does not necessarily imply that the RVs X and Y are
independent. However, if X and Y are independent, then P( X=x ,Y = y)=¿
P( X=x)∗ P (Y = y ) and E( X Y )=E(X ) E(Y ).
Similarly, continuous RVs X and Y are independent if f (x , y )=f (x)∗ f ( y ).
From the table, however,
3 2 3
P( X=1 ,Y =1)=0 ≠ P (X =1) ∗ P(Y =1)= ∗ = .
8 8 32
RVs X and Y are dependent. If you were told that there was only one run, you
would know that the outcome is either HHH or TTT; the outcome value of
RV Y affects the probability outcome of RV X .
0 1 2 3 fi
0 1 15 30 10 56
1 12 60 40 0 112
Y2 18 30 0 0 48
3 4 0 0 0 4
fx 35 105 70 10 220
AXADPQ 2 :31
In an urn are 5 red, 4 white, and 3 blue marbles. Three marbles are
randomly selected. Let RV X be the number of red marbles and RV Y the
number of white marbles.
a) Create the joint density function f (x , y ).
b) Find the marginal densities and compute E( X ), E(Y ) and E( X Y ).
c) Find P( X+ Y ≤ 1).
Solution
a) There are 12 C 3=220 ways to5select 3 of the 12 marbles. Each entry in
Table 2.11 then is to be divided
For example, P( X=2 , Y =1)=
( )( )
2 1
4
by
=
220 .
40 .
b) The marginal densities 12 ) 220
E (X ) ¿ [0 ∗are (shown
35+1∗ in ∗70+3
105+2 the margins.
∗ 10]/220=275/220
3
E(Y ) ¿ [0 ∗ 56+1 ∗112+2 ∗ 48+3 ∗ 4 ]/220=1 ;
E( X Y ) ¿ x ∗ y ∗ P (X =x , Y = y )=[1 ∗ 1∗ 60+2 ∗1 ∗ 40+1∗ 2 ∗30]/220
In an urn ¿are 2 red, 1 white, and 1 blue marble. ¿ A sample of 2 is selected.
Create the joint density function f (x , y , z) for RVs X , Y , and Z being the
number of corresponding colored marbles selected. Find the marginal
densities of X , Y , and Z .
Compute a) P( X ≤ 1), b) P( X=1), c) P( X=1 ∣Y =1),
d) P ¿ OR Z=1 ¿,
e) P( X=1 ∣Y =1 , Z =0),
f) E( X Y Z),
g) E( X ∣Y =1)
Solution X 2 0 1
Y 0 1 1
Z 0 1 0
¿ 4 /6+3 /6 −2/6=5/6
e) P( X=1 ∣Y =1 , Z =0)=P(X =1 ,Y =1 , Z=0)/ P(Y =1 , Z=0) ¿(2/6)/(2/6)=1.
f) E( X Y Z)=2∗ 0 ∗0 ∗ 1/6+0 ∗1 ∗ 1∗ 1/6+1 ∗1 ∗0 ∗ 1/6+1 ∗0 ∗1 ∗ 2/6=0.
g) E( X ∣Y =1)=2/3 .
X 0 1 2
f ( y )=∫
As f (x , y )≠ f x ( x )∗ f y ( y), X and
y
2
0 3
(x +2 y)d x =
2
(
2
Y are dependent.
3
y +
1
2
Whenever we sample randomly from a distribution, each member of our
)
on [0 , 1].
Let discrete joint density f (x , y ) for RVs X and Y be given as shown below,
where RV Y = X 2. Show that C (X , Y )=0 even though X and Y are not
independent.
Solution
X1
-1 0 1 f i ( y)
Y 0 0 1/3 0 1/3
⇒ ρ(X , Y ) ≤ 1
Hence,
−1 ≤ ρ(X , Y )≤ 1
The correlation coefficient measures the linear relationship between RVs X
and Y . Notice that the denominator canot be negative, implying that ρ and
C (X , 1) must have the same sign.
If ρ=0, there is neither a positive nor a negative linear relationship,
indicating that the two RVs are not linearly correlated, C (X , Y )=0. If ρ is
positive, then Y tends to increase as X increases; if ρ is negative, then Y
tends to decrease as X increases.
Solution
X -1 0 1
Y -2 0 2
EXAMPEE 2.36 Calculate ρ( X ,Y ) from the joint density table for RVs X and
Y.
Y
3 7 f (x)
5
1 1/4 0 0 1/4
X2 0 1/2 0 1/2
3 0 0 1/4 1/4