0% found this document useful (0 votes)
33 views9 pages

Solution Actual Probability: 2.5 Jointly Distributed Random Variables

Uploaded by

pustudydiaries
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
33 views9 pages

Solution Actual Probability: 2.5 Jointly Distributed Random Variables

Uploaded by

pustudydiaries
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 9

Solution

a) E( X )=1.5 ; V (X )=3 /4 with σ =0.866 . Chebyshev's Bound on Probability

Actual Probability
k =1 :
P[¿ X −1.5∨¿ 1(0.866)]>0 vs. P(0.63 ≤ X ≤ 2.4)⇒ P (1≤ X ≤ 2)=3/4 .
k =2 :
P[¿ X −1.5∨¿ 2(0.866)]>3 /4 vs. P(− 0.23 ≤ X ≤ 3.2)⇒ P(0 ≤ X ≤ 3)=1.
k =3 :
P[¿ X −1.5∨¿ 3(0.866)]> 8/9 vs. P(− 1.1≤ X ≤ 4.1)⇒ P(0 ≤1 X ≤33)=1.
(
b) P(4< X <20)=P(− 8< X −12< 8)=P ¿ X −12∨¿ 8>1 − 2 = .
2 4
2.5 Jointly Distributed Random Variables
We are often interested in how two or more random variables are related.
For example, we may be interested in height versus weight or success in
college with high school grade point average and scholastic achievement
tests. Jointly distributed RVs pervade correlation analysis and multiple
regression. We define the continuous joint cumulative distribution function
of two RVs X and Y (bivariate distribution) as
x y
F X Y (x , y )=P( X ≤ x ,Y ≤ y)=∫ ∫ f (x , y)d y d x
−∞ − ∞
The double integral is simply singular integrals in turn and f (x , y ) is the joint
density of the RVs. The individual density functions are called marginal
densities. We can find f (x) by integrating the joint density over all the y
values since

F X (x )=P(X ≤ x)=P (X ≤ x , Y ≤ ∞ )=F X Y (x ,∞ )


Similarly,

F Y (x )=P(Y ≤ y)=P( X ≤ ∞ ,Y ≤ y)=F X Y (∞ , y )


Thus we have ∞ ∞
f x (x )=∫ f (x , y )d y and f y ( y)= ∫ f (x , y )d x
−∞ −∞
(2−6)
Whereas for a singular RV, the probability of its whereabouts is the area
under the curve of the density function, for 2 RVs , the volume under the
surface of the joint density function represents the joint probability.

For joint density f X Y , i) f (x , y )≥ 0 for all (x , y )


ii) ∫ ∞−∞ ∫ ∞−∞ f (x , y )d x d y=1
y x
iii) P( X< x , Y < y )= ∫ −∞ ∫ −∞ f (x , y )d x d y .
a) Verify that the integral of the joint density is 1.
b) Find marginal densities ¿(x ) and ¿ f ( y ).
c) Find E( X ) from the joint density and also from the marginal density.
(d) Find E ( X ′ Y ′ ).

Solution
|
2 1
a) ∫ 10 ∫ 10 4 x y d y d x= ∫ 10 4 x y d x =∫ 10 2 x d x=1.
b) ∫ X ( x )= ∫ 10 4 x y d y=2 x2for0 0 ≤ x ≤ 1, and by symmetry f 1 (1 f )=2 y for 0 ≤ 11≤1.
Note that ∫ X Y (x , y )= ∫ X ∗ , ∫ Y , implying 1
that X and Y are independent RVs.
|
c) E( X )=∫ 10 ∫ 10 4 x 2 y d y d x= ∫ 10 ( 2 x 2 y 2) 0 d x= ∫ 10 2 x 2 d x=2/3 .

Observe that after integrating with respect to y , the resulting integral is the
expression for the expected value of the marginal density of X ,

( 3 )|
2
E( X )=∫ 1 2 x d x. 3
1
2
1 1 1 4x 2 1 4 y
d) E( X Y )=∫ 0 ∫ 0 x y ∗ 4 x y d x d y = ∫ 0 y d y= ∫ 0 d y=4/9 .
0 3
EXAMPLE 2.29 Given joint density f (x , y )=1/ y , 0< x < y <1, compute
a) f X ( x ) and f Y ( y) ,
b) P( X ≤ 1/2),
c) P(Y ≤ 1/4),
d) P( X ≤ 1/4 ,Y ≤ 1/2) ,
e) P′ (X >1 /4 , Y >1/2),
f) P ¿ or Y ≥1 /2 ¿ .

Solution
1 dy y dx
a) f X ( x )= ∫ x =− ln x , 0< x <1 ; ∫ ( y )= ∫ =1 , 0< y <1.
b) P( X ≤ 1/2)=∫ y 1 /2 − ln(x)d x=− xYln(x )−0x|1y/2=−(0.5 ln 0.5− 0.5)
0 0

¿ 0.84657 .
c) P(Y ≤ 1/4)= ∫ 10 /4 1 d y=1/4 .
d) MethodP(I:Xone
≤ 1/4 ,Y ≤ 1/2) using d y then d x ¿
integration
1/ 4 1/ 2 1/ 4
d ydx
∫∫ ¿ ∫ (ln 0.5− ln x)d x=x ln 0.5− (x ln x − x )|0
1/ 4

0 x y 0
¿ ¿
Figure 2.17 One Integration Using d y d x
Method II: Two integrations using
1/ 4 y
d x then 1/d2 y1/ 4 1 /2
d xd y
P( X ≤ 1/4 ,Y ≤ 1/2) ¿ ∫ ∫ ∫ d xyd y =0.25+∫ d4 yy
y
+∫
0 x 0 1/ 4 1/ 4
¿ ¿
e) Method I: One integration using d1 x theny
dy 1
P( X >1/4 ,Y >1 /2) ¿ ∫ ∫
1 /2 1/ 4
dxd y
y
= ∫ 1−
1/ 2
1
4y
dy ( )
¿ ¿

Figure 2.18 One Integration Using d x d y


Method II: Two integrations using d y then d x
1/ 2 1 1 1
P( X >1/4 ,Y >1 /2) ¿∫ ∫ d yyd x + ∫ ∫ d yyd x
1/ 4 1/ 2 1 /2 x
¿ ¿ 0.1733+
f) P ¿ or Y ≥1 /2 ¿=P(X ≥ 1/2)+ P(Y ≥ 1/2)−
1+ 0.5 ln 0.5 −0.5=0.3267 .
1 y P (X ≥1 /21, Y ≥1/2)
¿(1− 0.84657)+ 0.5 − ∫ ∫
1/ 2 1/ 2
dxd y
y
=∫ 1−
1/ 2
( 1
2y )
dy

¿ 0.153+0.5 −(0.5 −0.347)=0.5 .


Table 2.9 Discrete Joint Density Function

1 2

Figure 2.19 Graph of Discrete Joint Density Function


In computing probabilities for joint densities, economy of effort can be
achieved at times by wisely choosing the order of integration, which
depends on the region.

Discrete Joint Density Functions


We define the discrete joint distribution function of two RVs X and Y as

f X Y (x , y )=P( X=x ,Y = y ).
Discrete joint density functions are often shown in tables. The graph of the
probability mass or density function f (x , y )=P( X=x ,Y = y ) for Table 2.9 is
shown in Figure 2.19.

Notice that the sum of the probabilities in the table is 1.


Let RV X be the number of heads in 3 fair coin flips and RV Y the number of
runs in the flip outcomes. Verify the joint density f (x , y ) for RVs X and Y
shown in Table 2.10, and compute
a) P( X ≥ 2 ,Y ≥ 2);
b) the marginal densities of each;
c) E( X ) and E(Y );
d) V ( X) and V (Y );
e) E( X Y ) .
f) Are X and Y independent?

HHH is a run of H whereas HTH has 3 runs, H , T , H and THH has 2 runs.
HHH →1, HHT → 2, HTH → 3, HTT → 2,
THH → 2, THT → 3, TTH → 2, TTT → 1.

Number of Runs Density

Y 1 2 3

P(Y ) 2/8 4 /8 2/8


Number of Ileads Density

X 0 1 2 3

P( X) 1/8 3/8 3/8 1/8

Table 2.10 Discrete Joint Density f (x , y )

0 1 2 3 f ( y)

1 1/8 0 0 1/8 2/8

Y 2 2/8 2/8 0 4 /8

3 0 1/8 1/8 0 2/8

f (x) 1/8 3/8 3/8 1/8 1

Solution
a) P( X ≥ 2 ,Y ≥ 2)=f (2 ,2)+f (2 ,3)+ f (3 , 2)+(3 ,3)=2/8+1/8+0+ 0 ¿ 3/8 .
b) Notice that the marginal density functions are in the margins of Table
2.10.

x 0 1 2 3

f (x) 1/8 3/8 3/8 1/8

y f ( y)

1 2/8

2 4 /8

3 2/8
c) E( X )=0 ∗1/8+ 1∗ 3/8+2 ∗3 /8+3 ∗1/8=12/8=3 /2.
E(Y )=1∗ 2/8+2 ∗ 4/8+3 ∗ 2/8=2.
d) E ( X 2) =(1 /8)(0+3+12+9)=24 /8=3;
E ( Y 2 ) =(1/8)(2+16+ 18)=9/2 ;
V ( X)=E ( X 2 ) − E2 ( X )⇒ V ( X )=3 −9 /4=3/4 ; V (Y )=9 /2 − 4=1/2.
e) E( X Y )=(0 ∗ 1∗ 1/8)+(1 ∗1 ∗0)+(2∗ 1∗ 0)+(3 ∗ 1∗ 1/8)
+(0 ∗2 ∗ 0)+(1∗ 2 ∗2/8)+(2∗ 2∗ 2/8)+(3 ∗ 2∗ 0)
+(0 ∗3 ∗ 0)+(1 ∗3 ∗1 /8)+(2∗ 3 ∗1/8)+(3∗ 3 ∗0)=3 .
f) Observe that E( X Y )=3=E (X )∗ E(Y )=3 /2 ∗ 2. The fact that E( X Y )=¿
E( X )∗ E(Y ) does not necessarily imply that the RVs X and Y are
independent. However, if X and Y are independent, then P( X=x ,Y = y)=¿
P( X=x)∗ P (Y = y ) and E( X Y )=E(X ) E(Y ).
Similarly, continuous RVs X and Y are independent if f (x , y )=f (x)∗ f ( y ).
From the table, however,
3 2 3
P( X=1 ,Y =1)=0 ≠ P (X =1) ∗ P(Y =1)= ∗ = .
8 8 32
RVs X and Y are dependent. If you were told that there was only one run, you
would know that the outcome is either HHH or TTT; the outcome value of
RV Y affects the probability outcome of RV X .

Table 2.11 Discrete Joint Density f (x , y )

0 1 2 3 fi

0 1 15 30 10 56

1 12 60 40 0 112

Y2 18 30 0 0 48

3 4 0 0 0 4

fx 35 105 70 10 220

AXADPQ 2 :31
In an urn are 5 red, 4 white, and 3 blue marbles. Three marbles are
randomly selected. Let RV X be the number of red marbles and RV Y the
number of white marbles.
a) Create the joint density function f (x , y ).
b) Find the marginal densities and compute E( X ), E(Y ) and E( X Y ).
c) Find P( X+ Y ≤ 1).

Solution
a) There are 12 C 3=220 ways to5select 3 of the 12 marbles. Each entry in
Table 2.11 then is to be divided
For example, P( X=2 , Y =1)=
( )( )
2 1
4
by
=
220 .
40 .
b) The marginal densities 12 ) 220
E (X ) ¿ [0 ∗are (shown
35+1∗ in ∗70+3
105+2 the margins.
∗ 10]/220=275/220
3
E(Y ) ¿ [0 ∗ 56+1 ∗112+2 ∗ 48+3 ∗ 4 ]/220=1 ;
E( X Y ) ¿ x ∗ y ∗ P (X =x , Y = y )=[1 ∗ 1∗ 60+2 ∗1 ∗ 40+1∗ 2 ∗30]/220
In an urn ¿are 2 red, 1 white, and 1 blue marble. ¿ A sample of 2 is selected.
Create the joint density function f (x , y , z) for RVs X , Y , and Z being the
number of corresponding colored marbles selected. Find the marginal
densities of X , Y , and Z .
Compute a) P( X ≤ 1), b) P( X=1), c) P( X=1 ∣Y =1),
d) P ¿ OR Z=1 ¿,
e) P( X=1 ∣Y =1 , Z =0),
f) E( X Y Z),
g) E( X ∣Y =1)

Solution X 2 0 1

Y 0 1 1

Z 0 1 0

P( X ,Y , Z ) 1/6 1/6 2/6

The marginal densities of X , Y and Z are


X 0 1 2 Y 0 1 Z 0 1
f ( x) 1/ 6 4 /6 1/ 6 f ( y) 1/ 2 1/ 2 f ( z ) 1/ 2 1/ 2
a) P( X ≤ 1)=P(X =0)+ P(X =1)=1/6+4 /6=5/6 from the marginal density
function of X .
b) P( X=1)=4 /6 from the marginal density of X .
c) P( X=1 ∣Y =1)=P( X=1 , Y =1)/ P(Y =1)=(2/6)/(3 /6)=2/3.
d) P ¿ OR Z=1 ¿=P( X=1)+ P(Z =1)− P ¿ AND Z=1 ¿

¿ 4 /6+3 /6 −2/6=5/6
e) P( X=1 ∣Y =1 , Z =0)=P(X =1 ,Y =1 , Z=0)/ P(Y =1 , Z=0) ¿(2/6)/(2/6)=1.
f) E( X Y Z)=2∗ 0 ∗0 ∗ 1/6+0 ∗1 ∗ 1∗ 1/6+1 ∗1 ∗0 ∗ 1/6+1 ∗0 ∗1 ∗ 2/6=0.
g) E( X ∣Y =1)=2/3 .

X 0 1 2

J ( X ∣ Y =1) 1/3 2/3 0


2.6 Independence of Jointly Distributed
Random Variables
Two RVs X and Y are said to be independent if f (x , y )=f X (x )∗ f Y ( y ) for all
(x , y ) in the domain. If there are more than 2 RVs , the test for independence
is similar to the test for independence among 3 or more events. They must
be independent pair-wise, 3 -wise, etc.
∞ ∞
If the RVs are independent, E( Xthe ∫ ∫density
Y )=joint x y f (x , function
y)d y d x then becomes the
product of the marginal density functions.
−∞ − ∞ For example, f X Y (x , y )=4 x y =¿
2 x ∗ 2 y =f X ∗ f Y . Then (2−7)
∞ ∞
¿ ∫ x f (x)d x ∗ ∫ y f ( y) d y
−∞ −∞
For independent RVs X and Y , the mean
(2−7)of their product is the product of
their means.
¿ E(X )∗ E(Y )
(2−7)
Two discrete RVs X and Y are said to be independent if

P( X=x ,Y = y)=P(X =x)∗ P(Y = y ) for all x - and y -values.


EXAMPLE 2.33 Determine if RVs X and Y are independent given joint
density
2
function f ( x , y )= ( x +2 y ); 0 ≤ x ≤ 1; 0 ≤ y ≤1
3
1
Solution 2 2
f 1(x )=∫ (x +2 y)d y= (x+ 1) on [0 , 1].
0 3 3
1

f ( y )=∫
As f (x , y )≠ f x ( x )∗ f y ( y), X and
y
2
0 3
(x +2 y)d x =
2
(
2
Y are dependent.
3
y +
1
2
Whenever we sample randomly from a distribution, each member of our
)
on [0 , 1].

sample is a random variable. Because the sample is randomly selected, the


joint density function of the sample is just the product of each sample
member's density function. This fact underlies the importance of a random
sample.

Covariance and Correlation


If two random variables are independent, information about the value of one
does not provide any help in determining the value of the other. But if the
RVs are dependent, then information of the value of one helps determine the
value of the other. We measure this help with the term called correlation. To
define correlation, we need to determine the covariance between RVs. The
covariance of two random variables is the average of the product of each
from its mean and is a measure of the linear correspondence between the
random variables. The covariance of RVs X and Y is denoted by the symbols
C (X , Y ) or Cov (X , Y ) or σ x y and is defined as
C (X , Y )=E [ ( X − μ X ) ( Y − μY ) ]
(2−8)
Since E [ ( X − μ X ) ( Y − μY ) ] =E ( X Y −Y μx − X μY + μ x μY )=E ( X Y )− μY μ X − μ X μ Y + μ x μY ,
C (X , Y )=E( X Y )− E( X) E(Y )
(2−9)
Note that C (X , Y )=C(Y , X ), C(a X , bY )=a b ∗C (X , Y ), and C (X , X )=V ( X).
Recall that if X and Y are independent, E( X Y )=E(X ) E(Y ), and consequently
C (X , Y )=0.
However, it is not necessarily true that if C (X , Y )=0, then X and Y are
independent.

Let discrete joint density f (x , y ) for RVs X and Y be given as shown below,
where RV Y = X 2. Show that C (X , Y )=0 even though X and Y are not
independent.

Solution
X1

-1 0 1 f i ( y)

Y 0 0 1/3 0 1/3

1 1/3 0 1/3 2/3

f i (x) 1 11/3 1 1/3 1 1/3


2 2 1
¿ −1 ∗ + 0 ∗ +1 ∗ =0 ; E (Y )=0 ∗ + 1∗ = =E ( X )
2
E (X )
3 3 3 3 3 3
1 1 1
¿ E ( X ) =−1 ∗ + 0 ∗ +1 ∗ =0
3
E( X Y )
3 3 3
of XY )− 2
So although
2 C (Xthe
, Y ) covariance
¿ E(X and E(YX)isE(Y
zero, RV Y− depends
)=0=0 0∗ =0 . on X , i.e.,
Y = X . P(Y =1∣ X=0)=0 ≠ P (Y =1)=2/3, implying X and Y 3are dependent.
Whereas V ( X)≥ 0 ,C (X , Y ) can be positive, negative, or zero. With a positive
covariance, if X is large, Y tends to be large; with a negative covariance, if X
if large, Y would tend to be small.

The correlation coefficient ρ( X ,Y ) is defined as the ratio of the covariance to


the square root of the product of the variances.
C( X ,Y )
ρ( X ,Y )=C (X , Y )
− 1≤ √ V (X ) V ≤(Y1)
√V(2−10)
( X )V (Y )
We show that −1 ≤ ρ ≤1, or (2−11)
V
X Y
+( )
σX σY
V (X ) V (Y ) 2 C( X ,Y )
= 2 + 2 +
σX σY σ X σY
=1+1+2 ρ(X , Y )≥ 0
Similarly,
V (X ) V (Y (2−11)
V
( σ − σ )= σ + ⇒σρ((2−11)
X Y
X Y
2
X
)
X−2
, Y ρ(
2
Y
)≥ X
−1,Y. )=1+1 −2 ρ( X ,Y )≥0

⇒ ρ(X , Y ) ≤ 1
Hence,

−1 ≤ ρ(X , Y )≤ 1
The correlation coefficient measures the linear relationship between RVs X
and Y . Notice that the denominator canot be negative, implying that ρ and
C (X , 1) must have the same sign.
If ρ=0, there is neither a positive nor a negative linear relationship,
indicating that the two RVs are not linearly correlated, C (X , Y )=0. If ρ is
positive, then Y tends to increase as X increases; if ρ is negative, then Y
tends to decrease as X increases.

Specifically, if X > μ X and Y > μY , or if X < μ X and Y < μY , ρ>0;


if X < μ X and Y > μ I , or if X > μ X and Y < μY , ρ<0.

Let density function f (x) for RV X be given as shown below. Let RV Y =2 X .


Show that X and Y are perfectly correlated, i.e., ρ=1.

Solution
X -1 0 1

f (x) 1/3 1/3 1/3

Y -2 0 2

f ( y) 1/3 1/3 1/3

E (X )=(−1+0+1)/3=0 ; ¿ E(X Y )=E (Y )=¿ E(2 X )=2 E( X)=0


¿ V ( X)=E ( X )=2[1 /3 (1+ 0+1)]=4/3 ;
2
¿
Note that X and Y are linearly related. This result is always true since the
slope 2 of Y =2 X is positive, indicating a perfect positive correlation. If the
slope value were negative, then ρ=−1, a perfect negative correlation.

EXAMPEE 2.36 Calculate ρ( X ,Y ) from the joint density table for RVs X and
Y.
Y
3 7 f (x)
5
1 1/4 0 0 1/4

X2 0 1/2 0 1/2

3 0 0 1/4 1/4

f ( y) 1/4 1/2 1/4 1

You might also like