Chapter 3
Chapter 3
Year 2024
Page
i
Chapter 3
Joint Probability Distributions
1
↑ and yare independent
Example 3.1 A coin (baht), a coin (five baht), and a die are tossed simultaneously. Let
the random variable X denote the number of heads obtained from the coins and Y denote
the number shown on the die. Write the joint probability mass function of the random
variables (X, Y ). X H y number =
=
1 X =
1 y = 5
Answer 3.1 2y
=
,
o
X
=
C
HH1 HH2 HH3 HH4 HH5 HHG (S) =
24
E
, ,
, , ,
S =
H + 2 , HT3 , HTP , HT 5
HT6
HT 1 ,
,
&
, TH3 TH5 , THE
I
4
X 71 4= 2 TH1 ,
,
TT T
,
TTc , ,
1
=
07 :
0 2
X = . 1 ,
1, 2, 3 4 6
n =
, ,
5 ,
1 2 3 4 5
Ex 0 4
=
&
=
, , ,
X ,
,
f(X , y) I
3 ,4 5 6
7 2
E X= 1
=
7 , , , ,
,
774
En y 2,3 4,5, 6
TT6
=
X = 2 1
, ,
,
PT
o
Y
1
I
2
↑ i↑ ↑ 3456
En in e
6/24
24
I
12/24
I !
24
I
↓ 1 I 1
2 o Ei 24 2424 24
6/24
P(Y =
y) huy TH1
4/24
HT1
4/24 4144224dp they
,
HH3
(X = 1
y 1) =
,
(x = 2,y =
3)
fen
to show Independence that
We can see
f (x gexsheys for all X
7 for all
y)
=
, + 4
,
f(x , y) =
gax/bey) ,
( E) = f(0 , 1) =
g(0)h(1)
7 = = E)
Chapter 3. Joint Probability Distributions 3
Example 3.2 A box consists of 5 white balls and 5 black balls. Peter draws 3 balls from
this box. He takes one ball at a time and place it back before the next draw. Define random
variables X and Y as follows;
⎧
⎨0
if the first draw is white
⎪
X=⎪
⎩1 if the first draw is black X = 0 1
,
4 = 0 1,2 ,3
Y = number of white balls. ,
Answer 3.2
f(x , y)
same ⎧
⎪
⎪
⎪
⎪0, if (x, y) = (0, 0), (1, 3),
⎪
⎪
⎪
⎪
⎨1/8 if (x, y) = (0, 1), (0, 3), (1, 0), (1, 2),
⎪
⎪
f (x, y) =
⎪
⎪
⎪
⎪
⎪
1/4, if (x, y) = (0, 2), (1, 1),
⎪
⎪
⎪
⎪
⎩0
⎪
otherwise.
such that
that there is 50 420
we can see ,
!
f(0 %
,
= 0 9(01 :
&
of(1(8) h(0) =
1
Chapter 3. Joint Probability Distributions 4
Example 3.3 Suppose that 2 batteries are randomly chosen without replacement from the
following group of 12 batteries: 3 are new, 4 are in use (working) and 5 are defective. Let
X denote the number of new batteries chosen. Let Y denote the number of used batteries
chosen. Answer the following questions.
2. Find the joint probability distribution of (X, Y ). (Use contingency table to present
the joint distribution)
Answer 3.3
1. Though X can take on values 0, 1, and 2, and Y can take on values 0, 1, and 2,
when we consider them jointly, X + Y ≤ 2. So, not all combinations of (X, Y ) are
possible.
X \Y 0 1 2
0 10/66 15/66 3/66
1 20/66 12/66 0
2 6/66 0 0
!
Chapter 3. Joint Probability Distributions 5
Example 3.4 Let X and Y be the random variable with the joint pmf
⎧
⎪ 1
⎪
⎪
⎨ (x2 + y 2 ) if x = 1, 2, y = 0, 1, 2,
f (x, y) = ⎪ 25
⎪
⎩0
⎪
otherwise.
Find P(x = e , y
= 2) =
1
1. P (X > Y ),
2. P (X + Y ≤ 2), #o I 2
↓ 25
Bles
I 25
3. P (X + Y = 2).
24/25525 85
Answer 3.4
P (X > Y ) = P (X = 1, Y = 0) + P (X = 2, Y = 0) + P (X = 2, Y = 1)
= f (1, 0) + f (2, 0) + f (2, 1)
1 4 5
= + +
25 25 25
10
=
25
2
=
5
!
Chapter 3. Joint Probability Distributions 6
We call this function f (x, y) the joint probability density function (Joint pdf) of the random
variables (X, Y ) if f (x, y) has the following properties:
From the definition, the function f (x, y) represents a continuous surface in the plane. The
value of f (x, y) at a specific point (x, y) can be interpreted as the height of the surface
above that point. The joint probability of X and Y falling within a particular region is
represented by the volume under the surface f (x, y) within that region. Therefore, to
calculate the probability P (a < X ≤ b, c < Y ≤ d), where a and b are possible values for
X, and c and d are possible values for Y , we use the following double integral:
( d( b
P (a < X ≤ b, c < Y ≤ d) = f (x, y)dxdy (3.3)
c a
Note: Both the joint probability mass function (Joint pmf) and the joint probability
density function (Joint pdf) can be abbreviated as the joint probability function.
Chapter 3. Joint Probability Distributions 7
Example 3.5 X and Y are random variables with joint probability density function
f (x, y) defined by
⎧
⎨kx(x + y)
⎪
0 < x < 1, 0 < y < 2
f (x, y) = ⎪
⎩0 otherwise
1 1
Calculate k and P (0 < X < , < Y < 1)
5 5
Answer 3.5
( 2( 1
kx(x + y)dxdy = 1
0 0
( 2 ) *1
x3 x2 y
k + dy = 1
0 3 2 0
k(2
(2 + 3y)dy = 1
6 0
) *2
y2
k 2y + 3 =6
2 0
k(4 + 6) = 6
3
k=
5
1 1 3 ( 1 ( 1/5 2
P (0 < X < , < Y < 1) = (x + xy)dxdy
5 5 5 1/5 0
+ ,
1 (1 2 3y
= + dy
10 1/5 125 25
- .
1 1 15 1
= 2(1 − ) + (1 − )
1250 5 2 25
22
=
3125
!
Chapter 3. Joint Probability Distributions 8
Example 3.6 X and Y are random variables with joint probability density function
f (x, y) defined by
⎧
xy
⎨x2
⎪ + 0 ≤ x ≤ 1, 0 ≤ y ≤ 2
f (x, y) = ⎪ 3
⎩0 x, y otherwise
2. Find P (X + Y ≥ 1)
Answer 3.6
1. To show that f (x, y) is the joint probability function of X and Y , we need to show
(i)
xy
f (x, y) = x2 + ≥ 0 ���������� � x, y ��� 0 ≤ x ≤ 1, 0 ≤ y ≤ 2
3
(ii)
( ∞ ( ∞ ( 2( 1
xy
f (x, y)dxdy = (x2 + )dxdy
−∞ −∞ 0 0 3
( 2) 3 *1
x x2 y
= + dy
0 3 6 0
( 2- .
1 y
= + dy
0 3 6
) *2
y y2
= +
3 12 0
2 4
= +
3 12
=1
From (i) and (ii), we now can conclude that f (x, y) is the joint probability function
of random variables X and Y
2. Find P (X + Y ≥ 1)
( 1 ( 1−x
xy
P (A) = 1 − (x2 +
)dydx
0 0 3
( 1) *
2 x(1 − x)2
=1− x (1 − x) + dx
0 6
7 65
=1− =
72 72
or
( 1 ( 1−y
xy
P (A) = 1 − (x2 +
)dxdy
0 0 3
( 1) *
(1 − y)3 (1 − y)2 y
=1− + dy
0 3 6
( 1) *
(1 − 3y + 3y 2 + y 3 (y − 2y 2 + y 3 )
=1− + dy
0 3 6
1( 1/ 0
=1− 2 − 5y + 4y 2 + 3y 3 dy
6 0
) *1 + ,
1 5y 2 4y 3 3y 4 12 − 30 + 16 + 9
=1− 1− + + =1−
6 2 3 4 0 12
7 65
=1− =
72 72
( 1( 1 ( 1 ) *1
xy
2 x3 x2 y
(x + )dxdy = + dy
0 1−y 3 0 3 6 1−y
( 1
2y 2 y 3
= (y − + )dy
0 3 6
) *
4 1
2 2y 3
y 59
= y − + =
9 24 0 72
( 1( 2 ( 2 ) *1
xy
2 x3 x2 y
(x + )dxdy = + dy
0 1 3 1 3 6 0
( 2
1 y
= ( + )dy
1 3 6
) *
2 2
y y 1
= − =
3 12 1 12
59 1 65
Therefore, P (X + Y ≥ 1) = + =
72 12 72
Chapter 3. Joint Probability Distributions 10
Definition 3.3 The joint cumulative distribution function of random variables (X, Y ) at
point (x, y) is the probability that X has a value less than or equal to x and Y has a value
less than or equal to y. That is,
Let there be two random variables: X representing the age and Y representing the income
of a person. F (x, y) represents the joint cdf of X and Y .Therefore, F (30, 50000) would
mean the probability that a person is no older than 30 years old and has an income no
more than 50,000 Baht.
Example 3.7 X and Y are random variables with cumulative distribution function de-
fined by
⎧
1
⎨ (xy
⎪ − x − y + c) 1 < x < 3, 1 < y < k
F (x, y) = ⎪ 2
⎩0 x, y otherwise
1. Find c and k
2. Find P (X ≤ 2, Y ≤ 2)
!
Chapter 3. Joint Probability Distributions 11
Note: In definitions 3.4 and 3.5, the marginal functions g and h may not be specified as
mass functions or density functions. They are collectively called the marginal probability
distribution function (marginal function) of the random variables X and Y , respectively.
The owner has collected data and created a joint probability distribution table:
Y 0 1 2 P(x =
+) =
g(X)f(x)
X
0 0.10 0.10 0.05 0 25 .
1. Find the probability of exactly 0 coffee orders, regardless of the number of cake.
3. Find the probability of exactly 1 cake orders, regardless of the number of coffee.
Answer 3.8
0)
=
P(x 0 y 0)
= + p(x =
0
y
=
1) + p(X = 0 y
=
2)
① P(X
=
=
, , ,
20 10 + 0 10 + 0 05 =
0 30 Y 0
1
.
.
.
.
& p(Y 4) = =
=
0 .
40 n =
1
0 .
25
0 30 = 2
X = 0
.
n
0 25
E
.
② 1 =
1
p(x x
=
=
+ , 45
0
2
0 . 30x =
!
2 0 10 .
+ 0 10 . +0 . 05
= 0 25.
Chapter 3. Joint Probability Distributions 13
Example 3.9 Let random variables X and Y having joint probability distribution function
as follows.
f (x, y) = c(x + y) x = 1, 2, 3, y = 1, 2
1. Find c
+ )
( ,
2. Find P (X = 3).
3. Find P (Y = 2).
Answer 3.9
+ c(3 + 1) 2(3 + 2) 1
c(2 + 2)
+ =
2) c(2 + 1) +
((1 + )) + ((1 + +
2 c + 32 + 3 + 42 + 42 + 5 = 1
21c = 1
c =
1
3) =
f (3 , 1) f (3 , 2)
& P(X +
=
21
= 1 (3 + 1)
21
+ (3 + 2) = = =
77
f(2, 2) + f(3 2)
& P(y = 2) =
f() ,
2) + ,
=
(1 + 2) +
(2 + 2) +
(3 + 2)
9/2 , 1
S=
= y =
p(yzy)
=
5121 f f(2 19 f (3 , 1)
P(y21) f
X =
+
(1 , 7) + ,
S 7(2x
=
x) !
① P(X
= =
=
2
(1 + (2)
=
3
=
3/7 +
f(1 , 1) + f(1 2)
2 ++
P(x-1) =
,
= (1 + 1) +
&( + 2)
&
=
I
21
↑ (x =
2 1 =
f (2, 1) + f(2 2)
,
~
E +E + + E
Chapter 3. Joint Probability Distributions 14
Example 3.10 X and Y are random variables with joint probability function f (x, y)
defined by
xy + 1
f (x, y) = 0 < x < 2, 1 < y < 5
32
Answer 3.10
( 5
xy + 1
g(x) = dy
1 32
) *y=5
1 xy 2
= +y
32 2 y=1
3x + 1
⇒ g(x) = 0<x<2
8
( 4
P (2 < Y ≤ 4) = h(y)dy
2
( 4( 2 ( 2
xy + 1 xy + 1
= dxdy since h(y) = dx
2 0 32 0 32
Consider
( 2
xy + 1
h(y) = dx
0 32
) *x=2
1 x2 y
= +x
32 2 x=0
y+1
= 1<y<5
16
Therefore,
( 4
y+1
P (2 < Y ≤ 4) = dy
2 16
) *y=4
1 y2
= +y
16 2 y=2
1
=
2
!
Chapter 3. Joint Probability Distributions 15
We may think events A and B as random variables, so (3.9) can be expressed as functions
as follows
P (Y = y|X = x) =
P (X = x, Y = y)
,
sjoint probability
P (X = x)
margina of (3.10)
- t
f (x, y)
= .
f (x)
Definition 3.6 Let f (x, y) be the joint probability distribution function of X and Y and
if f (y) is the marginal probability function of Y , and
,
f (y) n(y)
then f (x|y) is called the conditional probability function of X for given Y = y Similarly,
the conditional probability function of Y for given X = x is given by
f (x, y)
f (y|x) = , f (x) > 0. (3.12)
f (x)
y)
, =
x +4
6 + 34
1. Find conditional probability of X when Y = y. hays
G
2. Find conditional probability of Y when X = x.
3. Find P (X = 2|Y = 2) + (x =
2(y = 2) = =
4. Find P (Y = 2|X = 2) f (y = 2 (x 2) =
= 4 2+ 2
Answer 3.11
22 + 3
=
212)
+ 3
1
p(x(y) =
P(X x(y = =
y) =
f(x(y)
f(x y) +got
f(x(y)
=
,
heys
X not got
Find hcrp
n(y) =
Ef(x y) , =
(2 + y) + (3 +
(
= +
21 21
-
67 34
21
f(X , 7)
Then faxly) =
n(y)
x +
I
y
21
6 +
34
21
21
x +Y X
~
21 0+ 34
x+
y !
I
#
6 +
34
f(x 4)
+
(y(x)
=
2 ,
9(x)
Find gexs
Ef
=
=
1
n=
((x + 1) +x +
2)]
=
Chapter 3. Joint Probability Distributions 18
Example 3.12 Random variables X and Y have joint probability density function defined
by
⎧
⎪ 3 2
⎨ (x + y 2 ) 0 < x < 1, 0 < y < 1
⎪
⎪
f (x, y) = 2
⎪
⎪
⎩0
⎪
otherwise
f (x, y)
f (x|y) =
h(y)
( ∞ ( 1
3 2
h(y) = f (x, y)dx = (x + y 2 )dx
−∞ 0 2
) *1
3 x 3
= + y2x
2 3 0
+ ,
3 1
= + y2 for 1 < y < 1
2 3
3 2
(x + y 2 ) 3(x2 + y 2 )
2
Therefore, f (x|y) = 3 1 = 0 < x < 1, 1 < y < 1
( +y )2 3y 2+1
2 3
2.
f (x, y)
f (y|x) =
g(x)
( ∞ ( 1
3 2
g(x) = f (x, y)dy = (x + y 2 )dy
−∞ 0 2
) *
3 1
3 x y 2
= +
2 y 3 0
+ ,
3 2 1
= x + for 1 < x < 1
2 3
3 2
(x + y 2 ) 3(x2 + y 2 )
2
Therefore, f (y|x) = 3 = 0 < x < 1, 1 < y < 1
1 3x 2+1
(x + )
2
2 3
!
Chapter 3. Joint Probability Distributions 19
3.6 Independence
Recall that events A and B are independent if
P (A ∩ B) = P (A)P (B).
This means that the probability of two concurrent events can be expressed as the product
of the two. Another way to express the independence of two random variables is as in
Definition ??.
Definition 3.7 Let X and Y be random variables with joint probability function f (x, y)
and marginal probability functions of X and Y as g(x) and h(y), respectively. Random
variables X and Y are independent if and only if
Remark: In Definition 3.7, equation (3.17) holds for both discrete and continuous random
variables.
Theorem 3.1 Let X and Y be independent discrete random variables with marginal
probability functions g(x) and h(y), respectively. Then
t
a<x<b c<y<d
Theorem 3.2 Let X and Y be independent continuous random variables with marginal
probability density functions g(x) and h(y), respectively. Then
Example 3.13 Are X and Y with the following joint probability distribution function
independent?
⎧
⎪
⎪ xy 2
⎨ x = 1, 2, 3, y = 1, 2
f (x, y) = ⎪ 30
⎩0
⎪
otherwise
Answer 3.13
!
Chapter 3. Joint Probability Distributions 21
Example 3.14 Suppose the random variables X and Y are independent with marginal
probability functions of X and Y given by
⎧ + ,x
1 2
⎪
⎨ x = 1, 2, 3, . . .
g(x) = 2 3
⎪
⎩0 otherwise
⎧ + ,y
1 2
⎪
⎨ y = 1, 2, 3, . . .
h(y) = 2 3
⎪
⎩0 otherwise
Find P (X = 1, Y = 3) and P (X + Y ) = 3
Answer 3.14
P (X = 1, Y = 3) = P (X = 1)P (Y = 3)
- + ,. ) + ,3 *
1 2 1 2
=
2 3 2 3
4
=
81
�������
P (X + Y = 3) = f (1, 2) + f (2, 1)
) + ,1+2 * ) + ,1+2 *
1 2 1 2
= +
4 3 4 3
4
=
27
!
Chapter 3. Joint Probability Distributions 22
Example 3.15 Suppose random variables X and Y are independent with marginal dis-
tribution functions of X and Y given by
⎧
⎨1
⎪
0<x<1
g(x) =
⎪
⎩0 otherwise
���
⎧
⎨2y
⎪
0<y<1
h(y) = ⎪
⎩0 otherwise
Find P (Y < X)
Answer 3.15
Then
( 1( x
P (Y < X) = f (x, y)dydx
0 0
( 1( x
= 2ydydx
0 0
( 1/ 0x
= y2 dx
0 0
( 1
= x2 dx
0
) *2
x3
=
3 0
1
=
3
!
Chapter 3. Joint Probability Distributions 23
Definition 3.8 Random variables X and Y with joint cdf F (x, y) such that X and Y are
independent if and only if
����
Therefore,
( x ( x ( ∞
G(x) = g(x)dx = f (x, y)dydx (3.21)
−∞ −∞ −∞
( y ( y ( ∞
h(y) = h(y)dy = f (x, y)dxdy (3.22)
−∞ −∞ −∞
We could use F (x) as an alternative notation for G(x) or we could use F (y) as an alter-
native notation for H(y)
Example 3.16 Suppose random variables X, Y and Z are independent with the uniform
distribution in (0, 1). Find P (Z ≥ XY )
Answer 3.16
f (x, y, z) = f (x)f (y)f (y) = 1 0 < x < 1, 0 < y < 1, 0 < z < 1
Therefore,
(((
P (Z ≥ XY ) = f (x, y, z)dxdydz
z≥xy
( 1( 1( 1
= dzdydx
0 0 xy
( 1( 1
= [z]1xy dydx
0 0
( 1( 1
= (1 − xy)dydx
0 0
( 1) *1
xy 2
= y− dx
0 2 0
( 1+ ,
x
= 1− dx
0 2
) *1
x2
= x−
4 0
3
=
4
!
Chapter 3. Joint Probability Distributions 25
3.7 Exercises
1. Flipping a fair coin three times and record the sequence of heads H and tails T .
Let random variable X denote the number of heads obtained. Let random variable
Y denote the winnings earned in a single play of a game with the following rules;
2. Roll two dice. Let X be the number on the first die and let Y be the total on both
dice. X 1 =
,
2 3 4 5 6
,, , , n 2 3 4 15 6 7 12 =
, , , , ...,
in first
number ofIt
2.1. Show the joint probability distribution. X =
number of It
ye total
↑
f(x , y) + g(x)h(x)
3. A fair coin is tossed three times independently. Let X denote the number of heads
on the first toss, and Y denote the total number of heads. Is) S =
O
3.1. Find the joint probability mass function of X and Y . s =
X=0
, 4
:
8
4. Consider two random variables X and Y with joint probability distribution given in
the following table Answer the following questions.
i
1 2 3
⑧
Y =0 Y =1 Y =2 O 115
2/8 118 ⑦
(
X=0 1/8 c 1/4 O 118
215 Is
1
X=1 1/8 1/6 1/6
&
THT YTH
4.1. Find c. HHT
,
# TH
4.2. Find the marginal probability distribution of X and Y . 2
= 1 -
24
4.3. Find P (Y = 1|X = 0).
4 .
1) y + +
h b+ y 4+ + c = 24-2
a = 1 -
(5 y y f + +
+
+ +
y)
= 1 -
(3
+ 6 + 3 +4
24
4)
6, ..
⑳ 11111 2
h
,
&
,
·
·
:
· & (2 , 1) ..
& t
·
76 6
16, 1) ... ,
1 3,4 5 0
X =
,
2
,
, ,
10 4 12
7 8 9
= 2 , 3, 4,5, 6 , , , , ,
y ,
f (1 , ))x
=
1 , y = 2 =
= 4
3
6
Chapter 3. Joint Probability Distributions 26
5. Two textbooks are randomly selected from a shelf containing 5 English literature,
3 history and 2 psychology. Denote X be the number of English literature selected
and Y be the number of history selected.
Y
X
0 1 2 3
-1 0.15 0 0.05 0.1
1 0.2 0.05 0 0.15
3 0.1 0.1 0 0.1
7. Suppose the random variables X and Y have the joint probability distribution
function defined by
⎧
⎨c(2x + y)
⎪
x = 0, 1 y = 0, 1, 2
f (x, y) = ⎪
⎩0 otherwise
7.1. Find c
7.2. Find the marginal probability function of X
7.3. Find f (y|x) and f (y = 1|x = 2)
Chapter 3. Joint Probability Distributions 27
8. An electronics store sells two main products: laptops (L) and smartphones (S). The
number of each sold on a given day is tracked, and the store manager has created the
following joint pmf table based on historical data: Answer the following questions
8.1. Find the probability mass functions for the number of laptops sold P (L) and
the number of smartphones sold P (S)
8.2. What is the probability of selling two smartphones on a day when one laptop
is sold?
8.3. Are the random variables L and S independent? Explain your reasoning.