0% found this document useful (0 votes)
6 views

Chapter 3

Uploaded by

Nisara Wasamon
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views

Chapter 3

Uploaded by

Nisara Wasamon
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 31

Probability and Statistics

Year 2024

Asst. Prof. Dr. Kuntalee Chaisee


Department of Statistics
Faculty of Science
Chiang Mai University
Contents

Page

3 Joint Probability Distributions 1


3.1 Joint probability mass function . . . . . . . . . . . . . . . . . . . . . . . . 1
3.2 Joint probability density function . . . . . . . . . . . . . . . . . . . . . . . 6
3.3 Joint cumulative distribution function . . . . . . . . . . . . . . . . . . . . 10
3.4 Marginal probability distributions . . . . . . . . . . . . . . . . . . . . . . . 11
3.5 Conditional probability distributions . . . . . . . . . . . . . . . . . . . . . 15
3.6 Independence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
3.7 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

i
Chapter 3
Joint Probability Distributions

“It always seems impossible until it’s done. ”


– Nelson Mandela

In the previous chapter, we focused on random variables defined as functions of a sin-


gle real number. However, real-world phenomena often involve multiple interdependent
random variables. The concept of joint probability distributions allows us to model and
analyse these complex situations where events rarely occur in isolation. In this course,
208150, we will delve into the study of joint probability distributions, focusing on the case
of two random variables.

3.1 Joint probability mass function


Definition 3.1 Let X and Y be discrete random variables where X : S → RX and
Y : S → Ry (RX and RY are discrete sets). We define joint probability function or joint
probability mass function (joint pmf) of (X, Y ) as

f (x, y) = P (X = x, Y = y), (3.1)

if f (x, y) has following properties


for all X
-x
(i) 0 ≤ f (x, y) ≤ 1 for all possible x, y,
for all y
!! Vy
(ii) f (x, y) = 1.
∀y ∀x

1
↑ and yare independent

Chapter 3. Joint Probability Distributions 2

Example 3.1 A coin (baht), a coin (five baht), and a die are tossed simultaneously. Let
the random variable X denote the number of heads obtained from the coins and Y denote
the number shown on the die. Write the joint probability mass function of the random
variables (X, Y ). X H y number =
=

1 X =
1 y = 5
Answer 3.1 2y
=
,

o
X
=

C
HH1 HH2 HH3 HH4 HH5 HHG (S) =
24

E
, ,
, , ,

S =
H + 2 , HT3 , HTP , HT 5
HT6
HT 1 ,
,

&
, TH3 TH5 , THE

I
4
X 71 4= 2 TH1 ,
,

TT T
,

TTc , ,

1
=
07 :

0 2
X = . 1 ,

1, 2, 3 4 6
n =
, ,
5 ,

1 2 3 4 5
Ex 0 4
=

&
=
, , ,
X ,
,

f(X , y) I

3 ,4 5 6
7 2
E X= 1
=
7 , , , ,
,

774
En y 2,3 4,5, 6
TT6
=
X = 2 1
, ,
,
PT

o
Y
1

I
2
↑ i↑ ↑ 3456

En in e
6/24
24

I
12/24
I !
24

I
↓ 1 I 1
2 o Ei 24 2424 24
6/24

P(Y =
y) huy TH1
4/24
HT1
4/24 4144224dp they
,
HH3
(X = 1
y 1) =

,
(x = 2,y =
3)

fen
to show Independence that
We can see
f (x gexsheys for all X
7 for all
y)
=
, + 4
,
f(x , y) =

gax/bey) ,

( E) = f(0 , 1) =
g(0)h(1)

7 = = E)
Chapter 3. Joint Probability Distributions 3

Example 3.2 A box consists of 5 white balls and 5 black balls. Peter draws 3 balls from
this box. He takes one ball at a time and place it back before the next draw. Define random
variables X and Y as follows;

⎨0
if the first draw is white

X=⎪
⎩1 if the first draw is black X = 0 1
,

4 = 0 1,2 ,3
Y = number of white balls. ,

Find the joint probability function of random variables X and Y .

Answer 3.2

Let w denote the while ball and b the black ball. x


= 7 ,
y
= 1
4
=
3
aX 0
=

S = {www, wwb, wbw, bww, wbb,Y


,

So bwb, bbw, bbb}. 1S1 S


=

The possible values of random variable; X = 0, 1 and Y = 0, 1, 2, 3.


set
&: empty
P (X = 0, Y = 0) = P (∅) = 0 impossible
P (X = 0, Y = 1) = P ({wbb}) = 1/8
P (X = 0, Y = 2) = P ({wwb, wbw}) = 2/8
P (X = 0, Y = 3) = P ({www}) = 1/8
P (X = 1, Y = 0) = P ({bbb}) = X
0 118

P (X = 1, Y = 1) = P ({bwb, bbw}) = 2/8


bNw
P (X = 1, Y = 2) = P ({wbb})
X = 1/8 not
To show -
y are
impossible
,
P (X = 1, Y = 3) = P (∅) = 0
independent we show that

there such that


X \Y 0 1 2 3 is +
, y
& 0 0 1/8 1/4 1/8
1 1/8 1/4 1/8 0 mmm f(x , y) + g(x(h(y)

f(x , y)
same ⎧



⎪0, if (x, y) = (0, 0), (1, 3),




⎨1/8 if (x, y) = (0, 1), (0, 3), (1, 0), (1, 2),


f (x, y) =





1/4, if (x, y) = (0, 2), (1, 1),




⎩0

otherwise.
such that
that there is 50 420
we can see ,
!

f(0 %
,
= 0 9(01 :
&
of(1(8) h(0) =
1
Chapter 3. Joint Probability Distributions 4

Example 3.3 Suppose that 2 batteries are randomly chosen without replacement from the
following group of 12 batteries: 3 are new, 4 are in use (working) and 5 are defective. Let
X denote the number of new batteries chosen. Let Y denote the number of used batteries
chosen. Answer the following questions.

1. What are all possible values of X, Y and (X, Y )?

2. Find the joint probability distribution of (X, Y ). (Use contingency table to present
the joint distribution)

Answer 3.3

1. Though X can take on values 0, 1, and 2, and Y can take on values 0, 1, and 2,
when we consider them jointly, X + Y ≤ 2. So, not all combinations of (X, Y ) are
possible.

2. There are 6 possible cases, for example,


& '
5
2 10
• no new and no used f (x, y) = f (0, 0) = & ' =
12 66
2
& '& '
4 5
1 1 10
• no new, 1 used f (x, y) = f (0, 1) = & ' =
20 66
2

Table 3.1: Joint probability table of (X, Y ) of Example 3.3.

X \Y 0 1 2
0 10/66 15/66 3/66
1 20/66 12/66 0
2 6/66 0 0

!
Chapter 3. Joint Probability Distributions 5

Example 3.4 Let X and Y be the random variable with the joint pmf

⎪ 1


⎨ (x2 + y 2 ) if x = 1, 2, y = 0, 1, 2,
f (x, y) = ⎪ 25

⎩0

otherwise.

Find P(x = e , y
= 2) =

1
1. P (X > Y ),

2. P (X + Y ≤ 2), #o I 2

↓ 25
Bles
I 25
3. P (X + Y = 2).
24/25525 85
Answer 3.4

P (X > Y ) = P (X = 1, Y = 0) + P (X = 2, Y = 0) + P (X = 2, Y = 1)
= f (1, 0) + f (2, 0) + f (2, 1)
1 4 5
= + +
25 25 25
10
=
25
2
=
5

Similarly, we can find P (X + Y ≤ 2) and P (X + Y ) = 2.

!
Chapter 3. Joint Probability Distributions 6

3.2 Joint probability density function


Definition 3.2 Let X and Y be continuous random variables with a function f (x, y) such
that

f (x, y) = P (a < X ≤ b, c < Y ≤ d) ������ a < b, c < d (3.2)

We call this function f (x, y) the joint probability density function (Joint pdf) of the random
variables (X, Y ) if f (x, y) has the following properties:

(i) f (x, y) > 0 for all possible x, y


( ∞ ( ∞
(ii) f (x, y)dxdy = 1
−∞ −∞

From the definition, the function f (x, y) represents a continuous surface in the plane. The
value of f (x, y) at a specific point (x, y) can be interpreted as the height of the surface
above that point. The joint probability of X and Y falling within a particular region is
represented by the volume under the surface f (x, y) within that region. Therefore, to
calculate the probability P (a < X ≤ b, c < Y ≤ d), where a and b are possible values for
X, and c and d are possible values for Y , we use the following double integral:
( d( b
P (a < X ≤ b, c < Y ≤ d) = f (x, y)dxdy (3.3)
c a

Note: Both the joint probability mass function (Joint pmf) and the joint probability
density function (Joint pdf) can be abbreviated as the joint probability function.
Chapter 3. Joint Probability Distributions 7

Example 3.5 X and Y are random variables with joint probability density function
f (x, y) defined by


⎨kx(x + y)

0 < x < 1, 0 < y < 2
f (x, y) = ⎪
⎩0 otherwise

1 1
Calculate k and P (0 < X < , < Y < 1)
5 5
Answer 3.5

f (x, y) is the joint pdf of X and Y , so no intregation in exam


-

( 2( 1
kx(x + y)dxdy = 1
0 0
( 2 ) *1
x3 x2 y
k + dy = 1
0 3 2 0
k(2
(2 + 3y)dy = 1
6 0
) *2
y2
k 2y + 3 =6
2 0
k(4 + 6) = 6
3
k=
5

1 1 3 ( 1 ( 1/5 2
P (0 < X < , < Y < 1) = (x + xy)dxdy
5 5 5 1/5 0
+ ,
1 (1 2 3y
= + dy
10 1/5 125 25
- .
1 1 15 1
= 2(1 − ) + (1 − )
1250 5 2 25
22
=
3125

!
Chapter 3. Joint Probability Distributions 8

Example 3.6 X and Y are random variables with joint probability density function
f (x, y) defined by

xy
⎨x2
⎪ + 0 ≤ x ≤ 1, 0 ≤ y ≤ 2
f (x, y) = ⎪ 3
⎩0 x, y otherwise

1. Show that f (x, y) is the joint probability function of X and Y

2. Find P (X + Y ≥ 1)

Answer 3.6

1. To show that f (x, y) is the joint probability function of X and Y , we need to show
(i)

xy
f (x, y) = x2 + ≥ 0 ���������� � x, y ��� 0 ≤ x ≤ 1, 0 ≤ y ≤ 2
3

(ii)
( ∞ ( ∞ ( 2( 1
xy
f (x, y)dxdy = (x2 + )dxdy
−∞ −∞ 0 0 3
( 2) 3 *1
x x2 y
= + dy
0 3 6 0
( 2- .
1 y
= + dy
0 3 6
) *2
y y2
= +
3 12 0
2 4
= +
3 12
=1

From (i) and (ii), we now can conclude that f (x, y) is the joint probability function
of random variables X and Y

2. Find P (X + Y ≥ 1)

Approach 1 Suppose A = {X + Y ≥ 1} so that Ac = {X + Y < 1}

Then we find P (A) from P (A) = 1 − P (Ac )


Chapter 3. Joint Probability Distributions 9

( 1 ( 1−x
xy
P (A) = 1 − (x2 +
)dydx
0 0 3
( 1) *
2 x(1 − x)2
=1− x (1 − x) + dx
0 6
7 65
=1− =
72 72

or
( 1 ( 1−y
xy
P (A) = 1 − (x2 +
)dxdy
0 0 3
( 1) *
(1 − y)3 (1 − y)2 y
=1− + dy
0 3 6
( 1) *
(1 − 3y + 3y 2 + y 3 (y − 2y 2 + y 3 )
=1− + dy
0 3 6
1( 1/ 0
=1− 2 − 5y + 4y 2 + 3y 3 dy
6 0
) *1 + ,
1 5y 2 4y 3 3y 4 12 − 30 + 16 + 9
=1− 1− + + =1−
6 2 3 4 0 12
7 65
=1− =
72 72

We divide the integration into two parts

( 1( 1 ( 1 ) *1
xy
2 x3 x2 y
(x + )dxdy = + dy
0 1−y 3 0 3 6 1−y
( 1
2y 2 y 3
= (y − + )dy
0 3 6
) *
4 1
2 2y 3
y 59
= y − + =
9 24 0 72

( 1( 2 ( 2 ) *1
xy
2 x3 x2 y
(x + )dxdy = + dy
0 1 3 1 3 6 0
( 2
1 y
= ( + )dy
1 3 6
) *
2 2
y y 1
= − =
3 12 1 12

59 1 65
Therefore, P (X + Y ≥ 1) = + =
72 12 72
Chapter 3. Joint Probability Distributions 10

3.3 Joint cumulative distribution function


The joint cumulative distribution function (Joint cdf) is a function that describes the
probability of two or more random variables by indicating the probability that these
variables will have a value equal to or less than a specified value. It is represented
byF (x, y)

Definition 3.3 The joint cumulative distribution function of random variables (X, Y ) at
point (x, y) is the probability that X has a value less than or equal to x and Y has a value
less than or equal to y. That is,

F (x, y) = P (X ≤ x, Y ≤ y), −∞ < x, y < ∞ (3.4)

Let there be two random variables: X representing the age and Y representing the income
of a person. F (x, y) represents the joint cdf of X and Y .Therefore, F (30, 50000) would
mean the probability that a person is no older than 30 years old and has an income no
more than 50,000 Baht.
Example 3.7 X and Y are random variables with cumulative distribution function de-
fined by

1
⎨ (xy
⎪ − x − y + c) 1 < x < 3, 1 < y < k
F (x, y) = ⎪ 2
⎩0 x, y otherwise

1. Find c and k

2. Find P (X ≤ 2, Y ≤ 2)

3. Find P (1.2 < X ≤ 2.2, Y ≤ 2)


Answer 3.7
1
1. F (1, 1) = 0 ⇒ (1 − 1 − 1 + c) = 0 ⇒ c = 1 and
2
1
F (3, k) = 1 ⇒ (3k − 3 − k + 1) = 1 ⇒ k = 2
2
1
2. P (X ≤ 2, Y ≤ 2) = F (2, 2) = (4 − 2 − 2 + 1) = 0.5
2
3.

P (1.2 < X ≤ 2.2, Y ≤ 2) = P (1.2 < X ≤ 2.2, 1 < Y ≤ 2.2)


= F (2.2, 1.2) − F (2.2, 1) − F (1.2, 2) + F (1.2, 1) = 0.5

!
Chapter 3. Joint Probability Distributions 11

3.4 Marginal probability distributions


When there are two random variables X and Y with a joint probability function f (x, y),
and we want to consider the distribution of only one of the random variables X or Y ,
without regard to the outcome of the other variable, we call this distribution the marginal
distribution or univariate distribution.
Studying the marginal distribution is often used to reduce complexity. For example, when
we have multiple random variables that are related, the joint distribution can be complex.
Marginal distributions allow us to focus on only the variable of interest, making analysis
and interpretation easier. In addition, the marginal distribution of each variable helps us
to see a clearer overall picture of each variable, which is useful in studying the relationship
between different variables.

3.4.1 Marginal probability mass function


Definition 3.4 Suppose (X, Y ) has joint probability mass function f (x, y). The marginal
probability mass function or marginal distribution or marginal pmf of X is given by
! !
f (x) = P (X = x) = P (X = x, Y = y) = f (x, y). (3.5)
∀y ∀y

The marginal probability mass function or marginal pmf of Y is given by


! !
f (y) = P (Y = y) = P (X = x, Y = y) = f (x, y). (3.6)
∀x ∀x

3.4.2 Marginal probability density function


Definition 3.5 Suppose (X, Y ) has joint probability density function f (x, y). The marginal
probability density function or marginal distribution or marginal pdf of X is given by
(
g(x) = f (x, y)dy (3.7)

The marginal probability density function or marginal distribution or marginal pdf of Y


is given by
(
h(y) = f (x, y)dx (3.8)
Chapter 3. Joint Probability Distributions 12

Note: In definitions 3.4 and 3.5, the marginal functions g and h may not be specified as
mass functions or density functions. They are collectively called the marginal probability
distribution function (marginal function) of the random variables X and Y , respectively.

Example 3.8 A coffee shop owner tracks two variables:

X : The number of customers who order a coffee on a given day.


Y : The number of customers who order a cake on a given day.

The owner has collected data and created a joint probability distribution table:

Y 0 1 2 P(x =
+) =
g(X)f(x)
X
0 0.10 0.10 0.05 0 25 .

1 0.15 0.20 0.10 0 . 45

2 0.05 0.10 0.15 0 30 .

Answer the following questions

1. Find the probability of exactly 0 coffee orders, regardless of the number of cake.

2. Find the marginal probability of X

3. Find the probability of exactly 1 cake orders, regardless of the number of coffee.

4. Find the marginal probability of Y

Answer 3.8

0)
=
P(x 0 y 0)
= + p(x =
0
y
=
1) + p(X = 0 y
=
2)
① P(X
=
=
, , ,

20 10 + 0 10 + 0 05 =
0 30 Y 0

1
.

.
.
.

& p(Y 4) = =

=
0 .
40 n =
1
0 .
25
0 30 = 2
X = 0
.
n
0 25

E
.

② 1 =
1
p(x x
=
=
+ , 45
0

2
0 . 30x =

③ p(y 1) p(x y 1) P(X y 11 + P(x y 2)


= 0
= =
=
0 +
= : = 0 ,
=
, ,

!
2 0 10 .
+ 0 10 . +0 . 05

= 0 25.
Chapter 3. Joint Probability Distributions 13

Example 3.9 Let random variables X and Y having joint probability distribution function
as follows.

f (x, y) = c(x + y) x = 1, 2, 3, y = 1, 2

1. Find c
+ )
( ,

2. Find P (X = 3).

3. Find P (Y = 2).

4. Find the marginal distribution function of X and Y .

Answer 3.9

f (1 f (1 , 2) + f(2 1) + f(2 , 2) + f(3 1) + f23 , 2) =


1
① ,
1) + , ,

+ c(3 + 1) 2(3 + 2) 1
c(2 + 2)
+ =

2) c(2 + 1) +
((1 + )) + ((1 + +

2 c + 32 + 3 + 42 + 42 + 5 = 1

21c = 1

c =
1
3) =
f (3 , 1) f (3 , 2)
& P(X +
=

21

= 1 (3 + 1)
21
+ (3 + 2) = = =

77
f(2, 2) + f(3 2)
& P(y = 2) =
f() ,
2) + ,

=
(1 + 2) +
(2 + 2) +
(3 + 2)

9/2 , 1

S=
= y =

p(yzy)
=

5121 f f(2 19 f (3 , 1)
P(y21) f
X =
+
(1 , 7) + ,

S 7(2x
=

x) !
① P(X
= =
=
2

(1 + (2)
=

3
=

3/7 +

f(1 , 1) + f(1 2)
2 ++
P(x-1) =
,

= (1 + 1) +
&( + 2)
&
=
I

21

↑ (x =
2 1 =
f (2, 1) + f(2 2)
,

~
E +E + + E
Chapter 3. Joint Probability Distributions 14

Example 3.10 X and Y are random variables with joint probability function f (x, y)
defined by

xy + 1
f (x, y) = 0 < x < 2, 1 < y < 5
32

Find the marginal pdf of X and calculate P (2 < Y ≤ 4)

Answer 3.10

( 5
xy + 1
g(x) = dy
1 32
) *y=5
1 xy 2
= +y
32 2 y=1

3x + 1
⇒ g(x) = 0<x<2
8

( 4
P (2 < Y ≤ 4) = h(y)dy
2
( 4( 2 ( 2
xy + 1 xy + 1
= dxdy since h(y) = dx
2 0 32 0 32

Consider
( 2
xy + 1
h(y) = dx
0 32
) *x=2
1 x2 y
= +x
32 2 x=0
y+1
= 1<y<5
16

Therefore,
( 4
y+1
P (2 < Y ≤ 4) = dy
2 16
) *y=4
1 y2
= +y
16 2 y=2
1
=
2

!
Chapter 3. Joint Probability Distributions 15

3.5 Conditional probability distributions


According to the conditional probability
AlB : AandB
P (A ∩ B)
P (B|A) = , P (A) > 0. (3.9)
P (A)

We may think events A and B as random variables, so (3.9) can be expressed as functions
as follows

P (Y = y|X = x) =
P (X = x, Y = y)
,
sjoint probability
P (X = x)
margina of (3.10)
- t

f (x, y)
= .
f (x)

Definition 3.6 Let f (x, y) be the joint probability distribution function of X and Y and
if f (y) is the marginal probability function of Y , and

f (x, y) f(x(y) f(x , y)


f (x|y) = f (y) > 0, (3.11)
=

,
f (y) n(y)

then f (x|y) is called the conditional probability function of X for given Y = y Similarly,
the conditional probability function of Y for given X = x is given by

f (x, y)
f (y|x) = , f (x) > 0. (3.12)
f (x)

fin incy marginas


Chapter 3. Joint Probability Distributions 16

Remark: Properties of conditional probability distribution functions

(i) f (x|y) > 0 and f (y|x) > 0

(ii) If X and Y are discrete, then


!
f (x|y) = 1
(3.13)
∀x
!
f (y|x) = 1
∀y

(iii) If X and Y are continuous, then


( ∞
f (x|y)dx = 1
−∞
( ∞ (3.14)
f (y|x)dy = 1
−∞

(iv) P (X ∈ A|Y = y) or P (Y ∈ B|X = x) can be obtained from


(
P (X ∈ A|Y = y) = f (x|y)dx (3.15)
(A
P (Y ∈ B|X = x) = f (y|x)dy (3.16)
B
Chapter 3. Joint Probability Distributions 17

Mr Example 3.11 Suppose random variables X and Y have joint pmf as



⎪ x+y


⎨ if x = 1, 2, 3 y = 1, 2,
f (x, y) = ⎪ 21 19

⎩0

otherwise. f(x

f(x(y) =

y)
, =
x +4

6 + 34
1. Find conditional probability of X when Y = y. hays
G
2. Find conditional probability of Y when X = x.

3. Find P (X = 2|Y = 2) + (x =

2(y = 2) = =
4. Find P (Y = 2|X = 2) f (y = 2 (x 2) =
= 4 2+ 2

Answer 3.11
22 + 3
=

212)
+ 3
1
p(x(y) =
P(X x(y = =

y) =
f(x(y)
f(x y) +got
f(x(y)
=
,

heys
X not got
Find hcrp

n(y) =
Ef(x y) , =
(2 + y) + (3 +
(
= +

21 21

-
67 34
21

f(X , 7)
Then faxly) =

n(y)
x +
I
y
21

6 +
34
21
21
x +Y X
~
21 0+ 34
x+
y !
I

#
6 +
34
f(x 4)
+
(y(x)
=
2 ,

9(x)

Find gexs
Ef
=

=
1
n=

((x + 1) +x +
2)]
=
Chapter 3. Joint Probability Distributions 18

Example 3.12 Random variables X and Y have joint probability density function defined
by

⎪ 3 2
⎨ (x + y 2 ) 0 < x < 1, 0 < y < 1


f (x, y) = 2


⎩0

otherwise

1. Find the conditional probability density function of X given Y = y

2. Find the conditional probability density function of Y given X = x


Answer 3.12
1.

f (x, y)
f (x|y) =
h(y)

( ∞ ( 1
3 2
h(y) = f (x, y)dx = (x + y 2 )dx
−∞ 0 2
) *1
3 x 3
= + y2x
2 3 0
+ ,
3 1
= + y2 for 1 < y < 1
2 3
3 2
(x + y 2 ) 3(x2 + y 2 )
2
Therefore, f (x|y) = 3 1 = 0 < x < 1, 1 < y < 1
( +y )2 3y 2+1
2 3

2.

f (x, y)
f (y|x) =
g(x)

( ∞ ( 1
3 2
g(x) = f (x, y)dy = (x + y 2 )dy
−∞ 0 2
) *
3 1
3 x y 2
= +
2 y 3 0
+ ,
3 2 1
= x + for 1 < x < 1
2 3
3 2
(x + y 2 ) 3(x2 + y 2 )
2
Therefore, f (y|x) = 3 = 0 < x < 1, 1 < y < 1
1 3x 2+1
(x + )
2
2 3

!
Chapter 3. Joint Probability Distributions 19

3.6 Independence
Recall that events A and B are independent if

P (A ∩ B) = P (A)P (B).

This means that the probability of two concurrent events can be expressed as the product
of the two. Another way to express the independence of two random variables is as in
Definition ??.

Definition 3.7 Let X and Y be random variables with joint probability function f (x, y)
and marginal probability functions of X and Y as g(x) and h(y), respectively. Random
variables X and Y are independent if and only if

f (x, y) = g(x)h(y) for all (x, y) (3.17)


W

Remark: In Definition 3.7, equation (3.17) holds for both discrete and continuous random
variables.

Theorem 3.1 Let X and Y be independent discrete random variables with marginal
probability functions g(x) and h(y), respectively. Then

P (a < X < b, c < Y < d) = P (a < X < b)P (c < Y < d)


⎛ ⎞⎛ ⎞
! ! (3.18)
=⎝ g(x)⎠ ⎝ h(y)⎠

t
a<x<b c<y<d

where a, b, c, d ∈ R with a < b and c < d.

Theorem 3.2 Let X and Y be independent continuous random variables with marginal
probability density functions g(x) and h(y), respectively. Then

P (a < X < b, c < Y < d) = P (a < X < b)P (c < Y < d)


5( 6 5( 6
b d (3.19)
= g(x)dx h(y)dy
a c

where a, b, c, d ∈ R with a < b and c < d


Chapter 3. Joint Probability Distributions 20

Example 3.13 Are X and Y with the following joint probability distribution function
independent?


⎪ xy 2
⎨ x = 1, 2, 3, y = 1, 2
f (x, y) = ⎪ 30
⎩0

otherwise

Answer 3.13

!
Chapter 3. Joint Probability Distributions 21

Example 3.14 Suppose the random variables X and Y are independent with marginal
probability functions of X and Y given by
⎧ + ,x
1 2

⎨ x = 1, 2, 3, . . .
g(x) = 2 3

⎩0 otherwise

⎧ + ,y
1 2

⎨ y = 1, 2, 3, . . .
h(y) = 2 3

⎩0 otherwise

Find P (X = 1, Y = 3) and P (X + Y ) = 3

Answer 3.14

Since, X and Y are indenpendent, then

P (X = 1, Y = 3) = P (X = 1)P (Y = 3)
- + ,. ) + ,3 *
1 2 1 2
=
2 3 2 3
4
=
81

and f (x, y) = g(x)h(y),


- + , .- + , .
1 2 x 1 2 y
f (x, y) =
2 3 2 3
+ ,
1 2 x+y
= x = 1, 2, . . . and y = 1, 2, . . .
4 3

�������

P (X + Y = 3) = f (1, 2) + f (2, 1)
) + ,1+2 * ) + ,1+2 *
1 2 1 2
= +
4 3 4 3
4
=
27

!
Chapter 3. Joint Probability Distributions 22

Example 3.15 Suppose random variables X and Y are independent with marginal dis-
tribution functions of X and Y given by

⎨1

0<x<1
g(x) =

⎩0 otherwise
���

⎨2y

0<y<1
h(y) = ⎪
⎩0 otherwise

Find P (Y < X)

Answer 3.15

The join pdf of X and Y can be expressed as follows.

f (x, y) = g(x)h(y) = 2y 0 < x < 1, 0 < y < 1

Then
( 1( x
P (Y < X) = f (x, y)dydx
0 0
( 1( x
= 2ydydx
0 0
( 1/ 0x
= y2 dx
0 0
( 1
= x2 dx
0
) *2
x3
=
3 0
1
=
3

!
Chapter 3. Joint Probability Distributions 23

Definition 3.8 Random variables X and Y with joint cdf F (x, y) such that X and Y are
independent if and only if

F (x, y) = F (x)H(y) for all (x, y) (3.20)

����

G(x) = F (x, ∞) is the cumulative distribution function of X


H(y) = F (∞, y) is the cumulative distribution function of Y

Therefore,
( x ( x ( ∞
G(x) = g(x)dx = f (x, y)dydx (3.21)
−∞ −∞ −∞
( y ( y ( ∞
h(y) = h(y)dy = f (x, y)dxdy (3.22)
−∞ −∞ −∞

We could use F (x) as an alternative notation for G(x) or we could use F (y) as an alter-
native notation for H(y)

Independence for multivariate random variables


Definition 3.9 Let X1 , X2 , . . . , Xn be random variables with a joint probability func-
tion f (x1 , . . . , xn ) and marginal probability functions f (x1 ), . . . , f (xn ), respectively. The
random variables X1 , X2 , . . . , Xn are independent if and only if

f (x1 , . . . , xn ) = f (x1 ) × f (x2 ) × . . . × f (xn ) (3.23)


Chapter 3. Joint Probability Distributions 24

Example 3.16 Suppose random variables X, Y and Z are independent with the uniform
distribution in (0, 1). Find P (Z ≥ XY )

Answer 3.16

The joint pdf of X, Y and Z ∈ U (0, 1) can be written as

f (x, y, z) = f (x)f (y)f (y) = 1 0 < x < 1, 0 < y < 1, 0 < z < 1

Therefore,
(((
P (Z ≥ XY ) = f (x, y, z)dxdydz
z≥xy
( 1( 1( 1
= dzdydx
0 0 xy
( 1( 1
= [z]1xy dydx
0 0
( 1( 1
= (1 − xy)dydx
0 0
( 1) *1
xy 2
= y− dx
0 2 0
( 1+ ,
x
= 1− dx
0 2
) *1
x2
= x−
4 0
3
=
4

!
Chapter 3. Joint Probability Distributions 25

3.7 Exercises
1. Flipping a fair coin three times and record the sequence of heads H and tails T .
Let random variable X denote the number of heads obtained. Let random variable
Y denote the winnings earned in a single play of a game with the following rules;

A player wins $1 if first H occurs on the first flip


A player wins $2 if first H occurs on the second flip
A player wins $3 if first H occurs on the third flip
A player loses $1 if no H occur
Answer the following questions

1.1. What are all possible values of X?


1.2. What are all possible values of Y ?
1.3. Present the joint probability distribution of (X, Y )

2. Roll two dice. Let X be the number on the first die and let Y be the total on both
dice. X 1 =
,
2 3 4 5 6
,, , , n 2 3 4 15 6 7 12 =
, , , , ...,

in first
number ofIt
2.1. Show the joint probability distribution. X =

number of It
ye total

2.2. Are X and Y independent? Give the reason to -


support your answer.
:
0
X =
1 , y


f(x , y) + g(x)h(x)
3. A fair coin is tossed three times independently. Let X denote the number of heads
on the first toss, and Y denote the total number of heads. Is) S =

O
3.1. Find the joint probability mass function of X and Y . s =

GHHH HAT ATH THA


3.2. Find marginal probability mass functions of X and Y HTT THY TTH

3.3. What is the conditional probability mass function of X given Y ? Tith


+ =
0 y =
1
3.4. Are X and Y independent? Give the reason to support your answer.↓
,

X=0
, 4
:
8

4. Consider two random variables X and Y with joint probability distribution given in
the following table Answer the following questions.
i
1 2 3

Y =0 Y =1 Y =2 O 115
2/8 118 ⑦

(
X=0 1/8 c 1/4 O 118
215 Is
1
X=1 1/8 1/6 1/6
&
THT YTH
4.1. Find c. HHT
,

# TH
4.2. Find the marginal probability distribution of X and Y . 2
= 1 -

24
4.3. Find P (Y = 1|X = 0).
4 .
1) y + +
h b+ y 4+ + c = 24-2

a = 1 -

(5 y y f + +

+
+ +
y)
= 1 -

(3
+ 6 + 3 +4

24
4)
6, ..
⑳ 11111 2

h
,

&
,

·
·
:
· & (2 , 1) ..
& t
·
76 6
16, 1) ... ,

1 3,4 5 0
X =
,
2
,
, ,

10 4 12
7 8 9
= 2 , 3, 4,5, 6 , , , , ,
y ,

f (1 , ))x
=
1 , y = 2 =

= 4
3

6
Chapter 3. Joint Probability Distributions 26

4.4. Are X and Y independent?

5. Two textbooks are randomly selected from a shelf containing 5 English literature,
3 history and 2 psychology. Denote X be the number of English literature selected
and Y be the number of history selected.

5.1. Find the joint probability distribution of X and Y .


5.2. Express that event that no psychology textbooks are selected in terms of X
and Y , and then determine its probability.

6. The joint probability distribution of random variables X and Y given by

Table 3.2: Joint probability distribution of X and Y

Y
X
0 1 2 3
-1 0.15 0 0.05 0.1
1 0.2 0.05 0 0.15
3 0.1 0.1 0 0.1

Find the marginal probability function of Y

7. Suppose the random variables X and Y have the joint probability distribution
function defined by

⎨c(2x + y)

x = 0, 1 y = 0, 1, 2
f (x, y) = ⎪
⎩0 otherwise

7.1. Find c
7.2. Find the marginal probability function of X
7.3. Find f (y|x) and f (y = 1|x = 2)
Chapter 3. Joint Probability Distributions 27

8. An electronics store sells two main products: laptops (L) and smartphones (S). The
number of each sold on a given day is tracked, and the store manager has created the
following joint pmf table based on historical data: Answer the following questions

S=0 S=1 S=2 S=3


L=0 0.05 0.10 0.08 0.02
L=1 0.12 0.20 0.15 0.03
L=2 0.08 0.10 0.05 0.02

8.1. Find the probability mass functions for the number of laptops sold P (L) and
the number of smartphones sold P (S)
8.2. What is the probability of selling two smartphones on a day when one laptop
is sold?
8.3. Are the random variables L and S independent? Explain your reasoning.

You might also like