0% found this document useful (0 votes)
7 views48 pages

Chapter 6 Part 1: Jointly Distributed Random Variables

Uploaded by

pmaldio866
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views48 pages

Chapter 6 Part 1: Jointly Distributed Random Variables

Uploaded by

pmaldio866
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 48

Chapter 6 part 1

Jointly Distributed Random Variables

EEE 214

Chapter 6 part 1 EEE 214 1 / 22


Outline

1 Joint distribution

2 Independent random variables

Chapter 6 part 1 EEE 214 2 / 22


Joint distribution

Outline

1 Joint distribution

2 Independent random variables

Chapter 6 part 1 EEE 214 3 / 22


Joint distribution

Joint cdf
Definition
We have a pair of random variables (either discrete or continuous) X and
Y . The joint cumulative probability distribution function of X and Y is
defined by
FX,Y (x, y) = P [X ≤ x, Y ≤ y]

Chapter 6 part 1 EEE 214 4 / 22


Joint distribution

Joint cdf
Definition
We have a pair of random variables (either discrete or continuous) X and
Y . The joint cumulative probability distribution function of X and Y is
defined by
FX,Y (x, y) = P [X ≤ x, Y ≤ y]
= P [(X, Y ) lies south-west of the point (x, y)]

Y
(x,y)

Chapter 6 part
X 1 EEE 214 4 / 22
Joint distribution

Properties of joint cdf


For one random variable: marginal cdf

FX (x) = FX,Y (x, ∞)

FX (x) = P (X ≤ x) = P (X ≤ x, Y ≤ ∞) = FX,Y (x, ∞)

Chapter 6 part 1 EEE 214 5 / 22


Joint distribution

Properties of joint cdf


For one random variable: marginal cdf

FX (x) = FX,Y (x, ∞)

FX (x) = P (X ≤ x) = P (X ≤ x, Y ≤ ∞) = FX,Y (x, ∞)

FY (y) = FX,Y (∞, y)


FY (y) = P (Y ≤ y) = P (X ≤ ∞, Y ≤ y) = FX,Y (∞, y)

Chapter 6 part 1 EEE 214 5 / 22


Joint distribution

Properties of joint cdf


For one random variable: marginal cdf

FX (x) = FX,Y (x, ∞)

FX (x) = P (X ≤ x) = P (X ≤ x, Y ≤ ∞) = FX,Y (x, ∞)

FY (y) = FX,Y (∞, y)


FY (y) = P (Y ≤ y) = P (X ≤ ∞, Y ≤ y) = FX,Y (∞, y)
Joint probabilities

P (X > x, Y > y) = 1 − FX (x) − FY (y) + FX,Y (x, y)

Chapter 6 part 1 EEE 214 5 / 22


Joint distribution

Example: two discrete random variables


Draw two socks at random, without replacement, from a drawer full of
twelve colored socks:
6 black, 4 white, 2 purple

Let B be the number of Black socks, W the number of White socks drawn.
Then the distributions of B and W are given by:

Chapter 6 part 1 EEE 214 6 / 22


Joint distribution

Example: two discrete random variables


Draw two socks at random, without replacement, from a drawer full of
twelve colored socks:
6 black, 4 white, 2 purple

Let B be the number of Black socks, W the number of White socks drawn.
Then the distributions of B and W are given by:

0 1 2

P(B=k)

P(W=k)

Chapter 6 part 1 EEE 214 6 / 22


Joint distribution

Example: two discrete random variables


Draw two socks at random, without replacement, from a drawer full of
twelve colored socks:
6 black, 4 white, 2 purple

Let B be the number of Black socks, W the number of White socks drawn.
Then the distributions of B and W are given by:

0 1 2

6 5 15 6 6 36 6 5 15
P(B=k) 12 · 11 = 66 2· 12 · 11 = 66 12 · 11 = 66

8 7 28 4 8 32 4 3 6
P(W=k) 12 · 11 = 66 2· 12 · 11 = 66 12 · 11 = 66

(k6)(2−k
6
) (4)( 8 )
Note - P (B = k) = 12 and P (W = k) = k 122−k
(2) (2)

Chapter 6 part 1 EEE 214 6 / 22


Joint distribution

Draw two socks at random, without replacement, from a drawer full of


twelve colored socks: 6 black, 4 white, 2 purple. Let B be the number of
Black socks, W the number of White socks drawn.
The joint distribution is given by: pB,W (b, w) = P (B = b, W = w)

W

0 1 2 





0



P (B = b, W = w) =

B 1








2

Chapter 6 part 1 EEE 214 7 / 22


Joint distribution

Draw two socks at random, without replacement, from a drawer full of


twelve colored socks: 6 black, 4 white, 2 purple. Let B be the number of
Black socks, W the number of White socks drawn.
The joint distribution is given by: pB,W (b, w) = P (B = b, W = w)

W

0 1 2 


1/66 If b=0,w=0
8/66 If b=0,w=1


1 8 6 15

0


6/66 If b=0,w=2
66 66 66 66 P (B = b, W = w) =
12 24 36 12/66 If b=1,w=0
B 1 0


66 66 66 24/66 If b=1,w=1





15 15 15/66 If b=2,w=0

2 66 0 0 66
28 32 6
66 66 66
6 4 2
  
b w 2−b−w
P (B = b, W = w) = 12
 , for 0 ≤ b, w ≤ 2 and b + w ≤ 2
2

Chapter 6 part 1 EEE 214 7 / 22


Joint distribution

Marginal Distributions
Note that the column and row sums are the distributions of B and W
respectively.

P (B = b) = P (B = b, W = 0) + P (B = b, W = 1) + P (B = b, W = 2)
P (W = w) = P (B = 0, W = w) + P (B = 1, W = w) + P (B = 2, W = w)

Chapter 6 part 1 EEE 214 8 / 22


Joint distribution

Marginal Distributions
Note that the column and row sums are the distributions of B and W
respectively.

P (B = b) = P (B = b, W = 0) + P (B = b, W = 1) + P (B = b, W = 2)
P (W = w) = P (B = 0, W = w) + P (B = 1, W = w) + P (B = 2, W = w)
These are the marginal distributions of B and W . In general,
X X
P (X = x) = P (X = x, Y = y) = P (X = x | Y = y)P (Y = y)
y y

Chapter 6 part 1 EEE 214 8 / 22


Joint distribution

Conditional Distribution
Conditional distributions are defined as we have seen previously with

P (X = x, Y = y) joint pmf
P (X = x | Y = y) = =
P (Y = y) marginal pmf

Chapter 6 part 1 EEE 214 9 / 22


Joint distribution

Joint distribution of two continuous random variables


Definition
Random variables X and Y are jointly continuous if there exists a function
f (x, y) such that
1 Non-negative f (x, y) ≥ 0, for any x, y ∈ R, and
R∞ R∞
−∞ −∞ f (x, y) dx dy = 1.
2

fX,Y (x, y) is called the joint probability density function of X and Y .

Chapter 6 part 1 EEE 214 10 / 22


Joint distribution

Joint distribution of two continuous random variables


Definition
Random variables X and Y are jointly continuous if there exists a function
f (x, y) such that
1 Non-negative f (x, y) ≥ 0, for any x, y ∈ R, and
R∞ R∞
−∞ −∞ f (x, y) dx dy = 1.
2

fX,Y (x, y) is called the joint probability density function of X and Y .


For any set C ⊂ R2 ,
ZZ
P [(X, Y ) ∈ C] = f (x, y) dx dy
(x,y)∈C

Chapter 6 part 1 EEE 214 10 / 22


Joint distribution

Joint distribution of two continuous random variables


Definition
Random variables X and Y are jointly continuous if there exists a function
f (x, y) such that
1 Non-negative f (x, y) ≥ 0, for any x, y ∈ R, and
R∞ R∞
−∞ −∞ f (x, y) dx dy = 1.
2

fX,Y (x, y) is called the joint probability density function of X and Y .


For any set C ⊂ R2 ,
ZZ
P [(X, Y ) ∈ C] = f (x, y) dx dy
(x,y)∈C

Connection between joint pdf and jointZ cdf Z


b a
F (a, b) = P (X ≤ a, Y ≤ b) = f (x, y) dx dy
−∞ −∞

Chapter 6 part 1 EEE 214 10 / 22


Joint distribution

Joint distribution of two continuous random variables


Definition
Random variables X and Y are jointly continuous if there exists a function
f (x, y) such that
1 Non-negative f (x, y) ≥ 0, for any x, y ∈ R, and
R∞ R∞
−∞ −∞ f (x, y) dx dy = 1.
2

fX,Y (x, y) is called the joint probability density function of X and Y .


For any set C ⊂ R2 ,
ZZ
P [(X, Y ) ∈ C] = f (x, y) dx dy
(x,y)∈C

Connection between joint pdf and jointZ cdf Z


b a
F (a, b) = P (X ≤ a, Y ≤ b) = f (x, y) dx dy
−∞ −∞
∂2
f (x, y) = F (x, y)
∂x∂y
Chapter 6 part 1 EEE 214 10 / 22
Joint distribution

Marginal pdfs
Marginal probability density functions are defined in terms of “integrating
out” one of the random variables.
Z ∞
fX (x) = f (x, y) dy
−∞
Z ∞
fY (y) = f (x, y) dx
−∞

Chapter 6 part 1 EEE 214 11 / 22


Joint distribution

Let X and Y be drawn uniformly from the triangle below. Find the joint pdf
fX,Y (x, y).

Y 2

0
0 1 2 3 4

Chapter 6 part 1 EEE 214 12 / 22


Joint distribution

Let X and Y be drawn uniformly from the triangle below. Find the joint pdf
fX,Y (x, y).
Since the joint density is constant, then
(
c for x ≥ 0, y ≥ 0 and x + y ≤ 3
f (x, y) =
0 otherwise
4

Y 2

0
0 1 2 3 4

Chapter 6 part 1 EEE 214 12 / 22


Joint distribution

Let X and Y be drawn uniformly from the triangle below. Find the joint pdf
fX,Y (x, y).
Since the joint density is constant, then
(
c for x ≥ 0, y ≥ 0 and x + y ≤ 3
f (x, y) =
0 otherwise
4

3 Because
Z ∞ Z ∞
Y 2 1= fX,Y (x, y) dx dy
−∞ −∞
1

0
0 1 2 3 4

Chapter 6 part 1 EEE 214 12 / 22


Joint distribution

Let X and Y be drawn uniformly from the triangle below. Find the joint pdf
fX,Y (x, y).
Since the joint density is constant, then
(
c for x ≥ 0, y ≥ 0 and x + y ≤ 3
f (x, y) =
0 otherwise
4

3 Because
Z ∞ Z ∞
Y 2 1= fX,Y (x, y) dx dy
−∞ −∞
1 ZZ
= c dx dy
0
0 1 2 3 4
x≥0,y≥0,x+y≤3
X

Chapter 6 part 1 EEE 214 12 / 22


Joint distribution

Let X and Y be drawn uniformly from the triangle below. Find the joint pdf
fX,Y (x, y).
Since the joint density is constant, then
(
c for x ≥ 0, y ≥ 0 and x + y ≤ 3
f (x, y) =
0 otherwise
4

3 Because
Z ∞ Z ∞
Y 2 1= fX,Y (x, y) dx dy
−∞ −∞
1 ZZ
= c dx dy
0
0 1 2 3 4
x≥0,y≥0,x+y≤3
X 3×3
= c × area of the triangle = c ×
2

Chapter 6 part 1 EEE 214 12 / 22


Joint distribution

Let X and Y be drawn uniformly from the triangle below. Find the joint pdf
fX,Y (x, y).
Since the joint density is constant, then
(
c for x ≥ 0, y ≥ 0 and x + y ≤ 3
f (x, y) =
0 otherwise
4

3 Because
Z ∞ Z ∞
Y 2 1= fX,Y (x, y) dx dy
−∞ −∞
1 ZZ
= c dx dy
0
0 1 2 3 4
x≥0,y≥0,x+y≤3
X 3×3
= c × area of the triangle = c ×
2
Therefore, c = 29 .
Chapter 6 part 1 EEE 214 12 / 22
Joint distribution

Recap
Joint cdf of two random variables X and Y :

FX,Y (x, y) = P [X ≤ x, Y ≤ y], −∞ < x, y < ∞

Probability of (X, Y ) in a rectangle

P (x1 < X ≤ x2 , y1 < Y ≤ y2 )

= F (x2 , y2 ) + F (x1 , y1 ) − F (x1 , y2 ) − F (x2 , y1 )


Marginal cdfs

FX (x) = FX,Y (x, ∞), FY (y) = FX,Y (∞, y)

Chapter 6 part 1 EEE 214 13 / 22


Joint distribution

Joint distribution of two discrete random variables


Joint pmf
pX,Y (x, y) = P (X = x, Y = y)
Marginal pmfs
X X
pX (x) = pX,Y (x, y), pY (y) = pX,Y (x, y)
y:p(x,y)>0 x:p(x,y)>0

Chapter 6 part 1 EEE 214 14 / 22


Joint distribution

Joint distribution of two discrete random variables


Joint pmf
pX,Y (x, y) = P (X = x, Y = y)
Marginal pmfs
X X
pX (x) = pX,Y (x, y), pY (y) = pX,Y (x, y)
y:p(x,y)>0 x:p(x,y)>0

Joint distribution of two continuous random variables


Joint pdf
▶ Non-negative
R∞ R∞ fX,Y (x, y) ≥ 0, for any x, y ∈ R

−∞ −∞
f X,Y (x, y) dx dy = 1
▶ For any set C ⊂ R2 ,
ZZ
P [(X, Y ) ∈ C] = fX,Y (x, y) dx dy
(x,y)∈C

Marginal pdfs
Z ∞ Z ∞
fX (x) = fX,Y (x, y) dy, fY (y) = fX,Y (x, y) dx
−∞ −∞
Chapter 6 part 1 EEE 214 14 / 22
Joint distribution

Let X and Y have the following joint pdf


(
2
for x ≥ 0, y ≥ 0 and x + y ≤ 3
f (x, y) = 9
0 otherwise

Find the marginal pdf fX (x).

Y 2

0
0 1 2 3 4

Chapter 6 part 1 EEE 214 15 / 22


Joint distribution

Let X and Y have the following joint pdf


(
2
for x ≥ 0, y ≥ 0 and x + y ≤ 3
f (x, y) = 9
0 otherwise

Find the marginal pdf fX (x).

For x ∈ [0, 3],


4
Z ∞ Z 3-x
2 2
3 fX (x) = f (x, y) dy = dy = (3 − x)
−∞ 0 9 9
Y 2
Be careful about the range of Y given X = x.
1 (
2
(3 − x) for x ∈ [0, 3]
0
0 1 2 3 4 fX (x) = 9
0 otherwise
X

Chapter 6 part 1 EEE 214 15 / 22


Joint distribution

In the previous example, find P (X < Y ).


(
2
for x ≥ 0, y ≥ 0 and x + y ≤ 3
f (x, y) = 9
0 otherwise

Y 2

0
0 1 2 3 4

Chapter 6 part 1 EEE 214 16 / 22


Joint distribution

In the previous example, find P (X < Y ).


(
2
for x ≥ 0, y ≥ 0 and x + y ≤ 3
f (x, y) = 9
0 otherwise
Identify the region
C = {(x, y) : x ≥ 0, y ≥ 0 and x + y ≤ 3, x < y}.
4 ZZ
3
P (X < Y ) = f (x, y) dx dy
(x,y)∈C
3
Y 2
Z Z 3−x 
2 2
= dy dx
1 0 x 9
Z 3
2 2 2 h 3i
0
0 1 2 3 4 = (3 − 2x) dx = × 3x − x2 02
0 9 9
X
 
2 9 9 1
= − =
9 2 4 2

Chapter 6 part 1 EEE 214 16 / 22


Joint distribution

Joint distribution of two continuous random variables


Joint pdf
▶ Non-negative
R∞ R∞ fX,Y (x, y) ≥ 0, for any x, y ∈ R

−∞ −∞ X,Y
f (x, y) dx dy = 1
▶ For any set C ⊂ R2 ,
ZZ
P [(X, Y ) ∈ C] = fX,Y (x, y) dx dy
(x,y)∈C

Between joint cdf and joint pdf


Z b Z a
FX,Y (a, b) = fX,Y (x, y) dx dy
−∞ −∞

∂2
fX,Y (x, y) = FX,Y (x, y)
∂x∂y
Marginal pdfs
Z ∞ Z ∞
fX (x) = fX,Y (x, y) dy, fY (y) = fX,Y (x, y) dx
−∞ −∞

Chapter 6 part 1 EEE 214 17 / 22


Independent random variables

Outline

1 Joint distribution

2 Independent random variables

Chapter 6 part 1 EEE 214 18 / 22


Independent random variables

Independent random variables


Definition
Random variables X and Y are independent if any real sets A, B ⊂ R,

P (X ∈ A, Y ∈ B) = P (X ∈ A)P (Y ∈ B)

Chapter 6 part 1 EEE 214 19 / 22


Independent random variables

Independent random variables


Definition
Random variables X and Y are independent if any real sets A, B ⊂ R,

P (X ∈ A, Y ∈ B) = P (X ∈ A)P (Y ∈ B)

Random variables X and Y are independent if and only if


Cdf: for any x, y ∈ R

FX,Y (x, y) = FX (x)FY (y)

Chapter 6 part 1 EEE 214 19 / 22


Independent random variables

Independent random variables


Definition
Random variables X and Y are independent if any real sets A, B ⊂ R,

P (X ∈ A, Y ∈ B) = P (X ∈ A)P (Y ∈ B)

Random variables X and Y are independent if and only if


Cdf: for any x, y ∈ R

FX,Y (x, y) = FX (x)FY (y)

If both are discrete, pmf: for any x, y ∈ R

pX,Y (x, y) = pX (x)pY (y)

Chapter 6 part 1 EEE 214 19 / 22


Independent random variables

Independent random variables


Definition
Random variables X and Y are independent if any real sets A, B ⊂ R,

P (X ∈ A, Y ∈ B) = P (X ∈ A)P (Y ∈ B)

Random variables X and Y are independent if and only if


Cdf: for any x, y ∈ R

FX,Y (x, y) = FX (x)FY (y)

If both are discrete, pmf: for any x, y ∈ R

pX,Y (x, y) = pX (x)pY (y)

If both are continuous, pdf: for any x, y ∈ R

fX,Y (x, y) = fX (x)fY (y)


Chapter 6 part 1 EEE 214 19 / 22
Independent random variables

A man and a woman decide to meet at a certain location. If each of them


independently arrives at a time uniformly distributed between 12 noon and
1 P.M., find the probability that the first to arrive has to wait longer than
10 minutes.

Chapter 6 part 1 EEE 214 20 / 22


Independent random variables

A man and a woman decide to meet at a certain location. If each of them


independently arrives at a time uniformly distributed between 12 noon and
1 P.M., find the probability that the first to arrive has to wait longer than
10 minutes.
Let random variables X, Y be the time they arrive (uniform between 0 to
60 minutes).

60

50

40

Y 30

20

10

0
0 10 20 30 40 50 60
X

Chapter 6 part 1 EEE 214 20 / 22


Independent random variables

A man and a woman decide to meet at a certain location. If each of them


independently arrives at a time uniformly distributed between 12 noon and
1 P.M., find the probability that the first to arrive has to wait longer than
10 minutes.
Let random variables X, Y be the time they arrive (uniform between 0 to
60 minutes).

P (|X − Y | > 10)


60
= P (Y > X + 10) + P (Y < X − 10)
50
= 2P (Y > X + 10)
40 ZZ
Y 30
=2 fX,Y (x, y) dx dy
20 y>x+10
ZZ
10
=2 fX (x)fY (y) dx dy
0
0 10 20 30 40 50 60 y>x+10
X Z 60 Z y−10
=2 (1/60)2 dx dy = 25/36
10 0
Chapter 6 part 1 EEE 214 20 / 22
Independent random variables

Independent random variables


The continuous (discrete) random variables X and Y are independent if
and only if their joint probability density (mass) function can be expressed
as
fX,Y (x, y) = g(x)h(y), −∞ < x, y < ∞

Chapter 6 part 1 EEE 214 21 / 22


Independent random variables

More than two random variables


☞ Random variables X1 , X2 , . . . , Xn are independent if any real sets
A1 , A2 , . . . , An ⊂ R,

P (X1 ∈ A1 , . . . , Xn ∈ An ) = P (X1 ∈ A1 ) · · · P (Xn ∈ An )

Chapter 6 part 1 EEE 214 22 / 22


Independent random variables

More than two random variables


☞ Random variables X1 , X2 , . . . , Xn are independent if any real sets
A1 , A2 , . . . , An ⊂ R,

P (X1 ∈ A1 , . . . , Xn ∈ An ) = P (X1 ∈ A1 ) · · · P (Xn ∈ An )

Random variables X1 , X2 , . . . , Xn are independent if and only if


Cdf: for any x1 , x2 , . . . , xn ∈ R

F (x1 , . . . , xn ) = FX1 (x1 ) · · · FXn (xn )

Chapter 6 part 1 EEE 214 22 / 22


Independent random variables

More than two random variables


☞ Random variables X1 , X2 , . . . , Xn are independent if any real sets
A1 , A2 , . . . , An ⊂ R,

P (X1 ∈ A1 , . . . , Xn ∈ An ) = P (X1 ∈ A1 ) · · · P (Xn ∈ An )

Random variables X1 , X2 , . . . , Xn are independent if and only if


Cdf: for any x1 , x2 , . . . , xn ∈ R

F (x1 , . . . , xn ) = FX1 (x1 ) · · · FXn (xn )

If both are discrete, pmf: for any x1 , x2 , . . . , xn ∈ R

p(x1 , . . . , xn ) = pX1 (x1 ) · · · pXn (xn )

Chapter 6 part 1 EEE 214 22 / 22


Independent random variables

More than two random variables


☞ Random variables X1 , X2 , . . . , Xn are independent if any real sets
A1 , A2 , . . . , An ⊂ R,

P (X1 ∈ A1 , . . . , Xn ∈ An ) = P (X1 ∈ A1 ) · · · P (Xn ∈ An )

Random variables X1 , X2 , . . . , Xn are independent if and only if


Cdf: for any x1 , x2 , . . . , xn ∈ R

F (x1 , . . . , xn ) = FX1 (x1 ) · · · FXn (xn )

If both are discrete, pmf: for any x1 , x2 , . . . , xn ∈ R

p(x1 , . . . , xn ) = pX1 (x1 ) · · · pXn (xn )

If both are continuous, pdf: for any x1 , x2 , . . . , xn ∈ R

f (x1 , . . . , xn ) = fX1 (x1 ) · · · fXn (xn )

Chapter 6 part 1 EEE 214 22 / 22

You might also like