0% found this document useful (0 votes)
35 views46 pages

Chapter04 MultiRV 2023su

This document discusses bivariate random variables, including their joint, marginal, and conditional probability distributions. It provides examples of discrete and continuous bivariate random variables. It defines key terms like joint probability mass/density functions, marginal probability mass/density functions, and conditional probability mass/density functions.

Uploaded by

pt5bgn4jpj
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
35 views46 pages

Chapter04 MultiRV 2023su

This document discusses bivariate random variables, including their joint, marginal, and conditional probability distributions. It provides examples of discrete and continuous bivariate random variables. It defines key terms like joint probability mass/density functions, marginal probability mass/density functions, and conditional probability mass/density functions.

Uploaded by

pt5bgn4jpj
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 46

Basic Quantitative Method

Chapter 4 Multivariate Random Variables

2023.7.14.

Basic Quantitative Method 2023.7.14. 1 / 46


Multivariate Random Variables

In this lecture, we will focus on two or more random variables.


We will learn the followings:
Joint Probability Distribution
Marginal Probability Distribution
Conditional Distribution
I.I.D. Random Variables
Functions of Random Variables

Basic Quantitative Method 2023.7.14. 2 / 46


Multivariate Random Variables

Definition
An n-dimensional random variable (random vector)
X = (X1 , X2 , . . . , X n )′ is a function from the sample space Ω to Rn

X ∶ Ω z→ Rn

Let’s first examine the simplest case that n = 2, i.e., the bivariate
random variables.

Basic Quantitative Method 2023.7.14. 3 / 46


Bivariate Discrete Random Variables

Section 1

Bivariate Discrete Random Variables

Basic Quantitative Method 2023.7.14. 4 / 46


Bivariate Discrete Random Variables

Bivariate Discrete Random Variables

Example: Toss a unfair coin twice: Ω = {HH, HT, TH, T T}


Let

⎪ ⎧


⎪0 if {HH, T T} ⎪
⎪0 if {HH, HT, TH}
X(ω) = ⎨ , Y(ω) = ⎨


⎪ 1 if {HT, TH} ⎪

⎪ 1 if {T T}
⎩ ⎩
The mapping Ω ↦ R2 is
ω (X(ω), Y(ω))

TT (0,1)
TH (1,0)
HT (1,0)
HH (0,0)

Basic Quantitative Method 2023.7.14. 5 / 46


Bivariate Discrete Random Variables

Joint Probability Mass Function

Definition (Joint Probability Mass Function)


Let X, Y be two discrete random variables, and let S denote the
two-dimensional support of X and Y. The joint probability mass
function is
f XY (x, y) = P(X = x, Y = y)
satisfying the following three conditions:
f XY (x, y) ≥ 0 for all (x, y) ∈ R2 .

∑ ∑ f XY (x, y) = ∑ ∑ f XY (x, y) = 1.
(x,y)∈S x y

P((X, Y) ∈ A) = ∑ ∑ f XY (x, y), where A ⊂ S.


(x,y)∈A

Basic Quantitative Method 2023.7.14. 6 / 46


Bivariate Discrete Random Variables

Back to Example

Suppose that P(H) = 2/3, and



⎪ ⎧


⎪0 if {HH, T T} ⎪
⎪0 if {HH, HT, TH}
X=⎨ , Y =⎨

⎪1 if {HT, TH}
⎪ ⎪

⎪ 1 if {T T}
⎩ ⎩
The original probability structure and joint pmf are
ω P({ω}) (X(ω), Y(ω)) (x, y) f (x, y) = P(X = x, Y = y)

TT 1/9 (0,1) (0,0) 4/9


TH 2/9 (1,0) (0,1) 1/9
HT 2/9 (1,0) (1,0) 4/9
HH 4/9 (0,0) (1,1) 0

For example, let A = {(x, y)∣x + y = 1}, find P((x, y) ∈ A) =?

Basic Quantitative Method 2023.7.14. 7 / 46


Bivariate Discrete Random Variables

Marginal Probability Mass Function

Definition (Marginal Probability Mass Function)


Let X and Y have the joint probability mass function f XY (x, y). The
marginal probability mass function of X is

f X (x) = ∑ f XY (x, y)
y

Similarly, the marginal probability mass function of Y is

fY (y) = ∑ f XY (x, y)
x

Basic Quantitative Method 2023.7.14. 8 / 46


Bivariate Discrete Random Variables

Back to Example

For example,

f X (0) = ∑ f XY (0, y)
y∈{0,1}

= f XY (0, 0) + f XY (0, 1) = 4/9 + 1/9 = 5/9

f X (1) = ∑ f XY (1, y)
y∈{0,1}

= f XY (1, 0) + f XY (1, 1) = 4/9 + 0 = 4/9

Hence,



⎪5/9 x=0
f X (x) = ⎨


⎪ 4/9 x=1

Basic Quantitative Method 2023.7.14. 9 / 46
Bivariate Discrete Random Variables

Conditional Distribution

Definition (Conditional Probability Mass Function)


Given discrete random variables X and Y. The conditional pmf of Y
given X = x can be derived as the conditional probability that Y = y
given that X = x:

P(X = x, Y = y) f XY (x, y)
fY∣X=x (y) = P(Y = y∣X = x) = =
P(X = x) f X (x)

A concise notation: f (y∣x)

Basic Quantitative Method 2023.7.14. 10 / 46


Bivariate Discrete Random Variables

Conditional Distribution

Given the joint distribution,


X
0 1 fY (y)
Y 0 4/9 4/9 8/9
1 1/9 0 1/9
f X (x) 5/9 4/9
For example,
P(Y = 0∣X = 0) =?
P(Y = 1∣X = 0) =?

Basic Quantitative Method 2023.7.14. 11 / 46


Bivariate Discrete Random Variables

Marginal vs. Conditional Distribution

Marginal Conditional
y fY (y) fY∣X=0 (y)
0 8/9 4/5
1 1/9 1/5

Basic Quantitative Method 2023.7.14. 12 / 46


Bivariate Continuous Random Variables

Section 3

Bivariate Continuous Random Variables

Basic Quantitative Method 2023.7.14. 13 / 46


Bivariate Continuous Random Variables

Bivariate Continuous Random Variables

Definition (Joint Probability Density Function)


Let X and Y be two continuous random variables and let S denote
the two-dimensional support of X and Y. The function f XY (x, y) is a
joint probability density function if it satisfies the following three
conditions:
f XY (x, y) ≥ 0 ∀(x, y) ∈ R2
∞ ∞
∬ f XY (x, y)dxd y = ∫−∞ ∫−∞ f XY (x, y)dxd y = 1
S

P((X, Y) ∈ A) = ∬ f XY (x, y)dxd y, where A ⊂ S.


A

Basic Quantitative Method 2023.7.14. 14 / 46


Bivariate Continuous Random Variables

Example

Let X and Y have the following joint probability density function:

f XY (x, y) = x + y

with support 0 ≤ x ≤ 1 and 0 ≤ y ≤ 1.


Is f XY (x, y) a valid p.d.f.?

Basic Quantitative Method 2023.7.14. 15 / 46


Bivariate Continuous Random Variables

Example: f XY (x y) = x + y

y
x

Basic Quantitative Method 2023.7.14. 16 / 46


Bivariate Continuous Random Variables

Bivariate Continuous Random Variables

Definition (Marginal Probability Density Function)


The marginal probability density functions of the continuous random
variables X and Y are given, respectively, by:

f X (x) = ∫ f XY (x, y)d y


y

and
fY (y) = ∫ f XY (x, y)dx
x

Back to example: f XY (x y) = x + y, 0 ≤ x ≤ 1, 0 ≤ y ≤ 1,

f X (x) =?

Basic Quantitative Method 2023.7.14. 17 / 46


Bivariate Continuous Random Variables

Bivariate Continuous Random Variables

Definition (Conditional Probability Density Function)


Given two continuous random variables, X and Y with joint pdf
f XY (x, y), and respective marginals f X (x) and fY (y). The
conditional pdf for Y given X = x is defined by

f XY (x, y)
fY∣X=x (y) =
f X (x)

Recall that f X (x) ≠ P(X = x). A conditional pdf is not the results
of conditioning on a set of probability zero.
The conditional probability is computed by
b
P(a < Y < b∣X = x) = ∫ fY∣X=x (y)d y
a

Basic Quantitative Method 2023.7.14. 18 / 46


Bivariate Continuous Random Variables

Example

Back to example:

f XY (x y) = x + y, 0 ≤ x ≤ 1, 0 ≤ y ≤ 1

Find
the marginal pdf of X: f X (x)
the conditional pdf of Y: fY∣X=x (y)

Basic Quantitative Method 2023.7.14. 19 / 46


Joint Distribution Function

Section 4

Joint Distribution Function

Basic Quantitative Method 2023.7.14. 20 / 46


Joint Distribution Function

Joint Distribution Function

Definition (Joint Distribution Function)


The joint distribution is given by

FXY (x, y) = P(X ≤ x, Y ≤ y)

For discrete random variables

FXY (x, y) = ∑ ∑ P(X = w, Y = m)


w≤x m≤y

For continuous random variables


y x
FXY (x, y) = ∫ ∫ f XY (s, t)dsdt
−∞ −∞

Basic Quantitative Method 2023.7.14. 21 / 46


Joint Distribution Function

Example: Discrete Random Variables

Given the joint pmf,


X
0 1 fY (y)
Y 0 4/9 4/9 8/9
1 1/9 0 1/9
f X (x) 5/9 4/9
For example, FXY (0, 1) =?

Basic Quantitative Method 2023.7.14. 22 / 46


Joint Distribution Function

Example: Continuous Random Variables

Given the joint pdf:


f XY (x y) = x + y,
what is the joint CDF
FXY (x, y) =?

Basic Quantitative Method 2023.7.14. 23 / 46


Independent Random Variables

Independent Random Variables

Definition
Two random variables X and Y are said to be independent if

f XY (x, y) = f X (x) fY (y)

for all realizations x and y. Where f XY is the joint pmf (pdf); f X and
fY are marginal pmf (pdf) for the discrete (continuous) case.

Note that for the discrete case, the condition is

P(X = x, Y = y) = P(X = x)P(Y = y)

Basic Quantitative Method 2023.7.14. 24 / 46


Independent Random Variables

Independent Random Variables

Theorem
If X and Y are independent, then h(X) and g(Y) are also
independent.

For example, if X and Y are independent, then X 2 and Y are
also independent.

Basic Quantitative Method 2023.7.14. 25 / 46


Multivariate Random Variables

Section 6

Multivariate Random Variables

Basic Quantitative Method 2023.7.14. 26 / 46


Multivariate Random Variables

Multivariate Random Variables

We now generalize two random variables to n random variables:

X1 , X2 , . . . , X n

To simplify the notation, we use


⎡X ⎤
⎢ 1⎥
⎢ ⎥
⎢ X2 ⎥
⎢ ⎥
X = ⎢ ⎥ = (X1 X2 ⋯X n )′
⎢ ⋮ ⎥
⎢ ⎥
⎢ ⎥
⎢ Xn ⎥
⎣ ⎦
to denote a random vector.

Basic Quantitative Method 2023.7.14. 27 / 46


Multivariate Random Variables

Joint Probability Mass Function

Definition (Joint Probability Mass Function)


Let X1 , . . . , X n be discrete random variables. The joint probability
mass function is

f X (x1 , x2 , . . . , x n ) = P(X1 = x1 , X2 = x2 , . . . , X n = x n )

where
f X (x1 , x2 , . . . , x n ) ≥ 0 for all (x1 , x2 , . . . , x n ) ∈ Rn .
∑x1 ∑x2 ⋯ ∑x n f X (x1 , x2 , . . . , x n ) = 1.
P((x1 , x2 , . . . , x n ) ∈ A) = ∑ ∑ ⋯ ∑ f X (x1 , x2 , . . . , x n )
(x 1 ,x 2 ,...,x n )∈A

Basic Quantitative Method 2023.7.14. 28 / 46


Multivariate Random Variables

Marginal Probability Mass Function

Definition (Marginal Probability Mass Function)


The marginal probability mass function of X1 is

f X1 (x1 ) = ∑ ⋯ ∑ f X (x1 , x2 , . . . , x n )
x2 xn

Basic Quantitative Method 2023.7.14. 29 / 46


Multivariate Random Variables

Joint Distribution Function

Definition (Joint Distribution Function)


The joint distribution is given by

FX (x1 , x2 , . . . , x n ) = P(X1 ≤ x1 , X2 ≤ x2 , . . . , X n ≤ x n )
= ∑ ∑ ⋯ ∑ P(X1 = w1 , X2 = w2 , . . . , X n = w n )
w 1 ≤x 1 w 2 ≤x 2 w n ≤x n

Basic Quantitative Method 2023.7.14. 30 / 46


Multivariate Random Variables

Multivariate Continuous Random Variables

Definition (Joint Probability Density Function)


Given n continuous random variables, X1 , . . . , X n . The function
f X (x1 , x2 , . . . , x n ) is a joint probability density function if it satisfies
the following three conditions:
f X (x1 , x2 , . . . , x n ) ≥ 0 ∀(x1 , x2 , . . . , x n ) ∈ Rn
∞ ∞
∫−∞ ⋯ ∫−∞ f X (x1 , . . . , x n )dx1 ⋯dx n = 1

P((X1 , . . . , X n ) ∈ A) = ∬ ⋯ ∫ f X (x1 , . . . , x n )dx1 ⋯dx n , where


A
A ⊂ S.

Basic Quantitative Method 2023.7.14. 31 / 46


Multivariate Random Variables

Multivariate Continuous Random Variables

Definition (Joint Distribution Function)


Given n continuous random variables, X1 , . . . , X n with joint pdf,
f X (x1 , x2 , . . . , x n ), the joint distribution function is
xn x1
FX (x1 , . . . , x n ) = ∫ ⋯∫ f X (u1 , . . . , u n )du1 ⋯du n
−∞ −∞

Note that
∂ n FX (x1 , . . . , x n )
f X (x1 , x2 , . . . , x n ) =
∂x1 ⋯∂x n

Basic Quantitative Method 2023.7.14. 32 / 46


Multivariate Random Variables

Marginal Distribution

Definition (Marginal Probability Density Function)


Given n continuous random variables, X1 , X2 , . . . , X n with joint pdf
f X (x1 , x2 , . . . , x n ). The marginal probability density function of X1 is

f X1 (x1 ) = ∫ ⋯ ∫ f X (x1 , x2 , . . . , x n )dx n ⋯dx2


x2 xn

Basic Quantitative Method 2023.7.14. 33 / 46


Multivariate Random Variables

Independent Random Variables

Definition (General Case)


X1 , X2 , . . . , X n are independent random variables if

f X (x1 , x2 , . . . , x n ) = f X1 (x1 ) f X2 (x2 )⋯ f X n (x n ),

where f X (x1 , x2 , . . . , x n ) is the joint pmf (pdf), and f X i (x i ) is the


marginal pmf (pdf) of X i .

For the discrete case, the condition is


n n
P (⋂{X i = x i }) = ∏ P(X i = x i )
i=1 i=1

Basic Quantitative Method 2023.7.14. 34 / 46


IID Random Variables

Section 7

IID Random Variables

Basic Quantitative Method 2023.7.14. 35 / 46


IID Random Variables

I.I.D. Random Variables

Definition (I.I.D. Random Variables)


A sequence of random variables {X i }ni=1 = {X1 , X2 , . . . , X n } is
independent and identically distributed (i.i.d.) if they have the same
distribution, and all are mutually independent.

That is,
{X1 , X2 , . . . , X n } come from identical f X (x)
{X1 , X2 , . . . , X n } are independent
The joint pmf (pdf) for i.i.d. random variables is thus

fX (x1 , x2 , . . . , x n ) = f X (x1 ) f X (x2 )⋯ f X (x n )

Basic Quantitative Method 2023.7.14. 36 / 46


IID Random Variables

Example

Suppose that
{X i }ni=1 ∼i.i.d. f X (x),
where
f X (x) = p(1 − p)x , x = 0, 1, 2, . . .
The joint pmf is

f X (x1 , . . . , x n ) = f X (x1 ) f X (x2 )⋯ f X (x n )


= [p(1 − p)x1 ] [p(1 − p)x2 ] ⋯ [p(1 − p)x n ]
n
= pn (1 − p)∑ i=1 x i

Basic Quantitative Method 2023.7.14. 37 / 46


IID Random Variables

Example: Identically Distributed but Not Identical

Note that two random variables with the same distribution (they
are identically distributed) need not be identical

identically distributed ≠ identical

Let’s see an example.


Tossing a biased coin twice (P(H) = 0.3).
Sample Space: Ω = {HH, HT, TH, T T} with probabilities,

P({HH}) = 0.09
P({HT}) = 0.21
P({TH}) = 0.21
P({T T}) = 0.49
Basic Quantitative Method 2023.7.14. 38 / 46
IID Random Variables

Remark: Identically Distributed but Not Identical

Let random variable X1 = 1 indicate the event that Head comes the
first, while X2 = 1 indicate the event that Head comes the second:


⎪ 1 ω ∈ {HH, HT}
X1 (ω) = ⎨

⎩ 0 ω ∈ {TH, T T}



⎪ 1 ω ∈ {HH, TH}
X2 (ω) = ⎨

⎩ 0 ω ∈ {HT, T T}

X1 and X2 are identically distributed.


X1 and X2 are NOT identical.

Basic Quantitative Method 2023.7.14. 39 / 46


Functions of Random Variables

Section 8

Functions of Random Variables

Basic Quantitative Method 2023.7.14. 40 / 46


Functions of Random Variables

Functions of Discrete Random Variables

Given that

⎪ −1, with Prob = 1/4



X = ⎨ 0, with Prob = 1/2




⎩ 1, with Prob = 1/4
Let Y = X 2 . Find the distribution of Y.

Basic Quantitative Method 2023.7.14. 41 / 46


Functions of Random Variables

Functions of Continuous Random Variables

Consider One-to-one transformation


Refer to Hogg and Tanis (2010) for more details.
Given X a continuous random variable, and let Y be a one-to-one
function of X:
Y = u(X)
Since the function is one-to-one, thus

X = u −1 (Y) = w(Y)

Y
For instance, If Y = 7X or Y = ln X, then X = 7 or X = e Y .
We want to find out the distribution of Y.

Basic Quantitative Method 2023.7.14. 42 / 46


Functions of Random Variables

Functions of Continuous Random Variables

Finding the probability distribution of a function of random variables


The Method of Distribution Functions
The Method of Transformations

Basic Quantitative Method 2023.7.14. 43 / 46


Functions of Random Variables

The Method of Distribution Functions

Given that the distribution function of X is FX (x).


If Y = u(X) is strictly increasing, then X = u −1 (Y) = w(Y) and

{Y ≤ y} iff. {X ≤ w(y)}

The distribution function of Y is

FY (y) = FX (w(y))

Thus, the pdf of Y is


dFY
fY (y) = .
dy
What if Y = u(X) is strictly decreasing?
Basic Quantitative Method 2023.7.14. 44 / 46
Functions of Random Variables

The Method of Transformations

Suppose that the distribution function of X is unknown but the


density function f X (x) is known.
Recall that Y = u(X), and X = u −1 (Y) = w(Y).
The density function of Y is obtained by the following formula:

d
fY (y) = f X (w(y)) ∣ w(y)∣ ,
dy

where, ∣ ddy w(y)∣ is the Jacobian.

Basic Quantitative Method 2023.7.14. 45 / 46


Functions of Random Variables

Example 1: Uniform Random Variable

Theorem (Invariance Under Linear Transformation)


Given X ∼ U[0, 1], and Y = aX + b, with a > 0. Then

Y ∼ U[b, a + b]

Proof. We can show this by the CDF methods.

y−b
P(Y ≤ y) =
(a + b) − b

Basic Quantitative Method 2023.7.14. 46 / 46

You might also like