0% found this document useful (0 votes)
3 views

article15

This lecture introduces the concept of pairs of discrete random variables, focusing on their joint probability mass function (pmf) and how to derive marginal distributions from it. It illustrates these concepts using examples, including the outcomes of tossing a fair coin and computing probabilities for sums and products of random variables. The lecture concludes with methods for calculating the expected value of combinations of these random variables.

Uploaded by

rommel baldago
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views

article15

This lecture introduces the concept of pairs of discrete random variables, focusing on their joint probability mass function (pmf) and how to derive marginal distributions from it. It illustrates these concepts using examples, including the outcomes of tossing a fair coin and computing probabilities for sums and products of random variables. The lecture concludes with methods for calculating the expected value of combinations of these random variables.

Uploaded by

rommel baldago
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 22

Lecture 15 : Pairs of Discrete Random Variables

0/ 21
Today we start Chapter 5. The transition we are making is like going from one
variable calculus to vector calculus. We should really think of vectors (X , Y ) of
random variables.
So suppose X and Y are discrete random variables defined on the same sample
space S.

Definition
The joint probability mass function, joint pmf , PX ,Y (x , y ), is defined by
and

1/ 21

Lecture 15 : Pairs of Discrete Random Variables


Example
A fair coin is tossed three times.
Let
X = ] of head on first toss
Y = total ] of heads
As usual
HHH , HHT , HTH , HTT
( )
S=
THH , THT , TTH , TTT
We want to compute

PX ,Y (x , y ) = P (X = x , Y = y )
= P ((X = x ) ∩ (Y = y ))
= P (X = x )P (Y = y |X = x )

2/ 21

Lecture 15 : Pairs of Discrete Random Variables


We will record the results in a matrix which we will now compute

First column (y = 0)
Let’s compute (upper left, x = 0)

(because )

Now lower left (X = 1)


PX ,Y (1, 0) = P (X = 1, Y = 0) = 0

Move to the 2nd column (y = 1)


PX ,Y (0, 1) = P (X = 0, Y = 1) (top entry X = 0)

3/ 21

Lecture 15 : Pairs of Discrete Random Variables


This is harder

P (X = 0, Y = 1) = P (T on first and exactly 1 head total)


2
= P (THT ) + P (TTH ) =
8
The bottom entry of the second column is

1
P (X = 1, Y = 1) = P (HTT ) =
8

Third column (y = 2)

1
P (X = 0, Y = 2) = P (THH ) =
8
P (X = 1, Y = 2) = P (HTH ) + P (HHT )
2
=
8

Fourth column (y = 3)

P (X = 0, Y = 3) = 0
1
P (X = 1, Y = 3) = P (HHH ) =
8
4/ 21

Lecture 15 : Pairs of Discrete Random Variables


The table for the joint pmf PX ,Y (x , y )
HH Y
HH 0 1 2 3
H
X H
1 2 1 (*)
0 8 8 8 0
1 2 1
1 0 8 8 8

Check that the total probability is 1.


The joint pmf has a huge amount of information in it. In particular it contains the
pmf PX (x ) of X and PY (y ) of Y .
So how do we recover P (Y = 1) from the table above. The event (Y = 1) is the
union of the two events (X = 0, Y = 1) and (X = 1, Y = 1). These two are
mutually exclusive.

5/ 21

Lecture 15 : Pairs of Discrete Random Variables


So

P (Y = 1) = P (X = 0, Y = 1) + P (X = 1, Y = 1)
2 1 3
= + =
8 8 8
= the sum of the entries in the second column (i.e. the column corresponding to
y = 1)

How about P (X = 1)?


We have an equality of events

(X = 1) = (X = 1, Y = 0) ∪ (X = 1, Y = 1) ∪ (X = 1, Y = 2) ∪ (X = 1, Y = 3)
1 2 1 1
=0+ + + =
8 8 8 2
= the sum of the entries in the second row (corresponding to X = 1)

6/ 21

Lecture 15 : Pairs of Discrete Random Variables


So we see we recover PY (y ) by taking column sums and PX (x ) by taking row
sums.

Marginal Distributions
We can express the above nicely by expanding the table (*) “adding margins”.

Table (*) with margins added


HH y
HH 0 1 2 3
H
x H
1 2 1
0 8 8 8 0 (**)
1 2 1
1 0 8 8 8

7/ 21

Lecture 15 : Pairs of Discrete Random Variables


The §64,000 question
How do you fill in the margins?
There is only one reasonable way to do this — put the row sums in the right
margin and the column sums in the bottom margin.

Table (**) with the margins filled in


HH y
HH 0 1 2 3
H
x H
1 2 1 1
0 8 8 8 0 2 (***)
1 2 1 1
1 0 8 8 8 2
1 3 3 1
8 8 8 8

8/ 21

Lecture 15 : Pairs of Discrete Random Variables


The right margin tells us the pmf of X and the bottom margin tells us the pmf of
Y.

X
1
0 ... 2
1
1 2

y 0 1 2 3
..
.
1 3 3 1
8 8 8 8

So we have
x 0 1 y 0 1 2 3
1 1 and 1 3 3 1
P (X = x ) P (Y = y )
2 2 8 8 8 8
! !
1 1
X ∼ Bin 1, Y ∼ Bin 3,
2 2

9/ 21

Lecture 15 : Pairs of Discrete Random Variables


For this reason, given the pair (X , Y ) the pmf ’s PX (x ) and PY (y ) are called the
marginal distributions.
To state all this correctly we have
Proposition
 row 
PX ,Y (X , y )
P
(i) PX (x ) = sum
all y
 
PX ,y (x , y ) column
P
(ii) PY (y ) = sum
all x

So you “sum away” one variable leaving a function of the remaining variable.

10/ 21

Lecture 15 : Pairs of Discrete Random Variables


Combining Discrete Random Variables
Suppose X and Y are discrete random variables defined on the same sample
space. Let h (x , y ) be a real-valued function of two variables. We want to define
a new random variable W = h (X , Y ).

Examples
We will start with the pair (X , Y ) from our basic example.
The key point is that a function of a pair of random variables
is again a random variable.

11/ 21

Lecture 15 : Pairs of Discrete Random Variables


We will need only the joint pmf
HH y
HH 0 1 2 3
x H
H
1 2 1 (*)
0 8 8 8 0
1 2 1
1 0 8 8 8

(i) h (x , y ) = x + y so W = X + Y
We see that the possible values of the sum are 0, 1, 2, 3, 4 (since they are the
sums of the possible values of X and Y ).
We need to compute their probabilities. How do you compute

P (W = 0) = P (X + Y = 0)?

Answer
Find all the pairs x and y that add up to zero, take the probability of each such
pair and add the resulting probabilities.

12/ 21

Lecture 15 : Pairs of Discrete Random Variables


Answer (Cont.)
Bit X + Y = 0 ⇔ X = 0 and Y = 0 so there is only one such pair (0, 0) and
(from the joint proof (*))
1
P (X = 0, Y = 0) =
8
Hence
1
P (W = 0) = P (X = 0, Y = 0) =
8

P (W = 1) = P (X + Y = 1)
= P (X = 0, Y = 1) + P (X = 1, Y = 0)
2 2
= +0=
8 8

P (W = Z ) = P (X = 0, Y = Z ) + P (X = 1, Y = 1)
2
=
8

13/ 21

Lecture 15 : Pairs of Discrete Random Variables


Answer (Cont.)
Similarly
2 1
P (W = 3) = and P (W = 4) =
8 8
So we get for W = X + Y

W 0 1 2 3 4
1 2 2 2 1
(b)
P (W = w ) 8 8 8 8 8

(check that the total probability is 1)

Remark
Technically the rule given in the “Answer” above is the definition of W = X + Y
as a random variable but as usual the definition is forced on us.

(ii) h (X , y ) = xy so W = XY
The possible values of W (the products of values of X with those of Y ) are
0, 1, 2, 3.

14/ 21

Lecture 15 : Pairs of Discrete Random Variables


We now compute their probabilities.

P(W=0)
We can get 0 as a product xy if either x = 0 or y = 0 so we have

P (W = 0) = P (XY = 0)
= P (X = 0, Y = 0) + P (X = 0, Y = 1) + P (X = 0, Y = 2)
+ P (X = 0, Y = 3) + P (X = 1, Y = 0)
1 2 1 1
= + + +0+0=
8 8 8 2

P(W=1)

1
P (W = 1) = P (X = 1, Y = 1) =
8

15/ 21

Lecture 15 : Pairs of Discrete Random Variables


P (W = 2)

2
P (W = 2) = P (X = 1, Y = 2) =
8

P (W = 3)

1
P (W = 3) = P (X = 1, Y = 3) =
8

W 0 1 2 3
1 1 2 1
P (W = w ) 2 8 8 8

(iii) h (x , y ) = max(x , y ) = the bigger of x and y


so W = max(X , Y )

Remark
The max function doesn’t turn up in vector calculus but it turns up a lot in
statistics in advanced mathematics and real life.

16/ 21

Lecture 15 : Pairs of Discrete Random Variables


The possible values of max(c , y ) are 0, 1, 2, 3.

P (W = 0)

P (W = 0) = P (Max(X , Y ) = 0)
1
= P (X = 0, Y = 0) =
8

P (W = 1)

P (W = 1) = P (Max(X , Y ) = 1)
= P (X = 0, Y = 1) + P (X = 1, Y = 0)
3
P (X = 1, Y = 1) =
8

P (W = 2)

P (W = 1) = P (X = 0, Y = 2) + P (X = 1, Y = 2)
3
=
8

17/ 21

Lecture 15 : Pairs of Discrete Random Variables


P (W = 3) = P (X = 0, Y = 3) + P (X = 1, Y = 3)
1
=
8

W 0 1 2 3
1 3 3 1
P (W = w ) 8 8 8 8

(check that the total probability is 1)

18/ 21

Lecture 15 : Pairs of Discrete Random Variables


The Expected Value of a Combination of Two Discrete Random Variables
If W = h (X , Y ) there are two ways to compute E (W ).

Proposition

X
E (W ) = h (x , y )PX ,Y (x , y ) (])
all (x ,y )
possible
values of
(X ,Y )

We will illustrate the proposition by computing E (W ) for the W = X + Y of


pages 12, 13, 14.

In two ways?

19/ 21

Lecture 15 : Pairs of Discrete Random Variables


First way (without using the proposition)
W is a random variable with proof given by (b) on page 14.
(so we use (b))
! ! !
1 2 2
E (W ) = (0) + ( 1) + (2)
8 8 8
! !
2 1
+ (3) + (4)
8 8
2+4+6+4 16
= = =2
8 8

Second way (using the proposition)


Now we use (*) from page 12
X
E (W ) = E (X + Y ) = (x + y )PX ,Y (x , y )
all x ,y
| {z }
sum over the 8 entries
of (*)

20/ 21

Lecture 15 : Pairs of Discrete Random Variables


! ! !
1 2 1
= (0 + 0) + (0 + 1) + (0 + 2) + (0 + 3)(???)
8 8 8
! ! !
1 2 1
(1 + 0)(0) + (1 + 1) + (1 + 2) + (1 + 3)
8 8 8
2+2 2+6+4
= +
8 8
4 + 12
= =2
8
The first way is easier but we need to compute the proof of W = X + Y first.
That was hard work, pages 12-14.

21/ 21

Lecture 15 : Pairs of Discrete Random Variables

You might also like