0% found this document useful (0 votes)
6 views7 pages

Lecture 20

This lecture discusses conditional distribution and expectation in the discrete case, focusing on the relationships between random variables X and Y. It includes definitions, examples of Poisson distributions, and the impact of conditioning on independence. The document emphasizes the importance of understanding conditional expectations and their calculations in various scenarios.

Uploaded by

Thành Nguyễn
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views7 pages

Lecture 20

This lecture discusses conditional distribution and expectation in the discrete case, focusing on the relationships between random variables X and Y. It includes definitions, examples of Poisson distributions, and the impact of conditioning on independence. The document emphasizes the importance of understanding conditional expectations and their calculations in various scenarios.

Uploaded by

Thành Nguyễn
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Lecture 20: Conditional Distribution and Expectation:

Discrete Case

STOR 435, Spring, 2025

4/8/2025

435-Spring-2025 conditional expectation


Basics
Pitman: Sections 6.1 and 6.2
Denote the sets of atoms for discrete RV’s X and Y by DX
and DY respectively. Given x ∈ DX , the collection
{P(Y = y | X = x), y ∈ DY } is called the conditional
distribution of Y given X = x.
Definition: Assume (X, Y) follows a discrete joint
distribution. Then for x ∈ DX ,
X
E(Y|X = x) = y P(Y = y|X = x).
y

For a known function g, we have the plug-in formula


X
E[g(X, Y)|X = x] = g(x, y) P(Y = y|X = x).
y

Averaging:
X
E(Y) = E[E(Y|X)] = E(Y|X = x) P(X = x).
x

435-Spring-2025 conditional expectation


Example 1
Fact: Let X ∼ Poisson(λ) and Y ∼ Poisson(µ) be
independent. Then X + Y ∼ Poisson(λ + µ).
Proof:
n
X
P(X + Y = n) = P(X = k, Y = n − k)
k=0
n
X λk −µ µn−k
= e−λ ·e
k! (n − k)!
k=0
n
(λ + µ)n X n!
= e−(λ+µ) pk qn−k
n! k!(n − k)!
k=0
| {z }
=1
(λ + µ)n
= e−(λ+µ) ,
n!
λ µ
where p = λ+µ and q = λ+µ .

435-Spring-2025 conditional expectation


Example 1 continued

Extension: Let X1 , ..., Xm be independent Poisson RV’s with


means
Pm λ1 , ..., λm respectively.
Pm Then
i=1 Xi ∼ Poisson( i=1 λi ).
Conditioning: Given X + Y = n, the conditional distribution
λ
of X follows Bin(n, p) with p = λ+µ .
Proof: For k = 0, 1, ..., n,

P(X = k, Y = n − k)
P(X = k |X + Y = n) =
P(X + Y = n)
k µ n−k
e−λ λk! · e−µ (n−k)!  
n
= n = pk qn−k .
e−(λ+µ) (λ+µ)
n!
k

Moreover, E(X |X + Y = n) = np and E(Y |X + Y = n) = nq.

435-Spring-2025 conditional expectation


Changes due to conditioning

In Example 1, X and Y are assumed to be independent


(unconditionally). However, the independence does not
hold anymore after conditioning. For example,
P(X = 0 | X + Y = 3) = q3 > 0 and
P(Y = 0 | X + Y = 3) = p3 > 0, but
P(X = 0, Y = 0 | X + Y = 3) = 0, which means

P(X = 0 | X + Y = 3) · P(Y = 0 | X + Y = 3)
̸= P(X = 0, Y = 0 | X + Y = 3).

Example 2: Let N ∼ Uniform{1,2}. Given N = n, assume X


and Y are conditionally iid each being uniformly distributed
over {1, ..., n}. Fact: X and Y are not independent.

435-Spring-2025 conditional expectation


Example 2 continued

The following identities of conditional probabilities

P(X = 2 | N = 1) = P(Y = 2 | N = 1)
= P(X = 2, Y = 2 | N = 1) = 0,
P(X = 2 | N = 2) = P(Y = 2 | N = 2) = 1/2,
P(X = 2, Y = 2 | N = 2) = P(X = 2 | N = 2) · P(Y = 2 | N = 2)
= 1/2 · 1/2 = 1/4 imply
P(X = 2) = P(N = 2)P(X = 2 | N = 2) = 1/4,
P(Y = 2) = P(N = 2)P(Y = 2 | N = 2) = 1/4,
P(X = 2, Y = 2) = P(N = 2)P(X = 2, Y = 2 | N = 2)
= 1/2 · 1/4 = 1/8;

however, P(X = 2)P(Y = 2) = 1/16 ̸= P(X = 2, Y = 2).


Therefore, X and Y are not independent (unconditionally).

435-Spring-2025 conditional expectation


Example 3

Let N be the number of jobs sent (per day) to the central server
in a computer network. Given N = n, let Xi represent the
number of hours required to complete the job i, i = 1, ...n.
Assume X1 , ..., Xn are iid each having mean µ, also N is
independent of {X1 , X2 , ...}. Define the total workload per day
by SN = X1 + · · · + XN .
X
E(SN ) = E[E(SN |N)] = P(N = n) E(SN | N = n)
n
X X
= P(N = n) E(Sn | N = n) = P(N = n) E(Sn )
n n
X
= P(N = n) · n µ = µ E(N).
n

435-Spring-2025 conditional expectation

You might also like