Lecture 3 20240318
Lecture 3 20240318
and
Conditional Expectation
Introduction
One of the most useful concepts in probability theory is that
of conditional probability and conditional expectation.
In practice, we are often interested in calculating probabilities and
expectations when some partial information is available; hence, the
desired probabilities and expectations are conditional ones.
Secondly, in calculating a desired probability or expectation it is
often extremely useful to first “condition” on some appropriate
random variable.
𝑝𝑝𝑌𝑌 1 =
𝑝𝑝𝑋𝑋|𝑌𝑌 1 1 =
𝑝𝑝𝑋𝑋|𝑌𝑌 2 1 =
Example 2
Example 3.2 If 𝑋𝑋1 and 𝑋𝑋2 are independent binomial random
variables with respective parameters (𝑛𝑛1 , 𝑝𝑝) and (𝑛𝑛2 , 𝑝𝑝), calculate
the conditional probability mass function of X1 given that X1 +
X2 = m.
Hypergeometric distribution
The number of blue balls that are chosen when a sample of m balls is randomly
chosen from an urn that contains 𝑛𝑛1 blue and 𝑛𝑛2 red balls
Example 3
Example 3.3 If 𝑋𝑋 and 𝑌𝑌 are independent Poisson random
variables with respective means 𝜆𝜆1 and 𝜆𝜆2 , calculate the
conditional expected value of 𝑋𝑋 given that 𝑋𝑋 + 𝑌𝑌 = 𝑛𝑛.
Example 2 of Probability on Lect. 1
Let A1 (A0) and B1 (B0) be the event that 1 (0) is sent and the
event that 1(0) is received, respectively.
Assumption
P(A0) = 0.8, P(A1) = 1- P(A0) = 0.2,
The probability of error, i.e., p=P(B1|A0)= P(B0|A1), is 0.1
0 0.9 0
0.1
0.1
1 0.9 1
Find
The error probability at the receiver
The probability that 1 is sent when the receiver decides 1.
Answer: Example 2 on Lect. 1
Parameters
P(A0) = 0.8, P(A1) = 1- P(A0) = 0.2, P(B1|A0)= P(B0|A1) = 0.1
𝑋𝑋
What is 𝐸𝐸[𝑒𝑒 |𝑌𝑌 = 1]?
2
Computing Expectations by
Conditioning
Let us denote by 𝐸𝐸[𝑋𝑋|𝑌𝑌] that function of the random variable
𝑌𝑌 whose value at 𝑌𝑌 = 𝑦𝑦 is 𝐸𝐸[𝑋𝑋|𝑌𝑌 = 𝑦𝑦].
An extremely important property of conditional expectation
is that for all random variables X and Y
𝐸𝐸 𝑋𝑋 = 𝐸𝐸[𝐸𝐸 𝑋𝑋 𝑌𝑌 ]
Discrete RV: 𝐸𝐸 𝑋𝑋 = 𝐸𝐸 𝐸𝐸 𝑋𝑋 𝑌𝑌 = ∑𝑦𝑦 𝐸𝐸 𝑋𝑋 𝑌𝑌 = 𝑦𝑦 𝑃𝑃{𝑌𝑌 = 𝑦𝑦}
∞
Continuous RV: 𝐸𝐸 𝑋𝑋 = 𝐸𝐸 𝐸𝐸 𝑋𝑋 𝑌𝑌 = ∫−∞ 𝐸𝐸 𝑋𝑋 𝑌𝑌 = 𝑦𝑦 𝑓𝑓𝑌𝑌 𝑦𝑦 𝑑𝑑𝑑𝑑
Proof?
To see this, let E denote an arbitrary event and define the
indicator random variable X by
1, if E occurs
𝑋𝑋 = �
0, if E does not occur
Therefore, we obtain
𝑃𝑃 𝐸𝐸 = ∑𝑦𝑦 𝑃𝑃 𝐸𝐸 𝑌𝑌 = 𝑦𝑦 𝑃𝑃{𝑌𝑌 = 𝑦𝑦}, if Y is discrete
∞
= ∫−∞ 𝑃𝑃 𝐸𝐸 𝑌𝑌 = 𝑦𝑦 𝑓𝑓𝑌𝑌 𝑦𝑦 𝑑𝑑𝑑𝑑, if Y is continuous
Examples
Example 3.21 Suppose that X and Y are independent
continuous random variables having densities fX and fY,
respectively. Compute P{X < Y}.