0% found this document useful (0 votes)
148 views

Conditional Probability

- Conditional probability and conditional expectation are ways to calculate the probability or expectation of one event or random variable given information about another related event or random variable. - The conditional probability of A given B is defined as P(A|B)=P(A and B)/P(B). - The conditional expectation of a random variable X given Y=y is defined as the expected value of X considering only the outcomes where Y=y. - Conditional probability and expectation can be used to break down complex probabilities or expectations into simpler pieces of information.

Uploaded by

Pedro Koob
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
148 views

Conditional Probability

- Conditional probability and conditional expectation are ways to calculate the probability or expectation of one event or random variable given information about another related event or random variable. - The conditional probability of A given B is defined as P(A|B)=P(A and B)/P(B). - The conditional expectation of a random variable X given Y=y is defined as the expected value of X considering only the outcomes where Y=y. - Conditional probability and expectation can be used to break down complex probabilities or expectations into simpler pieces of information.

Uploaded by

Pedro Koob
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

Conditional Probability

Conditional probability: for events E and F:


P(E | F) =
P(EF)
P(F)
Conditional probability mass function (pmf)
p
X|Y
(x | y) = P{X = x | Y = y}
=
P{X = x, Y = y}
P{Y = y}
=
p(x, y)
p
Y
(y)
dened for y : p
Y
(y) > 0.
Conditional expectation of X given Y = y
E[X | Y = y] =

x
xp
X|Y
(x | y)
If X and Y are independent, then E[X | Y = y] =
E[X].
1
Examples
1. Suppose the joint pmf of X and Y is given by p(1, 1) =
0.5, p(1, 2) = 0.1, p(2, 1) = 0.1, p(2, 2) = 0.3. Find
the pmf of X given Y = 1.
Solution:
p
X|Y =1
(1) = p(1, 1)/p
Y
(1) = 0.5/0.6 = 5/6
p
X|Y =1
(2) = p(2, 1)/p
Y
(1) = 0.1/0.6 = 1/6
2. If X and Y are independent Poisson RVs with respec-
tive means
1
and
2
, nd the conditional pmf of X
given X + Y = n and the conditional expected value
of X given X + Y = n.
Solution:
Let Z = X + Y . We want to nd p
X|Z=n
(k). For
k = 0, 1, 2, ..., n
p
X|Z=n
(k) =
P(X = k, Z = n)
P(Z = n)
=
P(X = k, X + Y = n)
P(Z = n)
=
P(X = k, Y = n k)
P(Z = n)
=
P(X = k)P(Y = n k)
P(Z = n)
2
We know that Z is Poisson with mean
1
+
2
.
p
X|Z=n
(k) =
P(X = k, Z = n)
P(Z = n)
=
P(X = k)P(Y = n k)
P(Z = n)
=
e


k
1
k!
e


nk
2
(nk)!
e
(
1
+
2
)

(
1
+
2
)
n
n!
=

n
k

1
+
2

1
+
2

nk
Hence the conditional distribution of X given X +
Y = n is a binomial distribution with parameters n
and

1

1
+
2
.
E(X|X + Y = n) =

1
n

1
+
2
.
3. Consider n +m independent trials, each of which re-
sults in a success with probability p. Compute the ex-
pected number of successes in the rst n trials given
that there are k successes in all.
Solution: Let Y be the number of successes in n +m
trials. Let X be the number of successes in the rst n
trials. Dene
X
i
=

1 if the ith trial is a success


0 otherwise
3
The X =

n
i=1
X
i
.
E(X|Y = k) = E(
n

i=1
X
i
|Y = k) =
n

i=1
E(X
i
|Y = k)
Since the trials are independent X
i
|Y = k have the
same distribution. Hence
E(X
i
|Y = k) = P(X
i
= 1|Y = k) = P(X
i
= 1|Y = k)
P(X
1
= 1|Y = k) =
P(X
1
= 1, Y = k)
P(Y = k)
=

n + m 1
k 1

p
k
(1 p)
n+mk

n + m
k

p
k
(1 p)
n+mk
=
k
n + m
Hence
E(X|Y = k) =
nk
n + m
4
Conditional Density
Conditional probability density function:
f
X|Y
(x | y) =
f(x, y)
f
Y
(y)
dened for y : f
Y
(y) > 0.
P(X R | Y = y) =

R
f
X|Y
(x | y)dx
Conditional expectation of X given Y = y
E[X | Y = y] =

xf
X|Y
(x | y)dx
For function g, the conditional expectation of g(X)
E[g(X) | Y = y] =

g(x)f
X|Y
(x | y)dx
5
Computing Expectation by Conditioning
Discrete:
E[X] =

y
E[X | Y = y]p
Y
(y)
=

x
xp
X|Y
(x | y)p
Y
(y)
Continuous:
E[X] =

E[X | Y = y]f
Y
(y)dy
=

xf
X|Y
(x | y)f
Y
(y)dxdy
Chain expansion: E[X] = E
Y
[E
X|Y
(X | Y )].
6
Expectation of the sum of a random number of ran-
dom variables:
If X =

N
i=1
X
i
, N is a random variable independent
of X
i
s. X
i
s have common mean . Then E[X] =
E[N].
Example: Suppose that the expected number of acci-
dents per week at an industrial plant is four. Suppose
also that the numbers of workers injured in each acci-
dent are independent random variables with a com-
mon mean of 2. Assume also that the number of
workers injured in each accident is independent of the
number of accidents that occur. What is the expected
number of injuries during a week?
The variance of a random number of random vari-
ables:
Z =

N
i=1
X
i
, E(X
i
) = , V ar(X
i
) =
2
,
V ar(Z) =
2
E[N] +
2
V ar(N).
7
Computing Probability by Conditioning
Total probability formula: Suppose F
1
, F
2
, ..., F
n
are
mutually exclusive and
n
i=1
F
i
= S
P(E) =
n

i=1
P(F
i
)P(E | F
i
)
p
X
(x) =

y
i
p
X|Y
(x | y
i
)p
Y
(y
i
).
f
X
(x) =

f
X|Y
(x | y)f
Y
(y)dy.
8

You might also like