0% found this document useful (0 votes)
25 views

Lecture02 Printed

This document discusses conditional probability and conditional expectation. It provides examples of calculating conditional probabilities and conditional expectations. It also discusses computing variance using conditioning and provides the conditional variance formula.

Uploaded by

JIAYI ZHOU
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
25 views

Lecture02 Printed

This document discusses conditional probability and conditional expectation. It provides examples of calculating conditional probabilities and conditional expectations. It also discusses computing variance using conditioning and provides the conditional variance formula.

Uploaded by

JIAYI ZHOU
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 15

BU204: Stochastic Processes with Applications

随机过程及应用
Lecture Two: Conditional Probability and Conditional
Expectation

罗俊 )
Instructor : Jun LUO (罗
Department of Management Science
Antai College of Economics and Management
Shanghai Jiao Tong University

Textbook : Introduction to Probability Models


(by Sheldon M Ross; Elsevier, 2010)

Shanghai Jiao Tong University () Stochastic Processes with Applications 1 / 15


Chapter 3: Conditional Probability and Conditional Expectation

Conditional Probability
In practice, calculating probabilities and expectations when some
partial information is available; hence, the desired probabilities and
expectations are conditional ones
In calculating a desired probability or expectation, it is often extremely
useful to first “condition” on some appropriate random variable
Let (X1 , X2 ) be a bivariate r.v.
F (x1 , x2 ) = P{X1 ≤ x1 , X2 ≤ x2 }
P{X1 ≤ x1 |X2 ≤ x2 } = P{X1 ≤ x1 , X2 ≤ x2 }/P{X2 ≤ x2 } =
F (x1 , x2 )/FX2 (x2 )
If X1 and X2 are independent, then
F (x1 , x2 ) = FX1 (x1 )FX2 (x2 )
P{X1 ≤ x1 |X2 ≤ x2 } = Fx1 (x1 ) and P{X2 ≤ x2 |X1 ≤ x1 } = FX2 (x2 )
Conditional probability mass function: discrete r.v.
pX1 |X2 (x1 |x2 ) = p(x1 , x2 )/pX2 (x2 ) (pX2 (x2 ) > 0)
Conditional probability density function: continuous r.v.
fX1 |X2 (x1 |x2 ) = f (x1 , x2 )/fX2 (x2 ) (fX2 (x2 ) > 0)
Shanghai Jiao Tong University () Stochastic Processes with Applications 2 / 15
Chapter 3: Conditional Probability and Conditional Expectation

Conditional Expectation

1 E [X |Y ]: The function of the random variable Y whose value at


Y = y is E [X |Y = y ] - a random variable
2 An extremely important property: for any r.v. X and Y

E [X ] = E [E [X |Y ]]

discrete case
XX
E [E [X |Y ]] = xP{X = x|Y = y }P{Y = y }
y x
X X
= x P{X = x, Y = y }
x y
X
= xP{X = x}
x

Shanghai Jiao Tong University () Stochastic Processes with Applications 3 / 15


Chapter 3: Conditional Probability and Conditional Expectation

Conditional Expectation (cont’d)

Compound r.v.: sum of a random number of i.i.d. r.v.s


N
X
X = Xi
i=1
where Xi are i.i.d. and also independent of the random number N.
Example 3.11: Expected number of accidents per week is 4, expected
number of workers injured in each accident is 2. What is the expected
number of workers injured each week?
Xi denotes the number of workers injured in the ith accident
⇒ E [Xi ] = 2.
N denotes the number of accidents per week ⇒ E [N] = 4.
PN
X = i=1 Xi , what is E [X ]?
" N #
X
E Xi = E [N]E [Xi ]
i=1
.
Shanghai Jiao Tong University () Stochastic Processes with Applications 4 / 15
Chapter 3: Conditional Probability and Conditional Expectation

Example 3.12 - The Mean of a Geometric Distribution

A coin, having probability p of coming up heads, is to be successively flipped until the first head
appears. What is the expected number of flips required?

Hence E [N] = p + (1 − p)(1 + E [N]) ⇒ E [N] = 1/p.

Shanghai Jiao Tong University () Stochastic Processes with Applications 5 / 15


Chapter 3: Conditional Probability and Conditional Expectation

Example 3.14 - The Matching Rounds Problem

Suppose that those choosing their own hats depart, while the others (those without a match)
put their selected hats in the center of the room, mix them up, and then reselect. Also, suppose
that this process continues until each individual has his own hat. Find E [Rn ] where Rn is the
number of rounds that are necessary when n individuals are initially present.
1 Since no matter how many people remain there will, on average, be one match per round,
we guess E [Rn ] = n. Let’s use induction.
2 Let Xn be the number of matches that occur in the first round, E [Xn ] = 1.
n
X
E [Rn ] = E [Rn |Xn = i]P{Xn = i}
i=0
Xn
= (1 + E [Rn−i ])P{Xn = i}
i=0
n
X
= 1 + E [Rn ]P{Xn = 0} + (n − i)P{Xn = i}
i=1
= E [Rn ]P{Xn = 0} + n(1 − P{Xn = 0})

3 E [Rn ] = n satisfies no matter what is P{Xn = 0}.

Shanghai Jiao Tong University () Stochastic Processes with Applications 6 / 15


Chapter 3: Conditional Probability and Conditional Expectation

Example 3.16 - Quick-Sort Algorithm


Suppose we are given a set of n distinct values —x1 , . . . , xn —and we desire to sort them. The
quick-sort algorithm is defined recursively as follows:
When n = 2 the algorithm compares the two values and puts them in the appropriate
order.
When n > 2 it starts by choosing at random one of the n values —say, xi —and then
compares each of the other n − 1 values with xi , noting which are smaller and which are
larger than xi . Letting Si denote the set of elements smaller than xi , and S̄i the set of
elements greater than xi , the algorithm now sorts the set Si and the set S̄i . The final
ordering, therefore, consists of the ordered set of the elements in Si , then xi , and then
the ordered set of the elements in S̄i .
What is the expected number of comparisons Mn ?
n
X 1
Mn = E [number of comparisons|value selected is jth smallest]
j=1
n
n
X 1
= (n − 1 + Mj−1 + Mn−j )
j=1
n
n−1
2X
= n−1+ Mk
n i=1
= · · · ∼ 2(n + 1) log(n + 1)

Shanghai Jiao Tong University () Stochastic Processes with Applications 7 / 15


Chapter 3: Conditional Probability and Conditional Expectation

Computing Variance by Conditioning

Conditional variance:
Var (X |Y = y ) = E [X 2 |Y = y ] − (E [X |Y = y ])2
The Conditional Variance Formula
Var (X ) = E [Var (X |Y )] + Var (E [X |Y ])
Proof:
E [Var (X |Y )] = E [E [X 2 |Y ] − (E [X |Y ])2 ]
= E [E [X 2 |Y ]] − E [(E [X |Y ])2 ]
= E [X 2 ] − E [(E [X |Y ])2 ]

Var (E [X |Y ]) = E [(E [X |Y ])2 ] − (E [E [X |Y ]])2


= E [(E [X |Y ])2 ] − (E [X ])2
Therefore E [Var (X |Y )] + Var (E [X |Y ]) = E [X 2 ] − (E [X ])2 = Var (X )
Shanghai Jiao Tong University () Stochastic Processes with Applications 8 / 15
Chapter 3: Conditional Probability and Conditional Expectation

Example 3.20 - The Matching Rounds Problem

Show that Vn = Var (Rn ) = n. By induction,

E [Rn |Xn ] = 1 + E [Rn−Xn ] = 1 + n − Xn


Var (Rn |Xn ) = Var (Rn−Xn ) = Vn−Xn

Therefore

Vn = E [Var (Rn |Xn )] + Var (E [Rn |Xn ])


= E [Vn−Xn ] + Var (Xn )
Xn
= Vn P(Xn = 0) + Vn−j P(Xn = j) + Var (Xn )
j=1
= Vn P(Xn = 0) + n(1 − P(Xn = 0))

(since E (Xn ) = Var (Xn ) = 1)

Shanghai Jiao Tong University () Stochastic Processes with Applications 9 / 15


Chapter 3: Conditional Probability and Conditional Expectation

Computing Probability by Conditioning


For any event A, define the indicator function 1A = 1 if A occurs and
0 otherwise. Then,
E (1A ) = P(A).
E (1A |Y = y ) = P(A|Y = y ).

X
P(A) = P(A|Y = y )P(Y = y ), if Y is discrete
y
Z ∞
= P(A|Y = y )fY (y )dy , if Y is continuous
−∞
Example 3.21: Suppose X and Y are independent continuous r.v.
with densities fX and fY .
Z ∞
P{X < Y } = P{X < Y |Y = y }fY (y )dy
Z−∞

= FX (y )fY (y )dy
−∞
Shanghai Jiao Tong University () Stochastic Processes with Applications 10 / 15
Chapter 3: Conditional Probability and Conditional Expectation

Example 3.22 - Insurance

An insurance company supposes that the number of accidents that each of


its policyholders will have in a year is Poisson distributed, with the mean
of the Poisson depending on the policyholder. If the Poisson mean of a
randomly chosen policyholder has a gamma distribution with density
function g (y ) = ye −y (y ≥ 0), what is the probability that a randomly
chosen policyholder has exactly n accidents next year?
Let X be the number of accidents that a randomly chosen
policyholder has next year and Y be the Poisson mean number of
accidents for this policyholder.
Z ∞
P{X = n} = P{X = n|Y = y }g (y )dy
0
1 ∞ n+1 −2y
Z
= y e dy
n! 0
n+1
= · · · = n+2
2
Shanghai Jiao Tong University () Stochastic Processes with Applications 11 / 15
Chapter 3: Conditional Probability and Conditional Expectation

Example 3.23 - Yoga Studio

Suppose the number of people who visit a yoga studio each day follows
Poisson(λ). Suppose each person who visits is, independently, female with
probability p or male with probability 1 − p. Denote N and M the number
of female and male each day respectively.

P{N = n, M = m} =?

1 Exercise?
2 N and M are independent Poisson random variables with respective
means λp and λ(1 − p).

Shanghai Jiao Tong University () Stochastic Processes with Applications 12 / 15


Chapter 3: Conditional Probability and Conditional Expectation

Example 3.27 - The Ballot Problem

In an election, candidate A receives n votes, and candidate B receives m


votes where n > m. Assuming that all orderings are equally likely, show
that the probability that A is always ahead in the count of votes is
Pn,m = (n − m)/(n + m).

Let E be the event that A receives the last vote.


n m
Pn,m = P{A always ahead|E } + P{A always ahead|E c }
n+m n+m
n m
= Pn−1,m + Pn,m−1
n+m n+m
Induction ...

Shanghai Jiao Tong University () Stochastic Processes with Applications 13 / 15


Chapter 3: Conditional Probability and Conditional Expectation

Compound Random Variables

Xi is a sequence of i.i.d. r.v.


Compound r.v. SN = N
P
i=1 Xi where r.v. N (called compounding
distribution) is independent from Xi

Var (SN ) = σ 2 E (N) + µ2 Var (N)

Proof:
XN
Var (SN |N = n) = Var ( Xi |N = n)
i=1
Xn
= Var ( Xi ) = nσ 2
i=1
E (SN |N = n) = nµ

Therefore Var (SN ) = E [Nσ 2 ] + Var (Nµ) = σ 2 E [N] + µ2 Var (N)


Shanghai Jiao Tong University () Stochastic Processes with Applications 14 / 15
Chapter 3: Conditional Probability and Conditional Expectation

The Compound Random Variable Identity

For any function h,


E [SN h(SN )] = E [N]E [X1 h(SM )]
nP{N=n}
where M is a r.v. independent of Xi and P{M = n} = E [N]

Corollary 3.6: Suppose Xi ’s are positive integer valued r.v. and


αj = P{X1 = j},
P{SN = 0} = P{N = 0}
k
1 X
P{SN = k} = E [N] jαj P{SM−1 = k − j}
k
j=1

Poisson Compounding Distribution: If N is the Poisson distribution with


mean λ,
(n + 1)P{N = n + 1} λn
P{M − 1 = n} = = e −λ
E [N] n!
So M − 1 is also a Poisson random variable with mean λ.
Shanghai Jiao Tong University () Stochastic Processes with Applications 15 / 15

You might also like