Lec 03
Lec 03
– Convolution method
– Composition method
– Acceptance-rejection method
Inverse transform method 3–2
Pseudo-random numbers
Many different methods of generating a (uniform[0,1]) random
number ...
• All we want is that the central limit theorem should kick in ...
• Will not focus too much on generating random numbers ... all
simulators and languages have good generators
• Composition approach
• Convolution method
• Acceptance-Rejection technique
Inverse transform method 3–6
P(X ≤ x) = P(F −1 (U ) ≤ x)
= P(U ≤ F (x)) (1)
= F (x) (2)
where
– (1) follows by the fact that F is an increasing function
– (2) follows from the fact 0 ≤ F (x) ≤ 1 and the CDF of a
uniform FU (y) = y for all y ∈ [0, 1]
• Algorithm:
– Given: the CDF F (x) or the density f (x). If density given
then first integrate to get CDF. (Most frequent error:
incorrect limits on the integration)
– Set F (X) = U and solve for X in terms of U . Have to make
sure that the solution X lies in the correct range.
Inverse transform method 3–7
Example : exp(λ)
1 − e−λX = U
e−λX = 1 − U
1
X = − log(1 − U )
λ
• Algorithm :
i=1 i=1
j−1 j
P(X = xj ) = P( pi ≤ U < pi )
X X
i=1 i=1
j j−1
= pi − pi = p j
X X
i=1 i=1
QED
Inverse transform method 3–9
1
PSfrag replacements
F (x5 )
F (x4 )
U
F (x3 )
F (x2 )
F (x1 ) −1
X = FX (U ) = x4
x1 x2 x3 x4 x5 x6
• Algorithm :
• Transformation : X = j if
F (j − 1) ≤ U < F (j) ⇔ 1 − q j−1 ≤ U < 1 − q j
⇔ q j < 1 − U ≤ q j−1
Therefore
X = min{j | q j < 1 − U }
= min{j
¯
| j log(q) < log(1− U )}
¯
¯
log(1 − U )
= min j j>
¯
¯
¯¯ log(q)
i.e.
log(1 − U )
X=d e
log(1 − p)
• the pmf is
n i i
pi = P(X = i) =
p (1 − p) , i = 0, . . . , n
i
Therefore ...
n − i p
pi = pi−1
i+1 1−p
• Algorithm
Disadvantages:
Advantages:
Convolution method
X = Z 1 + Z2 + . . . + Z m
• Algorithm :
Example : Erlang(λ,m)
λ(λx)m−1 −λx
fλ,m (x) = e
(m − 1)!
X = Z1 + Z2 + . . . + Zm is Erlang(λ, m)