0% found this document useful (0 votes)
53 views4 pages

Exercises: Stochastic Processes

1. For a normally distributed random variable X with mean μ and variance σ2, the first three cumulants are: hXic = μ, hX2ic = σ2, and hX3ic = 0. 2. For a chi-squared distribution with n degrees of freedom, the characteristic function is (1 - 2ik)-n/2 and the first two cumulants are hXic = n and hX2ic = 2n. 3. For a Poisson distribution with rate λ, the distribution is normalized, the first moment is μ = λ, and the variance is σ2 = λ.

Uploaded by

taye
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
53 views4 pages

Exercises: Stochastic Processes

1. For a normally distributed random variable X with mean μ and variance σ2, the first three cumulants are: hXic = μ, hX2ic = σ2, and hX3ic = 0. 2. For a chi-squared distribution with n degrees of freedom, the characteristic function is (1 - 2ik)-n/2 and the first two cumulants are hXic = n and hX2ic = 2n. 3. For a Poisson distribution with rate λ, the distribution is normalized, the first moment is μ = λ, and the variance is σ2 = λ.

Uploaded by

taye
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

FYTN03

HT09

Exercises: Stochastic processes


1. Calculate the first three cumulants hX n ic (n = 1, 2, 3) for a normally distributed random
number X with mean µ and variance σ 2 . Use the fact that the characteristic function is
the Fourier-transform of the probability distribution and that the cumulants are defined
in terms of the Taylor expansion of the logarithm of the characteristic function.
2. Consider the probability distribution for the χ2 -distribution with n degrees of freedom:
(
xn/2−1 e−x/2 /[2n/2 Γ(n/2)] if x ≥ 0
p(x) =
0 otherwise
R
where Γ(x) = 0∞ e−t tx−1 dt is the gamma function. What is the characteristic function?
What are the first two cumulants? What are the first two moments of p(x)?
3. Consider the (discrete) Poisson distribution:
λn
pn = e−λ n = 0, 1, 2, 3, .. (λ > 0)
n!
P∞
Show that this distribution is normalized: n=0 pn = 1. What is the first moments µ?
What is the variance σ 2 ?
4. First-passage time problems can be solved by introducing an absorbing boundary into the
equations of motion. Alternatively, renewal theory relates the first passage time density
f (t) (from x = x0 to the absorbing point x = c) to the probability distribution, P (x, t|x0 ),
in the absence of the boundary, according to:
P̂ (c, s|x0 )
fˆ(s) =
P̂ (c, s|c)
R
where a “hat” denotes the Laplace-transform: Â(s) = 0∞ e−st A(t). Show by explicit calcu-
lation that the result above agrees with the result in the lecture notes for a one-dimensional
random walker satisfying the diffusion equation: ∂P (x, t|x0 )/∂t = D∂ 2 P (x, t|x0 )/∂x2 .
5. Show how a random number with frequency function
(
(1 + a)xa if 0 < x < 1
p(x) = (a > −1)
0 otherwise

can be obtained from a rectangularly distributed random number.


6. Show how a random number with frequency function
(
4/[π(1 + x2 )] if 0 < x < 1
p(x) =
0 otherwise

can be obtained from rectangularly distributed random numbers by (a) the transformation
method, and (b) the accept/reject method.
Solutions

1. Calculate the first three cumulants hX n ic (n = 1, 2, 3) for a normally distributed random


number X with mean µ and variance σ 2 .
Solution: The characteristic function is the Fourier-transform of the probability distribu-
tion. Z ∞
1 2 2
Φ(k) = √ dxeikx e−(x−µ) /2σ
2
2πσ −∞
we do the variable transformation y = (x − µ)/σ which gives us
Z Z
1 ∞ 2 /2 2 σ 2 /2 1 ∞ 2 /2
Φ(k) = √ dyeik(σy+µ) e−y = eikµ−k √ dye−(ikσ−y)
2π −∞ 2π −∞

where the integral is simply 2π. Now we see that the logarithm of the characteristic
function is simply given by
ln Φ(k) = ikµ − k2 σ 2 /2
Identifying the powers of k gives hXic = µ, hX 2 ic = σ 2 and hX n ic = 0 for n > 2.

2. Consider the probability distribution for the χ2 -distribution with n degrees of freedom:
(
xn/2−1 e−x/2 /[2n/2 Γ(n/2)] if x ≥ 0
p(x) =
0 otherwise
R
where Γ(x) = 0∞ e−t tx−1 dt is the gamma function. What is the characteristic function?
What are the first two cumulants? What are the first two moments of p(x)?
Solution: Taking the Fourier-transform of p(x) gives:
Z Z
∞ 1 ∞
φ(k) = eikx p(x)dx = n/2
ex(ik−1/2) xn/2−1 dx.
−∞ 2 Γ(n/2) 0

Making the variable substitution s = (1/2 − ik)x we obtain:


R∞
1 0 e−s sn/2−1 ds
φ(k) = n/2 = (1 − 2ik)−n/2
2 (1/2 − ik)n/2 Γ(n/2)

where we in the last step used the definition of the gamma function. In order to determine
the cumulants we expand:
n
ln φ(k) = ln[(1 − 2ik)−n/2 ] = − ln(1 − 2ik) = n(ik) + n(ik)2 + O(k3 )
2
from which we identify
hXic = n hX 2 ic = 2n
Using the relation between cumulants and moments: hXic = hXi and hX 2 ic = hX 2 i−hXi2
we obtain the moments:

hXi = n hX 2 i = n(n + 2)
3. Consider the (discrete) Poisson distribution:

λn
pn = e−λ n = 0, 1, 2, 3, .. (λ > 0)
n!
P∞
Show that this distribution is normalized: n=0 pn = 1. What is the first moments µ?
What is the variance σ 2 ?
P∞
Solution: From the fact that the Taylor-series of eλ is n
n=0 λ /n! it follows immediately
P
that ∞ n=0 pn = 1. The first moment µ is:


X ∞
X ∞
X λn X∞
λn−1
µ= npn = npn = e−λ = e−λ λ =λ
n=0 n=1 n=1
(n − 1)! n=1
(n − 1)!
| {z }

The variance is:



X ∞
X
σ2 = (n − µ)2 pn = n2
( |{z} −2µn + µ2 )pn
n=0 n=0 n(n−1)+n

X ∞
X ∞
X
= n(n − 1)pn + (1 − 2µ) npn +µ2 pn
n=0 n=0 n=0
| {z } | {z }
µ 1

X λn−2
= e−λ λ2 +λ − λ2 = λ
n=2
(n − 2)!
| {z }

i.e., for the Poisson distribution µ = σ 2 = λ.

4. First-passage time problems can be solved by introducing an absorbing boundary into the
equations of motion. Alternatively, renewal theory relates the first passage time density
f (t) (from x = x0 to the absorbing point x = c) to the probability distribution, P (x, t|x0 ),
in the absence of the boundary, according to:

P̂ (c, s|x0 )
fˆ(s) =
P̂ (c, s|c)
R
where a “hat” denotes the Laplace-transform: Â(s) = 0∞ e−st A(t). Show by explicit calcu-
lation that the result above agrees with the result in the lecture notes for a one-dimensional
random walker satisfying the diffusion equation: ∂P (x, t|x0 )/∂t = D∂ 2 P (x, t|x0 )/∂x2 .
Solution: The solution to the 1d diffusion equation with initial condition P (x, t = 0|x0 ) =
δ(x − x0 ) is: !
1 (x − x0 )2
P (x, t|x0 ) = exp −
(4πDt)1/2 4Dt
Taking the Laplace-transform gives:
q 
1
P̂ (x, s|x0 ) = p exp −|x − x0 | s/D
2 s/D
We hence get:
 q 
P̂ (c, s|x0 )
fˆ(s) = = exp −|c − x0 | s/D
P̂ (c, s|c)
so that by inverse Laplace-transform we have:
!
|c − x0 | (c − x0 )2
f (t) = √ exp − .
4πDt3 4Dt

in agreement with the lecture notes.


5. Show how a random number with frequency function
(
(1 + a)xa if 0 < x < 1
p(x) = (a > −1)
0 otherwise

can be obtained from a rectangularly distributed random number.


Solution: The cumulative distribution is
Z x
C(x) = (1 + a)(x′ )a dx′ = xa+1 .
0

If R denotes a uniform random number ∈ [0, 1] then we set


1
C(X) = R ⇒ X a+1 = R ⇒ X = R a+1

6. Show how a random number with frequency function


(
4/[π(1 + x2 )] if 0 < x < 1
p(x) =
0 otherwise

can be obtained from rectangularly distributed random numbers by (a) the transformation
method, and (b) the accept/reject method.
Solution: (a) The cumulative distribution is
Z x
4 dx′ 4
C(x) = 2
= tan−1 (x)
π 0 1 + (x )
′ π
so that by introducing a uniform random number R ∈ [0, 1] and setting C(X) = R we get
X = tan(πR/4).
(b) The simplest overestimating function is f0 (x) = 4/π. Thus,
f0 (x)
p0 (x) = R 1 = 1, if 0 < x < 1
0 f0 (x)

and 0 otherwise. So, we first generate a random number Xtrial from the p0 (x) (uniform)
distribution. We then draw a new uniform random number R and accept Xtrial with
probability p(Xtrial )/f0 (Xtrial ) = 1+X1 2 , i.e., if
trial

1
R< 2 .
1 + Xtrial
The efficiency of this method is larger than 50% (π/4 to be precise).

You might also like