MScFE 622 CTSP - Compiled - Notes - M6
MScFE 622 CTSP - Compiled - Notes - M6
Compiled Notes
Module 6
MScFE 622
Continuous-time Stochastic
Processes
Revised: 07/07/2020
MScFE 622 Continuous-time Stochastic Processes − Notes Module 6
Contents
Unit 1: The Poisson Process 3
Problem Set 11
Summary
Module 6 introduces Levy processes and demonstrates its application to modeling stock price returns.
The module begins with an introduction to the Poisson process and then continues by discussing Levy
processes and developing proofs of its properties. At the end of the module, applications of Levy
processes to financial modeling are discussed, with an emphasis on exponential Levy models.
• A stochastic process N = {Nt : t ≥ 0} is called a counting process if N is cadlag and the sample
paths of N are piecewise constant with jumps of size 1. We will also assume that N0 = 0.
• So we can think of a counting process N as a stochastic process such that Nt counts the
number of events that have occurred up to (and including) time t and the increment Nt − Ns
(for 0 ≤ s < t) counts the number of events that have occurred in the interval (s, t]. Every
sample path of a counting process N will move through the states N = {0, 1, 2, 3 . . .} in that
order.
• For the remainder of this section, let N be a homogeneous Poisson process with rate λ > 0.
Define the random variables (stopping times) S0 , S1 , . . . as follows:
S0 := 0, Sn := inf {t ≥ 0 : Nt = n} n ≥ 1.
The Sn ’s are called the arrival times of N ; that is, Sn is the time of arrival of the nth event.
Note that S0 ≤ S1 ≤ S2 ≤ . . ..
Tn := Sn − Sn−1 , n ≥ 1.
These random variables represent the time between successive events of N . For n ≥ 1, the
arrival times can be recovered as n
X
Sn = Ti .
i=1
• Let us now find the distribution of Tn , and consequently, that of Sn for n ≥ 1. First note the
equivalence of the following events:
The intuitive explanation of this relationship is that if no events have occurred by time t (i.e.,
Nt = 0), then the arrival of the first event is after time t (i.e., S1 > t or T1 > t since S1 = T1 )
and vice versa. Since T1 is a non-negative random variable and Nt has a Poisson distribution
with parameter λt, we have (for t ≥ 0)
1 − FT1 (t) = P (T1 > t) = P (Nt = 0) = e−λt .
Hence T1 (and S1 ) has an exponential distribution with parameter λ > 0.
• Let us now find the joint distribution of S1 and S2 . For 0 ≤ w1 ≤ w2 we have
FS1 S2 (w1 , w2 ) = P (S1 ≤ w1 , S2 ≤ w2 ) = P (Nw1 ≥ 1, Nw2 ≥ 2)
= P (Nw1 = 1, Nw2 − Nw1 ≥ 1) + P (Nw1 ≥ 2)
−λw1
1 − e−λ(w2 −w1 ) + 1 − λw1 e−λw1 − e−λw1 = 1 − λw1 e−λw2 − e−λw1 .
= λw1 e
Hence the joint density of S1 and S2 is
(
λ2 e−λw2 0 ≤ w1 ≤ w2
fS1 S2 (w1 , w2 ) =
0 otherwise.
• We can also define the Poisson process by starting with an i.i.d sequence of exponential random
variables T1 , T2 , . . . with parameter λ > 0. Then we define the arrival times as
S0 = 0, Sn = Sn−1 + Tn n ≥ 1.
Then the counting process N defined by
∞
X
Nt = I{Sn ≤t} = # {n : Sn ≤ t}
n=1
is a Poisson process.
• Let X = {Xt : t ≥ 0} be a stochastic processes (adapted to F). We say that X is a Levy process
if
1. X0 = 0
2. X has independent increments
3. X has stationary increments
4. X is stochastically continuous; i.e., for every t ≥ 0 and every > 0,
• It can be shown that if X is a Levy process, then X has a cadlag modification, and because of
that we will simply assume that X is cadlag.
• The (homogeneous) Poisson process discussed above is a Levy process. Again, the first three
properties are clearly satisfied. For the 4th property, we use Markov’s inequality (or Cheby-
chev’s inequality) to obtain (for > 0)
E (|Ns − Nt |) λ |s − t|
P (|Ns − Nt | > ) ≤ = → 0 as s → t.
The paths of N are of course discontinuous, with jumps of size 1 (i.e., ∆N ∈ {0, 1}).
• One drawback of using the Poisson process to model stock price returns is that the sizes of the
jumps are always equal to 1, which is not realistic. We now look at a generalization of this by
defining what is called a compound Poisson process.
• Let N be a homogeneous Poisson process with rate λ > 0 and Y1 , Y2 , . . . be a sequence of i.i.d.
random variables that are also independent of N . A compound Poisson process is a stochastic
process X defined by
∞
(
X 0 Nt = 0
Xt := Yn I{Nt ≥n} = PNt
n=1 k=1 Yk Nt ≥ 1.
• We can think of X as a generalization of the Poisson process, where the sizes of the jumps are
random variables, instead of just being all equal to 1.
• The compound Poisson process is a Levy process. The full proof of this fact is left as an
exercise, but we will show the stationarity of increments. Consider the times 0 ≤ s < t. Then
the characteristic function of the increment is
P Nt P Nt
ϕXt −Xs (u) = E eiu(Xt −Xs ) = E eiu( k=Ns +1 Yk ) = E E eiu( k=Ns +1 Yk ) |σ (Ns , Nt )
since Nt − Ns has a Poisson distribution with rate λ(t − s). So, clearly the distribution of the
increment only depends on t − s.
– A deterministic term bt
– A diffusion term σWt
– A pure jump term N
P t
k=1 Yk
• It turns out that the characteristic function of any Levy process is similar to the one above.
Let X be a Levy process and for a fixed u ∈ R, define gu : [0, ∞) → C to be the characteristic
function of Xt :
gu (t) := E eiuXt , t ≥ 0.
gu (t) = etψ(u)
where Z
1
ψ(u) = ibu − σ 2 u2 + eiuy − 1 − iuyI[−1,1] (y) dν(y)
2 R
2
for some b ∈ R, σ ≥ 0 and a measure on B(R) called the Levy measure of X, that satisfies
Z
ν({0}) = 0, and 1 ∧ y 2 dν(y) < ∞.
R
That is, Nt (B)(ω) is the number of jumps of X that are of size B. This random variable is
well-defined since X is cadlag, and therefore, has finitely many jumps of a given size in a finite
interval. Now note that
– For fixed ω ∈ Ω, B 7→ Nt (B)(ω) is a positive measure (a counting measure)
– For fixed B ∈ B(R) with 0 ∈
/ B̄, ω 7→ Nt (B)(ω) is a random variable
– For fixed B ∈ B(R) with 0 ∈
/ B̄, (t, ω) 7→ Nt (B)(ω) is a counting process.
In fact, N· (B) is a Poisson process. Indeed, define the stopping times τ0 < τ1 < τ2 < . . . by
τ0 = 0, τn+1 := inf {t > τn : ∆Xt ∈ B} .
Then N (B) can be written as
∞
X
Nt (B) = I{τn ≤t} .
n=1
Hence all we need to show is that the interarrival times τn+1 − τn are i.i.d exponentially dis-
tributed. This is achieved by showing that the distribution of τn+1 − τn is memoryless, i.e.,
P (τn+1 − τn > s + t) = P (τn+1 − τn > t) P (τn+1 − τn > s) s, t > 0.
This is left as an exercise.
• So we get that N (B) is a Poisson process with rate E(N1 (B)) =: ν(B). This is how the Levy
measure is obtained.
• We now move on to the infinite divisibility property of Levy process. Let X be a Levy process
and t > 0 be fixed. Then for each n ≥ 1 we can write Xt as
X n
Xt = Xt − X t(n−1) + . . . + X nt − X0 = Zi ,
n
i=1
• Let Y be a random variable whose distribution is infinitely divisible. Then the characteristic
function of Y is given by
ϕY (u) = eψ(u)
where Z
1
ψ(u) = ibu − σ 2 u2 + eiuy − 1 − iuyI[−1,1] (y) dν(y)
2 R
for some b ∈ R, σ 2 ≥ 0 and a measure on B(R) called the Levy measure of X, that satisfies
Z
ν({0}) = 0, and 1 ∧ y 2 dν(y) < ∞.
R
• For example, the Levy process X such that X1 has a Gamma distribution is called a Gamma
process.
• We have seen such a model in the previous module (the Black-Scholes model), where S satisfies
the following SDE:
dSt = St (µ dt + σ dWt ) = St dXt ,
where Xt := µt + σWt is a Levy process. We can write S as
1 2
St = S0 eXt − 2 σ t = S0 E(X)t .
• To deal with general discontinuous Levy processes, we need to first introduce the stochastic
calculus for general semimartingales. We state only the one-dimensional version of Ito’s formula;
the multidimensional version can be found in any standard reference for stochastic calculus.
Here [X] is the (optional) quadratic variation of X, rather than the predictable quadratic
variation of X denoted by hXi. The latter is not even defined for all semimartingales, but the
two are equal for continuous semimartingales.
dY = Y− dX, Y0 = 1.
• In general, exponential Levy models give rise to incomplete markets. Consider the following
popular jump-diffusion type model for S:
where
Nt
X
Jt = Yk
k=1
is a compound Poisson process with Yk > −1. The solution to this SDE is
St = S0 E(X)t
N N
1 2 Yt 1 2 Yt
= S0 exp Xt − σ t (1 + Yk ) exp (−Yk ) = S0 exp Xt − Jt − σ t (1 + Yk )
2 k=1
2 k=1
Nt
!
1 2 X
= S0 exp µ − σ t + σWt + ln (1 + Yk )
2 k=1
since
X X Nt
X
[J]t = ∆Js2 = ∆Xs2 = Yk2 .
s≤t s≤t k=1
Problem Set
Problem 1. Let the number of accidents N occur according to a Poisson process with rate
λ = 5 per day. What is the expected number of accidents between the fifth day and the seventh
day?
Solution: Since N has a Poisson distribution, E(N ) = λt. That is, we expect λt accidents in
t times units. Thus, in our example, the expected number of accidents between the fifth day
and the seventh day is equat to 5 ∗ 2 = 10.
Problem 2. Let the number of accidents N occur according to a Poisson process with rate
λ = 2 per day. What is the probability the number of accidents between the third day and the
fourth day is 3?
P (N4 − N3 = 3) = P (N1 = 3)
We also know from the lecture notes that,
e−λt (λt)k
P (Nt = k) =
k!
So, finally we get,
e−2 23
P (N4 − N3 = 3) = P (N1 = 3) =
3!
Which is the solution to the problem.
Problem 3. Let the number of accidents N occur according to a Poisson process with rate
λ = 2 per day. Given that no accidents have occurred in the last 3 days, what is the probability
that the next accident occurs within the next day?
Solution: The probability the the next accident occurs between day four and three is,
e−λt (λt)k
P (N4 − N3 > 0) = P (N1 > 0) = 1 − P (N1 ≤ 0) = 1 −
k!
with k = 0, λ = 2 and t = 1. The solution is equal to,
e−λt (λt)k
P (N4 − N3 > 0) = P (N1 > 0) = 1 − P (N1 ≤ 0) = 1 − = 1 − e−2
k!
Problem 4. Let N be a Poisson process with rate λ = 2 and Y1 , Y2 , . . . be i.i.d. normal random
variables with mean µ = 2 and variance σ 2 = 1. Define the compound Poisson process X by
Nt
X
Xt := Yk .
k=1
What is E(X2 )?
Solution: The expected value, considering that Y1 , Y2 , . . . are i.i.d. normal random variables
can be compute as follows,
Nt
X
E[Xt ] := E[ Yk ] = E[Nt ] ∗ E[Y ] = λt ∗ µ
k=1
1
Var(X(t)) = (σ 2 + m2 )λt = ( + 1) ∗ 6 ∗ 1 = 8
3
Solution: We need to compute b such that the expected value of Xt + bt, for t = 2, is equal to
zero. Thus,
20
E[X2 + 2b] = E[X2 ] + 2b = + 2b = 0 , then b = −5
2