MScFE 622 CTSP - Compiled - Video - Transcripts - M6
MScFE 622 CTSP - Compiled - Video - Transcripts - M6
Video Transcripts
Module 6
MScFE 622
Continuous-time Stochastic
Processes
Table of Contents
A stochastic process, 𝑁, where 𝑁 = {𝑁! : 𝑡 ≥ 0}, is called a Poisson process if it satisfies the
following conditions:
1 N is a counting process. This means that 𝑁" = 0 and 𝑁! counts the number of events up to,
and including, time 𝑡. So, this implies, of course, that the state space of 𝑁 – in other words,
the values that 𝑁! takes – will all be non-negative integers.
2 N has independent increments, which means that if we pick two time points, time 𝑠 and
time 𝑡, where 𝑠 is less than 𝑡, then the increment 𝑁! minus 𝑁# – in other words, the number
of events, using this interpretation that occurs over the interval, between 𝑠 and 𝑡 – is
independent of the number of events that occur in any other interval that does not
overlap with 𝑠 and 𝑡. So, non-overlapping increments are independent.
3 N has stationary increments, which means that, again, if we pick two time points, 𝑠 and 𝑡,
then the distribution – or the low of 𝑁! − 𝑁# , which, in this case, is simply the number of
events between time 𝑠 and time 𝑡 – does not depend on 𝑠. In other words, if we shift this
interval a certain length – let's say, we shift it by ℎ units to 𝑠 + ℎ and 𝑡 + ℎ – then the
distribution of 𝑁!$% − 𝑁#$% will also be the same as the distribution of 𝑁! − 𝑁# . Another
way of putting this is that the distribution of the increments 𝑁! − 𝑁# depends only on 𝑡 −
𝑠, which is the length of the time interval.
So, what Poisson distribution means is that the low has the following PMF:
&
/λ(𝑡 − 𝑠)3 𝑒 '((!'#)
ℙ(𝑁! − 𝑁# = 𝑛) =
𝑛!
!
+,(!'#)- . "#(%"&)
This is true for 𝑛 = 0, 1, 2 . .. and, of course, &!
= 0 otherwise, or elsewhere.
This is a Poisson process and we can call a stochastic process Poisson if it satisfies the conditions
listed above.
We have dealt with the Poisson process briefly before and we showed an important property that,
if 𝑁 is a Poisson process, then the stochastic process 𝑁! − λ! , which we call the compensated
Poisson process, is a martingale. In fact, 𝑁 itself is a submartingale, so:
𝑁 = {𝑁! : 𝑡 ≤ 0} is a martingale.
Quite simply, if we look at a diagram, a point starts at time 0 and remains there for a certain period
of time, 𝑇0 , until an event occurs, and it jumps up to 1, where it stays for a random amount of time,
𝑇1 , until a second event occurs and it jumps up to 2, where it stays for a period of time, 𝑇2 , and so
on. In the next video, we are going to talk more about these times, which are called interarrival
times.
Therefore, the sample paths of a Poisson are piecewise constant and have jumps of sizes 1. So, in
particular, Poisson processes are an example of stochastic processes that do not have continuous
sample paths, unlike Brownian motion.
Now that we have introduced the Poisson process, in the next video we are going to talk about
interarrival times.
Let 𝑁 be a homogeneous Poisson process, where 𝑁 = {𝑁! : 𝑡 ≥ 0}, with rate λ > 0.
Remember that the sample paths of 𝑁 always start at 0 and stay at 0 for a random amount of time,
𝑇0 , before jumping up to 1, where it stays for a random amount of time, 𝑇1 , before jumping up to 2,
where it does the same, staying for a random amount of time, 𝑇2 , and so on. So, that's why the state
space is the non-negative integers.
We will call these random amounts of time, which are random variables, the interarrival times and
we will call the times, up until the 𝑛!% event, the arrival time of the 𝑛!% event. So, for example, we
will call the time between 0 and the end of 𝑇0 , 𝑊0 , the arrival time of the first event. The second
time interval, between 0 and the end of 𝑇1 , will be called 𝑊1 , which is the arrival time of the second
event, and so on. 𝑊2 will be the arrival time of the third event. So, the obvious relationship
between the interarrival times and the arrival times is as follows:
&
𝑊& = @ 𝑇3
340
We can define all of this formally as 𝑊& = inf{ 𝑡 ≥ 0: 𝑁! = 𝑛}. This is the first time that the Poisson
process reaches 𝑛, and we then, of course, define 𝑇& as 𝑇& : = 𝑊& − 𝑊&'0 . That is the definition of
arrival and interarrival times.
Now, the distributional properties of the interarrival times can easily be derived from the
properties of a Poisson process. For instance, to find the distribution of T0 , we calculate the
probability that 𝑇0 is greater than 𝑡. Now, since the 𝑊& 's are increasing – in other words 𝑊2 is
greater than 𝑊1 , which is greater than or equal to 𝑊0 – the 𝑇& 's are non-negative random variables.
We can calculate the survivor function, in other words, the probability that 𝑇0 is greater than 𝑡 – by
looking at a number line that starts at 0. There are no events that occur in the space between 0 and
point 𝑡 because the first interarrival time, 𝑇0 , which is also equal to the arrival time of the first
event, is greater than 𝑡. In other words, there are no events between 0 and 𝑡 on the number line.
Therefore:
𝑃𝑟(𝑇0 > 𝑡) = 𝑃𝑟(𝑁! − 𝑁" = 0)
Now, as we saw in the previous video, the distribution of the increment is Poisson, so 𝑁! − 𝑁" = 0
has a Poisson distribution with parameter λ(𝑡). Hence, this is equal to 𝑒 '(! .
From there, we can calculate, for 𝑡 > 0, the CDF of 𝑇0 , whereby we get 𝐹5( (𝑡) = 1 − 𝑒 '(! . And, of
course, it is 0 for 𝑡 ≤ 0 because it is a non-negative random variable. Hence, as we can see, this has
an exponential distribution with parameter λ.
In fact, we can prove a much more general result that says that all of the interarrival times, {𝑇& }6
&40 ,
are i.i.d exponentials. In other words, they are independent and identically distributed. All of them
have an exponential distribution with parameter λ > 0. Following on from that, because 𝑊& is the
sum of the 𝑇3 's, 𝑊& has a Gamma distribution because it is the sum of i.i.d exponentials with
parameter (α = 𝑛, λ) with λ being the same λ that we used in the exponential case.
Now that we have looked at arrival times and interarrival times, in the next video, we are going to
introduce Lévy processes.
1 We will assume that 𝑋" = 0. It is not necessary to make this assumption but it will simplify
the calculations that we will be making later on.
2 𝑋 has stationary and independent increments, just like it has in Brownian motion and the
Poisson process.
So, those are the three properties that a stochastic process must satisfy in order for it to be a Lévy
process.
Now, I just want to mention the third property: this property also implies that 𝑋 has a cadlag
modification. This means that there exists a cadlag stochastic process 𝑌 such that 𝑌 is a
modification of 𝑋 and, therefore, because of this reason, some authors replace the third condition
that we listed with the condition that 𝑋 is cadlag itself. We will make that assumption here as well.
In other words, we are going to assume from the onset that whenever we are dealing with a Lévy
process, that Lévy process itself is cadlag.
1 The first one is a Brownian motion, 𝑊. A Brownian motion easily satisfies the three
properties: if we look at the first one, Brownian motion already starts at 0 and, as we have
mentioned, it has stationary and independent increments. The third property is
automatically satisfied because Brownian motion is continuous itself.
2 In the second example, we take 𝑁 to be a Poisson process, which also satisfies the three
conditions. Let's see why: the first condition requires 𝑁" to equal 0, which is true, and
something that we have stated in previous videos. The second condition agrees that the
homogeneous Poisson process has stationary and independent increments. And then we
reach the third condition that we need to prove.
To show that 𝑁 is stochastically continuous, we have to show that this condition holds for
every 𝑡. Therefore, let 𝜀 be positive, and then we calculate the limit, as 𝑠 tends to 𝑡 of the
probability of the absolute value of 𝑁! − 𝑁# , which is greater than 𝜀. Written in full:
Now, for this, we are going to use an inequality known as Markov's inequality that we
studied in Probability Theory. This inequality says that if 𝑋 is a non-negative random
variable, then the probability that 𝑋 is greater than ε, where 𝜀 is positive, is less than or
equal to the expected value of 𝑋 over 𝜀 for any positive 𝜀. Written in full:
𝐸(𝑋)
If 𝑋 ≥ 0 then ℙ(𝑋 > ε) ≤
ε
Using Markov's inequality, this probability here is less than or equal to the limit, as 𝑠 tends
to 𝑡, which is non-negative (because this is a probability) of the expected value absolute
value |𝑁! − 𝑁# |, divided by ε itself, because ε is positive. Written in full:
𝐸(|𝑁! − 𝑁# |)
≤ lim
#→! 𝜀
Now, the expected value of this absolute value is either equal to 𝑁! − 𝑁# (when 𝑠 is less
than 𝑡) or 𝑁# − 𝑁! (when 𝑠 is greater than 𝑡). So, in any case, this will just be equal to (since
this increment is a Poisson distribution) the limit, as 𝑠 tends to 𝑡 of λ times the absolute
value of 𝑡 − 𝑠 (which sorts out the two cases), divided by ε. And this limit is equal to 0 as 𝑠
tends to 𝑡, because the top part goes to 0 and the bottom part is just a constant. Written in
full:
𝐸(|𝑁! − 𝑁# |) 𝜆|𝑡 − 𝑠|
≤ lim = lim =0
#→! 𝜀 #→! 𝜀
Therefore, we have shown that 𝑁 is stochastically continuous and that implies, then, that
𝑁, the Poisson process, is a Lévy process.
Now that we have introduced Lévy processes, in the next video we are going to look at more
properties of Lévy processes.
The Lévy process itself is not a martingale all the time; however, if 𝑋! is integrable, then the
stochastic process 𝑀! = 𝑋! − 𝐸(𝑋! ) is a martingale with respect to the natural filtration of
𝑋! . So, let's show that martingale property.
The first condition – that it must be adapted – is clear because we are using the natural
filtration of 𝑋! , and 𝑀! is a function of 𝑋! . The second property – that it is integrable – has
already been stated when we said that 𝑋! itself must be integrable. Therefore, looking at
the equation, 𝑋! is integrable and 𝐸(𝑋! ) is just a finite constant, which means that 𝑀! will be
integrable as well.
All we need to show, then, is the third martingale property, which says that if 𝑠 < 𝑡, we
have to calculate the expected value – the conditional expectation – of 𝑀! given ℱ#𝓍 for 𝑠 <
𝑡. When we calculate that, we get 𝐸(𝑋! − 𝐸(𝑋! )|ℱ#𝓍 ), which is a constant and can therefore
be taken out, to leave us with 𝐸(𝑋! |ℱ#𝓍 ) − 𝐸(𝑋! ).
The expected value of 𝑋! is equal to 𝐸(𝑋# + 𝑋! − 𝑋# |ℱ#𝓍 ) − 𝐸(𝑋! ). Here, we are doing what
is called increment creation, which is then equal to, looking at the first part of the equation
(because 𝑋# is ℱ#𝓍 measurable), 𝑋# itself, plus the expected value of 𝑋! − 𝑋# , because 𝑋! − 𝑋#
is independent of ℱ#𝓍 due to it being a Lévy process and therefore it has independent
increments, minus the expected value of 𝑋! .
This is then equal to 𝑋# + 𝐸(𝑋! − 𝑋# ) − 𝐸(𝑋! ), and because 𝑋! − 𝑋# has the same
distribution as 𝑋!'# , this means that we can split the expectation as the expected value of
𝑋! minus the expected value of 𝑋# . We then cancel out 𝐸(𝑋! ) in both cases where it occurs,
and we get 𝑋# − 𝐸(𝑋# ) = 𝑀# , which is the martingale property.
Written in full:
𝑠 < 𝑡, 𝐸(𝑀! |ℱ#𝓍 ) − 𝐸(𝑋! )
= 𝑋# + 𝐸(𝑋! − 𝑋# ) − 𝐸(𝑋! )
2 The second property that we will look at is a distributional property for which we will need
to calculate the moment-generating function (MGF) of a Lévy process. So, we will call 𝑀!
the MGF of 𝑋! . In other words, this is the expected value of 𝑒 9:% . Let's try and find the form
of this MGF.
Here, we have performed an increment creation, because this cancels out, and we can
write it out as:
Now, we can use independents of increments to split this expectation to equal the
following:
The second part of the equation is equal to 𝑀! (𝛼) times (because 𝑋 has stationary
increments) 𝑋!$# − 𝑋! , which has the same distribution as 𝑋# − 𝑋" , which equals 0 (meaning
that it has the same distribution as 𝑋# ). Hence, this will be same as the expected value of
𝑒 9:& , which is 𝑀# (α).
Written in full:
(9)
𝑀! = 𝑀:% (9) = 𝐸(𝑒 9:% )
= 𝐸/𝑒 9:% ⋅ 𝑒 9(:%)& ' :%) 3 = 𝐸(𝑒 9:% ⋅ 𝐸/𝑒 9(:%)& ' <%) 3
= 𝑀! (α) ⋅ 𝑀# (α)
So, we have found that 𝑀!$# (9) is equal to 𝑀! (𝛼) times 𝑀# (α). Therefore, we can use this to
derive a differential equation satisfied by 𝑀! as follows:
If 𝑡 and ℎ are positive, and we think of ℎ as being very small, we use this relationship here to
(9) (9)
get 𝑀!$% (9) , which is equal to 𝑀! 𝑀% . If you subtract 𝑀! on both sides, we get:
We can then divide both sides of the equation by ℎ, and we take the limit as ℎ tends to 0
= (9)
and assume differentiability, of course. This gives us the derivative, of 𝑀! , which is
=!
(9)
equal to 𝑀! times something that does not depend on 𝑡 but depends on α, which we will
call 𝜓 of α (some function 𝜓 of α).
We can then solve this differential equation, a separable differential equation, which
(9) (9)
means we can divide by 𝑀! on both sides, and, of course, get the answer as 𝑀! =
(9)
𝑀" 𝑒 >(9)! . 𝑀" is equal to 1, because the Lévy process starts at 0, which means that the
final part of the equation will be equal to 𝑒 >(9)! . Written in full:
(9) (9)
𝑡, ℎ > 0 𝑀!$% (9) = 𝑀! 𝑀%
𝑑
𝑀 (α) = 𝑀! (α)𝜓(α)
𝑑𝑡 !
(9) (9)
𝑀! = 𝑀" 𝑒 >(9)! = 𝑒 >(9)!
That is therefore the form of the MGF of the Lévy process 𝑋! . It is just 𝑒 >(9)! ,depending on
what 𝑡 is.
This is a very useful property of Lévy processes, which is linked to something that we call
infinite divisibility of the Lévy process.
Now that we have looked at a few properties of Lévy processes, in the next video, we are going to
look at the compound Poisson process and exponential Lévy models.
Let 𝑁 be a homogeneous Poisson process, where 𝑁 = {𝑁! : 𝑡 ≥ 0} with constant rate λ > 0. In
addition, let {𝑌3 }3'∧ 6 be a sequence of i.i.d random variables – so, they are independent and have
the same distribution.
A continuous-time process 𝑋, where 𝑋 = {𝑋! : 𝑡 ≥ 0}, is called a compound Poisson process if:
0 if 𝑁! = 0
@%
𝑋! = d
@ 𝑌3 if 𝑁! ≥ 1
340
This is what a compound Poisson process is. So, we are just summing a random number of random
variables up to 𝑁! , depending on what 𝑁! is.
Now, it turns out that a compound Poisson process is also a Lévy process. You can check the other
conditions as an exercise. What we will do now, however, is check the stationarity of increments.
Let's consider the increment 𝑋! − 𝑋# , where 𝑠 is less than 𝑡, and we are going to calculate its MGF.
+%
9A∑*,+ C D
We are going to calculate 𝐸/𝑒 9(:% ':& )3, which is equal to 𝐸 f𝑒 & )( * g. This is what the
increment is, if we look at the definition above (assuming, of course, that 𝑁! is greater than or
+
9 ∑* %, + C*
equal to 1). This is equal to, when we use the Law of Total Expectation, 𝐸(𝐸(𝑒 &)( ) given
the 𝜎-algebra generated by 𝑁! and 𝑁# . If we calculate this conditional expectation, since the 𝑌3 's
are independent – we will always assume that the 𝑌3 's are also independent of the Poisson process
itself – they are equal to the expectation of the MGF of 𝑌 (each one of the 𝑌3 's have the same
distribution), all to the power 𝑁! − 𝑁# , because this is the sum of 𝑁! − 𝑁# number of variables. 𝑁! −
𝑁# has a Poisson distribution with parameter λ times 𝑡 − 𝑠, which is equal to 𝑒 ((!'#)+E. (9)'0- . As
we can see, this only depends on 𝑡 − 𝑠, the length of the increment. It doesn't depend on the
starting point explicitly.
Written in full:
+
9 ∑* %, + C*
= 𝐸(𝐸(𝑒 &)( )| 𝜎(𝑁! , 𝑁# ))
@% '@&
= 𝐸 fi𝑀F (α)j g
= 𝑒 ((!'#)+E.(9)'0-
This shows, therefore, that a compound Poisson process has stationary increments. In addition,
you can show that the other properties of Lévy processes are satisfied for 𝑋 as well.
In finance, we are going to be dealing a lot with what are known as exponential Lévy models.
When looking at stocks, we say that a stock, 𝑆! , has an exponential Lévy model if 𝑆 can be written
as 𝑆" times the exponent of 𝑋! , where the stochastic process 𝑋! is a Lévy process.
0
An example of this that we have dealt with is the case when 𝑋! = i𝜇 − 1 𝜎 1 j 𝑡 + 𝜎𝑊! , where 𝑊! is a
Brownian motion. As a reminder, we saw this when we looked at the Black-Scholes model, because
( /
the solution of that SDE, the geometric Brownian motion, was 𝑆! = 𝑆" 𝑒 AG'/H D!$HI%
. This is a Lévy
process; Brownian motion is a Lévy process; this deterministic process is a Lévy process; and the
sum of the two is also a Lévy process.
We are going to see other examples of exponential Lévy models. For instance, we can take 𝑋! to be
@
equal to some constant 𝑏𝑡 plus 𝜎𝑊! plus a compound Poisson process, ∑340
%
𝑌3 . This is an example
of a Lévy process that we can use to construct an exponential Lévy model. The advantage of this is
not only that it is continuous as this is, but it also accounts for jumps as well. In other words, this is
a combination of a diffusion, which starts at 0 and then, as soon as an event occurs here, at a
Poisson rate of λ, this jumps by 𝑌3 . It can jump up or down depending on the sign of 𝑌3 . 𝑌3 can be
positive or negative. It then continues as a Brownian diffusion and then jumps again by 𝑌3 , and so
on. So, this is an example of a model that has both jumps and diffusion components, and it is very
useful in modeling as well.
Now that we have covered the compound Poisson process, we have reached the end of the
module.