Stochastic Calculus of Variations For Jump: Processes 1st Edition Yasushi Ishikawa
Stochastic Calculus of Variations For Jump: Processes 1st Edition Yasushi Ishikawa
https://ebookfinal.com/download/levy-processes-and-stochastic-
calculus-1st-edition-david-applebaum/
https://ebookfinal.com/download/applied-calculus-of-variations-for-
engineers-second-edition-edition-komzsik/
https://ebookfinal.com/download/applied-calculus-of-variations-for-
engineers-second-edition-louis-komzsik/
https://ebookfinal.com/download/stationary-stochastic-processes-for-
scientists-and-engineers-1st-edition-georg-lindgren/
Stochastic Processes and Models David Stirzaker
https://ebookfinal.com/download/stochastic-processes-and-models-david-
stirzaker/
https://ebookfinal.com/download/applied-stochastic-processes-1st-
edition-ming-liao-author/
https://ebookfinal.com/download/stochastic-processes-selected-papers-
of-hiroshi-tanaka-1st-edition-hiroshi-tanaka/
https://ebookfinal.com/download/stochastic-analysis-and-diffusion-
processes-1st-edition-gopinath-kallianpur/
Edited by
Carsten Carstensen, Berlin, Germany
Nicola Fusco, Napoli, Italy
Fritz Gesztesy, Columbia, Missouri, USA
Niels Jacob, Swansea, United Kingdom
Karl-Hermann Neeb, Erlangen, Germany
Volume 54
Yasushi Ishikawa
Stochastic Calculus of
Variations for Jump
Processes
Mathematics Subject Classification 2010
60J25, 60J35, 60G51, 60H07
Author
Prof. Dr. Yasushi Ishikawa
Ehime University
Graduate School of Science and Engineering
Mathematics, Physics, and Earth Sciences
Bunkyo-cho 2-chome
790-8577 Matsuyama
Japan
[email protected]
ISBN 978-3-11-028180-4
e-ISBN 978-3-11-028200-9
Set-ISBN 978-3-11-028201-6
www.degruyter.com
Preface
This book is a concise introduction to the stochastic calculus of variations (also
known as Malliavin calculus) for processes with jumps. It is written for researchers
and graduate students who are interested in Malliavin calculus for jump process-
es. In this book, “processes with jumps” include both pure jump processes and
jump-diffusions. The author has tried to provide many results on this topic in a self-
contained way; this also applies to stochastic differential equations (SDEs) “with
jumps”. This book also contains some applications of the stochastic calculus for
processes with jumps to control theory and mathematical finance.
The field of jump processes is quite wide-ranging nowadays, from the Lévy mea-
sure (jump measure) to SDEs with jumps. Recent developments in stochastic analy-
sis, especially Malliavin calculus with jumps in the 1990s and 2000s, have enabled
us to express various results in a compact form. Until now, these topics have been
rarely discussed in a monograph. Among the few books on this topic, we would like
to mention Bichteler–Gravereaux–Jacod (1987) and Bichteler (2002).
One objective of Malliavin calculus (of jump type) is to prove the existence of the
density function pt (x, y) of the transition probability of a jump Markov process Xt
probabilistically, especially the very important case where Xt is given by a (Itô, Mar-
cus, Stratonovich . . . ) SDE, cf. Léandre (1988). Furthermore, granting the existence
of the density, one may apply various methods to obtain the asymptotic behaviour of
pt (x, y) as t → 0 where x and y are fixed. The results are known to be different,
according to whether x ≠ y or x = y . We also describe this topic.
The starting point for this book was July 2009, when Prof. R. Schilling invited
me to the Technische Universität Dresden, Germany, to teach a short course on Malli-
avin’s calculus for jump processes. He suggested that I expand the manuscript, thus
creating a book. Prof. H. Kunita kindly read and commented on earlier drafts of the
manuscript. The author is deeply indebted to Professors R. Schilling, M. Kanda, H. Ku-
nita, J. Picard, R. Léandre, C. Geiss, F. Baumgartner, N. Privault and K. Taira.
This book is dedicated to the memory of the late Professor Paul Malliavin.
0 Introduction 1
4 Applications 181
4.1 Asymptotic expansion of the SDE 182
4.1.1 Analysis on the stochastic model 184
4.1.2 Asymptotic expansion of the density 205
4.1.3 Examples of asymptotic expansions 210
4.2 Optimal consumption problem 216
4.2.1 Setting of the optimal consumption 216
4.2.2 Viscosity solutions 220
4.2.3 Regularity of solutions 239
4.2.4 Optimal consumption 243
4.2.5 Historical sketch 246
Appendix 249
Bibliography 253
Index 264
0 Introduction
A theme of this book is to describe a close interplay between analysis and probability
via Malliavin calculus. Compared to other books on this subject, our focus is mainly
on jump processes, especially Lévy processes.
The concept of (abstract) Wiener space has been well known since the 1970s.
Since then, despite a genuine difficulty with respect to the definition of the (abstract)
Wiener space, many textbooks have been published on the stochastic calculus on the
Wiener space. It should be noted that it is directly connected to Itô’s theory of stochas-
tic differential equations, which Itô invented while inspired by the work of A. N. Kol-
mogorov. Already at this stage, a close relation between stochastic calculus and PDE
theory has been recognised through the transition probability pt (x, dy), whose den-
sity pt (x, y) is the fundamental solution to Kolmogorov’s backward equation.
Malliavin calculus started with the paper [150] by P. Malliavin (cf. [151]). One of
the motivations of his paper is the problem of hypoellipticity for operators associated
with stochastic differential equations of diffusion type. At the beginning, Malliavin’s
calculus was not very popular (except for his students, and a few researchers such as
Bismut, Jacod, Shigekawa, Watanabe and Stroock) due to its technical difficulties.
Malliavin’s paper was presented at the international symposium in Kyoto organ-
ised by Prof. K. Itô. At that time, a close relation began between P. Malliavin and the
Kyoto school of probability in Japan. The outcome was a series of works by Watan-
abe [204], Ikeda–Watanabe [79], Shigekawa [187], and others.
The relation between Malliavin calculus for diffusion processes and PDEs has
been deeply developed by Kusuoka and Stroock [128–130] and others.
On the other hand, Paul Lévy began his study on additive stochastic processes
(cf. [143]). The trajectories of his processes are continuous or discontinuous. The
additive processes he studied are now called Lévy processes. The discontinuous
Lévy processes have an infinitesimal generator of integro-differential type in the
semigroup theory in the sense of Hille–Yosida. Such integro-differential operators
have been studied in potential theory, by e.g. Ph. Courrège [42] and Bony–Courrège–
Priouret [32]. The latter paper is related to the boundary value problem associated
with integro-differential operators.
The theory developed following that of Fourier integral operators and pseudodif-
ferential operators (cf., e.g. [33]).
My first encounter with Malliavin calculus for jump processes was the paper by
Léandre [131], where he proves
if the jump process Xt can reach y by one single jump (y ≠ x ). Here, n(x, dy)
denotes the Lévy kernel. This result has been generalised to the case of n jumps,
n = 1, 2, . . . in [82], independently of the work by Picard.
2 Introduction
When I started my research in this field, I was inspired by the close relation be-
tween the theories of integro-differential operators and jump type Markov processes.
Consequently, my own research on the short time asymptotics of the transition den-
sity plays an important role in this monograph.
Later, Malliavin calculus has found new applications in the theory of finance.
The presentation of the contents follows the historical development of the theory.
The technical requirements of this book are usual undergraduate calculus, probabil-
ity and abstract measure theory.
Historically, the theory was started by Bismut. The approach by Bismut is based
on the Girsanov transform of the underlying probability measure. From an analytic
point of view, the main idea is to replace the Radon–Nikodým density function for the
perturbed law of the continuous trajectories by the one for discontinuous trajectories.
From an analytic point of view, the main idea is to replace the Radon–Nikodým den-
sity function in the Girsanov transform of measures induced by the perturbation of
the continuous trajectories by that induced by the perturbation of the discontinuous
trajectories. Subsequently, the theory was extended to cover singular Lévy measures
using perturbation methods (Picard).
Most textbooks on Malliavin calculus on the Wiener space (e.g. [151, 187]) adopt
a functional analytic approach, where the abstract Wiener space and the Malliavin
operator appear. I do not use such a setting in this book. This is partly because
such a setting is not very intuitive, and partly because the setting cannot directly be
transferred to the Poisson space from the Wiener space. This is also discussed in Sec-
tion 3.4.
In the spirit of potential theory and (nonlocal) integro-differential operators,
I have adopted the method of perturbations of trajectories on the Wiener–Poisson
space. This perspective fits well to the Markov chain approximation method used
in Sections 2.2, 2.3, and to the technique of path-decomposition used in Section 3.6.
Hence, it constitutes one of the main themes of this book.
In our approach, both in the Wiener space and in the Poisson space, the main
characters are the derivative operator Dt or the finite difference operator D̃u , and
their adjoints δ or δ̃. The derivative operator is defined to act on the random variable
F (ω) defined on a given probability space (Ω, F , P ).
In the Wiener space, Ω = C0 (T) is the space of continuous functions defined
on the interval T equipped with the topology given by the sup-norm. The Malliavin
derivative Dt F (ω) of F (ω) is then given in two ways, either as a functional derivative
or in terms of a chaos expansion, see Section 3.1.1 for details. The definition via chaos
expansion is quite appealing since it gives an elementary proof of the Clark–Ocone
formula, and since the definition can be carried over to the Poisson space in a natural
way; details are stated in Section 3.2.
Here is a short outline of all chapters.
Introduction 3
Chapter 1. In Chapter 1, I briefly prepare basic materials which are needed for the
theory. Namely, I introduce Lévy processes, Poisson random measures, stochastic in-
tegrals, stochastic differential equations (SDE) with jumps driven by Lévy processes,
Itô processes, canonical processes, and so on. Some technical issues in the stochas-
tic analysis such as Girsanov transforms of measures, quadratic variation, and the
Doléans stochastic exponential are also discussed. The SDEs introduced in Section 1.3
are time independent, i.e. of autonomous (or “Markovian”) type.
In this chapter, technical details on materials concerning SDEs are often referred
to citations, as our focus is to expose basic elements for stochastic analysis briefly.
Especially, for materials and explanations of diffusion processes, Wiener processes,
stochastic integrals with respect to Wiener process, readers can refer to [110].
tail. This chapter is the main part in the theoretical aspect of stochastic analysis for
processes with jumps.
In Sections 3.5 (General theory) and 3.7 (Itô processes), I define the composi-
tion Φ ◦ F of a random variable F on the Wiener–Poisson space with a generalised
function Φ in the space S of tempered distributions, such as Φ(x) = (x − K)+ or
Φ(x) = δ(x). These results are mostly new. In Section 3.6, I investigate the smooth-
ness of the density of the processes defined on the Wiener–Poisson space as function-
als of Itô processes.
I hope this book will be useful as a textbook and as a resource for researchers in
probability and analysis.
1 Lévy processes and Itô calculus
Happy families are all alike; every unhappy family is unhappy in its own way.
Lev Tolstoy, Anna Karenina
In this chapter, we briefly prepare the basic concepts and mathematical tools which
are necessary for stochastic calculus with jumps throughout this book. We consider
Poisson processes, Lévy processes, and the Itô calculus associated with these pro-
cesses. Especially, we consider SDEs of Itô and canonical type.
We first introduce Lévy processes in Section 1.1. We provide basic materials to
SDEs with jumps in Section 1.2. Then, we introduce SDEs for Itô processes (Section 1.3)
in the subsequent section. Since the main objective of this article is to inquire into an-
alytic properties of the functionals on the Wiener–Poisson space, not all of the basic
results stated in this chapter are provided with full proofs.
Throughout this book, we shall denote the Itô process on the Poisson space by
xt or xt (x), and the canonical process by Yt . The expressions Xt , X(t) are used for
both cases of the above, or just in the sense of a general Itô process on the Wiener–
Poisson space. In the text, if we cite formula (l, m, n), we mean the formula (m, n)
in Chapter l.
1 Here and in what follows, a.s. denotes the abbreviation for ‘almost surely’. Similarly, a.e. stands for
‘almost every’ or ‘almost everywhere’.
6 Lévy processes and Itô calculus
We denote
Δz(t) = z(t) − z(t−) .
The same notation for Δ will be applied for processes Xt , Mt , xt , . . . which will appear
later.
We can associate the counting measure N to z(t) in the following way: for A ∈
B(Rm \ {0}), we put
N(t, A) = 1A (Δz(s)), t > 0 .
0≤s≤t
Note that this is a counting measure of jumps of z in A up to the time t . As the path
is càdlàg, for A ∈ B(Rm \ {0}) such that Ā ⊂ Rm \ {0}, we have N(t, A) < +∞ a.s.
A random measure on T × (Rm \ {0}) defined by
where a ≤ b and T = [0, T ], is called a Poisson random measure if it follows the Pois-
son distribution with mean measure E[N((a, b] × A)], and if for disjoint (a1 , b1 ] ×
A1 , . . ., (ar , br ] × Ar ∈ B(T × (Rm \ {0})), N((a1 , b1 ] × A1 ), . . ., N((ar , br ] × Ar )
are independent.
Proposition 1.1 (Lévy–Itô decomposition theorem, [180] Theorem I.42). Let z(t) be
a Lévy process. Then, z(t) admits the following representation
t t
z(t) = tc + σ W (t) + z Ñ(dsdz) + zN(dsdz) ,
0 |z|<1 0 |z|≥1
By this proposition, N(., .) derived from z(t) defines a Poisson random measure
t
on T × (Rm \ {0}). Here, we use the notation of stochastic integrals 0 zN(dsdz)
t
and 0 zÑ(dsdz). The precise meaning of these integrals is postponed to Sec-
tion 1.2. However, it should be noted here that the Wiener process W (t) and the Pois-
son random measure N(dtdz) are adapted to the original filtration (Ft ) generated
by the Lévy process z(t).
We take the mean measure
Proposition 1.2.
(1) Let z be a Lévy process on Rm \ {0}. Then,
where
1
Ψ (ξ) = i(c, ξ) − (ξ, σ σ T ξ) + (ei(ξ,z) − 1 − i(ξ, z)1{|z|<1} )μ(dz) . (1.4)
2
8 Lévy processes and Itô calculus
and
t
E ei(ξ, 0 |z|≥1 zN(dsdz)) = exp t (ei(ξ,z) − 1)μ(dz) .
|z|≥1
Please refer to Theorem 8.1 in [184], and Section 0 in [100]. In the above statement,
(a, b) denotes the inner product of a and b.
Let Dp = {t ∈ T; Δz(t) ≠ 0}. Then, it is a countable subset of T a.s. Let
A ⊂ Rm \ {0}. In case μ(A) < +∞, the process Dp t → s≤t,Δz(s)∈A δ(s,Δz(s))
is called a Poisson counting measure associated to the Lévy process z(t) (or, the Lévy
measure μ(dz)) taking values in A. The function Dp t → p(t) = Δz(t) is called
a Poisson point process.
1. Poisson process
A Poisson process Nt with intensity λ > 0 is a nonnegative integer-valued process
defined on [0, +∞) which satisfies the following conditions:
(i) N0 = 0, ΔNt = Nt − Nt− is 0 or 1
(ii) For s < t , Nt − Ns is independent of Fs .
(iii) For all t1 , t2 and all s > 0, Nt1 +s − Nt1 has the same distribution as Nt2 +s −
Nt2 .
(iv)
1
P (Nt = k) = (λt)k e−λt , k = 0, 1, 2, . . . .
k!
In fact, the property (iv) follows from (i) to (iii), cf. [180] Theorem I.23. We put (iv)
for simplicity. The Lévy measure μ of the Poisson process is the point mass λδ{1} ,
and b = 0, σ = 0.
Poisson random measure and Lévy processes 9
t
Yt = zN(dsdz) ,
0 Rm \{0}
where N(dsdz) denotes a Poisson random measure on T × (Rm \ {0}) with the
mean measure λdsμ(dz).
3. Stable process
A Lévy process such that its Lévy measure μ , given by
dz
μ(dz) = cα ,
|z|m+α
z dz
μ(dz) = cα a ,
|z| |z|m+α
where a(.) is defined on S m−1 and a(·) ≥ 0, the process is called an asymmetric
stable process. In case m = 1, μ takes the form
dz
μ(dz) = (c− 1{z<0} + c+ 1{z>0} ) ,
|z|1+α
where c− ≥ 0, c+ ≥ 0.
4. Wiener process
A Wiener process (or Brownian motion) W (t) (on another probability space) such
that W (0) = 0 satisfies the conditions (1–5) of Definition 1.1. Hence, it is a (con-
tinuous) Lévy process.
1
A Wiener process has a scaling property that if c > 0, then c − 2 W (ct) is indistin-
guishable from W (t) in the sense of distribution.
t
X(t) = x + tc + σ W (t) + γ(z)Ñ(dsdz), t ≥ 0 ,
0 R\{0}
10 Lévy processes and Itô calculus
where γ(z) is such that R\{0} γ(z)2 μ(dz) < ∞. Let f : R → R be a function in
C 2 (R), and let
Y (t) = f (X(t)) .
df df 1 d2 f
dY (t) = (X(t))c dt + (X(t))σ dW (t) + (X(t))σ 2 dt
dx dx 2 dx 2
df
+ f (X(t) + γ(z)) − f (X(t)) − (X(t))γ(z) μ(dz) dt
dx
R\{0}
+ [f (X(t) + γ(z)) − f (X(t))] Ñ(dt dz) .
R\{0}
t
X(t) = x + tc + σ W (t) + γ(z)Ñ(dsdz) , t ≥ 0.
0
m m t
i
X (t) = xi + tci + σij Wj (t) + γij (z)Ñj (dsdzj ) , i = 1, . . ., d .
j=1 j=1 0 R\{0}
Y (t) = f (X(t)) .
d
∂f d m
∂f
dY (t) = (X(t))ci dt + (X(t))σij dWj (t)
i=1
∂xi i=1 j=1
∂xi
1 ∂2f
d
+ (X(t))(σ σ T )ij dt
2 i,j=1 ∂xi ∂xj
Poisson random measure and Lévy processes 11
m
+ f (X(t) + γ j (zj )) − f (X(t))
j=1 R\{0}
d
∂f
− (X(t))γij (z) μj (dzj ) dt
i=1
∂xi
m
+ f (X(t−) + γ j (z)) − f (X(t−)) Ñj (dtdzj ) .
j=1 R\{0}
For the precise meaning of the stochastic integrals with respect to dW (t) and
Ñ(dtdz), see Section 1.2.
Example 1.1. Let b = 0, γ(z) = 0, σ = 1 and f (x) = x 2 . Then, Itô’s formula leads
to
1
W (t) dWt = (W (T )2 − T ) .
2
T
A Lévy process z(t) is said to have a finite variation if the total variation
2n
t(k − 1)
|z|t = sup z tk −z (1.5)
2n 2n
n≥1 k=1
is finite a.s. on every compact interval of [0, +∞). If it is not so, the process is said to
have infinite variation.
We introduce the Blumenthal–Getoor index of the Lévy process z(t) by
⎧ ⎫
⎪
⎨ ⎪
⎬
β = inf δ ≥ 0; |z|δ μ(dz) < +∞ .
⎪
⎩ ⎪
⎭
|z|≤1
The index takes values in [0, 2]. It is known (cf. [37, 78]) that if β < 1, then z has
a finite variation path a.s., and if β > 1, then z has an infinite variation path a.s.
t
We shall define the stochastic integral s f (u, ω)dz(u, ω) first for the finite vari-
ation process and then for the infinite variation process, where f is a bounded, jointly
measurable function.
12 Lévy processes and Itô calculus
For the integral using the infinite variation process, we need the predictable property
for u → f (u, .) and that of semimartingales. For these, see Section 1.2.2.
In the following sections of this chapter, we use the notion of a stochastic differ-
ential equation (SDE) with respect to the Lévy process z(t). The precise definition
and the properties of the solution are postponed to the next section.
Due to a recent development by T. Lyons [147, 148], there is a possibility to define
“stochastic integrals” ω-wisely by using the (iterated) Young integrals of processes of
finite or infinite variation not using the integration by semimartingales. The theory is
called the rough path theory, and it uses the notion of p -variation norm and spaces.
See also [50].
Indeed, similarly to (1.5), we can define the p -variation
2n
p
(p)
tk t(k − 1)
|z|t = sup z −z (1.6)
n≥1 k=1
2n 2n
with respect to the integrator dzs of finite p -variation can be embedded into the the-
ory of integration using rough paths.
Basic materials to SDEs with jumps 13
The space D
Let T = [0, T ], T < +∞. D = D(T) denotes the space of all functions defined on T
with values in Rm or Rd that are right continuous on [0, T ) and have left limits on
(0, T ] (càdlàg paths). We introduce a topology on D(T) by introducing the Skorohod
metric dT defined by
dT (f , g) = inf sup |f (t) − g(τ(t))| + |τ(t) − t| ,
τ t∈T
where τ moves over all strictly increasing, continuous mappings of T to T such that
τ(0) = 0, τ(T ) = T . The topological space (D(T), dT ) is called a Skorohod space.
The space (D(T), dT ) is separable, and by choosing an equivalent metric d◦T it is
complete ([26] Section 12).
D([0, +∞)) denotes the space of all càdlàg paths on [0, +∞). It is a Fréchet
space metrisable with the metric
1
∞
d(f , g) = n
1 ∧ d◦[0,n] (f , g) ,
n=1
2
the space (D([0, +∞)), d∞ ) is complete, but it is not separable. We will encounter
this space again in the next section and in Section 2.5.3.
Let (Ω, F , P ) be a probability space. A family (Ft )t∈T of sub σ -fields of F is called
a filtration if Fs ⊂ Ft for all s < t . A filtration (Ft )t∈T is said to satisfy the usual
conditions if F0 contains all null sets of F and if it is right continuous. Below, we
consider probability spaces equipped with filtrations which satisfy the usual condi-
tions.
14 Lévy processes and Itô calculus
(1) Mt ∈ L1 (P ), t ∈ T (2.1)
(2) if s ≤ t then E[Mt |Fs ] = Ms , a.s., s, t ∈ T . (2.2)
In case that,
Xt is called a supermartingale.
A random variable T : Ω → [0, +∞] is said to be a stopping time if the event
{T ≤ t} ∈ Ft for every t ∈ T. The set of all stopping times is denoted by T . Let
0 = T0 ≤ T1 ≤ . . . ≤ Tn ≤ . . . be a sequence of stopping times such that Tn →
+∞a.s . An adapted process Mt such that, for some sequence of stopping times as
above, Mt∧Tn is a martingale for any n is called a local martingale. A martingale is
a local martingale.
A process Xt is called a semimartingale if it can be written as
Xt = X0 + Mt + At ,
t t t
h(u)dXu = h(s)dMu + h(s)dAu ,
s s s
n−1
h(t) = h0 1{0} (t) + hi 1(ti ,ti+1 ] (t) ,
i=0
where hi is Fti -measurable and |hi | < +∞ a.s. We denote by S the set of elementary
processes, endowed with the topology given by the uniform convergence in (t, ω).
We define the integral I(h) of an elementary process h ∈ S with respect to the mar-
tingale M having càdlàg path by
n
I(h) = h(0)M0 + hi (Mti+1 − Mti ) .
i=0
E[I(h)(t)|Fs ] = I(h)(s), s ≤ t .
t
(2) I 2 (h)(t) − h2 (s)d[M]s is a martingale.
0
t
(3) E[I 2 (h)(t)] = E[ 0 h2 (s)d[M]s ].
Here, [M] denotes the quadratic variation of M (see just below for the definition). For
the proof of (1), see [179] Proposition 2.5.7. It follows from (3) that h → I(h) extends
to an isometry from the space of elementary processes equipped with the norm on
16 Lévy processes and Itô calculus
We denote by D the space of adapted processes with càdlàg paths with the Sko-
rohod topology. It can be observed that the process I(h) for h ∈ S takes values
in D.
It is known (cf. [44]) that Λ contains all predictable processes h such that
∞
E[ 0 h2 (s)d[M]s ] < +∞. Hence, for h ∈ Λ, the previous properties (1–3)
for I(h) hold true.
(ii) More precisely, we first extend I(·) : S → D to I(·) : L → D, where L denotes the
space of adapted processes with càglàd paths (left continuous paths with right
limits) endowed with the topology given by the uniform convergence in proba-
bility on compact sets (ucp-topology, for short). Here, we say a sequence (hn )
converges to h in the ucp-topology if for each t > 0,
in probability.
For the proof of this extension, we use the fact that the elements in S are dense
in L in the ucp-topology, that bounded elements in L are dense in L, and that the
bounded elements in L can be approximated by the bounded elements in S in the
ucp-topology ([180] Theorems II.10, II.11.).
We then extend I(.) : L → D to I(.) : Λ → D. The map I is well-defined for each
h ∈ Λ.
(iii) In case Mt = W (t) or Mt = Ñt , we extend I(.) thus obtained to I(.) : L2 (Ω ×
[0, +∞), P × d[M]s ) → D by using the L2 -isometry (3) above. Here, we use the
fact that Λ is dense in L2 (Ω ×[0, +∞), P ×d[M]s ) and the bounded convergence
theorem.
To prove this statement, we first approximate an element in L2 (Ω × [0, +∞), P ×
d[M]s ) by a sequence of bounded adapted processes in the L2 -norm, and then
2 Two P s in the two L2 spaces are distinct. Here, we use the same symbol, supposing that no confu-
sion occurs.
Basic materials to SDEs with jumps 17
The extension I : L2 (Ω×[0, +∞), P ×d[M]s ) → D thus obtained is called the stochas-
tic integral.
Xt = Mt − At (resp. Xt = Mt + At )
without the above mentioned uniform integrability condition, but with replacing Mt
to be a local martingale ([180] Theorem III.16).
Hence, Mt2 is a submartingale which satisfies the above property. By Theorem 1.1,
there exists a unique, increasing, predictable process A with A0 = 0 such that
Xt = Mt + At .
Proposition 1.4 (cf. [182] Theorem IV.26). Let M be a square integrable martingale
such that M0 = 0. Then, there exists a unique increasing process [M], [M]0 = 0,
such that
(1) M 2 − [M] is an uniformly integrable martingale,
(2) Δ[M] = (ΔM)2 .
18 Lévy processes and Itô calculus
The process [M] is called the quadratic variation process of M . An intuitive mean-
ing of [M] is given by
2
[M]t = lim Mtin − Mti−1
n
n→+∞
i
i
where tin =t∧ 2n .
By the two decompositions above, we see that At in Theorem 1.1(2) coincides with
the compensator of [M]t . Namely, the compensator is a predictable FV process, null
at 0, such that [M]t − At is a local martingale. We write it by Mt , and call it the
angle bracket of Mt .
The processes [M]t and Mt coincide if t → Mt is continuous a.s.
We can decompose
M = Mc + Md ,
where M c is the continuous part and M d is the purely discontinuous part. The
quadratic variation process [M]t can be decomposed into continuous and discontin-
uous parts by
[M]t = [M c ]t + (ΔMs )2 .
0≤s≤t
Hence, we can decompose
[M] = [M]c + [M]d .
Here, [M]c = [M c ] and [M]d = [M d ], where [M d ] = 0≤s≤t (ΔMs )2 .
t
The property (2) of I(h) above implies that [I 2 (h)]t = 0 h2 (t)d[M]t .
For square integrable martingales M, N such that M0 = 0 and N0 = 0, the
quadratic covariance process [M, N] is given by
1
[M, N] = ([M + N] − [M − N]) . (2.3)
4
Using this notation, we have
(2) For 0 ≤ t1 ≤ t2 ≤ t3
E[I(h)(t)] = 0 , t > 0.
(6)
t t
2
[I(h)]t = |h(s)| d[M]s , [I(h), I(g)]t = h(s)g(s)d[M]s .
0 0
Below, up to the end of this subsection, (Ft ) denotes the filtration generated by the
Lévy process z(t) satisfying the usual conditions.
We can introduce the integral with respect to Ñ in terms of the z variable (Pois-
son random measure) by introducing that by elementary Poisson measures. See [119]
Section 2.1. Then, for
I(ϕ) = ϕ(z)Ñ((s, t] × dz) , I(ψ) = ψ(z)Ñ((s, t] × dz) ,
we have
I(ϕ), I(ψ)t = (t − s) ϕ(z)ψ(z)μ(dz) .
t
h(s, z)Ñ(dsdz)
0
if ψ(z) is Fs -measurable.
For h(t, z) = i ψi (z)1[ti ,ti+1 ) (t), where ψi are Fti -measurable, we write
t
h(s, z)Ñ(dsdz) = ψi (z)(Ñ(ti+1 ∧ tdz) − Ñ(ti ∧ tdz)). (2.6)
0 i
Then, by (2.5),
⎡ ⎤
. t
⎢ ⎥
⎣ h(s, z)Ñ(ds dz)⎦ = h2 (s, z)N̂(dsdz) .
0 t 0
We denote by L2 (N̂) the set of all predictable functionals h(s, z) satisfying the
condition (2.4). The next assertion follows by the standard argument.
Proposition 1.5. Simple predictable processes h with the property (2.4) are dense in
L2 (N̂).
Theorem 1.2 (Kunita–Watanabe representation theorem, cf. [113, 180] Theorem IV.43).
Let Mt be a locally square integrable martingale defined on (Ω, F , P ). Then, there exist
predictable, square integrable processes φ(t), ψ(t, z) such that
t t
Mt = M0 + φ(s)dW (s) + ψ(t, z)dÑ(dsdz) .
0 0
Basic materials to SDEs with jumps 21
In the above assertion, we take F = (Ft )t∈T , where Ft is the minimal sub σ -field
on which W (s) and Ñ((0, s] × E), s ≤ t are measurable for each E ⊂ Rm \ {0}.
Itô’s formula for the Lévy process (Proposition 1.3) can be extended to the follow-
ing form.
t t
1
f (X(t)) = f (X(0)) + f (X(s−))dX(s) + f (X(s−))d[X, X]cs
2
0 0
+ [f (X(s)) − f (X(s−)) − f (X(s))ΔX(s)] .
0<s≤t
Y (t) = f (X(t))
d t
∂f
Y (t) − Y (0) = (X(s−))dX i (s)
i=1
∂x i
0
d t
1 ∂2f
+ (X(s−))d[X i , X j ]cs
2 i,j=1 ∂xi ∂xj
0
⎡ ⎤
d
∂f
+ ⎣f (X(s)) − f (X(s−)) − (X(s−))ΔX (s)⎦ .
i
0<s≤t i=1
∂xi
The above formula can be regarded as an equation with respect to the process X(t).
Such an equation is called a stochastic differential equation (SDE).
A typical example is Doléans–Dade (local martingale) exponential to the Doléans’
equation
t
Xt = 1 + Xs− dMs ,
0
If M is a locally square integrable martingale such that ΔMt > −1 and if it holds
that ⎧ ⎛ ⎡ ⎤⎞⎫
⎪
⎨
⎜1 , ⎟
⎪
⎬
E exp ⎝ M c T + ⎣ f (ΔMt )⎦⎠ < +∞ , (2.7)
⎪
⎩ 2 ⎪
⎭
t≤T
Theorem 1.4.
(1) Let θ(t, z), t ∈ T, z ∈ R \ {0} be a predictable process such that θ(t, z) < 1, and
| log(1 − θ(t, z))|2 + θ(t, z)2 dtμ(dz) < +∞ ,
T
and let u(t), t ∈ T be a predictable process such that T u(t)2 dt < +∞.
We put
- t t
1
Zt = exp u(s)dW (s) − u2 (s)ds
2
0 0
t
+ log(1 − θ(s, z))Ñ(dsdz)
0
t .
+ {(log(1 − θ(s, z)) + θ(s, z))}dsμ(dz) .
0
Basic materials to SDEs with jumps 23
E[ZT ] = 1 .
Hence Q(A) = E[1A .ZT ] defines a probability measure on (Ω, F ) such that
Q(A) = E[1A .Zt ] for A ∈ Ft . That is,
dQ
|F = Zt , t > 0.
dP t
(2) Let
Ñ1 (dtdz) = θ(t, z)dtμ(dz) + Ñ(dtdz)
and
dW1 (t) = u(t)dt + dW (t) .
Remark 1.2.
(1) To see Zt is a martingale, we use the Lépingle–Mémin result above. The condition
(2.7) is implied by our assumption (2.8). We put
t t
U (t) = u(s)dW (s) − θ(s, z)Ñ(dsdz) .
0 0
Then, ΔU (t) = −θ(t, z) > −1. We can show by Itô’s formula with f (x) = ex
that
dZt = Zt− dU (t)
with Z0 = 1. Hence,
Zt = E (U )t
by the uniqueness of the Doléans’ exponential.
If we assume a uniformity condition in t , that is,
(2) In Section 2.1.1 below, we introduce a perturbation method using the Girsanov
transform of measures. Bismut [28] used the expression for Zt in terms of SDE
above, whereas Bass et al.[13] used the expression E (M)t .
(3) Under the assumption that (Ft ) satisfies the usual conditions, Ñ1 = Ñt |t=1 is not
necessarily a compensated Poisson random measure with respect to Q, and W1
is not necessarily a Brownian motion with respect to Q (cf. [24] Warning 3.9.20).
Let z(t) be a one-dimensional Lévy process with the Lévy measure μ(dz) :
t
z(t) = 0 z Ñ(dsdz) being a martingale. Here, we assume that supp μ is compact.
We put θ(t, z) = 1 − eα(t)·z where α(t) is fixed below. Then,
log(1 − θ(t, z)) + θ(t, z) = α(t) · z + 1 − eα(t)·z
= − eα(t)·z − 1 − α(t) · z .
on Ft . We see u(t) and θ(t, z) satisfy the above mentioned condition for the unifor-
mity for a bounded α(t). Hence, putting a new measure dQ by dQ = Zt dP on Ft ,
t
the process z1 (t) = 0 zÑ1 (dsdz) is a Lévy process which is a martingale under Q.
Here, we can choose α(t) to be some deterministic function. In particular,
dL
α(t) = (ḣ(t)) ,
dq
where L(q) denotes the Legendre transform of the Hamiltonian associated with the
process z(t):
H(p) = log E[ep·z(1) ] ,
and h(t) is an element in the Sobolev space W 1,p (T), p > 2. This setting is used in
the Bismut perturbation in Section 2.1.1.
From now until Section 2.5, we are mainly interested in the processes which are
obtained as a solution to the SDE driven by pure jump Lévy processes, and we shall
not consider those driven by Lévy processes having the diffusion part.
We may write
t
z(t) = (z1 (t), . . ., zm (t)) = z(N(dsdz) − 1{|z|≤1} .μ(dz)ds) ,
0 Rm \{0}
where N(dsdz) is a Poisson random measure on T×Rm \{0} with mean ds ×μ(dz).
We denote the index of the Lévy measure μ by β, that is,
β = inf{α > 0; |z|α μ(dz) < +∞} .
|z|≤1
We assume
|z|2 μ(dz) < +∞ (3.1)
Rm \{0}
Here, we assume
|b(x)| ≤ K(1 + |x|), |f (x)| ≤ K(1 + |x|), |g(x, z)| ≤ K(z)(1 + |x|) ,
and
|b(x) − b(y)| ≤ L|x − y| ,
|f (x) − f (y)| ≤ L|x − y| ,
|g(x, z) − g(y, z)| ≤ L(z)|x − y| .
Here, K, L are positive constants, and K(z), L(z) are positive functions, satisfying
{K p (z) + Lp (z)}μ(dz) < +∞ ,
Rm \{0}
where p ≥ 2. In Section 4.1, we will use the SDE of this form as a financial model.
Theorem 1.5. Assume x is p -th integrable. Under the assumptions on b(x), f (x),
g(x, z) above, the SDE has a unique solution in Lp .
If we can find a process X̃t which is Ft -measurable for some (Ft ) as above, such
that X̃0 and x have the same distribution, and that X̃t satisfies (∗) for some Lévy
process z̃(t), then it is called a weak solution. A solution Xt to (∗) is called a strong
solution if it is an adapted process (to (Ft )) and if it is represented as a functional of
the integrator z(t) and x : X. = F (x, z(.)).
A strong solution is a weak solution. Few results are known for the existence of
strong solutions in case that z(t) is a general semimartingale. For the uniqueness
of the solution (in the strong and weak senses), see [180] Theorems V.6, V.7 (See al-
so [197]).
Definition 1.2. We say the pathwise uniqueness of the solution to (∗) holds if for any
two solutions X 1 , X 2 to (∗) defined on the same probability space and driven by (the
same Brownian motion and) the same Poisson random measure N(dtdz), it holds
that
P (sup |Xt1 − Xt2 | = 0) = 1 .
t∈T
There are several conditions known so as that the pathwise uniqueness holds for
the solution to the SDE (∗). See [12, 77].
For the proof of this result, see [190] Theorem 137. The proof in [190] is rather
complex. In case that the SDE is driven only by the Wiener process, the result is called
“Barlow’s theorem”. See [180] Exercise IV.40 (p. 246 in Second edition).
In what follows, we consider, in particular, the following SDE on the Poisson
space with values in Rd of Itô type
t
c
xt (x) = x + b(xs (x))ds + γ(xs− (x), Δz(s)) , x0 (x) = x . (3.2)
0 s≤t
Itô processes with jumps 27
c
Here, denotes the compensated sum, that is,
⎧ ⎫
c ⎪
⎨ t ⎪
⎬
γ(x, Δz(s)) = lim γ(x, Δz(s)) − ds γ(x, z)μ(dz)
⎪
→0 ⎩ ⎪
⎭
s≤t s≤t,|Δz(s)|≥ 0 |z|≥
t
+ γ(xs− (x), z)N(ds dz), (3.3)
0 |z|>1
(A.1)
(a) For any p ≥ 2 and any k ∈ Nd \ {0},
∂kγ
p
|γ(x, z)|p μ(dz) ≤ C(1 + |x|)p , sup k (x, z) μ(dz) < +∞ .
x ∂x
Due to the previous result (Theorem 1.5), the SDE (3.3) has a unique solution xt (x).
Furthermore, the condition (A.2) guarantees the existence of the flow φst (x)(ω) :
Rd → Rd , xs (x) → xt (x) of diffeomorphisms for all 0 < s ≤ t .
Here, we say φst (x)(ω) : Rd → Rd , xs (x) → xt (x) is a flow of diffeomorphisms
if it is a bijection a.s. for which φst (x) and its inverse are smooth diffeomorphisms
a.s. for each s < t . We write ϕt (x) of xt (x) if s = 0.
We remark that at the jump time t = τ of xt (x),
xτ (x) = xτ− (x) + Δxτ (x) = xτ− (x) + γ(xτ− (x), Δz(τ)) .
Hence,
∂ ∂ ∂
ϕτ (x) = I + γ (xτ− (x), Δz(τ)) ϕτ− (x) ,
∂x ∂x ∂x
and this implies
−1 −1 −1
∂ ∂ ∂
ϕτ (x) = ϕτ− (x) I+ γ (xτ− (x), Δz(τ)) .
∂x ∂x ∂x
Lp -estimates
The Lp -estimate for the solution of (3.2) is not easy in general. Here, we provide a sim-
ple one. Suppose xt (x) is given by
t
xt (x) = x + bt + γ(z)Ñ(dr dz) .
0
Under the integrability of γ(z) with respect to 1{|z|≥1} .dμ(z), we have the following
Lp estimate for xt (x).
Indeed, the crucial condition (3.22) in [119] follows from (3.4) and (A.1).
Itô processes with jumps 29
Jacobian
In order to inquire the flow property of xt (x), we need the derivative ∇xt (x) =
∂xt
∂x (x) of xt (x). It is known that under the conditions (A.0–A.2), the derivative sat-
isfies the following linear equation:
Proposition 1.8. Under the assumptions (A.0–A.2), the derivative ∇xt (x) satisfies the
following SDE:
t
∇xt (x) = I + ∇b (xs− (x))ds∇xs− (x)
0
t
+ ∇γ(xs− (x), z)Ñ(dsdz)∇xs− (x)
0 |z|≤1
t
+ ∇γ(xs− (x), z)N(dsdz)∇xs− (x) . (3.5)
0 |z|>1
Proof. We skip the proof for the differentiability since it is considerably long ([119]
Theorem 3.3). The SDE for which ∇xt (x) should satisfy is given as below.
Let e be a unit vector in Rd , and let
1
X (t) = (xt (x + λe) − xt (x)), 0 < |λ| < 1 .
λ
t t t
X (t) = e + bλ (s)ds + γλ (s, z)Ñ(dsdz) + γλ (s, z)N(dsdz)
0 0 |z|≤1 0 |z|>1
where
1
bλ (s) =(b (xs− (x + λe)) − b (xs− (x))) ,
λ
1
γλ (s, z) = (γ(xs− (x + λe), z) − γ(xs− (x), z)) .
λ
and
lim γλ (s, z) = ∇γ(xs− (x), z) · ∇xs− (x)a.s.
λ→0
30 Lévy processes and Itô calculus
t
∇xt (x) = I + ∇b (xs− (x))ds∇xs− (x)
0
t
+ ∇γ(xs− (x), z)Ñ(dsdz)∇xs− (x)
0 |z|≤1
t
+ ∇γ(xs− (x), z)N(dsdz)∇xs− (x).
0 |z|>1
The equation (3.5) is fundamental in considering the stochastic quadratic form
(an analogue of Malliavin matrix for the jump process). See Chap. 2.
The derivative ∇xt (x) depends on the choice of e ∈ Rd , and is indeed a di-
rectional derivative. We denote the Jacobian matrix by the same symbol, namely,
∇xt (x) = ∇x xt (x).
Here is an expression of det ∇xt (x) which will be useful in the analytic (volume)
estimates.
Let
t d
∂
At = b (xs (x))ds
i=1
∂xi
0
⎡ ⎤
t d
⎣det(I + ∇γ(xs− (x), z)) − 1 − ∂
+ γ(xs− (x), z)⎦ dsμ(dz)
i=1
∂xi
0 |z|≤1
⎡ ⎤
t d
⎣det(I + ∇γ(xs− (x), z)) − 1 − ∂
+ γ(xs− (x), z)⎦ N(ds, dz) ,
i=1
∂xi
0 |z|>1
t
Mt = [det(I + ∇γ(xs− (x), z)) − 1] Ñ(dsdz) .
0 |z|≤1
We then have
Lemma 1.1.
det(∇xt (x)) = E (A)t · E (M)t . (3.6)
Itô processes with jumps 31
t d
∂
det(∇xt (x)) = 1 + b (xs (x))ds det(∇xs (x))
i=1
∂x i
0
t
/
+ det((I + ∇γ(xs− (x), z)) · ∇xs− (x))
0 |z|≤1
0
− det(∇xs− (x)) Ñ(dsdz)
t
/
+ det((I + ∇γ(xs− (x), z)) · ∇xs− (x))
0 |z|>1
0
− det(∇xs− (x)) N(dsdz)
t
/
+ det((I + ∇γ(xs− (x), z)) · ∇xs− (x)) − det(∇xs− (x))
0 |z|≤1
d
∂ 0
− γ(xs− (x), z) · det ∇xs− (x) dsμ(dz)
i=1
∂xi
t
= 1+ det(∇xs− (x))d(As + Ms ) .
0
Hence, we apply the Doléans’ exponential formula and Girsanov’s theorem (Theo-
rem 1.4) from Section 1.2.3. In view of [A, M]s = 0, [A + M] = [A] + [M] and we
have the assertion.
Using (3.6), it is possible to show the boundedness of E[supt∈T det(∇xt (x))−p ]
by a constant which depends on p > 1 and the coefficients of the SDE (3.2).
Inverse flow
Using the Jacobian ∇xt (x), we can show the existence of the inverse flow xt−1 (x) as
a representation of the flow property. By the inverse mapping theorem, this is due to
the (local) invertibility of the Jacobian ∇xt (x).
Associated to the above mentioned process ∇xt (x) = U (t):
t t
U (t) = ∇b (xs− (x))U (s)ds + ∇γ(xs− (x), z)U (s)Ñ(dsdz)
0 0 |z|≤1
t
+ ∇γ(xs− (x), z)U (s)N(dsdz) ,
0 |z|>1
Exploring the Variety of Random
Documents with Different Content
sufficiently early to see a herd of more than a thousand antelopes
that were going to the watering-place. My huntsman, also, who had
struck into another road, saw some hundred together; all the others
agreed that there were these thousand which I have mentioned. But
they soon dexterously divided to the right and the left on the
immeasurable level of this land, where there was merely low grass,
wild bamie and a quantity of basil, which latter was also met with on
all sides in the countries further up; and Suliman Kashef only shot
four, and my Sale not a single one. I myself could only see some
antelopes on the horizon, because it was already getting dusk, and I
stopped with Sabatier close to the vessels, in case some beast
should be scattered from the herd, but in vain. On this occasion,
also, I saw two lions at a distance.
At night the wind blew in coldly at the door and windows, and
even this morning the north-east wind was cool. At half-past six we
proceed E.N.E., and in a bend further to the right E. and E. by S.,
where we make a stronger evolution to the right. Eight o’clock.
Libàhn from S.E. by S. to S. We glide over shallows apparently
consisting of rubble-stone; the wind becomes strong and tosses the
waves. A quarter before nine, S.E. by S. to S., then still more to the
left, where we are soon thrown by the wind on the left shore, and
stop in E.S.E. Thibaut is with me, and they are calling for him; his
ship is full of water, and all the crew are summoned there: it is
fortunate that we are near land. Selim Capitan neglected to have the
vessels caulked at Khartùm, or to order at least gotrahm (instead of
tar) to be applied to the parts which we had stopped up with some
oakum.
At five minutes’ distance above, a large village deserted by
people; we are magnanimous enough on our side to keep the crew
from plundering it. It is slightly elevated: the same is also the case
with the shore, so that shallow lakes are formed right and left, at
present dry, and having vents to the water, which apparently are kept
open by human hands for the sustentation of the soil,—on which,
however, nothing is seen. A number of snail-shells are lying together
on the surface just as I have seen in other places, and it seems that
snails are eaten. We remain here on account of the accident to
Thibaut’s vessel, but the shores, à talus, do not allow us to bring it
on the dry land. Thermometer 17° and 24°.
CHAPTER X.
VARIOUS SPECIES OF GRASSES. — FORMATION OF THE SHORES. —
WATERFOWLS. — AN ANTELOPE OF THE TETE SPECIES, NOW AT
BERLIN. — STRATA OF THE SHORE. — THE SOBÀT RIVER. THE MAIN
ROAD FOR THE NATIVES FROM THE HIGHLANDS TO THE PLAINS. —
OBSERVATIONS ON THE COURSE OF THE NILE AND SOBÀT. — A
THOUSAND ANTELOPES SEEN MOVING TOGETHER! — WILD
BUFFALOES, LIONS, AND HYÆNAS. — AFRICA, THE CRADLE OF THE
NEGRO RACE. — THE SHUDDER-EL-FAS: DESCRIPTION OF THIS SHRUB.
— ARNAUD’S CHARLATANRY. — OUR AUTHOR FEARED BY THE
FRENCHMEN. — ARNAUD AND SABATIER’S JOURNALS: THE
MARVELLOUS STORIES OF THE FORMER. — THIBAUT’S JEALOUSY. —
VISIT OF A SHEIKH OF THE SHILLUKS. — FEAR OF THE TURKS AT THESE
PEOPLE. — SULIMAN KASHEF PURSUED BY A LION.
Our website is not just a platform for buying books, but a bridge
connecting readers to the timeless values of culture and wisdom. With
an elegant, user-friendly interface and an intelligent search system,
we are committed to providing a quick and convenient shopping
experience. Additionally, our special promotions and home delivery
services ensure that you save time and fully enjoy the joy of reading.
ebookfinal.com