Heavy-Tailed Innovations in The R Package Stochvol: Preface
Heavy-Tailed Innovations in The R Package Stochvol: Preface
Heavy-Tailed Innovations in The R Package Stochvol: Preface
Gregor Kastner
WU Vienna University of Economics and Business
Abstract
We document how sampling from a conditional Student’s t distribution is implemented
in stochvol. Moreover, a simple example using EUR/CHF exchange rates illustrates how
to use the augmented sampler. We conclude with results and implications.
Preface
This note serves as a preliminary add-on to the more elaborate article “Dealing with Stochastic
Volatility in Time Series using the R package stochvol” (Kastner 2016a). It discusses and
relaxes the restriction to conditionally normal errors in the vanilla stochastic volatility (SV)
model.
ν ∼ U(a, b), i.e., follows a uniform distribution with support on the real interval (a, b). All
other prior components are chosen as in Kastner (2016a).
2. Usage
Estimating a stochastic volatility model with conditional t errors via stochvol is very similar to
estimating a model with standard Gaussian errors, differing only through specifying a non-NA
argument priornu. This triggers the sampler specified in Section 3. To provide an example,
we investigate the historical daily EUR/CHF exchange rates and display these in Figure 1.
R> library(stochvol)
R> data(exrates)
R> par(mfrow = c(2, 1), mar = c(1.7, 1.7, 1.7, 0.1), mgp = c(1.6, 0.6, 0))
R> plot(exrates$date, exrates$CHF, type = 'l', main = 'Price of 1 EUR in CHF')
R> dat <- logret(exrates$CHF, demean = TRUE)
R> plot(exrates$date[-1], dat, type = 'l', main = 'Demeaned log returns')
By specifying the argument priornu (a two-element vector containing the lower and upper
bounds of the uniform prior for ν), we can trigger the sampler to allow for heavy-tailed
conditional innovations.
R> rest <- svsample(dat, priormu = c(-12, 1), priorphi = c(20, 1.1),
+ priorsigma = 0.1, priornu = c(2, 100), burnin = 2000)
R> plot(rest, showobs = FALSE)
Gregor Kastner 3
Results are displayed in Figure 2, containing the output from the SV-t model. Row 1 depicts
p t /2) for t ∈ {1, . . . , n}; row 2 shows the time varying standard deviations given through
exp(h
ν/(ν − 2) exp(ht /2) for t ∈ {1, . . . , n}; row 3 portrays trace plots and row 4 outlines the
corresponding smoothed kernel density estimates for the four parameters µ, φ, σ, and ν. It
is worth noting that ν is estimated to lie between 6 and 18 with high posterior probability,
indicating evidence for the presence of heavy tails even after catering for stochastic volatility.
The extra flexibility of the SV-t sampler seems to allow for increased persistence φ and smaller
variance of log-volatility σ 2 , resulting in smoother time-varying volatility estimates.
Trace of mu (thinning = 1) Trace of phi (thinning = 1) Trace of sigma (thinning = 1) Trace of nu (thinning = 1)
1.000
−10
16
0.995
−11
0.18
14
0.990
12
−12
10
0.14
0.985
−13
8
−14
0.980
6
0.10
2000 4000 6000 8000 10000 12000 2000 4000 6000 8000 10000 12000 2000 4000 6000 8000 10000 12000 2000 4000 6000 8000 10000 12000
120
0.20
1.0
20
0.15
0.8
15
80
0.6
0.10
60
10
0.4
40
0.05
5
0.2
20
0.00
0.0
−14 −13 −12 −11 −10 0.980 0.985 0.990 0.995 1.000 0.10 0.12 0.14 0.16 0.18 0.20 0.22 4 6 8 10 12 14 16 18
Figure 2: Standard output of the plot method when applied to an svdraws object containing
posterior draws from an SV model with Student’s t errors.
0.05
●
●●
●
● ●
● ●● ●
● ●● ● ●
●
●● ● ● ● ● ●● ●
● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ●
● ●● ●●● ● ● ●
● ● ●● ● ●
●
●● ●
●●● ● ● ●● ●● ● ●
●
● ●● ●
● ● ● ● ●●● ●●●●● ● ● ● ●●●
●
● ● ●● ● ● ● ● ●● ●● ●● ●● ● ● ●● ● ● ●
●
●● ● ●●●● ● ● ●● ● ●● ● ● ●● ●●● ●● ● ●●● ● ● ●●●
●●● ●● ● ●
●
● ●●● ● ●● ● ● ●●●●●
●
●●●
● ●
●●● ●●● ●● ●
●● ● ●● ●●
● ● ●
● ● ● ● ● ● ●●● ● ● ● ●● ●● ●● ● ● ● ●●●● ● ●● ● ● ● ●
●● ●● ●● ● ●
● ●● ● ● ●● ●
● ● ●●●● ● ●●●● ● ●
●● ●●
●● ● ● ●● ●●●● ●●● ●
●●
● ●● ●●●●●●● ●
●●●●
●●
● ●● ●
●●●●
●● ●● ●●
● ●● ●
● ●● ● ● ●●● ●
● ●
●● ●●●●●● ●
●● ● ●
●● ●●● ● ● ● ●● ●●●● ● ●●● ● ● ● ●● ● ● ● ●●
●● ●●
●
●
●●
●
●
●
●●
● ●●
●●●●●● ●●●●●●
●
●●●●
●●
●● ●
●● ●
●●
●●
●●
●●
●●●●
● ●●
●
●
●
●
●
●
●●●
●● ●●
●
●
●
●
●
● ●●
●●
●
●
●
●●
●
●
●
●●
●
●●●●
● ●●●●
●●
●●●
●●
●● ●
●●●●●●
●●
●
●
●●
●
●
●●
●
●●●●
●●● ●
●
●
●●
●
●
●●
●●
●●
●●
●
●●
●●●●●●●
●●●
●
●
●
●
●●●
●
●
●
●
●
●
●●●
● ●
●
●
●
●● ●
●●●
●
●
●
●●
●
●●
●●●●●●
●
●●
●●●● ●
●
●●●
● ●●●
●●
●●●●
●● ●●●● ●
●●
● ●●
●●● ●
●●●●●
●●●
●
● ●●
●
●● ●● ● ●●
● ●● ● ●●● ●●● ●●●●
●●●●●
●●●●●●● ●●
●
●● ● ●
●● ●
●● ●● ●
●●●●● ● ●●
●●●
●●
●
●●
●
●
●● ●● ● ● ●●● ●●
●● ●● ●●● ●●●● ●● ●●●
● ●●●
●
●
●● ●
0.00 ●●●
● ●
●●●● ●●●●
●●
●●●
● ●● ●●●
● ●
● ●
●●●●
● ●●
● ●●
● ●
●●●
●●
●●●●●
●●●
●●●●●●●
●
● ●●
●●●
●●● ●
●
●
●●●
●●
●● ●●●●●●
●●●●● ●
●●
●
●●●
●●● ●
●●● ●
●
●●●
●●
●●● ●
●●
●
●
●●●●●●● ●
● ●● ●
●●●●
●● ●●
●●●●●●●
●●
●●●● ● ●●●●●
●●● ●● ● ●
●● ●●
● ●●
●●●●●●●●●●●● ●●
●● ●● ●
● ●
●●
●●
●●
●
● ●
●● ●●
●●
●●
●●
● ●
●
●●●●● ●●●●●●● ● ●
●● ●
● ●●●● ●● ●●●● ●
●● ●●
●●●
●
●●
●●
●
●●
●●
●●
●
●●
●●
●● ●
●●●●
●●
●●
●●
● ●● ●
●●
●
●●
●
●
●●
●
●
●●
●●●
●●
● ●
●●●
●●●
●●● ●●●
●●●
●●
● ●●●
●●
●●
●●
●
●●●
● ●● ● ●
●●
●●
●●●
●●●
●
●●●
●●
●
●●●
●●●
●●●
●●
●●
●
●
● ●
●●
●
●●●●●
● ●
●●●
●●
●●●
●
●●●
● ●●
●
●
●
●●
●●
●
●●●
●●●●
● ●●●●
● ●●●● ●●●
●
●
●●
●
●●●
●
●
●● ●● ●
●
●● ●● ●●●●● ●●●●●● ●●●
● ●●
●
●●
●●
●
●●
●
●●● ●●●●●
● ●●●●
●●●●● ●●●
●
●
●
●
●
●
●●
●●
● ●
●●
●●
●●
●
●●
●
●
●●●
●
●●
●
●
● ●
●●
●●
●●●
●●
●●
●●
●●
● ●
●
●●
● ● ● ●● ●
●●●●●
●●●●
●● ●●●●●
●●●●
●● ●●●
●●●●
●●●
●●
●●
● ●
●●●
●●●
●
● ●●
●●
●●
●
●●
●
●●
●
●●●● ●
● ● ●●● ●●●●
●● ●● ●●●
●
●
●●●
●
●●
● ●● ●
● ●●●● ●● ●●●●●●●
● ● ●
● ●
●● ●
● ●● ●● ●
●● ●●●
●
●
● ●●●●●●●● ●●
●●● ●
● ●●●
● ● ● ●●●
●● ●● ●●● ●● ● ●●●
●● ● ● ●●
● ●
●
● ●
● ●
● ●●●
●●
●
●●
●●
● ●
● ●
●
● ●
●● ● ●●●●●●●●
●
●●
● ●
●●
●●●
● ●●
●● ●●●● ●● ●
●
●
●
●
●● ●
●●
●●●
● ● ● ●
● ● ● ● ● ● ●● ●● ●●● ● ● ● ●● ● ● ●
● ● ● ● ● ● ● ● ● ●● ● ●
●● ●● ●●●● ●●● ●●● ●● ●●
● ●●●● ● ●●●● ● ●●● ●
● ● ●
●●● ● ● ●● ● ● ●
●●●● ●●
● ●●●●● ●
● ● ●●●● ●
●● ●
● ●●● ● ● ● ●●
● ● ●
●● ●●●●●● ●●●
● ●
●
●●●● ● ●● ●●
● ●●
●●●●●● ● ● ●
● ● ● ● ● ●●●● ●
● ● ● ●●● ●● ● ●● ●● ● ● ●●● ● ● ● ●●
● ● ● ●● ●
●● ●●● ● ● ● ●
● ● ● ● ●● ●
● ● ● ●● ●● ● ● ● ●●●●●
● ● ●
● ● ● ● ● ● ● ●
● ●● ● ● ● ● ●● ●
● ● ● ●●● ● ● ●●●
● ● ●
● ● ●
● ●●
●
●
● ●
−0.05
800
600
400
200
Figure 3: Log predictive one-day-ahead Bayes factors in favor of SV, SV-t, and GARCH errors
over the homoskedastic model. The final log predictive Bayes factors aggregate to 1107.44
(SV), 1115.36 (SV-t), and 1033.53 (GARCH), respectively, thus providing strong evidence for
the SV-t model.
Gregor Kastner 5
denotes the inverse gamma distribution with shape and scale parameters a and b, respectively.
√
Treating τ = (τ1 , . . . , τn )> as latent data and letting ỹt = yt / τt for t ∈ {1, . . . , n}, we have
ỹt |ht ∼ N (0, exp ht ) ,
and the AWOL sampler described in Kastner and Frühwirth-Schnatter (2014) can directly be
applied to the transformed data. To obtain draws from the newly introduced variables τ and
ν, two additional steps are required.
where ψ (m) denotes the polygamma function of order m. Using the above, it is easy to
numerically find
and a proposal candidate νprop may be drawn from a normal distribution with mean ν̂ and
variance Bν̂ (the Laplace approximation). Letting φ(x|ν̂, Bν̂ ) denote the corresponding den-
sity function, the acceptance probability is equal to min{1, R} with
4. Conclusion
We have shown how a simply data augmentation trick can be utilized to generalize the core
sampler in stochvol in order to cater for potentially heavier-tailed innovation distributions.
However, several caveats are called for:
• Even though the uniform prior for ν has been used widely, more robust alternatives are
probably preferred, cf. Frühwirth-Schnatter and Pyne (2010) and the references therein.
• Leaving aside the additional computational burden, it is trivial to incorporate this ex-
tension into samplers employing stochvol as part of a larger MCMC scheme (e.g. Huber
2014; Kastner, Frühwirth-Schnatter, and Lopes 2014; Dovern, Feldkircher, and Huber
2015). Nevertheless, at the current stage of development, this should be conducted with
caution by carefully investigating the convergence of the posterior draws.
Acknowledgments
The author would like to thank the attendants of the Spring 2015 Brown Bag Seminar of the
Institute for Statistics and Mathematics, WU Vienna University of Economics and Business,
in particular Sylvia Frühwirth-Schnatter, Mark Jensen, and Kurt Hornik, for stimulating
comments and suggestions.
References
Chib S, Greenberg E (1994). “Bayes Inference in Regression Models with ARMA(p, q) Errors.”
Journal of Econometrics, 64, 183–206. doi:10.1016/0304-4076(94)90063-9.
Gregor Kastner 7
Chib S, Nardari F, Shephard N (2002). “Markov Chain Monte Carlo Methods for Stochastic
Volatility Models.” Journal of Econometrics, 108, 281–316. doi:10.1016/S0304-4076(01)
00137-3.
Delatola EI, Griffin JE (2011). “Bayesian Nonparametric Modelling of the Return Distribution
with Stochastic Volatility.” 6, 901–926. doi:10.1214/11-BA632.
Dovern J, Feldkircher M, Huber F (2015). “Does Joint Modeling of the World Economy
Pay Off? Evaluating GVAR Forecasts from a Multivariate Perspective.” Discussion Paper
Series 590, University of Heidelberg, Department of Economics. URL http://www.ub.
uni-heidelberg.de/archiv/18586.
Harvey AC, Ruiz E, Shephard N (1994). “Multivariate Stochastic Variance Models.” The
Review of Economic Studies, 61(2), 247–264. doi:10.2307/2297980.
Huber F (2014). “Density Forecasting using Bayesian Global Vector Autoregressions with
Common Stochastic Volatility.” Department of Economics Working Paper Series 179, WU
Vienna University of Economics and Business. URL http://epub.wu.ac.at/id/eprint/
4280.
Kastner G (2016a). “Dealing with Stochastic Volatility in Time Series Using the R Package
stochvol.” Journal of Statistical Software, 69(5), 1–30. doi:10.18637/jss.v069.i05.
Kastner G (2016b). stochvol: Efficient Bayesian Inference for Stochastic Volatility (SV)
Models. R package version 1.2.3, URL http://CRAN.R-project.org/package=stochvol.
Nakajima J, Omori Y (2012). “Stochastic Volatility Model with Leverage and Asymmetrically
Heavy-Tailed Error Using GH Skew Student’s t-Distribution.” 56(11), 3690–3704. doi:
10.1016/j.csda.2010.07.012.
R Core Team (2016). R: A Language and Environment for Statistical Computing. R Founda-
tion for Statistical Computing, Vienna, Austria. URL http://www.R-project.org/.
8 Heavy-Tailed Innovations in the R Package stochvol
Silva RS, Lopes HF, Migon HS (2006). “The Extended Generalized Inverse Gaussian Distri-
bution for Log-Linear and Stochastic Volatility Models.” Brazilian Journal of Probability
and Statistics, 20(1), 67–91.
Affiliation:
Gregor Kastner
Institute for Statistics and Mathematics
Department of Finance, Accounting and Statistics
WU Vienna University of Economics and Business
Welthandelsplatz 1, Building D4, Level 4
1020 Vienna, Austria
Telephone: +43/1/31336-5593
Fax: +43/1/31336-90-5593
E-mail: [email protected]
URL: http://statmath.wu.ac.at/~kastner/