0% found this document useful (0 votes)
3 views1 page

Computer Tutorial 2

This document outlines a computer tutorial focused on Bayesian analysis of regression models, featuring exercises that cover various methods including analytical results, Monte Carlo integration, and Gibbs sampling. It includes detailed instructions for implementing these methods in MATLAB, with a particular emphasis on the Normal linear regression model and an AR(p) model for time series analysis. The tutorial provides data and code for practical application, encouraging users to explore Bayesian inference techniques through hands-on exercises.

Uploaded by

jessezheng742247
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views1 page

Computer Tutorial 2

This document outlines a computer tutorial focused on Bayesian analysis of regression models, featuring exercises that cover various methods including analytical results, Monte Carlo integration, and Gibbs sampling. It includes detailed instructions for implementing these methods in MATLAB, with a particular emphasis on the Normal linear regression model and an AR(p) model for time series analysis. The tutorial provides data and code for practical application, encouraging users to explore Bayesian inference techniques through hands-on exercises.

Uploaded by

jessezheng742247
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 1

Computer Tutorial 2: Bayesian Analysis of the Regression Model

Data and Matlab code for all questions are available on the course website.

This computer tutorial takes you through a series of exercises which demonstrates Bayesian inference in the
regression. The …rst three of the exercises go through the basics while the fourth puts it all together in an
interesting time series application. If you are con…dent in your knowledge of the basics, then you might want
to go directly to Exercise 4.

Exercise 1: The Normal Linear Regression Model with Natural Conjugate prior (Analytical results)
Bayesian inference in the Normal linear regression model with natural conjugate prior can be done analyti-
cally. That is, analytical formulae for the posterior exists. Construct a program which uses these analytical
formulae to calculate the posterior mean and standard deviation of the regression coe¢ cients for an infor-
mative and a noninformative prior. This is done in Session2_Ex1.m. Optional exercise: Extend this code to
calculate marginal likelihoods and calculate posterior odds ratios for j = 0 for each regression coe¢ cient.

Exercise 2: The Normal Linear Regression Model with Natural Conjugate prior (Monte Carlo integration)
Repeat Exercise 1 using Monte Carlo integration. How many draws to you need to take to replicate your
answers to Exercise 1? How sensitive are results to your choice of the number of draws? Code is available in
Session2_Ex2.m.

Exercise 3: The Normal Linear Regression Model with Independent Normal-Gamma prior (Gibbs sampling)
Bayesian inference in the Normal linear regression model with independent Normal-Gamma prior requires
Gibbs sampling. Repeat Exercise 2, but using an independent Normal-Gamma prior and, thus, using Gibbs
sampling. How do your results compare to Exercise 1 and 2? Code is available in Session2.Ex3.m.

Exercise 4: The AR(p) model as a Regression Model


This exercise is based on Geweke (1988, Journal of Business and Economic Statistics): “The Secular and
Cyclical Behavior of Real GDP in 19 OECD Countries” which uses an AR(3) model for GDP growth:

yt = 0 + 1 yt 1 + 2 yt 2 + 3 yt 3 + "t

where "t is i.i.d. N 0; h 1 . This can be treated a regression model (i.e. yt 1 , yt 2 and yt 3 play the
role Pof explanatory variables).1 Many important properties of yt depend on the roots of the polynomial
p i
1 i=1 i z which we will denote by ri for i = 1; ::; p. Geweke (1988) lets yt be the log of real GDP and
sets p = 3 and, for this choice, focusses on the features of interest: C = f : Two of ri are complexg and
0
D = f : min jri j < 1g where = ( 0 ; 1 ; 2 ; 3 ) . If the AR coe¢ cients lie in the region de…ned by C
then real GDP exhibits an oscillatory response to a shock and if they lie in D then yt exhibits an explosive
response to a shock. Note that C and D are regions whose bounds are complicated nonlinear functions of
1 ; 2 ; 3 and hence analytical results are not available (even if a natural conjugate prior is used) and Monte
Carlo integration or Gibbs sampling is required.
(a) Using an appropriate data set (e.g., the US real GDP data set provided on the website associated with
this book), write a program which calculates the posterior means and standard deviations of and minjri j.
(b) Extend the program of part a) to calculate the probability that yt is oscillatory (i.e., Pr ( 2 Cjy)), the
probability that yt is explosive (i.e., Pr ( 2 DjData)) and calculate these probabilities using your data set.

1 To simplify things, it is common to ignore the (minor) complications relating to the treatment of initial conditions. Thus,

assume the dependent variable is y = (y4 ; ::; yT )0 and treat y1 ; ::; y3 as …xed initial conditions.

You might also like