0% found this document useful (0 votes)
85 views1 page

Regression: Lancaster Page 10

This document discusses the components of Bayes' theorem and provides examples to illustrate them. It begins by defining a regression function as the expected conditional distribution of one random variable given another. It then explains that Bayes' theorem relates the posterior distribution, likelihood, and prior distribution. The document uses simple examples with scalar parameters and observations to illustrate how Bayes' theorem works, despite the simplicity not representing real applications fully. It identifies the posterior as the object on the left of Bayes' theorem, the likelihood and prior as the numerator on the right, and the marginal distribution or predictive distribution of the data as the denominator, which is often ignored for inference purposes.

Uploaded by

muralidharan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
85 views1 page

Regression: Lancaster Page 10

This document discusses the components of Bayes' theorem and provides examples to illustrate them. It begins by defining a regression function as the expected conditional distribution of one random variable given another. It then explains that Bayes' theorem relates the posterior distribution, likelihood, and prior distribution. The document uses simple examples with scalar parameters and observations to illustrate how Bayes' theorem works, despite the simplicity not representing real applications fully. It identifies the posterior as the object on the left of Bayes' theorem, the likelihood and prior as the numerator on the right, and the marginal distribution or predictive distribution of the data as the denominator, which is often ignored for inference purposes.

Uploaded by

muralidharan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 1

REGRESSION A regression function is a property of the joint distribution of a pair of random variables.

Specifically, it is the expected value in the conditional distribution of one given the other. If the variates are X
and Y it is EX\Y y as a function of y or EY\X x as a function of x.
Let us examine the components of Bayes’ theorem as expressed

using several simple examples. We shall initially restrict ourselves to cases in which the parameter 𝛉 is scalar
and not vector valued and we shall consider only situations where each observation is also scalar. This will be
rather artificial since in almost all econometric applications the parameter has several, possibly many,
dimensions – even in our consumption income example the parameter (, ) had two dimensions and, as we
remarked before,
most economic models involve relations between several variables. Moreover the
examples use rather simple functional forms and these do not do justice to the
full flexibility of modern Bayesian methods. But these restrictions have the great
expositional advantage that they avoid computational complexity and enable us to
show the workings of Bayes’ theorem graphically.
The components of Bayes’ theorem are the objects appearing in (1.4). The object
on the left, p(\y), is the posterior distribution; the numerator on the right contains
the likelihood, p(y\), and the prior p(). The denominator on the right, p(y), is
called the marginal distribution of the data or, depending on the context, the predictive
distribution of the data. It can be seen that it does not involve and so for purposes
of inference about it can be neglected and Bayes’ theorem is often written as
p(\y) p(y\)p() (1.5)
where the symbol means “is proportional to.” This last relation can be translated
into words as “the posterior distribution is proportional to the likelihood times the
prior.” We shall focus here on the elements of (1.5).
Lancaster page 10

You might also like