Spring 1997 John Rust
Economics 551b 37 Hillhouse, Rm. 27
PROBLEM SET 2
Bayesian Basics (II)
(Due: February 10, 1997)
QUESTION 1
Derive the convergence of posterior distribution when support of prior does
not include true parameter . Show that
where is a point mass at
and
is defined by:
QUESTION 2
Derive the convergence of posterior distribution when there exists observationally equivalent to
, i.e.
where
for almost all x (i.e. except
for x in a set of Lebesgue measure zero).
QUESTION 3 Do extra credit problem on an example
of computing an
invariant density for a Markov process. This is optional
but recommended.
QUESTION 4 This question asks you to employ the
Gibbs-sampling algorithm in a simple example taken from the book
Bayesian Data Analysis by Gelman et. al. Here the
data consist of a single observation from a bivariate
normal distribution with unknown mean
and known
covariance matrix
given by
With a uniform prior distribution over the posterior
distribution of
is bivariate normal with mean
and covariance matrix
.
Although it is trivial to sample directly from a bivariate normal
distribution of
, the purpose of this question
is to have you use the Gibbs sampler to draw from the posterior and
compare how well the random samples from the Gibbs sampler
approximate draws from the bivariate normal posterior.
Given the realized value of draw a value of
from
the conditional density
of
given
given by:
Starting from 500 different randomly drawn initial values of
perform T=50 loops of the above
Gibbs sampling algorithm and save the 500 final draws of
for each initial condition. Use these
random draws to compute the sample mean and covariance matrix of
the posterior. How well does it compare to the true mean and covariance
matrix of the posterior? Is T=50 a sufficient number of iterations for
the Gibbs sampler to converge to the invariant density?
where h is some function that has finite expectation with
respect to the invariant density p of the Markov process
generated by the Gibbs sampling algorithm.)