"SURGIBBS.GPR: program to use Gibbs sampling of the the seemingly unrelated regression model to compute draws from the posterior density for (beta,omega) By John Rust, Yale University, April, 1997"; ""; /* Consider Bayesian estimation of the Seemingly Unrelated Regression Model (SUR) with kx1 vector of regression coefficients beta and a p x p covariance matrix for the unobservables in the SUR model. We observe (y,x) where Y is a np x 1 stacked dependent variable (i.e. p equations for each observation times the n observations) and x is an (np x k) matrix in the format given in the answer to question 1 on the midterm. Given a prior over (beta,omega^{-1}) equal to a product of a Normal and Wishart with hyperparameters (b0,a0,rho0,r0) (see answers to Problem Set 3 for details) this program uses Gibbs sampling to construct draws of beta and Omega. Program is setup to report a moving average of simulated beta and omega draws every 100 iterations of the Gibbs sampling procedure. For comparison purposes the program also computes the GLS estimates of beta and Omega. This is a "generic" program which can be used to answer Questions 1 and 4 of the Midterm exam. The user needs to modify the procedures: set_hp.g: procedure to set the prior hyperparameters of the model get_dat.g: procedure to return (y,X) After modifying these procedures, user only needs to set the following two variables: nloops: determines the number of Gibbs sampling iterations done prior to beginning sampling from the posterior nsamples: number of samples to obtain from the approximate posterior density */ nloops=500; nsamples=500; /* You don't have to change anything below these lines. But make sure you have modified get_dat and set_hp procedures to the problem at hand. */ /* set prior hyperparameters */ {b0,a0,rho0,r0}=set_hp; /* retrieve observed X and y variables for SUR model p = number of equations per observation k = total number of regression coefficients to be estimated y = np x 1 dependent variable vector x = np x k matrix of independent variables */ {x,y,n,p,k}=get_dat; omegasim=zeros(p*(p+1)/2,nsamples); /* stores vectorized lower triangle of each simulated omega matrix */ betasim=zeros(k,nsamples); /* stores each simulated beta vector */ omegapost=zeros(p,p); /* stores moving average of simulated omegas: equals mean of omegasim at termination of program */ betapost=zeros(k,1); /* stores moving average of simulated betas: equals mean of betasim */ "Computing 2-stage feasible GLS point estimates of SUR model"; ""; omega=eye(p); xm=zeros(k,k); ym=zeros(k,1); i=1; do until i > n; xm=xm+x[p*(i-1)+1:p*(i-1)+p,.]'(omega*x[p*(i-1)+1:p*(i-1)+p,.]); ym=ym+x[p*(i-1)+1:p*(i-1)+p,.]'(omega*y[p*(i-1)+1:p*(i-1)+p,.]); i=i+1; endo; "First stage OLS regression estimates of SUR model"; tb0=(invpd(xm)*ym); tb0'; /* calculate OLS residuals and construct estimate of covariance matrix omega */ ehat=y-x*tb0; omega=zeros(p,p); i=1; do until i > n; omega=omega+ehat[p*(i-1)+1:p*(i-1)+p]*ehat[p*(i-1)+1:p*(i-1)+p]'; i=i+1; endo; omega=omega/n; if (p < 10); "Estimated Omega matrix from first stage estimates of beta"; omega; endif; omega=invpd(omega); xm=zeros(k,k); ym=zeros(k,1); i=1; do until i > n; xm=xm+x[p*(i-1)+1:p*(i-1)+p,.]'(omega*x[p*(i-1)+1:p*(i-1)+p,.]); ym=ym+x[p*(i-1)+1:p*(i-1)+p,.]'(omega*y[p*(i-1)+1:p*(i-1)+p,.]); i=i+1; endo; "Second stage feasible GLS regression estimates of SUR model"; tb0=(invpd(xm)*ym); tb0'; /* calculate OLS residuals and construct estimate of covariance matrix omega */ ehat=y-x*tb0; omega=zeros(p,p); i=1; do until i > n; omega=omega+ehat[p*(i-1)+1:p*(i-1)+p]*ehat[p*(i-1)+1:p*(i-1)+p]'; i=i+1; endo; omega=omega/n; if (p < 10); "Estimated Omega matrix from second stage estimates of beta"; omega; endif; ""; "Starting Gibbs sampling algorithm using "$+ftos(nloops,"*.*lf",4,0)$+" `burn-in' iterations"; /* compute initial posterior hyperparameters start Gibbs sampling algorithm */ rhon=rho0+n; bn=b0; t=hsec; g=1; do until g > nloops+nsamples; /* compute parameter matrix of conditional inverse Wishart component of the posterior density */ ehat=y-x*bn; rn=invpd(r0); i=1; do until i > n; rn=rn+ehat[p*(i-1)+1:p*(i-1)+p]*ehat[p*(i-1)+1:p*(i-1)+p]'; i=i+1; endo; rn=invpd(rn); /* Draw a random inverse omega matrix from the Wishart with parameters (rhon,rn) */ /* Fast method for simulating a Wishart described on p. 213 of Rossi and McCulloch, Journal of Econometrics, 1994 */ rn=chol(rn); omega=zeros(p,p); i=1; do until i > p; omega[i,i]=sqrt(sumc(rndn(rhon,1)^2)); j=i+1; do until j > p; omega[j,i]=rndn(1,1); j=j+1; endo; i=i+1; endo; omega=rn*omega; omega=omega*omega'; /* Slow method of simulating from a Wishart based on the representation of a Wishart random vector as a sum of outer products of Normal random vectors. (Keep block of code commented out if you are using the fast simulation method for the Wishart given above: i=1; do until i > rhon; z=rn*rndn(p,1); omega=omega+z*z'; i=i+1; endo; */ /* Update parameters of conditional Normal component of the posterior density */ an=a0; xm=zeros(k,k); ym=zeros(k,1); i=1; do until i > n; xm=xm+x[p*(i-1)+1:p*(i-1)+p,.]'(omega*x[p*(i-1)+1:p*(i-1)+p,.]); ym=ym+x[p*(i-1)+1:p*(i-1)+p,.]'(omega*y[p*(i-1)+1:p*(i-1)+p,.]); i=i+1; endo; an=an+xm; bn=invpd(an)*(a0*b0+ym); bn=bn+chol(invpd(an))*rndn(k,1); /* we have now drawn (bn,omega) from iteration g of the Gibbs sampling algorithm. We now start a new loop and draw a new pair (bn,omega) for iteration g+1. But before we do this, the following code prints out some summary statistics on the progress of the Gibbs sampling iterations so far */ "\r\r\r\r\r";; g;; /* print current iteration of Gibbs sampler */ if (g < nloops); nf=g; elseif (g == nloops); nf=1; betapost=zeros(k,1); omegapost=zeros(p,p); "Burn-in simulations completed: now drawing "$+ftos(nsamples,"*.*lf",4,0)$+" from approximate posterior, storing averages in betapost and omegapost"; else; nf=g-nloops; endif; omega=invpd(omega); betapost=betapost+bn; omegapost=omegapost+omega; /* Store current simulations of beta and omega in matrices betasim and omegasim */ if (g > nloops); omegasim[.,nf]=vech(omega); betasim[.,nf]=bn; endif; if (g%100 == 0); ""; ""; "CPU time for 100 iterations of Gibbs sampler (seconds): ";; (hsec-t)/100; t=hsec; "Current estimate of posterior mean of beta"; betapost'/nf; if (p < 10); "Current estimate of posterior mean of omega"; omegapost/nf; endif; ""; endif; g=g+1; endo; "Vectorized records of simulations are stored in omegasim and betasim "; "Do xpnd(omegasim[.,j]) to recreate the j'th simulated draw of omega ";