Prof. John Rust
Due: February 17, 1999
Question 1
Let where Y is
random vector
and
is
positive definite matrix. Prove
. (Hint: Apply Jordan
Decomposition on
.)
Question 2
The Legendre Polynomials, denoted by ,
,
,...,
can be obtained by finding the set of orthogonal vectors that
span the same subspace in
as 1, x,
,
,... Using
the Gram-Schmidt Orthogonalization Process, find the first four Legendre Polynomials.
Then normalize the Legendre Polynomials so that they are orthonormal.
Question 3
Let be the OLS estimate in the regression:
And be the estimate from stepwise regression.
That is, first obtain
as the OLS estimate from the regression:
Then obtain as the OLS estimate from the following regression:
Where and
are projections of y and
on
respectively, i.e.,
they are the predicted values in the regression of y and
on
respectively. Show that
and
are identical.
Suggest when stepwise regression can be useful.
Question 4
Suppose , show that
, where
.