Paper 3, Section II, H

Statistics | Part IB, 2011

Consider the general linear model

Y=Xβ+ϵY=X \beta+\epsilon

where XX is a known n×pn \times p matrix, β\beta is an unknown p×1p \times 1 vector of parameters, and ϵ\epsilon is an n×1n \times 1 vector of independent N(0,σ2)N\left(0, \sigma^{2}\right) random variables with unknown variance σ2\sigma^{2}. Assume the p×pp \times p matrix XTXX^{T} X is invertible.

(i) Derive the least squares estimator β^\widehat{\beta} of β\beta.

(ii) Derive the distribution of β^\widehat{\beta}. Is β^\widehat{\beta} an unbiased estimator of β\beta ?

(iii) Show that 1σ2YXβ^2\frac{1}{\sigma^{2}}\|Y-X \widehat{\beta}\|^{2} has the χ2\chi^{2} distribution with kk degrees of freedom, where kk is to be determined.

(iv) Let β~\tilde{\beta} be an unbiased estimator of β\beta of the form β~=CY\tilde{\beta}=C Y for some p×np \times n matrix CC. By considering the matrix E[(β^β~)(β^β)T]\mathbb{E}\left[(\widehat{\beta}-\widetilde{\beta})(\widehat{\beta}-\beta)^{T}\right] or otherwise, show that β^\widehat{\beta} and β^β~\widehat{\beta}-\widetilde{\beta} are independent.

[You may use standard facts about the multivariate normal distribution as well as results from linear algebra, including the fact that IX(XTX)1XTI-X\left(X^{T} X\right)^{-1} X^{T} is a projection matrix of rank npn-p, as long as they are carefully stated.]

Typos? Please submit corrections to this page on GitHub.