Paper 3, Section II, 20H\mathbf{2 0 H}

Statistics | Part IB, 2017

Consider the general linear model

Y=Xβ+ε\boldsymbol{Y}=X \boldsymbol{\beta}+\varepsilon

where XX is a known n×pn \times p matrix of full rank p<n,εNn(0,σ2I)p<n, \varepsilon \sim \mathcal{N}_{n}\left(0, \sigma^{2} I\right) with σ2\sigma^{2} known and βRp\boldsymbol{\beta} \in \mathbb{R}^{p} is an unknown vector.

(a) State without proof the Gauss-Markov theorem.

Find the maximum likelihood estimator β^\widehat{\boldsymbol{\beta}} for β\boldsymbol{\beta}. Is it unbiased?

Let β\boldsymbol{\beta}^{*} be any unbiased estimator for β\boldsymbol{\beta} which is linear in (Yi)\left(Y_{i}\right). Show that

var(tTβ^)var(tTβ)\operatorname{var}\left(\boldsymbol{t}^{T} \widehat{\boldsymbol{\beta}}\right) \leqslant \operatorname{var}\left(\boldsymbol{t}^{T} \boldsymbol{\beta}^{*}\right)

for all tRp\boldsymbol{t} \in \mathbb{R}^{p}.

(b) Suppose now that p=1p=1 and that β\boldsymbol{\beta} and σ2\sigma^{2} are both unknown. Find the maximum likelihood estimator for σ2\sigma^{2}. What is the joint distribution of β^\widehat{\boldsymbol{\beta}} and σ^2\widehat{\sigma}^{2} in this case? Justify your answer.

Typos? Please submit corrections to this page on GitHub.