Paper 1, Section II, E

Statistics | Part IB, 2010

Consider the the linear regression model

Yi=βxi+ϵi,Y_{i}=\beta x_{i}+\epsilon_{i},

where the numbers x1,,xnx_{1}, \ldots, x_{n} are known, the independent random variables ϵ1,,ϵn\epsilon_{1}, \ldots, \epsilon_{n} have the N(0,σ2)N\left(0, \sigma^{2}\right) distribution, and the parameters β\beta and σ2\sigma^{2} are unknown. Find the maximum likelihood estimator for β\beta.

State and prove the Gauss-Markov theorem in the context of this model.

Write down the distribution of an arbitrary linear estimator for β\beta. Hence show that there exists a linear, unbiased estimator β^\widehat{\beta} for β\beta such that

Eβ,σ2[(β^β)4]Eβ,σ2[(β~β)4]\mathbb{E}_{\beta, \sigma^{2}}\left[(\widehat{\beta}-\beta)^{4}\right] \leqslant \mathbb{E}_{\beta, \sigma^{2}}\left[(\widetilde{\beta}-\beta)^{4}\right]

for all linear, unbiased estimators β~\widetilde{\beta}.

[Hint: If ZN(a,b2)Z \sim N\left(a, b^{2}\right) then E[(Za)4]=3b4.]\left.\mathbb{E}\left[(Z-a)^{4}\right]=3 b^{4} .\right]

Typos? Please submit corrections to this page on GitHub.