4.II.19C

Statistics | Part IB, 2007

Consider the linear regression model

Yi=α+βxi+ϵi,1inY_{i}=\alpha+\beta x_{i}+\epsilon_{i}, \quad 1 \leqslant i \leqslant n

where ϵ1,,ϵn\epsilon_{1}, \ldots, \epsilon_{n} are independent, identically distributed N(0,σ2),x1,,xnN\left(0, \sigma^{2}\right), x_{1}, \ldots, x_{n} are known real numbers with i=1nxi=0\sum_{i=1}^{n} x_{i}=0 and α,β\alpha, \beta and σ2\sigma^{2} are unknown.

(i) Find the least-squares estimates α^\widehat{\alpha}and β^\widehat{\beta} of α\alpha and β\beta, respectively, and explain why in this case they are the same as the maximum-likelihood estimates.

(ii) Determine the maximum-likelihood estimate σ^2\widehat{\sigma}^{2} of σ2\sigma^{2} and find a multiple of it which is an unbiased estimate of σ2\sigma^{2}.

(iii) Determine the joint distribution of α^,β^\widehat{\alpha}, \widehat{\beta} and σ^2\widehat{\sigma}^{2}.

(iv) Explain carefully how you would test the hypothesis H0:α=α0H_{0}: \alpha=\alpha_{0} against the alternative H1:αα0H_{1}: \alpha \neq \alpha_{0}.

Typos? Please submit corrections to this page on GitHub.