Paper 4, Section II, 19H

Statistics | Part IB, 2019

Consider the linear model

Yi=βxi+ϵi for i=1,,nY_{i}=\beta x_{i}+\epsilon_{i} \quad \text { for } \quad i=1, \ldots, n

where x1,,xnx_{1}, \ldots, x_{n} are known and ϵ1,,ϵn\epsilon_{1}, \ldots, \epsilon_{n} are i.i.d. N(0,σ2)N\left(0, \sigma^{2}\right). We assume that the parameters β\beta and σ2\sigma^{2} are unknown.

(a) Find the MLE β^\widehat{\beta} of β\beta. Explain why β^\widehat{\beta} is the same as the least squares estimator of β\beta.

(b) State and prove the Gauss-Markov theorem for this model.

(c) For each value of θR\theta \in \mathbb{R} with θ0\theta \neq 0, determine the unbiased linear estimator β~\tilde{\beta} of β\beta which minimizes

Eβ,σ2[exp(θ(β~β))]\mathbb{E}_{\beta, \sigma^{2}}[\exp (\theta(\tilde{\beta}-\beta))]

Typos? Please submit corrections to this page on GitHub.