Paper 1, Section II, J

Principles of Statistics | Part II, 2014

State without proof the inequality known as the Cramér-Rao lower bound in a parametric model {f(,θ):θΘ},ΘR\{f(\cdot, \theta): \theta \in \Theta\}, \Theta \subseteq \mathbb{R}. Give an example of a maximum likelihood estimator that attains this lower bound, and justify your answer.

Give an example of a parametric model where the maximum likelihood estimator based on observations X1,,XnX_{1}, \ldots, X_{n} is biased. State without proof an analogue of the Cramér-Rao inequality for biased estimators.

Define the concept of a minimax decision rule, and show that the maximum likelihood estimator θ^MLE\hat{\theta}_{M L E} based on X1,,XnX_{1}, \ldots, X_{n} in a N(θ,1)N(\theta, 1) model is minimax for estimating θΘ=R\theta \in \Theta=\mathbb{R} in quadratic risk.

Typos? Please submit corrections to this page on GitHub.