Paper 3, Section II, I

Principles of Statistics | Part II, 2009

What is meant by an equaliser decision rule? What is meant by an extended Bayes rule? Show that a decision rule that is both an equaliser rule and extended Bayes is minimax.

Let X1,,XnX_{1}, \ldots, X_{n} be independent and identically distributed random variables with the normal distribution N(θ,h1)\mathcal{N}\left(\theta, h^{-1}\right), and let k>0k>0. It is desired to estimate θ\theta with loss function L(θ,a)=1exp{12k(aθ)2}L(\theta, a)=1-\exp \left\{-\frac{1}{2} k(a-\theta)^{2}\right\}.

Suppose the prior distribution is θN(m0,h01)\theta \sim \mathcal{N}\left(m_{0}, h_{0}^{-1}\right). Find the Bayes act and the Bayes loss posterior to observing X1=x1,,Xn=xnX_{1}=x_{1}, \ldots, X_{n}=x_{n}. What is the Bayes risk of the Bayes rule with respect to this prior distribution?

Show that the rule that estimates θ\theta by Xˉ=n1i=1nXi\bar{X}=n^{-1} \sum_{i=1}^{n} X_{i} is minimax.

Typos? Please submit corrections to this page on GitHub.