Paper 1, Section II, H

Statistics | Part IB, 2011

Let X1,,XnX_{1}, \ldots, X_{n} be independent random variables with probability mass function f(x;θ)f(x ; \theta), where θ\theta is an unknown parameter.

(i) What does it mean to say that TT is a sufficient statistic for θ\theta ? State, but do not prove, the factorisation criterion for sufficiency.

(ii) State and prove the Rao-Blackwell theorem.

Now consider the case where f(x;θ)=1x!(logθ)xθf(x ; \theta)=\frac{1}{x !}(-\log \theta)^{x} \theta for non-negative integer xx and 0<θ<10<\theta<1.

(iii) Find a one-dimensional sufficient statistic TT for θ\theta.

(iv) Show that θ~={X1=0}\tilde{\theta}=\mathbb{\prod}_{\left\{X_{1}=0\right\}} is an unbiased estimator of θ\theta.

(v) Find another unbiased estimator θ^\widehat{\theta}which is a function of the sufficient statistic TT and that has smaller variance than θ~\tilde{\theta}. You may use the following fact without proof: X1++XnX_{1}+\cdots+X_{n} has the Poisson distribution with parameter nlogθ-n \log \theta.

Typos? Please submit corrections to this page on GitHub.