Paper 4, Section II, J

Principles of Statistics | Part II, 2021

Suppose that XθPoisson(θ),θ>0X \mid \theta \sim \operatorname{Poisson}(\theta), \theta>0, and suppose the prior π\pi on θ\theta is a gamma distribution with parameters α>0\alpha>0 and β>0\beta>0. [Recall that π\pi has probability density function

f(z)=βαΓ(α)zα1eβz,z>0f(z)=\frac{\beta^{\alpha}}{\Gamma(\alpha)} z^{\alpha-1} e^{-\beta z}, \quad z>0

and that its mean and variance are α/β\alpha / \beta and α/β2\alpha / \beta^{2}, respectively. ]

(a) Find the π\pi-Bayes estimator for θ\theta for the quadratic loss, and derive its quadratic risk function.

(b) Suppose we wish to estimate μ=eθ=Pθ(X=0)\mu=e^{-\theta}=\mathbb{P}_{\theta}(X=0). Find the π\pi-Bayes estimator for μ\mu for the quadratic loss, and derive its quadratic risk function. [Hint: The moment generating function of a Poisson (θ)(\theta) distribution is M(t)=exp(θ(et1))M(t)=\exp \left(\theta\left(e^{t}-1\right)\right) for tRt \in \mathbb{R}, and that of a Gamma (α,β)(\alpha, \beta) distribution is M(t)=(1t/β)αM(t)=(1-t / \beta)^{-\alpha} for t<βt<\beta.]

(c) State a sufficient condition for an admissible estimator to be minimax, and give a proof of this fact.

(d) For each of the estimators in parts (a) and (b), is it possible to deduce using the condition in (c) that the estimator is minimax for some value of α\alpha and β\beta ? Justify your answer.

Typos? Please submit corrections to this page on GitHub.