Paper 3, Section II, 27 K27 \mathrm{~K}

Principles of Statistics | Part II, 2011

Random variables X1,X2,X_{1}, X_{2}, \ldots are independent and identically distributed from the exponential distribution E(θ)\mathcal{E}(\theta), with density function

pX(xθ)=θeθx(x>0),p_{X}(x \mid \theta)=\theta e^{-\theta x} \quad(x>0),

when the parameter Θ\Theta takes value θ>0\theta>0. The following experiment is performed. First X1X_{1} is observed. Thereafter, if X1=x1,,Xi=xiX_{1}=x_{1}, \ldots, X_{i}=x_{i} have been observed (i1)(i \geqslant 1), a coin having probability α(xi)\alpha\left(x_{i}\right) of landing heads is tossed, where α:R(0,1)\alpha: \mathbb{R} \rightarrow(0,1) is a known function and the coin toss is independent of the XX 's and previous tosses. If it lands heads, no further observations are made; if tails, Xi+1X_{i+1} is observed.

Let NN be the total number of XX 's observed, and X:=(X1,,XN)\mathbf{X}:=\left(X_{1}, \ldots, X_{N}\right). Write down the likelihood function for Θ\Theta based on data X=(x1,,xn)\mathbf{X}=\left(x_{1}, \ldots, x_{n}\right), and identify a minimal sufficient statistic. What does the likelihood principle have to say about inference from this experiment?

Now consider the experiment that only records Y:=XNY:=X_{N}. Show that the density function of YY has the form

pY(yθ)=exp{a(y)k(θ)θy}p_{Y}(y \mid \theta)=\exp \{a(y)-k(\theta)-\theta y\}

Assuming the function a()a(\cdot) is twice differentiable and that both pY(yθ)p_{Y}(y \mid \theta) and pY(yθ)/y\partial p_{Y}(y \mid \theta) / \partial y vanish at 0 and \infty, show that a(Y)a^{\prime}(Y) is an unbiased estimator of Θ\Theta, and find its variance.

Stating clearly any general results you use, deduce that

k(θ)Eθ{a(Y)}1.-k^{\prime \prime}(\theta) \mathbb{E}_{\theta}\left\{a^{\prime \prime}(Y)\right\} \geqslant 1 .

Typos? Please submit corrections to this page on GitHub.