Paper 2, Section II, J

Principles of Statistics | Part II, 2019

(a) We consider the model {Poisson(θ):θ(0,)}\{\operatorname{Poisson}(\theta): \theta \in(0, \infty)\} and an i.i.d. sample X1,,XnX_{1}, \ldots, X_{n} from it. Compute the expectation and variance of X1X_{1} and check they are equal. Find the maximum likelihood estimator θ^MLE\hat{\theta}_{M L E} for θ\theta and, using its form, derive the limit in distribution of n(θ^MLEθ)\sqrt{n}\left(\hat{\theta}_{M L E}-\theta\right).

(b) In practice, Poisson-looking data show overdispersion, i.e., the sample variance is larger than the sample expectation. For π[0,1]\pi \in[0,1] and λ(0,)\lambda \in(0, \infty), let pπ,λ:N0[0,1]p_{\pi, \lambda}: \mathbb{N}_{0} \rightarrow[0,1],

kpπ,λ(k)={πeλλkk! for k1(1π)+πeλ for k=0k \mapsto p_{\pi, \lambda}(k)= \begin{cases}\pi e^{-\lambda} \frac{\lambda^{k}}{k !} & \text { for } k \geqslant 1 \\ (1-\pi)+\pi e^{-\lambda} & \text { for } k=0\end{cases}

Show that this defines a distribution. Does it model overdispersion? Justify your answer.

(c) Let Y1,,YnY_{1}, \ldots, Y_{n} be an i.i.d. sample from pπ,λp_{\pi, \lambda}. Assume λ\lambda is known. Find the maximum likelihood estimator π^MLE\hat{\pi}_{M L E} for π\pi.

(d) Furthermore, assume that, for any π[0,1],n(π^MLEπ)\pi \in[0,1], \sqrt{n}\left(\hat{\pi}_{M L E}-\pi\right) converges in distribution to a random variable ZZ as nn \rightarrow \infty. Suppose we wanted to test the null hypothesis that our data arises from the model in part (a). Before making any further computations, can we necessarily expect ZZ to follow a normal distribution under the null hypothesis? Explain. Check your answer by computing the appropriate distribution.

[You may use results from the course, provided you state it clearly.]

Typos? Please submit corrections to this page on GitHub.