Paper 4, Section II, J

Principles of Statistics | Part II, 2019

We consider a statistical model {f(,θ):θΘ}\{f(\cdot, \theta): \theta \in \Theta\}.

(a) Define the maximum likelihood estimator (MLE) and the Fisher information I(θ).I(\theta) .

(b) Let Θ=R\Theta=\mathbb{R} and assume there exist a continuous one-to-one function μ:RR\mu: \mathbb{R} \rightarrow \mathbb{R} and a real-valued function hh such that

Eθ[h(X)]=μ(θ)θR\mathbb{E}_{\theta}[h(X)]=\mu(\theta) \quad \forall \theta \in \mathbb{R}

(i) For X1,,XnX_{1}, \ldots, X_{n} i.i.d. from the model for some θ0R\theta_{0} \in \mathbb{R}, give the limit in almost sure sense of

μ^n=1ni=1nh(Xi)\hat{\mu}_{n}=\frac{1}{n} \sum_{i=1}^{n} h\left(X_{i}\right)

Give a consistent estimator θ^n\hat{\theta}_{n} of θ0\theta_{0} in terms of μ^n\hat{\mu}_{n}.

(ii) Assume further that Eθ0[h(X)2]<\mathbb{E}_{\theta_{0}}\left[h(X)^{2}\right]<\infty and that μ\mu is continuously differentiable and strictly monotone. What is the limit in distribution of n(θ^nθ0)\sqrt{n}\left(\hat{\theta}_{n}-\theta_{0}\right). Assume too that the statistical model satisfies the usual regularity assumptions. Do you necessarily expect Var(θ^n)(nI(θ0))1\operatorname{Var}\left(\hat{\theta}_{n}\right) \geqslant\left(n I\left(\theta_{0}\right)\right)^{-1} for all nn ? Why?

(iii) Propose an alternative estimator for θ0\theta_{0} with smaller bias than θ^n\hat{\theta}_{n} if Bn(θ0)=B_{n}\left(\theta_{0}\right)= Eθ0[θ^n]θ0=an+bn2+O(1n3)\mathbb{E}_{\theta_{0}}\left[\hat{\theta}_{n}\right]-\theta_{0}=\frac{a}{n}+\frac{b}{n^{2}}+O\left(\frac{1}{n^{3}}\right) for some a,bRa, b \in \mathbb{R} with a0a \neq 0.

(iv) Further to all the assumptions in iii), assume that the MLE for θ0\theta_{0} is of the form

θ^MLE=1ni=1nh(Xi)\hat{\theta}_{M L E}=\frac{1}{n} \sum_{i=1}^{n} h\left(X_{i}\right)

What is the link between the Fisher information at θ0\theta_{0} and the variance of h(X)h(X) ? What does this mean in terms of the precision of the estimator and why?

[You may use results from the course, provided you state them clearly.]

Typos? Please submit corrections to this page on GitHub.