Paper 2, Section II, I

Principles of Statistics | Part II, 2009

Suppose that the random vector X=(X1,,Xn)\mathbf{X}=\left(X_{1}, \ldots, X_{n}\right) has a distribution over Rn\mathbb{R}^{n} depending on a real parameter θ\theta, with everywhere positive density function p(xθ)p(\mathbf{x} \mid \theta). Define the maximum likelihood estimator θ^\hat{\theta}, the score variable UU, the observed information j^\hat{j} and the expected (Fisher) information II for the problem of estimating θ\theta from X\mathbf{X}.

For the case where the (Xi)\left(X_{i}\right) are independent and identically distributed, show that, as n,I1/2UdN(0,1)n \rightarrow \infty, I^{-1 / 2} U \stackrel{d}{\rightarrow} \mathcal{N}(0,1). [You may assume sufficient conditions to allow interchange of integration over the sample space and differentiation with respect to the parameter.] State the asymptotic distribution of θ^\hat{\theta}.

The random vector X=(X1,,Xn)\mathbf{X}=\left(X_{1}, \ldots, X_{n}\right) is generated according to the rule

Xi+1=θXi+EiX_{i+1}=\theta X_{i}+E_{i}

where X0=1X_{0}=1 and the (Ei)\left(E_{i}\right) are independent and identically distributed from the standard normal distribution N(0,1)\mathcal{N}(0,1). Write down the likelihood function for θ\theta based on data x=(x1,,xn)\mathbf{x}=\left(x_{1}, \ldots, x_{n}\right), find θ^\hat{\theta} and j^\hat{j} and show that the pair (θ^,j^)(\hat{\theta}, \hat{j}) forms a minimal sufficient statistic.

A Bayesian uses the improper prior density π(θ)1\pi(\theta) \propto 1. Show that, in the posterior, S(θθ^)S(\theta-\hat{\theta}) (where SS is a statistic that you should identify) has the same distribution as E1E_{1}.

Typos? Please submit corrections to this page on GitHub.