Paper 3, Section II, 27 K27 \mathrm{~K}

Principles of Statistics | Part II, 2012

The parameter vector is Θ(Θ1,Θ2,Θ3)\Theta \equiv\left(\Theta_{1}, \Theta_{2}, \Theta_{3}\right), with Θi>0,Θ1+Θ2+Θ3=1\Theta_{i}>0, \Theta_{1}+\Theta_{2}+\Theta_{3}=1. Given Θ=θ(θ1,θ2,θ3)\boldsymbol{\Theta}=\boldsymbol{\theta} \equiv\left(\theta_{1}, \theta_{2}, \theta_{3}\right), the integer random vector X=(X1,X2,X3)\boldsymbol{X}=\left(X_{1}, X_{2}, X_{3}\right) has a trinomial distribution, with probability mass function

p(xθ)=n!x1!x2!x3!θ1x1θ2x2θ3x3,(xi0,i=13xi=n)p(\boldsymbol{x} \mid \boldsymbol{\theta})=\frac{n !}{x_{1} ! x_{2} ! x_{3} !} \theta_{1}^{x_{1}} \theta_{2}^{x_{2}} \theta_{3}^{x_{3}}, \quad\left(x_{i} \geqslant 0, \sum_{i=1}^{3} x_{i}=n\right)

Compute the score vector for the parameter Θ:=(Θ1,Θ2)\Theta^{*}:=\left(\Theta_{1}, \Theta_{2}\right), and, quoting any relevant general result, use this to determine E(Xi)(i=1,2,3)\mathbb{E}\left(X_{i}\right)(i=1,2,3).

Considering (1) as an exponential family with mean-value parameter Θ\Theta^{*}, what is the corresponding natural parameter Φ(Φ1,Φ2)\boldsymbol{\Phi} \equiv\left(\Phi_{1}, \Phi_{2}\right) ?

Compute the information matrix II for Θ\Theta^{*}, which has (i,j)(i, j)-entry

Iij=E(2lθiθj)(i,j=1,2)I_{i j}=-\mathbb{E}\left(\frac{\partial^{2} l}{\partial \theta_{i} \partial \theta_{j}}\right) \quad(i, j=1,2)

where ll denotes the log-likelihood function, based on X\boldsymbol{X}, expressed in terms of (θ1,θ2)\left(\theta_{1}, \theta_{2}\right).

Show that the variance of log(X1/X3)\log \left(X_{1} / X_{3}\right) is asymptotic to n1(θ11+θ31)n^{-1}\left(\theta_{1}^{-1}+\theta_{3}^{-1}\right) as nn \rightarrow \infty. [Hint. The information matrix IΦI_{\Phi} for Φ\boldsymbol{\Phi} is I1I^{-1} and the dispersion matrix of the maximum likelihood estimator Φ^\widehat{\boldsymbol{\Phi}}behaves, asymptotically (for nn \rightarrow \infty ) as IΦ1I_{\Phi}^{-1}.]

Typos? Please submit corrections to this page on GitHub.