Paper 1, Section I, J

Statistical Modelling | Part II, 2011

Let Y1,,YnY_{1}, \ldots, Y_{n} be independent identically distributed random variables with model function f(y,θ),yY,θΘRf(y, \theta), y \in \mathcal{Y}, \theta \in \Theta \subseteq \mathbb{R}, and denote by EθE_{\theta} and Varθ\operatorname{Var}_{\theta} expectation and variance under f(y,θ)f(y, \theta), respectively. Define Un(θ)=i=1nθlogf(Yi,θ)U_{n}(\theta)=\sum_{i=1}^{n} \frac{\partial}{\partial \theta} \log f\left(Y_{i}, \theta\right). Prove that EθUn(θ)=0E_{\theta} U_{n}(\theta)=0. Show moreover that if T=T(Y1,,Yn)T=T\left(Y_{1}, \ldots, Y_{n}\right) is any unbiased estimator of θ\theta, then its variance satisfies Varθ(T)(nVarθ(U1(θ))1\operatorname{Var}_{\theta}(T) \geqslant\left(n \operatorname{Var}_{\theta}\left(U_{1}(\theta)\right)^{-1}\right.. [You may use the Cauchy-Schwarz inequality without proof, and you may interchange differentiation and integration without justification if necessary.]

Typos? Please submit corrections to this page on GitHub.