Paper 1, Section I, J

Statistical Modelling | Part II, 2013

Variables Y1,,YnY_{1}, \ldots, Y_{n} are independent, with YiY_{i} having a density p(yμi)p\left(y \mid \mu_{i}\right) governed by an unknown parameter μi\mu_{i}. Define the deviance for a model MM that imposes relationships between the (μi)\left(\mu_{i}\right).

From this point on, suppose YiPoisson(μi)Y_{i} \sim \operatorname{Poisson}\left(\mu_{i}\right). Write down the log-likelihood of data y1,,yny_{1}, \ldots, y_{n} as a function of μ1,,μn\mu_{1}, \ldots, \mu_{n}.

Let μ^i\widehat{\mu}_{i} be the maximum likelihood estimate of μi\mu_{i} under model MM. Show that the deviance for this model is given by

2i=1n{yilogyiμ^i(yiμ^i)}2 \sum_{i=1}^{n}\left\{y_{i} \log \frac{y_{i}}{\widehat{\mu}_{i}}-\left(y_{i}-\widehat{\mu}_{i}\right)\right\}

Now suppose that, under M,logμi=βTxi,i=1,,nM, \log \mu_{i}=\beta^{\mathrm{T}} x_{i}, i=1, \ldots, n, where x1,,xnx_{1}, \ldots, x_{n} are known pp-dimensional explanatory variables and β\beta is an unknown pp-dimensional parameter. Show that μ^:=(μ^1,,μ^n)T\widehat{\mu}:=\left(\widehat{\mu}_{1}, \ldots, \widehat{\mu}_{n}\right)^{\mathrm{T}} satisfies XTy=XTμ^X^{\mathrm{T}} y=X^{\mathrm{T}} \widehat{\mu}, where y=(y1,,yn)Ty=\left(y_{1}, \ldots, y_{n}\right)^{\mathrm{T}} and XX is the (n×p)(n \times p) matrix with rows x1T,,xnTx_{1}^{\mathrm{T}}, \ldots, x_{n}^{\mathrm{T}}, and express this as an equation for the maximum likelihood estimate β^\widehat{\beta} of β\beta. [You are not required to solve this equation.]

Typos? Please submit corrections to this page on GitHub.