3.I.5I

Statistical Modelling | Part II, 2007

Consider two possible experiments giving rise to observed data yijy_{i j} where i=1,,I,j=1,,J.i=1, \ldots, I, j=1, \ldots, J .

  1. The data are realizations of independent Poisson random variables, i.e.,

Yijf1(yij;μij)=μijyijyij!exp{μij}Y_{i j} \sim f_{1}\left(y_{i j} ; \mu_{i j}\right)=\frac{\mu_{i j}^{y_{i j}}}{y_{i j} !} \exp \left\{-\mu_{i j}\right\}

where μij=μij(β)\mu_{i j}=\mu_{i j}(\beta), with β\beta an unknown (possibly vector) parameter. Write β^\hat{\beta} for the maximum likelihood estimator (m.l.e.) of β\beta and y^ij=μij(β^)\hat{y}_{i j}=\mu_{i j}(\hat{\beta}) for the (i,j)(i, j) th fitted value under this model.

  1. The data are components of a realization of a multinomial random 'vector'

Yf2((yij);n,(pij))=n!i=1Ij=1Jpijyijyij!Y \sim f_{2}\left(\left(y_{i j}\right) ; n,\left(p_{i j}\right)\right)=n ! \prod_{i=1}^{I} \prod_{j=1}^{J} \frac{p_{i j}^{y_{i j}}}{y_{i j} !}

where the yijy_{i j} are non-negative integers with

i=1Ij=1Jyij=n and pij(β)=μij(β)n\sum_{i=1}^{I} \sum_{j=1}^{J} y_{i j}=n \quad \text { and } \quad p_{i j}(\beta)=\frac{\mu_{i j}(\beta)}{n}

Write β\beta^{*} for the m.l.e. of β\beta and yij=npij(β)y_{i j}^{*}=n p_{i j}\left(\beta^{*}\right) for the (i,j)(i, j) th fitted value under this model.

Show that, if

i=1Ij=1Jy^ij=n\sum_{i=1}^{I} \sum_{j=1}^{J} \hat{y}_{i j}=n

then β^=β\hat{\beta}=\beta^{*} and y^ij=yij\hat{y}_{i j}=y_{i j}^{*} for all i,ji, j. Explain the relevance of this result in the context of fitting multinomial models within a generalized linear model framework.

Typos? Please submit corrections to this page on GitHub.