Paper 2, Section II, 28 K28 \mathrm{~K}

Principles of Statistics | Part II, 2012

Carefully defining all italicised terms, show that, if a sufficiently general method of inference respects both the Weak Sufficiency Principle and the Conditionality Principle, then it respects the Likelihood Principle.

The position XtX_{t} of a particle at time t>0t>0 has the Normal distribution N(0,ϕt)\mathcal{N}(0, \phi t), where ϕ\phi is the value of an unknown parameter Φ\Phi; and the time, TxT_{x}, at which the particle first reaches position x0x \neq 0 has probability density function

px(t)=x2πϕt3exp(x22ϕt)(t>0).p_{x}(t)=\frac{|x|}{\sqrt{2 \pi \phi t^{3}}} \exp \left(-\frac{x^{2}}{2 \phi t}\right) \quad(t>0) .

Experimenter E1E_{1} observes XτX_{\tau}, and experimenter E2E_{2} observes TξT_{\xi}, where τ>0,ξ0\tau>0, \xi \neq 0 are fixed in advance. It turns out that Tξ=τT_{\xi}=\tau. What does the Likelihood Principle say about the inferences about Φ\Phi to be made by the two experimenters?

E1E_{1} bases his inference about Φ\Phi on the distribution and observed value of Xτ2/τX_{\tau}^{2} / \tau, while E2E_{2} bases her inference on the distribution and observed value of ξ2/Tξ\xi^{2} / T_{\xi}. Show that these choices respect the Likelihood Principle.

Typos? Please submit corrections to this page on GitHub.