1.II.27I

Principles of Statistics | Part II, 2007

Suppose that XX has density f(θ)f(\cdot \mid \theta) where θΘ\theta \in \Theta. What does it mean to say that statistic TT(X)T \equiv T(X) is sufficient for θ\theta ?

Suppose that θ=(ψ,λ)\theta=(\psi, \lambda), where ψ\psi is the parameter of interest, and λ\lambda is a nuisance parameter, and that the sufficient statistic TT has the form T=(C,S)T=(C, S). What does it mean to say that the statistic SS is ancillary? If it is, how (according to the conditionality principle) do we test hypotheses on ψ?\psi ? Assuming that the set of possible values for XX is discrete, show that SS is ancillary if and only if the density (probability mass function) f(xψ,λ)f(x \mid \psi, \lambda) factorises as

f(xψ,λ)=φ0(x)φC(C(x),S(x),ψ)φS(S(x),λ)f(x \mid \psi, \lambda)=\varphi_{0}(x) \varphi_{C}(C(x), S(x), \psi) \varphi_{S}(S(x), \lambda)

for some functions φ0,φC\varphi_{0}, \varphi_{C}, and φS\varphi_{S} with the properties

xC1(c)S1(s)φ0(x)=1=sφS(s,λ)=scφC(c,s,ψ)\sum_{x \in C^{-1}(c) \cap S^{-1}(s)} \varphi_{0}(x)=1=\sum_{s} \varphi_{S}(s, \lambda)=\sum_{s} \sum_{c} \varphi_{C}(c, s, \psi)

for all c,s,ψc, s, \psi, and λ\lambda.

Suppose now that X1,,XnX_{1}, \ldots, X_{n} are independent observations from a Γ(a,b)\Gamma(a, b) distribution, with density

f(xa,b)=(bx)a1ebxbI{x>0}/Γ(a).f(x \mid a, b)=(b x)^{a-1} e^{-b x} b I_{\{x>0\}} / \Gamma(a) .

Assuming that the criterion (*) holds also for observations which are not discrete, show that it is not possible to find (C(X),S(X))(C(X), S(X)) sufficient for (a,b)(a, b) such that SS is ancillary when bb is regarded as a nuisance parameter, and aa is the parameter of interest.

Typos? Please submit corrections to this page on GitHub.