A4.13 B4.15

Principles of Statistics | Part II, 2002

(a) Let X1,,XnX_{1}, \ldots, X_{n} be independent, identically distributed random variables from a one-parameter distribution with density function

f(x;θ)=h(x)g(θ)exp{θt(x)},xR.f(x ; \theta)=h(x) g(\theta) \exp \{\theta t(x)\}, x \in \mathbb{R} .

Explain in detail how you would test

H0:θ=θ0 against H1:θθ0H_{0}: \theta=\theta_{0} \text { against } H_{1}: \theta \neq \theta_{0} \text {. }

What is the general form of a conjugate prior density for θ\theta in a Bayesian analysis of this distribution?

(b) Let Y1,Y2Y_{1}, Y_{2} be independent Poisson random variables, with means (1ψ)λ(1-\psi) \lambda and ψλ\psi \lambda respectively, with λ\lambda known.

Explain why the Conditionality Principle leads to inference about ψ\psi being drawn from the conditional distribution of Y2Y_{2}, given Y1+Y2Y_{1}+Y_{2}. What is this conditional distribution?

(c) Suppose Y1,Y2Y_{1}, Y_{2} have distributions as in (b), but that λ\lambda is now unknown.

Explain in detail how you would test H0:ψ=ψ0H_{0}: \psi=\psi_{0} against H1:ψψ0H_{1}: \psi \neq \psi_{0}, and describe the optimality properties of your test.

[Any general results you use should be stated clearly, but need not be proved.]

Typos? Please submit corrections to this page on GitHub.