A3.12 B3.15

Principles of Statistics | Part II, 2004

(i) What is a sufficient statistic? What is a minimal sufficient statistic? Explain the terms nuisance parameter and ancillary statistic.

(ii) Let U1,,UnU_{1}, \ldots, U_{n} be independent random variables with common uniform( ([0,1])([0,1]) distribution, and suppose you observe XiaUiβ,i=1,,nX_{i} \equiv a U_{i}^{-\beta}, i=1, \ldots, n, where the positive parameters a,βa, \beta are unknown. Write down the joint density of X1,,XnX_{1}, \ldots, X_{n} and prove that the statistic

(m,p)(min1jn{Xj},j=1nXj)(m, p) \equiv\left(\min _{1 \leqslant j \leqslant n}\left\{X_{j}\right\}, \prod_{j=1}^{n} X_{j}\right)

is minimal sufficient for (a,β)(a, \beta). Find the maximum-likelihood estimator (a^,β^)(\hat{a}, \hat{\beta}) of (a,β)(a, \beta).

Regarding β\beta as the parameter of interest and aa as the nuisance parameter, is mm ancillary? Find the mean and variance of β^\hat{\beta}. Hence find an unbiased estimator of β\beta.

Typos? Please submit corrections to this page on GitHub.