Paper 1, Section I, 7H\mathbf{7 H} \quad

Statistics | Part IB, 2011

Consider the experiment of tossing a coin nn times. Assume that the tosses are independent and the coin is biased, with unknown probability pp of heads and 1p1-p of tails. A total of XX heads is observed.

(i) What is the maximum likelihood estimator p^\widehat{p} of pp ?

Now suppose that a Bayesian statistician has the Beta(M,N)\operatorname{Beta}(M, N) prior distribution for pp.

(ii) What is the posterior distribution for pp ?

(iii) Assuming the loss function is L(p,a)=(pa)2L(p, a)=(p-a)^{2}, show that the statistician's point estimate for pp is given by

M+XM+N+n\frac{M+X}{M+N+n}

[The Beta(M,N)\operatorname{Beta}(M, N) distribution has density Γ(M+N)Γ(M)Γ(N)xM1(1x)N1\frac{\Gamma(M+N)}{\Gamma(M) \Gamma(N)} x^{M-1}(1-x)^{N-1} for 0<x<10<x<1 and meanMM+N.]\left.\operatorname{mean} \frac{M}{M+N} .\right]

Typos? Please submit corrections to this page on GitHub.