2.II.27I

Principles of Statistics | Part II, 2005

(i) Suppose that XX is a multivariate normal vector with mean μRd\mu \in \mathbb{R}^{d} and covariance matrix σ2I\sigma^{2} I, where μ\mu and σ2\sigma^{2} are both unknown, and II denotes the d×dd \times d identity matrix. Suppose that Θ0Θ1\Theta_{0} \subset \Theta_{1} are linear subspaces of Rd\mathbb{R}^{d} of dimensions d0d_{0} and d1d_{1}, where d0<d1<dd_{0}<d_{1}<d. Let PiP_{i} denote orthogonal projection onto Θi(i=0,1)\Theta_{i}(i=0,1). Carefully derive the joint distribution of (XP1X2,P1XP0X2)\left(\left|X-P_{1} X\right|^{2},\left|P_{1} X-P_{0} X\right|^{2}\right) under the hypothesis H0:μΘ0H_{0}: \mu \in \Theta_{0}. How could you use this to make a test of H0H_{0} against H1:μΘ1H_{1}: \mu \in \Theta_{1} ?

(ii) Suppose that II students take JJ exams, and that the mark XijX_{i j} of student ii in exam jj is modelled as

Xij=m+αi+βj+εijX_{i j}=m+\alpha_{i}+\beta_{j}+\varepsilon_{i j}

where iαi=0=jβj\sum_{i} \alpha_{i}=0=\sum_{j} \beta_{j}, the εij\varepsilon_{i j} are independent N(0,σ2)N\left(0, \sigma^{2}\right), and the parameters m,α,βm, \alpha, \beta and σ\sigma are unknown. Construct a test of H0:βj=0H_{0}: \beta_{j}=0 for all jj against H1:jβj=0H_{1}: \sum_{j} \beta_{j}=0.

Typos? Please submit corrections to this page on GitHub.