• # Paper 2, Section I, D

A coin has probability $p$ of landing heads. Let $q_{n}$ be the probability that the number of heads after $n$ tosses is even. Give an expression for $q_{n+1}$ in terms of $q_{n}$. Hence, or otherwise, find $q_{n}$.

comment
• # Paper 2, Section I, F

Let $X$ be a continuous random variable taking values in $[0, \sqrt{3}]$. Let the probability density function of $X$ be

$f_{X}(x)=\frac{c}{1+x^{2}}, \quad \text { for } x \in[0, \sqrt{3}]$

where $c$ is a constant.

Find the value of $c$ and calculate the mean, variance and median of $X$.

[Recall that the median of $X$ is the number $m$ such that $\left.\mathbb{P}(X \leqslant m)=\frac{1}{2} .\right]$

comment
• # Paper 2, Section II, 10E

(a) Alanya repeatedly rolls a fair six-sided die. What is the probability that the first number she rolls is a 1 , given that she rolls a 1 before she rolls a $6 ?$

(b) Let $\left(X_{n}\right)_{n \geqslant 0}$ be a simple symmetric random walk on the integers starting at $x \in \mathbb{Z}$, that is,

$X_{n}=\left\{\begin{array}{cl} x & \text { if } n=0 \\ x+\sum_{i=1}^{n} Y_{i} & \text { if } n \geqslant 1 \end{array}\right.$

where $\left(Y_{n}\right)_{n \geqslant 1}$ is a sequence of IID random variables with $\mathbb{P}\left(Y_{n}=1\right)=\mathbb{P}\left(Y_{n}=-1\right)=\frac{1}{2}$. Let $T=\min \left\{n \geqslant 0: X_{n}=0\right\}$ be the time that the walk first hits 0 .

(i) Let $n$ be a positive integer. For $0, calculate the probability that the walk hits 0 before it hits $n$.

(ii) Let $x=1$ and let $A$ be the event that the walk hits 0 before it hits 3 . Find $\mathbb{P}\left(X_{1}=0 \mid A\right)$. Hence find $\mathbb{E}(T \mid A)$.

(iii) Let $x=1$ and let $B$ be the event that the walk hits 0 before it hits 4 . Find $\mathbb{E}(T \mid B)$.

comment
• # Paper 2, Section II, 12F

State and prove Chebyshev's inequality.

Let $\left(X_{i}\right)_{i \geqslant 1}$ be a sequence of independent, identically distributed random variables such that

$\mathbb{P}\left(X_{i}=0\right)=p \text { and } \mathbb{P}\left(X_{i}=1\right)=1-p$

for some $p \in[0,1]$, and let $f:[0,1] \rightarrow \mathbb{R}$ be a continuous function.

(i) Prove that

$B_{n}(p):=\mathbb{E}\left(f\left(\frac{X_{1}+\cdots+X_{n}}{n}\right)\right)$

is a polynomial function of $p$, for any natural number $n$.

(ii) Let $\delta>0$. Prove that

$\sum_{k \in K_{\delta}}\left(\begin{array}{l} n \\ k \end{array}\right) p^{k}(1-p)^{n-k} \leqslant \frac{1}{4 n \delta^{2}}$

where $K_{\delta}$ is the set of natural numbers $0 \leqslant k \leqslant n$ such that $|k / n-p|>\delta$.

(iii) Show that

$\sup _{p \in[0,1]}\left|f(p)-B_{n}(p)\right| \rightarrow 0$

as $n \rightarrow \infty$. [You may use without proof that, for any $\epsilon>0$, there is a $\delta>0$ such that $|f(x)-f(y)| \leqslant \epsilon$ for all $x, y \in[0,1]$ with $|x-y| \leqslant \delta$.]

comment
• # Paper 2, Section II, 9E

(a) (i) Define the conditional probability $\mathbb{P}(A \mid B)$ of the event $A$ given the event $B$. Let $\left\{B_{j}: 1 \leqslant j \leqslant n\right\}$ be a partition of the sample space such that $\mathbb{P}\left(B_{j}\right)>0$ for all $j$. Show that, if $\mathbb{P}(A)>0$,

$\mathbb{P}\left(B_{j} \mid A\right)=\frac{\mathbb{P}\left(A \mid B_{j}\right) \mathbb{P}\left(B_{j}\right)}{\sum_{k=1}^{n} \mathbb{P}\left(A \mid B_{k}\right) \mathbb{P}\left(B_{k}\right)}$

(ii) There are $n$ urns, the $r$ th of which contains $r-1$ red balls and $n-r$ blue balls. Alice picks an urn (uniformly) at random and removes two balls without replacement. Find the probability that the first ball is blue, and the conditional probability that the second ball is blue, given that the first is blue. [You may assume, if you wish, that $\sum_{i=1}^{n-1} i(i-1)=\frac{1}{3} n(n-1)(n-2)$.]

(b) (i) What is meant by saying that two events $A$ and $B$ are independent? Two fair (6-sided) dice are rolled. Let $A_{t}$ be the event that the sum of the numbers shown is $t$, and let $B_{i}$ be the event that the first die shows $i$. For what values of $t$ and $i$ are the two events $A_{t}$ and $B_{i}$ independent?

(ii) The casino at Monte Corona features the following game: three coins each show heads with probability $3 / 5$ and tails otherwise. The first counts 10 points for a head and 2 for a tail; the second counts 4 points for both a head and a tail; and the third counts 3 points for a head and 20 for a tail. You and your opponent each choose a coin. You cannot both choose the same coin. Each of you tosses your coin once and the person with the larger score wins the jackpot. Would you prefer to be the first or the second to choose a coin?

comment
• # Paper 2, Section II, D

Let $\Delta$ be the disc of radius 1 with centre at the origin $O$. Let $P$ be a random point uniformly distributed in $\Delta$. Let $(R, \Theta)$ be the polar coordinates of $P$. Show that $R$ and $\Theta$ are independent and find their probability density functions $f_{R}$ and $f_{\Theta}$.

Let $A, B$ and $C$ be three random points selected independently and uniformly in $\Delta$. Find the expected area of triangle $O A B$ and hence find the probability that $C$ lies in the interior of triangle $O A B$.

Find the probability that $O, A, B$ and $C$ are the vertices of a convex quadrilateral.

comment

• # Paper 1, Section I, F

A robot factory begins with a single generation-0 robot. Each generation- $n$ robot independently builds some number of generation- $(n+1)$ robots before breaking down. The number of generation- $(n+1)$ robots built by a generation- $n$ robot is $0,1,2$ or 3 with probabilities $\frac{1}{12}, \frac{1}{2}, \frac{1}{3}$ and $\frac{1}{12}$ respectively. Find the expectation of the total number of generation- $n$ robots produced by the factory. What is the probability that the factory continues producing robots forever?

[Standard results about branching processes may be used without proof as long as they are carefully stated.]

comment
• # Paper 1, Section II, F

(a) Let $Z$ be a $N(0,1)$ random variable. Write down the probability density function (pdf) of $Z$, and verify that it is indeed a pdf. Find the moment generating function (mgf) $m_{Z}(\theta)=\mathbb{E}\left(e^{\theta Z}\right)$ of $Z$ and hence, or otherwise, verify that $Z$ has mean 0 and variance 1 .

(b) Let $\left(X_{n}\right)_{n \geqslant 1}$ be a sequence of IID $N(0,1)$ random variables. Let $S_{n}=\sum_{i=1}^{n} X_{i}$ and let $U_{n}=S_{n} / \sqrt{n}$. Find the distribution of $U_{n}$.

(c) Let $Y_{n}=X_{n}^{2}$. Find the mean $\mu$ and variance $\sigma^{2}$ of $Y_{1}$. Let $T_{n}=\sum_{i=1}^{n} Y_{i}$ and let $V_{n}=\left(T_{n}-n \mu\right) / \sigma \sqrt{n}$.

If $\left(W_{n}\right)_{n \geqslant 1}$ is a sequence of random variables and $W$ is a random variable, what does it mean to say that $W_{n} \rightarrow W$ in distribution? State carefully the continuity theorem and use it to show that $V_{n} \rightarrow Z$ in distribution.

[You may not assume the central limit theorem.]

comment
• # Paper 1, Section II, F

Let $A_{1}, A_{2}, \ldots, A_{n}$ be events in some probability space. State and prove the inclusion-exclusion formula for the probability $\mathbb{P}\left(\bigcup_{i=1}^{n} A_{i}\right)$. Show also that

$\mathbb{P}\left(\bigcup_{i=1}^{n} A_{i}\right) \geqslant \sum_{i} \mathbb{P}\left(A_{i}\right)-\sum_{i

Suppose now that $n \geqslant 2$ and that whenever $i \neq j$ we have $\mathbb{P}\left(A_{i} \cap A_{j}\right) \leqslant 1 / n$. Show that there is a constant $c$ independent of $n$ such that $\sum_{i=1}^{n} \mathbb{P}\left(A_{i}\right) \leqslant c \sqrt{n}$.

comment

• # Paper 2, Section I, 3F

(a) Prove that $\log (n !) \sim n \log n$ as $n \rightarrow \infty$.

(b) State Stirling's approximation for $n$ !.

(c) A school party of $n$ boys and $n$ girls travel on a red bus and a green bus. Each bus can hold $n$ children. The children are distributed at random between the buses.

Let $A_{n}$ be the event that the boys all travel on the red bus and the girls all travel on the green bus. Show that

$\mathbb{P}\left(A_{n}\right) \sim \frac{\sqrt{\pi n}}{4^{n}} \text { as } n \rightarrow \infty$

comment
• # Paper 2, Section I, F

Let $X$ and $Y$ be independent exponential random variables each with parameter 1 . Write down the joint density function of $X$ and $Y$.

Let $U=6 X+8 Y$ and $V=2 X+3 Y$. Find the joint density function of $U$ and $V$.

Are $U$ and $V$ independent? Briefly justify your answer.

comment
• # Paper 2, Section II, F

Let $A_{1}, A_{2}, \ldots, A_{n}$ be events in some probability space. Let $X$ be the number of $A_{i}$ that occur (so $X$ is a random variable). Show that

$\mathbb{E}(X)=\sum_{i=1}^{n} \mathbb{P}\left(A_{i}\right)$

and

$\operatorname{Var}(X)=\sum_{i=1}^{n} \sum_{j=1}^{n}\left(\mathbb{P}\left(A_{i} \cap A_{j}\right)-\mathbb{P}\left(A_{i}\right) \mathbb{P}\left(A_{j}\right)\right)$

[Hint: Write $X=\sum_{i=1}^{n} X_{i}$ where $X_{i}=\left\{\begin{array}{ll}1 & \text { if } A_{i} \text { occurs } \\ 0 & \text { if not }\end{array}\right.$.]

A collection of $n$ lightbulbs are arranged in a circle. Each bulb is on independently with probability $p$. Let $X$ be the number of bulbs such that both that bulb and the next bulb clockwise are on. Find $\mathbb{E}(X)$ and $\operatorname{Var}(X)$.

Let $B$ be the event that there is at least one pair of adjacent bulbs that are both on.

Use Markov's inequality to show that if $p=n^{-0.6}$ then $\mathbb{P}(B) \rightarrow 0$ as $n \rightarrow \infty$.

Use Chebychev's inequality to show that if $p=n^{-0.4}$ then $\mathbb{P}(B) \rightarrow 1$ as $n \rightarrow \infty$.

comment
• # Paper 2, Section II, F

Recall that a random variable $X$ in $\mathbb{R}^{2}$ is bivariate normal or Gaussian if $u^{T} X$ is normal for all $u \in \mathbb{R}^{2}$. Let $X=\left(\begin{array}{c}X_{1} \\ X_{2}\end{array}\right)$ be bivariate normal.

(a) (i) Show that if $A$ is a $2 \times 2$ real matrix then $A X$ is bivariate normal.

(ii) Let $\mu=\mathbb{E}(X)$ and $V=\operatorname{Var}(X)=\mathbb{E}\left[(X-\mu)(X-\mu)^{T}\right]$. Find the moment generating function $M_{X}(\lambda)=\mathbb{E}\left(e^{\lambda^{T}} X\right)$ of $X$ and deduce that the distribution of a bivariate normal random variable $X$ is uniquely determined by $\mu$ and $V$.

(iii) Let $\mu_{i}=\mathbb{E}\left(X_{i}\right)$ and $\sigma_{i}^{2}=\operatorname{Var}\left(X_{i}\right)$ for $i=1,2$. Let $\rho=\frac{\operatorname{Cov}\left(X_{1}, X_{2}\right)}{\sigma_{1} \sigma_{2}}$ be the correlation of $X_{1}$ and $X_{2}$. Write down $V$ in terms of some or all of $\mu_{1}, \mu_{2}, \sigma_{1}, \sigma_{2}$ and $\rho$. If $\operatorname{Cov}\left(X_{1}, X_{2}\right)=0$, why must $X_{1}$ and $X_{2}$ be independent?

For each $a \in \mathbb{R}$, find $\operatorname{Cov}\left(X_{1}, X_{2}-a X_{1}\right)$. Hence show that $X_{2}=a X_{1}+Y$ for some normal random variable $Y$ in $\mathbb{R}$ that is independent of $X_{1}$ and some $a \in \mathbb{R}$ that should be specified.

(b) A certain species of East Anglian goblin has left arm of mean length $100 \mathrm{~cm}$ with standard deviation $1 \mathrm{~cm}$, and right arm of mean length $102 \mathrm{~cm}$ with standard deviation $2 \mathrm{~cm}$. The correlation of left- and right-arm-length of a goblin is $\frac{1}{2}$. You may assume that the distribution of left- and right-arm-lengths can be modelled by a bivariate normal distribution. What is the probability that a randomly selected goblin has longer right arm than left arm?

[You may give your answer in terms of the distribution function $\Phi$ of a $N(0,1)$ random variable $Z$. That is, $\Phi(t)=\mathbb{P}(Z \leqslant t)$.J

comment
• # Paper 2, Section II, F

Let $m$ and $n$ be positive integers with $n>m>0$ and let $p \in(0,1)$ be a real number. A random walk on the integers starts at $m$. At each step, the walk moves up 1 with probability $p$ and down 1 with probability $q=1-p$. Find, with proof, the probability that the walk hits $n$ before it hits 0 .

Patricia owes a very large sum $£ 2(N$ !) of money to a member of a violent criminal gang. She must return the money this evening to avoid terrible consequences but she only has $£ N$ !. She goes to a casino and plays a game with the probability of her winning being $\frac{18}{37}$. If she bets $£ a$ on the game and wins then her $£ a$ is returned along with a further $£ a$; if she loses then her $£ a$ is lost.

The rules of the casino allow Patricia to play the game repeatedly until she runs out of money. She may choose the amount $£ a$ that she bets to be any integer a with $1 \leqslant a \leqslant N$, but it must be the same amount each time. What choice of $a$ would be best and why?

What choice of $a$ would be best, and why, if instead the probability of her winning the game is $\frac{19}{37}$ ?

comment
• # Paper 2, Section II, F

(a) State the axioms that must be satisfied by a probability measure $\mathbb{P}$ on a probability space $\Omega$.

Let $A$ and $B$ be events with $\mathbb{P}(B)>0$. Define the conditional probability $\mathbb{P}(A \mid B)$.

Let $B_{1}, B_{2}, \ldots$ be pairwise disjoint events with $\mathbb{P}\left(B_{i}\right)>0$ for all $i$ and $\Omega=\cup_{i=1}^{\infty} B_{i}$. Starting from the axioms, show that

$\mathbb{P}(A)=\sum_{i=1}^{\infty} \mathbb{P}\left(A \mid B_{i}\right) \mathbb{P}\left(B_{i}\right)$

and deduce Bayes' theorem.

(b) Two identical urns contain white balls and black balls. Urn I contains 45 white balls and 30 black balls. Urn II contains 12 white balls and 36 black balls. You do not know which urn is which.

(i) Suppose you select an urn and draw one ball at random from it. The ball is white. What is the probability that you selected Urn I?

(ii) Suppose instead you draw one ball at random from each urn. One of the balls is white and one is black. What is the probability that the white ball came from Urn I?

(c) Now suppose there are $n$ identical urns containing white balls and black balls, and again you do not know which urn is which. Each urn contains 1 white ball. The $i$ th urn contains $2^{i}-1$ black balls $(1 \leqslant i \leqslant n)$. You select an urn and draw one ball at random from it. The ball is white. Let $p(n)$ be the probability that if you replace this ball and again draw a ball at random from the same urn then the ball drawn on the second occasion is also white. Show that $p(n) \rightarrow \frac{1}{3}$ as $n \rightarrow \infty$

comment

• # Paper 2, Section I, F

(a) State the Cauchy-Schwarz inequality and Markov's inequality. State and prove Jensen's inequality.

(b) For a discrete random variable $X$, show that $\operatorname{Var}(X)=0$ implies that $X$ is constant, i.e. there is $x \in \mathbb{R}$ such that $\mathbb{P}(X=x)=1$.

comment
• # Paper 2, Section I, F

Let $X$ and $Y$ be independent Poisson random variables with parameters $\lambda$ and $\mu$ respectively.

(i) Show that $X+Y$ is Poisson with parameter $\lambda+\mu$.

(ii) Show that the conditional distribution of $X$ given $X+Y=n$ is binomial, and find its parameters.

comment
• # Paper 2, Section II, 10F

(a) Let $X$ and $Y$ be independent random variables taking values $\pm 1$, each with probability $\frac{1}{2}$, and let $Z=X Y$. Show that $X, Y$ and $Z$ are pairwise independent. Are they independent?

(b) Let $X$ and $Y$ be discrete random variables with mean 0 , variance 1 , covariance $\rho$. Show that $\mathbb{E} \max \left\{X^{2}, Y^{2}\right\} \leqslant 1+\sqrt{1-\rho^{2}}$.

(c) Let $X_{1}, X_{2}, X_{3}$ be discrete random variables. Writing $a_{i j}=\mathbb{P}\left(X_{i}>X_{j}\right)$, show that $\min \left\{a_{12}, a_{23}, a_{31}\right\} \leqslant \frac{2}{3}$.

comment
• # Paper 2, Section II, F

For a symmetric simple random walk $\left(X_{n}\right)$ on $\mathbb{Z}$ starting at 0 , let $M_{n}=\max _{i \leqslant n} X_{i}$.

(i) For $m \geqslant 0$ and $x \in \mathbb{Z}$, show that

$\mathbb{P}\left(M_{n} \geqslant m, X_{n}=x\right)= \begin{cases}\mathbb{P}\left(X_{n}=x\right) & \text { if } x \geqslant m \\ \mathbb{P}\left(X_{n}=2 m-x\right) & \text { if } x

(ii) For $m \geqslant 0$, show that $\mathbb{P}\left(M_{n} \geqslant m\right)=\mathbb{P}\left(X_{n}=m\right)+2 \sum_{x>m} \mathbb{P}\left(X_{n}=x\right)$ and that

$\mathbb{P}\left(M_{n}=m\right)=\mathbb{P}\left(X_{n}=m\right)+\mathbb{P}\left(X_{n}=m+1\right)$

(iii) Prove that $\mathbb{E}\left(M_{n}^{2}\right)<\mathbb{E}\left(X_{n}^{2}\right)$.

comment
• # Paper 2, Section II, F

(a) Consider a Galton-Watson process $\left(X_{n}\right)$. Prove that the extinction probability $q$ is the smallest non-negative solution of the equation $q=F(q)$ where $F(t)=\mathbb{E}\left(t^{X_{1}}\right)$. [You should prove any properties of Galton-Watson processes that you use.]

In the case of a Galton-Watson process with

$\mathbb{P}\left(X_{1}=1\right)=1 / 4, \quad \mathbb{P}\left(X_{1}=3\right)=3 / 4$

find the mean population size and compute the extinction probability.

(b) For each $n \in \mathbb{N}$, let $Y_{n}$ be a random variable with distribution $\operatorname{Poisson}(n)$. Show that

$\frac{Y_{n}-n}{\sqrt{n}} \rightarrow Z$

in distribution, where $Z$ is a standard normal random variable.

Deduce that

$\lim _{n \rightarrow \infty} e^{-n} \sum_{k=0}^{n} \frac{n^{k}}{k !}=\frac{1}{2}$

comment
• # Paper 2, Section II, F

(a) Let $Y$ and $Z$ be independent discrete random variables taking values in sets $S_{1}$ and $S_{2}$ respectively, and let $F: S_{1} \times S_{2} \rightarrow \mathbb{R}$ be a function.

Let $E(z)=\mathbb{E} F(Y, z)$. Show that

$\mathbb{E} E(Z)=\mathbb{E} F(Y, Z) .$

Let $V(z)=\mathbb{E}\left(F(Y, z)^{2}\right)-(\mathbb{E} F(Y, z))^{2}$. Show that

$\operatorname{Var} F(Y, Z)=\mathbb{E} V(Z)+\operatorname{Var} E(Z)$

(b) Let $X_{1}, \ldots, X_{n}$ be independent Bernoulli $(p)$ random variables. For any function $F:\{0,1\} \rightarrow \mathbb{R}$, show that

$\operatorname{Var} F\left(X_{1}\right)=p(1-p)(F(1)-F(0))^{2}$

Let $\{0,1\}^{n}$ denote the set of all $0-1$ sequences of length $n$. By induction, or otherwise, show that for any function $F:\{0,1\}^{n} \rightarrow \mathbb{R}$,

$\operatorname{Var} F(X) \leqslant p(1-p) \sum_{i=1}^{n} \mathbb{E}\left(\left(F(X)-F\left(X^{i}\right)\right)^{2}\right)$

where $X=\left(X_{1}, \ldots, X_{n}\right)$ and $X^{i}=\left(X_{1}, \ldots, X_{i-1}, 1-X_{i}, X_{i+1}, \ldots, X_{n}\right)$.

comment

• # Paper 2, Section I, F

Let $X$ and $Y$ be real-valued random variables with joint density function

$f(x, y)= \begin{cases}x e^{-x(y+1)} & \text { if } x \geqslant 0 \text { and } y \geqslant 0 \\ 0 & \text { otherwise. }\end{cases}$

(i) Find the conditional probability density function of $Y$ given $X$.

(ii) Find the expectation of $Y$ given $X$.

comment
• # Paper 2, Section I, F

Let $X$ be a non-negative integer-valued random variable such that $0<\mathbb{E}\left(X^{2}\right)<\infty$.

Prove that

$\frac{\mathbb{E}(X)^{2}}{\mathbb{E}\left(X^{2}\right)} \leqslant \mathbb{P}(X>0) \leqslant \mathbb{E}(X)$

[You may use any standard inequality.]

comment
• # Paper 2, Section II, 10F

(a) For any random variable $X$ and $\lambda>0$ and $t>0$, show that

$\mathbb{P}(X>t) \leqslant \mathbb{E}\left(e^{\lambda X}\right) e^{-\lambda t}$

For a standard normal random variable $X$, compute $\mathbb{E}\left(e^{\lambda X}\right)$ and deduce that

$\mathbb{P}(X>t) \leqslant e^{-\frac{1}{2} t^{2}}$

(b) Let $\mu, \lambda>0, \mu \neq \lambda$. For independent random variables $X$ and $Y$ with distributions $\operatorname{Exp}(\lambda)$ and $\operatorname{Exp}(\mu)$, respectively, compute the probability density functions of $X+Y$ and $\min \{X, Y\}$.

comment
• # Paper 2, Section II, 12F

(a) Let $k \in\{1,2, \ldots\}$. For $j \in\{0, \ldots, k+1\}$, let $D_{j}$ be the first time at which a simple symmetric random walk on $\mathbb{Z}$ with initial position $j$ at time 0 hits 0 or $k+1$. Show $\mathbb{E}\left(D_{j}\right)=j(k+1-j)$. [If you use a recursion relation, you do not need to prove that its solution is unique.]

(b) Let $\left(S_{n}\right)$ be a simple symmetric random walk on $\mathbb{Z}$ starting at 0 at time $n=0$. For $k \in\{1,2, \ldots\}$, let $T_{k}$ be the first time at which $\left(S_{n}\right)$ has visited $k$ distinct vertices. In particular, $T_{1}=0$. Show $\mathbb{E}\left(T_{k+1}-T_{k}\right)=k$ for $k \geqslant 1$. [You may use without proof that, conditional on $S_{T_{k}}=i$, the random variables $\left(S_{T_{k}+n}\right)_{n \geqslant 0}$ have the distribution of a simple symmetric random walk starting at $i$.]

(c) For $n \geqslant 3$, let $\mathbb{Z}_{n}$ be the circle graph consisting of vertices $0, \ldots, n-1$ and edges between $k$ and $k+1$ where $n$ is identified with 0 . Let $\left(Y_{i}\right)$ be a simple random walk on $\mathbb{Z}_{n}$ starting at time 0 from 0 . Thus $Y_{0}=0$ and conditional on $Y_{i}$ the random variable $Y_{i+1}$ is $Y_{i} \pm 1$ with equal probability (identifying $k+n$ with $k$ ).

The cover time $T$ of the simple random walk on $\mathbb{Z}_{n}$ is the first time at which the random walk has visited all vertices. Show that $\mathbb{E}(T)=n(n-1) / 2$.

comment
• # Paper 2, Section II, F

Let $\beta>0$. The Curie-Weiss Model of ferromagnetism is the probability distribution defined as follows. For $n \in \mathbb{N}$, define random variables $S_{1}, \ldots, S_{n}$ with values in $\{\pm 1\}$ such that the probabilities are given by

$\mathbb{P}\left(S_{1}=s_{1}, \ldots, S_{n}=s_{n}\right)=\frac{1}{Z_{n, \beta}} \exp \left(\frac{\beta}{2 n} \sum_{i=1}^{n} \sum_{j=1}^{n} s_{i} s_{j}\right)$

where $Z_{n, \beta}$ is the normalisation constant

$Z_{n, \beta}=\sum_{s_{1} \in\{\pm 1\}} \cdots \sum_{s_{n} \in\{\pm 1\}} \exp \left(\frac{\beta}{2 n} \sum_{i=1}^{n} \sum_{j=1}^{n} s_{i} s_{j}\right)$

(a) Show that $\mathbb{E}\left(S_{i}\right)=0$ for any $i$.

(b) Show that $\mathbb{P}\left(S_{2}=+1 \mid S_{1}=+1\right) \geqslant \mathbb{P}\left(S_{2}=+1\right)$. [You may use $\mathbb{E}\left(S_{i} S_{j}\right) \geqslant 0$ for all $i, j$ without proof. ]

(c) Let $M=\frac{1}{n} \sum_{i=1}^{n} S_{i}$. Show that $M$ takes values in $E_{n}=\left\{-1+\frac{2 k}{n}: k=0, \ldots, n\right\}$, and that for each $m \in E_{n}$ the number of possible values of $\left(S_{1}, \ldots, S_{n}\right)$ such that $M=m$ is

$\frac{n !}{\left(\frac{1+m}{2} n\right) !\left(\frac{1-m}{2} n\right) !}$

Find $\mathbb{P}(M=m)$ for any $m \in E_{n}$.

comment
• # Paper 2, Section II, F

For a positive integer $N, p \in[0,1]$, and $k \in\{0,1, \ldots, N\}$, let

$p_{k}(N, p)=\left(\begin{array}{c} N \\ k \end{array}\right) p^{k}(1-p)^{N-k}$

(a) For fixed $N$ and $p$, show that $p_{k}(N, p)$ is a probability mass function on $\{0,1, \ldots, N\}$ and that the corresponding probability distribution has mean $N p$ and variance $N p(1-p)$.

(b) Let $\lambda>0$. Show that, for any $k \in\{0,1,2, \ldots\}$,

$\lim _{N \rightarrow \infty} p_{k}(N, \lambda / N)=\frac{e^{-\lambda} \lambda^{k}}{k !}$

Show that the right-hand side of $(*)$ is a probability mass function on $\{0,1,2, \ldots\}$.

(c) Let $p \in(0,1)$ and let $a, b \in \mathbb{R}$ with $a. For all $N$, find integers $k_{a}(N)$ and $k_{b}(N)$ such that

$\sum_{k=k_{a}(N)}^{k_{b}(N)} p_{k}(N, p) \rightarrow \frac{1}{\sqrt{2 \pi}} \int_{a}^{b} e^{-\frac{1}{2} x^{2}} d x \quad \text { as } N \rightarrow \infty$

[You may use the Central Limit Theorem.]

comment

• # Paper 2, Section I, $4 \mathrm{~F}$

Define the moment-generating function $m_{Z}$ of a random variable $Z$. Let $X_{1}, \ldots, X_{n}$ be independent and identically distributed random variables with distribution $\mathcal{N}(0,1)$, and let $Z=X_{1}^{2}+\cdots+X_{n}^{2}$. For $\theta<1 / 2$, show that

$m_{Z}(\theta)=(1-2 \theta)^{-n / 2} .$

comment
• # Paper 2, Section I, F

Let $X_{1}, \ldots, X_{n}$ be independent random variables, all with uniform distribution on $[0,1]$. What is the probability of the event $\left\{X_{1}>X_{2}>\cdots>X_{n-1}>X_{n}\right\}$ ?

comment
• # Paper 2, Section II, F

A random graph with $n$ nodes $v_{1}, \ldots, v_{n}$ is drawn by placing an edge with probability $p$ between $v_{i}$ and $v_{j}$ for all distinct $i$ and $j$, independently. A triangle is a set of three distinct nodes $v_{i}, v_{j}, v_{k}$ that are all connected: there are edges between $v_{i}$ and $v_{j}$, between $v_{j}$ and $v_{k}$ and between $v_{i}$ and $v_{k}$.

(a) Let $T$ be the number of triangles in this random graph. Compute the maximum value and the expectation of $T$.

(b) State the Markov inequality. Show that if $p=1 / n^{\alpha}$, for some $\alpha>1$, then $\mathbb{P}(T=0) \rightarrow 1$ when $n \rightarrow \infty$

(c) State the Chebyshev inequality. Show that if $p$ is such that $\operatorname{Var}[T] / \mathbb{E}[T]^{2} \rightarrow 0$ when $n \rightarrow \infty$, then $\mathbb{P}(T=0) \rightarrow 0$ when $n \rightarrow \infty$

comment
• # Paper 2, Section II, F

Let $X$ be a non-negative random variable such that $\mathbb{E}\left[X^{2}\right]>0$ is finite, and let $\theta \in[0,1]$.

(a) Show that

$\mathbb{E}[X \mathbb{I}[\{X>\theta \mathbb{E}[X]\}]] \geqslant(1-\theta) \mathbb{E}[X]$

(b) Let $Y_{1}$ and $Y_{2}$ be random variables such that $\mathbb{E}\left[Y_{1}^{2}\right]$ and $\mathbb{E}\left[Y_{2}^{2}\right]$ are finite. State and prove the Cauchy-Schwarz inequality for these two variables.

(c) Show that

$\mathbb{P}(X>\theta \mathbb{E}[X]) \geqslant(1-\theta)^{2} \frac{\mathbb{E}[X]^{2}}{\mathbb{E}\left[X^{2}\right]}$

comment
• # Paper 2, Section II, F

We randomly place $n$ balls in $m$ bins independently and uniformly. For each $i$ with $1 \leqslant i \leqslant m$, let $B_{i}$ be the number of balls in bin $i$.

(a) What is the distribution of $B_{i}$ ? For $i \neq j$, are $B_{i}$ and $B_{j}$ independent?

(b) Let $E$ be the number of empty bins, $C$ the number of bins with two or more balls, and $S$ the number of bins with exactly one ball. What are the expectations of $E, C$ and $S$ ?

(c) Let $m=a n$, for an integer $a \geqslant 2$. What is $\mathbb{P}(E=0)$ ? What is the limit of $\mathbb{E}[E] / m$ when $n \rightarrow \infty$ ?

(d) Instead, let $n=d m$, for an integer $d \geqslant 2$. What is $\mathbb{P}(C=0)$ ? What is the limit of $\mathbb{E}[C] / m$ when $n \rightarrow \infty$ ?

comment
• # Paper 2, Section II, F

For any positive integer $n$ and positive real number $\theta$, the Gamma distribution $\Gamma(n, \theta)$ has density $f_{\Gamma}$ defined on $(0, \infty)$ by

$f_{\Gamma}(x)=\frac{\theta^{n}}{(n-1) !} x^{n-1} e^{-\theta x} .$

For any positive integers $a$ and $b$, the Beta distribution $B(a, b)$ has density $f_{B}$ defined on $(0,1)$ by

$f_{B}(x)=\frac{(a+b-1) !}{(a-1) !(b-1) !} x^{a-1}(1-x)^{b-1}$

Let $X$ and $Y$ be independent random variables with respective distributions $\Gamma(n, \theta)$ and $\Gamma(m, \theta)$. Show that the random variables $X /(X+Y)$ and $X+Y$ are independent and give their distributions.

comment

• # Paper 2, Section I, F

Let $A, B$ be events in the sample space $\Omega$ such that $0 and $0. The event $B$ is said to attract $A$ if the conditional probability $P(A \mid B)$ is greater than $P(A)$, otherwise it is said that $A$ repels $B$. Show that if $B$ attracts $A$, then $A$ attracts $B$. Does $B^{c}=\Omega \backslash B$ repel $A ?$

comment
• # Paper 2, Section I, F

Let $U$ be a uniform random variable on $(0,1)$, and let $\lambda>0$.

(a) Find the distribution of the random variable $-(\log U) / \lambda$.

(b) Define a new random variable $X$ as follows: suppose a fair coin is tossed, and if it lands heads we set $X=U^{2}$ whereas if it lands tails we set $X=1-U^{2}$. Find the probability density function of $X$.

comment
• # Paper 2, Section II, F

When coin $A$ is tossed it comes up heads with probability $\frac{1}{4}$, whereas coin $B$ comes up heads with probability $\frac{3}{4}$. Suppose one of these coins is randomly chosen and is tossed twice. If both tosses come up heads, what is the probability that coin $B$ was tossed? Justify your answer.

In each draw of a lottery, an integer is picked independently at random from the first $n$ integers $1,2, \ldots, n$, with replacement. What is the probability that in a sample of $r$ successive draws the numbers are drawn in a non-decreasing sequence? Justify your answer.

comment
• # Paper 2, Section II, F

State and prove Markov's inequality and Chebyshev's inequality, and deduce the weak law of large numbers.

If $X$ is a random variable with mean zero and finite variance $\sigma^{2}$, prove that for any $a>0$,

$P(X \geqslant a) \leqslant \frac{\sigma^{2}}{\sigma^{2}+a^{2}}$

[Hint: Show first that $P(X \geqslant a) \leqslant P\left((X+b)^{2} \geqslant(a+b)^{2}\right)$ for every $b>0$.]

comment
• # Paper 2, Section II, F

Consider the function

$\phi(x)=\frac{1}{\sqrt{2 \pi}} e^{-x^{2} / 2}, \quad x \in \mathbb{R}$

Show that $\phi$ defines a probability density function. If a random variable $X$ has probability density function $\phi$, find the moment generating function of $X$, and find all moments $E\left[X^{k}\right]$, $k \in \mathbb{N}$.

Now define

$r(x)=\frac{P(X>x)}{\phi(x)}$

Show that for every $x>0$,

$\frac{1}{x}-\frac{1}{x^{3}}

comment
• # Paper 2, Section II, F

Lionel and Cristiana have $a$ and $b$ million pounds, respectively, where $a, b \in \mathbb{N}$. They play a series of independent football games in each of which the winner receives one million pounds from the loser (a draw cannot occur). They stop when one player has lost his or her entire fortune. Lionel wins each game with probability $0 and Cristiana wins with probability $q=1-p$, where $p \neq q$. Find the expected number of games before they stop playing.

comment

• # Paper 2, Section I, F

Consider independent discrete random variables $X_{1}, \ldots, X_{n}$ and assume $E\left[X_{i}\right]$ exists for all $i=1, \ldots, n$.

Show that

$E\left[\prod_{i=1}^{n} X_{i}\right]=\prod_{i=1}^{n} E\left[X_{i}\right]$

If the $X_{1}, \ldots, X_{n}$ are also positive, show that

$\prod_{i=1}^{n} \sum_{m=0}^{\infty} P\left(X_{i}>m\right)=\sum_{m=0}^{\infty} P\left(\prod_{i=1}^{n} X_{i}>m\right)$

comment
• # Paper 2, Section I, F

Consider a particle situated at the origin $(0,0)$ of $\mathbb{R}^{2}$. At successive times a direction is chosen independently by picking an angle uniformly at random in the interval $[0,2 \pi]$, and the particle then moves an Euclidean unit length in this direction. Find the expected squared Euclidean distance of the particle from the origin after $n$ such movements.

comment
• # Paper 2, Section II, 9F

State the axioms of probability.

State and prove Boole's inequality.

Suppose you toss a sequence of coins, the $i$-th of which comes up heads with probability $p_{i}$, where $\sum_{i=1}^{\infty} p_{i}<\infty$. Calculate the probability of the event that infinitely many heads occur.

Suppose you repeatedly and independently roll a pair of fair dice and each time record the sum of the dice. What is the probability that an outcome of 5 appears before an outcome of 7 ? Justify your answer.

comment
• # Paper 2, Section II, F

Give the definition of an exponential random variable $X$ with parameter $\lambda$. Show that $X$ is memoryless.

Now let $X, Y$ be independent exponential random variables, each with parameter $\lambda$. Find the probability density function of the random variable $Z=\min (X, Y)$ and the probability $P(X>Y)$.

Suppose the random variables $G_{1}, G_{2}$ are independent and each has probability density function given by

$f(y)=C^{-1} e^{-y} y^{-1 / 2}, \quad y>0, \quad \text { where } C=\int_{0}^{\infty} e^{-y} y^{-1 / 2} d y$

Find the probability density function of $G_{1}+G_{2} \cdot$ [You may use standard results without proof provided they are clearly stated.]

comment
• # Paper 2, Section II, F

For any function $g: \mathbb{R} \rightarrow \mathbb{R}$ and random variables $X, Y$, the "tower property" of conditional expectations is

$E[g(X)]=E[E[g(X) \mid Y]] .$

Provide a proof of this property when both $X, Y$ are discrete.

Let $U_{1}, U_{2}, \ldots$ be a sequence of independent uniform $U(0,1)$-random variables. For $x \in[0,1]$ find the expected number of $U_{i}$ 's needed such that their sum exceeds $x$, that is, find $E[N(x)]$ where

$N(x)=\min \left\{n: \sum_{i=1}^{n} U_{i}>x\right\}$

[Hint: Write $\left.E[N(x)]=E\left[E\left[N(x) \mid U_{1}\right]\right] .\right]$

comment
• # Paper 2, Section II, F

Define what it means for a random variable $X$ to have a Poisson distribution, and find its moment generating function.

Suppose $X, Y$ are independent Poisson random variables with parameters $\lambda, \mu$. Find the distribution of $X+Y$.

If $X_{1}, \ldots, X_{n}$ are independent Poisson random variables with parameter $\lambda=1$, find the distribution of $\sum_{i=1}^{n} X_{i}$. Hence or otherwise, find the limit of the real sequence

$a_{n}=e^{-n} \sum_{j=0}^{n} \frac{n^{j}}{j !}, \quad n \in \mathbb{N}$

[Standard results may be used without proof provided they are clearly stated.]

comment

• # Paper 2, Section I, F

(i) Let $X$ be a random variable. Use Markov's inequality to show that

$\mathbb{P}(X \geqslant k) \leqslant \mathbb{E}\left(e^{t X}\right) e^{-k t}$

for all $t \geqslant 0$ and real $k$.

(ii) Calculate $\mathbb{E}\left(e^{t X}\right)$ in the case where $X$ is a Poisson random variable with parameter $\lambda=1$. Using the inequality from part (i) with a suitable choice of $t$, prove that

$\frac{1}{k !}+\frac{1}{(k+1) !}+\frac{1}{(k+2) !}+\ldots \leqslant\left(\frac{e}{k}\right)^{k}$

for all $k \geqslant 1$.

comment
• # Paper 2, Section I, F

Let $X$ be a random variable with mean $\mu$ and variance $\sigma^{2}$. Let

$G(a)=\mathbb{E}\left[(X-a)^{2}\right]$

Show that $G(a) \geqslant \sigma^{2}$ for all $a$. For what value of $a$ is there equality?

Let

$H(a)=\mathbb{E}[|X-a|]$

Supposing that $X$ has probability density function $f$, express $H(a)$ in terms of $f$. Show that $H$ is minimised when $a$ is such that $\int_{-\infty}^{a} f(x) d x=1 / 2$.

comment
• # Paper 2, Section II, F

Let $\Omega$ be the sample space of a probabilistic experiment, and suppose that the sets $B_{1}, B_{2}, \ldots, B_{k}$ are a partition of $\Omega$ into events of positive probability. Show that

$\mathbb{P}\left(B_{i} \mid A\right)=\frac{\mathbb{P}\left(A \mid B_{i}\right) \mathbb{P}\left(B_{i}\right)}{\sum_{j=1}^{k} \mathbb{P}\left(A \mid B_{j}\right) \mathbb{P}\left(B_{j}\right)}$

for any event $A$ of positive probability.

A drawer contains two coins. One is an unbiased coin, which when tossed, is equally likely to turn up heads or tails. The other is a biased coin, which will turn up heads with probability $p$ and tails with probability $1-p$. One coin is selected (uniformly) at random from the drawer. Two experiments are performed:

(a) The selected coin is tossed $n$ times. Given that the coin turns up heads $k$ times and tails $n-k$ times, what is the probability that the coin is biased?

(b) The selected coin is tossed repeatedly until it turns up heads $k$ times. Given that the coin is tossed $n$ times in total, what is the probability that the coin is biased?

comment
• # Paper 2, Section II, F

Let $X$ be a geometric random variable with $\mathbb{P}(X=1)=p$. Derive formulae for $\mathbb{E}(X)$ and $\operatorname{Var}(X)$ in terms of $p .$

A jar contains $n$ balls. Initially, all of the balls are red. Every minute, a ball is drawn at random from the jar, and then replaced with a green ball. Let $T$ be the number of minutes until the jar contains only green balls. Show that the expected value of $T$ is $n \sum_{i=1}^{n} 1 / i$. What is the variance of $T ?$

comment
• # Paper 2, Section II, F

Let $X$ be a random variable taking values in the non-negative integers, and let $G$ be the probability generating function of $X$. Assuming $G$ is everywhere finite, show that

$G^{\prime}(1)=\mu \text { and } G^{\prime \prime}(1)=\sigma^{2}+\mu^{2}-\mu$

where $\mu$ is the mean of $X$ and $\sigma^{2}$ is its variance. [You may interchange differentiation and expectation without justification.]

Consider a branching process where individuals produce independent random numbers of offspring with the same distribution as $X$. Let $X_{n}$ be the number of individuals in the $n$-th generation, and let $G_{n}$ be the probability generating function of $X_{n}$. Explain carefully why

$G_{n+1}(t)=G_{n}(G(t))$

Assuming $X_{0}=1$, compute the mean of $X_{n}$. Show that

$\operatorname{Var}\left(X_{n}\right)=\sigma^{2} \frac{\mu^{n-1}\left(\mu^{n}-1\right)}{\mu-1}$

Suppose $\mathbb{P}(X=0)=3 / 7$ and $\mathbb{P}(X=3)=4 / 7$. Compute the probability that the population will eventually become extinct. You may use standard results on branching processes as long as they are clearly stated.

comment
• # Paper 2, Section II, F

Let $Z$ be an exponential random variable with parameter $\lambda=1$. Show that

$\mathbb{P}(Z>s+t \mid Z>s)=\mathbb{P}(Z>t)$

for any $s, t \geqslant 0$.

Let $Z_{\text {int }}=\lfloor Z\rfloor$ be the greatest integer less than or equal to $Z$. What is the probability mass function of $Z_{\text {int }}$ ? Show that $\mathbb{E}\left(Z_{\text {int }}\right)=\frac{1}{e-1}$.

Let $Z_{\mathrm{frac}}=Z-Z_{\mathrm{int}}$ be the fractional part of $Z$. What is the density of $Z_{\mathrm{frac}}$ ?

Show that $Z_{\text {int }}$ and $Z_{\text {frac }}$ are independent.

comment

• # Paper 2, Section I, F

Define the probability generating function $G(s)$ of a random variable $X$ taking values in the non-negative integers.

A coin shows heads with probability $p \in(0,1)$ on each toss. Let $N$ be the number of tosses up to and including the first appearance of heads, and let $k \geqslant 1$. Find the probability generating function of $X=\min \{N, k\}$.

Show that $E(X)=p^{-1}\left(1-q^{k}\right)$ where $q=1-p$.

comment
• # Paper 2, Section I, F

Given two events $A$ and $B$ with $P(A)>0$ and $P(B)>0$, define the conditional probability $P(A \mid B)$.

Show that

$P(B \mid A)=P(A \mid B) \frac{P(B)}{P(A)}$

A random number $N$ of fair coins are tossed, and the total number of heads is denoted by $H$. If $P(N=n)=2^{-n}$ for $n=1,2, \ldots$, find $P(N=n \mid H=1)$.

comment
• # Paper 2, Section II, F

Let $X, Y$ be independent random variables with distribution functions $F_{X}, F_{Y}$. Show that $U=\min \{X, Y\}, V=\max \{X, Y\}$ have distribution functions

$F_{U}(u)=1-\left(1-F_{X}(u)\right)\left(1-F_{Y}(u)\right), \quad F_{V}(v)=F_{X}(v) F_{Y}(v)$

Now let $X, Y$ be independent random variables, each having the exponential distribution with parameter 1. Show that $U$ has the exponential distribution with parameter 2 , and that $V-U$ is independent of $U$.

Hence or otherwise show that $V$ has the same distribution as $X+\frac{1}{2} Y$, and deduce the mean and variance of $V$.

[You may use without proof that $X$ has mean 1 and variance 1.]

comment
• # Paper 2, Section II, F

(i) Let $X_{n}$ be the size of the $n^{\text {th }}$generation of a branching process with familysize probability generating function $G(s)$, and let $X_{0}=1$. Show that the probability generating function $G_{n}(s)$ of $X_{n}$ satisfies $G_{n+1}(s)=G\left(G_{n}(s)\right)$ for $n \geqslant 0$.

(ii) Suppose the family-size mass function is $P\left(X_{1}=k\right)=2^{-k-1}, k=0,1,2, \ldots$ Find $G(s)$, and show that

$G_{n}(s)=\frac{n-(n-1) s}{n+1-n s} \quad \text { for }|s|<1+\frac{1}{n} .$

Deduce the value of $P\left(X_{n}=0\right)$