• # Paper 2, Section I, A

Let $y_{1}$ and $y_{2}$ be two linearly independent solutions to the differential equation

$\frac{\mathrm{d}^{2} y}{\mathrm{~d} x^{2}}+p(x) \frac{\mathrm{d} y}{\mathrm{~d} x}+q(x) y=0 .$

Show that the Wronskian $W=y_{1} y_{2}^{\prime}-y_{2} y_{1}^{\prime}$ satisfies

$\frac{\mathrm{d} W}{\mathrm{~d} x}+p W=0 .$

Deduce that if $y_{2}\left(x_{0}\right)=0$ then

$y_{2}(x)=y_{1}(x) \int_{x_{0}}^{x} \frac{W(t)}{y_{1}(t)^{2}} \mathrm{~d} t .$

Given that $y_{1}(x)=x^{3}$ satisfies the equation

$x^{2} \frac{\mathrm{d}^{2} y}{\mathrm{~d} x^{2}}-x \frac{\mathrm{d} y}{\mathrm{~d} x}-3 y=0$

find the solution which satisfies $y(1)=0$ and $y^{\prime}(1)=1$.

comment
• # Paper 2, Section I, A

Solve the difference equation

$y_{n+2}-4 y_{n+1}+4 y_{n}=n$

subject to the initial conditions $y_{0}=1$ and $y_{1}=0$.

comment
• # Paper 2, Section II, A

By means of the change of variables $\eta=x-t$ and $\xi=x+t$, show that the wave equation for $u=u(x, t)$

$\frac{\partial^{2} u}{\partial x^{2}}-\frac{\partial^{2} u}{\partial t^{2}}=0$

is equivalent to the equation

$\frac{\partial^{2} U}{\partial \eta \partial \xi}=0$

where $U(\eta, \xi)=u(x, t)$. Hence show that the solution to $(*)$ on $x \in \mathbf{R}$ and $t>0$, subject to the initial conditions

$u(x, 0)=f(x), \quad \frac{\partial u}{\partial t}(x, 0)=g(x)$

$u(x, t)=\frac{1}{2}[f(x-t)+f(x+t)]+\frac{1}{2} \int_{x-t}^{x+t} g(y) \mathrm{d} y$

Deduce that if $f(x)=0$ and $g(x)=0$ on the interval $\left|x-x_{0}\right|>r$ then $u(x, t)=0$ on $\left|x-x_{0}\right|>r+t$.

Suppose now that $y=y(x, t)$ is a solution to the wave equation $(*)$ on the finite interval $0 and obeys the boundary conditions

$y(0, t)=y(L, t)=0$

for all $t$. The energy is defined by

$E(t)=\frac{1}{2} \int_{0}^{L}\left[\left(\frac{\partial y}{\partial x}\right)^{2}+\left(\frac{\partial y}{\partial t}\right)^{2}\right] \mathrm{d} x$

By considering $\mathrm{d} E / \mathrm{d} t$, or otherwise, show that the energy remains constant in time.

comment
• # Paper 2, Section II, A

For a linear, second order differential equation define the terms ordinary point, singular point and regular singular point.

For $a, b \in \mathbb{R}$ and $b \notin \mathbb{Z}$ consider the following differential equation

$x \frac{\mathrm{d}^{2} y}{\mathrm{~d} x^{2}}+(b-x) \frac{\mathrm{d} y}{\mathrm{~d} x}-a y=0 .$

Find coefficients $c_{m}(a, b)$ such that the function $y_{1}=M(x, a, b)$, where

$M(x, a, b)=\sum_{m=0}^{\infty} c_{m}(a, b) x^{m}$

satisfies $(*)$. By making the substitution $y=x^{1-b} u(x)$, or otherwise, find a second linearly independent solution of the form $y_{2}=x^{1-b} M(x, \alpha, \beta)$ for suitable $\alpha, \beta$.

Suppose now that $b=1$. By considering a limit of the form

$\lim _{b \rightarrow 1} \frac{y_{2}-y_{1}}{b-1}$

or otherwise, obtain two linearly independent solutions to $(*)$ in terms of $M$ and derivatives thereof.

comment
• # Paper 2, Section II, A

For an $n \times n$ matrix $A$, define the matrix exponential by

$\exp (A)=\sum_{m=0}^{\infty} \frac{A^{m}}{m !}$

where $A^{0} \equiv I$, with $I$ being the $n \times n$ identity matrix. [You may assume that $\exp ((s+t) A)=\exp (s A) \exp (t A)$ for real numbers $s, t$ and you do not need to consider issues of convergence.] Show that

$\frac{\mathrm{d}}{\mathrm{d} t} \exp (t A)=A \exp (t A)$

Deduce that the unique solution to the initial value problem

$\frac{\mathrm{d} \mathbf{y}}{\mathrm{d} t}=A \mathbf{y}, \quad \mathbf{y}(0)=\mathbf{y}_{0}, \quad \text { where } \mathbf{y}(t)=\left(\begin{array}{c} y_{1}(t) \\ \vdots \\ y_{n}(t) \end{array}\right)$

is $\mathbf{y}(t)=\exp (t A) \mathbf{y}_{0}$.

Let $\mathbf{x}=\mathbf{x}(t)$ and $\mathbf{f}=\mathbf{f}(t)$ be vectors of length $n$ and $A$ a real $n \times n$ matrix. By considering a suitable integrating factor, show that the unique solution to

$\frac{\mathrm{d} \mathbf{x}}{\mathrm{d} t}-A \mathbf{x}=\mathbf{f}, \quad \mathbf{x}(0)=\mathbf{x}_{0}$

is given by

$\mathbf{x}(t)=\exp (t A) \mathbf{x}_{0}+\int_{0}^{t} \exp [(t-s) A] \mathbf{f}(s) \mathrm{d} s$

Hence, or otherwise, solve the system of differential equations $(*)$ when

$A=\left(\begin{array}{ccc} 2 & 2 & -2 \\ 5 & 1 & -3 \\ 1 & 5 & -3 \end{array}\right), \quad \mathbf{f}(t)=\left(\begin{array}{c} \sin t \\ 3 \sin t \\ 0 \end{array}\right), \quad \mathbf{x}_{0}=\left(\begin{array}{l} 1 \\ 1 \\ 2 \end{array}\right)$

[Hint: Compute $A^{2}$ and show that $\left.A^{3}=0 .\right]$

comment
• # Paper 2, Section II, A

The function $\theta=\theta(t)$ takes values in the interval $(-\pi, \pi]$ and satisfies the differential equation

$\frac{\mathrm{d}^{2} \theta}{\mathrm{d} t^{2}}+(\lambda-2 \mu) \sin \theta+\frac{2 \mu \sin \theta}{\sqrt{5+4 \cos \theta}}=0$

where $\lambda$ and $\mu$ are positive constants.

Let $\omega=\dot{\theta}$. Express $(*)$ in terms of a pair of first order differential equations in $(\theta, \omega)$. Show that if $3 \lambda<4 \mu$ then there are three fixed points in the region $0 \leqslant \theta \leqslant \pi .$

Classify all the fixed points of the system in the case $3 \lambda<4 \mu$. Sketch the phase portrait in the case $\lambda=1$ and $\mu=3 / 2$.

Comment briefly on the case when $3 \lambda>4 \mu$.

comment

• # Paper 2, Section I, D

A coin has probability $p$ of landing heads. Let $q_{n}$ be the probability that the number of heads after $n$ tosses is even. Give an expression for $q_{n+1}$ in terms of $q_{n}$. Hence, or otherwise, find $q_{n}$.

comment
• # Paper 2, Section I, F

Let $X$ be a continuous random variable taking values in $[0, \sqrt{3}]$. Let the probability density function of $X$ be

$f_{X}(x)=\frac{c}{1+x^{2}}, \quad \text { for } x \in[0, \sqrt{3}]$

where $c$ is a constant.

Find the value of $c$ and calculate the mean, variance and median of $X$.

[Recall that the median of $X$ is the number $m$ such that $\left.\mathbb{P}(X \leqslant m)=\frac{1}{2} .\right]$

comment
• # Paper 2, Section II, 10E

(a) Alanya repeatedly rolls a fair six-sided die. What is the probability that the first number she rolls is a 1 , given that she rolls a 1 before she rolls a $6 ?$

(b) Let $\left(X_{n}\right)_{n \geqslant 0}$ be a simple symmetric random walk on the integers starting at $x \in \mathbb{Z}$, that is,

$X_{n}=\left\{\begin{array}{cl} x & \text { if } n=0 \\ x+\sum_{i=1}^{n} Y_{i} & \text { if } n \geqslant 1 \end{array}\right.$

where $\left(Y_{n}\right)_{n \geqslant 1}$ is a sequence of IID random variables with $\mathbb{P}\left(Y_{n}=1\right)=\mathbb{P}\left(Y_{n}=-1\right)=\frac{1}{2}$. Let $T=\min \left\{n \geqslant 0: X_{n}=0\right\}$ be the time that the walk first hits 0 .

(i) Let $n$ be a positive integer. For $0, calculate the probability that the walk hits 0 before it hits $n$.

(ii) Let $x=1$ and let $A$ be the event that the walk hits 0 before it hits 3 . Find $\mathbb{P}\left(X_{1}=0 \mid A\right)$. Hence find $\mathbb{E}(T \mid A)$.

(iii) Let $x=1$ and let $B$ be the event that the walk hits 0 before it hits 4 . Find $\mathbb{E}(T \mid B)$.

comment
• # Paper 2, Section II, 12F

State and prove Chebyshev's inequality.

Let $\left(X_{i}\right)_{i \geqslant 1}$ be a sequence of independent, identically distributed random variables such that

$\mathbb{P}\left(X_{i}=0\right)=p \text { and } \mathbb{P}\left(X_{i}=1\right)=1-p$

for some $p \in[0,1]$, and let $f:[0,1] \rightarrow \mathbb{R}$ be a continuous function.

(i) Prove that

$B_{n}(p):=\mathbb{E}\left(f\left(\frac{X_{1}+\cdots+X_{n}}{n}\right)\right)$

is a polynomial function of $p$, for any natural number $n$.

(ii) Let $\delta>0$. Prove that

$\sum_{k \in K_{\delta}}\left(\begin{array}{l} n \\ k \end{array}\right) p^{k}(1-p)^{n-k} \leqslant \frac{1}{4 n \delta^{2}}$

where $K_{\delta}$ is the set of natural numbers $0 \leqslant k \leqslant n$ such that $|k / n-p|>\delta$.

(iii) Show that

$\sup _{p \in[0,1]}\left|f(p)-B_{n}(p)\right| \rightarrow 0$

as $n \rightarrow \infty$. [You may use without proof that, for any $\epsilon>0$, there is a $\delta>0$ such that $|f(x)-f(y)| \leqslant \epsilon$ for all $x, y \in[0,1]$ with $|x-y| \leqslant \delta$.]

comment
• # Paper 2, Section II, 9E

(a) (i) Define the conditional probability $\mathbb{P}(A \mid B)$ of the event $A$ given the event $B$. Let $\left\{B_{j}: 1 \leqslant j \leqslant n\right\}$ be a partition of the sample space such that $\mathbb{P}\left(B_{j}\right)>0$ for all $j$. Show that, if $\mathbb{P}(A)>0$,

$\mathbb{P}\left(B_{j} \mid A\right)=\frac{\mathbb{P}\left(A \mid B_{j}\right) \mathbb{P}\left(B_{j}\right)}{\sum_{k=1}^{n} \mathbb{P}\left(A \mid B_{k}\right) \mathbb{P}\left(B_{k}\right)}$

(ii) There are $n$ urns, the $r$ th of which contains $r-1$ red balls and $n-r$ blue balls. Alice picks an urn (uniformly) at random and removes two balls without replacement. Find the probability that the first ball is blue, and the conditional probability that the second ball is blue, given that the first is blue. [You may assume, if you wish, that $\sum_{i=1}^{n-1} i(i-1)=\frac{1}{3} n(n-1)(n-2)$.]

(b) (i) What is meant by saying that two events $A$ and $B$ are independent? Two fair (6-sided) dice are rolled. Let $A_{t}$ be the event that the sum of the numbers shown is $t$, and let $B_{i}$ be the event that the first die shows $i$. For what values of $t$ and $i$ are the two events $A_{t}$ and $B_{i}$ independent?

(ii) The casino at Monte Corona features the following game: three coins each show heads with probability $3 / 5$ and tails otherwise. The first counts 10 points for a head and 2 for a tail; the second counts 4 points for both a head and a tail; and the third counts 3 points for a head and 20 for a tail. You and your opponent each choose a coin. You cannot both choose the same coin. Each of you tosses your coin once and the person with the larger score wins the jackpot. Would you prefer to be the first or the second to choose a coin?

comment
• # Paper 2, Section II, D

Let $\Delta$ be the disc of radius 1 with centre at the origin $O$. Let $P$ be a random point uniformly distributed in $\Delta$. Let $(R, \Theta)$ be the polar coordinates of $P$. Show that $R$ and $\Theta$ are independent and find their probability density functions $f_{R}$ and $f_{\Theta}$.

Let $A, B$ and $C$ be three random points selected independently and uniformly in $\Delta$. Find the expected area of triangle $O A B$ and hence find the probability that $C$ lies in the interior of triangle $O A B$.

Find the probability that $O, A, B$ and $C$ are the vertices of a convex quadrilateral.

comment