• # Paper 2, Section I, B

Consider the ordinary differential equation

$P(x, y)+Q(x, y) \frac{d y}{d x}=0 .$

State an equation to be satisfied by $P$ and $Q$ that ensures that equation $(*)$ is exact. In this case, express the general solution of equation $(*)$ in terms of a function $F(x, y)$ which should be defined in terms of $P$ and $Q$.

Consider the equation

$\frac{d y}{d x}=-\frac{4 x+3 y}{3 x+3 y^{2}}$

satisfying the boundary condition $y(1)=2$. Find an explicit relation between $y$ and $x$.

comment
• # Paper 2, Section I, B

The following equation arises in the theory of elastic beams:

$t^{4} \frac{d^{2} u}{d t^{2}}+\lambda^{2} u=0, \quad \lambda>0, t>0$

where $u(t)$ is a real valued function.

By using the change of variables

$t=\frac{1}{\tau}, \quad u(t)=\frac{v(\tau)}{\tau},$

find the general solution of the above equation.

comment
• # Paper 2, Section II, B

The so-called "shallow water theory" is characterised by the equations

\begin{aligned} &\frac{\partial \zeta}{\partial t}+\frac{\partial}{\partial x}[(h+\zeta) u]=0 \\ &\frac{\partial u}{\partial t}+u \frac{\partial u}{\partial x}+g \frac{\partial \zeta}{\partial x}=0 \end{aligned}

where $g$ denotes the gravitational constant, the constant $h$ denotes the undisturbed depth of the water, $u(x, t)$ denotes the speed in the $x$-direction, and $\zeta(x, t)$ denotes the elevation of the water.

(i) Assuming that $|u|$ and $|\zeta|$ and their gradients are small in some appropriate dimensional considerations, show that $\zeta$ satisfies the wave equation

$\frac{\partial^{2} \zeta}{\partial t^{2}}=c^{2} \frac{\partial^{2} \zeta}{\partial x^{2}},$

where the constant $c$ should be determined in terms of $h$ and $g$.

(ii) Using the change of variables

$\xi=x+c t, \quad \eta=x-c t$

show that the general solution of $(*)$ satisfying the initial conditions

$\zeta(x, 0)=u_{0}(x), \quad \frac{\partial \zeta}{\partial t}(x, 0)=v_{0}(x)$

is given by

$\zeta(x, t)=f(x+c t)+g(x-c t)$

where

\begin{aligned} &\frac{d f(x)}{d x}=\frac{1}{2}\left[\frac{d u_{0}(x)}{d x}+\frac{1}{c} v_{0}(x)\right] \\ &\frac{d g(x)}{d x}=\frac{1}{2}\left[\frac{d u_{0}(x)}{d x}-\frac{1}{c} v_{0}(x)\right] \end{aligned}

Simplify the above to find $\zeta$ in terms of $u_{0}$ and $v_{0}$.

(iii) Find $\zeta(x, t)$ in the particular case that

$u_{0}(x)=H(x+1)-H(x-1), \quad v_{0}(x)=0, \quad-\infty

where $H(\cdot)$ denotes the Heaviside step function.

Describe in words this solution.

comment
• # Paper 2, Section II, B

Use the transformation

$y(t)=\frac{1}{c x(t)} \frac{d x(t)}{d t}$

where $c$ is a constant, to map the Ricatti equation

$\frac{d y}{d t}+c y^{2}+a(t) y+b(t)=0, \quad t>0$

to a linear equation.

Using the above result, as well as the change of variables $\tau=\ln t$, solve the boundary value problem

$\begin{gathered} \frac{d y}{d t}+y^{2}+\frac{y}{t}-\frac{\lambda^{2}}{t^{2}}=0, \quad t>0 \\ y(1)=2 \lambda \end{gathered}$

where $\lambda$ is a positive constant. What is the value of $t>0$ for which the solution is singular?

comment
• # Paper 2, Section II, B

Consider the damped pendulum equation

$\frac{d^{2} \theta}{d t^{2}}+c \frac{d \theta}{d t}+\sin \theta=0$

where $c$ is a positive constant. The energy $E$, which is the sum of the kinetic energy and the potential energy, is defined by

$E(t)=\frac{1}{2}\left(\frac{d \theta}{d t}\right)^{2}+1-\cos \theta$

(i) Verify that $E(t)$ is a decreasing function.

(ii) Assuming that $\theta$ is sufficiently small, so that terms of order $\theta^{3}$ can be neglected, find an approximation for the general solution of $(*)$ in terms of two arbitrary constants. Discuss the dependence of this approximate solution on $c$.

(iii) By rewriting $(*)$ as a system of equations for $x(t)=\theta$ and $y(t)=\dot{\theta}$, find all stationary points of $(*)$ and discuss their nature for all $c$, except $c=2$.

(iv) Draw the phase plane curves for the particular case $c=1$.

comment
• # Paper 2, Section II, B

(a) Let $y_{1}(x)$ be a solution of the equation

$\frac{d^{2} y}{d x^{2}}+p(x) \frac{d y}{d x}+q(x) y=0$

Assuming that the second linearly independent solution takes the form $y_{2}(x)=$ $v(x) y_{1}(x)$, derive an ordinary differential equation for $v(x)$.

(b) Consider the equation

$\left(1-x^{2}\right) \frac{d^{2} y}{d x^{2}}-2 x \frac{d y}{d x}+2 y=0, \quad-1

By inspection or otherwise, find an explicit solution of this equation. Use the result in (a) to find the solution $y(x)$ satisfying the conditions

$y(0)=\frac{d y}{d x}(0)=1$

comment

• # Paper 2, Section I, F

Consider independent discrete random variables $X_{1}, \ldots, X_{n}$ and assume $E\left[X_{i}\right]$ exists for all $i=1, \ldots, n$.

Show that

$E\left[\prod_{i=1}^{n} X_{i}\right]=\prod_{i=1}^{n} E\left[X_{i}\right]$

If the $X_{1}, \ldots, X_{n}$ are also positive, show that

$\prod_{i=1}^{n} \sum_{m=0}^{\infty} P\left(X_{i}>m\right)=\sum_{m=0}^{\infty} P\left(\prod_{i=1}^{n} X_{i}>m\right)$

comment
• # Paper 2, Section I, F

Consider a particle situated at the origin $(0,0)$ of $\mathbb{R}^{2}$. At successive times a direction is chosen independently by picking an angle uniformly at random in the interval $[0,2 \pi]$, and the particle then moves an Euclidean unit length in this direction. Find the expected squared Euclidean distance of the particle from the origin after $n$ such movements.

comment
• # Paper 2, Section II, 9F

State the axioms of probability.

State and prove Boole's inequality.

Suppose you toss a sequence of coins, the $i$-th of which comes up heads with probability $p_{i}$, where $\sum_{i=1}^{\infty} p_{i}<\infty$. Calculate the probability of the event that infinitely many heads occur.

Suppose you repeatedly and independently roll a pair of fair dice and each time record the sum of the dice. What is the probability that an outcome of 5 appears before an outcome of 7 ? Justify your answer.

comment
• # Paper 2, Section II, F

Give the definition of an exponential random variable $X$ with parameter $\lambda$. Show that $X$ is memoryless.

Now let $X, Y$ be independent exponential random variables, each with parameter $\lambda$. Find the probability density function of the random variable $Z=\min (X, Y)$ and the probability $P(X>Y)$.

Suppose the random variables $G_{1}, G_{2}$ are independent and each has probability density function given by

$f(y)=C^{-1} e^{-y} y^{-1 / 2}, \quad y>0, \quad \text { where } C=\int_{0}^{\infty} e^{-y} y^{-1 / 2} d y$

Find the probability density function of $G_{1}+G_{2} \cdot$ [You may use standard results without proof provided they are clearly stated.]

comment
• # Paper 2, Section II, F

For any function $g: \mathbb{R} \rightarrow \mathbb{R}$ and random variables $X, Y$, the "tower property" of conditional expectations is

$E[g(X)]=E[E[g(X) \mid Y]] .$

Provide a proof of this property when both $X, Y$ are discrete.

Let $U_{1}, U_{2}, \ldots$ be a sequence of independent uniform $U(0,1)$-random variables. For $x \in[0,1]$ find the expected number of $U_{i}$ 's needed such that their sum exceeds $x$, that is, find $E[N(x)]$ where

$N(x)=\min \left\{n: \sum_{i=1}^{n} U_{i}>x\right\}$

[Hint: Write $\left.E[N(x)]=E\left[E\left[N(x) \mid U_{1}\right]\right] .\right]$

comment
• # Paper 2, Section II, F

Define what it means for a random variable $X$ to have a Poisson distribution, and find its moment generating function.

Suppose $X, Y$ are independent Poisson random variables with parameters $\lambda, \mu$. Find the distribution of $X+Y$.

If $X_{1}, \ldots, X_{n}$ are independent Poisson random variables with parameter $\lambda=1$, find the distribution of $\sum_{i=1}^{n} X_{i}$. Hence or otherwise, find the limit of the real sequence

$a_{n}=e^{-n} \sum_{j=0}^{n} \frac{n^{j}}{j !}, \quad n \in \mathbb{N}$

[Standard results may be used without proof provided they are clearly stated.]

comment