• Paper 1, Section I, E

(a) Let $f$ be continuous in $[a, b]$, and let $g$ be strictly monotonic in $[\alpha, \beta]$, with a continuous derivative there, and suppose that $a=g(\alpha)$ and $b=g(\beta)$. Prove that

$\int_{a}^{b} f(x) d x=\int_{\alpha}^{\beta} f(g(u)) g^{\prime}(u) d u$

[Any version of the fundamental theorem of calculus may be used providing it is quoted correctly.]

(b) Justifying carefully the steps in your argument, show that the improper Riemann integral

$\int_{0}^{e^{-1}} \frac{d x}{x\left(\log \frac{1}{x}\right)^{\theta}}$

converges for $\theta>1$, and evaluate it.

comment
• Paper 1, Section II, D

(a) State Rolle's theorem. Show that if $f: \mathbb{R} \rightarrow \mathbb{R}$ is $N+1$ times differentiable and $x \in \mathbb{R}$ then

$f(x)=f(0)+f^{\prime}(0) x+\frac{f^{\prime \prime}(0)}{2 !} x^{2}+\ldots+\frac{f^{(N)}(0)}{N !} x^{N}+\frac{f^{(N+1)}(\theta x)}{(N+1) !} x^{N+1}$

for some $0<\theta<1$. Hence, or otherwise, show that if $f^{\prime}(x)=0$ for all $x \in \mathbb{R}$ then $f$ is constant.

(b) Let $s: \mathbb{R} \rightarrow \mathbb{R}$ and $c: \mathbb{R} \rightarrow \mathbb{R}$ be differentiable functions such that

$s^{\prime}(x)=c(x), \quad c^{\prime}(x)=-s(x), \quad s(0)=0 \quad \text { and } \quad c(0)=1$

Prove that (i) $s(x) c(a-x)+c(x) s(a-x)$ is independent of $x$, (ii) $s(x+y)=s(x) c(y)+c(x) s(y)$, (iii) $s(x)^{2}+c(x)^{2}=1$.

Show that $c(1)>0$ and $c(2)<0$. Deduce there exists $1 such that $s(2 k)=c(k)=0$ and $s(x+4 k)=s(x)$.

comment
• Paper 1, Section II, F

(a) Let $\left(x_{n}\right)$ be a bounded sequence of real numbers. Show that $\left(x_{n}\right)$ has a convergent subsequence.

(b) Let $\left(z_{n}\right)$ be a bounded sequence of complex numbers. For each $n \geqslant 1$, write $z_{n}=x_{n}+i y_{n}$. Show that $\left(z_{n}\right)$ has a subsequence $\left(z_{n_{j}}\right)$ such that $\left(x_{n_{j}}\right)$ converges. Hence, or otherwise, show that $\left(z_{n}\right)$ has a convergent subsequence.

(c) Write $\mathbb{N}=\{1,2,3, \ldots\}$ for the set of positive integers. Let $M$ be a positive real number, and for each $i \in \mathbb{N}$, let $X^{(i)}=\left(x_{1}^{(i)}, x_{2}^{(i)}, x_{3}^{(i)}, \ldots\right)$ be a sequence of real numbers with $\left|x_{j}^{(i)}\right| \leqslant M$ for all $i, j \in \mathbb{N}$. By induction on $i$ or otherwise, show that there exist sequences $N^{(i)}=\left(n_{1}^{(i)}, n_{2}^{(i)}, n_{3}^{(i)}, \ldots\right)$ of positive integers with the following properties:

• for all $i \in \mathbb{N}$, the sequence $N^{(i)}$ is strictly increasing;

• for all $i \in \mathbb{N}, N^{(i+1)}$ is a subsequence of $N^{(i)} ;$ and

• for all $k \in \mathbb{N}$ and all $i \in \mathbb{N}$ with $1 \leqslant i \leqslant k$, the sequence

$\left(x_{n_{1}^{(k)}}^{(i)}, x_{n_{2}^{(k)}}^{(i)}, x_{n_{3}^{(k)}}^{(i)}, \ldots\right)$

converges.

Hence, or otherwise, show that there exists a strictly increasing sequence $\left(m_{j}\right)$ of positive integers such that for all $i \in \mathbb{N}$ the sequence $\left(x_{m_{1}}^{(i)}, x_{m_{2}}^{(i)}, x_{m_{3}}^{(i)}, \ldots\right)$ converges.

comment

• Paper 1, Section I, A

Solve the differential equation

$\frac{d y}{d x}=\frac{1}{x+e^{2 y}}$

subject to the initial condition $y(1)=0$.

comment
• Paper 1, Section II, A

Solve the system of differential equations for $x(t), y(t), z(t)$,

\begin{aligned} &\dot{x}=3 z-x \\ &\dot{y}=3 x+2 y-3 z+\cos t-2 \sin t \\ &\dot{z}=3 x-z \end{aligned}

subject to the initial conditions $x(0)=y(0)=0, z(0)=1$.

comment
• Paper 1, Section II, A

Show that for each $t>0$ and $x \in \mathbb{R}$ the function

$K(x, t)=\frac{1}{\sqrt{4 \pi t}} \exp \left(-\frac{x^{2}}{4 t}\right)$

satisfies the heat equation

$\frac{\partial u}{\partial t}=\frac{\partial^{2} u}{\partial x^{2}}$

For $t>0$ and $x \in \mathbb{R}$ define the function $u=u(x, t)$ by the integral

$u(x, t)=\int_{-\infty}^{\infty} K(x-y, t) f(y) d y$

Show that $u$ satisfies the heat equation and $\lim _{t \rightarrow 0^{+}} u(x, t)=f(x)$. [Hint: You may find it helpful to consider the substitution $Y=(x-y) / \sqrt{4 t}$.]

Burgers' equation is

$\frac{\partial w}{\partial t}+w \frac{\partial w}{\partial x}=\frac{\partial^{2} w}{\partial x^{2}}$

By considering the transformation

$w(x, t)=-2 \frac{1}{u} \frac{\partial u}{\partial x}$

solve Burgers' equation with the initial condition $\lim _{t \rightarrow 0^{+}} w(x, t)=g(x)$.

comment

• Paper 1, Section I, F

A robot factory begins with a single generation-0 robot. Each generation- $n$ robot independently builds some number of generation- $(n+1)$ robots before breaking down. The number of generation- $(n+1)$ robots built by a generation- $n$ robot is $0,1,2$ or 3 with probabilities $\frac{1}{12}, \frac{1}{2}, \frac{1}{3}$ and $\frac{1}{12}$ respectively. Find the expectation of the total number of generation- $n$ robots produced by the factory. What is the probability that the factory continues producing robots forever?

[Standard results about branching processes may be used without proof as long as they are carefully stated.]

comment
• Paper 1, Section II, F

(a) Let $Z$ be a $N(0,1)$ random variable. Write down the probability density function (pdf) of $Z$, and verify that it is indeed a pdf. Find the moment generating function (mgf) $m_{Z}(\theta)=\mathbb{E}\left(e^{\theta Z}\right)$ of $Z$ and hence, or otherwise, verify that $Z$ has mean 0 and variance 1 .

(b) Let $\left(X_{n}\right)_{n \geqslant 1}$ be a sequence of IID $N(0,1)$ random variables. Let $S_{n}=\sum_{i=1}^{n} X_{i}$ and let $U_{n}=S_{n} / \sqrt{n}$. Find the distribution of $U_{n}$.

(c) Let $Y_{n}=X_{n}^{2}$. Find the mean $\mu$ and variance $\sigma^{2}$ of $Y_{1}$. Let $T_{n}=\sum_{i=1}^{n} Y_{i}$ and let $V_{n}=\left(T_{n}-n \mu\right) / \sigma \sqrt{n}$.

If $\left(W_{n}\right)_{n \geqslant 1}$ is a sequence of random variables and $W$ is a random variable, what does it mean to say that $W_{n} \rightarrow W$ in distribution? State carefully the continuity theorem and use it to show that $V_{n} \rightarrow Z$ in distribution.

[You may not assume the central limit theorem.]

comment
• Paper 1, Section II, F

Let $A_{1}, A_{2}, \ldots, A_{n}$ be events in some probability space. State and prove the inclusion-exclusion formula for the probability $\mathbb{P}\left(\bigcup_{i=1}^{n} A_{i}\right)$. Show also that

$\mathbb{P}\left(\bigcup_{i=1}^{n} A_{i}\right) \geqslant \sum_{i} \mathbb{P}\left(A_{i}\right)-\sum_{i

Suppose now that $n \geqslant 2$ and that whenever $i \neq j$ we have $\mathbb{P}\left(A_{i} \cap A_{j}\right) \leqslant 1 / n$. Show that there is a constant $c$ independent of $n$ such that $\sum_{i=1}^{n} \mathbb{P}\left(A_{i}\right) \leqslant c \sqrt{n}$.

comment

• Paper 1, Section I, C

Given a non-zero complex number $z=x+i y$, where $x$ and $y$ are real, find expressions for the real and imaginary parts of the following functions of $z$ in terms of $x$ and $y$ :

(i) $e^{z}$,

(ii) $\sin z$

(iii) $\frac{1}{z}-\frac{1}{\bar{z}}$,

(iv) $z^{3}-z^{2} \bar{z}-z \bar{z}^{2}+\bar{z}^{3}$,

where $\bar{z}$ is the complex conjugate of $z$.

Now assume $x>0$ and find expressions for the real and imaginary parts of all solutions to

(v) $w=\log z$.

comment
• Paper 1, Section II, $\mathbf{6 A}$

What does it mean to say an $n \times n$ matrix is Hermitian?

What does it mean to say an $n \times n$ matrix is unitary?

Show that the eigenvalues of a Hermitian matrix are real and that eigenvectors corresponding to distinct eigenvalues are orthogonal.

Suppose that $A$ is an $n \times n$ Hermitian matrix with $n$ distinct eigenvalues $\lambda_{1}, \ldots, \lambda_{n}$ and corresponding normalised eigenvectors $\mathbf{u}_{1}, \ldots, \mathbf{u}_{n}$. Let $U$ denote the matrix whose columns are $\mathbf{u}_{1}, \ldots, \mathbf{u}_{n}$. Show directly that $U$ is unitary and $U D U^{\dagger}=A$, where $D$ is a diagonal matrix you should specify.

If $U$ is unitary and $D$ diagonal, must it be the case that $U D U^{\dagger}$ is Hermitian? Give a proof or counterexample.

Find a unitary matrix $U$ and a diagonal matrix $D$ such that

$U D U^{\dagger}=\left(\begin{array}{ccc} 2 & 0 & 3 i \\ 0 & 2 & 0 \\ -3 i & 0 & 2 \end{array}\right)$

comment
• Paper 1, Section II, C

(a) Let $A, B$, and $C$ be three distinct points in the plane $\mathbb{R}^{2}$ which are not collinear, and let $\mathbf{a}, \mathbf{b}$, and $\mathbf{c}$ be their position vectors.

Show that the set $L_{A B}$ of points in $\mathbb{R}^{2}$ equidistant from $A$ and $B$ is given by an equation of the form

$\mathbf{n}_{A B} \cdot \mathbf{x}=p_{A B},$

where $\mathbf{n}_{A B}$ is a unit vector and $p_{A B}$ is a scalar, to be determined. Show that $L_{A B}$ is perpendicular to $\overrightarrow{A B}$.

Show that if $\mathbf{x}$ satisfies

$\mathbf{n}_{A B} \cdot \mathbf{x}=p_{A B} \quad \text { and } \quad \mathbf{n}_{B C} \cdot \mathbf{x}=p_{B C}$

then

$\mathbf{n}_{C A} \cdot \mathbf{x}=p_{C A} .$

How do you interpret this result geometrically?

(b) Let $\mathbf{a}$ and $\mathbf{u}$ be constant vectors in $\mathbb{R}^{3}$. Explain why the vectors $\mathbf{x}$ satisfying

$\mathbf{x} \times \mathbf{u}=\mathbf{a} \times \mathbf{u}$

describe a line in $\mathbb{R}^{3}$. Find an expression for the shortest distance between two lines $\mathbf{x} \times \mathbf{u}_{k}=\mathbf{a}_{k} \times \mathbf{u}_{k}$, where $k=1,2$.

comment