• # Paper 1, Section I, $4 \mathbf{F}$

Find the radius of convergence of the following power series: (i) $\sum_{n \geqslant 1} \frac{n !}{n^{n}} z^{n}$; (ii) $\sum_{n \geqslant 1} n^{n} z^{n !}$.

comment
• # Paper 1, Section I, D

Show that every sequence of real numbers contains a monotone subsequence.

comment
• # Paper 1, Section II, D

(a) Show that for all $x \in \mathbb{R}$,

$\lim _{k \rightarrow \infty} 3^{k} \sin \left(x / 3^{k}\right)=x,$

stating carefully what properties of sin you are using.

Show that the series $\sum_{n \geqslant 1} 2^{n} \sin \left(x / 3^{n}\right)$ converges absolutely for all $x \in \mathbb{R}$.

(b) Let $\left(a_{n}\right)_{n \in \mathbb{N}}$ be a decreasing sequence of positive real numbers tending to zero. Show that for $\theta \in \mathbb{R}, \theta$ not a multiple of $2 \pi$, the series

$\sum_{n \geqslant 1} a_{n} e^{i n \theta}$

converges.

Hence, or otherwise, show that $\sum_{n \geqslant 1} \frac{\sin (n \theta)}{n}$ converges for all $\theta \in \mathbb{R}$.

comment
• # Paper 1, Section II, E

(i) Prove Taylor's Theorem for a function $f: \mathbb{R} \rightarrow \mathbb{R}$ differentiable $n$ times, in the following form: for every $x \in \mathbb{R}$ there exists $\theta$ with $0<\theta<1$ such that

$f(x)=\sum_{k=0}^{n-1} \frac{f^{(k)}(0)}{k !} x^{k}+\frac{f^{(n)}(\theta x)}{n !} x^{n}$

[You may assume Rolle's Theorem and the Mean Value Theorem; other results should be proved.]

(ii) The function $f: \mathbb{R} \rightarrow \mathbb{R}$ is twice differentiable, and satisfies the differential equation $f^{\prime \prime}-f=0$ with $f(0)=A, f^{\prime}(0)=B$. Show that $f$ is infinitely differentiable. Write down its Taylor series at the origin, and prove that it converges to $f$ at every point. Hence or otherwise show that for any $a, h \in \mathbb{R}$, the series

$\sum_{k=0}^{\infty} \frac{f^{(k)}(a)}{k !} h^{k}$

converges to $f(a+h)$.

comment
• # Paper 1, Section II, E

(i) State the Mean Value Theorem. Use it to show that if $f:(a, b) \rightarrow \mathbb{R}$ is a differentiable function whose derivative is identically zero, then $f$ is constant.

(ii) Let $f: \mathbb{R} \rightarrow \mathbb{R}$ be a function and $\alpha>0$ a real number such that for all $x, y \in \mathbb{R}$,

$|f(x)-f(y)| \leqslant|x-y|^{\alpha} .$

Show that $f$ is continuous. Show moreover that if $\alpha>1$ then $f$ is constant.

(iii) Let $f:[a, b] \rightarrow \mathbb{R}$ be continuous, and differentiable on $(a, b)$. Assume also that the right derivative of $f$ at $a$ exists; that is, the limit

$\lim _{x \rightarrow a+} \frac{f(x)-f(a)}{x-a}$

exists. Show that for any $\epsilon>0$ there exists $x \in(a, b)$ satisfying

$\left|\frac{f(x)-f(a)}{x-a}-f^{\prime}(x)\right|<\epsilon .$

[You should not assume that $f^{\prime}$ is continuous.]

comment
• # Paper 1, Section II, F

Define what it means for a function $f:[0,1] \rightarrow \mathbb{R}$ to be (Riemann) integrable. Prove that $f$ is integrable whenever it is

(a) continuous,

(b) monotonic.

Let $\left\{q_{k}: k \in \mathbb{N}\right\}$ be an enumeration of all rational numbers in $[0,1)$. Define a function $f:[0,1] \rightarrow \mathbb{R}$ by $f(0)=0$,

$f(x)=\sum_{k \in Q(x)} 2^{-k}, \quad x \in(0,1]$

where

$Q(x)=\left\{k \in \mathbb{N}: q_{k} \in[0, x)\right\}$

Show that $f$ has a point of discontinuity in every interval $I \subset[0,1]$.

Is $f$ integrable? [Justify your answer.]

comment

• # Paper 1, Section I, 1B

(a) Let

$z=2+2 i$

(i) Compute $z^{4}$.

(ii) Find all complex numbers $w$ such that $w^{4}=z$.

(b) Find all the solutions of the equation

$e^{2 z^{2}}-1=0$

(c) Let $z=x+i y, \bar{z}=x-i y, x, y \in \mathbb{R}$. Show that the equation of any line, and of any circle, may be written respectively as

$B z+\bar{B} \bar{z}+C=0 \quad \text { and } \quad z \bar{z}+\bar{B} z+B \bar{z}+C=0$

for some complex $B$ and real $C$.

comment
• # Paper 1, Section I, 2A

(a) What is meant by an eigenvector and the corresponding eigenvalue of a matrix $A$ ?

(b) Let $A$ be the matrix

$A=\left(\begin{array}{ccc} 3 & -2 & -2 \\ 1 & 0 & -2 \\ 3 & -3 & -1 \end{array}\right)$

Find the eigenvalues and the corresponding eigenspaces of $A$ and determine whether or not $A$ is diagonalisable.

comment
• # Paper 1, Section II, $7 \mathrm{C}$

Let $\mathcal{A}: \mathbb{C}^{2} \rightarrow \mathbb{C}^{2}$ be the linear map

$\mathcal{A}\left(\begin{array}{c} z \\ w \end{array}\right)=\left(\begin{array}{c} z \mathrm{e}^{i \theta}+w \\ w \mathrm{e}^{-i \phi}+z \end{array}\right)$

where $\theta$ and $\phi$ are real constants. Write down the matrix $A$ of $\mathcal{A}$ with respect to the standard basis of $\mathbb{C}^{2}$ and show that $\operatorname{det} A=2 i \sin \frac{1}{2}(\theta-\phi) \exp \left(\frac{1}{2} i(\theta-\phi)\right)$.

Let $\mathcal{R}: \mathbb{C}^{2} \rightarrow \mathbb{R}^{4}$ be the invertible map

$\mathcal{R}\left(\begin{array}{c} z \\ w \end{array}\right)=\left(\begin{array}{l} \operatorname{Re} z \\ \operatorname{Im} z \\ \operatorname{Re} w \\ \operatorname{Im} w \end{array}\right)$

and define a linear map $\mathcal{B}: \mathbb{R}^{4} \rightarrow \mathbb{R}^{4}$ by $\mathcal{B}=\mathcal{R} \mathcal{A} \mathcal{R}^{-1}$. Find the image of each of the standard basis vectors of $\mathbb{R}^{4}$ under both $\mathcal{R}^{-1}$ and $\mathcal{B}$. Hence, or otherwise, find the matrix $B$ of $\mathcal{B}$ with respect to the standard basis of $\mathbb{R}^{4}$ and verify that $\operatorname{det} B=|\operatorname{det} A|^{2}$.

comment
• # Paper 1, Section II, 5B

(i) For vectors $\mathbf{a}, \mathbf{b}, \mathbf{c} \in \mathbb{R}^{3}$, show that

$\mathbf{a} \times(\mathbf{b} \times \mathbf{c})=(\mathbf{a} \cdot \mathbf{c}) \mathbf{b}-(\mathbf{a} \cdot \mathbf{b}) \mathbf{c} .$

Show that the plane $(\mathbf{r}-\mathbf{a}) \cdot \mathbf{n}=0$ and the line $(\mathbf{r}-\mathbf{b}) \times \mathbf{m}=\mathbf{0}$, where $\mathbf{m} \cdot \mathbf{n} \neq 0$, intersect at the point

$\mathbf{r}=\frac{(\mathbf{a} \cdot \mathbf{n}) \mathbf{m}+\mathbf{n} \times(\mathbf{b} \times \mathbf{m})}{\mathbf{m} \cdot \mathbf{n}}$

and only at that point. What happens if $\mathbf{m} \cdot \mathbf{n}=0$ ?

(ii) Explain why the distance between the planes $\left(\mathbf{r}-\mathbf{a}_{1}\right) \cdot \hat{\mathbf{n}}=0$ and $\left(\mathbf{r}-\mathbf{a}_{2}\right) \cdot \hat{\mathbf{n}}=0$ is $\left|\left(\mathbf{a}_{1}-\mathbf{a}_{2}\right) \cdot \hat{\mathbf{n}}\right|$, where $\hat{\mathbf{n}}$ is a unit vector.

(iii) Find the shortest distance between the lines $(3+s, 3 s, 4-s)$ and $(-2,3+t, 3-t)$ where $s, t \in \mathbb{R}$. [You may wish to consider two appropriately chosen planes and use the result of part (ii).]

comment
• # Paper 1, Section II, A

Let $A$ be a real $n \times n$ symmetric matrix.

(i) Show that all eigenvalues of $A$ are real, and that the eigenvectors of $A$ with respect to different eigenvalues are orthogonal. Assuming that any real symmetric matrix can be diagonalised, show that there exists an orthonormal basis $\left\{\mathbf{y}_{i}\right\}$ of eigenvectors of $A$.

(ii) Consider the linear system

$A \mathbf{x}=\mathbf{b} .$

Show that this system has a solution if and only if $\mathbf{b} \cdot \mathbf{h}=0$ for every vector $\mathbf{h}$ in the kernel of $A$. Let $\mathbf{x}$ be such a solution. Given an eigenvector of $A$ with non-zero eigenvalue, determine the component of $x$ in the direction of this eigenvector. Use this result to find the general solution of the linear system, in the form

$\mathbf{x}=\sum_{i=1}^{n} \alpha_{i} \mathbf{y}_{i}$

comment
• # Paper 1, Section II, C

Let $A$ and $B$ be complex $n \times n$ matrices.

(i) The commutator of $A$ and $B$ is defined to be

$[A, B] \equiv A B-B A$

Show that $[A, A]=0 ;[A, B]=-[B, A] ;$ and $[A, \lambda B]=\lambda[A, B]$ for $\lambda \in \mathbb{C}$. Show further that the trace of $[A, B]$ vanishes.

(ii) A skew-Hermitian matrix $S$ is one which satisfies $S^{\dagger}=-S$, where $\dagger$ denotes the Hermitian conjugate. Show that if $A$ and $B$ are skew-Hermitian then so is $[A, B]$.

(iii) Let $\mathcal{M}$ be the linear map from $\mathbb{R}^{3}$ to the set of $2 \times 2$ complex matrices given by

$\mathcal{M}\left(\begin{array}{l} x \\ y \\ z \end{array}\right)=x M_{1}+y M_{2}+z M_{3}$

where

$M_{1}=\frac{1}{2}\left(\begin{array}{cc} i & 0 \\ 0 & -i \end{array}\right), \quad M_{2}=\frac{1}{2}\left(\begin{array}{cc} 0 & 1 \\ -1 & 0 \end{array}\right), \quad M_{3}=\frac{1}{2}\left(\begin{array}{cc} 0 & i \\ i & 0 \end{array}\right) \text {. }$

Prove that for any $\mathbf{a} \in \mathbb{R}^{3}, \mathcal{M}(\mathbf{a})$ is traceless and skew-Hermitian. By considering pairs such as $\left[M_{1}, M_{2}\right]$, or otherwise, show that for $\mathbf{a}, \mathbf{b} \in \mathbb{R}^{3}$,

$\mathcal{M}(\mathbf{a} \times \mathbf{b})=[\mathcal{M}(\mathbf{a}), \mathcal{M}(\mathbf{b})]$

(iv) Using the result of part (iii), or otherwise, prove that if $C$ is a traceless skewHermitian $2 \times 2$ matrix then there exist matrices $A, B$ such that $C=[A, B]$. [You may use geometrical properties of vectors in $\mathbb{R}^{3}$ without proof.]

comment