• 1.I $. 3 \mathrm{~F} \quad$

State the ratio test for the convergence of a series.

Find all real numbers $x$ such that the series

$\sum_{n=1}^{\infty} \frac{x^{n}-1}{n}$

converges.

comment
• 1.I.4E

Let $f:[0,1] \rightarrow \mathbb{R}$ be Riemann integrable, and for $0 \leqslant x \leqslant 1$ set $F(x)=\int_{0}^{x} f(t) d t$.

Assuming that $f$ is continuous, prove that for every $0 the function $F$ is differentiable at $x$, with $F^{\prime}(x)=f(x)$.

If we do not assume that $f$ is continuous, must it still be true that $F$ is differentiable at every $0 ? Justify your answer.

comment
• 1.II $. 9 \mathrm{~F} \quad$

Investigate the convergence of the series (i) $\sum_{n=2}^{\infty} \frac{1}{n^{p}(\log n)^{q}}$ (ii) $\sum_{n=3}^{\infty} \frac{1}{n(\log \log n)^{r}}$

for positive real values of $p, q$ and $r$.

[You may assume that for any positive real value of $\alpha, \log n for $n$ sufficiently large. You may assume standard tests for convergence, provided that they are clearly stated.]

comment
• 1.II.10D

(a) State and prove the intermediate value theorem.

(b) An interval is a subset $I$ of $\mathbb{R}$ with the property that if $x$ and $y$ belong to $I$ and $x then $z$ also belongs to $I$. Prove that if $I$ is an interval and $f$ is a continuous function from $I$ to $\mathbb{R}$ then $f(I)$ is an interval.

(c) For each of the following three pairs $(I, J)$ of intervals, either exhibit a continuous function $f$ from $I$ to $\mathbb{R}$ such that $f(I)=J$ or explain briefly why no such continuous function exists: (i) $I=[0,1], \quad J=[0, \infty)$; (ii) $I=(0,1], \quad J=[0, \infty)$; (iii) $I=(0,1], \quad J=(-\infty, \infty)$.

comment
• 1.II.11D

(a) Let $f$ and $g$ be functions from $\mathbb{R}$ to $\mathbb{R}$ and suppose that both $f$ and $g$ are differentiable at the real number $x$. Prove that the product $f g$ is also differentiable at $x$.

(b) Let $f$ be a continuous function from $\mathbb{R}$ to $\mathbb{R}$ and let $g(x)=x^{2} f(x)$ for every $x$. Prove that $g$ is differentiable at $x$ if and only if either $x=0$ or $f$ is differentiable at $x$.

(c) Now let $f$ be any continuous function from $\mathbb{R}$ to $\mathbb{R}$ and let $g(x)=f(x)^{2}$ for every $x$. Prove that $g$ is differentiable at $x$ if and only if at least one of the following two possibilities occurs:

(i) $f$ is differentiable at $x$;

(ii) $f(x)=0$ and

$\frac{f(x+h)}{|h|^{1 / 2}} \longrightarrow 0 \quad \text { as } \quad h \rightarrow 0$

comment
• 1.II.12E

Let $\sum_{n=0}^{\infty} a_{n} z^{n}$ be a complex power series. Prove that there exists an $R \in[0, \infty]$ such that the series converges for every $z$ with $|z| and diverges for every $z$ with $|z|>R$.

Find the value of $R$ for each of the following power series: (i) $\sum_{n=1}^{\infty} \frac{1}{n^{2}} z^{n}$; (ii) $\sum_{n=0}^{\infty} z^{n !}$.

In each case, determine at which points on the circle $|z|=R$ the series converges.

comment

• 1.I.1B

State de Moivre's Theorem. By evaluating

$\sum_{r=1}^{n} e^{i r \theta}$

or otherwise, show that

$\sum_{r=1}^{n} \cos (r \theta)=\frac{\cos (n \theta)-\cos ((n+1) \theta)}{2(1-\cos \theta)}-\frac{1}{2}$

Hence show that

$\sum_{r=1}^{n} \cos \left(\frac{2 p \pi r}{n+1}\right)=-1$

where $p$ is an integer in the range $1 \leqslant p \leqslant n$.

comment
• 1.I.2A

Let $U$ be an $n \times n$ unitary matrix $\left(U^{\dagger} U=U U^{\dagger}=I\right)$. Suppose that $A$ and $B$ are $n \times n$ Hermitian matrices such that $U=A+i B$.

Show that

(i) $A$ and $B$ commute,

(ii) $A^{2}+B^{2}=I$.

Find $A$ and $B$ in terms of $U$ and $U^{\dagger}$, and hence show that $A$ and $B$ are uniquely determined for a given $U$.

comment
• 1.II $. 5 B \quad$

(a) Use suffix notation to prove that

$\mathbf{a} \times(\mathbf{b} \times \mathbf{c})=(\mathbf{a} \cdot \mathbf{c}) \mathbf{b}-(\mathbf{a} \cdot \mathbf{b}) \mathbf{c}$

Hence, or otherwise, expand (i) $(\mathbf{a} \times \mathbf{b}) \cdot(\mathbf{c} \times \mathbf{d})$, (ii) $(\mathbf{a} \times \mathbf{b}) \cdot[(\mathbf{b} \times \mathbf{c}) \times(\mathbf{c} \times \mathbf{a})]$.

(b) Write down the equation of the line that passes through the point a and is parallel to the unit vector $\hat{\mathbf{t}}$.

The lines $L_{1}$ and $L_{2}$ in three dimensions pass through $\mathbf{a}_{1}$ and $\mathbf{a}_{2}$ respectively and are parallel to the unit vectors $\hat{\mathbf{t}}_{1}$ and $\hat{\mathbf{t}}_{2}$ respectively. Show that a necessary condition for $L_{1}$ and $L_{2}$ to intersect is

$\left(\mathbf{a}_{1}-\mathbf{a}_{2}\right) \cdot\left(\hat{\mathbf{t}}_{1} \times \hat{\mathbf{t}}_{2}\right)=0$

Why is this condition not sufficient?

In the case in which $L_{1}$ and $L_{2}$ are non-parallel and non-intersecting, find an expression for the shortest distance between them.

comment
• 1.II $. 7 \mathrm{C} \quad$

Prove that any $n$ orthonormal vectors in $\mathbb{R}^{n}$ form a basis for $\mathbb{R}^{n}$.

Let $A$ be a real symmetric $n \times n$ matrix with $n$ orthonormal eigenvectors $\mathbf{e}_{i}$ and corresponding eigenvalues $\lambda_{i}$. Obtain coefficients $a_{i}$ such that

$\mathbf{x}=\sum_{i} a_{i} \mathbf{e}_{i}$

is a solution to the equation

$A \mathbf{x}-\mu \mathbf{x}=\mathbf{f},$

where $\mathbf{f}$ is a given vector and $\mu$ is a given scalar that is not an eigenvalue of $A$.

How would your answer differ if $\mu=\lambda_{1}$ ?

Find $a_{i}$ and hence $\mathbf{x}$ when

$A=\left(\begin{array}{ccc} 2 & 1 & 0 \\ 1 & 2 & 0 \\ 0 & 0 & 3 \end{array}\right) \quad \text { and } \quad \mathbf{f}=\left(\begin{array}{l} 1 \\ 2 \\ 3 \end{array}\right)$

in the cases (i) $\mu=2$ and (ii) $\mu=1$.

comment
• 1.II.6A

A real $3 \times 3$ matrix $A$ with elements $A_{i j}$ is said to be upper triangular if $A_{i j}=0$ whenever $i>j$. Prove that if $A$ and $B$ are upper triangular $3 \times 3$ real matrices then so is the matrix product $A B$.

Consider the matrix

$A=\left(\begin{array}{rrr} 1 & 2 & 0 \\ 0 & -1 & 1 \\ 0 & 0 & -1 \end{array}\right)$

Show that $A^{3}+A^{2}-A=I$. Write $A^{-1}$ as a linear combination of $A^{2}, A$ and $I$ and hence compute $A^{-1}$ explicitly.

For all integers $n$ (including negative integers), prove that there exist coefficients $\alpha_{n}, \beta_{n}$ and $\gamma_{n}$ such that

$A^{n}=\alpha_{n} A^{2}+\beta_{n} A+\gamma_{n} I$

For all integers $n$ (including negative integers), show that

$\left(A^{n}\right)_{11}=1, \quad\left(A^{n}\right)_{22}=(-1)^{n}, \quad \text { and } \quad\left(A^{n}\right)_{23}=n(-1)^{n-1}$

Hence derive a set of 3 simultaneous equations for $\left\{\alpha_{n}, \beta_{n}, \gamma_{n}\right\}$ and find their solution.

comment
• 1.II.8C

Prove that the eigenvalues of a Hermitian matrix are real and that eigenvectors corresponding to distinct eigenvalues are orthogonal (i.e. $\mathbf{e}_{i}^{*} \cdot \mathbf{e}_{j}=0$ ).

Let $A$ be a real $3 \times 3$ non-zero antisymmetric matrix. Show that $i A$ is Hermitian. Hence show that there exists a (complex) eigenvector $\mathbf{e}_{1}$ such $A \mathbf{e}_{1}=\lambda \mathbf{e}_{1}$, where $\lambda$ is imaginary.

Show further that there exist real vectors $\mathbf{u}$ and $\mathbf{v}$ and a real number $\theta$ such that

$A \mathbf{u}=\theta \mathbf{v} \quad \text { and } \quad A \mathbf{v}=-\theta \mathbf{u}$

Show also that $A$ has a real eigenvector $\mathbf{e}_{3}$ such that $A \mathbf{e}_{3}=0$.

Let $R=I+\sum_{n=1}^{\infty} \frac{A^{n}}{n !}$. By considering the action of $R$ on $\mathbf{u}, \mathbf{v}$ and $\mathbf{e}_{3}$, show that $R$ is a rotation matrix.

comment