• # Paper 1, Section I, D

Let $\sum_{n \geqslant 0} a_{n} z^{n}$ be a complex power series. State carefully what it means for the power series to have radius of convergence $R$, with $R \in[0, \infty]$.

Suppose the power series has radius of convergence $R$, with $0. Show that the sequence $\left|a_{n} z^{n}\right|$ is unbounded if $|z|>R$.

Find the radius of convergence of $\sum_{n \geqslant 1} z^{n} / n^{3}$.

comment
• # Paper 1, Section I, E

Find the limit of each of the following sequences; justify your answers.

(i)

$\frac{1+2+\ldots+n}{n^{2}}$

(ii)

$\sqrt[n]{n}$

(iii)

$\left(a^{n}+b^{n}\right)^{1 / n} \quad \text { with } \quad 0

comment
• # Paper 1, Section II, D

Define what it means for a bounded function $f:[a, \infty) \rightarrow \mathbb{R}$ to be Riemann integrable.

Show that a monotonic function $f:[a, b] \rightarrow \mathbb{R}$ is Riemann integrable, where $-\infty.

Prove that if $f:[1, \infty) \rightarrow \mathbb{R}$ is a decreasing function with $f(x) \rightarrow 0$ as $x \rightarrow \infty$, then $\sum_{n \geqslant 1} f(n)$ and $\int_{1}^{\infty} f(x) d x$ either both diverge or both converge.

Hence determine, for $\alpha \in \mathbb{R}$, when $\sum_{n \geqslant 1} n^{\alpha}$ converges.

comment
• # Paper 1, Section II, E

Determine whether the following series converge or diverge. Any tests that you use should be carefully stated.

(a)

$\sum_{n \geqslant 1} \frac{n !}{n^{n}}$

(b)

$\sum_{n \geqslant 1} \frac{1}{n+(\log n)^{2}}$

(c)

$\sum_{n \geqslant 1} \frac{(-1)^{n}}{1+\sqrt{n}}$

(d)

$\sum_{n \geqslant 1} \frac{(-1)^{n}}{n\left(2+(-1)^{n}\right)}$

comment
• # Paper 1, Section II, F

(a) Let $n \geqslant 1$ and $f$ be a function $\mathbb{R} \rightarrow \mathbb{R}$. Define carefully what it means for $f$ to be $n$ times differentiable at a point $x_{0} \in \mathbb{R}$.

$\text { Set } \operatorname{sign}(x)= \begin{cases}x /|x|, & x \neq 0 \\ 0, & x=0 .\end{cases}$

Consider the function $f(x)$ on the real line, with $f(0)=0$ and

$f(x)=x^{2} \operatorname{sign}(x)\left|\cos \frac{\pi}{x}\right|, \quad x \neq 0 .$

(b) Is $f(x)$ differentiable at $x=0$ ?

(c) Show that $f(x)$ has points of non-differentiability in any neighbourhood of $x=0$.

(d) Prove that, in any finite interval $I$, the derivative $f^{\prime}(x)$, at the points $x \in I$ where it exists, is bounded: $\left|f^{\prime}(x)\right| \leqslant C$ where $C$ depends on $I$.

comment
• # Paper 1, Section II, F

(a) State and prove Taylor's theorem with the remainder in Lagrange's form.

(b) Suppose that $e: \mathbb{R} \rightarrow \mathbb{R}$ is a differentiable function such that $e(0)=1$ and $e^{\prime}(x)=e(x)$ for all $x \in \mathbb{R}$. Use the result of (a) to prove that

$e(x)=\sum_{n \geqslant 0} \frac{x^{n}}{n !} \quad \text { for all } \quad x \in \mathbb{R}$

[No property of the exponential function may be assumed.]

comment

• # Paper 1, Section I, $1 A$

Let $A$ be the matrix representing a linear map $\Phi: \mathbb{R}^{n} \rightarrow \mathbb{R}^{m}$ with respect to the bases $\left\{\mathbf{b}_{1}, \ldots, \mathbf{b}_{n}\right\}$ of $\mathbb{R}^{n}$ and $\left\{\mathbf{c}_{1}, \ldots, \mathbf{c}_{m}\right\}$ of $\mathbb{R}^{m}$, so that $\Phi\left(\mathbf{b}_{i}\right)=A_{j i} \mathbf{c}_{j}$. Let $\left\{\mathbf{b}_{1}^{\prime}, \ldots, \mathbf{b}_{n}^{\prime}\right\}$ be another basis of $\mathbb{R}^{n}$ and let $\left\{\mathbf{c}_{1}^{\prime}, \ldots, \mathbf{c}_{m}^{\prime}\right\}$ be another basis of $\mathbb{R}^{m}$. Show that the matrix $A^{\prime}$ representing $\Phi$ with respect to these new bases satisfies $A^{\prime}=C^{-1} A B$ with matrices $B$ and $C$ which should be defined.

comment
• # Paper 1, Section I, C

(a) The complex numbers $z_{1}$ and $z_{2}$ satisfy the equations

$z_{1}^{3}=1, \quad z_{2}^{9}=512 .$

What are the possible values of $\left|z_{1}-z_{2}\right|$ ? Justify your answer.

(b) Show that $\left|z_{1}+z_{2}\right| \leqslant\left|z_{1}\right|+\left|z_{2}\right|$ for all complex numbers $z_{1}$ and $z_{2}$. Does the inequality $\left|z_{1}+z_{2}\right|+\left|z_{1}-z_{2}\right| \leqslant 2 \max \left(\left|z_{1}\right|,\left|z_{2}\right|\right)$ hold for all complex numbers $z_{1}$ and $z_{2}$ ? Justify your answer with a proof or a counterexample.

comment
• # Paper 1, Section II, $6 \mathbf{C}$

Let $\mathbf{a}_{1}, \mathbf{a}_{2}$ and $\mathbf{a}_{3}$ be vectors in $\mathbb{R}^{3}$. Give a definition of the dot product, $\mathbf{a}_{1} \cdot \mathbf{a}_{2}$, the cross product, $\mathbf{a}_{1} \times \mathbf{a}_{2}$, and the triple product, $\mathbf{a}_{1} \cdot \mathbf{a}_{2} \times \mathbf{a}_{3}$. Explain what it means to say that the three vectors are linearly independent.

Let $\mathbf{b}_{1}, \mathbf{b}_{2}$ and $\mathbf{b}_{3}$ be vectors in $\mathbb{R}^{3}$. Let $S$ be a $3 \times 3$ matrix with entries $S_{i j}=\mathbf{a}_{i} \cdot \mathbf{b}_{j}$. Show that

$\left(\mathbf{a}_{1} \cdot \mathbf{a}_{2} \times \mathbf{a}_{3}\right)\left(\mathbf{b}_{1} \cdot \mathbf{b}_{2} \times \mathbf{b}_{3}\right)=\operatorname{det}(S)$

Hence show that $S$ is of maximal rank if and only if the sets of vectors $\left\{\mathbf{a}_{1}, \mathbf{a}_{2}\right.$, $\left.\mathbf{a}_{3}\right\}$ and $\left\{\mathbf{b}_{1}, \mathbf{b}_{2}, \mathbf{b}_{3}\right\}$ are both linearly independent.

Now let $\left\{\mathbf{c}_{1}, \mathbf{c}_{2}, \ldots, \mathbf{c}_{n}\right\}$ and $\left\{\mathbf{d}_{1}, \mathbf{d}_{2}, \ldots, \mathbf{d}_{n}\right\}$ be sets of vectors in $\mathbb{R}^{n}$, and let $T$ be an $n \times n$ matrix with entries $T_{i j}=\mathbf{c}_{i} \cdot \mathbf{d}_{j}$. Is it the case that $T$ is of maximal rank if and only if the sets of vectors $\left\{\mathbf{c}_{1}, \mathbf{c}_{2}, \ldots, \mathbf{c}_{n}\right\}$ and $\left\{\mathbf{d}_{1}, \mathbf{d}_{2}, \ldots, \mathbf{d}_{n}\right\}$ are both linearly independent? Justify your answer with a proof or a counterexample.

Given an integer $n>2$, is it always possible to find a set of vectors $\left\{\mathbf{c}_{1}, \mathbf{c}_{2}, \ldots, \mathbf{c}_{n}\right\}$ in $\mathbb{R}^{n}$ with the property that every pair is linearly independent and that every triple is linearly dependent? Justify your answer.

comment
• # Paper 1, Section II, A

Let $A$ and $B$ be real $n \times n$ matrices.

(i) Define the trace of $A, \operatorname{tr}(A)$, and show that $\operatorname{tr}\left(A^{T} B\right)=\operatorname{tr}\left(B^{T} A\right)$.

(ii) Show that $\operatorname{tr}\left(A^{T} A\right) \geqslant 0$, with $\operatorname{tr}\left(A^{T} A\right)=0$ if and only if $A$ is the zero matrix. Hence show that

$\left(\operatorname{tr}\left(A^{T} B\right)\right)^{2} \leqslant \operatorname{tr}\left(A^{T} A\right) \operatorname{tr}\left(B^{T} B\right)$

Under what condition on $A$ and $B$ is equality achieved?

(iii) Find a basis for the subspace of $2 \times 2$ matrices $X$ such that

$\begin{gathered} \operatorname{tr}\left(A^{T} X\right)=\operatorname{tr}\left(B^{T} X\right)=\operatorname{tr}\left(C^{T} X\right)=0 \\ \text { where } \quad A=\left(\begin{array}{ll} 1 & 1 \\ 2 & 0 \end{array}\right), \quad B=\left(\begin{array}{rr} 1 & 1 \\ 0 & -2 \end{array}\right), \quad C=\left(\begin{array}{ll} 0 & 0 \\ 1 & 1 \end{array}\right) \end{gathered}$

comment
• # Paper 1, Section II, B

Let $R$ be a real orthogonal $3 \times 3$ matrix with a real eigenvalue $\lambda$ corresponding to some real eigenvector. Show algebraically that $\lambda=\pm 1$ and interpret this result geometrically.

Each of the matrices

$M=\left(\begin{array}{lll} 0 & 1 & 0 \\ 0 & 0 & 1 \\ 1 & 0 & 0 \end{array}\right), \quad N=\left(\begin{array}{rrr} 1 & -2 & -2 \\ 0 & 1 & -2 \\ 0 & 0 & 1 \end{array}\right), \quad P=\frac{1}{3}\left(\begin{array}{rrr} 1 & -2 & -2 \\ -2 & 1 & -2 \\ -2 & -2 & 1 \end{array}\right)$

has an eigenvalue $\lambda=1$. Confirm this by finding as many independent eigenvectors as possible with this eigenvalue, for each matrix in turn.

Show that one of the matrices above represents a rotation, and find the axis and angle of rotation. Which of the other matrices represents a reflection, and why?

State, with brief explanations, whether the matrices $M, N, P$ are diagonalisable (i) over the real numbers; (ii) over the complex numbers.

comment
• # Paper 1, Section II, B

Let $A$ be a complex $n \times n$ matrix with an eigenvalue $\lambda$. Show directly from the definitions that:

(i) $A^{r}$ has an eigenvalue $\lambda^{r}$ for any integer $r \geqslant 1$; and

(ii) if $A$ is invertible then $\lambda \neq 0$ and $A^{-1}$ has an eigenvalue $\lambda^{-1}$.

For any complex $n \times n$ matrix $A$, let $\chi_{A}(t)=\operatorname{det}(A-t I)$. Using standard properties of determinants, show that:

(iii) $\chi_{A^{2}}\left(t^{2}\right)=\chi_{A}(t) \chi_{A}(-t)$; and

(iv) if $A$ is invertible,

$\chi_{A^{-1}}(t)=(\operatorname{det} A)^{-1}(-1)^{n} t^{n} \chi_{A}\left(t^{-1}\right)$

Explain, including justifications, the relationship between the eigenvalues of $A$ and the polynomial $\chi_{A}(t)$.

If $A^{4}$ has an eigenvalue $\mu$, does it follow that $A$ has an eigenvalue $\lambda$ with $\lambda^{4}=\mu$ ? Give a proof or counterexample.

comment