• # Paper 1, Section I, $4 \mathrm{E}$

Show that if the power series $\sum_{n=0}^{\infty} a_{n} z^{n}(z \in \mathbb{C})$ converges for some fixed $z=z_{0}$, then it converges absolutely for every $z$ satisfying $|z|<\left|z_{0}\right|$.

Define the radius of convergence of a power series.

Give an example of $v \in \mathbb{C}$ and an example of $w \in \mathbb{C}$ such that $|v|=|w|=1, \sum_{n=1}^{\infty} \frac{v^{n}}{n}$ converges and $\sum_{n=1}^{\infty} \frac{w^{n}}{n}$ diverges. [You may assume results about standard series without proof.] Use this to find the radius of convergence of the power series $\sum_{n=1}^{\infty} \frac{z^{n}}{n}$.

comment
• # Paper 1, Section I, F

Given an increasing sequence of non-negative real numbers $\left(a_{n}\right)_{n=1}^{\infty}$, let

$s_{n}=\frac{1}{n} \sum_{k=1}^{n} a_{k}$

Prove that if $s_{n} \rightarrow x$ as $n \rightarrow \infty$ for some $x \in \mathbb{R}$ then also $a_{n} \rightarrow x$ as $n \rightarrow \infty$

comment
• # Paper 1, Section II, D

Let $a, b \in \mathbb{R}$ with $a and let $f:(a, b) \rightarrow \mathbb{R}$.

(a) Define what it means for $f$ to be continuous at $y_{0} \in(a, b)$.

$f$ is said to have a local minimum at $c \in(a, b)$ if there is some $\varepsilon>0$ such that $f(c) \leqslant f(x)$ whenever $x \in(a, b)$ and $|x-c|<\varepsilon$.

If $f$ has a local minimum at $c \in(a, b)$ and $f$ is differentiable at $c$, show that $f^{\prime}(c)=0$.

(b) $f$ is said to be convex if

$f(\lambda x+(1-\lambda) y) \leqslant \lambda f(x)+(1-\lambda) f(y)$

for every $x, y \in(a, b)$ and $\lambda \in[0,1]$. If $f$ is convex, $r \in \mathbb{R}$ and $\left[y_{0}-|r|, y_{0}+|r|\right] \subset(a, b)$, prove that

$(1+\lambda) f\left(y_{0}\right)-\lambda f\left(y_{0}-r\right) \leqslant f\left(y_{0}+\lambda r\right) \leqslant(1-\lambda) f\left(y_{0}\right)+\lambda f\left(y_{0}+r\right)$

for every $\lambda \in[0,1]$.

Deduce that if $f$ is convex then $f$ is continuous.

If $f$ is convex and has a local minimum at $c \in(a, b)$, prove that $f$ has a global minimum at $c$, i.e., that $f(x) \geqslant f(c)$ for every $x \in(a, b)$. [Hint: argue by contradiction.] Must $f$ be differentiable at $c$ ? Justify your answer.

comment
• # Paper 1, Section II, D

(a) State the Intermediate Value Theorem.

(b) Define what it means for a function $f: \mathbb{R} \rightarrow \mathbb{R}$ to be differentiable at a point $a \in \mathbb{R}$. If $f$ is differentiable everywhere on $\mathbb{R}$, must $f^{\prime}$ be continuous everywhere? Justify your answer.

State the Mean Value Theorem.

(c) Let $f: \mathbb{R} \rightarrow \mathbb{R}$ be differentiable everywhere. Let $a, b \in \mathbb{R}$ with $a.

If $f^{\prime}(a) \leqslant y \leqslant f^{\prime}(b)$, prove that there exists $c \in[a, b]$ such that $f^{\prime}(c)=y$. [Hint: consider the function $g$ defined by

$g(x)=\frac{f(x)-f(a)}{x-a}$

if $x \neq a$ and $\left.g(a)=f^{\prime}(a) .\right]$

If additionally $f(a) \leqslant 0 \leqslant f(b)$, deduce that there exists $d \in[a, b]$ such that $f^{\prime}(d)+f(d)=y$.

comment
• # Paper 1, Section II, E

Let $f:[a, b] \rightarrow \mathbb{R}$ be a bounded function defined on the closed, bounded interval $[a, b]$ of $\mathbb{R}$. Suppose that for every $\varepsilon>0$ there is a dissection $\mathcal{D}$ of $[a, b]$ such that $S_{\mathcal{D}}(f)-s_{\mathcal{D}}(f)<\varepsilon$, where $s_{\mathcal{D}}(f)$ and $S_{\mathcal{D}}(f)$ denote the lower and upper Riemann sums of $f$ for the dissection $\mathcal{D}$. Deduce that $f$ is Riemann integrable. [You may assume without proof that $s_{\mathcal{D}}(f) \leqslant S_{\mathcal{D}^{\prime}}(f)$ for all dissections $\mathcal{D}$ and $\mathcal{D}^{\prime}$ of $\left.[a, b] .\right]$

Prove that if $f:[a, b] \rightarrow \mathbb{R}$ is continuous, then $f$ is Riemann integrable.

Let $g:(0,1] \rightarrow \mathbb{R}$ be a bounded continuous function. Show that for any $\lambda \in \mathbb{R}$, the function $f:[0,1] \rightarrow \mathbb{R}$ defined by

$f(x)= \begin{cases}g(x) & \text { if } 0

is Riemann integrable.

Let $f:[a, b] \rightarrow \mathbb{R}$ be a differentiable function with one-sided derivatives at the endpoints. Suppose that the derivative $f^{\prime}$ is (bounded and) Riemann integrable. Show that

$\int_{a}^{b} f^{\prime}(x) d x=f(b)-f(a)$

[You may use the Mean Value Theorem without proof.]

comment
• # Paper 1, Section II, F

(a) Let $\left(x_{n}\right)_{n=1}^{\infty}$ be a non-negative and decreasing sequence of real numbers. Prove that $\sum_{n=1}^{\infty} x_{n}$ converges if and only if $\sum_{k=0}^{\infty} 2^{k} x_{2^{k}}$ converges.

(b) For $s \in \mathbb{R}$, prove that $\sum_{n=1}^{\infty} n^{-s}$ converges if and only if $s>1$.

(c) For any $k \in \mathbb{N}$, prove that

$\lim _{n \rightarrow \infty} 2^{-n} n^{k}=0$

(d) The sequence $\left(a_{n}\right)_{n=0}^{\infty}$ is defined by $a_{0}=1$ and $a_{n+1}=2^{a_{n}}$ for $n \geqslant 0$. For any $k \in \mathbb{N}$, prove that

$\lim _{n \rightarrow \infty} \frac{2^{n^{k}}}{a_{n}}=0$

comment

• # Paper 1, Section I, A

Consider $z \in \mathbb{C}$ with $|z|=1$ and $\arg z=\theta$, where $\theta \in[0, \pi)$.

(a) Prove algebraically that the modulus of $1+z$ is $2 \cos \frac{1}{2} \theta$ and that the argument is $\frac{1}{2} \theta$. Obtain these results geometrically using the Argand diagram.

(b) Obtain corresponding results algebraically and geometrically for $1-z$.

comment
• # Paper 1, Section I, C

Let $A$ and $B$ be real $n \times n$ matrices.

Show that $(A B)^{T}=B^{T} A^{T}$.

For any square matrix, the matrix exponential is defined by the series

$e^{A}=I+\sum_{k=1}^{\infty} \frac{A^{k}}{k !}$

Show that $\left(e^{A}\right)^{T}=e^{A^{T}}$. [You are not required to consider issues of convergence.]

Calculate, in terms of $A$ and $A^{T}$, the matrices $Q_{0}, Q_{1}$ and $Q_{2}$ in the series for the matrix product

$e^{t A} e^{t A^{T}}=\sum_{k=0}^{\infty} Q_{k} t^{k}, \quad \text { where } t \in \mathbb{R}$

Hence obtain a relation between $A$ and $A^{T}$ which necessarily holds if $e^{t A}$ is an orthogonal matrix.

comment
• # Paper 1, Section II, $8 \mathrm{C}$

(a) Given $\mathbf{y} \in \mathbb{R}^{3}$ consider the linear transformation $T$ which maps

$\mathbf{x} \mapsto T \mathbf{x}=\left(\mathbf{x} \cdot \mathbf{e}_{1}\right) \mathbf{e}_{1}+\mathbf{x} \times \mathbf{y}$

Express $T$ as a matrix with respect to the standard basis $\mathbf{e}_{1}, \mathbf{e}_{2}, \mathbf{e}_{3}$, and determine the rank and the dimension of the kernel of $T$ for the cases (i) $\mathbf{y}=c_{1} \mathbf{e}_{1}$, where $c_{1}$ is a fixed number, and (ii) $\mathbf{y} \cdot \mathbf{e}_{1}=0$.

(b) Given that the equation

$A B \mathbf{x}=\mathbf{d}$

where

$A=\left(\begin{array}{ccc} 1 & 1 & 0 \\ 0 & 2 & 3 \\ 0 & 1 & 2 \end{array}\right), \quad B=\left(\begin{array}{ccc} 1 & 4 & 1 \\ -3 & -2 & 1 \\ 1 & -1 & -1 \end{array}\right) \quad \text { and } \quad \mathbf{d}=\left(\begin{array}{l} 1 \\ 1 \\ k \end{array}\right)$

has a solution, show that $4 k=1$.

comment
• # Paper 1, Section II, A

(a) Define the vector product $\mathbf{x} \times \mathbf{y}$ of the vectors $\mathbf{x}$ and $\mathbf{y}$ in $\mathbb{R}^{3}$. Use suffix notation to prove that

$\mathbf{x} \times(\mathbf{x} \times \mathbf{y})=\mathbf{x}(\mathbf{x} \cdot \mathbf{y})-\mathbf{y}(\mathbf{x} \cdot \mathbf{x})$

(b) The vectors $\mathbf{x}_{n+1}(n=0,1,2, \ldots)$ are defined by $\mathbf{x}_{n+1}=\lambda \mathbf{a} \times \mathbf{x}_{n}$, where $\mathbf{a}$ and $\mathbf{x}_{0}$ are fixed vectors with $|\mathbf{a}|=1$ and $\mathbf{a} \times \mathbf{x}_{0} \neq \mathbf{0}$, and $\lambda$ is a positive constant.

(i) Write $\mathbf{x}_{2}$ as a linear combination of $\mathbf{a}$ and $\mathbf{x}_{0}$. Further, for $n \geqslant 1$, express $\mathbf{x}_{n+2}$ in terms of $\lambda$ and $\mathbf{x}_{n}$. Show, for $n \geqslant 1$, that $\left|\mathbf{x}_{n}\right|=\lambda^{n}\left|\mathbf{a} \times \mathbf{x}_{0}\right|$.

(ii) Let $X_{n}$ be the point with position vector $\mathbf{x}_{n}(n=0,1,2, \ldots)$. Show that $X_{1}, X_{2}, \ldots$ lie on a pair of straight lines.

(iii) Show that the line segment $X_{n} X_{n+1}(n \geqslant 1)$ is perpendicular to $X_{n+1} X_{n+2}$. Deduce that $X_{n} X_{n+1}$ is parallel to $X_{n+2} X_{n+3}$.

Show that $\mathbf{x}_{n} \rightarrow \mathbf{0}$ as $n \rightarrow \infty$ if $\lambda<1$, and give a sketch to illustrate the case $\lambda=1$.

(iv) The straight line through the points $X_{n+1}$ and $X_{n+2}$ makes an angle $\theta$ with the straight line through the points $X_{n}$ and $X_{n+3}$. Find $\cos \theta$ in terms of $\lambda$.

comment
• # Paper 1, Section II, B

(a) Show that a square matrix $A$ is anti-symmetric if and only if $\mathbf{x}^{T} A \mathbf{x}=0$ for every vector $\mathbf{x}$.

(b) Let $A$ be a real anti-symmetric $n \times n$ matrix. Show that the eigenvalues of $A$ are imaginary or zero, and that the eigenvectors corresponding to distinct eigenvalues are orthogonal (in the sense that $\mathbf{x}^{\dagger} \mathbf{y}=0$, where the dagger denotes the hermitian conjugate).

(c) Let $A$ be a non-zero real $3 \times 3$ anti-symmetric matrix. Show that there is a real non-zero vector a such that $A \mathbf{a}=\mathbf{0}$.

Now let $\mathbf{b}$ be a real vector orthogonal to $\mathbf{a}$. Show that $A^{2} \mathbf{b}=-\theta^{2} \mathbf{b}$ for some real number $\theta$.

The matrix $e^{A}$ is defined by the exponential series $I+A+\frac{1}{2 !} A^{2}+\cdots$ Express $e^{A} \mathbf{a}$ and $e^{A} \mathbf{b}$ in terms of $\mathbf{a}, \mathbf{b}, A \mathbf{b}$ and $\theta$.

[You are not required to consider issues of convergence.]

comment
• # Paper 1, Section II, B

(a) Show that the eigenvalues of any real $n \times n$ square matrix $A$ are the same as the eigenvalues of $A^{T}$.

The eigenvalues of $A$ are $\lambda_{1}, \lambda_{2}, \ldots, \lambda_{n}$ and the eigenvalues of $A^{T} A$ are $\mu_{1}, \mu_{2}, \ldots$, $\mu_{n}$. Determine, by means of a proof or a counterexample, whether the following are necessary valid: (i) $\sum_{i=1}^{n} \mu_{i}=\sum_{i=1}^{n} \lambda_{i}^{2}$; (ii) $\prod_{i=1}^{n} \mu_{i}=\prod_{i=1}^{n} \lambda_{i}^{2}$.

(b) The $3 \times 3$ matrix $B$ is given by

$B=I+\mathbf{m n}^{T}$

where $\mathbf{m}$ and $\mathbf{n}$ are orthogonal real unit vectors and $I$ is the $3 \times 3$ identity matrix.

(i) Show that $\mathbf{m} \times \mathbf{n}$ is an eigenvector of $B$, and write down a linearly independent eigenvector. Find the eigenvalues of $B$ and determine whether $B$ is diagonalisable.

(ii) Find the eigenvectors and eigenvalues of $B^{T} B$.

comment