• # Paper 1, Section I, B

The matrix

$A=\left(\begin{array}{rr} 2 & -1 \\ 2 & 0 \\ -1 & 1 \end{array}\right)$

represents a linear map $\Phi: \mathbb{R}^{2} \rightarrow \mathbb{R}^{3}$ with respect to the bases

$B=\left\{\left(\begin{array}{l} 0 \\ 2 \end{array}\right),\left(\begin{array}{r} -2 \\ 0 \end{array}\right)\right\}, \quad C=\left\{\left(\begin{array}{l} 1 \\ 1 \\ 0 \end{array}\right),\left(\begin{array}{l} 0 \\ 1 \\ 0 \end{array}\right),\left(\begin{array}{l} 0 \\ 1 \\ 1 \end{array}\right)\right\}$

Find the matrix $A^{\prime}$ that represents $\Phi$ with respect to the bases

$B^{\prime}=\left\{\left(\begin{array}{l} 1 \\ 1 \end{array}\right),\left(\begin{array}{r} 1 \\ -1 \end{array}\right)\right\}, \quad C^{\prime}=\left\{\left(\begin{array}{l} 1 \\ 0 \\ 0 \end{array}\right),\left(\begin{array}{l} 0 \\ 1 \\ 0 \end{array}\right),\left(\begin{array}{l} 0 \\ 0 \\ 1 \end{array}\right)\right\}$

comment
• # Paper 1, Section I, C

(a) Find all complex solutions to the equation $z^{i}=1$.

(b) Write down an equation for the numbers $z$ which describe, in the complex plane, a circle with radius 5 centred at $c=5 i$. Find the points on the circle at which it intersects the line passing through $c$ and $z_{0}=\frac{15}{4}$.

comment
• # Paper 1, Section II, 8B

(a) Consider the matrix

$A=\left(\begin{array}{rrr} \mu & 1 & 1 \\ 2 & -\mu & 0 \\ -\mu & 2 & 1 \end{array}\right)$

Find the kernel of $A$ for each real value of the constant $\mu$. Hence find how many solutions $\mathbf{x} \in \mathbb{R}^{3}$ there are to

$A \mathbf{x}=\left(\begin{array}{l} 1 \\ 1 \\ 2 \end{array}\right)$

depending on the value of $\mu$. [There is no need to find expressions for the solution(s).]

(b) Consider the reflection map $\Phi: \mathbb{R}^{3} \rightarrow \mathbb{R}^{3}$ defined as

$\Phi: \mathbf{x} \mapsto \mathbf{x}-2(\mathbf{x} \cdot \mathbf{n}) \mathbf{n}$

where $\mathbf{n}$ is a unit vector normal to the plane of reflection.

(i) Find the matrix $H$ which corresponds to the map $\Phi$ in terms of the components of $\mathbf{n}$.

(ii) Prove that a reflection in a plane with unit normal $\mathbf{n}$ followed by a reflection in a plane with unit normal vector $\mathbf{m}$ (both containing the origin) is equivalent to a rotation along the line of intersection of the planes with an angle twice that between the planes.

[Hint: Choose your coordinate axes carefully.]

(iii) Briefly explain why a rotation followed by a reflection or vice-versa can never be equivalent to another rotation.

Part IA, 2021 List of Questions

comment
• # Paper 1, Section II, A

Let $A$ be a real, symmetric $n \times n$ matrix.

We say that $A$ is positive semi-definite if $\mathbf{x}^{T} A \mathbf{x} \geqslant 0$ for all $\mathbf{x} \in \mathbb{R}^{n}$. Prove that $A$ is positive semi-definite if and only if all the eigenvalues of $A$ are non-negative. [You may quote results from the course, provided that they are clearly stated.]

We say that $A$ has a principal square root $B$ if $A=B^{2}$ for some symmetric, positive semi-definite $n \times n$ matrix $B$. If such a $B$ exists we write $B=\sqrt{A}$. Show that if $A$ is positive semi-definite then $\sqrt{A}$ exists.

Let $M$ be a real, non-singular $n \times n$ matrix. Show that $M^{T} M$ is symmetric and positive semi-definite. Deduce that $\sqrt{M^{T} M}$ exists and is non-singular. By considering the matrix

$M\left(\sqrt{M^{T} M}\right)^{-1}$

or otherwise, show $M=R P$ for some orthogonal $n \times n$ matrix $R$ and a symmetric, positive semi-definite $n \times n$ matrix $P$.

Describe the transformation $R P$ geometrically in the case $n=3$.

comment
• # Paper 1, Section II, A

(a) For an $n \times n$ matrix $A$ define the characteristic polynomial $\chi_{A}$ and the characteristic equation.

The Cayley-Hamilton theorem states that every $n \times n$ matrix satisfies its own characteristic equation. Verify this in the case $n=2$.

(b) Define the adjugate matrix $\operatorname{adj}(A)$ of an $n \times n$ matrix $A$ in terms of the minors of $A$. You may assume that

$A \operatorname{adj}(A)=\operatorname{adj}(A) A=\operatorname{det}(A) I$

where $I$ is the $n \times n$ identity matrix. Show that if $A$ and $B$ are non-singular $n \times n$ matrices then

$\operatorname{adj}(A B)=\operatorname{adj}(B) \operatorname{adj}(A)$

(c) Let $M$ be an arbitrary $n \times n$ matrix. Explain why

(i) there is an $\alpha>0$ such that $M-t I$ is non-singular for $0;

(ii) the entries of $\operatorname{adj}(M-t I)$ are polynomials in $t$.

Using parts (i) and (ii), or otherwise, show that $(*)$ holds for all matrices $A, B$.

(d) The characteristic polynomial of the arbitrary $n \times n$ matrix $A$ is

$\chi_{A}(z)=(-1)^{n} z^{n}+c_{n-1} z^{n-1}+\cdots+c_{1} z+c_{0}$

By considering adj $(A-t I)$, or otherwise, show that

$\operatorname{adj}(A)=(-1)^{n-1} A^{n-1}-c_{n-1} A^{n-2}-\cdots-c_{2} A-c_{1} I .$

[You may assume the Cayley-Hamilton theorem.]

comment
• # Paper 1, Section II, C

Using the standard formula relating products of the Levi-Civita symbol $\epsilon_{i j k}$ to products of the Kronecker $\delta_{i j}$, prove

$\mathbf{a} \times(\mathbf{b} \times \mathbf{c})=(\mathbf{a} \cdot \mathbf{c}) \mathbf{b}-(\mathbf{a} \cdot \mathbf{b}) \mathbf{c}$

Define the scalar triple product $[\mathbf{a}, \mathbf{b}, \mathbf{c}]$ of three vectors $\mathbf{a}, \mathbf{b}$, and $\mathbf{c}$ in $\mathbb{R}^{3}$ in terms of the dot and cross product. Show that

$[\mathbf{a} \times \mathbf{b}, \mathbf{b} \times \mathbf{c}, \mathbf{c} \times \mathbf{a}]=[\mathbf{a}, \mathbf{b}, \mathbf{c}]^{2}$

Given a basis $\mathbf{e}_{1}, \mathbf{e}_{2}, \mathbf{e}_{3}$ for $\mathbb{R}^{3}$ which is not necessarily orthonormal, let

$\mathbf{e}_{1}^{\prime}=\frac{\mathbf{e}_{2} \times \mathbf{e}_{3}}{\left[\mathbf{e}_{1}, \mathbf{e}_{2}, \mathbf{e}_{3}\right]}, \quad \mathbf{e}_{2}^{\prime}=\frac{\mathbf{e}_{3} \times \mathbf{e}_{1}}{\left[\mathbf{e}_{1}, \mathbf{e}_{2}, \mathbf{e}_{3}\right]}, \quad \mathbf{e}_{3}^{\prime}=\frac{\mathbf{e}_{1} \times \mathbf{e}_{2}}{\left[\mathbf{e}_{1}, \mathbf{e}_{2}, \mathbf{e}_{3}\right]}$

Show that $\mathbf{e}_{1}^{\prime}, \mathbf{e}_{2}^{\prime}, \mathbf{e}_{3}^{\prime}$ is also a basis for $\mathbb{R}^{3}$. [You may assume that three linearly independent vectors in $\mathbb{R}^{3}$ form a basis.]

The vectors $\mathbf{e}_{1}^{\prime \prime}, \mathbf{e}_{2}^{\prime \prime}, \mathbf{e}_{3}^{\prime \prime}$ are constructed from $\mathbf{e}_{1}^{\prime}, \mathbf{e}_{2}^{\prime}, \mathbf{e}_{3}^{\prime}$ in the same way that $\mathbf{e}_{1}^{\prime}, \mathbf{e}_{2}^{\prime}$, $\mathbf{e}_{3}^{\prime}$ are constructed from $\mathbf{e}_{1}, \mathbf{e}_{2}, \mathbf{e}_{3}$. Show that

$\mathbf{e}_{1}^{\prime \prime}=\mathbf{e}_{1}, \quad \mathbf{e}_{2}^{\prime \prime}=\mathbf{e}_{2}, \quad \mathbf{e}_{3}^{\prime \prime}=\mathbf{e}_{3}$

An infinite lattice consists of all points with position vectors given by

$\mathbf{R}=n_{1} \mathbf{e}_{1}+n_{2} \mathbf{e}_{2}+n_{3} \mathbf{e}_{3} \text { with } n_{1}, n_{2}, n_{3} \in \mathbb{Z}$

Find all points with position vectors $\mathbf{K}$ such that $\mathbf{K} \cdot \mathbf{R}$ is an integer for all integers $n_{1}$, $n_{2}, n_{3}$.

comment

• # Paper 1, Section I, C

Given a non-zero complex number $z=x+i y$, where $x$ and $y$ are real, find expressions for the real and imaginary parts of the following functions of $z$ in terms of $x$ and $y$ :

(i) $e^{z}$,

(ii) $\sin z$

(iii) $\frac{1}{z}-\frac{1}{\bar{z}}$,

(iv) $z^{3}-z^{2} \bar{z}-z \bar{z}^{2}+\bar{z}^{3}$,

where $\bar{z}$ is the complex conjugate of $z$.

Now assume $x>0$ and find expressions for the real and imaginary parts of all solutions to

(v) $w=\log z$.

comment
• # Paper 1, Section II, $\mathbf{6 A}$

What does it mean to say an $n \times n$ matrix is Hermitian?

What does it mean to say an $n \times n$ matrix is unitary?

Show that the eigenvalues of a Hermitian matrix are real and that eigenvectors corresponding to distinct eigenvalues are orthogonal.

Suppose that $A$ is an $n \times n$ Hermitian matrix with $n$ distinct eigenvalues $\lambda_{1}, \ldots, \lambda_{n}$ and corresponding normalised eigenvectors $\mathbf{u}_{1}, \ldots, \mathbf{u}_{n}$. Let $U$ denote the matrix whose columns are $\mathbf{u}_{1}, \ldots, \mathbf{u}_{n}$. Show directly that $U$ is unitary and $U D U^{\dagger}=A$, where $D$ is a diagonal matrix you should specify.

If $U$ is unitary and $D$ diagonal, must it be the case that $U D U^{\dagger}$ is Hermitian? Give a proof or counterexample.

Find a unitary matrix $U$ and a diagonal matrix $D$ such that

$U D U^{\dagger}=\left(\begin{array}{ccc} 2 & 0 & 3 i \\ 0 & 2 & 0 \\ -3 i & 0 & 2 \end{array}\right)$

comment
• # Paper 1, Section II, C

(a) Let $A, B$, and $C$ be three distinct points in the plane $\mathbb{R}^{2}$ which are not collinear, and let $\mathbf{a}, \mathbf{b}$, and $\mathbf{c}$ be their position vectors.

Show that the set $L_{A B}$ of points in $\mathbb{R}^{2}$ equidistant from $A$ and $B$ is given by an equation of the form

$\mathbf{n}_{A B} \cdot \mathbf{x}=p_{A B},$

where $\mathbf{n}_{A B}$ is a unit vector and $p_{A B}$ is a scalar, to be determined. Show that $L_{A B}$ is perpendicular to $\overrightarrow{A B}$.

Show that if $\mathbf{x}$ satisfies

$\mathbf{n}_{A B} \cdot \mathbf{x}=p_{A B} \quad \text { and } \quad \mathbf{n}_{B C} \cdot \mathbf{x}=p_{B C}$

then

$\mathbf{n}_{C A} \cdot \mathbf{x}=p_{C A} .$

How do you interpret this result geometrically?

(b) Let $\mathbf{a}$ and $\mathbf{u}$ be constant vectors in $\mathbb{R}^{3}$. Explain why the vectors $\mathbf{x}$ satisfying

$\mathbf{x} \times \mathbf{u}=\mathbf{a} \times \mathbf{u}$

describe a line in $\mathbb{R}^{3}$. Find an expression for the shortest distance between two lines $\mathbf{x} \times \mathbf{u}_{k}=\mathbf{a}_{k} \times \mathbf{u}_{k}$, where $k=1,2$.

comment

• # Paper 1, Section I, $1 \mathrm{C}$

(a) If

$x+i y=\sum_{a=0}^{200} i^{a}+\prod_{b=1}^{50} i^{b}$

where $x, y \in \mathbb{R}$, what is the value of $x y$ ?

(b) Evaluate

$\frac{(1+i)^{2019}}{(1-i)^{2017}}$

(c) Find a complex number $z$ such that

$i^{i^{z}}=2$

(d) Interpret geometrically the curve defined by the set of points satisfying

$\log z=i \log \bar{z}$

in the complex $z$-plane.

comment
• # Paper 1, Section I, A

If $A$ is an $n$ by $n$ matrix, define its determinant $\operatorname{det} A$.

Find the following in terms of $\operatorname{det} A$ and a scalar $\lambda$, clearly showing your argument:

(i) $\operatorname{det} B$, where $B$ is obtained from $A$ by multiplying one row by $\lambda$.

(ii) $\operatorname{det}(\lambda A)$.

(iii) $\operatorname{det} C$, where $C$ is obtained from $A$ by switching row $k$ and row $l(k \neq l)$.

(iv) $\operatorname{det} D$, where $D$ is obtained from $A$ by adding $\lambda$ times column $l$ to column $k$ $(k \neq l)$.

comment
• # Paper 1, Section II, $\mathbf{6 B}$

Let $\mathbf{e}_{1}, \mathbf{e}_{2}, \mathbf{e}_{3}$ be the standard basis vectors of $\mathbb{R}^{3}$. A second set of vectors $\mathbf{f}_{1}, \mathbf{f}_{2}, \mathbf{f}_{3}$ are defined with respect to the standard basis by

$\mathbf{f}_{j}=\sum_{i=1}^{3} P_{i j} \mathbf{e}_{i}, \quad j=1,2,3$

The $P_{i j}$ are the elements of the $3 \times 3$ matrix $P$. State the condition on $P$ under which the set $\left\{\mathbf{f}_{1}, \mathbf{f}_{2}, \mathbf{f}_{3}\right\}$ forms a basis of $\mathbb{R}^{3}$.

Define the matrix $A$ that, for a given linear transformation $\alpha$, gives the relation between the components of any vector $\mathbf{v}$ and those of the corresponding $\alpha(\mathbf{v})$, with the components specified with respect to the standard basis.

Show that the relation between the matrix $A$ and the matrix $\tilde{A}$ of the same transformation with respect to the second basis $\left\{\mathbf{f}_{1}, \mathbf{f}_{2}, \mathbf{f}_{3}\right\}$ is

$\tilde{A}=P^{-1} A P$

Consider the matrix

$A=\left(\begin{array}{ccc} 2 & 6 & 2 \\ 0 & -1 & -1 \\ 0 & 6 & 4 \end{array}\right)$

Find a matrix $P$ such that $B=P^{-1} A P$ is diagonal. Give the elements of $B$ and demonstrate explicitly that the relation between $A$ and $B$ holds.

Give the elements of $A^{n} P$ for any positive integer $n$.

comment
• # Paper 1, Section II, 7B

(a) Let $A$ be an $n \times n$ matrix. Define the characteristic polynomial $\chi_{A}(z)$ of $A$. [Choose a sign convention such that the coefficient of $z^{n}$ in the polynomial is equal to $\left.(-1)^{n} .\right]$ State and justify the relation between the characteristic polynomial and the eigenvalues of $A$. Why does $A$ have at least one eigenvalue?

(b) Assume that $A$ has $n$ distinct eigenvalues. Show that $\chi_{A}(A)=0$. [Each term $c_{r} z^{r}$ in $\chi_{A}(z)$ corresponds to a term $c_{r} A^{r}$ in $\left.\chi_{A}(A) .\right]$

(c) For a general $n \times n$ matrix $B$ and integer $m \geqslant 1$, show that $\chi_{B^{m}}\left(z^{m}\right)=\prod_{l=1}^{m} \chi_{B}\left(\omega_{l} z\right)$, where $\omega_{l}=e^{2 \pi i l / m},(l=1, \ldots, m) .[$ Hint: You may find it helpful to note the factorization of $z^{m}-1$.]

Prove that if $B^{m}$ has an eigenvalue $\lambda$ then $B$ has an eigenvalue $\mu$ where $\mu^{m}=\lambda$.

comment
• # Paper 1, Section II, A

The exponential of a square matrix $M$ is defined as

$\exp M=I+\sum_{n=1}^{\infty} \frac{M^{n}}{n !}$

where $I$ is the identity matrix. [You do not have to consider issues of convergence.]

(a) Calculate the elements of $R$ and $S$, where

$R=\exp \left(\begin{array}{cc} 0 & -\theta \\ \theta & 0 \end{array}\right), \quad S=\exp \left(\begin{array}{ll} 0 & \theta \\ \theta & 0 \end{array}\right)$

and $\theta$ is a real number.

(b) Show that $R R^{T}=I$ and that

$S J S=J, \quad \text { where } \quad J=\left(\begin{array}{cc} 1 & 0 \\ 0 & -1 \end{array}\right)$

(c) Consider the matrices

$A=\left(\begin{array}{ccc} 0 & 0 & 0 \\ 0 & 0 & -1 / 2 \\ 0 & 1 / 2 & 0 \end{array}\right), \quad B=\left(\begin{array}{lll} 0 & 0 & 1 \\ 0 & 0 & 0 \\ 1 & 0 & 0 \end{array}\right)$

Calculate:

(i) $\exp (x A)$,

(ii) $\exp (x B)$.

(d) Defining

$C=\left(\begin{array}{ccc} 1 & 0 & 0 \\ 0 & -1 & 0 \\ 0 & 0 & -1 \end{array}\right)$

find the elements of the following matrices, where $N$ is a natural number:

(i)

$\sum_{n=1}^{N}\left(\exp (x A) C[\exp (x A)]^{T}\right)^{n}$

(ii)

$\sum_{n=1}^{N}(\exp (x B) C \exp (x B))^{n}$

[Your answers to parts $(a),(c)$ and $(d)$ should be in closed form, i.e. not given as series.]

comment
• # Paper 1, Section II, C

(a) Use index notation to prove $\mathbf{a} \times(\mathbf{b} \times \mathbf{c})=(\mathbf{a} \cdot \mathbf{c}) \mathbf{b}-(\mathbf{a} \cdot \mathbf{b}) \mathbf{c}$.

Hence simplify

(i) $(\mathbf{a} \times \mathbf{b}) \cdot(\mathbf{c} \times \mathbf{d})$,

(ii) $(\mathbf{a} \times \mathbf{b}) \cdot[(\mathbf{b} \times \mathbf{c}) \times(\mathbf{c} \times \mathbf{a})]$.

(b) Give the general solution for $\mathbf{x}$ and $\mathbf{y}$ of the simultaneous equations

$\mathbf{x}+\mathbf{y}=2 \mathbf{a}, \quad \mathbf{x} \cdot \mathbf{y}=c \quad(c<\mathbf{a} \cdot \mathbf{a})$

Show in particular that $\mathbf{x}$ and $\mathbf{y}$ must lie at opposite ends of a diameter of a sphere whose centre and radius should be found.

(c) If two pairs of opposite edges of a tetrahedron are perpendicular, show that the third pair are also perpendicular to each other. Show also that the sum of the lengths squared of two opposite edges is the same for each pair.

comment

• # Paper 1, Section I, A

The map $\boldsymbol{\Phi}(\mathbf{x})=\alpha(\mathbf{n} \cdot \mathbf{x}) \mathbf{n}-\mathbf{n} \times(\mathbf{n} \times \mathbf{x})$ is defined for $\mathbf{x} \in \mathbb{R}^{3}$, where $\mathbf{n}$ is a unit vector in $\mathbb{R}^{3}$ and $\alpha$ is a real constant.

(i) Find the values of $\alpha$ for which the inverse map $\Phi^{-1}$ exists, as well as the inverse map itself in these cases.

(ii) When $\boldsymbol{\Phi}$ is not invertible, find its image and kernel. What is the value of the rank and the value of the nullity of $\Phi$ ?

(iii) Let $\mathbf{y}=\mathbf{\Phi}(\mathbf{x})$. Find the components $A_{i j}$ of the matrix $A$ such that $y_{i}=A_{i j} x_{j}$. When $\Phi$ is invertible, find the components of the matrix $B$ such that $x_{i}=B_{i j} y_{j}$.

comment
• # Paper 1, Section I, C

For $z, w \in \mathbb{C}$ define the principal value of $z^{w}$. State de Moivre's theorem.

Hence solve the equations (i) $z^{6}=\sqrt{3}+i$, (ii) $z^{1 / 6}=\sqrt{3}+i$, (iii) $i^{z}=\sqrt{3}+i$ (iv) $\left(e^{5 i \pi / 2}\right)^{z}=\sqrt{3}+i$

[In each expression, the principal value is to be taken.]

comment
• # Paper 1, Section II, $5 \mathbf{V}$

Let $\mathbf{x}, \mathbf{y} \in \mathbb{R}^{n}$ be non-zero real vectors. Define the inner product $\mathbf{x} \cdot \mathbf{y}$ in terms of the components $x_{i}$ and $y_{i}$, and define the norm $|\mathbf{x}|$. Prove that $\mathbf{x} \cdot \mathbf{y} \leqslant|\mathbf{x}||\mathbf{y}|$. When does equality hold? Express the angle between $\mathbf{x}$ and $\mathbf{y}$ in terms of their inner product.

Use suffix notation to expand $(\mathbf{a} \times \mathbf{b}) \cdot(\mathbf{b} \times \mathbf{c})$.

Let $\mathbf{a}, \mathbf{b}, \mathbf{c}$ be given unit vectors in $\mathbb{R}^{3}$, and let $\mathbf{m}=(\mathbf{a} \times \mathbf{b})+(\mathbf{b} \times \mathbf{c})+(\mathbf{c} \times \mathbf{a})$. Obtain expressions for the angle between $\mathbf{m}$ and each of $\mathbf{a}, \mathbf{b}$ and $\mathbf{c}$, in terms of $\mathbf{a}, \mathbf{b}, \mathbf{c}$ and $|\mathbf{m}|$. Calculate $|\mathbf{m}|$ for the particular case when the angles between $\mathbf{a}, \mathbf{b}$ and $\mathbf{c}$ are all equal to $\theta$, and check your result for an example with $\theta=0$ and an example with $\theta=\pi / 2$.

Consider three planes in $\mathbb{R}^{3}$ passing through the points $\mathbf{p}, \mathbf{q}$ and $\mathbf{r}$, respectively, with unit normals $\mathbf{a}, \mathbf{b}$ and $\mathbf{c}$, respectively. State a condition that must be satisfied for the three planes to intersect at a single point, and find the intersection point.

comment
• # Paper 1, Section II, A

What is the definition of an orthogonal matrix $M$ ?

Write down a $2 \times 2$ matrix $R$ representing the rotation of a 2-dimensional vector $(x, y)$ by an angle $\theta$ around the origin. Show that $R$ is indeed orthogonal.

Take a matrix

$A=\left(\begin{array}{ll} a & b \\ b & c \end{array}\right)$

where $a, b, c$ are real. Suppose that the $2 \times 2$ matrix $B=R A R^{T}$ is diagonal. Determine all possible values of $\theta$.

Show that the diagonal entries of $B$ are the eigenvalues of $A$ and express them in terms of the determinant and trace of $A$.

Using the above results, or otherwise, find the elements of the matrix

$\left(\begin{array}{ll} 1 & 2 \\ 2 & 1 \end{array}\right)^{2 N}$

as a function of $N$, where $N$ is a natural number.

comment
• # Paper 1, Section II, B

Let $A$ be a real symmetric $n \times n$ matrix.

(a) Prove the following:

(i) Each eigenvalue of $A$ is real and there is a corresponding real eigenvector.

(ii) Eigenvectors corresponding to different eigenvalues are orthogonal.

(iii) If there are $n$ distinct eigenvalues then the matrix is diagonalisable.

Assuming that $A$ has $n$ distinct eigenvalues, explain briefly how to choose (up to an arbitrary scalar factor) the vector $v$ such that $\frac{v^{T} A v}{v^{T} v}$ is maximised.

(b) A scalar $\lambda$ and a non-zero vector $v$ such that

$A v=\lambda B v$

are called, for a specified $n \times n$ matrix $B$, respectively a generalised eigenvalue and a generalised eigenvector of $A$.

Assume the matrix $B$ is real, symmetric and positive definite (i.e. $\left(u^{*}\right)^{T} B u>0$ for all non-zero complex vectors $u$ ).

Prove the following:

(i) If $\lambda$ is a generalised eigenvalue of $A$ then it is a root of $\operatorname{det}(A-\lambda B)=0$.

(ii) Each generalised eigenvalue of $A$ is real and there is a corresponding real generalised eigenvector.

(iii) Two generalised eigenvectors $u, v$, corresponding to different generalised eigenvalues, are orthogonal in the sense that $u^{T} B v=0$.

(c) Find, up to an arbitrary scalar factor, the vector $v$ such that the value of $F(v)=\frac{v^{T} A v}{v^{T} B v}$ is maximised, and the corresponding value of $F(v)$, where

$A=\left(\begin{array}{ccc} 4 & 2 & 0 \\ 2 & 3 & 0 \\ 0 & 0 & 10 \end{array}\right) \quad \text { and } \quad B=\left(\begin{array}{ccc} 2 & 1 & 0 \\ 1 & 1 & 0 \\ 0 & 0 & 3 \end{array}\right)$

comment
• # Paper 1, Section II, B

(a) Consider the matrix

$R=\left(\begin{array}{ccc} \cos \theta & -\sin \theta & 0 \\ \sin \theta & \cos \theta & 0 \\ 0 & 0 & 1 \end{array}\right)$

representing a rotation about the $z$-axis through an angle $\theta$.

Show that $R$ has three eigenvalues in $\mathbb{C}$ each with modulus 1 , of which one is real and two are complex (in general), and give the relation of the real eigenvector and the two complex eigenvalues to the properties of the rotation.

Now consider the rotation composed of a rotation by angle $\pi / 2$ about the $z$-axis followed by a rotation by angle $\pi / 2$ about the $x$-axis. Determine the rotation axis $\mathbf{n}$ and the magnitude of the angle of rotation $\phi$.

(b) A surface in $\mathbb{R}^{3}$ is given by

$7 x^{2}+4 x y+3 y^{2}+2 x z+3 z^{2}=1 .$

By considering a suitable eigenvalue problem, show that the surface is an ellipsoid, find the lengths of its semi-axes and find the position of the two points on the surface that are closest to the origin.

comment

• # Paper 1, Section I, A

Consider $z \in \mathbb{C}$ with $|z|=1$ and $\arg z=\theta$, where $\theta \in[0, \pi)$.

(a) Prove algebraically that the modulus of $1+z$ is $2 \cos \frac{1}{2} \theta$ and that the argument is $\frac{1}{2} \theta$. Obtain these results geometrically using the Argand diagram.

(b) Obtain corresponding results algebraically and geometrically for $1-z$.

comment
• # Paper 1, Section I, C

Let $A$ and $B$ be real $n \times n$ matrices.

Show that $(A B)^{T}=B^{T} A^{T}$.

For any square matrix, the matrix exponential is defined by the series

$e^{A}=I+\sum_{k=1}^{\infty} \frac{A^{k}}{k !}$

Show that $\left(e^{A}\right)^{T}=e^{A^{T}}$. [You are not required to consider issues of convergence.]

Calculate, in terms of $A$ and $A^{T}$, the matrices $Q_{0}, Q_{1}$ and $Q_{2}$ in the series for the matrix product

$e^{t A} e^{t A^{T}}=\sum_{k=0}^{\infty} Q_{k} t^{k}, \quad \text { where } t \in \mathbb{R}$

Hence obtain a relation between $A$ and $A^{T}$ which necessarily holds if $e^{t A}$ is an orthogonal matrix.

comment
• # Paper 1, Section II, $8 \mathrm{C}$

(a) Given $\mathbf{y} \in \mathbb{R}^{3}$ consider the linear transformation $T$ which maps

$\mathbf{x} \mapsto T \mathbf{x}=\left(\mathbf{x} \cdot \mathbf{e}_{1}\right) \mathbf{e}_{1}+\mathbf{x} \times \mathbf{y}$

Express $T$ as a matrix with respect to the standard basis $\mathbf{e}_{1}, \mathbf{e}_{2}, \mathbf{e}_{3}$, and determine the rank and the dimension of the kernel of $T$ for the cases (i) $\mathbf{y}=c_{1} \mathbf{e}_{1}$, where $c_{1}$ is a fixed number, and (ii) $\mathbf{y} \cdot \mathbf{e}_{1}=0$.

(b) Given that the equation

$A B \mathbf{x}=\mathbf{d}$

where

$A=\left(\begin{array}{ccc} 1 & 1 & 0 \\ 0 & 2 & 3 \\ 0 & 1 & 2 \end{array}\right), \quad B=\left(\begin{array}{ccc} 1 & 4 & 1 \\ -3 & -2 & 1 \\ 1 & -1 & -1 \end{array}\right) \quad \text { and } \quad \mathbf{d}=\left(\begin{array}{l} 1 \\ 1 \\ k \end{array}\right)$

has a solution, show that $4 k=1$.

comment
• # Paper 1, Section II, A

(a) Define the vector product $\mathbf{x} \times \mathbf{y}$ of the vectors $\mathbf{x}$ and $\mathbf{y}$ in $\mathbb{R}^{3}$. Use suffix notation to prove that

$\mathbf{x} \times(\mathbf{x} \times \mathbf{y})=\mathbf{x}(\mathbf{x} \cdot \mathbf{y})-\mathbf{y}(\mathbf{x} \cdot \mathbf{x})$

(b) The vectors $\mathbf{x}_{n+1}(n=0,1,2, \ldots)$ are defined by $\mathbf{x}_{n+1}=\lambda \mathbf{a} \times \mathbf{x}_{n}$, where $\mathbf{a}$ and $\mathbf{x}_{0}$ are fixed vectors with $|\mathbf{a}|=1$ and $\mathbf{a} \times \mathbf{x}_{0} \neq \mathbf{0}$, and $\lambda$ is a positive constant.

(i) Write $\mathbf{x}_{2}$ as a linear combination of $\mathbf{a}$ and $\mathbf{x}_{0}$. Further, for $n \geqslant 1$, express $\mathbf{x}_{n+2}$ in terms of $\lambda$ and $\mathbf{x}_{n}$. Show, for $n \geqslant 1$, that $\left|\mathbf{x}_{n}\right|=\lambda^{n}\left|\mathbf{a} \times \mathbf{x}_{0}\right|$.

(ii) Let $X_{n}$ be the point with position vector $\mathbf{x}_{n}(n=0,1,2, \ldots)$. Show that $X_{1}, X_{2}, \ldots$ lie on a pair of straight lines.

(iii) Show that the line segment $X_{n} X_{n+1}(n \geqslant 1)$ is perpendicular to $X_{n+1} X_{n+2}$. Deduce that $X_{n} X_{n+1}$ is parallel to $X_{n+2} X_{n+3}$.

Show that $\mathbf{x}_{n} \rightarrow \mathbf{0}$ as $n \rightarrow \infty$ if $\lambda<1$, and give a sketch to illustrate the case $\lambda=1$.

(iv) The straight line through the points $X_{n+1}$ and $X_{n+2}$ makes an angle $\theta$ with the straight line through the points $X_{n}$ and $X_{n+3}$. Find $\cos \theta$ in terms of $\lambda$.

comment
• # Paper 1, Section II, B

(a) Show that a square matrix $A$ is anti-symmetric if and only if $\mathbf{x}^{T} A \mathbf{x}=0$ for every vector $\mathbf{x}$.

(b) Let $A$ be a real anti-symmetric $n \times n$ matrix. Show that the eigenvalues of $A$ are imaginary or zero, and that the eigenvectors corresponding to distinct eigenvalues are orthogonal (in the sense that $\mathbf{x}^{\dagger} \mathbf{y}=0$, where the dagger denotes the hermitian conjugate).

(c) Let $A$ be a non-zero real $3 \times 3$ anti-symmetric matrix. Show that there is a real non-zero vector a such that $A \mathbf{a}=\mathbf{0}$.

Now let $\mathbf{b}$ be a real vector orthogonal to $\mathbf{a}$. Show that $A^{2} \mathbf{b}=-\theta^{2} \mathbf{b}$ for some real number $\theta$.

The matrix $e^{A}$ is defined by the exponential series $I+A+\frac{1}{2 !} A^{2}+\cdots$ Express $e^{A} \mathbf{a}$ and $e^{A} \mathbf{b}$ in terms of $\mathbf{a}, \mathbf{b}, A \mathbf{b}$ and $\theta$.

[You are not required to consider issues of convergence.]

comment
• # Paper 1, Section II, B

(a) Show that the eigenvalues of any real $n \times n$ square matrix $A$ are the same as the eigenvalues of $A^{T}$.

The eigenvalues of $A$ are $\lambda_{1}, \lambda_{2}, \ldots, \lambda_{n}$ and the eigenvalues of $A^{T} A$ are $\mu_{1}, \mu_{2}, \ldots$, $\mu_{n}$. Determine, by means of a proof or a counterexample, whether the following are necessary valid: (i) $\sum_{i=1}^{n} \mu_{i}=\sum_{i=1}^{n} \lambda_{i}^{2}$; (ii) $\prod_{i=1}^{n} \mu_{i}=\prod_{i=1}^{n} \lambda_{i}^{2}$.

(b) The $3 \times 3$ matrix $B$ is given by

$B=I+\mathbf{m n}^{T}$

where $\mathbf{m}$ and $\mathbf{n}$ are orthogonal real unit vectors and $I$ is the $3 \times 3$ identity matrix.

(i) Show that $\mathbf{m} \times \mathbf{n}$ is an eigenvector of $B$, and write down a linearly independent eigenvector. Find the eigenvalues of $B$ and determine whether $B$ is diagonalisable.

(ii) Find the eigenvectors and eigenvalues of $B^{T} B$.

comment

• # Paper 1, Section I, A

Let $z \in \mathbb{C}$ be a solution of

$z^{2}+b z+1=0$

where $b \in \mathbb{R}$ and $|b| \leqslant 2$. For which values of $b$ do the following hold?

(i) $\left|e^{z}\right|<1$.

(ii) $\left|e^{i z}\right|=1$.

(iii) $\operatorname{Im}(\cosh z)=0$.

comment
• # Paper 1, Section I, C

Write down the general form of a $2 \times 2$ rotation matrix. Let $R$ be a real $2 \times 2$ matrix with positive determinant such that $|R \mathbf{x}|=|\mathbf{x}|$ for all $\mathbf{x} \in \mathbb{R}^{2}$. Show that $R$ is a rotation matrix.

Let

$J=\left(\begin{array}{cc} 0 & -1 \\ 1 & 0 \end{array}\right)$

Show that any real $2 \times 2$ matrix $A$ which satisfies $A J=J A$ can be written as $A=\lambda R$, where $\lambda$ is a real number and $R$ is a rotation matrix.

comment
• # Paper 1, Section II, $8 \mathbf{C}$

(a) Show that the equations

$\begin{array}{r} 1+s+t=a \\ 1-s+t=b \\ 1-2 t=c \end{array}$

determine $s$ and $t$ uniquely if and only if $a+b+c=3$.

Write the following system of equations

\begin{aligned} &5 x+2 y-z=1+s+t \\ &2 x+5 y-z=1-s+t \\ &-x-y+8 z=1-2 t \end{aligned}

in matrix form $A \mathbf{x}=\mathbf{b}$. Use Gaussian elimination to solve the system for $x, y$, and $z$. State a relationship between the rank and the kernel of a matrix. What is the rank and what is the kernel of $A$ ?

For which values of $x, y$, and $z$ is it possible to solve the above system for $s$ and $t$ ?

(b) Define a unitary $n \times n$ matrix. Let $A$ be a real symmetric $n \times n$ matrix, and let $I$ be the $n \times n$ identity matrix. Show that $|(A+i I) \mathbf{x}|^{2}=|A \mathbf{x}|^{2}+|\mathbf{x}|^{2}$ for arbitrary $\mathbf{x} \in \mathbb{C}^{n}$, where $|\mathbf{x}|^{2}=\sum_{j=1}^{n}\left|x_{j}\right|^{2}$. Find a similar expression for $|(A-i I) \mathbf{x}|^{2}$. Prove that $(A-i I)(A+i I)^{-1}$ is well-defined and is a unitary matrix.

comment
• # Paper 1, Section II, $\mathbf{6 B}$

The $n \times n$ real symmetric matrix $M$ has eigenvectors of unit length $\mathbf{e}_{1}, \mathbf{e}_{2}, \ldots, \mathbf{e}_{n}$, with corresponding eigenvalues $\lambda_{1}, \lambda_{2}, \ldots, \lambda_{n}$, where $\lambda_{1}>\lambda_{2}>\cdots>\lambda_{n}$. Prove that the eigenvalues are real and that $\mathbf{e}_{a} \cdot \mathbf{e}_{b}=\delta_{a b}$.

Let $\mathbf{x}$ be any (real) unit vector. Show that

$\mathbf{x}^{\mathrm{T}} M \mathrm{x} \leqslant \lambda_{1}$

What can be said about $\mathbf{x}$ if $\mathbf{x}^{\mathrm{T}} M \mathbf{x}=\lambda_{1} ?$

Let $S$ be the set of all (real) unit vectors of the form

$\mathbf{x}=\left(0, x_{2}, \ldots, x_{n}\right)$

Show that $\alpha_{1} \mathbf{e}_{1}+\alpha_{2} \mathbf{e}_{2} \in S$ for some $\alpha_{1}, \alpha_{2} \in \mathbb{R}$. Deduce that

$\underset{\mathbf{x} \in S}{\operatorname{Max}} \mathbf{x}^{\mathrm{T}} M \mathbf{x} \geqslant \lambda_{2}$

The $(n-1) \times(n-1)$ matrix $A$ is obtained by removing the first row and the first column of $M$. Let $\mu$ be the greatest eigenvalue of $A$. Show that

$\lambda_{1} \geqslant \mu \geqslant \lambda_{2}$

comment
• # Paper 1, Section II, A

(a) Use suffix notation to prove that

$\mathbf{a} \cdot(\mathbf{b} \times \mathbf{c})=\mathbf{c} \cdot(\mathbf{a} \times \mathbf{b})$

(b) Show that the equation of the plane through three non-colinear points with position vectors $\mathbf{a}, \mathbf{b}$ and $\mathbf{c}$ is

$\mathbf{r} \cdot(\mathbf{a} \times \mathbf{b}+\mathbf{b} \times \mathbf{c}+\mathbf{c} \times \mathbf{a})=\mathbf{a} \cdot(\mathbf{b} \times \mathbf{c})$

where $\mathbf{r}$ is the position vector of a point in this plane.

Find a unit vector normal to the plane in the case $\mathbf{a}=(2,0,1), \mathbf{b}=(0,4,0)$ and $\mathbf{c}=(1,-1,2)$.

(c) Let $\mathbf{r}$ be the position vector of a point in a given plane. The plane is a distance $d$ from the origin and has unit normal vector $\mathbf{n}$, where $\mathbf{n} \cdot \mathbf{r} \geqslant 0$. Write down the equation of this plane.

This plane intersects the sphere with centre at $\mathbf{p}$ and radius $q$ in a circle with centre at $\mathbf{m}$ and radius $\rho$. Show that

$\mathbf{m}-\mathbf{p}=\gamma \mathbf{n}$

Find $\gamma$ in terms of $q$ and $\rho$. Hence find $\rho$ in terms of $\mathbf{n}, d, \mathbf{p}$ and $q$.

comment
• # Paper 1, Section II, B

What does it mean to say that a matrix can be diagonalised? Given that the $n \times n$ real matrix $M$ has $n$ eigenvectors satisfying $\mathbf{e}_{a} \cdot \mathbf{e}_{b}=\delta_{a b}$, explain how to obtain the diagonal form $\Lambda$ of $M$. Prove that $\Lambda$ is indeed diagonal. Obtain, with proof, an expression for the trace of $M$ in terms of its eigenvalues.

The elements of $M$ are given by

$M_{i j}= \begin{cases}0 & \text { for } i=j \\ 1 & \text { for } i \neq j\end{cases}$

Determine the elements of $M^{2}$ and hence show that, if $\lambda$ is an eigenvalue of $M$, then

$\lambda^{2}=(n-1)+(n-2) \lambda$

Assuming that $M$ can be diagonalised, give its diagonal form.

comment

• # Paper 1, Section I, $2 \mathrm{C}$

Precisely one of the four matrices specified below is not orthogonal. Which is it?

Give a brief justification.

$\frac{1}{\sqrt{6}}\left(\begin{array}{rcc} 1 & -\sqrt{3} & \sqrt{2} \\ 1 & \sqrt{3} & \sqrt{2} \\ -2 & 0 & \sqrt{2} \end{array}\right) \quad \frac{1}{3}\left(\begin{array}{ccc} 1 & 2 & -2 \\ 2 & -2 & -1 \\ 2 & 1 & 2 \end{array}\right) \quad \frac{1}{\sqrt{6}}\left(\begin{array}{rrr} 1 & -2 & 1 \\ -\sqrt{6} & 0 & \sqrt{6} \\ 1 & 1 & 1 \end{array}\right) \quad \frac{1}{9}\left(\begin{array}{rrr} 7 & -4 & -4 \\ -4 & 1 & -8 \\ -4 & -8 & 1 \end{array}\right)$

Given that the four matrices represent transformations of $\mathbb{R}^{3}$ corresponding (in no particular order) to a rotation, a reflection, a combination of a rotation and a reflection, and none of these, identify each matrix. Explain your reasoning.

[Hint: For two of the matrices, $A$ and $B$ say, you may find it helpful to calculate $\operatorname{det}(A-I)$ and $\operatorname{det}(B-I)$, where $I$ is the identity matrix.]

comment
• # Paper 1, Section I, B

(a) Describe geometrically the curve

$|\alpha z+\beta \bar{z}|=\sqrt{\alpha \beta}(z+\bar{z})+(\alpha-\beta)^{2},$

where $z \in \mathbb{C}$ and $\alpha, \beta$ are positive, distinct, real constants.

(b) Let $\theta$ be a real number not equal to an integer multiple of $2 \pi$. Show that

$\sum_{m=1}^{N} \sin (m \theta)=\frac{\sin \theta+\sin (N \theta)-\sin (N \theta+\theta)}{2(1-\cos \theta)}$

and derive a similar expression for $\sum_{m=1}^{N} \cos (m \theta)$.

comment
• # Paper 1, Section II, $6 \mathrm{C}$

(i) Consider the map from $\mathbb{R}^{4}$ to $\mathbb{R}^{3}$ represented by the matrix

$\left(\begin{array}{rrrr} \alpha & 1 & 1 & -1 \\ 2 & -\alpha & 0 & -2 \\ -\alpha & 2 & 1 & 1 \end{array}\right)$

where $\alpha \in \mathbb{R}$. Find the image and kernel of the map for each value of $\alpha$.

(ii) Show that any linear map $f: \mathbb{R}^{n} \rightarrow \mathbb{R}$ may be written in the form $f(\mathbf{x})=\mathbf{a} \cdot \mathbf{x}$ for some fixed vector $\mathbf{a} \in \mathbb{R}^{n}$. Show further that $\mathbf{a}$ is uniquely determined by $f$.

It is given that $n=4$ and that the vectors

$\left(\begin{array}{r} 1 \\ 1 \\ 1 \\ -1 \end{array}\right),\left(\begin{array}{r} 2 \\ -1 \\ 0 \\ -2 \end{array}\right),\left(\begin{array}{r} -1 \\ 2 \\ 1 \\ 1 \end{array}\right)$

lie in the kernel of $f$. Determine the set of possible values of a.

comment
• # Paper 1, Section II, 5B

(i) State and prove the Cauchy-Schwarz inequality for vectors in $\mathbb{R}^{n}$. Deduce the inequalities

$|\mathbf{a}+\mathbf{b}| \leqslant|\mathbf{a}|+|\mathbf{b}| \text { and }|\mathbf{a}+\mathbf{b}+\mathbf{c}| \leqslant|\mathbf{a}|+|\mathbf{b}|+|\mathbf{c}|$

for $\mathbf{a}, \mathbf{b}, \mathbf{c} \in \mathbb{R}^{n}$.

(ii) Show that every point on the intersection of the planes

$\mathbf{x} \cdot \mathbf{a}=A, \quad \mathbf{x} \cdot \mathbf{b}=B$

where $\mathbf{a} \neq \mathbf{b}$, satisfies

$|\mathbf{x}|^{2} \geqslant \frac{(A-B)^{2}}{|\mathbf{a}-\mathbf{b}|^{2}}$

What happens if $\mathbf{a}=\mathbf{b} ?$

(iii) Using your results from part (i), or otherwise, show that for any $\mathbf{x}_{1}, \mathbf{x}_{2}, \mathbf{y}_{1}, \mathbf{y}_{2} \in \mathbb{R}^{n}$,

$\left|\mathbf{x}_{1}-\mathbf{y}_{1}\right|-\left|\mathbf{x}_{1}-\mathbf{y}_{2}\right| \leqslant\left|\mathbf{x}_{2}-\mathbf{y}_{1}\right|+\left|\mathbf{x}_{2}-\mathbf{y}_{2}\right|$

comment
• # Paper 1, Section II, A

(a) A matrix is called normal if $A^{\dagger} A=A A^{\dagger}$. Let $A$ be a normal $n \times n$ complex matrix.

(i) Show that for any vector $\mathbf{x} \in \mathbb{C}^{n}$,

$|A \mathbf{x}|=\left|A^{\dagger} \mathbf{x}\right|$

(ii) Show that $A-\lambda I$ is also normal for any $\lambda \in \mathbb{C}$, where $I$ denotes the identity matrix.

(iii) Show that if $\mathbf{x}$ is an eigenvector of $A$ with respect to the eigenvalue $\lambda \in \mathbb{C}$, then $\mathbf{x}$ is also an eigenvector of $A^{\dagger}$, and determine the corresponding eigenvalue.

(iv) Show that if $\mathbf{x}_{\lambda}$ and $\mathbf{x}_{\mu}$ are eigenvectors of $A$ with respect to distinct eigenvalues $\lambda$ and $\mu$ respectively, then $\mathbf{x}_{\lambda}$ and $\mathbf{x}_{\mu}$ are orthogonal.

(v) Show that if $A$ has a basis of eigenvectors, then $A$ can be diagonalised using an orthonormal basis. Justify your answer.

[You may use standard results provided that they are clearly stated.]

(b) Show that any matrix $A$ satisfying $A^{\dagger}=A$ is normal, and deduce using results from (a) that its eigenvalues are real.

(c) Show that any matrix $A$ satisfying $A^{\dagger}=-A$ is normal, and deduce using results from (a) that its eigenvalues are purely imaginary.

(d) Show that any matrix $A$ satisfying $A^{\dagger}=A^{-1}$ is normal, and deduce using results from (a) that its eigenvalues have unit modulus.

comment
• # Paper 1, Section II, A

(i) Find the eigenvalues and eigenvectors of the following matrices and show that both are diagonalisable:

$A=\left(\begin{array}{rrr} 1 & 1 & -1 \\ -1 & 3 & -1 \\ -1 & 1 & 1 \end{array}\right), \quad B=\left(\begin{array}{rcr} 1 & 4 & -3 \\ -4 & 10 & -4 \\ -3 & 4 & 1 \end{array}\right)$

(ii) Show that, if two real $n \times n$ matrices can both be diagonalised using the same basis transformation, then they commute.

(iii) Suppose now that two real $n \times n$ matrices $C$ and $D$ commute and that $D$ has $n$ distinct eigenvalues. Show that for any eigenvector $\mathbf{x}$ of $D$ the vector $C \mathbf{x}$ is a scalar multiple of $\mathbf{x}$. Deduce that there exists a common basis transformation that diagonalises both matrices.

(iv) Show that $A$ and $B$ satisfy the conditions in (iii) and find a matrix $S$ such that both of the matrices $S^{-1} A S$ and $S^{-1} B S$ are diagonal.

comment

• # Paper 1, Section I, 1B

(a) Let

$z=2+2 i$

(i) Compute $z^{4}$.

(ii) Find all complex numbers $w$ such that $w^{4}=z$.

(b) Find all the solutions of the equation

$e^{2 z^{2}}-1=0$

(c) Let $z=x+i y, \bar{z}=x-i y, x, y \in \mathbb{R}$