# Linear Algebra

### Jump to year

Paper 1, Section I, $1 \mathrm{E}$

commentLet $V$ be a vector space over $\mathbb{R}, \operatorname{dim} V=n$, and let $\langle,$,$rangle be a non-degenerate anti-$ symmetric bilinear form on $V$.

Let $v \in V, v \neq 0$. Show that $v^{\perp}$ is of dimension $n-1$ and $v \in v^{\perp}$. Show that if $W \subseteq v^{\perp}$ is a subspace with $W \oplus \mathbb{R} v=v^{\perp}$, then the restriction of $\langle,$,$rangle to W$ is nondegenerate.

Conclude that the dimension of $V$ is even.

Paper 1, Section II, E

commentLet $d \geqslant 1$, and let $J_{d}=\left(\begin{array}{ccccc}0 & 1 & 0 & \cdots & 0 \\ 0 & 0 & 1 & \cdots & 0 \\ & & \cdots & \cdots & \\ 0 & 0 & \cdots & 0 & 1 \\ 0 & 0 & \ldots & 0 & 0\end{array}\right) \in \operatorname{Mat}_{d}(\mathbb{C})$.

(a) (i) Compute $J_{d}^{n}$, for all $n \geqslant 0$.

(ii) Hence, or otherwise, compute $\left(\lambda I+J_{d}\right)^{n}$, for all $n \geqslant 0$.

(b) Let $V$ be a finite-dimensional vector space over $\mathbb{C}$, and let $\varphi \in \operatorname{End}(V)$. Suppose $\varphi^{n}=0$ for some $n>1$.

(i) Determine the possible eigenvalues of $\varphi$.

(ii) What are the possible Jordan blocks of $\varphi$ ?

(iii) Show that if $\varphi^{2}=0$, there exists a decomposition

$V=U \oplus W_{1} \oplus W_{2}$

where $\varphi(U)=\varphi\left(W_{1}\right)=0, \varphi\left(W_{2}\right)=W_{1}$, and $\operatorname{dim} W_{2}=\operatorname{dim} W_{1}$.

Paper 2, Section II, E

comment(a) Compute the characteristic polynomial and minimal polynomial of

$A=\left(\begin{array}{ccc} -2 & -6 & -9 \\ 3 & 7 & 9 \\ -1 & -2 & -2 \end{array}\right)$

Write down the Jordan normal form for $A$.

(b) Let $V$ be a finite-dimensional vector space over $\mathbb{C}, f: V \rightarrow V$ be a linear map, and for $\alpha \in \mathbb{C}, n \geqslant 1$, write

$W_{\alpha, n}:=\left\{v \in V \mid(f-\alpha I)^{n} v=0\right\}$

(i) Given $v \in W_{\alpha, n}, v \neq 0$, construct a non-zero eigenvector for $f$ in terms of $v$.

(ii) Show that if $w_{1}, \ldots, w_{d}$ are non-zero eigenvectors for $f$ with eigenvalues $\alpha_{1}, \ldots, \alpha_{d}$, and $\alpha_{i} \neq \alpha_{j}$ for all $i \neq j$, then $w_{1}, \ldots, w_{d}$ are linearly independent.

(iii) Show that if $v_{1} \in W_{\alpha_{1}, n}, \ldots, v_{d} \in W_{\alpha_{d}, n}$ are all non-zero, and $\alpha_{i} \neq \alpha_{j}$ for all $i \neq j$, then $v_{1}, \ldots, v_{d}$ are linearly independent.

Paper 3, Section II, 9E

comment(a) (i) State the rank-nullity theorem.

Let $U$ and $W$ be vector spaces. Write down the definition of their direct sum $U \oplus W$ and the inclusions $i: U \rightarrow U \oplus W, j: W \rightarrow U \oplus W$.

Now let $U$ and $W$ be subspaces of a vector space $V$. Define $l: U \cap W \rightarrow U \oplus W$ by $l(x)=i x-j x .$

Describe the quotient space $(U \oplus W) / \operatorname{Im}(l)$ as a subspace of $V$.

(ii) Let $V=\mathbb{R}^{5}$, and let $U$ be the subspace of $V$ spanned by the vectors

$\left(\begin{array}{c} 1 \\ 2 \\ -1 \\ 1 \\ 1 \end{array}\right),\left(\begin{array}{l} 1 \\ 0 \\ 0 \\ 1 \\ 0 \end{array}\right),\left(\begin{array}{c} -2 \\ 2 \\ 2 \\ 1 \\ -2 \end{array}\right)$

and $W$ the subspace of $V$ spanned by the vectors

$\left(\begin{array}{c} 3 \\ 2 \\ -3 \\ 1 \\ 3 \end{array}\right),\left(\begin{array}{l} 1 \\ 1 \\ 0 \\ 0 \\ 0 \end{array}\right),\left(\begin{array}{c} 1 \\ -4 \\ -1 \\ -2 \\ 1 \end{array}\right)$

Determine the dimension of $U \cap W$.

(b) Let $A, B$ be complex $n$ by $n$ matrices with $\operatorname{rank}(B)=k$.

Show that $\operatorname{det}(A+t B)$ is a polynomial in $t$ of degree at most $k$.

Show that if $k=n$ the polynomial is of degree precisely $n$.

Give an example where $k \geqslant 1$ but this polynomial is zero.

Paper 4, Section I, $1 \mathbf{E}$

commentLet $\operatorname{Mat}_{n}(\mathbb{C})$ be the vector space of $n$ by $n$ complex matrices.

Given $A \in \operatorname{Mat}_{n}(\mathbb{C})$, define the linear $\operatorname{map}_{A}: \operatorname{Mat}_{n}(\mathbb{C}) \rightarrow \operatorname{Mat}_{n}(\mathbb{C})$,

$X \mapsto A X-X A$

(i) Compute a basis of eigenvectors, and their associated eigenvalues, when $A$ is the diagonal matrix

$A=\left(\begin{array}{llll} 1 & & & \\ & 2 & & \\ & & \ddots & \\ & & & n \end{array}\right)$

What is the rank of $\varphi_{A}$ ?

(ii) Now let $A=\left(\begin{array}{ll}0 & 1 \\ 0 & 0\end{array}\right)$. Write down the matrix of the linear transformation $\varphi_{A}$ with respect to the standard basis of $\operatorname{Mat}_{2}(\mathbb{C})$.

What is its Jordan normal form?

Paper 4, Section II, E

comment(a) Let $V$ be a complex vector space of dimension $n$.

What is a Hermitian form on $V$ ?

Given a Hermitian form, define the matrix $A$ of the form with respect to the basis $v_{1}, \ldots, v_{n}$ of $V$, and describe in terms of $A$ the value of the Hermitian form on two elements of $V$.

Now let $w_{1}, \ldots, w_{n}$ be another basis of $V$. Suppose $w_{i}=\sum_{j} p_{i j} v_{j}$, and let $P=\left(p_{i j}\right)$. Write down the matrix of the form with respect to this new basis in terms of $A$ and $P$.

Let $N=V^{\perp}$. Describe the dimension of $N$ in terms of the matrix $A$.

(b) Write down the matrix of the real quadratic form

$x^{2}+y^{2}+2 z^{2}+2 x y+2 x z-2 y z .$

Using the Gram-Schmidt algorithm, find a basis which diagonalises the form. What are its rank and signature?

(c) Let $V$ be a real vector space, and $\langle,$,$rangle a symmetric bilinear form on it. Let A$ be the matrix of this form in some basis.

Prove that the signature of $\langle,$,$rangle is the number of positive eigenvalues of A$ minus the number of negative eigenvalues.

Explain, using an example, why the eigenvalues themselves depend on the choice of a basis.

Paper 1, Section I, F

commentDefine what it means for two $n \times n$ matrices $A$ and $B$ to be similar. Define the Jordan normal form of a matrix.

Determine whether the matrices

$A=\left(\begin{array}{ccc} 4 & 6 & -15 \\ 1 & 3 & -5 \\ 1 & 2 & -4 \end{array}\right), \quad B=\left(\begin{array}{ccc} 1 & -3 & 3 \\ -2 & -6 & 13 \\ -1 & -4 & 8 \end{array}\right)$

are similar, carefully stating any theorem you use.

Paper 1, Section II, F

commentLet $\mathcal{M}_{n}$ denote the vector space of $n \times n$ matrices over a field $\mathbb{F}=\mathbb{R}$ or $\mathbb{C}$. What is the $\operatorname{rank} r(A)$ of a matrix $A \in \mathcal{M}_{n}$ ?

Show, stating accurately any preliminary results that you require, that $r(A)=n$ if and only if $A$ is non-singular, i.e. $\operatorname{det} A \neq 0$.

Does $\mathcal{M}_{n}$ have a basis consisting of non-singular matrices? Justify your answer.

Suppose that an $n \times n$ matrix $A$ is non-singular and every entry of $A$ is either 0 or 1. Let $c_{n}$ be the largest possible number of 1 's in such an $A$. Show that $c_{n} \leqslant n^{2}-n+1$. Is this bound attained? Justify your answer.

[Standard properties of the adjugate matrix can be assumed, if accurately stated.]

Paper 2, Section II, F

commentLet $V$ be a finite-dimensional vector space over a field. Show that an endomorphism $\alpha$ of $V$ is idempotent, i.e. $\alpha^{2}=\alpha$, if and only if $\alpha$ is a projection onto its image.

Determine whether the following statements are true or false, giving a proof or counterexample as appropriate:

(i) If $\alpha^{3}=\alpha^{2}$, then $\alpha$ is idempotent.

(ii) The condition $\alpha(1-\alpha)^{2}=0$ is equivalent to $\alpha$ being idempotent.

(iii) If $\alpha$ and $\beta$ are idempotent and such that $\alpha+\beta$ is also idempotent, then $\alpha \beta=0$.

(iv) If $\alpha$ and $\beta$ are idempotent and $\alpha \beta=0$, then $\alpha+\beta$ is also idempotent.

Paper 1, Section I, F

commentDefine a basis of a vector space $V$.

If $V$ has a finite basis $\mathcal{B}$, show using only the definition that any other basis $\mathcal{B}^{\prime}$ has the same cardinality as $\mathcal{B}$.

Paper 1, Section II, F

commentWhat is the adjugate adj $(A)$ of an $n \times n$ matrix $A$ ? How is it related to $\operatorname{det}(A) ?$

(a) Define matrices $B_{0}, B_{1}, \ldots, B_{n-1}$ by

$\operatorname{adj}(t I-A)=\sum_{i=0}^{n-1} B_{i} t^{n-1-i}$

and scalars $c_{0}, c_{1}, \ldots, c_{n}$ by

$\operatorname{det}(t I-A)=\sum_{j=0}^{n} c_{j} t^{n-j}$

Find a recursion for the matrices $B_{i}$ in terms of $A$ and the $c_{j}$ 's.

(b) By considering the partial derivatives of the multivariable polynomial

$p\left(t_{1}, t_{2}, \ldots, t_{n}\right)=\operatorname{det}\left(\left(\begin{array}{cccc} t_{1} & 0 & \cdots & 0 \\ 0 & t_{2} & \cdots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \cdots & t_{n} \end{array}\right)-A\right)$

show that

$\frac{d}{d t}(\operatorname{det}(t I-A))=\operatorname{Tr}(\operatorname{adj}(t I-A))$

(c) Hence show that the $c_{j}$ 's may be expressed in terms of $\operatorname{Tr}(A), \operatorname{Tr}\left(A^{2}\right), \ldots, \operatorname{Tr}\left(A^{n}\right)$.

Paper 2, Section I, F

commentIf $U$ and $W$ are finite-dimensional subspaces of a vector space $V$, prove that

$\operatorname{dim}(U+W)=\operatorname{dim}(U)+\operatorname{dim}(W)-\operatorname{dim}(U \cap W)$

Let

$\begin{aligned} U &=\left\{\mathbf{x} \in \mathbb{R}^{4} \mid x_{1}=7 x_{3}+8 x_{4}, x_{2}+5 x_{3}+6 x_{4}=0\right\} \\ W &=\left\{\mathbf{x} \in \mathbb{R}^{4} \mid x_{1}+2 x_{2}+3 x_{3}=0, x_{4}=0\right\} . \end{aligned}$

Show that $U+W$ is 3 -dimensional and find a linear map $\ell: \mathbb{R}^{4} \rightarrow \mathbb{R}$ such that

$U+W=\left\{\mathbf{x} \in \mathbb{R}^{4} \mid \ell(\mathbf{x})=0\right\}$

Paper 2, Section II, F

commentLet $A$ and $B$ be $n \times n$ matrices over $\mathbb{C}$.

(a) Assuming that $A$ is invertible, show that $A B$ and $B A$ have the same characteristic polynomial.

(b) By considering the matrices $A-s I$, show that $A B$ and $B A$ have the same characteristic polynomial even when $A$ is singular.

(c) Give an example to show that the minimal polynomials $m_{A B}(t)$ and $m_{B A}(t)$ of $A B$ and $B A$ may be different.

(d) Show that $m_{A B}(t)$ and $m_{B A}(t)$ differ at most by a factor of $t$. Stating carefully any results which you use, deduce that if $A B$ is diagonalisable then so is $(B A)^{2}$.

Paper 3, Section II, F

commentIf $q$ is a quadratic form on a finite-dimensional real vector space $V$, what is the associated symmetric bilinear form $\varphi(\cdot, \cdot)$ ? Prove that there is a basis for $V$ with respect to which the matrix for $\varphi$ is diagonal. What is the signature of $q$ ?

If $R \leqslant V$ is a subspace such that $\varphi(r, v)=0$ for all $r \in R$ and all $v \in V$, show that $q^{\prime}(v+R)=q(v)$ defines a quadratic form on the quotient vector space $V / R$. Show that the signature of $q^{\prime}$ is the same as that of $q$.

If $e, f \in V$ are vectors such that $\varphi(e, e)=0$ and $\varphi(e, f)=1$, show that there is a direct sum decomposition $V=\operatorname{span}(e, f) \oplus U$ such that the signature of $\left.q\right|_{U}$ is the same as that of $q$.

Paper 4, Section I, F

commentWhat is an eigenvalue of a matrix $A$ ? What is the eigenspace corresponding to an eigenvalue $\lambda$ of $A$ ?

Consider the matrix

$A=\left(\begin{array}{cccc} a a & a b & a c & a d \\ b a & b b & b c & b d \\ c a & c b & c c & c d \\ d a & d b & d c & d d \end{array}\right)$

for $(a, b, c, d) \in \mathbb{R}^{4}$ a non-zero vector. Show that $A$ has rank 1 . Find the eigenvalues of $A$ and describe the corresponding eigenspaces. Is $A$ diagonalisable?

Paper 4, Section II, F

commentIf $U$ is a finite-dimensional real vector space with inner product $\langle\cdot, \cdot\rangle$, prove that the linear map $\phi: U \rightarrow U^{*}$ given by $\phi(u)\left(u^{\prime}\right)=\left\langle u, u^{\prime}\right\rangle$ is an isomorphism. [You do not need to show that it is linear.]

If $V$ and $W$ are inner product spaces and $\alpha: V \rightarrow W$ is a linear map, what is meant by the adjoint $\alpha^{*}$ of $\alpha$ ? If $\left\{e_{1}, e_{2}, \ldots, e_{n}\right\}$ is an orthonormal basis for $V,\left\{f_{1}, f_{2}, \ldots, f_{m}\right\}$ is an orthonormal basis for $W$, and $A$ is the matrix representing $\alpha$ in these bases, derive a formula for the matrix representing $\alpha^{*}$ in these bases.

Prove that $\operatorname{Im}(\alpha)=\operatorname{Ker}\left(\alpha^{*}\right)^{\perp}$.

If $w_{0} \notin \operatorname{Im}(\alpha)$ then the linear equation $\alpha(v)=w_{0}$ has no solution, but we may instead search for a $v_{0} \in V$ minimising $\left\|\alpha(v)-w_{0}\right\|^{2}$, known as a least-squares solution. Show that $v_{0}$ is such a least-squares solution if and only if it satisfies $\alpha^{*} \alpha\left(v_{0}\right)=\alpha^{*}\left(w_{0}\right)$. Hence find a least-squares solution to the linear equation

$\left(\begin{array}{ll} 1 & 0 \\ 1 & 1 \\ 0 & 1 \end{array}\right)\left(\begin{array}{l} x \\ y \end{array}\right)=\left(\begin{array}{l} 1 \\ 2 \\ 3 \end{array}\right)$

Paper 1, Section I, E

commentState the Rank-Nullity Theorem.

If $\alpha: V \rightarrow W$ and $\beta: W \rightarrow X$ are linear maps and $W$ is finite dimensional, show that

$\operatorname{dim} \operatorname{Im}(\alpha)=\operatorname{dim} \operatorname{Im}(\beta \alpha)+\operatorname{dim}(\operatorname{Im}(\alpha) \cap \operatorname{Ker}(\beta))$

If $\gamma: U \rightarrow V$ is another linear map, show that

$\operatorname{dim} \operatorname{Im}(\beta \alpha)+\operatorname{dim} \operatorname{Im}(\alpha \gamma) \leqslant \operatorname{dim} \operatorname{Im}(\alpha)+\operatorname{dim} \operatorname{Im}(\beta \alpha \gamma)$

Paper 1, Section II, E

commentDefine a Jordan block $J_{m}(\lambda)$. What does it mean for a complex $n \times n$ matrix to be in Jordan normal form?

If $A$ is a matrix in Jordan normal form for an endomorphism $\alpha: V \rightarrow V$, prove that

$\operatorname{dim} \operatorname{Ker}\left((\alpha-\lambda I)^{r}\right)-\operatorname{dim} \operatorname{Ker}\left((\alpha-\lambda I)^{r-1}\right)$

is the number of Jordan blocks $J_{m}(\lambda)$ of $A$ with $m \geqslant r$.

Find a matrix in Jordan normal form for $J_{m}(\lambda)^{2}$. [Consider all possible values of $\lambda$.]

Find a matrix in Jordan normal form for the complex matrix

$\left[\begin{array}{cccc} 0 & 0 & 0 & a_{1} \\ 0 & 0 & a_{2} & 0 \\ 0 & a_{3} & 0 & 0 \\ a_{4} & 0 & 0 & 0 \end{array}\right]$

assuming it is invertible.

Paper 2, Section I, E

commentLet $V$ be a real vector space. Define the dual vector space $V^{*}$ of $V$. If $U$ is a subspace of $V$, define the annihilator $U^{0}$ of $U$. If $x_{1}, x_{2}, \ldots, x_{n}$ is a basis for $V$, define its dual $x_{1}^{*}, x_{2}^{*}, \ldots, x_{n}^{*}$ and prove that it is a basis for $V^{*}$.

If $V$ has basis $x_{1}, x_{2}, x_{3}, x_{4}$ and $U$ is the subspace spanned by

$x_{1}+2 x_{2}+3 x_{3}+4 x_{4} \quad \text { and } \quad 5 x_{1}+6 x_{2}+7 x_{3}+8 x_{4},$

give a basis for $U^{0}$ in terms of the dual basis $x_{1}^{*}, x_{2}^{*}, x_{3}^{*}, x_{4}^{*}$.

Paper 2, Section II, E

commentIf $X$ is an $n \times m$ matrix over a field, show that there are invertible matrices $P$ and $Q$ such that

$Q^{-1} X P=\left[\begin{array}{cc} I_{r} & 0 \\ 0 & 0 \end{array}\right]$

for some $0 \leqslant r \leqslant \min (m, n)$, where $I_{r}$ is the identity matrix of dimension $r$.

For a square matrix of the form $A=\left[\begin{array}{cc}B & D \\ 0 & C\end{array}\right]$ with $B$ and $C$ square matrices, prove that $\operatorname{det}(A)=\operatorname{det}(B) \operatorname{det}(C)$.

If $A \in M_{n \times n}(\mathbb{C})$ and $B \in M_{m \times m}(\mathbb{C})$ have no common eigenvalue, show that the linear map

$\begin{aligned} L: M_{n \times m}(\mathbb{C}) & \longrightarrow M_{n \times m}(\mathbb{C}) \\ X & \longmapsto A X-X B \end{aligned}$

is injective.

Paper 3, Section II, E

commentState and prove the Cayley-Hamilton Theorem.

Let $A$ be an $n \times n$ complex matrix. Using division of polynomials, show that if $p(x)$ is a polynomial then there is another polynomial $r(x)$ of degree at most $(n-1)$ such that $p(\lambda)=r(\lambda)$ for each eigenvalue $\lambda$ of $A$ and such that $p(A)=r(A)$.

Hence compute the $(1,1)$ entry of the matrix $A^{1000}$ when

$A=\left[\begin{array}{ccc} 2 & -1 & 0 \\ 1 & -1 & 1 \\ -1 & -1 & 1 \end{array}\right]$

Paper 4, Section I, E

commentDefine a quadratic form on a finite dimensional real vector space. What does it mean for a quadratic form to be positive definite?

Find a basis with respect to which the quadratic form

$x^{2}+2 x y+2 y^{2}+2 y z+3 z^{2}$

is diagonal. Is this quadratic form positive definite?

Paper 4, Section II, E

commentLet $V$ be a finite dimensional inner-product space over $\mathbb{C}$. What does it mean to say that an endomorphism of $V$ is self-adjoint? Prove that a self-adjoint endomorphism has real eigenvalues and may be diagonalised.

An endomorphism $\alpha: V \rightarrow V$ is called positive definite if it is self-adjoint and satisfies $\langle\alpha(x), x\rangle>0$ for all non-zero $x \in V$; it is called negative definite if $-\alpha$ is positive definite. Characterise the property of being positive definite in terms of eigenvalues, and show that the sum of two positive definite endomorphisms is positive definite.

Show that a self-adjoint endomorphism $\alpha: V \rightarrow V$ has all eigenvalues in the interval $[a, b]$ if and only if $\alpha-\lambda I$ is positive definite for all $\lambda<a$ and negative definite for all $\lambda>b$.

Let $\alpha, \beta: V \rightarrow V$ be self-adjoint endomorphisms whose eigenvalues lie in the intervals $[a, b]$ and $[c, d]$ respectively. Show that all of the eigenvalues of $\alpha+\beta$ lie in the interval $[a+c, b+d]$.

Paper 1, Section I, F

commentState and prove the Steinitz Exchange Lemma.

Deduce that, for a subset $S$ of $\mathbb{R}^{n}$, any two of the following imply the third:

(i) $S$ is linearly independent

(ii) $S$ is spanning

(iii) $S$ has exactly $n$ elements

Let $e_{1}, e_{2}$ be a basis of $\mathbb{R}^{2}$. For which values of $\lambda$ do $\lambda e_{1}+e_{2}, e_{1}+\lambda e_{2}$ form a basis of $\mathbb{R}^{2} ?$

Paper 1, Section II, F

commentLet $U$ and $V$ be finite-dimensional real vector spaces, and let $\alpha: U \rightarrow V$ be a surjective linear map. Which of the following are always true and which can be false? Give proofs or counterexamples as appropriate.

(i) There is a linear map $\beta: V \rightarrow U$ such that $\beta \alpha$ is the identity map on $U$.

(ii) There is a linear map $\beta: V \rightarrow U$ such that $\alpha \beta$ is the identity map on $V$.

(iii) There is a subspace $W$ of $U$ such that the restriction of $\alpha$ to $W$ is an isomorphism from $W$ to $V$.

(iv) If $X$ and $Y$ are subspaces of $U$ with $U=X \oplus Y$ then $V=\alpha(X) \oplus \alpha(Y)$.

(v) If $X$ and $Y$ are subspaces of $U$ with $V=\alpha(X) \oplus \alpha(Y)$ then $U=X \oplus Y$.

Paper 2, Section I, F

commentState and prove the Rank-Nullity theorem.

Let $\alpha$ be a linear map from $\mathbb{R}^{3}$ to $\mathbb{R}^{3}$ of rank 2 . Give an example to show that $\mathbb{R}^{3}$ may be the direct sum of the kernel of $\alpha$ and the image of $\alpha$, and also an example where this is not the case.

Paper 2, Section II, F

commentLet $\alpha: U \rightarrow V$ and $\beta: V \rightarrow W$ be linear maps between finite-dimensional real vector spaces.

Show that the rank $r(\beta \alpha)$ satisfies $r(\beta \alpha) \leqslant \min (r(\beta), r(\alpha))$. Show also that $r(\beta \alpha) \geqslant r(\alpha)+r(\beta)-\operatorname{dim} V$. For each of these two inequalities, give examples to show that we may or may not have equality.

Now let $V$ have dimension $2 n$ and let $\alpha: V \rightarrow V$ be a linear map of rank $2 n-2$ such that $\alpha^{n}=0$. Find the rank of $\alpha^{k}$ for each $1 \leqslant k \leqslant n-1$.

Paper 3, Section II, F

commentLet $f$ be a quadratic form on a finite-dimensional real vector space $V$. Prove that there exists a diagonal basis for $f$, meaning a basis with respect to which the matrix of $f$ is diagonal.

Define the rank $r$ and signature $s$ of $f$ in terms of this matrix. Prove that $r$ and $s$ are independent of the choice of diagonal basis.

In terms of $r, s$, and the dimension $n$ of $V$, what is the greatest dimension of a subspace on which $f$ is zero?

Now let $f$ be the quadratic form on $\mathbb{R}^{3}$ given by $f(x, y, z)=x^{2}-y^{2}$. For which points $v$ in $\mathbb{R}^{3}$ is it the case that there is some diagonal basis for $f$ containing $v$ ?

Paper 4, Section I, F

commentBriefly explain the Gram-Schmidt orthogonalisation process in a real finite-dimensional inner product space $V$.

For a subspace $U$ of $V$, define $U^{\perp}$, and show that $V=U \oplus U^{\perp}$.

For which positive integers $n$ does

$(f, g)=f(1) g(1)+f(2) g(2)+f(3) g(3)$

define an inner product on the space of all real polynomials of degree at most $n$ ?

Paper 4, Section II, F

commentWhat is the dual $X^{*}$ of a finite-dimensional real vector space $X$ ? If $X$ has a basis $e_{1}, \ldots, e_{n}$, define the dual basis, and prove that it is indeed a basis of $X^{*}$.

[No results on the dimension of duals may be assumed without proof.]

Write down (without making a choice of basis) an isomorphism from $X$ to $X^{* *}$. Prove that your map is indeed an isomorphism.

Does every basis of $X^{*}$ arise as the dual basis of some basis of $X ?$ Justify your answer.

A subspace $W$ of $X^{*}$ is called separating if for every non-zero $x \in X$ there is a $T \in W$ with $T(x) \neq 0$. Show that the only separating subspace of $X^{*}$ is $X^{*}$ itself.

Now let $X$ be the (infinite-dimensional) space of all real polynomials. Explain briefly how we may identify $X^{*}$ with the space of all real sequences. Give an example of a proper subspace of $X^{*}$ that is separating.

Paper 1, Section $I$, $1 F$

comment(a) Consider the linear transformation $\alpha: \mathbb{R}^{3} \rightarrow \mathbb{R}^{3}$ given by the matrix

$\left(\begin{array}{rrr} 5 & -6 & -6 \\ -1 & 4 & 2 \\ 3 & -6 & -4 \end{array}\right)$

Find a basis of $\mathbb{R}^{3}$ in which $\alpha$ is represented by a diagonal matrix.

(b) Give a list of $6 \times 6$ matrices such that any linear transformation $\beta: \mathbb{R}^{6} \rightarrow \mathbb{R}^{6}$ with characteristic polynomial

$(x-2)^{4}(x+7)^{2}$

and minimal polynomial

$(x-2)^{2}(x+7)$

is similar to one of the matrices on your list. No two distinct matrices on your list should be similar. [No proof is required.]

Paper 1, Section II, F

commentLet $M_{n, n}$ denote the vector space over $F=\mathbb{R}$ or $\mathbb{C}$ of $n \times n$ matrices with entries in $F$. Let $\operatorname{Tr}: M_{n, n} \rightarrow F$ denote the trace functional, i.e., if $A=\left(a_{i j}\right)_{1 \leqslant i, j \leqslant n} \in M_{n, n}$, then

$\operatorname{Tr}(A)=\sum_{i=1}^{n} a_{i i}$

(a) Show that Tr is a linear functional.

(b) Show that $\operatorname{Tr}(A B)=\operatorname{Tr}(B A)$ for $A, B \in M_{n, n}$.

(c) Show that $\operatorname{Tr}$ is unique in the following sense: If $f: M_{n, n} \rightarrow F$ is a linear functional such that $f(A B)=f(B A)$ for each $A, B \in M_{n, n}$, then $f$ is a scalar multiple of the trace functional. If, in addition, $f(I)=n$, then $f=$ Tr.

(d) Let $W \subseteq M_{n, n}$ be the subspace spanned by matrices $C$ of the form $C=A B-B A$ for $A, B \in M_{n, n}$. Show that $W$ is the kernel of Tr.

Paper 2, Section I, F

commentFind a linear change of coordinates such that the quadratic form

$2 x^{2}+8 x y-6 x z+y^{2}-4 y z+2 z^{2}$

takes the form

$\alpha x^{2}+\beta y^{2}+\gamma z^{2}$

for real numbers $\alpha, \beta$ and $\gamma$.

Paper 2, Section II, F

commentLet $M_{n, n}$ denote the vector space over a field $F=\mathbb{R}$ or $\mathbb{C}$ of $n \times n$ matrices with entries in $F$. Given $B \in M_{n, n}$, consider the two linear transformations $R_{B}, L_{B}: M_{n, n} \rightarrow$ $M_{n, n}$ defined by

$L_{B}(A)=B A, \quad R_{B}(A)=A B$

(a) Show that $\operatorname{det} L_{B}=(\operatorname{det} B)^{n}$.

[For parts (b) and (c), you may assume the analogous result $\operatorname{det} R_{B}=(\operatorname{det} B)^{n}$ without proof.]

(b) Now let $F=\mathbb{C}$. For $B \in M_{n, n}$, write $B^{*}$ for the conjugate transpose of $B$, i.e., $B^{*}:=\bar{B}^{T}$. For $B \in M_{n, n}$, define the linear transformation $M_{B}: M_{n, n} \rightarrow M_{n, n}$ by

$M_{B}(A)=B A B^{*}$

Show that $\operatorname{det} M_{B}=|\operatorname{det} B|^{2 n}$.

(c) Again let $F=\mathbb{C}$. Let $W \subseteq M_{n, n}$ be the set of Hermitian matrices. [Note that $W$ is not a vector space over $\mathbb{C}$ but only over $\mathbb{R} .]$ For $B \in M_{n, n}$ and $A \in W$, define $T_{B}(A)=B A B^{*}$. Show that $T_{B}$ is an $\mathbb{R}$-linear operator on $W$, and show that as such,

$\operatorname{det} T_{B}=|\operatorname{det} B|^{2 n}$

Paper 3, Section II, F

commentLet $\alpha: V \rightarrow V$ be a linear transformation defined on a finite dimensional inner product space $V$ over $\mathbb{C}$. Recall that $\alpha$ is normal if $\alpha$ and its adjoint $\alpha^{*}$ commute. Show that $\alpha$ being normal is equivalent to each of the following statements:

(i) $\alpha=\alpha_{1}+i \alpha_{2}$ where $\alpha_{1}, \alpha_{2}$ are self-adjoint operators and $\alpha_{1} \alpha_{2}=\alpha_{2} \alpha_{1}$;

(ii) there is an orthonormal basis for $V$ consisting of eigenvectors of $\alpha$;

(iii) there is a polynomial $g$ with complex coefficients such that $\alpha^{*}=g(\alpha)$.

Paper 4, Section I, F

commentFor which real numbers $x$ do the vectors

$(x, 1,1,1), \quad(1, x, 1,1), \quad(1,1, x, 1), \quad(1,1,1, x),$

not form a basis of $\mathbb{R}^{4}$ ? For each such value of $x$, what is the dimension of the subspace of $\mathbb{R}^{4}$ that they span? For each such value of $x$, provide a basis for the spanned subspace, and extend this basis to a basis of $\mathbb{R}^{4}$.

Paper 4, Section II, F

comment(a) Let $\alpha: V \rightarrow W$ be a linear transformation between finite dimensional vector spaces over a field $F=\mathbb{R}$ or $\mathbb{C}$.

Define the dual map of $\alpha$. Let $\delta$ be the dual map of $\alpha$. Given a subspace $U \subseteq V$, define the annihilator $U^{\circ}$ of $U$. Show that $(\operatorname{ker} \alpha)^{\circ}$ and the image of $\delta$ coincide. Conclude that the dimension of the image of $\alpha$ is equal to the dimension of the image of $\delta$. Show that $\operatorname{dim} \operatorname{ker}(\alpha)-\operatorname{dim} \operatorname{ker}(\delta)=\operatorname{dim} V-\operatorname{dim} W$.

(b) Now suppose in addition that $V, W$ are inner product spaces. Define the adjoint $\alpha^{*}$ of $\alpha$. Let $\beta: U \rightarrow V, \gamma: V \rightarrow W$ be linear transformations between finite dimensional inner product spaces. Suppose that the image of $\beta$ is equal to the kernel of $\gamma$. Then show that $\beta \beta^{*}+\gamma^{*} \gamma$ is an isomorphism.

Paper 1, Section I, E

commentLet $U$ and $V$ be finite dimensional vector spaces and $\alpha: U \rightarrow V$ a linear map. Suppose $W$ is a subspace of $U$. Prove that

$r(\alpha) \geqslant r\left(\left.\alpha\right|_{W}\right) \geqslant r(\alpha)-\operatorname{dim}(U)+\operatorname{dim}(W)$

where $r(\alpha)$ denotes the rank of $\alpha$ and $\left.\alpha\right|_{W}$ denotes the restriction of $\alpha$ to $W$. Give examples showing that each inequality can be both a strict inequality and an equality.

Paper 1, Section II, E

commentDetermine the characteristic polynomial of the matrix

$M=\left(\begin{array}{cccc} x & 1 & 1 & 0 \\ 1-x & 0 & -1 & 0 \\ 2 & 2 x & 1 & 0 \\ 0 & 0 & 0 & 1 \end{array}\right)$

For which values of $x \in \mathbb{C}$ is $M$ invertible? When $M$ is not invertible determine (i) the Jordan normal form $J$ of $M$, (ii) the minimal polynomial of $M$.

Find a basis of $\mathbb{C}^{4}$ such that $J$ is the matrix representing the endomorphism $M: \mathbb{C}^{4} \rightarrow \mathbb{C}^{4}$ in this basis. Give a change of basis matrix $P$ such that $P^{-1} M P=J$.

Paper 2, Section I, $1 \mathrm{E}$

commentLet $q$ denote a quadratic form on a real vector space $V$. Define the rank and signature of $q$.

Find the rank and signature of the following quadratic forms. (a) $q(x, y, z)=x^{2}+y^{2}+z^{2}-2 x z-2 y z$. (b) $q(x, y, z)=x y-x z$.

(c) $q(x, y, z)=x y-2 z^{2}$.

Paper 2, Section II, E

comment(i) Suppose $A$ is a matrix that does not have $-1$ as an eigenvalue. Show that $A+I$ is non-singular. Further, show that $A$ commutes with $(A+I)^{-1}$.

(ii) A matrix $A$ is called skew-symmetric if $A^{T}=-A$. Show that a real skewsymmetric matrix does not have $-1$ as an eigenvalue.

(iii) Suppose $A$ is a real skew-symmetric matrix. Show that $U=(I-A)(I+A)^{-1}$ is orthogonal with determinant 1 .

(iv) Verify that every orthogonal matrix $U$ with determinant 1 which does not have $-1$ as an eigenvalue can be expressed as $(I-A)(I+A)^{-1}$ where $A$ is a real skew-symmetric matrix.

Paper 3, Section II, E

commentLet $A_{1}, A_{2}, \ldots, A_{k}$ be $n \times n$ matrices over a field $\mathbb{F}$. We say $A_{1}, A_{2}, \ldots, A_{k}$ are simultaneously diagonalisable if there exists an invertible matrix $P$ such that $P^{-1} A_{i} P$ is diagonal for all $1 \leqslant i \leqslant k$. We say the matrices are commuting if $A_{i} A_{j}=A_{j} A_{i}$ for all $i, j$.

(i) Suppose $A_{1}, A_{2}, \ldots, A_{k}$ are simultaneously diagonalisable. Prove that they are commuting.

(ii) Define an eigenspace of a matrix. Suppose $B_{1}, B_{2}, \ldots, B_{k}$ are commuting $n \times n$ matrices over a field $\mathbb{F}$. Let $E$ denote an eigenspace of $B_{1}$. Prove that $B_{i}(E) \leqslant E$ for all $i$.

(iii) Suppose $B_{1}, B_{2}, \ldots, B_{k}$ are commuting diagonalisable matrices. Prove that they are simultaneously diagonalisable.

(iv) Are the $2 \times 2$ diagonalisable matrices over $\mathbb{C}$ simultaneously diagonalisable? Explain your answer.

Paper 4, Section I, E

commentDefine the dual space $V^{*}$ of a vector space $V$. Given a basis $\left\{x_{1}, \ldots, x_{n}\right\}$ of $V$ define its dual and show it is a basis of $V^{*}$.

Let $V$ be a 3-dimensional vector space over $\mathbb{R}$ and let $\left\{\zeta_{1}, \zeta_{2}, \zeta_{3}\right\}$ be the basis of $V^{*}$ dual to the basis $\left\{x_{1}, x_{2}, x_{3}\right\}$ for $V$. Determine, in terms of the $\zeta_{i}$, the bases dual to each of the following: (a) $\left\{x_{1}+x_{2}, x_{2}+x_{3}, x_{3}\right\}$, (b) $\left\{x_{1}+x_{2}, x_{2}+x_{3}, x_{3}+x_{1}\right\}$.

Paper 4, Section II, E

commentSuppose $U$ and $W$ are subspaces of a vector space $V$. Explain what is meant by $U \cap W$ and $U+W$ and show that both of these are subspaces of $V$.

Show that if $U$ and $W$ are subspaces of a finite dimensional space $V$ then

$\operatorname{dim} U+\operatorname{dim} W=\operatorname{dim}(U \cap W)+\operatorname{dim}(U+W)$

Determine the dimension of the subspace $W$ of $\mathbb{R}^{5}$ spanned by the vectors

$\left(\begin{array}{c} 1 \\ 3 \\ 3 \\ -1 \\ 1 \end{array}\right),\left(\begin{array}{l} 4 \\ 1 \\ 3 \\ 2 \\ 1 \end{array}\right),\left(\begin{array}{l} 3 \\ 2 \\ 1 \\ 2 \\ 3 \end{array}\right),\left(\begin{array}{c} 2 \\ 2 \\ 5 \\ -1 \\ -1 \end{array}\right)$

Write down a $5 \times 5$ matrix which defines a linear map $\mathbb{R}^{5} \rightarrow \mathbb{R}^{5}$ with $(1,1,1,1,1)^{T}$ in the kernel and with image $W$.

What is the dimension of the space spanned by all linear maps $\mathbb{R}^{5} \rightarrow \mathbb{R}^{5}$

(i) with $(1,1,1,1,1)^{T}$ in the kernel and with image contained in $W$,

(ii) with $(1,1,1,1,1)^{T}$ in the kernel or with image contained in $W$ ?

Paper 1, Section I, G

commentState and prove the Steinitz Exchange Lemma. Use it to prove that, in a finitedimensional vector space: any two bases have the same size, and every linearly independent set extends to a basis.

Let $e_{1}, \ldots, e_{n}$ be the standard basis for $\mathbb{R}^{n}$. Is $e_{1}+e_{2}, e_{2}+e_{3}, e_{3}+e_{1}$ a basis for $\mathbb{R}^{3} ?$ Is $e_{1}+e_{2}, e_{2}+e_{3}, e_{3}+e_{4}, e_{4}+e_{1}$ a basis for $\mathbb{R}^{4} ?$ Justify your answers.

Paper 1, Section II, G

commentLet $V$ be an $n$-dimensional real vector space, and let $T$ be an endomorphism of $V$. We say that $T$ acts on a subspace $W$ if $T(W) \subset W$.

(i) For any $x \in V$, show that $T$ acts on the linear span of $\left\{x, T(x), T^{2}(x), \ldots, T^{n-1}(x)\right\}$.

(ii) If $\left\{x, T(x), T^{2}(x), \ldots, T^{n-1}(x)\right\}$ spans $V$, show directly (i.e. without using the CayleyHamilton Theorem) that $T$ satisfies its own characteristic equation.

(iii) Suppose that $T$ acts on a subspace $W$ with $W \neq\{0\}$ and $W \neq V$. Let $e_{1}, \ldots, e_{k}$ be a basis for $W$, and extend to a basis $e_{1}, \ldots, e_{n}$ for $V$. Describe the matrix of $T$ with respect to this basis.

(iv) Using (i), (ii) and (iii) and induction, give a proof of the Cayley-Hamilton Theorem.

[Simple properties of determinants may be assumed without proof.]

Paper 2, Section I, G

commentState and prove the Rank-Nullity Theorem.

Let $\alpha$ be a linear map from $\mathbb{R}^{5}$ to $\mathbb{R}^{3}$. What are the possible dimensions of the kernel of $\alpha$ ? Justify your answer.

Paper 2, Section II, G

commentDefine the determinant of an $n \times n$ complex matrix $A$. Explain, with justification, how the determinant of $A$ changes when we perform row and column operations on $A$.

Let $A, B, C$ be complex $n \times n$ matrices. Prove the following statements. (i) $\operatorname{det}\left(\begin{array}{cc}A & C \\ 0 & B\end{array}\right)=\operatorname{det} A \operatorname{det} B$. (ii) $\operatorname{det}\left(\begin{array}{cc}A & -B \\ B & A\end{array}\right)=\operatorname{det}(A+i B) \operatorname{det}(A-i B)$.

Paper 3, Section II, G

commentLet $q$ be a nonsingular quadratic form on a finite-dimensional real vector space $V$. Prove that we may write $V=P \bigoplus N$, where the restriction of $q$ to $P$ is positive definite, the restriction of $q$ to $N$ is negative definite, and $q(x+y)=q(x)+q(y)$ for all $x \in P$ and $y \in N$. [No result on diagonalisability may be assumed.]

Show that the dimensions of $P$ and $N$ are independent of the choice of $P$ and $N$. Give an example to show that $P$ and $N$ are not themselves uniquely defined.

Find such a decomposition $V=P \bigoplus N$ when $V=\mathbb{R}^{3}$ and $q$ is the quadratic form $q((x, y, z))=x^{2}+2 y^{2}-2 x y-2 x z$

Paper 4, Section I, G

commentLet $V$ denote the vector space of all real polynomials of degree at most 2 . Show that

$(f, g)=\int_{-1}^{1} f(x) g(x) d x$

defines an inner product on $V$.

Find an orthonormal basis for $V$.

Paper 4, Section II, G

commentLet $V$ be a real vector space. What is the dual $V^{*}$ of $V ?$ If $e_{1}, \ldots, e_{n}$ is a basis for $V$, define the dual basis $e_{1}^{*}, \ldots, e_{n}^{*}$ for $V^{*}$, and show that it is indeed a basis for $V^{*}$.

[No result about dimensions of dual spaces may be assumed.]

For a subspace $U$ of $V$, what is the annihilator of $U$ ? If $V$ is $n$-dimensional, how does the dimension of the annihilator of $U$ relate to the dimension of $U$ ?

Let $\alpha: V \rightarrow W$ be a linear map between finite-dimensional real vector spaces. What is the dual map $\alpha^{*}$ ? Explain why the rank of $\alpha^{*}$ is equal to the rank of $\alpha$. Prove that the kernel of $\alpha^{*}$ is the annihilator of the image of $\alpha$, and also that the image of $\alpha^{*}$ is the annihilator of the kernel of $\alpha$.

[Results about the matrices representing a map and its dual may be used without proof, provided they are stated clearly.]

Now let $V$ be the vector space of all real polynomials, and define elements $L_{0}, L_{1}, \ldots$ of $V^{*}$ by setting $L_{i}(p)$ to be the coefficient of $X^{i}$ in $p$ (for each $p \in V$ ). Do the $L_{i}$ form a basis for $V^{*}$ ?

Paper 1, Section I, E

commentWhat is the adjugate of an $n \times n$ matrix $A$ ? How is it related to $A^{-1}$ ? Suppose all the entries of $A$ are integers. Show that all the entries of $A^{-1}$ are integers if and only if $\operatorname{det} A=\pm 1$.

Paper 1, Section II, E

If $V_{1}$ and $V_{2}$ are vector spaces, what is meant by $V_{1} \oplus V_{2}$ ? If $V_{1}$ and $V_{2}$ are subspaces of a vector space $V$, what is meant by $V_{1}+V_{2}$ ?

Stating clearly any theorems you use, show that if $V_{1}$ and $V_{2}$ are subspaces of a finite dimensional vector space $V$, then

$\operatorname{dim} V_{1}+\operatorname{dim} V_{2}=\operatorname{dim}\left(V_{1} \cap V_{2}\right)+\operatorname{dim}\left(V_{1}+V_{2}\right)$