• 1.I $. 5 \mathrm{E} \quad$

Let $V$ be the subset of $\mathbb{R}^{5}$ consisting of all quintuples $\left(a_{1}, a_{2}, a_{3}, a_{4}, a_{5}\right)$ such that

$a_{1}+a_{2}+a_{3}+a_{4}+a_{5}=0$

and

$a_{1}+2 a_{2}+3 a_{3}+4 a_{4}+5 a_{5}=0$

Prove that $V$ is a subspace of $\mathbb{R}^{5}$. Solve the above equations for $a_{1}$ and $a_{2}$ in terms of $a_{3}, a_{4}$ and $a_{5}$. Hence, exhibit a basis for $V$, explaining carefully why the vectors you give form a basis.

comment
• 1.II.14E

(a) Let $U, U^{\prime}$ be subspaces of a finite-dimensional vector space $V$. Prove that $\operatorname{dim}\left(U+U^{\prime}\right)=\operatorname{dim} U+\operatorname{dim} U^{\prime}-\operatorname{dim}\left(U \cap U^{\prime}\right) .$

(b) Let $V$ and $W$ be finite-dimensional vector spaces and let $\alpha$ and $\beta$ be linear maps from $V$ to $W$. Prove that

$\operatorname{rank}(\alpha+\beta) \leqslant \operatorname{rank} \alpha+\operatorname{rank} \beta$

(c) Deduce from this result that

$\operatorname{rank}(\alpha+\beta) \geqslant|\operatorname{rank} \alpha-\operatorname{rank} \beta|$

(d) Let $V=W=\mathbb{R}^{n}$ and suppose that $1 \leqslant r \leqslant s \leqslant n$. Exhibit linear maps $\alpha, \beta: V \rightarrow W$ such that $\operatorname{rank} \alpha=r, \operatorname{rank} \beta=s$ and $\operatorname{rank}(\alpha+\beta)=s-r$. Suppose that $r+s \geqslant n$. Exhibit linear maps $\alpha, \beta: V \rightarrow W$ such that $\operatorname{rank} \alpha=r, \operatorname{rank} \beta=s$ and $\operatorname{rank}(\alpha+\beta)=n$.

comment
• 2.I.6E

Let $a_{1}, a_{2}, \ldots, a_{n}$ be distinct real numbers. For each $i$ let $\mathbf{v}_{i}$ be the vector $\left(1, a_{i}, a_{i}^{2}, \ldots, a_{i}^{n-1}\right)$. Let $A$ be the $n \times n$ matrix with rows $\mathbf{v}_{1}, \mathbf{v}_{2}, \ldots, \mathbf{v}_{n}$ and let $\mathbf{c}$ be a column vector of size $n$. Prove that $A \mathbf{c}=\mathbf{0}$ if and only if $\mathbf{c}=\mathbf{0}$. Deduce that the vectors $\mathbf{v}_{1}, \mathbf{v}_{2}, \ldots, \mathbf{v}_{n} \operatorname{span} \mathbb{R}^{n}$.

[You may use general facts about matrices if you state them clearly.]

comment
• 2.II.15E

(a) Let $A=\left(a_{i j}\right)$ be an $m \times n$ matrix and for each $k \leqslant n$ let $A_{k}$ be the $m \times k$ matrix formed by the first $k$ columns of $A$. Suppose that $n>m$. Explain why the nullity of $A$ is non-zero. Prove that if $k$ is minimal such that $A_{k}$ has non-zero nullity, then the nullity of $A_{k}$ is 1 .

(b) Suppose that no column of $A$ consists entirely of zeros. Deduce from (a) that there exist scalars $b_{1}, \ldots, b_{k}$ (where $k$ is defined as in (a)) such that $\sum_{j=1}^{k} a_{i j} b_{j}=0$ for every $i \leqslant m$, but whenever $\lambda_{1}, \ldots, \lambda_{k}$ are distinct real numbers there is some $i \leqslant m$ such that $\sum_{j=1}^{k} a_{i j} \lambda_{j} b_{j} \neq 0$.

(c) Now let $\mathbf{v}_{1}, \mathbf{v}_{2}, \ldots, \mathbf{v}_{m}$ and $\mathbf{w}_{1}, \mathbf{w}_{2}, \ldots, \mathbf{w}_{m}$ be bases for the same real $m$ dimensional vector space. Let $\lambda_{1}, \lambda_{2}, \ldots, \lambda_{n}$ be distinct real numbers such that for every $j$ the vectors $\mathbf{v}_{1}+\lambda_{j} \mathbf{w}_{1}, \ldots, \mathbf{v}_{m}+\lambda_{j} \mathbf{w}_{m}$ are linearly dependent. For each $j$, let $a_{1 j}, \ldots, a_{m j}$ be scalars, not all zero, such that $\sum_{i=1}^{m} a_{i j}\left(\mathbf{v}_{i}+\lambda_{j} \mathbf{w}_{i}\right)=\mathbf{0}$. By applying the result of (b) to the matrix $\left(a_{i j}\right)$, deduce that $n \leqslant m$.

(d) It follows that the vectors $\mathbf{v}_{1}+\lambda \mathbf{w}_{1}, \ldots, \mathbf{v}_{m}+\lambda \mathbf{w}_{m}$ are linearly dependent for at most $m$ values of $\lambda$. Explain briefly how this result can also be proved using determinants.

comment
• 3.I.7G

Let $\alpha$ be an endomorphism of a finite-dimensional real vector space $U$ and let $\beta$ be another endomorphism of $U$ that commutes with $\alpha$. If $\lambda$ is an eigenvalue of $\alpha$, show that $\beta$ maps the kernel of $\alpha-\lambda \iota$ into itself, where $\iota$ is the identity map. Suppose now that $\alpha$ is diagonalizable with $n$ distinct real eigenvalues where $n=\operatorname{dim} U$. Prove that if there exists an endomorphism $\beta$ of $U$ such that $\alpha=\beta^{2}$, then $\lambda \geqslant 0$ for all eigenvalues $\lambda$ of $\alpha$.

comment
• 3.II.17G

Define the determinant $\operatorname{det}(A)$ of an $n \times n$ complex matrix A. Let $A_{1}, \ldots, A_{n}$ be the columns of $A$, let $\sigma$ be a permutation of $\{1, \ldots, n\}$ and let $A^{\sigma}$ be the matrix whose columns are $A_{\sigma(1)}, \ldots, A_{\sigma(n)}$. Prove from your definition of determinant that $\operatorname{det}\left(A^{\sigma}\right)=\epsilon(\sigma) \operatorname{det}(A)$, where $\epsilon(\sigma)$ is the sign of the permutation $\sigma$. Prove also that $\operatorname{det}(A)=\operatorname{det}\left(A^{t}\right) .$

Define the adjugate matrix $\operatorname{adj}(A)$ and prove from your $\operatorname{definitions}$ that $A \operatorname{adj}(A)=$ $\operatorname{adj}(A) A=\operatorname{det}(A) I$, where $I$ is the identity matrix. Hence or otherwise, prove that if $\operatorname{det}(A) \neq 0$, then $A$ is invertible.

Let $C$ and $D$ be real $n \times n$ matrices such that the complex matrix $C+i D$ is invertible. By considering $\operatorname{det}(C+\lambda D)$ as a function of $\lambda$ or otherwise, prove that there exists a real number $\lambda$ such that $C+\lambda D$ is invertible. [You may assume that if a matrix $A$ is invertible, then $\operatorname{det}(A) \neq 0$.]

Deduce that if two real matrices $A$ and $B$ are such that there exists an invertible complex matrix $P$ with $P^{-1} A P=B$, then there exists an invertible real matrix $Q$ such that $Q^{-1} A Q=B$.

comment
• 4.I.6G

Let $\alpha$ be an endomorphism of a finite-dimensional real vector space $U$ such that $\alpha^{2}=\alpha$. Show that $U$ can be written as the direct sum of the kernel of $\alpha$ and the image of $\alpha$. Hence or otherwise, find the characteristic polynomial of $\alpha$ in terms of the dimension of $U$ and the rank of $\alpha$. Is $\alpha$ diagonalizable? Justify your answer.

comment
• 4.II.15G

Let $\alpha \in L(U, V)$ be a linear map between finite-dimensional vector spaces. Let

$\begin{gathered} M^{l}(\alpha)=\{\beta \in L(V, U): \beta \alpha=0\} \quad \text { and } \\ M^{r}(\alpha)=\{\beta \in L(V, U): \alpha \beta=0\} . \end{gathered}$

(a) Prove that $M^{l}(\alpha)$ and $M^{r}(\alpha)$ are subspaces of $L(V, U)$ of dimensions

$\begin{gathered} \operatorname{dim} M^{l}(\alpha)=(\operatorname{dim} V-\operatorname{rank} \alpha) \operatorname{dim} U \quad \text { and } \\ \operatorname{dim} M^{r}(\alpha)=\operatorname{dim} \operatorname{ker}(\alpha) \operatorname{dim} V \end{gathered}$

[You may use the result that there exist bases in $U$ and $V$ so that $\alpha$ is represented by

$\left(\begin{array}{cc} I_{r} & 0 \\ 0 & 0 \end{array}\right)$

where $I_{r}$ is the $r \times r$ identity matrix and $r$ is the rank of $\left.\alpha .\right]$

(b) Let $\Phi: L(U, V) \rightarrow L\left(V^{*}, U^{*}\right)$ be given by $\Phi(\alpha)=\alpha^{*}$, where $\alpha^{*}$ is the dual map induced by $\alpha$. Prove that $\Phi$ is an isomorphism. [You may assume that $\Phi$ is linear, and you may use the result that a finite-dimensional vector space and its dual have the same dimension.]

(c) Prove that

$\Phi\left(M^{l}(\alpha)\right)=M^{r}\left(\alpha^{*}\right) \quad \text { and } \quad \Phi\left(M^{r}(\alpha)\right)=M^{l}\left(\alpha^{*}\right)$

[You may use the results that $(\beta \alpha)^{*}=\alpha^{*} \beta^{*}$ and that $\beta^{* *}$ can be identified with $\beta$ under the canonical isomorphism between a vector space and its double dual.]

(d) Conclude that $\operatorname{rank}(\alpha)=\operatorname{rank}\left(\alpha^{*}\right)$.

comment

• 1.I.5G

Define $f: \mathbb{C}^{3} \rightarrow \mathbb{C}^{3}$ by

$f(a, b, c)=(a+3 b-c, 2 b+c,-4 b-c)$

Find the characteristic polynomial and the minimal polynomial of $f$. Is $f$ diagonalisable? Are $f$ and $f^{2}$ linearly independent endomorphisms of $\mathbb{C}^{3}$ ? Justify your answers.

comment
• 1.II.14G

Let $\alpha$ be an endomorphism of a vector space $V$ of finite dimension $n$.

(a) What is the dimension of the vector space of linear endomorphisms of $V$ ? Show that there exists a non-trivial polynomial $p(X)$ such that $p(\alpha)=0$. Define what is meant by the minimal polynomial $m_{\alpha}$ of $\alpha$.

(b) Show that the eigenvalues of $\alpha$ are precisely the roots of the minimal polynomial of $\alpha$.

(c) Let $W$ be a subspace of $V$ such that $\alpha(W) \subseteq W$ and let $\beta$ be the restriction of $\alpha$ to $W$. Show that $m_{\beta}$ divides $m_{\alpha}$.

(d) Give an example of an endomorphism $\alpha$ and a subspace $W$ as in (c) not equal to $V$ for which $m_{\alpha}=m_{\beta}$, and $\operatorname{deg}\left(m_{\alpha}\right)>1$.

comment
• 2.I.6G

Let $A$ be a complex $4 \times 4$ matrix such that $A^{3}=A^{2}$. What are the possible minimal polynomials of $A$ ? If $A$ is not diagonalisable and $A^{2} \neq 0$, list all possible Jordan normal forms of $A$.

comment
• 2.II.15G

(a) A complex $n \times n$ matrix is said to be unipotent if $U-I$ is nilpotent, where $I$ is the identity matrix. Show that $U$ is unipotent if and only if 1 is the only eigenvalue of $U$.

(b) Let $T$ be an invertible complex matrix. By considering the Jordan normal form of $T$ show that there exists an invertible matrix $P$ such that

$P T P^{-1}=D_{0}+N$

where $D_{0}$ is an invertible diagonal matrix, $N$ is an upper triangular matrix with zeros in the diagonal and $D_{0} N=N D_{0}$.

(c) Set $D=P^{-1} D_{0} P$ and show that $U=D^{-1} T$ is unipotent.

(d) Conclude that any invertible matrix $T$ can be written as $T=D U$ where $D$ is diagonalisable, $U$ is unipotent and $D U=U D$.

comment
• 3.I $77 \mathrm{~F} \quad$

Which of the following statements are true, and which false? Give brief justifications for your answers.

(a) If $U$ and $W$ are subspaces of a vector space $V$, then $U \cap W$ is always a subspace of $V$.

(b) If $U$ and $W$ are distinct subspaces of a vector space $V$, then $U \cup W$ is never a subspace of $V$.

(c) If $U, W$ and $X$ are subspaces of a vector space $V$, then $U \cap(W+X)=$ $(U \cap W)+(U \cap X)$.

(d) If $U$ is a subspace of a finite-dimensional space $V$, then there exists a subspace $W$ such that $U \cap W=\{0\}$ and $U+W=V$.

comment
• 3.II.17F

Define the determinant of an $n \times n$ matrix $A$, and prove from your definition that if $A^{\prime}$ is obtained from $A$ by an elementary row operation (i.e. by adding a scalar multiple of the $i$ th row of $A$ to the $j$ th row, for some $j \neq i$ ), then $\operatorname{det} A^{\prime}=\operatorname{det} A$.

Prove also that if $X$ is a $2 n \times 2 n$ matrix of the form

$\left(\begin{array}{ll} A & B \\ O & C \end{array}\right)$

where $O$ denotes the $n \times n$ zero matrix, then $\operatorname{det} X=\operatorname{det} A$ det $C$. Explain briefly how the $2 n \times 2 n$ matrix

$\left(\begin{array}{ll} B & I \\ O & A \end{array}\right)$

can be transformed into the matrix

$\left(\begin{array}{cc} B & I \\ -A B & O \end{array}\right)$

by a sequence of elementary row operations. Hence or otherwise prove that $\operatorname{det} A B=$ $\operatorname{det} A \operatorname{det} B$.

comment
• 4.I $. 6 \mathrm{~F} \quad$

Define the rank and nullity of a linear map between finite-dimensional vector spaces.

State the rank-nullity formula.

Let $\alpha: U \rightarrow V$ and $\beta: V \rightarrow W$ be linear maps. Prove that

$\operatorname{rank}(\alpha)+\operatorname{rank}(\beta)-\operatorname{dim} V \leqslant \operatorname{rank}(\beta \alpha) \leqslant \min \{\operatorname{rank}(\alpha), \operatorname{rank}(\beta)\}$

Part IB

comment
• 4.II.15F

Define the dual space $V^{*}$ of a finite-dimensional real vector space $V$, and explain what is meant by the basis of $V^{*}$ dual to a given basis of $V$. Explain also what is meant by the statement that the second dual $V^{* *}$ is naturally isomorphic to $V$.

Let $V_{n}$ denote the space of real polynomials of degree at most $n$. Show that, for any real number $x$, the function $e_{x}$ mapping $p$ to $p(x)$ is an element of $V_{n}^{*}$. Show also that, if $x_{1}, x_{2}, \ldots, x_{n+1}$ are distinct real numbers, then $\left\{e_{x_{1}}, e_{x_{2}}, \ldots, e_{x_{n+1}}\right\}$ is a basis of $V_{n}^{*}$, and find the basis of $V_{n}$ dual to it.

Deduce that, for any $(n+1)$ distinct points $x_{1}, \ldots, x_{n+1}$ of the interval $[-1,1]$, there exist scalars $\lambda_{1}, \ldots, \lambda_{n+1}$ such that

$\int_{-1}^{1} p(t) d t=\sum_{i=1}^{n+1} \lambda_{i} p\left(x_{i}\right)$

for all $p \in V_{n}$. For $n=4$ and $\left(x_{1}, x_{2}, x_{3}, x_{4}, x_{5}\right)=\left(-1,-\frac{1}{2}, 0, \frac{1}{2}, 1\right)$, find the corresponding scalars $\lambda_{i}$.

comment

• 1.I $. 5 \mathrm{C} \quad$

Determine for which values of $x \in \mathbb{C}$ the matrix

$M=\left(\begin{array}{ccc} x & 1 & 1 \\ 1-x & 0 & -1 \\ 2 & 2 x & 1 \end{array}\right)$

is invertible. Determine the rank of $M$ as a function of $x$. Find the adjugate and hence the inverse of $M$ for general $x$.

comment
• 1.II.14C

(a) Find a matrix $M$ over $\mathbb{C}$ with both minimal polynomial and characteristic polynomial equal to $(x-2)^{3}(x+1)^{2}$. Furthermore find two matrices $M_{1}$ and $M_{2}$ over $\mathbb{C}$ which have the same characteristic polynomial, $(x-3)^{5}(x-1)^{2}$, and the same minimal polynomial, $(x-3)^{2}(x-1)^{2}$, but which are not conjugate to one another. Is it possible to find a third such matrix, $M_{3}$, neither conjugate to $M_{1}$ nor to $M_{2}$ ? Justify your answer.

(b) Suppose $A$ is an $n \times n$ matrix over $\mathbb{R}$ which has minimal polynomial of the form $\left(x-\lambda_{1}\right)\left(x-\lambda_{2}\right)$ for distinct roots $\lambda_{1} \neq \lambda_{2}$ in $\mathbb{R}$. Show that the vector space $V=\mathbb{R}^{n}$ on which $A$ defines an endomorphism $\alpha: V \rightarrow V$ decomposes as a direct sum into $V=\operatorname{ker}\left(\alpha-\lambda_{1} \iota\right) \oplus \operatorname{ker}\left(\alpha-\lambda_{2} \iota\right)$, where $\iota$ is the identity.

[Hint: Express $v \in V$ in terms of $\left(\alpha-\lambda_{1} \iota\right)(v)$ and $\left.\left(\alpha-\lambda_{2} \iota\right)(v) .\right]$

Now suppose that $A$ has minimal polynomial $\left(x-\lambda_{1}\right)\left(x-\lambda_{2}\right) \ldots\left(x-\lambda_{m}\right)$ for distinct $\lambda_{1}, \ldots, \lambda_{m} \in \mathbb{R}$. By induction or otherwise show that

$V=\operatorname{ker}\left(\alpha-\lambda_{1} \iota\right) \oplus \operatorname{ker}\left(\alpha-\lambda_{2} \iota\right) \oplus \ldots \oplus \operatorname{ker}\left(\alpha-\lambda_{m} \iota\right)$

Use this last statement to prove that an arbitrary matrix $A \in M_{n \times n}(\mathbb{R})$ is diagonalizable if and only if all roots of its minimal polynomial lie in $\mathbb{R}$ and have multiplicity $1 .$

comment
• 2.I $6 \mathrm{C} \quad$

Show that right multiplication by $A=\left(\begin{array}{ll}a & b \\ c & d\end{array}\right) \in M_{2 \times 2}(\mathbb{C})$ defines a linear transformation $\rho_{A}: M_{2 \times 2}(\mathbb{C}) \rightarrow M_{2 \times 2}(\mathbb{C})$. Find the matrix representing $\rho_{A}$ with respect to the basis

$\left(\begin{array}{ll} 1 & 0 \\ 0 & 0 \end{array}\right),\left(\begin{array}{ll} 0 & 1 \\ 0 & 0 \end{array}\right),\left(\begin{array}{ll} 0 & 0 \\ 1 & 0 \end{array}\right),\left(\begin{array}{ll} 0 & 0 \\ 0 & 1 \end{array}\right)$

of $M_{2 \times 2}(\mathbb{C})$. Prove that the characteristic polynomial of $\rho_{A}$ is equal to the square of the characteristic polynomial of $A$, and that $A$ and $\rho_{A}$ have the same minimal polynomial.

comment
• 2.II.15C

Define the dual $V^{*}$ of a vector space $V$. Given a basis $\left\{v_{1}, \ldots, v_{n}\right\}$ of $V$ define its dual and show it is a basis of $V^{*}$. For a linear transformation $\alpha: V \rightarrow W$ define the dual $\alpha^{*}: W^{*} \rightarrow V^{*}$.

Explain (with proof) how the matrix representing $\alpha: V \rightarrow W$ with respect to given bases of $V$ and $W$ relates to the matrix representing $\alpha^{*}: W^{*} \rightarrow V^{*}$ with respect to the corresponding dual bases of $V^{*}$ and $W^{*}$.

Prove that $\alpha$ and $\alpha^{*}$ have the same rank.

Suppose that $\alpha$ is an invertible endomorphism. Prove that $\left(\alpha^{*}\right)^{-1}=\left(\alpha^{-1}\right)^{*}$.

comment
• 3.I $7 \mathrm{C} \quad$

Determine the dimension of the subspace $W$ of $\mathbb{R}^{5}$ spanned by the vectors

$\left(\begin{array}{r} 1 \\ 2 \\ 2 \\ -1 \\ 1 \end{array}\right),\left(\begin{array}{r} 4 \\ 2 \\ -2 \\ 6 \\ -2 \end{array}\right),\left(\begin{array}{l} 4 \\ 5 \\ 3 \\ 1 \\ 1 \end{array}\right),\left(\begin{array}{r} 5 \\ 4 \\ 0 \\ 5 \\ -1 \end{array}\right)$

Write down a $5 \times 5$ matrix $M$ which defines a linear map $\mathbb{R}^{5} \rightarrow \mathbb{R}^{5}$ whose image is $W$ and which contains $(1,1,1,1,1)^{T}$ in its kernel. What is the dimension of the space of all linear maps $\mathbb{R}^{5} \rightarrow \mathbb{R}^{5}$ with $(1,1,1,1,1)^{T}$ in the kernel, and image contained in $W$ ?

comment
• 3.II.17C

Let $V$ be a vector space over $\mathbb{R}$. Let $\alpha: V \rightarrow V$ be a nilpotent endomorphism of $V$, i.e. $\alpha^{m}=0$ for some positive integer $m$. Prove that $\alpha$ can be represented by a strictly upper-triangular matrix (with zeros along the diagonal). [You may wish to consider the subspaces $\operatorname{ker}\left(\alpha^{j}\right)$ for $j=1, \ldots, m$.]

Show that if $\alpha$ is nilpotent, then $\alpha^{n}=0$ where $n$ is the dimension of $V$. Give an example of a $4 \times 4$ matrix $M$ such that $M^{4}=0$ but $M^{3} \neq 0$.

Let $A$ be a nilpotent matrix and $I$ the identity matrix. Prove that $I+A$ has all eigenvalues equal to 1 . Is the same true of $(I+A)(I+B)$ if $A$ and $B$ are nilpotent? Justify your answer.

comment
• 4.I $6 \mathrm{C} \quad$

Find the Jordan normal form $J$ of the matrix

$M=\left(\begin{array}{rrrr} 1 & 0 & 1 & 0 \\ 0 & 1 & 0 & 0 \\ 0 & -1 & 2 & 0 \\ 0 & 0 & 0 & 2 \end{array}\right)$

and determine both the characteristic and the minimal polynomial of $M$.

Find a basis of $\mathbb{C}^{4}$ such that $J$ (the Jordan normal form of $M$ ) is the matrix representing the endomorphism $M: \mathbb{C}^{4} \rightarrow \mathbb{C}^{4}$ in this basis. Give a change of basis matrix $P$ such that $P^{-1} M P=J$.

comment
• 4.II.15C

Let $A$ and $B$ be $n \times n$ matrices over $\mathbb{C}$. Show that $A B$ and $B A$ have the same characteristic polynomial. [Hint: Look at $\operatorname{det}(C B C-x C)$ for $C=A+y I$, where $x$ and $y$ are scalar variables.]

Show by example that $A B$ and $B A$ need not have the same minimal polynomial.

Suppose that $A B$ is diagonalizable, and let $p(x)$ be its minimal polynomial. Show that the minimal polynomial of $B A$ must divide $x p(x)$. Using this and the first part of the question prove that $(A B)^{2}$ and $(B A)^{2}$ are conjugate.

comment