2.II.15E

(a) Let $A=\left(a_{i j}\right)$ be an $m \times n$ matrix and for each $k \leqslant n$ let $A_{k}$ be the $m \times k$ matrix formed by the first $k$ columns of $A$. Suppose that $n>m$. Explain why the nullity of $A$ is non-zero. Prove that if $k$ is minimal such that $A_{k}$ has non-zero nullity, then the nullity of $A_{k}$ is 1 .

(b) Suppose that no column of $A$ consists entirely of zeros. Deduce from (a) that there exist scalars $b_{1}, \ldots, b_{k}$ (where $k$ is defined as in (a)) such that $\sum_{j=1}^{k} a_{i j} b_{j}=0$ for every $i \leqslant m$, but whenever $\lambda_{1}, \ldots, \lambda_{k}$ are distinct real numbers there is some $i \leqslant m$ such that $\sum_{j=1}^{k} a_{i j} \lambda_{j} b_{j} \neq 0$.

(c) Now let $\mathbf{v}_{1}, \mathbf{v}_{2}, \ldots, \mathbf{v}_{m}$ and $\mathbf{w}_{1}, \mathbf{w}_{2}, \ldots, \mathbf{w}_{m}$ be bases for the same real $m$ dimensional vector space. Let $\lambda_{1}, \lambda_{2}, \ldots, \lambda_{n}$ be distinct real numbers such that for every $j$ the vectors $\mathbf{v}_{1}+\lambda_{j} \mathbf{w}_{1}, \ldots, \mathbf{v}_{m}+\lambda_{j} \mathbf{w}_{m}$ are linearly dependent. For each $j$, let $a_{1 j}, \ldots, a_{m j}$ be scalars, not all zero, such that $\sum_{i=1}^{m} a_{i j}\left(\mathbf{v}_{i}+\lambda_{j} \mathbf{w}_{i}\right)=\mathbf{0}$. By applying the result of (b) to the matrix $\left(a_{i j}\right)$, deduce that $n \leqslant m$.

(d) It follows that the vectors $\mathbf{v}_{1}+\lambda \mathbf{w}_{1}, \ldots, \mathbf{v}_{m}+\lambda \mathbf{w}_{m}$ are linearly dependent for at most $m$ values of $\lambda$. Explain briefly how this result can also be proved using determinants.

*Typos? Please submit corrections to this page on GitHub.*