Algebra and Geometry | Part IA, 2001

For a 2×22 \times 2 matrix A=(abcd)A=\left(\begin{array}{ll}a & b \\ c & d\end{array}\right), prove that A2=0A^{2}=0 if and only if a=da=-d and bc=a2b c=-a^{2}. Prove that A3=0A^{3}=0 if and only if A2=0A^{2}=0.

[Hint: it is easy to check that A2(a+d)A+(adbc)I=0.]\left.A^{2}-(a+d) A+(a d-b c) I=0 .\right]

Typos? Please submit corrections to this page on GitHub.