Fix me or crush me

Can you make matrices which will fix one lucky vector and crush another to zero?
Exploring and noticing Working systematically Conjecturing and generalising Visualising and representing Reasoning, convincing and proving
Being curious Being resourceful Being resilient Being collaborative

Problem

 

Imagine that you have a pair of vectors ${\bf F}$ and ${\bf Z}$

$$

{\bf F}=\pmatrix{1\cr 1 \cr 0}\quad {\bf Z}=\pmatrix{0\cr 1 \cr 1}

$$

Can you construct an example of a matrix ${\bf M}$, other than the identity matrix, which leaves ${\bf F}$ fixed, in that ${\bf M}{\bf F}={\bf F}$? How many such matrices can you find? Which is the simplest? Which is the most complicated?

Can you construct an example of a matrix ${\bf N}$, other than the zero matrix, which crushes ${\bf Z}$ to the zero vector ${\bf 0}$, in that ${\bf N}{\bf Z}={\bf 0}$? How many such matrices can you find? Which is the simplest? Which is the most complicated?

Can you find a matrix which leaves ${\bf F}$ fixed and also crushes ${\bf Z}$?

Can you find any (many?) vectors fixed or crushed by the following matrices? Give examples or convincing arguments if no such vectors exist.

$$

{\bf M} = \begin{pmatrix} 1&0&0\\ 0&1&0\\ 0&0&1\\ \end{pmatrix}, \: \begin{pmatrix} 1&2&3\\ 2&3&4\\ 3&4&5\\ \end{pmatrix}, \: \begin{pmatrix} \phantom{-}1&-2&\phantom{-}1\\ \phantom{-}1&\phantom{-}1&\phantom{-}0\\ -2&\phantom{-}1&-2\\ \end{pmatrix}

$$

 

You might find this Matrix Multiplication calculator helpful for testing out your ideas.

There are more matrix problems in this feature.



NOTES AND BACKGROUND

Matrices are used to represent transformations of vectors; vectors and matrices are usually studied together as an inseparable pair. Although matrices and the rules of matrix multiplication might seem abstract upon first encounter, they are actually very natural and encode in an entirely meaningful way notions of symmetry and transformation. This problem allows you to explore the effects matrix multiplication has on various vectors.

The eigenvectors of a matrix are those vectors whose direction is unchanged by the action of the matrix - the "Fixed" vectors here are eigenvectors with an eigenvalue of $1$.  More generally the eigenvectors of ${\bf M}$ satisfy ${\bf M}{\bf F} = \lambda {\bf F}$, where $\lambda$ is the eigenvalue associated with the eigenvector.

The kernel of a matrix is the set of vectors which are squashed to zero, or you can think of them as being the set of eigenvectors with eigenvalue $0$.

Both concepts are of fundamental importance in higher-level algebra and its applications to science.