| Marcos |
How can we prove that (AB)T = BT AT ? (where MT denotes the transpose of the matrix M) I'm looking for an (almost) elegant method that doesn't involve considering two arbitrary sized matrices and the sums after you multiply them together blah blah... Marcos |
|||||||||||
| David
Loeffler |
What's wrong with the orthodox proof:
? Of course this is exactly what you say that you want to avoid; but the problem is that the definition of matrix multiplication more or less requires you to do it that way; the proof is about as simple as it can be. David |
|||||||||||
| Marcos |
Thanks, that's brilliant.... using the index notation it looks so much better than I had ever dreamt it would look like (I dunno, but I had a hunch it would look awful doing it that way - note to self: never trust intuition again!) Marcos |
|||||||||||
| Marcos |
One last question (well... for now anyway) If M is square and singular can you always find a (non-zero) column vector v such that M v = 0? (It's obvious that if M v = 0 for some non-zero v then M is singular, of course) Marcos |
|||||||||||
| David
Loeffler |
Yes, you always can. This is somewhat more difficult to prove. Have you come across eigenvalues and eigenvectors? David |
|||||||||||
| Marcos |
Not really, although I've heard the name before... I think I briefly recall something of the type: If M v = kv for some scalar k, a column vector v and a square matrix M then v is an eigenvector and k is an eigenvalue of M ... Of course, I may have completely made this up Marcos P.S. How do you pronounce the terms? Is it a-ee-gen or e'-ee-gen vector/value? |
|||||||||||
| Julian
Pulman |
My lecturers always pronounced it eye-gon vector/value. Your definition is correct, Marcos, clearly if v is an eigenvector, then any point on that line is transformed back onto the line by means of scalar multiplication by k. |
|||||||||||
| David
Loeffler |
Now, there's a theorem: for any n x n matrix we can find n eigenvalues (not necessarily all distinct) and the determinant is the product of the eigenvalues. So if the determinant is zero, there's a zero eigenvalue, i.e. v such that Mv = 0. Which is exactly what we're looking for. David |
|||||||||||
| Michael
Doré |
But you don't need this theorem to prove the required result. (Or at least you certainly don't need anything involving FTA.) If Av < > 0 for any non-zero v then Ae1 ,...,Aen are linearly independent where e1 ,...,en is a basis for Rn , hence span(Ae1 ,...,Aen ) = Rn . So for any i you can find a linear combination of Ae1 ,...,Aen which equals ei , and it's easy to construct an inverse matrix A-1 from here. |
|||||||||||
| David
Loeffler |
Of course you are right, Michael, and that is a better proof from a linear algebra viewpoint. However I was going for a proof that proceeds directly from the definition of 'singular' as 'having determinant zero', rather than as 'non-invertible'. David |
|||||||||||
| Demetres
Christofides |
There is also an alternative proof: 1) Find an invertible matrix B so that AB is just A with two rows i and j interchanged. 2) Find an invertible matrix B so that AB is just A with all elements of row A multiplied by k. (k non-zero) 3) Find an inverible matrix B so that AB is just A with k times row i added to row j. 4) FInd an algorithm to show that there are invertible matrices B1 ,...,Bm so that AB1 ...Bm is an upper triangular matrix (i.e. all its elements are above the main diagonal), with all the elements of the main diagonal being 1's and 0's in decreasing order. Call this matrix A' 5)Now the determinant of A' is just the product of the elements in its main diagonal. 6)Show that detPQ=detPdetQ and deduce that detA'=0 (assuming A is singular). 7)It follows that A' has a zero on its main diagonal. Find a non-zero vector v so that A'v=0. Since the Bi 's are invertible, u=B1 ...Bm v is non-zero. Then Au=0 as required. This proof might be longer than any given in linear algebra textbooks but its the only one I can thing of which it does not use any previous results (e.g. basis/dimension linear independence/e-vectors/e-values etc) Of course the most elegant proof is the one Micheal has described. Demetres |
|||||||||||
| Demetres
Christofides |
Sorry, change rows to columns. Demetres |
|||||||||||
| Marcos |
Thanks everyone... Michael, I don't quite understand your final sentence. Could you (or anyone else) please explain that point a bit more? Thanks, Marcos |
|||||||||||
| Kerwin
Hui |
Marcos, We let ei to be the standard basis for Rn (or Cn, the space of all complex n-tuple), i.e. the n-tuple (0,0,...,0,1,0,...,0) (it should be treated as a column vector in what follows). From span{A e1,...,A en} = Rn (or Cn), we get any n-vector can be expressed as a linear combination of the A ej's. If we know we can express ei as a linear combination of the A ej's, i.e.
, for all i, bi j are in R or C accordingly. or equivalently,
and it is immediately obvious what A-1 ei would be. But A-1 ei is the ith column of the matrix A-1, so we know what A-1 is. In fact, if you know the definition of determinant (as volume forms), it is not difficult to prove that the determinant is zero if any only if the rows (or columns) are linearly dependent, whence you immediately recover a (non-zero) eigenvector. Kerwin |
|||||||||||
| Marcos |
Thanks, Marcos |