Symmetric Matrices


By George Walker on Friday, January 11, 2002 - 01:48 pm:

I covered this topic in a maths class today and my teacher didnt prove the following question...

How can we prove that A nxn symmetric matrix has n REAL eigenvalues?

Thanks
George

ps. If possible dont use notation that I wont have met since I have only just done matrices and I havent done much external research on them.


By William Astle on Friday, January 11, 2002 - 03:05 pm:
You probably know this but just in case:

A matrix M is symmetric if it satisfies MT =M. Where T denotes the transpose operation (swapping rows with respective columns and columns with respective rows). For the statement made to be true you actually need the additional condition that all the entries in the matrix are real numbers. I use u* for the complex conjugate of the vector u, that is the vector with components conjugate to the respective components of u. We write the additional condition M*=M. (This type of matrix (real and symmetric) is part of a set of operators called Hermitian operators which are defined by the condition M *T =M).

Suppose that Mv=λv (that v is an eigenvector of a matrix M with eigenvector λ) then since v is not the zero vector we have |v |2 =v *T v>0 (where |v| is the length (or norm) of the vector v):

λ=λv *T v/(v *T v)=v *T Mv/(v *T v)=v *T M *T v/(v *T v)=(Mv) * T /(v *T v)=(λv) *T v/(v *T v)=λ*(v *T v)/(v *T v)=λ*

So if M has an eigenvalue it is real. It is not necessarily true that M (when it is n x n) has n eigenvalues (consider the identity matrix which has only one eigenvalue). It is true that M has at least one eigenvalue and that it has no more than n, to prove this you need to assume (because proving it is rather tricky) the fundemental theroem of algebra which states that every polynomial has a root in the set of complex numbers. By induction you can show easily enough that this implies that a polynomial of degree (largest power) n has no more than n distinct roots in C.

The function

χ(x)=det(M-xI)

(where I is the ( n x n) identity matrix) has roots which are precisely the eigenvalues of M. det(M-xI)=0 only when M-xI is not invertible which happens and only happens when there exists a vector v such that (M-xI)v=0 (such that Mv=xv) (This might need a little more explaning).

Since χ(x) is a polynomial of degree n the result follows.



By George Walker on Friday, January 11, 2002 - 08:23 pm:

I am not sure about one thing in the proof:
v is an eigenvector, that is an nx1 matrix. Therefore v*T is a matrix with one row.
Thus v*T v IS a 1x1 matrix BUT this is still a matrix NOT a number, and you are trying to divide by this matrix in the first line, which I dont think is very nice and I'm not sure your allowed to do it....

correct me if I'm wrong anyone

George


By Kerwin Hui on Friday, January 11, 2002 - 08:52 pm:

The answer is: we consider an 1x1 matrix the same as its entry. So, for example, we can define the dot product of two column vectors a,b as

a.b=a*T b

and this will recover the usual result v.v=|v|2 .

Kerwin


By George Walker on Friday, January 11, 2002 - 10:54 pm:

How about saying, since (v*T v) is a matrix, the following:

λ

=λ(v *T v)(v *T v )-1

=v *T Mv(v *T v )-1

=v *T M *T v(v *T v )-1

=(Mv) *T v(v *T v )-1

=(λv) *T v(v *T v )-1

=λ*(v *T v)(v *T v )-1

=λ*
I think I prefer this although I might not be right.
Isn't this more rigorous since we are dealing with matrices?

George


By Kerwin Hui on Friday, January 11, 2002 - 11:24 pm:
George,

Your argument is identical to William's. Since the map (x)x is a field isomorphism, we can consider (x) the same as x, where x is a scalar.

If you are still unhappy about this, consider:

λv *T v=v *T (λv)=v *T Mv=v *T M *T v=(Mv) *T v=λ*v *T v

i.e. (λ-λ*)v *T v=0

since v *T v is nonzero, we have λ=λ*.

Kerwin