Symmetric Matrices
By George Walker on Friday, January 11,
2002 - 01:48 pm:
I covered this topic in a maths class today and my teacher
didnt prove the following question...
How can we prove that A nxn symmetric matrix has n REAL
eigenvalues?
Thanks
George
ps. If possible dont use notation that I wont have met since I
have only just done matrices and I havent done much external
research on them.
By William Astle on Friday, January 11,
2002 - 03:05 pm:
You probably know this but just in case: A matrix M is symmetric if it satisfies MT = M. Where T denotes the
transpose operation (swapping rows with respective columns and columns with
respective rows). For the statement made to be true you actually need the
additional condition that all the entries in the matrix are real numbers. I
use u* for the complex conjugate of the vector u, that is the vector with
components conjugate to the respective components of u. We write the
additional condition M* = M. (This type of matrix (real and symmetric) is
part of a set of operators called Hermitian operators which are defined by the
condition M*T = M).
Suppose that M v=lv (that v is an eigenvector of a matrix M with
eigenvector l) then since v is not the zero vector we have
|v|2 = v*T v > 0 (where |v| is the length (or norm) of the vector v):
l = lv*T v/(v*T v)=v*T M v/(v*T v)=v*T M*T v/(v*T v)=(M v)* T /(v*T v)=(lv)*T v/(v*T v)=l* (v*T v)/(v*T v) = l*
So if M has an eigenvalue it is real. It is not necessarily true that M
(when it is n x n) has n eigenvalues (consider the identity matrix which
has only one eigenvalue). It is true that M has at least one eigenvalue and
that it has no more than n, to prove this you need to assume (because proving
it is rather tricky) the fundemental theroem of algebra which states that
every polynomial has a root in the set of complex numbers. By induction you
can show easily enough that this implies that a polynomial of degree (largest
power) n has no more than n distinct roots in C.
The function
c(x)=det(M-x I)
(where I is the (n x n) identity matrix) has roots which are precisely
the eigenvalues of M. det(M-x I)=0 only when M-x I is not invertible
which happens and only happens when there exists a vector v such that
(M-x I)v=0 (such that M v=x v) (This might need a little more explaning).
Since c(x) is a polynomial of degree n the result follows.
By George Walker on Friday, January 11,
2002 - 08:23 pm:
I am not sure about one thing in the proof:
v is an eigenvector, that is an nx1 matrix. Therefore
v*T is a matrix with one row.
Thus v*T v IS a 1x1 matrix BUT this is still a matrix
NOT a number, and you are trying to divide by this matrix in the
first line, which I dont think is very nice and I'm not sure your
allowed to do it....
correct me if I'm wrong anyone
George
By Kerwin Hui on Friday, January 11, 2002
- 08:52 pm:
The answer is: we consider an 1x1 matrix
the same as its entry. So, for example, we can define the dot
product of two column vectors a,b as
a.b=a*T b
and this will recover the usual result v.v=|v|2
.
Kerwin
By George Walker on Friday, January 11,
2002 - 10:54 pm:
How about saying, since (v*T v) is a matrix, the
following:
l
=l(v*T v)(v*T v)-1 =v*T M v(v*T v)-1 =v*T M*T v(v*T v)-1 =(M v)*T v(v*T v)-1 =(lv)*T v(v*T v)-1 =l* (v*T v)(v*T v)-1 =l*
I think I prefer this although I might not be right.
Isn't this more rigorous since we are dealing with
matrices?
George
By Kerwin Hui on Friday, January 11,
2002 - 11:24 pm:
George,
Your argument is identical to William's. Since the map (x)® x is a field
isomorphism, we can consider (x) the same as x, where x is a scalar.
If you are still unhappy about this, consider:
lv*T v=v*T (lv)=v*T M v=v*T M*T v=(M v)*T v=l* v*T v i.e. (l-l*)v*T v=0 since v*T v is nonzero, we have l = l*.
Kerwin