Symmetric Matrices
By George Walker on Friday, January 11,
2002 - 01:48 pm:
I covered this topic in a maths class today and my teacher
didnt prove the following question...
How can we prove that A nxn symmetric matrix has n REAL
eigenvalues?
Thanks
George
ps. If possible dont use notation that I wont have met since I
have only just done matrices and I havent done much external
research on them.
By William Astle on Friday, January 11,
2002 - 03:05 pm:
You probably know this but just in case: A matrix
is symmetric if it satisfies
. Where
denotes the
transpose operation (swapping rows with respective columns and columns with
respective rows). For the statement made to be true you actually need the
additional condition that all the entries in the matrix are real numbers. I
use
for the complex conjugate of the vector
, that is the vector with
components conjugate to the respective components of
. We write the
additional condition
. (This type of matrix (real and symmetric) is
part of a set of operators called Hermitian operators which are defined by the
condition
).
Suppose that
(that
is an eigenvector of a matrix
with
eigenvector
) then since
is not the zero vector we have
(where
is the length (or norm) of the vector
):
So if
has an eigenvalue it is real. It is not necessarily true that
(when it is
x
) has
eigenvalues (consider the identity matrix which
has only one eigenvalue). It is true that
has at least one eigenvalue and
that it has no more than
, to prove this you need to assume (because proving
it is rather tricky) the fundemental theroem of algebra which states that
every polynomial has a root in the set of complex numbers. By induction you
can show easily enough that this implies that a polynomial of degree (largest
power)
has no more than
distinct roots in
.
The function
(where
is the (
x
) identity matrix) has roots which are precisely
the eigenvalues of
.
only when
is not invertible
which happens and only happens when there exists a vector
such that
(such that
) (This might need a little more explaning).
Since
is a polynomial of degree
the result follows.
By George Walker on Friday, January 11,
2002 - 08:23 pm:
I am not sure about one thing in the proof:
v is an eigenvector, that is an nx1 matrix. Therefore
v*T is a matrix with one row.
Thus v*T v IS a 1x1 matrix BUT this is still a matrix
NOT a number, and you are trying to divide by this matrix in the
first line, which I dont think is very nice and I'm not sure your
allowed to do it....
correct me if I'm wrong anyone
George
By Kerwin Hui on Friday, January 11, 2002
- 08:52 pm:
The answer is: we consider an 1x1 matrix
the same as its entry. So, for example, we can define the dot
product of two column vectors a,b as
a.b=a*T b
and this will recover the usual result v.v=|v|2
.
Kerwin
By George Walker on Friday, January 11,
2002 - 10:54 pm:
How about saying, since (v*T v) is a matrix, the
following:
I think I prefer this although I might not be right.
Isn't this more rigorous since we are dealing with
matrices?
George
By Kerwin Hui on Friday, January 11,
2002 - 11:24 pm:
George,
Your argument is identical to William's. Since the map
is a field
isomorphism, we can consider
the same as
, where
is a scalar.
If you are still unhappy about this, consider:
i.e.
since
is nonzero, we have
.
Kerwin