As you know, even for a
matrix
, the properties of
might not be ideal. There might not be a basis of linearly
independent eigenvectors, or the eigenvalues and eigenvectors might
be complex, or perhaps the eigenspaces of
are not perpendicular.
In contrast, symmetric matrices are guaranteed to have the best
possible properties:
Theorem (``spectral theorem''). If
is a real symmetric matrix, then
(1) the eigenvalues of
are real numbers,
(2) the eigenspaces of
are perpendicular to one another,
(3)
can be diagonalized using a rotation matrix; in other
words,
for some rotation matrix
.
The name ``spectral theorem'' comes from advanced applications in physics where the eigenvalues determine frequencies, like a color spectrum of light.
Problem
DD-1. Explain how (3) implies (1) and (2). (Method: In (3),
does
have real entries? Are the eigenvectors perpendicular?)
The nice thing in applications is that there is is not much to do
to find
; as the theorem says, the eigenspaces are automatically
perpendicular to one another.
Example:
. Here in the past we found that
we could use
and
.
But
is not a rotation matrix, since it violates both (i)
and (ii). The columns are perpendicular as expected, though.
To fix (i), find the length of each column and divide by it to
make all column lengths be 1:
.
Now check the determinant. It's
, so change the sign of a column,
say the second. We get
.
Now
does fit the definition of a rotation. Notice that
.
Problem
DD-2. Diagonalize
with a rotation matrix.