next up previous
Next: About this document ... Up: dd_sym Previous: dd_sym

4. Proof of the Spectral Theorem

We'll concentrate on the $ 2 \times 2$ case.

Problem DD-10. For a general symmetric matrix $ \left[\begin{array}{cc}a&b\\  b&d\end{array}\right]$,

(a) find the eigenvalues in terms of $ a,b,d$. (You will need to use the quadratic formula to get the roots of the characteristic polynomial.) Simplify algebraically to show that the discriminant (the part inside the square root) is the sum of two squares.

(b) How do you know from (a) that the eigenvalues are real?

(c) Show that if the two eigenvalues are equal, then the matrix is a scalar matrix. (Method: Notice that the eigenvalues are equal when the discriminant is zero.)



Now for perpendicularity of eigenvectors:

$ Proposition.$ If $ A$ is a symmetric matrix, then two eigenvectors belonging to distinct eigenvalues are perpendicular.



Problem DD-11. Show that if $ A$ is symmetric, then for any two vectors u$ ,$v we have $ \langle$   u$ , A$v$ \rangle =
\langle A$u$ ,$   v$ \rangle $.

(Method: If we regard u$ ,$v as column vectors, $ \langle$   u$ ,$   v$ \rangle =$   u$ ^ t$   v. What happens when you put $ A$ in, in either location?)



Now to prove the Proposition:

If $ A$   v$ _ 1 = \lambda _ 1$   v$ _ 1$ and $ A$   v$ _ 2 = \lambda _ 2$   v$ _ 2$ and $ \lambda _ 1 \neq \lambda _ 2$, then start from

$ \langle A$   v$ _ 1,$   v$ _ 2 \rangle = \langle$   v$ _ 1, A$   v$ _ 2 \rangle $. This becomes

$ \langle \lambda _ 1$   v$ _ 1,$   v$ _ 2 \rangle = \langle$   v$ _ 1, \lambda _ 2$   v$ _ 2 \rangle $. Since the dot product is linear in each entry, we get

$ \lambda _ 1 \langle$   v$ _ 1,$   v$ _ 2 \rangle =
\lambda _ 2 \langle$   v$ _ 1,$   v$ _ 2 \rangle $. Then

$ (\lambda _ 1 - \lambda _ 2)\langle$   v$ _ 1,$   v$ _ 2 \rangle = 0$.

Since $ \lambda _ 1 - \lambda _ 2 \neq 0$, we must have $ \langle$   v$ _ 1,$   v$ _ 2 \rangle = 0$. Therefore v$ _ 1 \perp$   v$ _ 2$.



How does all this prove the Spectral Theorem for a $ 2 \times 2$ matrix $ A$? You know from Problem DD-[*] that the eigenvalues are real. Also, from the same problem you know that if the eigenvalues are equal then $ A$ is already diagonal, so you can diagonalize it using $ I$, which is a rotation. On the other hand, if the eigenvalues are different, then the Proposition shows that the eigenvectors are perpendicular. Now you can make the matrix $ P = [$v$ _ 1 \vert$v$ _ 2]$, scale the lengths of the columns to make them length 1, and then negate v$ _ 2$ if necessary to make the determinant be 1.


next up previous
Next: About this document ... Up: dd_sym Previous: dd_sym
Kirby A. Baker 2001-11-26