next up previous
Next: dd_sym Up: dd_sym Previous: dd_sym

2. More on orthonormal sets of vectors and rotations

Problem DD-3. Show that an orthonormal set of vectors is linearly independent.

(Method: Given a linear combination $ = 0$, take the inner product of both sides with the first vector and expand, using the fact that the inner product is linear in each entry. Now try the second vector, etc.)



In the definition of a rotation matrix, notice that by (i), the matrix transformation $ T _ R$ takes the standard basis to an orthonormal basis, so $ T _ R$ acts ``rigidly''. By (ii), orientation is preserved. More generally,



Definition. A matrix whose columns are orthonormal is called an orthogonal matrix (not a good terminology, since it really should be an ``orthonormal'' matrix). Thus a rotation matrix is an orthogonal matrix with determinant 1. The following fact can be proved:



Theorem. For an $ n \times n$ real matrix $ P$, the following are equivalent:

(1) The columns of $ P$ are orthonormal (i.e., $ P$ is an orthogonal matrix);

(2) $ P ^ t P = I$;

(3) $ P$ is invertible and $ P ^ {-1} = P ^ t$;

(4) the rows of $ P$ are orthonormal;

(5) $ T _ P$ is a rigid transformation, meaning that all distances are preserved.



Problem DD-4. Prove $ (1) \Leftrightarrow (2)$. (Think about what happens in multiplying when $ P$ is $ 2 \times 2$ or $ 3 \times 3$.)



Problem DD-5. Find the inverse of the rotation $ \left[\begin{array}{ccc}0&1&0\\  0&0&1\\  1&0&0\end{array}\right]$.



Proposition. If $ P$ is an orthogonal matrix then $ \det P = \pm 1$.



Problem DD-6. Prove this proposition. (Start from (2) and use the facts that $ \det AB = \det A \; \det B$ and that $ \det A ^ t = \det A$.)



So, rotations are the case $ \det P = +1$.




next up previous
Next: dd_sym Up: dd_sym Previous: dd_sym
Kirby A. Baker 2001-11-26