Problem DD-3. Show that an orthonormal set of vectors is linearly independent.
(Method: Given a linear combination
, take the inner product
of both sides with the first vector and expand, using the fact that
the inner product is linear in each entry. Now try the second vector, etc.)
In the definition of a rotation matrix, notice that by (i), the
matrix transformation
takes the standard basis to an
orthonormal basis, so
acts ``rigidly''. By (ii),
orientation is preserved. More generally,
Definition. A matrix whose columns are orthonormal is called an orthogonal matrix (not a good terminology, since it really should be an ``orthonormal'' matrix). Thus a rotation matrix is an orthogonal matrix with determinant 1. The following fact can be proved:
Theorem. For an
real matrix
, the following are
equivalent:
(1) The columns of
are orthonormal (i.e.,
is an orthogonal matrix);
(2)
;
(3)
is invertible and
;
(4) the rows of
are orthonormal;
(5)
is a rigid transformation, meaning that all distances are
preserved.
Problem
DD-4. Prove
. (Think about what happens in
multiplying when
is
or
.)
Problem
DD-5. Find the inverse of the rotation
.
Proposition. If
is an orthogonal matrix then
.
Problem
DD-6. Prove this proposition. (Start from (2) and use the facts that
and that
.)
So, rotations are the case
.