Quiz 6 in discussion section Tuesday, November 6: Be able
to prove that every linear transformation on
is
of the form
, for some matrix
. (You will
need to be very clear about what the issues are.)
Assignment due nominally in lecture on Wednesday, November 7 but you can hand it in Friday, November 9.
| where | Do but don't hand in | Hand in |
| p. 21 | Ex's 3, 4 | Ex. 6 (any method OK) |
| p. 26 | Ex's 1, 5, 6 | |
| p. 74 | Ex's 11, 12 | |
| p. 84 | Ex. 7 | |
| p. 85 | Ex 1, 4 | Ex. 6 |
| Q | Q-5 | Q-1, Q-2, Q-3, Q-4, Q-6, Q-7 |
| R | R-1, R-2, R-3, R-4 |
Problem
R-1. For a square matrix
, an eigenvector is a nonzero vector
for which
for some
, so
lies along
the same line as
. In other words,
will point in the same
direction as
or in the opposite direction (except in the case
, when
0).
Traditionally, we use
(lambda) for
, so the equation is
. If
is
, there are no more than
possible values for
, it turns out. These are the
eigenvalues of
. (The ``eigen'', pronounced ``eye-ghen'', is
German for ``own'', meaning that eigenvectors and eigenvalues are
special vectors and scalars that ``belong'' to
.)
Following the link on the class home page, try the ``Eigenvector
Demo''. The directions for finding eigenvalues are on the initial
page before you click on ``Eigen Program''. (There is something
not quite right about the ``Find Eigen'' tab; it may be necessary
to click on its right-hand edge. The same applies going back to
``Select Matrix A''.) The idea is first to choose a matrix and then
to move
around until
and
line up--if possible!
The problem: For each of a
rotation, a
reflection, and a shear, how many different eigenvector lines
(``eigenspaces'') can you find? (Answers must be among 0, 1, and
2. Two vectors along the same line count as one eigenspace.)
Also try some transformations you invent yourself, by dragging
and
on the ``Select Matrix A'' panel, but
you don't need to turn those in.
Problem
R-2.
diagonal matrices behave very neatly when they
are added and multiplied: The diagonal entries are added and
multiplied without affecting each other. For example,
. A a result, if you have some polynomial expression such
as
and apply it to a diagonal matrix such
as
, the value
, meaning
, is just
.
(Notice that the 3 becomes
; we can't have
a matrix!)
(a) For the matrix
, find a polynomial
of degree 2 with coefficients in
so that
.
(Use a ``monic'' polynomial--a polynomial for which the coefficent
of the highest power of
is 1.)
(b) Same problem if
and
is of degree 3.
Problem
R-3. Read §1.5 and §1.6. Notice that any elementary row
operation can be described as multiplying on the left by an
elementary matrix. Also notice that when you do any matrix
multiplication
, the columns of
do not affect each
other in finding the product; it's like finding
times each
column of
separately. In other words, if
then
, where
the
are the columns of
.
(a) Show that when an
matrix
is row-reduced
to a matrix
, there is an isomorphism of
with
itself that takes each column of
to a column of
.
(It doesn't matter whether
is in row-reduced echelon form
or not.)
(b) Use (a) to re-explain why if certain columns of
are
a basis for the column space of
, then the same columns of
are a basis for the column space of
.
Problem R-4. On separate paper from the rest of this assignment, choose one problem from the midterm that you wish you had answered better, and write out a better answer. I'll look at it and comment on it so you can see whether it would be satisfactory. For the most meaningful feedback, say it in your own words instead of copying the solution handed out, although you're welcome to look at the solution beforehand.