Multiplying on the left by an elementary matrix is part of the following general principle:
Proposition.
If you start with a matrix
and multiply on the left by
, then in
every row is a linear combination of rows
of
. Moreover, just which linear combinations depends
on
and not on
.
To invent an
so that
produces particular linear
combinations of the rows of
, look at the case
,
which gives
.
For example, if you want a matrix
whose effect
when multiplied on the left is to subtract 2 times row 1 from row
2, then make
by starting with the identity matrix of the
right size and subtracting 2 times row 1 from row 2. In the
case this would be
.
In other words, to get
, do to the identity matrix what you
want
to do to other matrices, when
is used as a
multiplier on the left.
As you know, many religions and cultures have a version of the Golden Rule, which in older English is ``Do unto others what you would have others do unto you.'' So it is said that the Golden Rule for matrix multiplication (on the left) is
``Do unto others what you do unto.''
Problem
V-4. (a) Suppose you want to scale the rows of a
matrix
by 10, 5, and 2 respectively. By what matrix
should you multiply
on the left to accomplish this?
(b) What should you do if instead you want to scale the columns
of
by the same three factors?
Problem
V-5. A permutation matrix is a matrix obtained by permuting
the rows of an identity matrix, or equivalently, a matrix that
has zero entries except for a single 1 in each row and column.
An example is
.
What does
do to
?
(Method: Use the Golden Rule. Of course, this can be computed directly too; the point is that you can tell what happens more intuitively, without doing a computation.)
(a) Calculate
where
.
(b) If you wanted to produce that effect on
by multiplying on
the left, how would you know to use
?
(c) Find
in your head, without writing anything down except the answer.
(d) Find
, where
.
Note. If
is a square matrix such that
for some
, then
is said to be nilpotent (meaning
``nothing-power''). Nilpotent matrices turn out to be important
building blocks in the theory of ``similar'' matrices.
and its transpose are the simplest
nilpotent matrices other than the zero matrix. They often
provide counterexamples to statements you might think should
be true.
Example: From experience with numbers, you might think
, but that's not true for matrices and
is
a counterexample.
Problem
V-7. Show that if
and
are orthogonal column vectors in
(see Problem V-
), then the
matrix
is nilpotent.
Proposition. Any invertible matrix
is a product of
elementary matrices.
Proof. Starting with an
invertible matrix
and row-reducing to row-reduced echelon form (RREF), since
is
invertible and therefore has rank
, the RREF must also
have rank
and so is
. Therefore the net effect is
. Then
, and inverses of elementary matrices are elementary.
Corollary. Any product
with
invertible can be
achieved by starting with
and performing elementary row
operations.
Corollary. To invert
(assuming it is invertible),
start with
and row-reduce until you reach the form
. Then
.
Proof. Row-reducing by a product of elementary matrices equalling
some
results in
, so if
then the
second half of the matrix is
.
Problem
V-8. Find the inverse of
.
Here is a handy shortcut for finding
, something that
often occurs.
Proposition. If
is an invertible
matrix
and
is an
matrix, the row-reduced echelon
form of the augmented matrix
is
.
Proof. Going from
to
means the effect of the
row-reduction is to multiply by
, and the Golden Rule
of matrix multiplication says
is affected the same way.