next up previous
Next: v_misc2 Up: v_misc2 Previous: v_misc2

1.2. Multiplying matrices on the left

Multiplying on the left by an elementary matrix is part of the following general principle:

Proposition. If you start with a matrix $ A$ and multiply on the left by $ M$, then in $ MA$ every row is a linear combination of rows of $ A$. Moreover, just which linear combinations depends on $ M$ and not on $ A$.

To invent an $ M$ so that $ MA$ produces particular linear combinations of the rows of $ A$, look at the case $ A = I$, which gives $ MI = M$.

For example, if you want a matrix $ M$ whose effect when multiplied on the left is to subtract 2 times row 1 from row 2, then make $ M$ by starting with the identity matrix of the right size and subtracting 2 times row 1 from row 2. In the $ 3
\times 3$ case this would be $ M = \left[\begin{array}{rrr}1&0&0\  -2&1&0\  0&0&1\end{array}\right]$.

In other words, to get $ M$, do to the identity matrix what you want $ M$ to do to other matrices, when $ M$ is used as a multiplier on the left.

As you know, many religions and cultures have a version of the Golden Rule, which in older English is ``Do unto others what you would have others do unto you.'' So it is said that the Golden Rule for matrix multiplication (on the left) is

``Do unto others what you do unto $ I$.''

Problem V-4. (a) Suppose you want to scale the rows of a $ 3
\times 3$ matrix $ A$ by 10, 5, and 2 respectively. By what matrix should you multiply $ A$ on the left to accomplish this?

(b) What should you do if instead you want to scale the columns of $ A$ by the same three factors?



Problem V-5. A permutation matrix is a matrix obtained by permuting the rows of an identity matrix, or equivalently, a matrix that has zero entries except for a single 1 in each row and column. An example is $ P = \small\left[\begin{array}{cccc}0&0&1&0\  0&0&0&1\  1&0&0&0\  0&1&0&0\end{array}\right]$. What does $ \tau _ P$ do to $ \left[\begin{array}{c}x _ 1\  x _ 2\  x _ 3\  x
_ 4\end{array}\right]$?

(Method: Use the Golden Rule. Of course, this can be computed directly too; the point is that you can tell what happens more intuitively, without doing a computation.)



Problem V-6. For $ N = \left[\begin{array}{cc}0&0\  1&0\end{array}\right]$,

(a) Calculate $ NH$ where $ H = \left[\begin{array}{ccccccc}H&E&L&P&I&A& M\  S&I&N&K&I&N&G\end{array}\right]$.

(b) If you wanted to produce that effect on $ H$ by multiplying on the left, how would you know to use $ N$?

(c) Find $ N^2$ in your head, without writing anything down except the answer.

(d) Find $ N _ 4^4$, where $ N _ 4 = \left[\begin{array}{cccc}0&0&0&0\  1&0&0&0\\
0&1&0&0\  0&0&1&0\end{array}\right]$.



Note. If $ M$ is a square matrix such that $ M^k = 0$ for some $ k$, then $ M$ is said to be nilpotent (meaning ``nothing-power''). Nilpotent matrices turn out to be important building blocks in the theory of ``similar'' matrices.

$ N = \left[\begin{array}{cc}0&0\  1&0\end{array}\right]$ and its transpose are the simplest nilpotent matrices other than the zero matrix. They often provide counterexamples to statements you might think should be true.

Example: From experience with numbers, you might think $ A^2 =
0 \Rightarrow A = 0$, but that's not true for matrices and $ N$ is a counterexample.



Problem V-7. Show that if $ v$ and $ w$ are orthogonal column vectors in $ \mathbb{R}^n$ (see Problem V-[*]), then the $ n \times n$ matrix $ w v ^ t$ is nilpotent.



Proposition. Any invertible matrix $ M$ is a product of elementary matrices.

Proof. Starting with an $ n \times n$ invertible matrix $ M$ and row-reducing to row-reduced echelon form (RREF), since $ M$ is invertible and therefore has rank $ n$, the RREF must also have rank $ n$ and so is $ I$. Therefore the net effect is $ E _ k \dots E _ 1 M = I$. Then $ M = E^{-1} _ 1 \dots E^{-1} _ k$, and inverses of elementary matrices are elementary.



Corollary. Any product $ MA$ with $ M$ invertible can be achieved by starting with $ A$ and performing elementary row operations.



Corollary. To invert $ A$ (assuming it is invertible), start with $ [A\vert I]$ and row-reduce until you reach the form $ [I\vert M]$. Then $ M = A^{-1}$.

Proof. Row-reducing by a product of elementary matrices equalling some $ M$ results in $ [MA\vert MI]$, so if $ MA = I$ then the second half of the matrix is $ M = A^{-1}$.



Problem V-8. Find the inverse of $ \left[\begin{array}{ccc}1&2&3\  0&1&4\  0&0&1\end{array}\right]$.



Here is a handy shortcut for finding $ A^{-1} B$, something that often occurs.



Proposition. If $ A$ is an invertible $ n \times n$ matrix and $ B$ is an $ m \times n$ matrix, the row-reduced echelon form of the augmented matrix $ [A\vert B]$ is $ [I\vert A^{-1} B]$.

Proof. Going from $ A$ to $ I$ means the effect of the row-reduction is to multiply by $ A^{-1}$, and the Golden Rule of matrix multiplication says $ B$ is affected the same way.




next up previous
Next: v_misc2 Up: v_misc2 Previous: v_misc2
Kirby A. Baker 2001-11-13