Sample problem and solutions for Quiz 3: If
are
linearly independent vectors in a vector space over
, show
that
are linearly independent.
Solution #1: Suppose
.
Expanding and collecting coefficients we see this is the same as
. Since
are
linearly independent, we have
, or
. Row-reducing we get
,
so
. Therefore any linear relation between
the given vectors is trivial and so they are linearly independent.
Solution #2:
are a basis for the subspace
they
span, so there is an isomorphism of
taking
the standard basis vectors
to
. Via
this isomorphism the question becomes whether
are linearly independent--in other
words,
,
, and
. Make a matrix
with these as columns (or rows) and row reduce. We get the
identity matrix, so these vectors are linearly independent.
Therefore the original vectors were linearly independent.
Comment on problem done in lecture:
The question asked by one of you was whether the following transformation is an isomorphism:
Pols
Pols
given by
.
This can't be true, since the transformation is between
spaces of different dimensions, but let's see if we can get
a better understanding of
by using isomorphisms.
This is a little more complicated than
some problems because there are two vector spaces involved, of
dimensions 4 and 5, and so we are replacing them by two
spaces,
and
.
First, we can multiply out the product of polynomials and collect coefficients of powers to get
.
Now let's use the fact that
Pols
via
. We also use the fact
that
Pols
via
.
So we are replacing the spaces on both ends of
, and
itself is replaced by
by
given by
.
Is
linear? Yes.
Is
one-to-one? We can check whether
, and the
answer turns out to be yes.
Is
onto? We can check whether, given any
, it is possible to find
with
: This is the same as the
equations
, where
are variables and
are constants.
Either directly or by matrices, we get
, so
.
Therefore we can't choose just any
and
is not onto. Therefore
is not onto, as
expected.
Note:
can be expressed in matrix form as
.
This gives a clearer picture of
than the polynomial version.