Problem
W-1. Find the best fitting straight line for data points
,
,
,
,
. (See Figure
.)
Problem
W-2. Find the best-fitting parabola
for data points
,
,
,
,
. (See Figure
.)
Problem
W-3. For the case of finding the best-fitting quadratic
for data
,
express the normal equations
c
d by giving the entries of
and
d as summations involving the data coordinates.
Problem
W-4. In the explanation of the method of normal equations,
we used the fact that the subspace
of all vectors of the form
x contains the columns of
. Which particular
x give
the various columns of
?
Problem
W-5. Carry out the derivation of the normal equations using
calculus, as suggested in Section 7, but just for the case
and arbitrary
.
Problem
W-6. Show that if
is a subspace of
R
and
b
is any vector in
R
, then there is a vector
w
of
so that
w
b is perpendicular to
.
(Method: Let
be the dimension of
and let
w
w
be an orthonormal basis of
. Write
w
w
w
with unknown coefficients
. See
if you can find a condition on the
so that
w
b
is perpendicular to
w
. Do the same for
w
w
.
This is really the same idea as for Gram-Schmidt orthogonalization.
Of course, in terms of the geometrical derivation of the least-squares method,
w
x
.)
Problem
W-7. Show that in the preceding problem, of all vectors in
the vector
w
is the closest vector to
b, by using both
a geometrical explanation and an algebraic proof.
Geometrical method: To the diagram in Section 6, add an arbitrary-looking
vector
w in
. Consider the triangle formed by (the ends of)
b,
w
, and
w. Three points always lie in a plane, so they form
an ordinary planar triangle, even if they are in a higher-dimensional
space.
Algebraic method: Consider squares of lengths instead of lengths. Thus you
want to show that
w
is the
w in
for which
w
b
w
b
is least. Write arbitrary
w as
w
w
w and write
w
b.
Rewrite the expression you are minimizing using this notation and expand
it using simple algebraic properties of the dot product. You should get
something algebraic that corresponds to the geometrical method.
Problem W-8. Consider the overdetermined system in one variable
(a) Find all values of
x that minimize
. Is there more than one such value?
(b) Find the least-squares solution using calculus.
(c) Find the least-squares solution using the normal equations.
Problem
W-9. Suppose you have a least-squares problem with
and a
nonsingular coefficient matrix
. Is the least-squares solution of
x
b the same as the ordinary solution of linear equations
x
b? (Explain why or why not.)
Problem
W-10. (a) Suppose you wanted to find the point of the
surface
that is closest to the origin.
Show how to rephrase this as a least-squares problem. (You are not
asked to finish the solution. Notice that this particular
is
linear, or more precisely, affine.)
(b) Sometimes you need to invent a new method from ingredients that
you know. Suppose you were given a nonlinear parametric surface
and were asked which point of it is closest to the origin.
Invent a method to solve such a problem. (Combine ideas of linear
approximation, Newtonian iteration, and least squares.)
Problem
W-11. Find a parametric line
so that
is close to
,
is close to
,
is close to
, and
is close to
.
(Method: Similar to the example of a cubic parametric curve in
§
.)
Problem
W-12. Show how to compute a parametric curve
of
degree at most 2 so that
is close to
,
is close to
,
is close to
, and
is close to
.
Rather than compute the coefficients numerically, you may leave them as a matrix expression involving matrices with explicit entries. (Matrix inverses can be left uncomputed.)
(Method: Similar to the example of a cubic parametric curve in
§
.)