next up previous
Next: About this document ... Up: w_lstsqs Previous: w_lstsqs

8. Problems

Problem W-1. Find the best fitting straight line for data points $ (-1,-1)$, $ (-1,0)$, $ (0,1)$, $ (1,2)$, $ (1,3)$. (See Figure [*].)

Figure: Data points
book/12dir/linearProb.eps

Problem W-2. Find the best-fitting parabola $ y = f(x)$ for data points $ (-2,-4)$, $ (-1,-4)$, $ (0,-1)$, $ (1,1)$, $ (2,5)$. (See Figure [*].)



Figure: Data points
book/12dir/quadraticProb.eps

Problem W-3. For the case of finding the best-fitting quadratic $ c _ 0 + c _ 1
x + c _ 2 x ^ 2$ for data $ (r _ 1, s _ 1),\dots, (r _ m, s _ m)$, express the normal equations $ N$   c$ = $   d by giving the entries of $ N$ and d as summations involving the data coordinates.



Problem W-4. In the explanation of the method of normal equations, we used the fact that the subspace $ W$ of all vectors of the form $ A$   x contains the columns of $ A$. Which particular x give the various columns of $ A$?



Problem W-5. Carry out the derivation of the normal equations using calculus, as suggested in Section 7, but just for the case $ n = 2$ and arbitrary $ m$.



Problem W-6. Show that if $ W$ is a subspace of R$ ^ n$ and b is any vector in R$ ^ n$, then there is a vector w$ _ 0$ of $ W$ so that w$ _ 0 -$   b is perpendicular to $ W$.

(Method: Let $ k$ be the dimension of $ W$ and let w$ _ 1,\dots,$   w$ _ k$ be an orthonormal basis of $ W$. Write w$ _ 0 = r _ 1$   w$ _ 1
+\dots +r _ k$   w$ _ k$ with unknown coefficients $ r _ i$. See if you can find a condition on the $ r _ i$ so that w$ _ 0 -$   b is perpendicular to w$ _ 1$. Do the same for w$ _ 2,\dots,$   w$ _ k$. This is really the same idea as for Gram-Schmidt orthogonalization. Of course, in terms of the geometrical derivation of the least-squares method, w$ _ 0 = A$   x$ _ 0$.)



Problem W-7. Show that in the preceding problem, of all vectors in $ W$ the vector w$ _ 0$ is the closest vector to b, by using both a geometrical explanation and an algebraic proof.

Geometrical method: To the diagram in Section 6, add an arbitrary-looking vector w in $ W$. Consider the triangle formed by (the ends of) b, w$ _ 0$, and w. Three points always lie in a plane, so they form an ordinary planar triangle, even if they are in a higher-dimensional space.

Algebraic method: Consider squares of lengths instead of lengths. Thus you want to show that w$ _ 0$ is the w in $ W$ for which $ ($w$ -$   b$ ) \cdot ($w$ -$   b$ )$ is least. Write arbitrary w as w$ = $   w$ _ 0 + \Delta$   w and write $ \epsilon =$   w$ _ 0 -$   b. Rewrite the expression you are minimizing using this notation and expand it using simple algebraic properties of the dot product. You should get something algebraic that corresponds to the geometrical method.



Problem W-8. Consider the overdetermined system in one variable

\begin{displaymath}\begin{array}{lll}
x & \approx & 1\\
x & \approx & 0
\end{array}\end{displaymath}

(a) Find all values of x that minimize $ \vert \epsilon
_ 1 \vert + \vert \epsilon _ 2 \vert$. Is there more than one such value?

(b) Find the least-squares solution using calculus.

(c) Find the least-squares solution using the normal equations.



Problem W-9. Suppose you have a least-squares problem with $ m = n$ and a nonsingular coefficient matrix $ A$. Is the least-squares solution of $ A$   x$ \approx$   b the same as the ordinary solution of linear equations $ A$   x$ = $   b? (Explain why or why not.)



Problem W-10. (a) Suppose you wanted to find the point of the surface $ P(t,u) = (3t-u-5,u+2t+2,4t+2u-7)$ that is closest to the origin. Show how to rephrase this as a least-squares problem. (You are not asked to finish the solution. Notice that this particular $ P$ is linear, or more precisely, affine.)

(b) Sometimes you need to invent a new method from ingredients that you know. Suppose you were given a nonlinear parametric surface $ P(t,u)$ and were asked which point of it is closest to the origin. Invent a method to solve such a problem. (Combine ideas of linear approximation, Newtonian iteration, and least squares.)



Problem W-11. Find a parametric line $ P(t)$ so that $ P(0)$ is close to $ \left[\begin{array}{c}0\\  2\end{array}\right]$, $ P(1)$ is close to $ \left[\begin{array}{c}1\\  2\end{array}\right]$, $ P(2)$ is close to $ \left[\begin{array}{c}2\\  1\end{array}\right]$, and $ P(3)$ is close to $ \left[\begin{array}{c}2\\  0\end{array}\right]$.

(Method: Similar to the example of a cubic parametric curve in §[*].)



Problem W-12. Show how to compute a parametric curve $ P(t)$ of degree at most 2 so that $ P(0)$ is close to $ \left[\begin{array}{c}0\\  2\end{array}\right]$, $ P(1)$ is close to $ \left[\begin{array}{c}1\\  2\end{array}\right]$, $ P(2)$ is close to $ \left[\begin{array}{c}2\\  1\end{array}\right]$, and $ P(3)$ is close to $ \left[\begin{array}{c}2\\  0\end{array}\right]$.

Rather than compute the coefficients numerically, you may leave them as a matrix expression involving matrices with explicit entries. (Matrix inverses can be left uncomputed.)

(Method: Similar to the example of a cubic parametric curve in §[*].)


next up previous
Next: About this document ... Up: w_lstsqs Previous: w_lstsqs
Kirby A. Baker 2003-05-13