Let
be a set of distinct abscissae (also called “nodes”) and let
denote the real vector space of polynomials of degree at most
with real coefficients. If
is a real-valued function defined on
, then there exists a polynomial
that interpolates
at the abscissae in the sense that
for each
: namely,
, where
. Furthermore, it is the only such polynomial, for if
also interpolates
at the same abscissae, then
has
distinct roots and must therefore be the zero polynomial.
The polynomials
constitute a basis of
known as the Lagrange basis. Other bases of
include the monomial basis
, given by
, and the Newton basis
, given by
.
Divided differences are quantities that can be used (among other things) to compute the coefficients of
in the Newton basis. We define the divided difference
as the coefficient of
in the monomial basis representation of
(which for simplicity we will refer to as the “leading coefficient” of
despite the fact that it may be zero). Below we present several properties of divided differences.
Coefficients of polynomial interpolant in Newton basis
Proof. Write
. For each
, the polynomial
interpolates
at
since
for all
. Clearly, its leading coefficient is
, so by definition,
. ∎
Notably, divided differences obey a recurrence relation that allows for their recursive computation.
Recurrence relation
Proof. Suppose that
. If
interpolates
at
and
interpolates
at
, then the polynomial
given by
interpolates
at
. For the base case, observe that the constant polynomial
interpolates
at
. ∎
Interestingly, we observe that
is a convex combination of
and
for
. The formula above also yields a recursive algorithm for evaluating
known as Neville’s algorithm. Namely, if
interpolates
at
so that
, then
, where
.
An explicit formula for divided differences can also be deduced by inspecting the Lagrange basis representation of
.
Linearity
If
, then
Proof. If
and
interpolate
and
, respectively, at
, then
interpolates
at
. ∎
Symmetry
If
is a permutation of
, then
Proof. In view of the definition above, this is immediate since
. ∎
Factor property
If
and
, then
Proof. If
interpolates
at
, then the polynomial
given by
interpolates
at
. ∎
In fact, the recurrence relation (excluding the base case) can be derived solely from linearity, symmetry, and the factor property:
Thus, these three properties along with the property
determine the values of all divided differences.
The identity
suggests a relationship between divided differences and derivatives: if, say,
,
, and
exists on
, then the mean value theorem amounts to the assertion that
for some
. This generalizes readily to divided differences and derivatives of higher order.
Mean value theorem
Let
and
. If
and
exists on
, then
for some
.
Proof. By symmetry, we may assume that
. If
interpolates
at
, then
has
distinct zeroes in
. By (repeated applications of) Rolle’s theorem,
has a zero
. ∎
Polynomial interpolation error
Let
and
. If
and
exists on
, then for all
there exists a
for which
Proof. If
, the conclusion is trivial; otherwise, the polynomial
interpolates
at
and the conclusion follows from the mean value theorem. ∎
In the same vein,
as
if
is differentiable at
, the geometric interpretation being that the slope of the secant line through
and
tends to that of the tangent line through
. This observation along with the identity
allows us to recover the product rule for derivatives. More generally, we have the following identity for divided differences.
Product rule
Proof. If
interpolates
at
, then
since
agrees with
on
. By linearity and the factor property, we have
By Taylor’s theorem,
if
is
times differentiable at
, where
denotes the forward difference operator with step
(that is,
). As a result,
and from the above we can derive the generalized product rule for derivatives,
.