Last modified: December 22, 2024

This article is written in: 🇺🇸

Gaussian Interpolation

Gaussian Interpolation, often associated with Gauss’s forward and backward interpolation formulas, is a technique that refines the approach of polynomial interpolation when data points are equally spaced. Instead of using the Newton forward or backward interpolation formulas directly from one end of the data interval, Gaussian interpolation centers the interpolation around a midpoint of the data set. This approach can provide better accuracy when the point at which we need to interpolate lies somewhere in the "interior" of the given data points rather than near the boundaries.

In essence, Gaussian interpolation is a variant of Newton’s divided difference interpolation but employs a "central" reference point and finite differences structured around a central node. By choosing a midpoint as a reference and using appropriately shifted indices, Gaussian interpolation formulas often yield more stable and accurate approximations for values near the center of the data set.

Imagine you have a set of equally spaced points and corresponding function values:

output(29)

Newton’s forward or backward interpolation builds a polynomial starting from one end (like x_0 or x_n). Gaussian interpolation, however, selects a point near the center of the interval, say x_m (the midpoint), and builds the interpolation polynomial outward from this center. This symmetric approach can lead to a polynomial that better represents the function near that central area, potentially reducing error.

Mathematical Formulation

Assume we have a set of equally spaced data points:

$$x_0, x_1, x_2, \ldots, x_n$$

with spacing $h = x_{i+1} - x_i$. Let the midpoint be $x_m$, where $m \approx n/2$ if $n$ is even. For convenience, we define a shifted variable:

$$t = \frac{x - x_m}{h}$$

The function values are $y_i = f(x_i)$. We then use central (forward and backward) differences around $x_m$ to construct the interpolation polynomial. The polynomial takes a form that involves binomial-type expansions with central differences, such as:

Gauss’s Forward Interpolation Formula (for a midpoint chosen to the "left"):

$$f(x) \approx f(x_m) + t \Delta f(x_m) + \frac{t(t-1)}{2!}\Delta^2 f(x_{m-1}) + \frac{t(t+1)(t-1)}{3!}\Delta^3 f(x_{m-1}) + \cdots$$

Gauss’s Backward Interpolation Formula (for a midpoint chosen to the "right"):

$$f(x) \approx f(x_m) + t \nabla f(x_m) + \frac{t(t+1)}{2!}\nabla^2 f(x_{m+1}) + \frac{t(t+1)(t-1)}{3!}\nabla^3 f(x_{m+1}) + \cdots$$

Here $\Delta$ and $\nabla$ denote forward and backward difference operators, respectively, and the differences are computed around the central index.

The exact form depends on whether you use forward or backward differences and how you pick the center. The key point is that the polynomial is expressed in terms of $t$ and central differences, resulting in symmetric factorial factors that resemble the binomial expansions.

Derivation

I. Starting from Equally Spaced Points:

Given $f(x_0), f(x_1), \ldots, f(x_n)$ at points equally spaced by $h$, define:

$$\Delta f(x_i) = f(x_{i+1}) - f(x_i)$$

and higher-order differences:

$$\Delta^2 f(x_i) = \Delta f(x_{i+1}) - \Delta f(x_i)$$

and so forth.

II. Choosing a Central Point:

Let $x_m$ be the chosen "central" point around which we will build the polynomial. Introduce $t = (x - x_m)/h$ to measure how far $x$ is from $x_m$ in terms of step size $h$.

III. Constructing the Polynomial:

Using Taylor-like expansions of forward or backward differences about the midpoint, you can derive a polynomial that expresses $f(x)$ in terms of $f(x_m)$, the central differences ($\Delta^k f$ or $\nabla^k f$) at points around $x_m$, and binomial-like terms in $t$.

IV. Symmetry and Binomial Coefficients:

The resulting terms often involve products like $t(t-1)$, $t(t+1)$, and factorial denominators, mirroring expansions from Newton’s forward interpolation but recentered so that the polynomial captures behavior near the center more accurately.

Algorithm Steps

Input:

I. Identify the Central Point:

II. Compute Central Differences:

III. Apply Gaussian Formula:

Substitute the central differences and the value of $t$ into the chosen Gaussian interpolation formula (forward or backward) to compute $f(x)$.

IV. Calculate Interpolated Value:

Sum the terms up to the desired order of approximation. More terms yield higher accuracy.

Example

Given Data: Suppose we have points with spacing $h=1$:

$$x_0=0, x_1=1, x_2=2, x_3=3, x_4=4$$ and function values:

$$f(0)=2, f(1)=3.5, f(2)=5, f(3)=5.8, f(4)=6$$

Assume we pick $x_2=2$ as the central point ($m=2$). We want to interpolate $f(1.5)$.

I. Compute differences around $x_2=2$:

II. Compute $t=(1.5-2)/1=-0.5$.

III. Apply Gauss’s formula (forward or backward depending on indexing). For simplicity, suppose we choose the formula that best suits points to the left:

The polynomial might look like:

$$f(1.5) \approx f(2) + t\Delta f(1) + \frac{t(t-1)}{2!}\Delta^2 f(\cdot) + \cdots$$

Insert computed differences and $t=-0.5$, then calculate term by term.

IV. Evaluate to get an approximate $f(1.5)$.

(Note: The exact numeric example would require a full difference table and careful selection of forward/backward form, but this gives the general idea.)

Advantages

I. Improved Accuracy Near the Center:

When the interpolation point $x$ is near the midpoint, Gaussian interpolation often yields less error compared to simple forward or backward Newton interpolation from the endpoints.

II. Symmetric Structure:

The formula’s symmetric form around a central point can produce more stable numerical results.

III. Adaptable:

You can choose which direction (forward/backward) and how many terms to include, balancing complexity and accuracy.

Limitations

I. Requires Equally Spaced Points:

Gaussian interpolation formulas are traditionally derived for equally spaced data. If spacing is uneven, this method is not directly applicable.

II. More Complex Setup:

Determining the central point and computing central differences can be more involved than direct Newton forward/backward interpolation.

III. Limited Gain if Not Near Center:

If the interpolation point is not near the data set’s midpoint, there may be no significant advantage over standard methods.

Table of Contents

    Gaussian Interpolation
    1. Mathematical Formulation
    2. Derivation
    3. Algorithm Steps
    4. Example
    5. Advantages
    6. Limitations