20 Richardson Extrapolation

20 Richardson Extrapolation#

Richardson extrapolation is a numerical method that can be used to improve truncated numerical approximations by estimating the error of these.

Consider a variable \(A\) that is approximated by a numerical solution \(A_0(h)\), given a small perturbation \(h\) (\(0 < h < 1\)), and an error that can be expressed by a power series:

(6)#\[ A = A_0(h) + a_0 h^{k_0} + a_1 h^{k_1} + ... \]

with \(k_0 < k_1\). As \(0 < h < 1\), the leading order of the local error is \(O(h^{k_0})\). To improve the approximations, we can eliminate the leading error term \(a_0 h^{k_0}\).

We choose a constant \(t\) and express \(A\) using \(A(h/t)\):

(7)#\[ A = A_0 \left( \tfrac{h}{t} \right) + a_0 \left( \tfrac{h}{t} \right)^{k_0} + a_1 \left( \tfrac{h}{t} \right)^{k_1} + \cdots \]

Now, \(t^{k_0}\) (7) - (6) will eliminate the \(O(h^{k_0})\) term:

\[ A = \frac{t^{k_0} A_0\left(\tfrac{h}{t}\right) - A_0(h)}{t^{k_0} - 1} + O\left(h^{k_1}\right) \]

This can be used as an updated approximation:

\[ A_1(h) = \frac{t^{k_0} A_0 \left( \tfrac{h}{t} \right) - A_0(h)}{t^{k_0} - 1} \]

which has a leading order of \(O\left(h^{k_1}\right)\) for its error.

The error of \(A_1(h)\) can be estimated as:

\[\begin{align*} E_1(h) &= A_1(h) - A_0 \left(\tfrac{h}{t} \right)\\ \\ &= \frac{A_0\left(\tfrac{h}{t}\right) - A_0(h)}{t_{k_0} - 1} \end{align*}\]

though note that this is only an approximation and has a leading order of \(O\left(h^{k_0}\right)\) so is likely over-estimating the error.

This method of improving the approximation can be appied recursively, with the improved approximations given by:

\[ A_{i+1} (h) = \frac{t^{k_i} A_i \left( \tfrac{h}{t} \right) - A_i(h)}{t^{k_i} - 1} \]

with leading error of \(O(h^{k_{i + 1}})\). As before the error can be estimated as:

\[ E_{i+1}(h) = A_{i+1}(h) - A_i \left( \tfrac{h}{t} \right) \]