Subject classification: this is a mathematics resource.

## Definitions

Linear multistep methods are used for the numerical solution of ordinary differential equations, in particular the initial value problem

${\displaystyle y'=f(t,y)\quad {\text{with}}\quad y(t_{0})=y_{0}\,.}$

The Adams-Bashforth methods and Adams-Moulton methods are described on the Linear multistep method page.

## Derivation

There are (at least) two ways that can be used to derive the Adams-Bashforth methods and Adams-Moulton methods. We will demonstrate the derivations using polynomial interpolation and using Taylor's theorem for the two-step Adams-Bashforth method.

### Derive the two-step Adams-bashforth method by using polynomial interpolation

From the w:Fundamental theorem of Calculus we can get

${\displaystyle y(t_{n+1})=y(t_{n})+\int _{t_{n}}^{t_{n+1}}y'(t)\,dt\,.}$

(1 )

Set

${\displaystyle A=\int _{t_{n}}^{t_{n+1}}y'(t)\,dt=\int _{t_{n}}^{t_{n+1}}f(t,y(t))\,dt.}$

(2 )

To get the value of ${\displaystyle A}$, we can use an interpolating polynomial ${\displaystyle P(t)}$ as an approximation of ${\displaystyle f(t,y(t))}$. The interpolation polynomial in the Lagrange form is a linear combination

${\displaystyle L(x):=\sum _{j=0}^{k}y_{j}\ell _{j}(x)}$

where

${\displaystyle \ell _{j}(x):=\prod _{\begin{smallmatrix}0\leq m\leq k\\m\neq j\end{smallmatrix}}{\frac {x-x_{m}}{x_{j}-x_{m}}}\,.}$

Then the interpolation can be

${\displaystyle P(t)=f(t_{n},y_{n}){\frac {t-t_{n-1}}{t_{n}-t_{n-1}}}+f(t_{n-1},y_{n-1}){\frac {t-t_{n}}{t_{n-1}-t_{n}}}\,.}$

Thus, (2 ) becomes

${\displaystyle A=\int _{t_{n}}^{t_{n+1}}f(t,y(t))\,dt\approx \int _{t_{n}}^{t_{n+1}}P(t)\,dt=\int _{t_{n}}^{t_{n+1}}f(t_{n},y_{n}){\frac {t-t_{n-1}}{t_{n}-t_{n-1}}}+f(t_{n-1},y_{n-1}){\frac {t-t_{n}}{t_{n-1}-t_{n}}}\,dt}$

(3 )

Integrating and simplifying, the right hand side of equation (3 ) becomes

${\displaystyle {\begin{array}{rcl}&&{\frac {1}{2}}(t_{n}+t_{n+1})(f(t_{n},y_{n})-f(t_{n-1},y_{n-1}))-t_{n-1}f(t_{n},y_{n})+t_{n}f(t_{n-1},y_{n-1})\\\\&=&{\frac {1}{2}}(t_{n}+t_{n+1}-2t_{n-1})f(t_{n},y_{n})+{\frac {1}{2}}(2t_{n}-t_{n+1}-t_{n})f(t_{n-1},y_{n-1})\,.\end{array}}}$

Since ${\displaystyle t_{n-1}}$, ${\displaystyle t_{n}}$ and ${\displaystyle t_{n+1}}$ are equally spaced, then ${\displaystyle t_{n}-t_{n-1}=t_{n+1}-t_{n}=h}$. Therefore, the value of ${\displaystyle A}$ is

${\displaystyle A={\frac {3}{2}}hf(t_{n},y_{n})-{\frac {1}{2}}hf(t_{n-1},y_{n-1})}$

Putting this value back to (1 ) yields

${\displaystyle y(t_{n+1})\approx y(t_{n})+{\frac {3}{2}}hf(t_{n},y_{n})-{\frac {1}{2}}hf(t_{n-1},y_{n-1})\,.}$

Thus, the equation ${\displaystyle y_{n+1}=y_{n}+{\frac {3}{2}}hf(t_{n},y_{n})-{\frac {1}{2}}hf(t_{n-1},y_{n-1})}$ is the two-step Adams-Bashforth method.

### Derive the two-step Adams-Bashforth method by using Taylor's theorem

To simplify, let's set ${\displaystyle f_{k}=f(x_{k},y_{k})}$. Then the general form of Adams-Bashforth method is

${\displaystyle y_{n+r}=y_{n+r-1}+h\sum _{k=1}^{r}\lambda _{k}f_{n+r-k}}$

(4 )

where ${\displaystyle \sum _{k=1}^{r}\lambda _{k}=1}$. For the two-step Adams-Bashforth method, let's set ${\displaystyle \lambda _{1}=1-\lambda ,\lambda _{2}=\lambda }$. Then (4 ) becomes

{\displaystyle {\begin{aligned}y_{n+2}&=y_{n+1}+h((1-\lambda )f_{n+1}+\lambda f_{n})\\&=y(t_{n+1})+h((1-\lambda )y'(t_{n+1})+\lambda y'(t_{n}))\,.\end{aligned}}}

By using Taylor's theorem, expand ${\displaystyle y'(t_{n})}$ at ${\displaystyle t_{n+1}}$ to get

${\displaystyle y_{n+2}=y(t_{n+1})+h((1-\lambda )y'(t_{n+1})+\lambda (y'(t_{n+1})-hy''(t_{n+1})+O(h^{2}))\,.}$

Thus, the simplified form is

${\displaystyle y_{n+2}=y(t_{n+1})+hy'(t_{n+1})-\lambda (h^{2})y''(t_{n+1})+O(h^{3})}$

(5 )

Expanding ${\displaystyle y(t_{n+2})}$ at ${\displaystyle y(t_{n+1})}$ yields

${\displaystyle y(t_{n+2})=y(t_{n+1})+hy'(t_{n+1})+{\frac {1}{2}}(h^{2})y''(t_{n+1})+O(h^{3})\,.}$

(6 )

Subtracting (5 ) from (6 ) and then requiring the h^2 term to cancel makes ${\displaystyle \lambda =-{\frac {1}{2}}}$. The two-step Adams-Bashforth method is then

${\displaystyle y_{n+2}=y_{n+1}+{\frac {3}{2}}hf(t_{n+1},y_{n+1})-{\frac {1}{2}}hf(t_{n},y_{n})}$

Since

${\displaystyle y(t_{n+2})-y_{n+2}=O(h^{3})\,.}$

the local truncation error is of order ${\displaystyle O(h^{3})}$ and thus the method is second order. (See w: Linear multistep method#Consistency and order and w:Truncation error)

## Exercises

1. Derive three-step Adams-Bashforth method by using polynomial interpolation

2. Derive the second-order Adams-Moulton method by using Taylor's theorem

## Predictor–corrector method

To solve an ordinary differential equation (ODE), a w:Predictor–corrector method is an algorithm that can be used in two steps. First, the prediction step calculates a rough approximation of the desired quantity, typically using an explicit method. Second, the corrector step refines the initial approximation using another means, typically an implicit method.

Here mainly discuss about using Adams-bashforth and Adams-moulton methods as a pair to contruct a predictor–corrector method.

Let's start from the two-step Adams method. The prediction step is to use two-step Adams-bashforth:

${\displaystyle P_{n+1}=y_{n}+{\frac {3}{2}}hf(t_{n},y_{n})-{\frac {1}{2}}hf(t_{n-1},y_{n-1})}$

Then, by using two-step Adams-moulton the corrector step can be:

${\displaystyle y_{n+1}=y_{n}+{\tfrac {1}{2}}h{\big (}f(t_{n+1},P_{n+1})+f(t_{n},y_{n}){\big )}}$

Also, by using four-step Adams-bashforth and Adams-moulton methods together, the predictor-corrector formula is:

${\displaystyle {\begin{cases}P_{n+1}=y_{n}+{\frac {h}{24}}(55f(t_{n},y_{n})-59f(t_{n-1},y_{n-1})+37f(t_{n-2},y_{n-2})-9f(t_{n-3},y_{n-3}))\\y_{n+1}=y_{n}+{\frac {h}{24}}(9f(t_{n+1},P_{n+1})+19f(t_{n},y_{n})-5f(t_{n-1},y_{n-1})+f(t_{n-2},y_{n-2}))\\\end{cases}}}$

Note, the four-step Adams-bashforth method needs four initial values to start the calculation. It needs to use other methods, for example Runge-Kutta, to get these initial values.

### matlab program

The four-step Adams predictor-corrector matlab program code is

function [x y]=maadams4(dyfun,xspan,y0,h)
% use four-step Adams predictor-corrector method to solve an ODE y'=f(x,y), y(x0)=y0
% inputs: dyfun -- the function f(x,y), as an inline
% xspan -- the interval [x0,xn]
% y0 -- the initial value
% h -- the step size
% output: x, y -- the node and the value of y
x=xspan(1):h:xspan(2);
% use Runge-Kutta method to get four initial values
[xx,yy]=marunge(dyfun,[x(1),x(4)],y0,h);
y(1)=yy(1);y(2)=yy(2);
y(3)=yy(3);y(4)=yy(4);
for n=4:(length(x)-1)
p=y(n)+h/24*(55*feval(dyfun,x(n),y(n))-59*feval(dyfun,x(n-1),y(n-1))+37*feval(dyfun,x(n-2),y(n-2))-9*feval(dyfun,x(n-3),y(n-3)));
y(n+1)=y(n)+h/24*(9*feval(dyfun,x(n+1),p)+19*feval(dyfun,x(n),y(n))-5*feval(dyfun,x(n-1),y(n-1))+feval(dyfun,x(n-2),y(n-2)));
end
x=x';y=y';



The matlab code of Runge-Kutta method is:

function [x y]=marunge(dyfun,xspan,y0,h)
x=xspan(1):h:xspan(2);
y(1)=y(0);
for n=1:(length(x)-1)
k1=feval(dyfun,x(n),y(n));
k2=feval(dyfun,x(n)+h/2,y(n)+h/2*k1);
k3=feval(dyfun,x(n)+h/2,y(n)+h/2*k2);
k4=feval(dyfun,x(n+1),y(n)+h*k3);
y(n+1)=y(n)+h*(k1+2*k2+2*k3+k4)/6;
end
x=x';y=y'