# Numerical Analysis/Differentiation/Examples

When deriving a finite difference approximation of the ${\displaystyle j}$th derivative of a function ${\displaystyle f:\mathbb {R} \rightarrow \mathbb {R} }$, we wish to find ${\displaystyle a_{1},a_{2},...,a_{n}\in \mathbb {R} }$ and ${\displaystyle b_{1},b_{2},...,b_{n}\in \mathbb {R} }$ such that

${\displaystyle f^{(j)}(x_{0})=h^{-j}\sum _{i=1}^{n}a_{i}f(x_{0}+b_{i}h)+O(h^{k}){\text{ as }}h\to 0}$

or, equivalently,

${\displaystyle h^{-j}\sum _{i=1}^{n}a_{i}f(x_{0}+b_{i}h)=f^{(j)}(x_{0})+O(h^{k}){\text{ as }}h\to 0}$

where ${\displaystyle O(h^{k})}$ is the error, the difference between the correct answer and the approximation, expressed using Big-O notation. Because ${\displaystyle h}$ may be presumed to be small, a larger value for ${\displaystyle k}$ is better than a smaller value.

A general method for finding the coefficients is to generate the Taylor expansion of ${\displaystyle h^{-j}\sum _{i=1}^{n}a_{i}f(x_{0}+b_{i}h)}$ and choose ${\displaystyle a_{1},a_{2},...,a_{n}}$ and ${\displaystyle b_{1},b_{2},...,b_{n}}$ such that ${\displaystyle f^{(j)}(x_{0})}$ and the remainder term are the only non-zero terms. If there are no such coefficients, a smaller value for ${\displaystyle k}$ must be chosen.

For a function of ${\displaystyle m}$ variables ${\displaystyle g:\mathbb {R} ^{m}\rightarrow \mathbb {R} }$, the procedure is similar, except ${\displaystyle x_{0},b_{1},b_{2},...,b_{n}}$ are replaced by points in ${\displaystyle \mathbb {R} ^{m}}$ and the multivariate extension of Taylor's theorem is used.

## Single-Variable

In all single-variable examples, ${\displaystyle x_{0}\in \mathbb {R} }$ and ${\displaystyle f:\mathbb {R} \rightarrow \mathbb {R} }$ are unknown, and ${\displaystyle h\in \mathbb {R} }$ is small. Additionally, let ${\displaystyle f}$ be 5 times continuously differentiable on ${\displaystyle \mathbb {R} }$.

### First Derivative

Find ${\displaystyle a,b,c\in \mathbb {R} }$ such that ${\displaystyle {\frac {af(x_{0}+h)+bf(x_{0}+ch)}{h}}}$ best approximates ${\displaystyle f'(x_{0})}$.

Let ${\displaystyle f:\mathbb {R} \rightarrow \mathbb {R} }$ be 42 times continuously differentiable on ${\displaystyle \mathbb {R} }$. Find the largest ${\displaystyle n\in \mathbb {N} }$ such that

${\displaystyle {\frac {df}{dx}}(x_{0})={\frac {-f(x_{0}+2h)+8f(x_{0}+h,y_{0})-8f(x_{0}-h)+f(x_{0}-2h)}{12h}}+O(h^{n}){\text{ as }}h\to 0}$

In other words, find the order of the error of the method.

### Second Derivative

Find ${\displaystyle a,b,c\in \mathbb {R} }$ such that ${\displaystyle {\frac {af(x_{0}-h)+bf(x_{0})+cf(x_{0}+h)}{h^{2}}}}$ best approximates ${\displaystyle f''(x_{0})}$.

## Multivariate

In all two-variable examples, ${\displaystyle x_{0},y_{0}\in \mathbb {R} }$ and ${\displaystyle f:\mathbb {R} ^{2}\rightarrow \mathbb {R} }$ are unknown, and ${\displaystyle h\in \mathbb {R} }$ is small.

### Non-Mixed Derivatives

Because of the nature of partial derivatives, some of them may be calculated using single-variable methods. This is done by holding constant all but one variable to form a new function of one variable. For example if ${\displaystyle g_{y}(y)=f(x_{0},y)}$, then ${\displaystyle {\frac {df}{dy}}(x_{0},y_{0})={\frac {dg}{dy}}(y_{0})}$.

Find an approximation of ${\displaystyle {\frac {d}{dy}}f(x_{0},y_{0})}$

### Mixed Derivatives

Mixed derivatives may require the multivariate extension of Taylor's theorem.

Let ${\displaystyle f:\mathbb {R} ^{2}\rightarrow \mathbb {R} }$ be 42 times continuously differentiable on ${\displaystyle \mathbb {R} ^{2}}$ and let ${\displaystyle g:\mathbb {R} ^{3}\rightarrow \mathbb {R} }$ be defined as

${\displaystyle g(x_{0},y_{0},h)={\frac {f(x_{0}+h,y_{0}+h)+f(x_{0}-h,y_{0}-h)-f(x_{0}-h,y_{0}+h)-f(x_{0}+h,y_{0}-h)}{4h^{2}}}\,.}$

Find the largest ${\displaystyle n\in \mathbb {N} }$ such that

${\displaystyle {\frac {d^{2}f}{dxdy}}(x_{0},y_{0})=g(x_{0},y_{0},h)+O(h^{n}){\text{ as }}h\to 0\,.}$

In other words, find the order of the error of the approximation.

## Example Code

Implementing these methods is reasonably simple in programming languages that support higher-order functions. For example, the method from the first example may be implemented in C++ using function pointers, as follows:

//  Returns an approximation of the derivative of f at x.
double derivative (double (*f)(double), double x, double h =0.01) {
return (f(x + h) - f(x - h)) / (2 * h);
}