# Bessel functions

## Definition

The Bessel function is canonical solution to Bessel's differential equation $x^{2}y''+xy'+(x^{2}-\nu ^{2})y=0,\,\nu \in \mathbb {C} ~.$ Solutions were first introduced by Daniel Bernoulli, but later generalized by Friedrich Bessel. The most common and most important case of the Bessel function is when $\nu \in \mathbb {Z} ~,$ which is called the order of the Bessel function.

Bessel functions arise when the method of separation of variables is applied to the Laplace or Helmholtz equation in cylindrical or spherical coordinates. They are very important for many problems dealing with physical phenomena, like wave or heat propagation.

## Derivation of Bessel function using Frobenius's Method

Consider the Bessel equation:

$x^{2}y''+xy'+(x^{2}-\nu ^{2})y=0$ $\Leftrightarrow y''+\underbrace {\left({\frac {1}{x}}\right)} _{p(x)}y'+\underbrace {\left(1-{\frac {\nu ^{2}}{x^{2}}}\right)} _{q(x)}y=0$ We're seeking solutions near $x_{0}=0~.$ Since:

{\begin{aligned}xp(x)&=1\\x^{2}q(x)&=x^{2}-\nu ^{2}\end{aligned}} are power series in x, $x_{0}=0$ is a regular singular point of the Bessel equation. This allows Frobenius's method to be applied.

We are seeking solutions of the form:

$y(x)=\sum _{n=0}^{\infty }C_{n}x^{n+r},\,x>0,C_{n}\neq 0$ Differentiating yields:

{\begin{aligned}y'(x)&=\sum _{n=0}^{\infty }(n+r)C_{n}x^{n+r-1}\\y''(x)&=\sum _{n=0}^{\infty }(n+r-1)(n+r)C_{n}x^{n+r-2}\end{aligned}} Conditions for $C_{n}$ must be found. Substituting our expressions back into the Bessel equation:

{\begin{aligned}0&=x^{2}y''+xy'+(x^{2}-\nu ^{2})y\\&=\sum _{n=0}^{\infty }(n+r-1)(n+r)C_{n}x^{n+r}+\sum _{n=0}^{\infty }(n+r)C_{n}x^{n+r}+\sum _{n=0}^{\infty }C_{n}x^{n+r+2}-\sum _{n=0}^{\infty }\nu ^{2}C_{n}x^{n+r}\end{aligned}} A substitution must be made in indices:$m=n+2~.$ This yields:

{\begin{aligned}0&=\sum _{n=0}^{\infty }\left[(n+r-1)(n+r)+(n+r)-\nu ^{2}\right]C_{n}x^{n+r}+\sum _{m=2}^{\infty }C_{m-2}x^{m+r}\\&=\sum _{n=0}^{\infty }\left[(n+r)^{2}-\nu ^{2}\right]C_{n}x^{n+r}+\sum _{n=2}^{\infty }C_{n-2}x^{n+r}\\&=(r^{2}-\nu ^{2})C_{0}x^{r}+[(r+1)^{2}-\nu ^{2}]C_{1}x^{r+1}+\sum _{n=2}^{\infty }\left\{[(n+r)^{2}-\nu ^{2}]C_{n}+C_{n-2}\right\}x^{n+r}\end{aligned}} Dividing the equation above by $x^{r}~(x>0)~$ yields:

$0=(r^{2}-\nu ^{2})C_{0}+[(r+1)^{2}-\nu ^{2}]C_{1}x+\sum _{n=2}^{\infty }\left\{[(n+r)^{2}-\nu ^{2}]C_{n}+C_{n-2}\right\}x^{n}$ By the "Identity Theorem" (which states that xn is linearly independent), it follows that:

{\begin{aligned}&(r^{2}-\nu ^{2})C_{0}=0\\&[(r+1)^{2}-\nu ^{2}]C_{1}=0\\&[(n+r)^{2}-\nu ^{2}]C_{n}+C_{n-2}=0,\,n=2,3,4,\cdots \end{aligned}} By assumption, $C_{0}\neq 0~,$ so we define a function:

$h(r):=r^{2}-\nu ^{2}=0\quad {\text{(indicial equation)}}$ The possible values for $r=\pm \nu ~.$ Let $r_{1}:=\nu ,\,r_{2}:=-\nu ~$ and for convenience, let $\nu >0~.$ We obtain the following recurrence relations for $C_{n}$ :

${\begin{cases}C_{0}\neq 0\quad {\text{(arbitrarily defined)}}\\C_{1}=0\quad {\text{(follows from }}[(r+1)^{2}-\nu ^{2}]C_{1}=0{\text{)}}\\\underbrace {[(n+r)^{2}-\nu ^{2}]} _{h(n+r)}C_{n}=-C_{n-2},\,n=2,3,4,\cdots \end{cases}}$ To get a solution to the Bessel equation, choose $r_{1}=\nu ~,\,\nu \neq 0~.$ Thus, $h(n+r)=h(n+\nu )\neq 0,\,n=2,3,4,\cdots ~.$ We can now solve for $C_{n}$ :

$C_{n}=-{\frac {C_{n-2}}{(n+\nu )^{2}-\nu ^{2}}}=-{\frac {C_{n-2}}{n^{2}+2n\nu }}$ We end up with the recursion:

${\begin{cases}C_{0}\neq 0\\C_{1}=0\\C_{n}=-{\frac {C_{n-2}}{n(n+2\nu )}},\,n=2,3,4,\cdots \end{cases}}$ Since the recursion has depth 2 and $C_{1}=0$ , it follows that:

${\begin{cases}C_{0}\neq 0\\C_{2n+1}=0,n=0,1,2,\cdots \\C_{2n}=-{\frac {C_{2n-2}}{2n(2n+2\nu )}}=-{\frac {C_{2n-2}}{2^{2}n(n+\nu )}},\,n=1,2,3,\cdots \end{cases}}$ Because of the recursion, we get the following set of terms:

{\begin{aligned}&C_{0}\neq 0\\&C_{2}=-{\frac {C_{0}}{2^{2}\cdot 1\cdot (1+\nu )}}\\&C_{4}=C_{2\cdot 2}=-{\frac {C_{2}}{2^{2}\cdot 2\cdot (2+\nu )}}={\frac {(-1)^{2}C_{0}}{2^{4}\cdot 1\cdot 2\cdot (1+\nu )(2+\nu )}}={\frac {(-1)^{2}C_{0}}{2^{4}\cdot 2!\cdot (1+\nu )(2+\nu )}}\\&C_{6}=C_{2\cdot 3}=-{\frac {C_{4}}{2^{2}\cdot 3\cdot (3+\nu )}}={\frac {(-1)^{3}C_{0}}{2^{6}\cdot 3!\cdot (1+\nu )(2+\nu )(3+\nu )}}\\&\vdots \\&C_{2n}={\frac {(-1)^{n}C_{0}}{2^{2n}\cdot n!\cdot (1+\nu )(2+\nu )\cdots (n+\nu )}},\,n=1,2,3,\cdots \end{aligned}} In order to simplify the expansion of y, we normalize $C_{0}$ and choose:

$C_{0}:={\frac {1}{2^{\nu }\Gamma (1+\nu )}}$ This simplifies our general term to:

$C_{2n}={\frac {(-1)^{n}}{2^{2n+\nu }\cdot n!\cdot \Gamma (n+1+\nu )}},\,n=0,1,2,\cdots$ The first solution to the Bessel equation can be written like this:

$J_{\nu }(x)=\sum _{n=0}^{\infty }{\frac {(-1)^{n}}{n!\cdot \Gamma (n+1+\nu )}}\left({\frac {x}{2}}\right)^{2n+\nu }$ ## Gamma Function

### Definition

The definition of the gamma function is defined on $x\in \mathbb {R}$ such that $x>0$ :

$\Gamma (x):=\int \limits _{0}^{\infty }t^{x-1}e^{-t}dt$ ### Properties of the Gamma Function

Here are some theorems for the gamma function:

1. $\Gamma (x+1)=x\Gamma (x)$ 2. $\Gamma (1)=1$ 3. $\Gamma (n+1)=n!$ 4. $\Gamma \left({\frac {1}{2}}\right)={\sqrt {\pi }}$ ## Second Solution of the Bessel Equation

For the case that $\nu \not \in \mathbb {N} ,\nu >0$ , we can define a second solution to the Bessel function. In this case, $n+1-\nu \in \mathbb {R} \backslash (-\mathbb {N} \cup \{0\})$ and therefore $\Gamma (n+1-\nu )$ is defined. Consider:

$J_{-\nu }(x):=\sum _{n=0}^{\infty }{\frac {(-1)^{n}}{n!\cdot \Gamma (n+1-\nu )}}\left({\frac {x}{2}}\right)^{2n-\nu }$ Some theorems for this new function:

• $J_{-\nu }(x)~$ solves the Bessel equation.
• $J_{\nu }(x),J_{-\nu }(x)~$ are linearly independent.

These theorems are proved easily, but will not be shown here.

## Bessel Functions of the Second Kind

A second function that is defined on $\nu \in \mathbb {R} ,\nu \not \in \mathbb {Z}$ takes the form:

$Y_{\nu }(x):={\frac {\cos(\nu \pi )J_{\nu }(x)-J_{-\nu }(x)}{\sin(\nu \pi )}}$ Deriving this result is fairly difficult and will not be shown here. This function, called the Bessel function of the second kind of order $\nu$ , is linearly independent from $J_{\nu }(x)$ .

## Hankel Functions, Bessel Functions of the Third Kind

A third type of function (complex-valued) for $\nu \in \mathbb {R} ,\,\nu \not \in \mathbb {Z} ,$ are:

$H_{\nu }^{(1)}:=J_{\nu }(x)+i\,Y_{\nu }(x)$ $H_{\nu }^{(2)}:=J_{\nu }(x)-i\,Y_{\nu }(x)$ and are called the Bessel functions of the 3rd kind or Hankel functions of order $\nu$ . The Hankel functions $H_{\nu }^{(1)},H_{\nu }^{(2)}$ are linearly independent.

## Complete Solution to the Bessel Equation

For all $\nu \in \mathbb {R} ,\lambda \in \mathbb {R} ,$ the complete solution of the Bessel equation:

$x^{2}y''+xy'+(\lambda ^{2}x^{2}-\nu ^{2})y=0$ can be written as:

$y(x)=C_{1}J_{\nu }(\lambda x)+C_{2}Y_{\nu }(\lambda x)$ or:

$y(x)=C_{1}H_{\nu }^{(1)}(\lambda x)+C_{2}H_{\nu }^{(2)}(\lambda x)~.$ If $\nu \in \mathbb {R} \backslash \mathbb {Z} ~,$ then:

$y(x)=C_{1}J_{\nu }(x)+C_{2}J_{-\nu }(x)~.$ Moreover:

• $J_{\nu },J_{-\nu },Y_{\nu }$ have countably many zeroes.
• If $v\geq 0$ , then $J_{\nu }(\lambda x)$ is finite for all $x\in \mathbb {R}$ , $J_{-\nu }(\lambda x)$ and $Y_{\nu }(\lambda x)$ are unbounded in the neighborhood of 0.

## Identities

Here are some identities for the Bessel function. They can be deduced with reasonable effort.

### Differential Identities

For $\nu \in \mathbb {R}$ :

1. $\left[x^{\nu }J_{\nu }(x)\right]'=x^{\nu }J_{\nu -1}(x)$ 2. $\left[x^{-\nu }J_{\nu }(x)\right]'=-x^{-\nu }J_{\nu +1}(x)$ 3. $\left[x^{\nu }Y_{\nu }(x)\right]'=x^{\nu }Y_{\nu -1}(x)$ 4. $\left[x^{-\nu }Y_{\nu }(x)\right]'=-x^{-\nu }Y_{\nu +1}(x)$ Corollary:

1. $xJ_{\nu }'(x)+\nu J_{\nu }(x)=xJ_{\nu -1}(x)$ 2. $xJ_{\nu }'(x)-\nu J_{\nu }(x)=-xJ_{\nu +1}(x)$ Corollary (Recursion Formula):

1. ${\frac {2\nu }{x}}J_{\nu }(x)=J_{\nu -1}(x)+J_{\nu +1}(x)$ 2. $J_{\nu }'(x)={\frac {1}{2}}\left[J_{\nu -1}(x)-J_{\nu +1}(x)\right]$ ### Integration Identities

For $\nu \in \mathbb {R}$ :

1. $\int x^{\nu }J_{\nu -1}(x)dx=x^{\nu }J_{\nu }(x)+C$ 2. $\int x^{-\nu }J_{\nu +1}(x)dx=-x^{-\nu }J_{\nu }(x)+C$ Important special cases $(\nu =0,1)$ :

1. $\int xJ_{0}(x)dx=xJ_{1}(x)+C$ 2. $\int J_{1}(x)dx=-J_{0}(x)+C$ 