Let us consider the linear homogeneous differential equation
of order
.\, If the coefficient functions
are continuous and the coefficient
of the highest order derivative does not vanish on a certain interval (resp. a domain in
), then all solutions
are continuous on this interval (resp. domain).\, If all coefficients have the continuous derivatives up to a certain order, the same concerns the solutions.
If, instead,
vanishes in a point
, this point is in general a singular point.\, After dividing the differential equation by
and then getting the form
some new coefficients
are discontinuous in the singular point.\, However, if the discontinuity is restricted so, that the products
are continuous, and even analytic in
, the point
is a regular singular point of the differential equation.\\
We introduce the so-called\, Frobenius method \, for finding solution functions in a neighbourhood of the regular singular point
, confining us to the case of a second order differential equation.\, When we use the quotient forms
where
,
and
are analytic in a neighbourhood of
and\,
,\, our differential equation reads
![{\displaystyle {\begin{matrix}(x-x_{0})^{2}r(x)y''(x)+(x-x_{0})p(x)y'(x)+q(x)y(x)=0.\end{matrix}}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/0058e3c181d4c86b91b9ba95a7ac3a61cc7bd947)
Since a simple change\,
\, of variable brings to the case that the singular point is the origin, we may suppose such a starting situation.\, Thus we can study the equation
![{\displaystyle {\begin{matrix}x^{2}r(x)y''(x)+xp(x)y'(x)+q(x)y(x)=0,\end{matrix}}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/ed972a1f3d493f4f516dc135d133878969251cde)
where the coefficients have the converging power series expansions
![{\displaystyle {\begin{matrix}r(x)=\sum _{n=0}^{\infty }r_{n}x^{n},\quad p(x)=\sum _{n=0}^{\infty }p_{n}x^{n},\quad q(x)=\sum _{n=0}^{\infty }q_{n}x^{n}\end{matrix}}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/297efba4ae22c9c1c54aaf61edf4af3c84bf1577)
and
In the Frobenius method one examines whether the equation (2) allows a series solution of the form
![{\displaystyle {\begin{matrix}y(x)=x^{s}\sum _{n=0}^{\infty }a_{n}x^{n}=a_{0}x^{s}+a_{1}x^{s+1}+a_{2}x^{s+2}+\ldots ,\end{matrix}}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/1edeb231d923bb9037c0c825b981787a8a4fdca1)
where
is a constant and\,
.
Substituting (3) and (4) to the differential equation (2) converts the left hand side to
![{\displaystyle {\begin{matrix}&[r_{0}s(s\!-\!1)\!+\!p_{0}s\!+\!q_{0}]a_{0}x^{s}+\\&[[r_{0}(s\!+\!1)s\!+\!p_{0}(s\!+\!1)\!+\!q_{0}]a_{1}\!+\![r_{1}s(s\!-\!1)\!+\!p_{1}s\!+\!q_{1}]a_{0}]x^{s+1}+\\&[[r_{0}(s\!+\!2)(s\!+\!1)\!+\!p_{0}(s\!+\!2)\!+\!q_{0}]a_{2}\!+\![r_{1}(s\!+\!1)s\!+\!p_{1}(s\!+\!1)\!+\!q_{1}]a_{1}\!+\![r_{2}s(s\!-\!1)\!+\!p_{2}s\!+\!q_{2}]a_{0}]x^{s+2}\!+\ldots \end{matrix}}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/295c980d04b28f05b221c9fa933b68006f435aa1)
Our equation seems clearer when using the notations\,
:
![{\displaystyle {\begin{matrix}f_{0}(s)a_{0}x^{s}+[f_{0}(s\!+\!1)a_{1}+f_{1}(s)a_{0}]x^{s+1}+[f_{0}(s\!+\!2)a_{2}+f_{1}(s\!+\!1)a_{1}+f_{2}(s)a_{0}]x^{s+2}+\ldots =0\end{matrix}}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/7e41dd311ff7174013e0349e515852a2f26297d2)
Thus the condition of satisfying the differential equation by (4) is the infinite system of equations
![{\displaystyle {\begin{matrix}{\begin{cases}f_{0}(s)a_{0}=0\\f_{0}(s\!+\!1)a_{1}+f_{1}(s)a_{0}=0\\f_{0}(s\!+\!2)a_{2}+f_{1}(s\!+\!1)a_{1}+f_{2}(s)a_{0}=0\\\qquad \cdots \qquad \cdots \qquad \cdots \end{cases}}\end{matrix}}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/5a7df62e8824eeb694fa05cf826b10f07c1e12c2)
In the first place, since\,
,\, the indicial equation
![{\displaystyle {\begin{matrix}f_{0}(s)\equiv r_{0}s^{2}+(p_{0}-r_{0})s+q_{0}=0\end{matrix}}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/f68498a22953eb92c704b1f4bfe3a3386949e57b)
must be satisfied.\, Because\,
,\, this quadratic equation determines for
two values, which in special case may coincide.
The first of the equations (6) leaves
arbitrary.\, The next linear equations in
allow to solve successively the constants
provided that the first coefficients
,\,
\,
do not vanish; this is evidently the case when the roots of the indicial equation don't differ by an integer (e.g. when the roots are complex conjugates or when
is the root having greater real part).\, In any case, one obtains at least for one of the roots of the indicial equation the definite values of the coefficients
in the series (4).\, It is not hard to show that then this series converges in a neighbourhood of the origin.
For obtaining the complete solution of the differential equation (2) it suffices to have only one solution
of the form (4), because another solution
, linearly independent on
, is gotten via mere integrations; then it is possible in the cases\,
\, that
has no expansion of the form (4).
[1]
- ↑ {\sc Pentti Laasonen:} Matemaattisia erikoisfunktioita .\, Handout No. 261. Teknillisen Korkeakoulun Ylioppilaskunta; Otaniemi, Finland (1969).