Return to computing page for the first course APMA0330
Return to computing page for the second course APMA0340
Return to Mathematica tutorial for the first course APMA0330
Return to Mathematica tutorial for the second course APMA0340
Return to the main page for the course APMA0330
Return to the main page for the course APMA0340
Return to Part V of the course APMA0330
Our main method to solve nonlinear and variable coefficient linear differential equations is expansion of solutions into power series. Before the middle of 20th century, it was the standard method for attacking problems of this kind and many of the famous equations such as Legendre and Chebyshev equations have series solutions named after them. With the availability of computers, the power series came to prominence once again.
In this section, we present an extension and generalization of Picard's iteration scheme. We start with the first order differential equations.
First order separable equations
Although the initial value problem for a separable differential equation can be solved by separation of variables, we demonstrate the interation scheme advantage on this particular class of equations. We rewrite the initial value problem
It is assumed that the functions f(x) and g(y) are smooth enough to have all nesessary derivatives is use. Let h(x) be an antiderivatives of f(x) so
\( \displaystyle f(x) = h' (x) . \) The integral is expanded by integration by parts and use of the given differential equation in the first step
The integral term in the equation above is the remainder of th expression; if it vanishes with N → ∞, the series solution to the separable equation becomes
where C is a constant that is specified by the initial condition. In the region of convergence the series summation will give a function of x and y. So it will provide an implicit solutions to the given separable equation.
Example:
Consider the differential equation
\[
y' = f(x)\, y^m ,
\]
where m is a real number. Using function g(y) = ym, we obtain from Eq.\eqref{EqIter.1} that
Extending the process of intgeration by parts, we arrive at
\[
y - y_0 = \left[ h(s) + \sum_{n=0}^N (-1)^{n+1} \frac{1}{n!} \, s^n g_n (s) \right]_{s=x_0}^{s=x} + \sum_{n=1}^N (-1)^{n} \frac{1}{n!} \int_{x_0}^x s^n h' (s) g'_n (s) \,{\text d} s + (-1)^N\,\frac{1}{N!} \int_{x_0}^x s^N h' (s) g_{N+1} (s) \,{\text d} s .
\]
Assuming that summation and integration can be interchaned in the second term, and that the remainder vanishes in the limit N → ∞, the series expansion becomes
\[
y - h(x) + \sum_{n\ge 0} (-1)^{n} \frac{1}{n!} \, x^n g_n (x) - \int_{x_0}^x {\text d} s \,\sum_{n\ge 0} (-1)^{n} \frac{1}{n!} \, s^n g'_n (s)\,h' (s) = C ,
\]
Fairen, V., Lopez, V., Conde, L., Power series approximation to solutions of nonlinear systems of differential equations, American Journal of Physics, 1988, Vol. 56, Issue , pp. 57--61; https://doi.org/10.1119/1.15432
Reut, Z., Solution of ordinary differential equations by successive integration by parts, International Journal of Mathematical Education in Science and Technology, 1995, Vol. 26, No. 4, pp. 589--597.
Robin, W.A., Solving differential equations using modified Picard iteration, International Journal of Mathematical Education in Science and Technology, 2010, Vol. 41, No. 5, pp. 659--665. https://doi.org/10.1080/00207391003675182
Return to Mathematica page
Return to the main page (APMA0330)
Return to the Part 1 (Plotting)
Return to the Part 2 (First Order ODEs)
Return to the Part 3 (Numerical Methods)
Return to the Part 4 (Second and Higher Order ODEs)
Return to the Part 5 (Series and Recurrences)
Return to the Part 6 (Laplace Transform)
Return to the Part 7 (Boundary Value Problems)