Return to computing page for the first course APMA0330
Return to computing page for the second course APMA0340
Return to Mathematica tutorial for the first course APMA0330
Return to Mathematica tutorial for the second course APMA0340
Return to the main page for the course APMA0330
Return to the main page for the course APMA0340
Theorem:
Consider the first order differential equation
\( {\text d}y/{\text d}x = f(y) , \) where f
is continuous in a neighborhood of y0. Assume that the
solutions through y0 are not unique, that is, assume that
there are two solutions φ1 and φ2 such that
φ1(0) = φ2(0) = y0 and that
φ1 and φ2 differ in every
neighborhood of 0. Then f(y0) = 0.
Theorem: Let f(y) be a continuous function on the closed interval [a,b] that has one null \( y^{\ast} \in (a,b) , \) namely, \( f(y^{\ast} ) =0 \) and \( f(y) \ne 0 \) for all other points \( y \in (a,b) . \) If the integral
diverges, then the initial value problem for the autonomous differential
equation
\[
y' = f(y) , \qquad y(x_0 ) = y^{\ast}
\]
has the unique solution \( y (x) \equiv y^{\ast} . \) If the integral converges, then the initial value problem has multiple solutions.
⧫
Theorem:
Suppose that
f(x,y) is uniformly Lipschitz continuous in y (meaning the Lipschitz constant L in the inequality \( |f(x,y_1 ) - f(x, y_2 )| \le L\,|y_1 - y_2 | \) can be taken independent of x) and continuous in x. Then, for some positive value δ there exists a unique solution \( y = \phi (x) \) to the initial value problem
on the interval \( \left[ x_0 -\delta , x_0 + \delta \right] . \)
⧫
Proof:
Suppose the the initial value problem
\( y' = f(x,y), \quad y(x_0 ) = y_0 , \)
with Lipschitz slope function f(x,y) has two solutions
y1 and y2. Let
\[
u(x) = \int_{x_0}^x \left\vert y_1 (s) - y_2 (s) \right\vert {\text d} s \ge 0
\quad \mbox{for} \quad x\ge x_0 .
\]
Since these functions are solutions of the same differential equation, they
also satisfy the equivalent integral equation
Therefore \( u(x) \le 0 , \) which leads to
conclusion that \( u(x) \equiv 0 \) for all
x ≥ x0.
A similar argument shows also that
\( u(x) \equiv 0 \) for all
x ≤ x0.
⧫
Theorem:
The initial value problem
\[
{\text d}y/{\text d}x = f(x,y). \qquad y\left( x_0 \right) = y_0
\tag{2}
\]
has a unique solution on some interval containing x0 if the slope function f(x, y) is continuous in both x and y on some rectangle R centered at (x0, y0) and nonincreasing in y for each fixed x ∈ R.
⧫
Theorem:
Let R be the region defined by the inequalities
\( 0 \le x- x_0 < a, \ |s_k - y_k | < b_k , \quad k=0,1,2,\ldots , n-1 , \) where yk ≥ 0 for k
> 0. Suppose the function
\( f(x, s_0 , s_1 , \ldots , s_{n-1} ) \) in the
initial value problem
then the above initial value problem has at most one solution in R.
⧫
Example:
Consider the initial value problem
\[
y' = 2\,\sqrt{y}, \quad y(0)=0 ,
\]
where the slope function \( f(y) = 2\,\sqrt{y} \) is continuous on infinite interval \( [0, \infty ) \) but not Lipschitz. So according to Peano's theorem, this initial value problem has a solution. Indeed, we can apply Picard's iteration procedure to obtain a solution \( y(x) \equiv 0 . \) On the other hand, since the given differential equation is autonomous, we can separate variables and integrate:
where C is an arbitrary constant. Since we consider only positive branch of the square root function, the above formula is valid only when \( x \ge C . \) Therefore, we get a family of solutions (which is also called the general solution) depending on a parameter C:
\[
y = \begin{cases} \left( x-C \right)^2 , & \qquad x \ge C , \\
0 , & \quad \mbox{for } x <C . \end{cases}
\]
In this sequence of command, I am first entering the family of solutions to the differential equation. Since using C is prohibited in Mathematica, we use CC instead. Then we use two subroutines, one for plotting solutions, and another one for looping with respect to constant C, Finally, we display all graphs.
We can also check that the given initial value problem has multiple solutions by evaluating integral
has at least two solutions, y = 0 and y = x4. Actually, the IVP has infinitely many solutions
\[
y = \begin{cases}
\left( x - a \right)^4 , & \ \mbox{ for} \quad x \ge a , \\
0, & \ \mbox{ for} \quad -b \le x \le a , \\
-\left( x - b \right)^4 , & \ \mbox{ for} \quad x \le b.
\end{cases}
\]
This function is everywhere continuous, differentiable, and satisfies all conditions of the given initial value problem.
The initial value problem
\[
y' = y^{3/2}, \quad y(0)=0 ,
\]
has infinite many solutions
\[
y = \begin{cases}
\frac{1}{27}\left( x - a \right)^3 , & \ \mbox{ for} \quad x \ge a , \\
0, & \ \mbox{ for} \quad -b \le x \le a , \\
\frac{1}{27} \left( x - b \right)^3 , & \ \mbox{ for} \quad x \le b.
\end{cases}
\]
for any constant c. The slope function does not satisfy the Lipschitz condition in a neighborhood of the origin.
■
Example:
The initial value problem
\[
y' = 3x\,y^{1/3}, \quad y(0)= a ,
\]
has a unique solution whenever 𝑎 ≠ 0. Indeed, the partial derivative \( \partial f/\partial y = x\,y^{-2/3} \) is continuous at all points in the xy-plane but not on the x-axis (y = 0). When 𝑎 ≠ 0, we can certainly draw a rectangle containing (0, 𝑎) that does not intersect the x-axis. In any such rectangle the hypotheses of the existence and uniqueness theorem are satisfied, and therefore the initial-value problem does indeed have a unique solution.
However, when 𝑎 = 0, the conditions of the uniqueness theorem do not hold. Since this theorem provides only sufficient conditions, we need to verify directly that the given problem has at least two solutions. One of them is the trivial one y = 0 and another solution is y = x³.
▣
It has again a trivial solution y(x) = 0. Let
\( u = y^{(n-1)} , \) then u(x) is
defined from the equation
\[
x = \int_0^u t^{-2/3} e^{-t}\,{\text d}t .
\]
As \( u \to \infty , \quad x\to r_0 = \int_0^{\infty}
t^{-2/3} e^{-t}\,{\text d}t . \) Then another solution is given by
\[
y(x) = \int_0^x \frac{\left( x - t \right)^{n-2}}{(n-2)!} \, u(t)\,{\text d}t ,
\]
defined on 0 ≤ x < r0 < ∞.
The function y(x) can be multiplied by an arbitrary continuous, nonnegative, nondecreasing function of x to give another example.
■
It should be noted that the Lipschitz condition is a sufficient one but not a necessary condition for uniqueness. An example where there exists unique solution in spite of the Lipschitz condition not being satisfied is given by the initial value problem
\[
y' = 1 + (4/3) \,y^{1/3} , \qquad y(0) = 0.
\]
It is obvious that the slope function violates the Lipschitz condition in a domain which includes the origin. However the above initial value problem has a unique solution.
Example:
Following Dhar, we consider one dimensional motion modeled by Newton's equation
where \( \ddot{x} = {\text d}^2 x/{\text d}xt^2 \)
is acceleration of a unit point mass. It is always assumed that functions
Π(x) and f(x) = - dΠ(x)/dx are
continuous functions of x.
We consider the case when the potential function Π(x) has a single
singularity at the origin in the sense that \( V'' (x) \)
or one of the higher derivatives does not exist. So we consider the
initial value problem
where \( E = v_0^2 /2 \) is the total energy of the
system and is a constant because Π(0) = 0. The sign of the square roots
depended on the sign of the initial velocity v0.
The first order differential equation \( \dot{x} = g(x)
\) is equivalent to Newton's equation \( \ddot{x} =
f(x) \) only if the velocity is not zero because we multiplied by
\( \dot{x} . \) Therefore, we need to consider two
cases depending on whether the initial velocity is zero or not.
v0 ≠ 0. Then E0 ≠ 0 and the
reciprocal 1/g(x) is finite and continuous in an interval containing the origin. Therefore, the first order differential equation
\( \dot{x} = g(x) \) can be integrated to give the unique solution \( t = \int_0^x {\text d}\xi/g(\xi ) . \) So the second order equation of motion \( \ddot{x} =
f(x) \) will have a unique solution irrespective of the type of singularity that Π(x) may have.
v0 = 0, then E0 = E = 0 and the first order equation becomes
are real and finite. When these integrals exist, the second order differential equation of motion will have additional solution provided that f(0) = 0.
Now we formulate the necessary and sufficient conditions for the existence of unique solutions to the IVP for Newton's equation of motion:
■
Review of Definition of Various Means.
For a given finite real number α , the α-th - power mean mα of positive scalars 𝑎
and b, is defined as
\begin{equation}
L = \frac{2 \left( a^2 + ab + b^2 \right)}{3 \left( a + b \right)} \qquad (\mbox{Centroidal mean}) . \label{centroidal}
\end{equation}
Dhar, A., Nonuniqueness in the solutions of Newton's equation of motion, American Journal of Physics, 1993, Vol. 61, No. 1, pp. 58--61; doi: 10.1119/1.17411
Hales, A.W. and Sells, G.R., Multiple solutions of a differential equation,
The American Mathematical Monthly, 1966, Vol. 73, No. 6, pp.672--673.
Wend, D.V.V., Uniqueness of solutions of ordinary differential equations,
The American Mathematical Monthly, 1967, Vol. 74, No. 8, pp. 948--950.
Return to Mathematica page
Return to the main page (APMA0330)
Return to the Part 1 (Plotting)
Return to the Part 2 (First Order ODEs)
Return to the Part 3 (Numerical Methods)
Return to the Part 4 (Second and Higher Order ODEs)
Return to the Part 5 (Series and Recurrences)
Return to the Part 6 (Laplace Transform)
Return to the Part 7 (Boundary Value Problems)