In this chapter we go to work finding solutions to some important differential equations, using for this purpose the processes shown in the preceding chapters.
The beginner, who now knows how easy most of those processes are in themselves, will here begin to realize that integration is an art. As in all arts, so in this, facility can be acquired only by diligent and regular practice. He who would attain that facility must work out examples, and more examples, and yet more examples, such as are found abundantly in all the regular treatises on the Calculus. Our purpose here must be to afford the briefest introduction to serious work.
Example (1.) Find the solution of the differential equation \[ ay + b \frac{dy}{dx} = 0. \]
Transposing we have \[ b \frac{dy}{dx} = -ay. \]
Now the mere inspection of this relation tells us that we have got to do with a case in which $\dfrac{dy}{dx}$ is proportional to $y$. If we think of the curve which will represent $y$ as a function of $x$, it will be such that its slope at any point will be proportional to the ordinate at that point, and will be a negative slope if $y$ is positive. So obviously the curve will be a die-away curve, and the solution will contain $\epsilon^{-x}$ as a factor. But, without presuming on this bit of sagacity, let us go to work.
As both $y$ and $dy$ occur in the equation and on opposite sides, we can do nothing until we get both $y$ and $dy$ to one side, and $dx$ to the other. To do this, we must split our usually inseparable companions $dy$ and $dx$ from one another. \[ \frac{dy}{y} = - \frac{a}{b}\, dx. \]
Having done the deed, we now can see that both sides have got into a shape that is integrable, because we recognize $\dfrac{dy}{y}$, or $\dfrac{1}{y}\, dy$, as a differential that we have met with (here) when differentiating logarithms. So we may at once write down the instructions to integrate, \[ \int \frac{dy}{y} = \int -\frac{a}{b}\, dx; \] and doing the two integrations, we have: \[ \log_\epsilon y = -\frac{a}{b} x + \log_\epsilon C, \] where $\log_\epsilon C$ is the yet undetermined constant of integration. Then, delogarizing, we get: \[ y = C \epsilon^{-\frac{a}{b} x}, \] which is the solution required*. Now, this solution looks quite unlike the original differential equation from which it was constructed: yet to an expert mathematician they both convey the same information as to the way in which $y$ depends on $x$.
Now, as to the $C$, its meaning depends on the initial value of $y$. For if we put $x = 0$ in order to see what value $y$ then has, we find that this makes $y = C \epsilon^{-0}$; and as $\epsilon^{-0} = 1$ we see that $C$ is nothing else than the particular value* of $y$ at starting. This we may call $y_0$, and so write the solution as \[ y = y_0 \epsilon^{-\frac{a}{b} x}. \]
Example (2.)
Let us take as an example to solve \[ ay + b \frac{dy}{dx} = g, \] where $g$ is a constant. Again, inspecting the equation will suggest, (1) that somehow or other $\epsilon^x$ will come into the solution, and (2) that if at any part of the curve $y$ becomes either a maximum or a minimum, so that $\dfrac{dy}{dx} = 0$, then $y$ will have the value $= \dfrac{g}{a}$. But let us go to work as before, separating the differentials and trying to transform the thing into some integrable shape. \begin{align*} b\frac{dy}{dx} &= g -ay; \\ \frac{dy}{dx} &= \frac{a}{b}\left(\frac{g}{a}-y\right); \\ \frac{dy}{y-\dfrac{g}{a}} &= -\frac{a}{b}\, dx. \end{align*}
Now we have done our best to get nothing but $y$ and $dy$ on one side, and nothing but $dx$ on the other. But is the result on the left side integrable?
It is of the same form as the result on here; so, writing the instructions to integrate, we have: \[ \int{\frac{dy}{y-\dfrac{g}{a}}} = - \int{\frac{a}{b}\, dx}; \] and, doing the integration, and adding the appropriate constant, \begin{align*} \log_\epsilon\left(y-\frac{g}{a}\right) &= -\frac{a}{b}x + \log_\epsilon C; \\ \text{whence}\;\; y-\frac{g}{a} &= C\epsilon^{-\frac{a}{b}x}; \\ \text{and finally,}\;\; y &= \frac{g}{a} + C\epsilon^{-\frac{a}{b}x}, \end{align*} which is the solution.
If the condition is laid down that $y = 0$ when $x = 0$ we can find $C$; for then the exponential becomes $= 1$; and we have \begin{align*} 0 &= \frac{g}{a} + C, \\ \text{or}\; C &= -\frac{g}{a}. \end{align*}
Putting in this value, the solution becomes \[ y = \frac{g}{a} (1-\epsilon^{-\frac{a}{b} x}). \]
But further, if $x$ grows indefinitely, $y$ will grow to a maximum; for when $x=\infty$, the exponential $= 0$, giving $y_{\text{max.}} = \dfrac{g}{a}$. Substituting this, we get finally \[ y = y_{\text{max.}}(1-\epsilon^{-\frac{a}{b} x}). \]
This result is also of importance in physical science.
Example (3.) Let $ay+b\frac{dy}{dt} = g · \sin 2\pi nt$.
We shall find this much less tractable than the preceding. First divide through by $b$. \[ \frac{dy}{dt} + \frac{a}{b}y = \frac{g}{b} \sin 2\pi nt. \]
Now, as it stands, the left side is not integrable. But it can be made so by the artifice–and this is where skill and practice suggest a plan–of multiplying all the terms by $\epsilon^{\frac{a}{b} t}$, giving us: \[ \frac{dy}{dt} \epsilon^{\frac{a}{b} t} + \frac{a}{b} y \epsilon^{\frac{a}{b} t} = \frac{g}{b} \epsilon^{\frac{a}{b} t} · \sin 2 \pi nt, \] which is the same as \[ \frac{dy}{dt} \epsilon^{\frac{a}{b} t} + y \frac{d(\epsilon^{\frac{a}{b} t})}{dt} = \frac{g}{b} \epsilon^{\frac{a}{b} t} · \sin 2 \pi nt; \] and this being a perfect differential may be integrated thus:–since, if $u = y\epsilon^{\frac{a}{b} t}$, $\dfrac{du}{dt} = \dfrac{dy}{dt} \epsilon^{\frac{a}{b} t} + y \dfrac{d(\epsilon^{\frac{a}{b} t})}{dt}$, \begin{align*} y \epsilon^{\frac{a}{b} t} &= \frac{g}{b} \int \epsilon^{\frac{a}{b} t} · \sin 2 \pi nt · dt + C, \\ or y &= \frac{g}{b} \epsilon^{-\frac{a}{b} t} \int \epsilon^{ \frac{a}{b} t} · \sin 2\pi nt · dt + C\epsilon^{-\frac{a}{b} t}. \tag*{[A]} \end{align*}
The last term is obviously a term which will die out as $t$ increases, and may be omitted. The trouble now comes in to find the integral that appears as a factor. To tackle this we resort to the device (see here) of integration by parts, the general formula for which is $\int u dv = uv - \int v du$. For this purpose write \begin{align*} &\left\{ \begin{aligned} u &= \epsilon^{\frac{a}{b} t}; \\ dv &= \sin 2\pi nt · dt. \end{aligned} \right. \\ \end{align*} We shall then have \begin{align*} &\left\{ \begin{aligned} du &= \epsilon^{\frac{a}{b} t} × \frac{a}{b}\, dt; \\ v &= - \frac{1}{2\pi n} \cos 2\pi nt. \end{aligned} \right. \end{align*}
Inserting these, the integral in question becomes: \begin{align*} \int \epsilon^{\frac{a}{b} t} &{} · \sin 2 \pi n t · dt \\ &= -\frac{1}{2 \pi n} · \epsilon^{\frac{a}{b} t} · \cos 2 \pi nt -\int -\frac{1}{2\pi n} \cos 2 \pi nt · \epsilon^{\frac{a}{b} t} · \frac{a}{b}\, dt \\ &= -\frac{1}{2 \pi n} \epsilon^{\frac{a}{b} t} \cos 2 \pi nt +\frac{a}{2 \pi nb} \int \epsilon^{\frac{a}{b} t} · \cos 2 \pi nt · dt. \tag*{[B]} \end{align*}
The last integral is still irreducible. To evade the difficulty, repeat the integration by parts of the left side, but treating it in the reverse way by writing: \begin{align*} &\left\{ \begin{aligned} u &= \sin 2 \pi n t ; \\ dv &= \epsilon^{\frac{a}{b} t} · dt; \end{aligned} \right. \\[1ex] whence &\left\{ \begin{aligned} du &= 2 \pi n · \cos 2 \pi n t · dt; \\ v &= \frac{b}{a} \epsilon ^{\frac{a}{b} t} \end{aligned} \right. \end{align*}
Inserting these, we get \begin{align*} \int \epsilon^{\frac{a}{b} t} &{} · \sin 2 \pi n t · dt\\ &= \frac{b}{a} · \epsilon^{\frac{a}{b} t} · \sin 2 \pi n t - \frac{2 \pi n b}{a} \int \epsilon^{\frac{a}{b} t} · \cos 2 \pi n t · dt. \tag*{[C]} \end{align*}
Noting that the final intractable integral in [C] is the same as that in [B], we may eliminate it, by multiplying [B] by $\dfrac{2 \pi nb}{a}$, and multiplying [C] by $\dfrac{a}{2 \pi nb}$, and adding them.
The result, when cleared down, is: \begin{align*} \int \epsilon^{\frac{a}{b} t} · \sin 2 \pi n t · dt &= \epsilon^{\frac{a}{b} t} \left\{\frac{ ab · \sin 2 \pi nt - 2 \pi n b^2 · \cos 2 \pi n t}{ a^2 + 4 \pi^2 n^2 b^2 } \right\} \tag*{[D]} &\\ \end{align*} Inserting this value in [A], we get \begin{align*} y &= g \left\{\frac{ a · \sin 2 \pi n t - 2 \pi n b · \cos 2 \pi nt}{ a^2 + 4 \pi^2 n^2 b^2}\right\}. & \end{align*}
To simplify still further, let us imagine an angle $\phi$ such that $\tan \phi = \dfrac{2 \pi n b}{ a}$. Then \[ \sin \phi = \frac{2 \pi nb}{\sqrt{a^2 + 4 \pi^2 n^2 b^2}}, \] and \[ \cos \phi = \frac{a}{\sqrt{a^2 + 4 \pi^2 n^2 b^2}}. \\ \] Substituting these, we get: \[ y = g \frac{\cos \phi · \sin 2 \pi nt - \sin \phi · \cos 2 \pi nt}{\sqrt{a^2 + 4 \pi^2 n^2 b^2}}, \\ \] which may be written \[ y = g \frac{\sin(2 \pi nt - \phi)}{\sqrt{a^2 + 4 \pi^2 n^2 b^2}}, \] which is the solution desired.
This is indeed none other than the equation of an alternating electric current, where $g$ represents the amplitude of the electromotive force, $n$ the frequency, $a$ the resistance, $b$ the coefficient of self-induction of the circuit, and $\phi$ is an angle of lag.
Example (4.) Suppose that $M\, dx + N\, dy = 0.$
We could integrate this expression directly, if $M$ were a function of $x$ only, and $N$ a function of $y$ only; but, if both $M$ and $N$ are functions that depend on both $x$ and $y$, how are we to integrate it? Is it itself an exact differential? That is: have $M$ and $N$ each been formed by partial differentiation from some common function $U$, or not? If they have, then \[\left\{ \begin{aligned} \frac{\partial U}{\partial x} = M, \\ \frac{\partial U}{\partial y} = N. \end{aligned} \right. \] And if such a common function exists, then \[ \frac{\partial U}{\partial x}\, dx + \frac{\partial U}{\partial y}\, dy \] is an exact differential (compare here).
Now the test of the matter is this. If the expression is an exact differential, it must be true that \begin{align*} \frac{dM}{dy} &= \frac{dN}{dx}; \\ \text{for then}\; \frac{d(dU)}{dx\, dy} &= \frac{d(dU)}{dy\, dx},\\ \end{align*} which is necessarily true.
Take as an illustration the equation \[ (1 + 3 xy)\, dx + x^2\, dy = 0. \]
Is this an exact differential or not? Apply the test. \[\left\{ \begin{aligned} \frac{d(1 + 3xy)}{dy}=3x, \\ \dfrac{d(x^2)}{dx} = 2x, \end{aligned} \right. \] which do not agree. Therefore, it is not an exact differential, and the two functions $1+3xy$ and $x^2$ have not come from a common original function.
It is possible in such cases to discover, however, an integrating factor, that is to say, a factor such that if both are multiplied by this factor, the expression will become an exact differential. There is no one rule for discovering such an integrating factor; but experience will usually suggest one. In the present instance $2x$ will act as such. Multiplying by $2x$, we get \[ (2x + 6x^2y)\, dx + 2x^3\, dy = 0. \]
Now apply the test to this. \[ \left\{ \begin{aligned} \frac{d(2x + 6x^2y)}{dy}=6x^2, \\ \dfrac{d(2x^3)}{dx} = 6x^2, \end{aligned} \right. \] which agrees. Hence this is an exact differential, and may be integrated. Now, if $w = 2x^3y$, \[ dw=6x^2y\, dx + 2x^3\, dy. \] Hence \[ \int 6x^2y\, dx + \int 2x^3\, dy=w=2x^3y; \] so that we get \[ U = x^2 + 2x^3y + C. \]
In this case we have a differential equation of the second degree, in which $y$ appears in the form of a second differential coefficient, as well as in person.
Transposing, we have $\dfrac{d^2 y}{dt^2} = - n^2 y$.
It appears from this that we have to do with a function such that its second differential coefficient is proportional to itself, but with reversed sign. In Chapter XV. we found that there was such a function–namely, the sine (or the cosine also) which possessed this property. So, without further ado, we may infer that the solution will be of the form $y = A \sin (pt + q)$. However, let us go to work.
Multiply both sides of the original equation by $2\dfrac{dy}{dt}$ and integrate, giving us $2\dfrac{d^2 y}{dt^2}\, \dfrac{dy}{dt} + 2x^2 y \dfrac{dy}{dt} = 0$, and, as \[ 2 \frac{d^2y}{dt^2}\, \frac{dy}{dt} = \frac{d \left(\dfrac{dy}{dt}\right)^2}{dt},\quad \left(\frac{dy}{dt}\right)^2 + n^2 (y^2-C^2) = 0, \] $C$ being a constant. Then, taking the square roots, \[ \frac{dy}{dt} = -n \sqrt{ y^2 - C^2}\quad \text{and}\quad \frac{dy}{\sqrt{C^2 - y^2}} = n · dt. \]
But it can be shown that (see here) \[ \frac{1}{\sqrt{C^2 - y^2}} = \frac{d (\arcsin \dfrac{y}{C})}{dy}; \] whence, passing from angles to sines, \[ \arcsin \frac{y}{C} = nt + C_1\quad \text{and}\quad y = C \sin (nt + C_1), \] where $C_1$ is a constant angle that comes in by integration.
Or, preferably, this may be written \[ y = A \sin nt + B \cos nt, \text{ which is the solution.} \]
Example (6.) $\dfrac{d^2 y}{dt^2} - n^2 y = 0$.
Here we have obviously to deal with a function $y$ which is such that its second differential coefficient is proportional to itself. The only function we know that has this property is the exponential function (see here), and we may be certain therefore that the solution of the equation will be of that form.
Proceeding as before, by multiplying through by $2 \dfrac{dy}{dx}$, and integrating, we get $2\dfrac{d^2 y}{dx^2}\, \dfrac{dy}{dx} - 2x^2 y \dfrac{dy}{dx}=0$, and, as \[ 2\frac{d^2 y}{dx^2}\, \frac{dy}{dx} = \frac{d \left(\dfrac{dy}{dx}\right)^2}{dx},\quad \left(\frac{dy}{dx}\right)^2 - n^2 (y^2 + c^2) = 0, \\ \frac{dy}{dx} - n \sqrt{y^2 + c^2} = 0, \] where $c$ is a constant, and $\dfrac{dy}{\sqrt{y^2 + c^2}} = n\, dx$.
Now, if \[ \quad w = \log_\epsilon ( y+ \sqrt{y^2+ c^2}) = \log_\epsilon u,\\ \frac{dw}{du} = \frac{1}{u},\quad \frac{du}{dy} = 1 + \frac{y}{\sqrt{y^2 + c^2}} = \frac{y + \sqrt{ y^2 + c^2}}{\sqrt{y^2 + c^2}} \\ \] and \[ \frac{dw}{dy} = \frac{1}{\sqrt{ y^2 + c^2}}. \]
Hence, integrating, this gives us \[ \log_\epsilon (y + \sqrt{y^2 + c^2} ) = nx + \log_\epsilon C, \\ y + \sqrt{y^2 + c^2} = C \epsilon^{nx}. \tag*{(1)} \\ \] \[ \text{Now}\; \qquad ( y + \sqrt{y^2 + c^2} ) × ( -y + \sqrt{y^2 + c^2} ) = c^2 ; \\ \text{whence}\; \qquad -y + \sqrt{y^2 + c^2} = \dfrac{c^2}{C} \epsilon^{-nx}. \tag*{(2)} \]
Subtracting (2) from (1) and dividing by $2$, we then have \[ y = \frac{1}{2} C \epsilon^{nx} - \frac{1}{2}\, \frac{c^2}{C} \epsilon^{-nx}, \] which is more conveniently written \[ y = A \epsilon^{nx} + B \epsilon^{-nx}. \] Or, the solution, which at first sight does not look as if it had anything to do with the original equation, shows that $y$ consists of two terms, one of which grows logarithmically as $x$ increases, and of a second term which dies away as $x$ increases.
Example (7.) Let \begin{align*} b \frac{d^2y}{dt^2} + a \frac{dy}{dt} + gy &= 0. \end{align*}
Examination of this expression will show that, if $b = 0$, it has the form of Example 1, the solution of which was a negative exponential. On the other hand, if $a = 0$, its form becomes the same as that of Example 6, the solution of which is the sum of a positive and a negative exponential. It is therefore not very surprising to find that the solution of the present example is \begin{align*} y &= (\epsilon^{-mt})(A \epsilon^{nt} + B \epsilon^{-nt}), \\ \text{where}\; m &= \frac{a}{2b}\quad \text{and}\quad n = \sqrt{\frac{a^2}{4b^2}} - \frac{g}{b}. \end{align*}
The steps by which this solution is reached are not given here; they may be found in advanced treatises.
Example (8.) \[ \frac{d^2y}{dt^2} = a^2 \frac{d^2y}{dx^2}. \]
It was seen (here) that this equation was derived from the original \[ y = F(x+at) + f(x-at), \] where $F$ and $f$ were any arbitrary functions of $t$.
Another way of dealing with it is to transform it by a change of variables into \[ \frac{d^2y}{du · dv} = 0, \] where $u = x + at$, and $v = x - at$, leading to the same general solution. If we consider a case in which $F$ vanishes, then we have simply \[ y = f(x-at); \] and this merely states that, at the time $t = 0$, $y$ is a particular function of $x$, and may be looked upon as denoting that the curve of the relation of $y$ to $x$ has a particular shape. Then any change in the value of $t$ is equivalent simply to an alteration in the origin from which $x$ is reckoned. That is to say, it indicates that, the form of the function being conserved, it is propagated along the $x$ direction with a uniform velocity $a$; so that whatever the value of the ordinate $y$ at any particular time $t_0$ at any particular point $x_0$, the same value of $y$ will appear at the subsequent time $t_1$ at a point further along, the abscissa of which is $x_0 + a(t_1 - t_0)$. In this case the simplified equation represents the propagation of a wave (of any form) at a uniform speed along the $x$ direction.
If the differential equation had been written \[ m \frac{d^2y}{dt^2} = k\, \frac{d^2y}{dx^2}, \] the solution would have been the same, but the velocity of propagation would have had the value \[ a = \sqrt{\frac{k}{m}}. \]
You have now been personally conducted over the
frontiers into the enchanted land. And in order that
you may have a handy reference to the principal
results, the author, in bidding you farewell, begs to
present you with a passport in the shape of a convenient
collection of standard forms.
In the middle column are set down a number of the
functions which most commonly occur. The results
of differentiating them are set down on the left; the
results of integrating them are set down on the right.
May you find them useful!