Questions of Uniqueness and Existance are important elements in Differential Equations. Here’s a very general form of a differential equations. First, here’s the: function behavior tests continuity Weakest statement. A function is continuous if and only if:
Lipschitz Condition Stronger statement. The Lipschitz Condition is a stronger test of Continuity such that:
for all t \in I, x,y \in \omega, with L \in (0,\infty), named “Lipschitz Constant”, in the dependent variable x. Reshaping this into linear one-dimensional function, we have that:
The important thing here is that its the same L of convergence \forall t. However, L may not be stable—in can oscillate Differentiable We finally have the strongest statement.
To make something Differentiable, it has to not only converge but converge to a constant C. Existence and Uniqueness Check for differential equation Assume some F:I \times \omega \to \mathbb{R}^{n} (a function F whose domain is in some space I \times \omega) is bounded and continuous and satisfies the Lipschitz Condition, and let x_{0} \in \omega, then, there exists T_{0} > 0 and a unique solution for x(t) that touches x_{0} to the standard First-Order Differential Equation \dv{x}{t} = F(t,x), x(t_{0}) = t_{0} for some |t-t_{0}| < T_{0}. To actually check that F satisfies Lipschitz Condition, we pretty much usually just go and take the partial derivative w.r.t. x (dependent variable, yes its x) of F on x, which—if exists on some bound—satisfies the Lipschitz condition on that bound. Proof So we started at:
We can separate this expression and integrate:
At this point, if F is seperable, we can then seperate it out by \dd{t} and taking the right integral. However, we are only interested in existance and uniquness, so we will do something named… Picard Integration Picard Integration is a inductive iteration scheme which leverages the Lipschitz Condition to show that a function integral converges. Begin with the result that all First-Order Differential Equations have shape (after forcibly separating):
We hope that the inductive sequence:
converges to the same result above (that is, the functions x_{n}(s) stop varying and therefore we converge to a solution x(s) to show existance. This is hard! Here’s a digression/example: if we fix a time t=10: we hope to say that:
\forall \epsilon > 0, \exists M < \infty, \forall n>M,
Now, the thing is, for the integral above to converge uniformly, we hope that M stays fixed \forall t (that all of the domain converges at once after the same under of the iterations. Taking the original expression, and applying the following page of algebra to it: Finally, we then apply the Lipschitz Condition because our setup is that F satisfies the Lipschitz Condition, we have that: