Differential equation

A differential equation is a mathematical equation that relates some function with its derivatives. In applications, the functions represent physical quantities, the derivatives represent their rates of change, the equation defines a relationship between the two; because such relations are common, differential equations play a prominent role in many disciplines including engineering, physics and biology. In pure mathematics, differential equations are studied from several different perspectives concerned with their solutions—the set of functions that satisfy the equation. Only the simplest differential equations are solvable by explicit formulas. If a closed-form expression for the solution is not available, the solution may be numerically approximated using computers; the theory of dynamical systems puts emphasis on qualitative analysis of systems described by differential equations, while many numerical methods have been developed to determine solutions with a given degree of accuracy. Differential equations first came into existence with the invention of calculus by Newton and Leibniz.

In Chapter 2 of his 1671 work Methodus fluxionum et Serierum Infinitarum, Isaac Newton listed three kinds of differential equations: d y d x = f d y d x = f x 1 ∂ y ∂ x 1 + x 2 ∂ y ∂ x 2 = y He solves these examples and others using infinite series and discusses the non-uniqueness of solutions. Jacob Bernoulli proposed the Bernoulli differential equation in 1695; this is an ordinary differential equation of the form y ′ + P y = Q y n for which the following year Leibniz obtained solutions by simplifying it. The problem of a vibrating string such as that of a musical instrument was studied by Jean le Rond d'Alembert, Leonhard Euler, Daniel Bernoulli, Joseph-Louis Lagrange. In 1746, d’Alembert discovered the one-dimensional wave equation, within ten years Euler discovered the three-dimensional wave equation; the Euler–Lagrange equation was developed in the 1750s by Euler and Lagrange in connection with their studies of the tautochrone problem. This is the problem of determining a curve on which a weighted particle will fall to a fixed point in a fixed amount of time, independent of the starting point.

Lagrange solved this problem in 1755 and sent the solution to Euler. Both further developed Lagrange's method and applied it to mechanics, which led to the formulation of Lagrangian mechanics. In 1822, Fourier published his work on heat flow in Théorie analytique de la chaleur, in which he based his reasoning on Newton's law of cooling, that the flow of heat between two adjacent molecules is proportional to the small difference of their temperatures. Contained in this book was Fourier's proposal of his heat equation for conductive diffusion of heat; this partial differential equation is now taught to every student of mathematical physics. For example, in classical mechanics, the motion of a body is described by its position and velocity as the time value varies. Newton's laws allow these variables to be expressed dynamically as a differential equation for the unknown position of the body as a function of time. In some cases, this differential equation may be solved explicitly. An example of modelling a real world problem using differential equations is the determination of the velocity of a ball falling through the air, considering only gravity and air resistance.

The ball's acceleration towards the ground is the acceleration due to gravity minus the acceleration due to air resistance. Gravity is considered constant, air resistance may be modeled as proportional to the ball's velocity; this means that the ball's acceleration, a derivative of its velocity, depends on the velocity. Finding the velocity as a function of time involves solving a differential equation and verifying its validity. Differential equations can be divided into several types. Apart from describing the properties of the equation itself, these classes of differential equations can help inform the choice of approach to a solution. Used distinctions include whether the equation is: Ordinary/Partial, Linear/Non-linear, Homogeneous/Inhomogeneous; this list is far from exhaustive. An ordinary differential equation is an equation containing an unknown function of one real or complex variable x, its derivatives, some

Mathematical analysis

Mathematical analysis is the branch of mathematics dealing with limits and related theories, such as differentiation, measure, infinite series, analytic functions. These theories are studied in the context of real and complex numbers and functions. Analysis evolved from calculus, which involves the elementary techniques of analysis. Analysis may be distinguished from geometry. Mathematical analysis formally developed in the 17th century during the Scientific Revolution, but many of its ideas can be traced back to earlier mathematicians. Early results in analysis were implicitly present in the early days of ancient Greek mathematics. For instance, an infinite geometric sum is implicit in Zeno's paradox of the dichotomy. Greek mathematicians such as Eudoxus and Archimedes made more explicit, but informal, use of the concepts of limits and convergence when they used the method of exhaustion to compute the area and volume of regions and solids; the explicit use of infinitesimals appears in Archimedes' The Method of Mechanical Theorems, a work rediscovered in the 20th century.

In Asia, the Chinese mathematician Liu Hui used the method of exhaustion in the 3rd century AD to find the area of a circle. Zu Chongzhi established a method that would be called Cavalieri's principle to find the volume of a sphere in the 5th century; the Indian mathematician Bhāskara II gave examples of the derivative and used what is now known as Rolle's theorem in the 12th century. In the 14th century, Madhava of Sangamagrama developed infinite series expansions, like the power series and the Taylor series, of functions such as sine, cosine and arctangent. Alongside his development of the Taylor series of the trigonometric functions, he estimated the magnitude of the error terms created by truncating these series and gave a rational approximation of an infinite series, his followers at the Kerala School of Astronomy and Mathematics further expanded his works, up to the 16th century. The modern foundations of mathematical analysis were established in 17th century Europe. Descartes and Fermat independently developed analytic geometry, a few decades Newton and Leibniz independently developed infinitesimal calculus, which grew, with the stimulus of applied work that continued through the 18th century, into analysis topics such as the calculus of variations and partial differential equations, Fourier analysis, generating functions.

During this period, calculus techniques were applied to approximate discrete problems by continuous ones. In the 18th century, Euler introduced the notion of mathematical function. Real analysis began to emerge as an independent subject when Bernard Bolzano introduced the modern definition of continuity in 1816, but Bolzano's work did not become known until the 1870s. In 1821, Cauchy began to put calculus on a firm logical foundation by rejecting the principle of the generality of algebra used in earlier work by Euler. Instead, Cauchy formulated calculus in terms of geometric infinitesimals. Thus, his definition of continuity required an infinitesimal change in x to correspond to an infinitesimal change in y, he introduced the concept of the Cauchy sequence, started the formal theory of complex analysis. Poisson, Liouville and others studied partial differential equations and harmonic analysis; the contributions of these mathematicians and others, such as Weierstrass, developed the -definition of limit approach, thus founding the modern field of mathematical analysis.

In the middle of the 19th century Riemann introduced his theory of integration. The last third of the century saw the arithmetization of analysis by Weierstrass, who thought that geometric reasoning was inherently misleading, introduced the "epsilon-delta" definition of limit. Mathematicians started worrying that they were assuming the existence of a continuum of real numbers without proof. Dedekind constructed the real numbers by Dedekind cuts, in which irrational numbers are formally defined, which serve to fill the "gaps" between rational numbers, thereby creating a complete set: the continuum of real numbers, developed by Simon Stevin in terms of decimal expansions. Around that time, the attempts to refine the theorems of Riemann integration led to the study of the "size" of the set of discontinuities of real functions. "monsters" began to be investigated. In this context, Jordan developed his theory of measure, Cantor developed what is now called naive set theory, Baire proved the Baire category theorem.

In the early 20th century, calculus was formalized using an axiomatic set theory. Lebesgue solved the problem of measure, Hilbert introduced Hilbert spaces to solve integral equations; the idea of normed vector space was in the air, in the 1920s Banach created functional analysis. In mathematics, a metric space is a set where a notion of distance between elements of the set is defined. Much of analysis happens in some metric space. Examples of analysis without a metric include functional analysis. Formally, a metric space is an ordered pair where M is a set