# Differential (infinitesimal)

The term differential is used in calculus to refer to an infinitesimal (infinitely small) change in some varying quantity. For example, if x is a variable, then a change in the value of x is often denoted Δx (pronounced delta x). The differential dx represents an infinitely small change in the variable x. The idea of an infinitely small or infinitely slow change is, intuitively, extremely useful, and there are a number of ways to make the notion mathematically precise.

Using calculus, it is possible to relate the infinitely small changes of various variables to each other mathematically using derivatives. If y is a function of x, then the differential dy of y is related to dx by the formula

${\displaystyle dy={\frac {dy}{dx}}\,dx,}$

where ${\displaystyle {\frac {dy}{dx}}\,}$denotes the derivative of y with respect to x. This formula summarizes the intuitive idea that the derivative of y with respect to x is the limit of the ratio of differences Δyx as Δx becomes infinitesimal.

There are several approaches for making the notion of differentials mathematically precise.

1. Differentials as linear maps. This approach underlies the definition of the derivative and the exterior derivative in differential geometry.[1]
2. Differentials as nilpotent elements of commutative rings. This approach is popular in algebraic geometry.[2]
3. Differentials in smooth models of set theory. This approach is known as synthetic differential geometry or smooth infinitesimal analysis and is closely related to the algebraic geometric approach, except that ideas from topos theory are used to hide the mechanisms by which nilpotent infinitesimals are introduced.[3]
4. Differentials as infinitesimals in hyperreal number systems, which are extensions of the real numbers that contain invertible infinitesimals and infinitely large numbers. This is the approach of nonstandard analysis pioneered by Abraham Robinson.[4]

These approaches are very different from each other, but they have in common the idea of being quantitative, i.e., saying not just that a differential is infinitely small, but how small it is.