# Gradient

In vector calculus, the **gradient** of a scalar-valued differentiable function *f* of several variables is the vector field (or vector-valued function) whose value at a point is the vector[lower-alpha 1] whose components are the partial derivatives of at .[1][2][3][4][5][6][7][8][9]^{[excessive citations]} That is, for , its gradient is defined at the point in *n-*dimensional space as the vector:[lower-alpha 2]

This article needs additional citations for verification. (January 2018) |

The nabla symbol , written as an upside-down triangle and pronounced "del", denotes the vector differential operator.

The gradient vector can be interpreted as the "direction and rate of fastest increase". If the gradient of a function is non-zero at a point *p*, the direction of the gradient is the direction in which the function increases most quickly from *p*, and the magnitude of the gradient is the rate of increase in that direction, the greatest absolute directional derivative.[10][11][12][13][14][15][16]^{[excessive citations]} Further, the gradient is the zero vector at a point if and only if it is a stationary point (where the derivative vanishes). The gradient thus plays a fundamental role in optimization theory, where it is used to maximize a function by gradient ascent.

The gradient is dual to the total derivative : the value of the gradient at a point is a tangent vector – a vector at each point; while the value of the derivative at a point is a *co*tangent vector – a linear function on vectors.[lower-alpha 3] They are related in that the dot product of the gradient of *f* at a point *p* with another tangent vector **v** equals the directional derivative of *f* at *p* of the function along **v**; that is, .
The gradient admits multiple generalizations to more general functions on manifolds; see § Generalizations.