# Polar coordinate system

In mathematics, the **polar coordinate system** is a two-dimensional coordinate system in which each point on a plane is determined by a distance from a reference point and an angle from a reference direction. The reference point (analogous to the origin of a Cartesian coordinate system) is called the *pole*, and the ray from the pole in the reference direction is the *polar axis*. The distance from the pole is called the *radial coordinate*, *radial distance* or simply *radius*, and the angle is called the *angular coordinate*, *polar angle*, or *azimuth*.[1] Angles in polar notation are generally expressed in either degrees or radians (2π rad being equal to 360°).

Grégoire de Saint-Vincent and Bonaventura Cavalieri independently introduced the concepts in the mid-17th century, though the actual term *polar coordinates* has been attributed to Gregorio Fontana in the 18th century. The initial motivation for the introduction of the polar system was the study of circular and orbital motion.

Polar coordinates are most appropriate in any context where the phenomenon being considered is inherently tied to direction and length from a center point in a plane, such as spirals. Planar physical systems with bodies moving around a central point, or phenomena originating from a central point, are often simpler and more intuitive to model using polar coordinates.

The polar coordinate system is extended to three dimensions in two ways: the cylindrical and spherical coordinate systems.