# Phase (waves)

In physics and mathematics, the phase of a periodic function ${\displaystyle F}$ of some real variable ${\displaystyle t}$ (such as time) is an angle-like quantity representing the fraction of the cycle covered up to ${\displaystyle t}$. It is denoted ${\displaystyle \phi (t)}$ and expressed in such a scale that it varies by one full turn as the variable ${\displaystyle t}$ goes through each period (and ${\displaystyle F(t)}$ goes through each complete cycle). It may be measured in any angular unit such as degrees or radians, thus increasing by 360° or ${\displaystyle 2\pi }$ as the variable ${\displaystyle t}$ completes a full period.[1]

This convention is especially appropriate for a sinusoidal function, since its value at any argument ${\displaystyle t}$ then can be expressed as the sine of the phase ${\displaystyle \phi (t)}$, multiplied by some factor (the amplitude of the sinusoid). (The cosine may be used instead of sine, depending on where one considers each period to start.)

Usually, whole turns are ignored when expressing the phase; so that ${\displaystyle \phi (t)}$ is also a periodic function, with the same period as ${\displaystyle F}$, that repeatedly scans the same range of angles as ${\displaystyle t}$ goes through each period. Then, ${\displaystyle F}$ is said to be "at the same phase" at two argument values ${\displaystyle t_{1}}$ and ${\displaystyle t_{2}}$ (that is, ${\displaystyle \phi (t_{1})=\phi (t_{2})}$) if the difference between them is a whole number of periods.

The numeric value of the phase ${\displaystyle \phi (t)}$ depends on the arbitrary choice of the start of each period, and on the interval of angles that each period is to be mapped to.

The term "phase" is also used when comparing a periodic function ${\displaystyle F}$ with a shifted version ${\displaystyle G}$ of it. If the shift in ${\displaystyle t}$ is expressed as a fraction of the period, and then scaled to an angle ${\displaystyle \varphi }$ spanning a whole turn, one gets the phase shift, phase offset, or phase difference of ${\displaystyle G}$ relative to ${\displaystyle F}$. If ${\displaystyle F}$ is a "canonical" function for a class of signals, like ${\displaystyle \sin(t)}$ is for all sinusoidal signals, then ${\displaystyle \varphi }$ is called the initial phase of ${\displaystyle G}$.