# Normal distribution

In probability theory, a **normal** (or **Gaussian** or **Gauss** or **Laplace–Gauss**) **distribution** is a type of continuous probability distribution for a real-valued random variable. The general form of its probability density function is

Probability density function The red curve is the standard normal distribution | |||

Cumulative distribution function | |||

Notation | |||
---|---|---|---|

Parameters |
= mean (location) = variance (squared scale) | ||

Support | |||

CDF | |||

Quantile | |||

Mean | |||

Median | |||

Mode | |||

Variance | |||

MAD | |||

Skewness | |||

Ex. kurtosis | |||

Entropy | |||

MGF | |||

CF | |||

Fisher information |
| ||

Kullback-Leibler divergence |

Part of a series on statistics |

Probability theory |
---|

The parameter is the mean or expectation of the distribution (and also its median and mode), while the parameter is its standard deviation.[1] The variance of the distribution is .[2] A random variable with a Gaussian distribution is said to be **normally distributed**, and is called a **normal deviate**.

Normal distributions are important in statistics and are often used in the natural and social sciences to represent real-valued random variables whose distributions are not known.[3][4] Their importance is partly due to the central limit theorem. It states that, under some conditions, the average of many samples (observations) of a random variable with finite mean and variance is itself a random variable—whose distribution converges to a normal distribution as the number of samples increases. Therefore, physical quantities that are expected to be the sum of many independent processes, such as measurement errors, often have distributions that are nearly normal.[5]

Moreover, Gaussian distributions have some unique properties that are valuable in analytic studies. For instance, any linear combination of a fixed collection of normal deviates is a normal deviate. Many results and methods, such as propagation of uncertainty and least squares parameter fitting, can be derived analytically in explicit form when the relevant variables are normally distributed.

A normal distribution is sometimes informally called a **bell curve**.[6] However, many other distributions are bell-shaped (such as the Cauchy, Student's *t*, and logistic distributions).