# Series (mathematics)

In mathematics, a series is, roughly speaking, a description of the operation of adding infinitely many quantities, one after the other, to a given starting quantity.[1] The study of series is a major part of calculus and its generalization, mathematical analysis. Series are used in most areas of mathematics, even for studying finite structures (such as in combinatorics) through generating functions. In addition to their ubiquity in mathematics, infinite series are also widely used in other quantitative disciplines such as physics, computer science, statistics and finance.

For a long time, the idea that such a potentially infinite summation could produce a finite result was considered paradoxical. This paradox was resolved using the concept of a limit during the 17th century. Zeno's paradox of Achilles and the tortoise illustrates this counterintuitive property of infinite sums: Achilles runs after a tortoise, but when he reaches the position of the tortoise at the beginning of the race, the tortoise has reached a second position; when he reaches this second position, the tortoise is at a third position, and so on. Zeno concluded that Achilles could never reach the tortoise, and thus that movement does not exist. Zeno divided the race into infinitely many sub-races, each requiring a finite amount of time, so that the total time for Achilles to catch the tortoise is given by a series. The resolution of the paradox is that, although the series has an infinite number of terms, it has a finite sum, which gives the time necessary for Achilles to catch up with the tortoise.

In modern terminology, any (ordered) infinite sequence ${\displaystyle (a_{1},a_{2},a_{3},\ldots )}$ of terms (that is, numbers, functions, or anything that can be added) defines a series, which is the operation of adding the ai one after the other. To emphasize that there are an infinite number of terms, a series may be called an infinite series. Such a series is represented (or denoted) by an expression like

${\displaystyle a_{1}+a_{2}+a_{3}+\cdots ,}$

or, using the summation sign,

${\displaystyle \sum _{i=1}^{\infty }a_{i}.}$

The infinite sequence of additions implied by a series cannot be effectively carried on (at least in a finite amount of time). However, if the set to which the terms and their finite sums belong has a notion of limit, it is sometimes possible to assign a value to a series, called the sum of the series. This value is the limit as n tends to infinity (if the limit exists) of the finite sums of the n first terms of the series, which are called the nth partial sums of the series. That is,

${\displaystyle \sum _{i=1}^{\infty }a_{i}=\lim _{n\to \infty }\sum _{i=1}^{n}a_{i}.}$

When this limit exists, one says that the series is convergent or summable, or that the sequence ${\displaystyle (a_{1},a_{2},a_{3},\ldots )}$ is summable. In this case, the limit is called the sum of the series. Otherwise, the series is said to be divergent.[2]

The notation ${\textstyle \sum _{i=1}^{\infty }a_{i}}$ denotes both the series—that is the implicit process of adding the terms one after the other indefinitely—and, if the series is convergent, the sum of the series—the result of the process. This is a generalization of the similar convention of denoting by ${\displaystyle a+b}$ both the addition—the process of adding—and its result—the sum of a and b.

Generally, the terms of a series come from a ring, often the field ${\displaystyle \mathbb {R} }$ of the real numbers or the field ${\displaystyle \mathbb {C} }$ of the complex numbers. In this case, the set of all series is itself a ring (and even an associative algebra), in which the addition consists of adding the series term by term, and the multiplication is the Cauchy product.