$$ \newcommand{\RR}{\mathbb{R}} \newcommand{\QQ}{\mathbb{Q}} \newcommand{\CC}{\mathbb{C}} \newcommand{\NN}{\mathbb{N}} \newcommand{\ZZ}{\mathbb{Z}} \newcommand{\FF}{\mathbb{F}} \renewcommand{\epsilon}{\varepsilon} % ALTERNATE VERSIONS % \newcommand{\uppersum}[1]{{\textstyle\sum^+_{#1}}} % \newcommand{\lowersum}[1]{{\textstyle\sum^-_{#1}}} % \newcommand{\upperint}[1]{{\textstyle\smallint^+_{#1}}} % \newcommand{\lowerint}[1]{{\textstyle\smallint^-_{#1}}} % \newcommand{\rsum}[1]{{\textstyle\sum_{#1}}} \newcommand{\uppersum}[1]{U_{#1}} \newcommand{\lowersum}[1]{L_{#1}} \newcommand{\upperint}[1]{U_{#1}} \newcommand{\lowerint}[1]{L_{#1}} \newcommand{\rsum}[1]{{\textstyle\sum_{#1}}} % extra auxiliary and additional topic/proof \newcommand{\extopic}{\bigstar} \newcommand{\auxtopic}{\blacklozenge} \newcommand{\additional}{\oplus} \newcommand{\partitions}[1]{\mathcal{P}_{#1}} \newcommand{\sampleset}[1]{\mathcal{S}_{#1}} \newcommand{\erf}{\operatorname{erf}} $$

13  Convergence Tests

Highlights of this Chapter: Finding the value of a series explicitly is difficult, so we develop some theory to determine convergence without explicitly finding the limit. In particular, we provide a simple criterion to determine if a series diverges, formalize the technique of comparison, and develop the root and ratio tests as means of comparing with geometric series. We also introduce the notion of absolute convergence, and prove the convergence of alternating series.

In this section, we build up some technology to prove the convergence (and divergence) of series, without explicitly being able to compute the limit of partial sums. Such results will prove incredibly useful, as in the future we will encounter many theorems of the form if \(\sum a_n\) converges, then… and we will need to a method of proving convergence to continue.

13.1 Divergent and Alternating Series

We begin with some low-hanging fruit: easy-to-check conditions on the terms of a series which either guarantee its convergence or divergence.

Corollary 13.1 (Divergence Test) If a series \(\sum a_n\) converges, then \(\lim a_n=0\). Equivalently, if \(a_n\not\to 0\) then \(\sum a_n\) diverges.

Proof. Assume the sequence \(s_N=\sum_{n=0}^Na_n\) of partial sums converges to some limit \(L\). Then \(s_{N-1}\) also converges to \(L\), as truncating the first term of a sequence doesn’t affect convergence. Thus, we may use the limit law for subtraction to conclude the sequence \(s_N-s_{N-1}\) converges, with limit zero. But \[s_N-s_{N-1}=\sum_{n=0}^N a_n -\sum_{n=0}^{N-1}a_n = a_{N}\] That is, \(a_N\) converges and \(\lim a_N =0\).

Here’s an alternative proof using the Cauchy criterion:

Proof. Let’s apply the cauchy condition to the single value \(m\). This says for all \(\epsilon>0\) there is some \(N\) where for \(m>N\) we have \[\left|\sum_{k=m}^{m}a_k\right|=|a_m|<\epsilon\]

But making \(|a_m|<\epsilon\) for all \(m>N\) is exactly the definition of \(a_m\to 0\).

Remark 13.1. The converse of the divergence test is false as the harmonic series \(\sum \frac{1}{n}\) has terms going to zero, but we’ve already seen the overall sum diverges.

Happily, there are some minimal extra conditions one can add to the terms go to zero that do ensure convergence! The most famous such set of conditions is for the series to be alternating with terms converging monotonically to zero.

Theorem 13.1 (Alternating Series Test) Let \(a_n\to 0\) be a monotonically decreasing sequence. Then \(\sum (-1)^na_n\) converges.

Proof. We will show the sequence \(s_n=\sum_{k=0}^n (-1)^k a_k\) of partial sums converges by showing it is Cauchy. Looking at a concrete example \[s_6-s_{12}=(-1)^6a_6+(-1)^7a_7+(-1)^8a_8+(-1)^9a_9+(-1)^{10}a_{10}+(-1)^{11}a_{11}+(-1)^{12}a_{12}\] \[=a_6-a_7+a_8-a_9+a_{10}-a_{11}+a_{12}\]

We can group the terms in two different ways to bound the difference \(|s_6-s_{12}|\): first, \[=(a_6-a_7)+(a_8-a_9)+(a_{10}-a_{11})+a_{12}\] Here, each parentheses encloses a nonnegative term (since \(a_k\) is monotone decreasing), so the sum is nonnegative: \(s_6-s_{12}\geq 0\). But sliding our groupings down by one, \[=a_6+(-a_7+a_8)+(-a_9+a_{10})+(-a_{11}+a_{12})\]

now each pair of parentheses includes a non-positive term, which we are taking away from \(a_6\). Thus \(s_6-s_{12}\leq a_6\). Together these bounds imply \(|s_6-s_{12}|<a_6\). Indeed, one can confirm this holds generally, so for \(m<n\), \(|s_n-s_m|\) is bounded above by \(a_m\). Using that \(a_k\to 0\), for any \(\epsilon>0\) there’s an \(N\) beyond which \(0<a_m<\epsilon\), and thus, for any \(n>m>N\) we have \[|s_n-s_m|<a_m<\epsilon\] So, the sequence of partial sums is Cauchy, and thus convergent, as required.

Exercise 13.1 Prove the general inequality used in the proof above: if \(s_n=\sum_{k=0}^n (-1)^ka_k\) is the partial sums of a monotone decreasing sequence \(a_n\to 0\), then \[|s_n-s_m|\leq a_m\hspace{0.5cm}\textrm{for all }m<n\]

Hint: think about different cases, when \(m, n\) are even and odd

Exercise 13.2 Give an alternative proof of the alternating series test, using the Nested Interval Theorem. Here’s a potential outline:

Let \(a_n\to 0\) be monotone decreasing and \(s_n=\sum_{k=0}^n(-1)^ka_k\) be its partial sums.

  • Show the even subsequence \(s_0,s_2,s_4,\ldots\) is monotone decreasing
  • Show the odd subsequence \(s_1,s_3,s_5,\ldots\) is monotone increasing
  • Show the intervals \([s_{2n-1},s_{2n}]\) are nested, and their lengths are going to zero.
  • Show there is a unique point in their intersection, and argue this is the limit of the partial sums \(s_n\).

The monotonicity hypothesis of the Alternating Series test cannot be dropped, as the following example shows.

Example 13.1 (Monotonicity is Required) Consider the infinite series \[\frac{1}{\sqrt{2}-1}-\frac{1}{\sqrt{2}+1}+\frac{1}{\sqrt{3}-1}-\frac{1}{\sqrt{3}+1}+\cdots\]

This series is alternating and its terms converge to zero, but it is not monotone. To see it diverges, look at the sum of two consecutive terms:

\[\frac{1}{\sqrt{n}-1}-\frac{1}{\sqrt{n}+1}=\frac{(\sqrt{n}+1)-(\sqrt{n}-1)}{n-1}=\frac{2}{n-1}\]

Thus, if we add up the first \(2N\) terms of the series we get \[\left(\frac{1}{\sqrt{2}-1}-\frac{1}{\sqrt{2}+1}\right)+\left(\frac{1}{\sqrt{3}-1}-\frac{1}{\sqrt{3}+1}\right)+\cdots+\left(\frac{1}{\sqrt{N}-1}-\frac{1}{\sqrt{N}+1}\right)\] \[=\left(\frac{2}{1}\right)+\left(\frac{2}{2}\right)+\cdots +\left(\frac{2}{N-1}\right)\] \[=2\left(1+\frac{1}{2}+\frac{1}{3}+\cdots+\frac{1}{n-1}\right)\] This is twice the sum of the first \(n-1\) terms of the harmonic series, which we know diverges! Thus our series contains a subsequence of partial sums that diverges, and must diverge as well.

13.2 Absolute vs Conditional Convergence

Below we will develop several theorems that apply exclusively to series of positive terms. That may seem at first to be a significant obstacle, as many series involve both addition and subtraction! So, we take some time here to assuage such worries, and provide a means of probing a general series using information about its nonnegative counterpart.

Definition 13.1 (Absolute Convergence) A series \(\sum a_n\) converges absolutely if the associated series of absolute values \(\sum |a_n|\) is convergent.

Of course, such a definition is only useful if facts about the nonnegative series imply facts about the original. Happily, that is the case.

Theorem 13.2 (Absolute Convergence Implies Convergence) Every absolutely convergent series is a convergent series.

Proof. Let \(\sum a_n\) be absolutely convergent. Then \(\sum |a_n|\) converges, and its partial sums satisfy the Cauchy criterion. This means for any \(\epsilon\) we can find an \(N\) where \[|a_n|+|a_{n+1}|+\cdots+|a_m|<\epsilon\]

But, by the triangle inequality we know that \[|a_n+a_{n+1}+\cdots+a_n|\leq |a_n|+|a_{n+1}|+\cdots+|a_m|\] Thus, our original series \(\sum a_k\) satisfies the Cauchy Criterion, as \[\left|\sum_{k=m}^na_k\right|<\epsilon\] And, since Cauchy is equivalent to convergence, this implies \(\sum a_k\) is a convergent series.

Definition 13.2 A series converges conditionally if it converges, but is not absolutely convergent.

Such series caused much trouble in the foundations of analysis, as they can exhibit rather strange behavior. We met one such series in the introduction, the alternating sum of \(1/n\) which seemed to converge to different values depending on the order we added its terms. Here we begin an investigation into such phenomena.

13.3 Comparison

One of the very most useful convergence tests for a series is comparison. This lets us show that a series we care about (that may be hard to compute with) converges or diverges by comparing it to a simpler series - much like the squeeze theorem did for us with sequences. This theorem gives less information than the squeeze theorem (it doesn’t give us the exact value of the series we are interested in) but it is also easier to use (it only requires a bound, not an upper and lower bound with the same limit).

Theorem 13.3 (Comparison For Series) Let \(\sum a_n\) and \(\sum b_n\) be two series of nonnegative terms, with \(0\leq a_n\leq b_n\).

  • If \(\sum b_n\) converges, then \(\sum a_n\) converges.
  • If \(\sum a_n\) diverges, then \(\sum b_n\) diverges.

The proof is just a rehashing of our old friend, Monotone Convergence.

Proof. We prove the first of the two claims, and leave the second as an exercise. If \(x_n\geq 0\) then the series \(s_n=\sum_{k=0}^n x_k\) is monotone increasing (as by definition \(s_{n}=s_{n-1}+x_n\) and \(x_n\geq 0\) we see \(s_{n}\geq s_{n-1}\) for all \(n\)).

Thus. \(\sum a_n\) and \(\sum b_n\) are monotone sequences. If \(\sum b_n\) converges, we know by the Monotone Convergence Theorem that it its limit \(\beta\) is the supremum of the partial sums, so for all \(n\) \[\sum_{k=0}^n b_k \leq \beta\] But, since \(a_k\leq b_k\) for all \(k\), we see the same is true of the partial sums \[\sum_{k=0}^n a_k\leq \sum_{k=0}^n b_k\] Stringing these inequalities together, we see that \(\sum a_k\) is bounded above by \(\beta\). Since it is monotone (as the sum of nonnegative terms) as well, Monotone convergence assures us that it converges, as claimed.

Exercise 13.3 Let \(\sum a_n\) and \(\sum b_n\) be two series of nonnegative terms, with \(0\leq a_n\leq b_n\). Prove that if \(\sum a_n\) diverges, then \(\sum b_n\) diverges.

A very effective means of proving the convergence or divergence of certain series is to compare them with geometric series, who we understand completely. Such comparisons will only work if the terms of our series are shrinking fast enough (quicker than a geometric progression, so at least exponentially), and coarse methods like this are bound to prove unhelpful for various particular examples. Nonetheless the ease of use of such comparisons is unparalleled, making them an essential element of our toolkit.

13.3.1 The Root Test

Geometric series are particularly simple to work with, as the \(n^{th}\) term is just a constant raised to the \(n^{th}\) power. Said another way, the \(n^{th}\) root of the \(n^{th}\) term is constant! This suggests a concrete way to compare a series with a geometric series: comparing the roots of its terms with a constant. This is commonly known as the root test.

Proposition 13.1 (The Root Test) Let \(\sum a_n\) be a series, and assume that \(\lim \sqrt[n]{|a_n|}\) converges to some limit \(L\). Then

  • \(\sum a_n\) converges if \(L<1\)
  • \(\sum a_n\) diverges if \(L>1\)

Proof. We work in cases, starting with \(L<1\). Here, for any \(r\) strictly between \(L\) and \(1\) we can eventually bound our sequence \(\sqrt[n]{|a_n|}\) above by \(r\): (set \(\epsilon=r-L\), then there’s an \(N\) beyond which \(|\sqrt[n]{|a_n|}-L|<\epsilon\), so \(\sqrt[n]{|a_n|}<L+\epsilon=r\)).

Taking the \(n^{th}\) power of both sides, we see that for all \(n\geq N\) we have \(|a_n|< r^n\), and so we can compare the series \(\sum_{n\geq N} |a_n|\) with the geometric series \(\sum_{n\geq N}r^n\), which converges as \(|r|=r<1\). Thus \(\sum_{n\geq N}|a_n|\) converges, and as the first finitely many terms of a series do not affect convergence, \(\sum_{n\geq 0}|a_n|\) converges as well. But now we are done: this is exactly the statement that \(\sum a_n\) converges absolutely, and absolute convergence implies convergence.

If \(L>1\) we can use the same sort of reasoning for any \(1<r<L\) to eventually bound our sequence \(\sqrt[n]{|a_n|}\) to be greater than \(r\), so that \(|a_n|>r^n\) for all \(n\geq N\). This sets up a comparison with a divergent geometric series which causes problems. But even more directly, if \(|a_n|>r^n\) and \(r^n>1\), then \(|a_n|>1\) for all sufficiently large \(n\), and its impossible that \(a_n\to 0\). Thus \(\sum a_n\) diverges by the Divergence Test.

Example 13.2 (Using the Root Test) Consider the series \[ \sum_{n=1}^{\infty} \left( \frac{2n}{3n+1} \right)^n. \]

Applying the root test means we must take the \(n^{th}\) root of the absolute value of \(a_n = \left( \frac{2n}{3n+1} \right)^n\). Since for all \(n\in\NN\) these terms are positive, this is \[ \sqrt[n]{|a_n|} = \frac{2n}{3n+1}. \]

Using the limit laws, we can check this limit exists: \[ \lim \frac{2n}{3n+1} = \frac{2}{3} \]

Since \(\frac{2}{3} < 1\), the root test guarantees that the series \(\sum_{n=1}^{\infty} \left( \frac{2n}{3n+1} \right)^n\) converges absolutely, and thus converges.

Remark 13.2. When \(\lim \sqrt[n]{|a_n|}=1\) we gain no information, as there are both convergent series (for example \(\sum\frac{1}{n^2}\)) and divergent series (for example, \(\sum \frac{1}{n}\)) with this property. Recalling that \(n^{1/n}\to 1\), we see

\[\lim \sqrt[n]{\frac{1}{n}}=\lim \frac{1}{\sqrt[n]{n}}=\frac{1}{\lim n^{1/n}}=1\] \[\lim \sqrt[n]{\frac{1}{n^2}}=\lim \frac{1}{\sqrt[n]{n^2}}=\frac{1}{\lim n^{2/n}}=\frac{1}{\left(\lim n^{1/n}\right)^2}=1\]

The root test is very powerful when it applies, but one of its hypotheses is that the limit \(\lim \sqrt[n]{|a_n|}\) must exist. This is a rather big ask: and remembering the limsup one might wonder if we could instead prove an analog of the root test which looks at \(\limsup \sqrt[n]{|a_n|}\) instead, as we know this quantity always exists so long as the sequence is bounded. Indeed we can, and after refreshing our memories of the definition of limsup, we see the proof barely changes either!

Exercise 13.4 (The Root Test with Limsup) Prove that if \(\sum a_n\) is a series and \(L=\limsup\sqrt[n]{|a_n|}\), the series converges if \(L<1\) and diverges for \(L>1\). (Here we write \(L=\infty\) as a shorthand to mean the sequence \(\sqrt[n]{|a_n|}\) is unbounded.)

This more general version will be crucial to our understanding of general power series in the next chapter. It is still not exact, because it does not give any information when the limsup is exactly 1; but this is the only case of ambiguity

Corollary 13.2 If \(\sum a_n x^n\) is a convergent series, then \(\limsup \sqrt[n]{|a_n|}\leq 1\).

Proof. We proceed by process of elimination. If the \(\limsup\) were not \(\leq 1\) then it would either be (1) greater than 1, or (2) not exist. In the first case, the root test directly proves the series diverges. In the second case, \(\limsup\) exists for every bounded sequence, so this means the sequence of \(n^{th}\) roots is unbounded. Thus the original sequence of terms is also unbounded, so the series diverges (by the divergence test, as the terms aren’t going to zero).

Below is an example in practice where the need for \(\limsup\), over the more familiar \(\lim\) may arise.

Example 13.3 (Using the Limsup Root Test) Consider the series \(\sum_{n=1}^{\infty} \left( \frac{1 + (-1)^n}{3} \right)^n\). Computing the \(n^{th}\) roots of the absolute values required for the root test,

\[\sqrt[n]{\left|\left( \frac{1 + (-1)^n}{3} \right)^n\right|}= \left|\frac{1 + (-1)^n}{3}\right|=\begin{cases} \frac{2}{3}, & \text{if } n \text{ is even}, \\ 0, & \text{if } n \text{ is odd}. \end{cases}\]

The limit of this sequence does not exist, because it oscillates between two values. Rigorously, the even subsequence converges to \(2/3\) and the odd subsequence converges to \(0\) (both are constant sequences), but for a convergent sequence all subsequences converge to the same limit. Thus, the original form of the root test does not apply, as the limit does not even exist.

However, the limsup does exist. Let \(r_n\) be the sequence of roots, so \(r_n=2/3\) if \(n\) is even and \(0\) if \(n\) is odd. Then for all \(N\) \[\sup_{n\geq N}r_n=\sup\{\tfrac{2}{3},0,\tfrac{2}{3},0,\tfrac{2}{3},\ldots\}=\frac{2}{3}\] And thus \[\limsup r_n = \lim_{N}\sup_{n\geq N} \{r_n\}=\frac{2}{3}\] as its the limit of a constant sequence.

Because \(2/3<1\), the limsup version of the root test applies without issue, proving that our series converges absolutely, and hence converges (even though our original test was not strong enough to say anything at all).

However, while good for building theory, using the root test in practice is rather annoying - nobody wants to be computing limits of \(n^{th}\) roots of arbitrary things! So its beneficial to look for another, perhaps simpler method of comparing with a geometric series, as we do below.

13.3.2 The Ratio Test

Consecutive terms of a geometric series \(\sum r^n\) have the common ratio \(r\). Thus a natural means of comparing with a geomeric series is to investigate the common ratio \(a_{n+1}/a_n\) of a series’ terms:

Proposition 13.2 (The Ratio Test) Let \(\sum a_n\) be a series and assume that the sequence of ratios \(\left|\frac{a_{n+1}}{a_n}\right|\) converges, with limit \(L\). Then \(\sum a_n\) converges if \(L<1\) and \(\sum a_n\) diverges when \(L>1\).

Proof. We first consider the case \(L<1\). For any \(r\) with \(L<r<1\), since \(\lim |a_{n+1}/a_n|<r\), our sequence is eventually less than \(r\) (setting \(\epsilon = r-L\) there is an \(N\) beyond which \(||a_{n+1}/a_{n}-L|<\epsilon\), or \(|a_{n+1}/a_{n}|<L+\epsilon = r\)). But this means that our sequence is eventually shrinking by a factor of \(r\) with each consecutive term:

\[\left|\frac{a_{n+1}}{a_n}\right|<r\,\implies\, |a_{n+1}|<r|a_{n}|\hspace{1cm}\textrm{for }n\geq N\]

Thus, beyond \(N\) our series is bounded by a geometric series:

\[|a_{N+1}|<r |a_N|\hspace{0.5cm}a_{N+1}<r|a_{N+1}|<r^2|a_N|\hspace{0.5cm}|a_{N+3}|<r|a_{N+2}|<r^3|a_N|\] continuing inductively yields immediately that for all \(k\), \(|a_{N+k}|<|a_N|r^k\). Thus, by comparison our series \(\sum_{n\geq N}|a_n|\) converges: \[\sum_{n\geq N}|a_n|=\sum_{k\geq 0}|a_{N+k}|<\sum_{k\geq 0}|a_N|r^k=|a_N|\sum_{k\geq 0}r^k\]

Exercise 13.5 Prove the divergence of \(\sum a_n\) in the case \(\lim|a_{n+1}/a_n|>1\).

Example 13.4 Consider the series \[\sum_{n=1}^{\infty} \frac{3^n}{n!}.\]

The \(n^{th}\) term is defined by \(a_n = \frac{3^n}{n!}\), so applying the ratio test requires us to compute \(\left|\frac{a_{n+1}}{a_n}\right|\). Since all terms involved are positive, we can drop the absolute values and compute \[\left| \frac{a_{n+1}}{a_n} \right| = \frac{3^{n+1} / (n+1)!}{3^n / n!} = \frac{3^{n+1} \cdot n!}{3^n \cdot (n+1)!}.\] Simplifying, \[ \left| \frac{a_{n+1}}{a_n} \right| = \frac{3^{n+1}}{3^n} \cdot \frac{n!}{(n+1)!} = \frac{3 \cdot 3^n}{3^n} \cdot \frac{1}{n+1} = \frac{3}{n+1}. \]

Now, taking the limit using our limit laws and basic limits \[ \lim_{n \to \infty} \frac{3}{n+1} = 0. \]

Since \(0 < 1\), the ratio test guarantees that the series \(\sum_{n=1}^{\infty} \frac{3^n}{n!}\) converges absolutely, and thus converges.

Remark 13.3. When \(\lim \left|\frac{a_{n+1}}{a_n}\right|=1\) we gain no information, as there are both convergent series (for example \(\sum\frac{1}{n^2}\)) and divergent series (for example, \(\sum \frac{1}{n}\)) with this property. \[\lim \frac{\frac{1}{n+1}}{\frac{1}{n}}=\lim\frac{n}{n+1}=\lim\frac{1}{1+\frac{1}{n}}=\frac{1}{1+\lim\frac{1}{n}}=1\] \[\lim \frac{\frac{1}{(n+1)^2}}{\frac{1}{n^2}}=\lim\left(\frac{n}{(n+1}\right)^2=\left(\lim \frac{n}{(n+1}\right)^2=1^2=1\]

Like the root test, one might seek a version of the root test which doesn’t require the limit \(\lim |a_{n+1}/a_n|\) to converge. Again, we succeed by replacing \(\lim\) with \(\limsup\) and barely modifying the proof.

Exercise 13.6 (The Ratio Test with Limsup) Prove that if \(\sum a_n\) is a series and \(L=\limsup\left|\frac{a_{n+1}}{a_n}\right|\), the series converges if \(L<1\) and diverges for \(L>1\).

Example 13.5 Example using ratio test + limsup

The ratio test, while easy to apply, has some obvious failure modes even in this more general version. What if some of the terms being added up are zero, so that consecutive ratios are undefined? (For example, if every other term of the series is zero, then the consecutive ratios alternate between being zero and undefined, completely independently of the values of the nonzero terms). One might be tempted to try and fix this problem by re-indexing; removing all terms that are zero before applying the ratio test. While this would remove some problems, the fact still remains that comparing consecutive ratios just isn’t that fine-grained of a tool to work with, and we can’t take this as a one size fits all tool.