$$ \newcommand{\RR}{\mathbb{R}} \newcommand{\QQ}{\mathbb{Q}} \newcommand{\CC}{\mathbb{C}} \newcommand{\NN}{\mathbb{N}} \newcommand{\ZZ}{\mathbb{Z}} \newcommand{\FF}{\mathbb{F}} \renewcommand{\epsilon}{\varepsilon} % ALTERNATE VERSIONS % \newcommand{\uppersum}[1]{{\textstyle\sum^+_{#1}}} % \newcommand{\lowersum}[1]{{\textstyle\sum^-_{#1}}} % \newcommand{\upperint}[1]{{\textstyle\smallint^+_{#1}}} % \newcommand{\lowerint}[1]{{\textstyle\smallint^-_{#1}}} % \newcommand{\rsum}[1]{{\textstyle\sum_{#1}}} \newcommand{\uppersum}[1]{U_{#1}} \newcommand{\lowersum}[1]{L_{#1}} \newcommand{\upperint}[1]{U_{#1}} \newcommand{\lowerint}[1]{L_{#1}} \newcommand{\rsum}[1]{{\textstyle\sum_{#1}}} % extra auxiliary and additional topic/proof \newcommand{\extopic}{\bigstar} \newcommand{\auxtopic}{\blacklozenge} \newcommand{\additional}{\oplus} \newcommand{\partitions}[1]{\mathcal{P}_{#1}} \newcommand{\sampleset}[1]{\mathcal{S}_{#1}} \newcommand{\erf}{\operatorname{erf}} $$

13  Convergence Tests

Highlights of this Chapter: Finding the value of a series explicitly is difficult, so we develop some theory to determine convergence without explicitly finding the limit. Our main tool is comparison, which is built using the Monotone convergence theorem; and in particular comparison with a geometric series - the Ratio Test. Along the way to developing this theory we study a few important special series:

  • We prove the harmonic series \(\sum\frac{1}{n}\) diverges.
  • In contrast, we prove that the sum of reciprocal squares \(\sum\frac{1}{n^2}\) converges. (Later we will show its value is \(\pi^2/6\)).

In this section, we build up some technology to prove the convergence (and divergence) of series, without explicitly being able to compute the limit of partial sums. Such results will prove incredibly useful, as in the future we will encounter many theorems of the form if \(\sum a_n\) converges, then… and we will need to a method of proving convergence to continue.

For sequences, after some work we were able to find a definition equivalent to the original notion of convergence, which did not mention the precise value of the limit. This is exactly the sort of thing we seek for our investigation into series, so we carry it over directly here:

Definition 13.1 (Cauchy Criterion) A series \(s_n=\sum a_n\) satisfies the Cauchy criterion if for every \(\epsilon>0\) there is an \(N\) such that for any \(n,m>N\) we have \[\left|\sum_m^n a_k\right|<\epsilon\]

Exercise 13.1 Prove a series satisfies the Cauchy criterion if and only if its sequence of partial sums is a Cauchy sequence.

Because we know that being convergent and cauchy are equivalent, this means that all series that satisfy the Cauchy criterion are convergent, and conversely if a series does not, then it must diverge. We use this second observation to construct an easy-to-apply test for divergence:

Corollary 13.1 (Divergence Test) If a series \(\sum a_n\) converges, then \(\lim a_n=0\). Equivalently, if \(a_n\not\to 0\) then \(\sum a_n\) diverges.

Proof. Let’s apply the cauchy condition to the single value \(m\). This says for all \(\epsilon>0\) there is some \(N\) where for \(m>N\) we have \[\left|\sum_{k=m}^{m}a_k\right|=|a_m|<\epsilon\]

But making \(|a_m|<\epsilon\) for all \(m>N\) is exactly the definition of \(a_m\to 0\).

This is useful mostly to immediately rule out the possibility that certain series converge. For instance it tells us that \(\sum (1+\frac{1}{n})\) must diverge as the terms approach \(1\), not zero. But, when the terms approach zero its not very helpful: there are many series with \(a_n\to 0\) which do converge, and many which diverge. To distinguish between these, we need to build up some more powerful tools.

13.1 Comparison

One of the very most useful convergence tests for a series is comparison. This lets us show that a series we care about (that may be hard to compute with) converges or diverges by comparing it to a simpler series - much like the squeeze theorem did for us with sequences. This theorem gives less information than the squeeze theorem (it doesn’t give us the exact value of the series we are interested in) but it is also easier to use (it only requires a bound, not an upper and lower bound with the same limit).

Theorem 13.1 (Comparison For Series) Let \(\sum a_n\) and \(\sum b_n\) be two series of nonnegative terms, with \(0\leq a_n\leq b_n\).

  • If \(\sum b_n\) converges, then \(\sum a_n\) converges.
  • If \(\sum a_n\) diverges, then \(\sum b_n\) diverges.

The proof is just a rehashing of our old friend, Monotone Convergence.

Proof. We prove the first of the two claims, and leave the second as an exercise. If \(x_n\geq 0\) then the series \(s_n=\sum_{k=0}^n x_k\) is monotone increasing (as by definition \(s_{n}=s_{n-1}+x_n\) and \(x_n\geq 0\) we see \(s_{n}\geq s_{n-1}\) for all \(n\)).

Thus. \(\sum a_n\) and \(\sum b_n\) are monotone sequences. If \(\sum b_n\) converges, we know by the Monotone Convergence Theorem that it its limit \(\beta\) is the supremum of the partial sums, so for all \(n\) \[\sum_{k=0}^n b_k \leq \beta\] But, since \(a_k\leq b_k\) for all \(k\), we see the same is true of the partial sums \[\sum_{k=0}^n a_k\leq \sum_{k=0}^n b_k\] Stringing these inequalities together, we see that \(\sum a_k\) is bounded above by \(\beta\). Since it is monotone (as the sum of nonnegative terms) as well, Monotone convergence assures us that it converges, as claimed.

Exercise 13.2 Let \(\sum a_n\) and \(\sum b_n\) be two series of nonnegative terms, with \(0\leq a_n\leq b_n\). Prove that if \(\sum a_n\) diverges, then \(\sum b_n\) diverges.

13.1.1 Summing Reciprocals

The comparison test is incredibly useful: some of the most famous series it lets us understand are left as exercises below.

Exercise 13.3 Show the harmonic series \(\sum \frac{1}{n}\) diverges, by comparing it with the partial sums of \[1, 1/2, 1/4, 1/4, 1/8, 1/8, 1/8, 1/8, 1/16,...\]

Exercise 13.4 Prove that \(\sum \frac{1}{n^2}\) converges. *Hint: compare with \(1/((n-1)n)\), which telescopes.

Exercise 13.5 Prove that for \(s\geq 2\) that \(\sum \frac{1}{n^s}\) converges.

13.1.2 Absolute & Conditional Convergence

Below we will develop several theorems that apply exclusively to series of positive terms. That may seem at first to be a significant obstacle, as many series involve both addition and subtraction! So, we take some time here to assuage such worries, and provide a means of probing a general series using information about its nonnegative counterpart.

Definition 13.2 (Absolute Convergence) A series \(\sum a_n\) converges absolutely if the associated series of absolute values \(\sum |a_n|\) is convergent.

Of course, such a definition is only useful if facts about the nonnegative series imply facts about the original. Happily, that is the case.

Theorem 13.2 (Absolute Convergence Implies Convergence) Every absolutely convergent series is a convergent series.

Proof. Let \(\sum a_n\) be absolutely convergent. Then \(\sum |a_n|\) converges, and its partial sums satisfy the Cauchy criterion. This means for any \(\epsilon\) we can find an \(N\) where \[|a_n|+|a_{n+1}|+\cdots+|a_m|<\epsilon\]

But, by the triangle inequality we know that \[|a_n+a_{n+1}+\cdots+a_n|\leq |a_n|+|a_{n+1}|+\cdots+|a_m|\] Thus, our original series \(\sum a_k\) satisfies the Cauchy Criterion, as \[\left|\sum_{k=m}^na_k\right|<\epsilon\] And, since Cauchy is equivalent to convergence, this implies \(\sum a_k\) is a convergent series.

Definition 13.3 A series converges conditionally if it converges, but is not absolutely convergent.

Such series caused much trouble in the foundations of analysis, as they can exhibit rather strange behavior. We met one such series in the introduction, the alternating sum of \(1/n\) which seemed to converge to different values depending on the order we added its terms. Here we begin an investigation into such phenomena.

13.2 Ratio & Root Tests

Give tests in terms of limits, then point out these may not always exist so need limsup. Give the (optional) if and only if version using limsup.

Emphasize in proofs of the ratio and root tests that we are doing comparison.

The ratio test will be most important to us.

But we do need the root test to prove the Cauchy Hadamard theorem. (And, the fact that it does not rely on consecutive terms, but looks at each term one at a time)

13.3 Alternating Series

Definition 13.4 (Alternating Series) An alternating series is a series of the form \(\sum (-1)^n b_n\) for \(a_n\) a nonnegative series. That is, every term switches from positive to negative.

Theorem 13.3 (Alternating Series Test) If \(\sum (-1)^na_n\) is alternating, then it converges if \(a_n\) decreases monotonically with limit zero.

Before jumping in, its helpful to take a look at a few partial sums to start. For example, \(s_4\):

\[s_4=a_0-a_1+a_2-a_3+a_4=(a_0-a_1)+(a_2-a_3)+a_4\]

Grouping the terms of this finite sum like so shows that \(s_4\) is a sum of positive numbers (since \(a_n\) is decreasing, so \(a_n-a_{n-1}\geq 0\)): thus \(s_4\geq 0\).

\[s_4=a_0-a_1+a_2-a_3+a_4=a_0-(a_1-a_2)-(a_3-a_4)\] This grouping shows \(s_4\) is equal to \(a_0\) minus a bunch of nonnegative terms: thus \(s_4\leq a_0\). This extends directly

Exercise 13.6 Let \(s_n=\sum_{k=0}^n (-1)^ka_k\) be an alternating series with \(a_n\to 0\) monotonically. Prove by induction that

  • All the partial sums \(s_n\) are nonnegative.
  • All partial sums are bounded above by the first term \(a_0\).

Corollary 13.2 Starting the sum at \(N\) instead of \(0\), the same argument shows that \(\left|\sum_{k=N}^n (-1)^ka_k\right|\leq |a_N|\) for all \(n\geq N\).

What other patterns can we notice? Increasing from \(s_4\) to \(s_6\) we see \[s_6=a_0-a_1+a_2-a_3+a_4-a_5+a_6\] \[=s_4-a_5+a_6=s_4-(a_5-a_6)\] Thus \(s_6\leq s_4\). A similar look at \(s_3\) and \(s_5\) shows \[s_5=a_0-a_1+a_2-a_3+a_4-a_5=s_3+(a_4-a_5)\] So \(s_5\geq s_3\)! This is a sort of pattern we’ve seen before, where it’s helpful to look at the even versus odd subsequences individually:

Exercise 13.7 Let \(s_n=\sum_{k=0}^n (-1)^ka_k\) be an alternating series, and prove by induction that

  • The even subsequence is monotone decreasing
  • The odd subsequence is monotone increasing

Because each of these subsequences is monotone and bounded (by the previous exercise) they converge via monotone convergence. Now, all we need to see is they converge to the same limit to assure convergence of the entire series, by Theorem 9.1.

Proposition 13.1 Let \(s_n=\sum_{k=0}^n (-1)^ka_k\) be an alternating series with \(a_n\to 0\) monotonically. Then \(s_n\) converges.

Proof. Let \(e_n=s_{2n}\) and \(o_n=s_{2n+1}\) be the even and odd subsequences respectively, and note that \(o_n=e_n-a_{2n+1}\). Then, since we know the subsequence \(a_{2n+1}\) converges to zero (as \(a_n\to 0\), so all subsequences have the same limit) we can apply the limit theorems and see \[\lim o_n = \lim e_n-a_{2n+1}=\lim e_n-\lim a_{2n+1}=\lim e_n\] So, the odd and even subsequences do have the same limit, as required.

First we look at the main example of a conditionally convergent series.

Example 13.1 \(\sum \frac{(-1)^n}{n}\) is conditionally convergent:

  • It converges, by the alternating series test.
  • But it is not absolutely convergent, as \(\sum \frac{1}{n}\) diverges by EXR

This series is famous from the introduction to our course, where we saw that its value when summed is the natural logarithm of 2, but that this value changes when we reorder the terms! This is a general behavior of conditionally convergent series; and one hint of this is that the sum of their positive and negative terms separately each diverges to \(\pm\infty\).

Theorem 13.4 If \(\sum a_k\) is conditionally convergent, let \(p_k\) be the subsequence of all positive terms of \(a_k\) and \(n_k\) be the subsequence of all negative terms. Prove that \[\sum p_k\to\infty\hspace{1cm}\sum n_k\to-\infty\]

For an absolutely convergent series, this cannot happen, and the sums of all the positive terms converges, as does the sum of all the negative terms.

Exercise 13.8 Prove that if \(\sum a_n\) is absolutely convergent, then its subseries of positive terms and its subseries of negative terms both converge.

13.4 Additional

Cauchy Condensation Test? (Find good applications, first) Pg 209