$$ \newcommand{\RR}{\mathbb{R}} \newcommand{\QQ}{\mathbb{Q}} \newcommand{\CC}{\mathbb{C}} \newcommand{\NN}{\mathbb{N}} \newcommand{\ZZ}{\mathbb{Z}} \newcommand{\FF}{\mathbb{F}} \renewcommand{\epsilon}{\varepsilon} % ALTERNATE VERSIONS % \newcommand{\uppersum}[1]{{\textstyle\sum^+_{#1}}} % \newcommand{\lowersum}[1]{{\textstyle\sum^-_{#1}}} % \newcommand{\upperint}[1]{{\textstyle\smallint^+_{#1}}} % \newcommand{\lowerint}[1]{{\textstyle\smallint^-_{#1}}} % \newcommand{\rsum}[1]{{\textstyle\sum_{#1}}} \newcommand{\uppersum}[1]{U_{#1}} \newcommand{\lowersum}[1]{L_{#1}} \newcommand{\upperint}[1]{U_{#1}} \newcommand{\lowerint}[1]{L_{#1}} \newcommand{\rsum}[1]{{\textstyle\sum_{#1}}} % extra auxiliary and additional topic/proof \newcommand{\extopic}{\bigstar} \newcommand{\auxtopic}{\blacklozenge} \newcommand{\additional}{\oplus} \newcommand{\partitions}[1]{\mathcal{P}_{#1}} \newcommand{\sampleset}[1]{\mathcal{S}_{#1}} \newcommand{\erf}{\operatorname{erf}} $$

14  Limits of Sums

Highlights of this Chapter: we consider the delicate problem of switching the order a limit and an infinite sum. We prove a theorem - the Dominated Convergence Theorem for Sums - that provides a condition under which this interchange is allowed, and explore a couple consequences for double summations. This Dominated Convergence Theorem is the first of several analogous theorems that will play an important role in what follows.

The fact that an infinite series is defined as a limit - precisely the limit of partial sums - has been of great utility so far, as all of our techniques for dealing with series fundamentally rest on limit theorems for sequences!

\[\sum_{k\geq 0} a_k :=\lim_{N\to\infty}\sum_{k=0}^N a_k\]

But once we start to deal with multiple series at a time, this can present newfound difficulties. Indeed, it’s rather common in practice to end up with an infinite sequence of infinite series.

\[ s_n=\sum_{k\geq 0}a_{n,k}\hspace{1cm} s=\lim_n s_n=\lim_n\lim_N \sum_{k=0}^N a_{k,n}\]

There’s an intuitive urge to just switch the order of the limits - equivalently, to “pull the limit inside the sum”. But such an operation is not always justified. Its easy to come up with examples of limits that cannot be switched:

Example 14.1 \[\begin{align*} 1 &= \frac{1}{2}+\frac{1}{2}\\ &=\frac{1}{4}+\frac{1}{4}+\frac{1}{4}+\frac{1}{4}\\ &=\frac{1}{8}+\frac{1}{8}+\frac{1}{8}+\frac{1}{8}+\frac{1}{8}+\frac{1}{8}+\frac{1}{8}+\frac{1}{8} \end{align*}\]

Taking the termwise limit and adding them up gives \[1=0+0+0+\cdots +0 = 0\]

This is nonsense! And the nonsense arises from implicitly exchanging two limits. To make this precise, one may define for each \(n\) the series \[a_n(k)=\begin{cases} 1/2^n & 0\leq k<2^n\\ 0 & \mathrm{else} \end{cases} \]

Then each of the rows above is the sum \(1=\sum_{k\geq 0}a_n(k)\) for \(n=2,3,4\). Since this is constant it is true that the limit is \(1\), but it is not true that the limit of the sums is the sum of the limits, which is zero.

\[1=\lim_n \sum_{k\geq 0}a_n(k) \neq \sum_{k\geq 0}\lim a_n(k)=0\]

So, its hopefully clear that to be able to use series in realistic contexts, we are in desperate need of a theorem which tells us when we can interchange limits and summations.

TO DO:

Give examples of what goes wrong! Pictures (with sums as bar-graphs) of what is going on in these examples: we need to prevent area from “leaking out to infinity”

14.1 Dominated Convergence

Because limit interchange is so fundamental to analysis, there are many theorems of this sort, of varying strengths and complexities. The one we will visit here is usually called Tannery’s theorem (named for Jules Tannery, an analyst at the end of the 1800s). With the luxury of hindsight, we now realize Tannery’s theorem is a particularly special case of a much more general result called Dominated Convergence, of which we will meet other special cases in the chapters to come. As such, I will call it by its more descriptive and general name throughout.

First, let’s set the stage precisely. For each \(n\), we have an infinite series \(s_n\), and we are interested in the limit \(\lim_n s_n\) (here, we will always write subscripts on the limit as multiple variables are involved!) For each fixed \(n\), the series \(s_n\) is an infinite sum, over some summation index \(k\):

\[s_n=\sum_{k\geq 0}a_k(n)\]

Where for each \(k\) we write the term as \(a_k(n)\) to remember that it also depends on \(n\) (the notation \(a_{k,n}\) is also perfectly acceptable). We seek a theorem that gives us the conditions on which we can take the term-wise limit, that is when

\[\lim_n \sum_{k\geq 0}a_k(n)=\sum_{k\geq 0}\lim_n a_k(n)\]

Dominated convergence assures us that such a switch is justified so long as the entire process - all of the \(a_k(n)\)s are bounded by a convergent series.

Theorem 14.1 (Dominated Convergence for Series) For each \(k\) let \(a_k(n)\) be a function of \(n\), and assume the following:

  • For each \(k\), \(a_k(n)\) is convergent.
  • For each \(n\), \(\sum_k a_k(n)\) is convergent.
  • There is an \(M_k\) with \(|a_k(n)|\leq M_k\) for all \(n\).
  • \(\sum M_k\) is convergent.

Then \(\sum_k\lim_n a_k(n)\) is convergent, and \[\lim_n\sum_k a_k(n)=\sum_k\lim_n a_k(n)\]

Proof. First, we show that \(\sum_k a_k\) converges. Since for all \(n\), \(|a_k(n)|\leq M_k\) we know this remains true in the limit, so \(\lim_n |a_k(n)|=|a_k|<M_k\). Thus, by comparison we see \(\sum_k |a_k|\) converges, and hence so does \(\sum_k a_k\).

Now, the main event. Let \(\epsilon>0\). To show that \(\lim_n \sum_k a_k(n)=\sum_k a_k\) we will show that there there is some \(N\) beyond which these two sums always differ by less than \(\epsilon\).

Since \(\sum_k M_k\) converges, by the Cauchy criterion there is some \(L\) where \[\sum_{k\geq L}M_k<\frac{\epsilon}{3}\]

For arbitrary \(n\), we compute

\[\begin{align*} \left|\sum_{k\geq 0} a_k(n)-\sum_{k\geq 0}a_k\right| &= \left|\sum_{k< L} (a_k(n)-a_k)+\sum_{k\geq L}a_k(n)+\sum_{k\geq L}a_k\right|\\ &\leq \left|\sum_{k< L} (a_k(n)-a_k)\right|+\left|\sum_{k\geq L}a_k(n)\right|+\left|\sum_{k\geq L}a_k\right|\\ &\leq \sum_{k< L}|a_k(n)-a_k| +\sum_{k< L}|a_k(n)|+\sum_{k\geq L}|a_k|\\ &\leq \sum_{k< L}|a_k(n)-a_k|+ 2\sum_{k>L}M_k\\ & < \sum_{k< L}|a_k(n)-a_k|+\frac{2\epsilon}{3} \end{align*}\]

That is, for an arbitrary \(n\) we can bound the difference essentially in terms of the first \(L\) terms: the rest are uniformly less than \(2\epsilon/3\). But for each of these \(L\) terms, we know that \(a_k(n)\to a_k\) so we can find an \(N\) making that difference as small as we like. Let’s choose \(N_k\) such that \(|a_k(n)-a_k|<\epsilon/3L\) for each \(k<L\) and then take

\[N=\max\{N_0,N_1,\ldots N_{L-1}\}\]

Now, for any \(n>N\) we are guaranteed that |a_k(n)-a_k|</3L$ and thus that

\[\sum_{k<L}|a_k(n)-a_k|< L\frac{\epsilon}{3L}=\frac{\epsilon}{3}\]

Combining with the above, we now have for all \(n>N\), \[\left|\sum_{k\geq 0} a_k(n)-\sum_{k\geq 0}a_k\right|<\epsilon\] as required.

There is a natural version of this theorem for products as well (though we will not need it in this course, I will state it here anyway)

Theorem 14.2 (\(\bigstar\) Dominated Convergence for Products) For each \(k\) let \(a_k(n)\) be a function of \(n\), and assume the following:

  • For each \(k\), \(a_k(n)\) is convergent.
  • For each \(n\), \(\prod_{k\geq 0} a_k(n)\) is convergent.
  • There is an \(M_k\) with \(|a_k(n)|\leq M_k\) for all \(n\).
  • \(\sum M_k\) is convergent.

Then \(\prod_{k\geq 0}\lim_n (1+a_{k}(n))\) is convergent, and \[\lim_n\prod_{k\geq 0} (1+a_k(n))=\prod_{k\geq 0}(1+\lim_n a_k(n))\]

14.2 Applications

Exercise 14.1 Use Dominated Convergence to prove that

\[\frac{1}{2}=\lim_n \left[\frac{1+2^n}{2^n\cdot 3 +4}+\frac{1+2^n}{2^n\cdot 3^2 +4^2}+\frac{1+2^n}{2^n\cdot 3^3 +4^3}+\cdots\right]\]

  • Write in summation notation, and give a formula for the terms \(a_k(n)\)
  • Show that \(\lim_n a_k(n) =\frac{1}{3^k}\)
  • Show that for all \(n\), \(|a_k(n)|\leq \frac{2}{3^k}\)

Use these facts to show that the hypotheses of dominated convergence hold true, and then use the theorem to help you take the limit.

TO DO: MORE EXAMPLES