$$ \newcommand{\RR}{\mathbb{R}} \newcommand{\QQ}{\mathbb{Q}} \newcommand{\CC}{\mathbb{C}} \newcommand{\NN}{\mathbb{N}} \newcommand{\ZZ}{\mathbb{Z}} \newcommand{\FF}{\mathbb{F}} \renewcommand{\epsilon}{\varepsilon} % ALTERNATE VERSIONS % \newcommand{\uppersum}[1]{{\textstyle\sum^+_{#1}}} % \newcommand{\lowersum}[1]{{\textstyle\sum^-_{#1}}} % \newcommand{\upperint}[1]{{\textstyle\smallint^+_{#1}}} % \newcommand{\lowerint}[1]{{\textstyle\smallint^-_{#1}}} % \newcommand{\rsum}[1]{{\textstyle\sum_{#1}}} \newcommand{\uppersum}[1]{U_{#1}} \newcommand{\lowersum}[1]{L_{#1}} \newcommand{\upperint}[1]{U_{#1}} \newcommand{\lowerint}[1]{L_{#1}} \newcommand{\rsum}[1]{{\textstyle\sum_{#1}}} % extra auxiliary and additional topic/proof \newcommand{\extopic}{\bigstar} \newcommand{\auxtopic}{\blacklozenge} \newcommand{\additional}{\oplus} \newcommand{\partitions}[1]{\mathcal{P}_{#1}} \newcommand{\sampleset}[1]{\mathcal{S}_{#1}} \newcommand{\erf}{\operatorname{erf}} $$

20  Power Series

Highlights of this Chapter: we introduce the definition of a power series, and testing for convergence via ratios.

Polynomials are finite sums of multiples of powers of \(x\). The natural infinite analog is a power series, arising as the limit of a sequence of polynomials of increasing degree

Definition 20.1 (Power Series) A power series is a function defined as the limit of a sequence of polynomials \[f(x)=\sum_{n\geq 0}a_n x^n\] for a sequence \(a_n\) of real numbers. For each \(x\), this defines an infinite series; the domain of a power series is the subset \(D\subset\RR\) of \(x\) values where the series converges.

The simplest power series are polynomials themselves, which have \(a_n=0\) after some finite \(N\). Perhaps the second simplest power series is the one with \(a_n=1\) for all \(n\):

\[f(x)=1+x+x^2+x^3+x^4+\cdots + x^n+\cdots\]

This is none other than the geometric series in \(x\)! So, it converges whenever the common ratio \(x\) satisfies \(|x|<1\): its domain is the interval \((-1,1)\).

Power series are an extremely versatile tool to reach beyond the arithmetic of polynomials, while staying close to the fundamental operations of addition/subtraction and multiplication/division. One of our main uses of them will be to provide efficient means of computing important functions (exponentials, logs, trigonometric functions, etc).

Definition 20.2 (Power Series Representation) Given a function \(f(x)\), a power series representation of \(f\) is a series \(s(x)=\sum a_n x^n\) such that \(s(x)=f(x)\) whenever \(s(x)\) converges.

20.1 Convergence

The most important thing to understand about a power series is its domain: where does it actually converge?

Definition 20.3 (Interval of Convergence) The domain of a power series, also called its interval of convergence, is the set of all \(x\) for which it converges.

Proposition 20.1 If a power series converges at \(u\) then it converges at all \(v\in (-u,u)\).

Proof.

Exercise 20.1 If a power series diverges at \(u\) then it also diverges at all \(v\) with \(|v|>|u|\).

Thus, the domain of a power series must be a set of the form \((-r,r)\), \([-r,r]\), \([-r,r)\) or \((-r,r]\) Since it can often be difficult to determine exactly what happens at the endpoints of the interval of convergence, where the series may converge either absolutely, conditionally, or not at all. Thus speaking of the radius (and avoiding the issue of convergence at endpoints) is often useful.

Definition 20.4 (Radius of Convergence) The radius of convergence of a power series is the largest value of \(r>0\) such that the series converges on \((-r,r)\).

Corollary 20.1 (Absolute Convergence of Power Series) Let \(f(x)=\sum a_n x^n\) be a power series with radius of convergence \(r\), and let \(u\in(-r,r)\). Then \(f\) converges absolutely at \(u\).

Proof.

Thus within the radius of convergence, absolute convergence means that the terms of a power series can be re-arranged without changing the limiting value: infinite addition is commutative here. (Note that this may not apply at the endpoints of the interval of convergence.)

20.1.1 Finding the Radius of Convergence

Comparison has already taught us a lot about the convergence of series, but it can do a lot more for us. Indeed, rewriting \(\sum a_n x^n\) as \(\sum (a_n^{1/n}x)^n\) suggests a natural comparison when \(a_n^{1/n}\) converges:

Proposition 20.2 Let \(\sum a_n x^n\) be a power series, and \(\alpha = \lim \sqrt[n]{|a_n|}\). Then the radius of convergence is \(r=1/\alpha\) (where \(\alpha=\infty\) means \(r=0\) and \(\alpha=0\) means convergence on all of \(\RR\)).

Proof. COMPARISON WITH GEOMETRIC SERIES

This test is clear and rather powerful - it applies to almost all series you will encounter in practice. But its not completely general as we assumed that \(\lim \sqrt[n]{|a_n|}\) converged as a hypothesis, and this may not be the case. Happily, the easy technical fix of replacing \(\lim\) with \(\limsup\) (which does always exist) provides a completely general theorem.

Theorem 20.1 (Finding the Radius of Convergence: Cauchy-Hadamard) Let \(\sum a_n x^n\) be a power series, and \(\alpha = \limsup \sqrt[n]{|a_n|}\). Then the radius of convergence is \(r=1/\alpha\) (where \(\alpha=\infty\) means \(r=0\) and \(\alpha=0\) means convergence on all of \(\RR\)).

Exercise 20.2 Generalize the proof of Proposition 20.2 to prove the above theorem, using the limsup.

This test is incredibly useful theoretically as it gives precise conditions on when a power series converges or diverges. But for specific series, its rather difficult to apply in practice: who wants to compute a sequence of \(n^{th}\) roots? So now, we seek a more practical test for convergence that is easy to apply in specific cases without worrying about total generality. And we find one in our other standardized comparison with geometric series, the ratio test!

Theorem 20.2 Let \(\sum a_n x^n\) be a power series, and assume the sequence of ratios \(\tfrac{a_{n+1}}{a_n}\) converges to some \(\alpha\in\RR\). Then the radius of convergence is \(r=1/\alpha\) (where \(\alpha=\infty\) means \(r=0\) and \(\alpha=0\) means convergence on all of \(\RR\)).

Proof. COMPARISON WITH A GEOMETRIC SERIES

One weakness is that it relies on consecutive ratios and these aren’t even always defined: for example the series \(\sum x^{2n}\) has coefficients \(1,0,1,0,1,0,\ldots\). This however is easy to fix: we can apply the ratio test to the nonzero terms of a power series and also get a useful result:

Exercise 20.3 Let \(a_n\) be the sequence of nonzero coefficients for a power series \(\sum a_n x^{n_k}\) (where \(n_k\) is an increasing sequence of powers, skipping any \(n\) where the coefficient would have been zero). Assume the sequence of ratios \(\tfrac{a_{n+1}}{a_n}\) converges to some \(\alpha\in\RR\). Prove the radius of convergence is \(r=1/\alpha\) (where \(\alpha=\infty\) means \(r=0\) and \(\alpha=0\) means convergence on all of \(\RR\)).

20.2 Continuity

Theorem 20.3 (Continuity within Radius of Convergence) Let \(f(x)=\sum_k a_kx^k\) be a power series with radius of convergence \(r\). Then if \(|x|<r\), \(f\) is continuous at \(x\).

Proof. Without loss of generality take \(x>0\), and let \(x_n\) be an arbitrary sequence in \((-r,r)\) converging to \(x\). We aim to show that \(f(x_n)\to f(x)\).

As \(x<r\) choose some \(y\) with \(x<y<r\) (perhaps, \(y=(x+r)/2\)). Since \(x_n\to x\) there is some \(N\) past which \(x_n\) is always less than \(y\) (take \(\epsilon = y-x\) and apply the definition of \(x_n\to x\)). As truncating the terms of the sequence before this does not change its limit, we may without loss of generality assume that \(x_n<y\) for all \(n\). Thus, we may define \(M_k = a_k y^k\), and we are in a situation to verify the hypotheses of Dominated Convergence:

  • Since \(x_n\to x\), we have \(a_kx_n^k\to a_kx^k\) by the limit theorems.
  • For each \(n\), \(f(x_n)=\sum_k a_k x_n^k\) is convergent as \(x_n\) is within the radius of convergence.
  • \(M_k=a_ky^k\) bounds \(a_kx_n^k\) for all \(n\), as \(0<x_n<y\).
  • \(\sum_k M_k\) converges as this is just \(f(y)\) and \(y\) is within the radius of convergence.

Applying the theorem, we see \[\lim_n f(x_n)=\lim_n\sum_k a_kx_n^k=\sum_k \lim a_kx_n^k=\sum_k a_kx^k=f(x)\]

Thus for arbitrary \(x_n\to x\) we have \(f(x_n)\to f(x)\), so \(f\) is continuous at \(x\).

20.2.1 \(\blacklozenge\) Behavior at Boundary

Prove Abels theorem (maybe using summation by parts?) Will re-prove in “Functional Analysis” using uniform convergence.

Theorem 20.4 If a power series converges at an endpoint of its interval of convergence, its continuous there.

Proof.

This has many important corollaries for power series representations of continuous functions: if \(\sum a_n x^n\) represents \(f(x)\) on \((-r,r\) and converges at \(r\) then \(\sum a_n r^n = f(r)\). We will use this in later chapters to come up with beautiful formulas that converge to \(\pi\), among other things.

20.3 Advanced

Product of Power Series

Composition and Division of Power Series (pg 499 in Amazing)

Abel Summability (p 464)

20.4 \(\bigstar\) Plugging Everything into Power Series

Power series are an incredibly useful tool, both for computation and for creativity. Since they are defined using only the operations of addition and multiplication, power series make sense in any mathematical domain where we (1) have an operation of \(+\) and \(\times\) and (2) have a notion of convergence.

20.5 Problems

20.5.1 Example Power Series

Power series provide us a means of describing functions via explicit formulas that we have not been able to thus far, by allowing a limiting process in their definition. For instance, we will soon see that the power series below is an exponential function.

Exercise 20.4 Show the power series \(\sum\frac{x^n}{n!}\) converges for all \(x\in\RR\).

When a power series converges on a finite interval, its behavior at each endpoint may require a different argument than the ratio test (as that will give \(1\), and tell you nothing)

Example 20.1 Show the power series \(\sum\frac{x^n}{n}\) has domain \([-1,1)\).

Exercise 20.5 Show the power series \(\sum\frac{x^n}{n^2}\) has domain \([-1,1]\).

When the radius of convergence is \(0\), the power series converges at a single point:

Exercise 20.6 Show the power series \(\sum n!x^n\) diverges for all \(x\neq 0\).

Exercise 20.7 Series \(\sum 2^nx^n\) converges on \([-1/2,1/2)\). Hint: substitution \(y=2x\)

Example 20.2 Where does \(\sum 2^nx^{3n}\) converge? Trickier! Need to worry about the exponents not being just \(n\)