16 Definition & Properties
Highlights of this Chapter: we formalize the concept of continuity, one of the foundational definitions in the analysis of functions. We provide an equivalent definition built out of sequences, and use it to prove ‘continuity analogs’ of the limit theorems. We also give the related definitions of function limits.
What does continuity mean? In pre-calculus classes, we often first hear something like “you can draw the graph without picking up your pencil”. This is a good guide to start with for a formal definition: its clearly capturing some property that is easy to check by visual inspection! But it’s not precise: terms like “you” and “pencil”, as well as modal phrases like “can draw” are nowhere to be found in the axioms of ordered fields! How can we say the same thing, using words we have access to?
16.1 Epsilons and Deltas
First, a function is an input-output machine, so we should rephrase things in terms of inputs and outputs. When a graph makes a jump (where you’d have to pick up your pencil), the output changes a lot even when the input barely does. Thus, not having to pick up your pencil means you change the input by a little bit, the output changes by a little bit.
This is totally something we can make precise! A good start is by giving names to things: we want to say for any change in the input smaller than some \(\delta\), we know the output cant change that much: maybe its maximum is some other small change \(\epsilon\):
Definition 16.1 (Continuity with \(\epsilon-\delta\)) A function \(f\) is continuous at a point \(a\) in its domain if for every \(\epsilon>0\) there is some threshold \(\delta\) where if \(x\) is within \(\delta\) of \(a\), then \(f(x)\) is within \(\epsilon\) of \(f(a)\). As a logic sentence: \[\forall \epsilon>0\,\exists\delta>0\, \forall x\, |x-a|<\delta\implies |f(x)-f(a)|<\epsilon\]
A function is continuous on a set \(X\subset\RR\) if it is continuous at \(a\) for each \(a\in\RR\). A function is continuous if it is continuous on its domain.
16.1.1 Working with this Definition
This definition looks a lot like the sequence definition, at least in terms of the order of the quantifiers. And so we can work with it the same way: playing the “\(\epsilon\)-\(\delta\) game” instead of the \(\epsilon\)-\(N\) game.
Example 16.1 Any constant function \(f(x)=c\) is continuous at every real number \(a\).
To prove this, we choose arbitrary \(\epsilon>0\), and observe that for any \(x\in\RR\), \(f(x)-f(a)=c-c=0\), which is less than \(\epsilon\). Thus, for any \(\delta>0\) (and if we want to be specific, choose say \(\delta=1\)), if \(|x-a|<\delta\) then \(|f(x)-f(a)|<\epsilon\).
Example 16.2 The function \(y=cx\) is continuous at every real number \(a\).
Here’s the scratch work: note that if \(c=0\) then \(f(x)=0\) is constant, and we are done by the previous example. So, we may assume \(c\neq 0\). Given an aribtrary \(a\in\RR\), choose \(\epsilon>0\), and note that \(|f(x)-f(a)|=|kx-ka|=|k||x-a|\). If \(|x-a|<\delta\) this means \(|f(x)-f(a)|<|k|\delta\), so we may choose \(\delta = \epsilon/|k|\).
Remark: our value of \(\delta\) is allowed to depend on \(\epsilon\), as well as properties of our function (like the \(k\) here)
While the \(\epsilon-\delta\) definition is nice in that it looks like the sequence definition, we still end up having to play the \(\epsilon\) game with every argument. Indeed, while some functions are well-suited these, for other relatively simple looking arguments, picking the right \(\delta\) actually turns out to be a bit of work!
Example 16.3 The function \(f(x)=x^2\) is continuous.
Scratch Work: Given \(a\in\RR\); we will prove \(f\) is continuous at \(a\) (here we do the case \(a>0\); it is only a small modification for \(a<0\): can you complete it?) Start by choosing arbitrary \(\epsilon>0\). We seek a \(\delta\) such that when \(|x-a|<\delta\), we can ensure \(|f(x)-f(a)|=|x^2-a^2|<\epsilon\). Using difference of squares, \[|x^2-a^2|=|x+a||x-a|<|x+a|\delta\] for our future value of \(\delta\). To make further progress, let’s decide to always choose a value of \(\delta\) which is \(<1\) (if you originally had a larger \(\delta\), of course taking a smaller value will also work, so its no trouble to choose a maximal size). Then \(|x-a|<\delta\) means \(x\) is always within \(1\) of \(a\), so \(x\) can never be bigger than \(a+1\). Thus, \(x+a\) can never be bigger then \((a+1)+a\), or \(2a+1\) so we know \[|x+a|<(2a+1)\delta\]
For this to be less than \(\epsilon\), we can solve for \(\delta\), and set \(\delta = \epsilon / (2a+1)\). Writing a rigorous proof by essentially starting with this claimed value for \(\delta\) and “working backwards” confirms this works.
Like any definition, its good after seeing a few examples to also turn and look at non-examples:
Example 16.4 The step function \[h(x)=\begin{cases} 0 & x\leq 0\\ 1 & x>0 \end{cases}\] is discontinuous at \(0\), but is continuous at all other real numbers.
At \(0\), we prove discontinuity by fixing \(\epsilon=1/2\), and showing for any \(\delta>0\) there are points within \(\delta\) of \(0\) whose values under \(f\) differ from \(f(0)\) by more than \(\epsilon\). Indeed - we can just take \(x=\delta/2\): this is positive, and \(|x-0|<\delta\), but \(f(x)=1\) whereas \(f(0)=0\), so \(|f(x)-f(0)|=1>\epsilon\). However, for any nonzero \(a\in\RR\), \(h\) is continuous at \(a\). Fixing an arbitrary \(\epsilon>0\), we can take \(\delta=|a|\), and note that \(x\) being within \(\delta\) of \(a\) implies \(x\) has the same sign as \(a\) (either positive or negative). Thus \(f(x)=f(a)\), so \(|f(x)-f(a)|=0\) which is certainly less than \(\epsilon\).
Thus, a function with a jump in it is discontinuous right at the jump, as we expect. This shows its possible for a function to be discontinuous at a single point, but things can get much stranger!
Example 16.5 The characteristic function of the rational numbers is discontinuous everywhere. \[b(x)=\begin{cases} 1 & x\in\QQ\\ 0 & x\not\in\QQ \end{cases}\]
Setting \(\epsilon=1/2\), note that proving discontinuity at \(a\) means showing that for *any$ \(\delta>0\) we can find an \(x\) within \(\delta\) of \(a\) where \(f(x)\) differs from \(f(a)\) by more than \(1/2\). The proof breaks into two cases depending on the (ir)rationality of \(a\). First, for irrational \(a\), by the density of rationals we may for any \(\delta>0\) find a rational number \(x\) with \(a-\delta< x < a+\delta\), so \(|x-a|<\delta\). But, \(f(a)=0\) since \(a\) is irrational and \(f(x)=1\) since its rational, thus \(|f(x)-f(a)|=1>\epsilon\). The case of \(a\) rational is similar, but now we use the density of the irrationals to find an appropriate \(x\).
We saw above a function that is discontinuous at a single point, and then one that is discontinuous everywhere. What’s harder to imagine, is a function that is continuous at a single point. Try thinking about what this might mean!
Exercise 16.1 Show that the following function is continuous at \(0\) and discontinuous everywhere else:
\[g(x)=\begin{cases} x & x\in \QQ\\ 0 & x\not\in\QQ \end{cases}\]
There are even stranger functions out there: for instance, the Thomae function \[ \tau(x)=\begin{cases} \frac{1}{q} & x\in\QQ\textrm{ and }\frac{p}{q}\textrm{ is lowest terms.}\\ 0 &x\not\in\QQ \end{cases} \] is continuous at the irrational numbers, and discontinuous at every rational.
As an example of proving something using continuity, we prove the useful fact that when an continuous function is nonzero at some point, it actually stays nonzero for a little bit on each side.
Proposition 16.1 (Nonzero on a Neighborhood) If \(f\) is continuous, \(f(a)\neq 0\) then there is a small open interval about \(a\) where \(f\) is nonzero.
Proof. Let \(f(a)=c\) with \(c>0\), and set \(\epsilon= |f(c)|/2\). By continuity, there is some \(\delta\) such that if \(|x-c|<\delta\) we know \(|f(x)-f(c)|<\epsilon\). Unpacking this, for all \(x\in(c-\delta, c+\delta)\) we know
\[-\epsilon=\frac{-|f(c)|}{2} < f(x)-f(c)<\frac{|f(c)|}{2}=\epsilon\]
And thus
\[f(c)-\frac{|f(c)|}{2}< f(x)<f(c)+\frac{|f(c)|}{2}\]
If \(f(c)\) is positive, then the lower bound here is \(f(c)/2\) which is still positive, so \(f(x)\) is always positive in the interval. And, if \(f(c)\) is negative, the upper bound here is \(f(c)/2\) which is still negative: thus \(f(x)\) is always negative in the interval.
16.2 Continuity and Sequences
We spent a lot of time working with sequences so far, so it would be nice if we could leverage some of that knowledge as more than just analogy. And indeed we can! We give the formal result below, but pause to develop some intuition:
Theorem 16.1 (Sequences and Continuity) Let \(f\) be a real function, and \(a\) a point of its domain. Then \(f\) is continuous at \(a\) if and only if for every sequence \(a_n\) in the domain wtih \(a_n\to a\), we have \(f(a_n)\to f(a)\).
This theorem is an equivalence of definitions or an if-and-only-if result, so the proof requires two parts: first we show that continuity implies sequence continuity, and then we show the converse.
Proof (Continuity Implies Sequence Continuity). Let \(f\) be continuous at \(a\), and \(x_n\) an arbitrary sequence converging to \(a\). We wish to show the sequence \(f(x_n)\) converges to \(f(a)\). Choosing an \(\epsilon>0\), we use the assumed continuity to get a \(\delta>0\) where \(|x-a|<\delta\) implies that \(|f(x)-f(a)|<\epsilon\).
But since \(x_n\to a\), we know there must be some \(N\) such that for \(n>N\) we have \(|x_n-a|<\delta\): thus for this same \(N\) we have \(|f(x_n)-f(a)|<\epsilon\).
Putting this all together, this is just the definition of convergence for the sequence \(f(x_n)\) to \(f(a)\): starting with \(\epsilon>0\) we got an \(N\) which for \(n>N\) we can guarantee \(|f(x_n)-f(a)|<\epsilon\). So we are done.
Proof (Sequence Continuity Implies Continuity). Here we prove the contrapositive: that if \(f\) is not continuous at \(a\) then it is also not sequence continuous there.
If \(f\) is not continuous at \(a\) then there is some \(\epsilon\) where for every \(\delta>0\) we can find points within \(\delta\) of \(a\) where \(f(x)\) is more than \(\epsilon\) away from \(f(a)\). From this we need to somehow produce a sequence, so we will take a sequence of such \(\delta\)’s and for each pick some such bad point \(x\).
For example, if we let \(\delta=1/n\) then call \(x_n\) the point with \(|x_n-a|<1/n\) but \(|f(x_n)-f(a)|>\epsilon\). Doing this for all \(n\) produces a sequence where \[a-\frac{1}{n}<x_n<a+\frac{1}{n}\] And so by the squeeze theorem we see that \(x_n\) converges, and its limit is \(a\). But we also know (by our choices of \(x_n\)) that for every element of this sequence |f(x_n)-f(a)|>$, so there’s no way that \(f(x_n)\) converges to \(f(a)\).
Thus, we’ve shown by example that our function is not sequence continuous at \(a\), as required.
When working with this definition of continuity, its important to remember that we need to check \(f(\lim x)=\lim f(x_n)\) for all sequences \(x_n\to a\). If it fails for any individual sequence, that is enough to show the function is not continuous at that point. Thus when proving continuity we will always start with let \(x_n\) be an arbitrary sequence converging to \(a\), and make use of convergence theorems to help us (since we cannot know the particular sequence), whereas for proving discontinuity all we need to do is produce a specific example sequence that fails.
With this definition, we can bring all of our theory on limits and put it to work. We see many of these benefits below; here we pause merely to re-do a single example for illustrative purposes:
Example 16.6 The function \(f(x)=x^2\) is continuous on all of \(\RR\).
Let \(a\in\RR\) be arbitrary, and choose an arbitrary sequence \(a_n\) of real numbers with \(a_n\to a\) (we know at least one such sequence exists since we have proven every real number is the limit of a sequence of rationals). By the limit theorem for products, since \(a_n\to a\) we know \(a_n\cdot a_n\to a\cdot a\). And as \(f(x)=x^2\), we can rewrite this as \(f(a_n)\to f(a)\). Since \(a_n\to a\) was arbitrary, this holds for all such sequences, and so \(f\) is continuous at \(a\). But since \(a\in\RR\) was arbitrary, \(f\) is continuous on the entire real line.
16.3 Building Continuous Functions
Because we have an equivalent characterization of continuity in terms of sequence convergence, and we have many theorems about this, we can use our characterization to rephrase these as results about continuity.
Proposition 16.2 (Continuity of Constant Multiples) If \(f\) is continuous at \(a\in\RR\) and \(k\in\RR\) is a constant, then the function \(kf\colon x\mapsto kf(x)\) is continuous at \(a\).
Proof (Using \(\epsilon-\delta\)). First note if \(k=0\) we are done as \(kf(x)=0\) is a constant function. Otherwise, let \(\epsilon>0\): since \(f\) is continuous at \(a\) there exists a \(\delta>0\) such that \(|x-a|<\delta\) implies \(|f(x)-f(a)|<\epsilon/|k|\). But this implies \[|kf(x)-kf(a)|=|k||f(x)-f(a)|<k\frac{\epsilon}{|k|}=\epsilon\] So \(|x-a|<\delta\) implies \(|kf(x)-kf(a)|<\epsilon\), and \(kf(x)\) is continuous at \(a\).
Proof (Using Sequences). Let \(a\in\RR\) be arbitrary, and \(x_n\) a sequence converging to \(a\). Then by the limit theorem for multiples, \(kx_n\to ka\). Rephrasing this in terms of the function \(f(x)=kx\), this just says that \(\lim f(x_n)=f(\lim x_n)\) so \(f\) is continuous at \(a\).
Theorem 16.2 (Continuity and the Field Operations) Let \(f,g\) be functions which are continuous at a point \(a\). Then the functions \(f(x)+g(x),f(x)-g(x)\) and \(f(x)g(x)\) are all continuous at \(a\). Furthermore if \(g(a)\neq 0\) then \(f(x)/g(x)\) is also continuous at \(a\).
Proof. We prove the case for sums, and leave the rest as an exercise. Let \(f,g\) be any two continuous functions and let \(a\in\RR\) be a point in their domains. Let \(x_n\) be any sequence converging to \(a\). Since \(f\) is continuous we know that \(\lim f(x_n)=f(\lim x_n)=f(a)\) and similarly by the continuity of \(g\), \(\lim g(x_n)=f(\lim x_n)=g(a)\). Thus by the limit theorem for sums, the sequence \(f(x_n)+g(x_n)\) is convergent, with \[\lim\left(f(x_n)+g(x_n)\right)=\lim f(x_n)+\lim g(x_n)=f(a)+g(a)\] So, \(f+g\) is continuous at \(a\). Since \(a\) was arbitrary, we see that \(f+g\) is continuous at every point of its domain. The same argument applies for subtraction, multiplication, and division using the respective limit theorems for sequences.
Exercise 16.2 Prove the remaining “continuity theorems”.
Exercise 16.3 (Continuity of Polynomials) Prove that every polynomial is a continuous function on the entire real line. Hint: prove \(x^n\) is continuous for each \(n\) by induction. Then prove the result for polynomials by induction on their degree!
Exercise 16.4 (Continuity of Rational Functions) A rational function is a quotient of polynomials \(r(x)=p(x)/q(x)\). Prove that every rational function is continuous, on every point of its domain.
One of the most important operations for functions is that of composition: if \(f\colon \RR\to\RR\) and \(g\colon\RR\to\RR\) then the function \(g\circ f\colon\RR\to\RR\) is defined as \(g\circ f(x):=g\left(f(x)\right)\). More generally, so long as the domain of \(g\) is a subset of the range of \(f\), the composition \(g\circ f\) is well defined.
Theorem 16.3 (Continuity of Compositions) Let \(f, g\) be functions such that \(f\) is continuous at \(a\), and \(g\) is continuous at \(f(a)\). Then the composition \(g\circ f(x):=g(f(x))\) is continuous at \(a\).
Proof. Let \(x_n\) be an arbitrary sequence converging to \(a\in\RR\): we wish to show that \(\lim g(f(x_n))=g(f(\lim x_n))=g(f(a))\). Since \(f\) is continuous at \(x=a\) we see immediately that \(f(x_n)\) is a convergent sequence with \(f(x_n)\to f(a)\). And now, since \(g\) is assumed to be continuous at \(x=f(a)\) and \(f(x_n)\) is a sequence converging to this point, we know \(g(f(x_n))=g(f(a))\) as required.
Another limit theorem we had was the limit theorem for the square root: which translates directly to a continuity theorem as well!
Theorem 16.4 (Continuity of Roots) The function \(R(x)=\sqrt{x}\) is continuous on \([0,\infty)\).
Proof. We actually already proved this, as a limit theorem about the square root! CITE states that if \(x\geq 0\) and \(x_n\) is any sequence of nonnegative numbers converging to \(x\), then \(\lim\sqrt{x_n}=\sqrt{\lim x_n}\). Thus \(\sqrt{x}\) is continuous at \(x\), and as \(x\) is an arbitrary nonnegative value, its continuous on its domain.
Corollary 16.1 (Continuity of Absolute Value) The absolute value satisfies \(|x|=\sqrt{x^2}\) for all real \(x\). This is a composition of two continuous functions, and thus is continuous.
The same is true for \(n^{th}\) roots, though we do not stop to prove it here, you may wish to for practice! This is a special case of a more general result on the continuity of inverse functions (as the square root is the inverse of \(x^2\))
Theorem 16.5 (Continuity of Inverse Functions) Let \(f\colon A\to B\) be a continuous invertible function for \(A,B\subset\RR\) bounded subsets. Then \(f^{-1}\) is continuous.
Proof (By Contradiction). Assume for the sake of contradiction that \(f\colon A\to B\) is continuous and invertible with \(A,B\subset \RR\) but \(f^{-1}\) is not continuous. Then there would be some sequence convergent sequence \(b_n\to b\) where \(f^{-1}(b_n)\not\to f^{-1}(b)\).
This sequence \(f^{-1}(b_n)\) lies in \(A\) which is bounded, so it contains a convergent subsequence due to Bolzano Weierstrass. Its further possible to select such a subsequence \(f^{-1}(b_{n_k})\) that converges to some value \(a\neq f^{-1}(y)\) (if this were impossible, all convergent subsequences would converge to \(f^{-1}(y)\), and so our sequence would have converged to this value!)
Now we use the fact that \(f\) is continuous. Since \(f^{-1}(y_{n_k})\to a\), we see \(f\left(f^{-1}(y_{n_k})\right)\to f(a)\), and since \(f^{-1}\) is the inverse of \(f\), this just means that \(y_{n_k}\to f(a)\). But since \(f\) is invertible, its \(1-1\), so the fact that \(a\neq f^{-1}(y)\) means that \(f(a)\neq y\). That is, we have found a subsequence \(y_{n_k}\) of \(y_n\), which does not converge to \(y\).
But this implies that the sequence \(y_n\) itself does not converge to \(y\) (else all subsequence would converge to \(y\)!) and this is a contradiction, as we assumed \(y_n\to y\) at the very start. Thus, \(f^{-1}\) is actually continuous, as desired.
We can apply this to functions we care about like \(n^{th}\) powers, to prove the continuity of \(n^{th}\) roots.
Corollary 16.2 The function \(\sqrt[n]{x}\) is continuous on the positive reals.
Proof. Let \(x> 0\), we want to show \(\sqrt[n]{}\) is continuous at \(x\). So, we need to choose bounded sets \(A,B\) for our domains, to make sure things work. Taking \(B=[0,x+1]\) will do, as it contains \(x\), and then \(A=[0,\sqrt[n]{x+1}]\), so \(x^n\) is an invertible function from \(A\) to \(B\). Its continuous, so its inverse is continuous, meaning \(\sqrt[n]{}\) is continuous on the interval \(B=[0,x+1]\) which contains \(x\). Thus its continuous at \(x\), and as \(x\) was an arbitrary positive real number,
Exercise 16.5 Our argument above showed \(\sqrt[n]{x}\) is continuous at all positive inputs. Show its continuous at zero.
Note to students! If you can think of a better proof of this, (especially one that doesn’t have this awkward boundedness/bolzano weierstrass stuff) let me know. If its slick enough, I’ll replace the proof in the textbook, and thank you in a footnote of the future editions!
The combination of these theorems allows us to prove many complicated functions are continuous, that would otherwise have been quite difficult directly from the definition!
Example 16.7 The following function is continuous on the entire real line. \[\frac{(x+|x^2-1|)^4}{1+\sqrt[3]{1+|x-1|^7}}\]
To prove it, we work from inside out, like we do for using the limit laws. Starting with the numerator, we see \(x^2-1\) is continuous as its a polynomial, so \(|x^2+1|\) is continuous as the composition of two continuous functions, and \(x+|x^2+1|\) is continuous as its the sum of two continuous functions. For the denominator, we similarly start with the continuity of \(x-1\), compose with \(||\) to get the continuity of \(|x-1|\), then compose with \(x^7\) to get \(|x+1|^7\) is continuous, add the constant function 1 (which is continuous) take the cube root (composing with the continuous function \(\sqrt[3]{x}\)) and finally add the continuous function 1 once more. So, the numerator and denominator are continuous. Finally, the denominator is strictly positive for all \(x\) (hence nonzero), so the quotient is continuous.
Exercise 16.6 (Continuity of Max and Min) Prove that for any \(a,b\in\RR\), we have \[\max\{a,b\}=\frac{a+b+|a-b|}{2}\] Intuitively, notice \((a+b)/2\) is the midway point between \(a\) and \(b\), and \(|a-b|/2\) is half the distance between them. So \((a+b)/2+|a-b|/2\) is the midway point plus half the distance, so its the larger of the two. But give a rigorous argument, perhaps by cases.
Use this to conclude that if \(f,g\) are two continuous functions then \(M(x)=\max\{f(x),g(x)\}\) is also continuous. Propose and prove a similar formula for the minimum, and show that \(\min\{f(x),g(x)\}\) is continuous in \(x\).
16.4 Function Limits
A related but slightly different concept is the limit of a function. We include this here as the definition and techniques ties very closely to those for continuity; we will have use for this material when we introduce the derivative, and in other cases where we need to understand the behavior of a function near a point, without actually being able to compute the function’s value at that point (perhaps, that point is outside the functions’ domain).
Definition 16.2 (Limits of Functions) Let \(f\colon D\to \RR\) and \(a\) be a limit point of \(D\). Then we write \(\lim_{x\to a}f(x)=L\) if for every \(\epsilon>0\) there is a \(\delta>0\) such that if \(x\in D\) and \(|x-a|<\delta\) then \(|f(x)-L|<\epsilon\).
One can alternatively phrase this in terms of sequences:
Exercise 16.7 Prove the following definition is equivalent to \(\lim_{x\to a}f(x)=L\): Given any sequence \(\{x_n\}\) in \(D\) with \(x_n\neq a\) for all \(n\), \(x_n\to a\) implies that \(f(x_n)\to L\).
Example 16.8 \[\lim_{x\to 2}\frac{x^2-4}{x-2}\]
Let \(x_n\) be any sequence converging to \(2\), for which \(x_n\neq 2\) for all \(n\). Then since \(x_n\neq 2\) the denominator of \((x^2-4)/(x-2)\) is never zero, and we can simplify with algebra: \[\frac{x_n^2-4}{x_n-2}=\frac{(x_n+2)(x_n-2)}{x_n-2}=x_n+2\]
Thus, for all \(n\) we have \[\lim \frac{x^2_n-4}{x_n-2}=\lim x_n+2=\lim(x_n)+2=4\]
Since \(x_n\) was arbitrary, this holds for all sequences and
\[\lim_{x\to 2}\frac{x^2-4}{x-2}=4\]
We will be most interested in taking the limit of functions in cases where things are not actually define at \(a\) like the example above: the most important example being the derivative, defined as the limit \(f^\prime(a)=\lim_{x\to a}(f(x)-f(a))/(x-a)\). However a good sanity check with a new definition is to see it performs as expected in known situations
Theorem 16.6 (Limits of Continuous Functions) If \(f\) is continuous at \(a\), then \(\lim_{x\to a}f(x)=f(a)\).
Proof. Let \(x_n\) be a sequence converging to \(a\), but not equal to \(a\) at any term. Since \(f\) is continuous at \(a\), we know the sequence \(f(x_n)\) converges to \(f(a)\). Thus by the sequence definition of function limits \(\lim_{x\to a}f(x)=f(a)\).
As an exercise, re-prove this result using the \(\epsilon-\delta\) definition.
16.4.1 \(\blacklozenge\) One-Sided Limits
The definition of function limit requires understanding all sequences limiting to \(a\) but not equal to \(a\). In applications, its often important to consider more restricted limits, looking only at what happens when we approach \(a\) from above or from below.
Definition 16.3 (Left- and Right-Sided Limits) Let \(f\) be a function
Similarly to above, these definitions have sequence counterparts (prove this, as an exercise):
Definition 16.4 Let \(f\) be a function. Then \(\lim_{x\to a^+}f(x)=L\) if for every sequence \(x_n\to a\) with \(x_n>a\) we have \(f(x_n)\to L\). Similarly \(\lim_{x\to a^-}f(x)=L\) if for every \(x_n\to a\) with \(x_n<a\) we have \(f(x_n)\to L\).
Exercise 16.8 (Limit Exists when Both Sides Agree) Let \(f\) be a function defined on an interval containing \(a\) (but perhaps not at \(a\)). Then \(\lim_{x\to a}f(x)\) exists if and only if both \(\lim_{x\to a^+}f\) and \(\lim_{x\to a^-}f\) both exist, and in this case is equal to their common value.
Exercise 16.9 (One Sided Limits of Monotone Functions) Let \(f\) be a bounded monotone function on the interval \((a,b)\). Then both of the one sided limits exist \[\lim_{x\to a^+}f(x)\hspace{1cm}\lim_{x\to b^-}f(x)\] Hint: show they are the inf and sup of \(\{f(x)\mid x\in (a,b)\}\)
This proves useful in many cases where we know only that our function is monotone, but cannot compute its values. For us, the most important application is Proposition 24.1 where we show exponential functions are differentiable, when we have only assumed they are continuous.
16.5 Problems
Exercise 16.10 Let \(f(x)\) be a continuous function, and assume that \(f(x)^2\) is a constant function. Prove that \(f(x)\) is constant. To show continuity is an essential assumption, give an example of an \(f(x)\) where \(f(x)^2\) is constant, but \(f\) is not.
Exercise 16.11 Recall that a function \(f\) is a contraction map if there exists a \(k\in(0,1)\) with \(|f(x)-f(y)|<k|x-y|\) for all \(x,y\). Prove that contraction maps are continuous.
Exercise 16.12 If \(f\) is continuous at a point \(a\), then \(|f|\) is continuous there, by using the reverse triangle inequality.
Exercise 16.13 The function \[\mathrm{sgn}(x)=\begin{cases} -1 & x<0\\ 0 & x=0\\ 1 &x>0 \end{cases}\] is discontinuous at \(x=0\), but continuous at every other real number.
Exercise 16.14 (Removable and Jump Discontinuities) \[f(x)=\begin{cases} 0&x<0\\ 17 & x=0\\ x& x>0 \end{cases}\] Then \(\lim_{x\to 0}f(x)=0\)
Next consider \[g(x)=\begin{cases} 0&x<0\\ 17 & x=0\\ x^2+1& x>0 \end{cases}\] Show that \(\lim_{x\to 0}g(x)\) does not exist.
Exercise 16.15 (The Pasting Lemma) Let \(f,g\) be two continuous functions and \(a\in\RR\) is a point such that \(f(a)=g(a)\). Prove that the piecewise function below is continuous at \(a\). \[h(x)=\begin{cases} f(x)& x\leq a\\ g(x) & x> a \end{cases}\]