Friday, January 11, 2013 CC-BY-NC
Continuous functions on closed, bounded intervals

Maintainer: admin

Let's start off today with a lemma - those useful little building blocks fo proper theorems and their proofs. If you took the exam for Analysis 1, Fall 2012, this will look familiar:

Lemma: If $g : A \to \mathbb{R}$ is continuous at $c \in A$ and $g(c) \neq 0$, then there exists a neighborhood of $c$ for which $g(x) \neq 0$ on that neighborhood.

Proof: $g(c) \neq 0$, so let $\epsilon = \frac{|g(c)|}{2}$. Since $g$ is continuous at $x=c$, $\exists \delta > 0$ such that $|x - c| < \delta$ implies $|g(x) - g(c)| < \epsilon$. So $|g(c)| - |g(x)| \leq |g(c) - g(x)| < \frac{|g(c)|}{2}$. Rearranging this, we see that $|g(c)| - \frac{|g(c)|}{2} < |g(x)|$ thus $0 < \frac{|g(c)|}{2} < |g(x)|$. So in the neighborhood of our $\delta$, $g(x) \neq 0$.

We can also get the following corollary out of this:

Corollary: If $g : A \to \mathbb{R}$ is continuous at $c \in A$ and $g(c) \neq 0$, then $1 / g(x)$ is continuous at $c$.

Proof: For a given $\epsilon > 0$, there exists a $\delta > 0$ such that $|g(x)| > \frac{|g(c)|}{2}$ for all $x$ in the $\delta$-neighborhood of $c$. Thus, $\frac{2}{|g(c)|} > \frac{1}{|g(x)|}$ for those $x$. Thus, by the Triangle Inequality:

$$ \left|\frac{2}{|g(x)|} - \frac{2}{|g(c)|}\right| = \frac{|g(c) - g(x)|}{|g(x)||g(c)|} \leq \frac{2 \cdot |g(x) - g(c)|}{|g(c)|^2} $$

But $g$ is continuous at $c$, so for a given $\epsilon > 0$, there exists a $\delta_1 > 0$ such that if $|x - c| < \delta_1$, then $|g(x) - g(c)| < \frac{|g(c)|^2}{2}\epsilon$. Thus, if we choose a $\delta_2 := \inf\{\delta, \delta_1\}$ and set $|x - c| < \delta_2$, then:

$$ \left|\frac{1}{g(x)} - \frac{1}{g(c)}\right| \leq \frac{2 \cdot |g(x) - g(c)|}{|g(c)|^2} < \epsilon $$

Thus completing our proof.

1More combinations of continuous functions

Now let's talk about functions that are everywhere continuous. It's easy to see that $f(x) = x$ is a continuous function, and as we've seen, the product of a continuous function with a scalar or another continuous function is continuous. Additionally, the sum of continuous functions is continuous. Therefore polynomials are continuous everywhere as you can construct them with $f(x) = x$ and some coefficients. Rational functions are also continuous everywhere in their domain.

What about the composition of continuous functions? It turns out these are also continuous, which we'll state and prove as follows:

Theorem: Given $A, B \subseteq \mathbb{R}$, let $f : A \to \mathbb{R}$ and $g : B \to \mathbb{R}$ be functions where $f(A) \subseteq B$. If $f$ is continuous at $c \in A$ and $g$ is continuous at $f(c)$ then $g \circ f : A \to \mathbb{R}$ is continuous at $c$.

Proof: $g$ is continuous at $b = f(c)$, so for a given $\epsilon > 0$, there exists a $\delta > 0$ such that $|g(x) - g(b)| < \epsilon$. $f$ is continuous at $c$. Thus there exists a $\delta_1 > 0$ such that if $|x - c| < \delta_1$ then $|g(f(x)) - g(f(c))| < \epsilon$, thus completing our proof.

2Closed, bounded intervals

Now we give a definition that will be useful (and is quite obvious):

Definition: A function $f : A \to \mathbb{R}$ is said to be bounded on $A$ if there exists some $M > 0$ such that $|f(x)| \leq M$ for all $x \in A$.

In general, a continuous function on an interval need not be bounded, for example the function $1/x$ on $(0,1)$. However, a continuous function on a closed, bounded interval is bounded on that interval:

Theorem: Let $I = [a, b]$ be a closed, bounded interval and $f : I \to \mathbb{R}$ be a continuous function on $I$. Then $f$ is bounded on $I$.

Proof: Suppose $f$ is not bounded. Then for all $M \in \mathbb{N}$ there exists some $x_M \in I$ such that $|f(x_M)| > M$. Thus, $(x_M)$ is a sequence in $I$. But $I$ is a closed and bounded interval, so by Bolzano-Weierstrass it has a convergent subsequence $(x_{M_k})$ that converges to some $x_0 \in I$. As $f$ is continuous on $I$, $f(x_{M_k}) \to f(x_0)$, i.e. $f(x_{M_k})$ is convergent and convergent sequences are bounded. Hence we conclude our proof by contradiction.

This is an important result to know as it is used often. And now for another definition:

Definition: Let $A \subseteq \mathbb{R}$, $f : A \to \mathbb{R}$. We say $f$ has an absolute maximum on $A$ if there exists $c \in A$ such that $f(c) \geq f(x)$ for all $x \in A$. Similarly, $f$ has an absolute minimum on $A$ if there exists $d \in A$ such that $f(d) \leq f(x)$ for all $x \in A$.

This is a pretty intuitive definition and doesn't require much thought. Now let's make use of this new definition in the following theorem:

Theorem: Let $I = [a, b]$ be a closed, bounded interval, and $f : I \to \mathbb{R}$ be continuous on $I$. Then $f$ achieves both an absolute maximum and an absolute minimum on $I$.

Proof: By a previous theorem, we know there exists $M > 0$ such that $|f(x)| \leq M$, so $f(x) \leq |f(x)| \leq M$ for all $x \in A$. Thus the set $f(I) = \{f(x) : x \in A\}$ is bounded above, and therefore has a supremum $S$. Another way or saying this is that for $n \in \mathbb{N}$, $S - 1/n$ is not an upper bound. So we can create a sequence $(x_n)$ defined such that $S - 1/n < f(x_n) \leq S$. By Bolzano-Weierstrass, there is a convergent subsequence $(x_{n_k})$ that converges to some $x_0 \in I$. By the squeeze theorem we know that $f(x_0) = S$, giving us the absolute maximum. The proof for a minimum follows similarly.