1Lipschitz functions¶
Last time we talked about the concept of uniform continuity. It turns out that proving a function is uniformly continuous is often difficult, except in special cases. That being said, let's introduce a new definition:
Definition: Let $A \subseteq \mathbb{R}$ and $f : A \to \mathbb{R}$. If there exists some $k > 0$ such that $|f(x) - f(y)| < k|x - y|$ for all $x, y \in A$ then $f$ is said to be a Lipschitz function on $A$.
What is this saying intuitively? Well, notice that when $x \neq y$ we can rewrite this condition as:
$$ \left| \frac{f(x) - f(y)}{x - y} \right| \leq k $$
Which can be interpreted as saying that the slope between any two points on the function must be bounded.
What's the significance of Lipschitz functions? Well, it turns out that:
Theorem: If $f : A \to \mathbb{R}$ is Lipschitz on $A$ then $f$ is uniformly continuous on $A$.
Proof: Let $\epsilon > 0$ be given, and let $\delta = \frac{\epsilon}{k}$. If $|x-y| < \delta = \frac{\epsilon}{k}$, then $|f(x) - f(y)| \leq k|x - y| < k\frac{\epsilon}{k} = \epsilon$ $\blacksquare$
1.1Showing a function is Lipschitz¶
Example: Show the function $f(x) = kx$ for $k \in \mathbb{R}$ is Lipschitz on $\mathbb{R}$.
This one is easy. $|f(x) - f(y)| = |kx - ky| = |k||x - y| \leq (|k| + 1)|x-y| = M|x-y|$ for $M = |k| + 1$. There's nothing more to it.
Example: Show $f(x) = x^2$ is Lipschitz on $[0, b]$.
$|f(x) - f(y)| = |x^2 - y^2| = |x+y||x-y| \leq (|x| + |y|)|x-y| \leq 2b|x-y|$ as $x, y \in [0, b]$.
However, $x^2$ is not Lipschitz on $[0, \infty)$. Intuitively, we know that the slope of $x^2$ pretty much becomes vertical as $x \to \infty$, and can no longer be bounded. We can show this rigorously as follows: Let $\epsilon = \frac{1}{2}$. For any $\delta > 0$ there exists $M \in \mathbb{N}$ such that $0 < \frac{1}{M} < \delta$. Let $x = M + \frac{1}{M}$ and $y = M$, so $|x-y| = \left|\frac{1}{M}\right| < \delta$ and:
$$ |f(x) - f(y)| \\ = |M^2 + 2 + \frac{1}{M^2} - M^2| \\ = 2 + \frac{1}{M^2} > \frac{1}{2} = \epsilon $$
Note that not every uniformly continuous function is Lipschitz. Consider $g(x) = \sqrt{x}$ on the interval $[0, 2]$. As the interval is closed and bounded and $g$ is continuous, $g$ is uniformly continuous on $[0,2]$. However, suppose there exists some $k > 0$ such that $|g(x) - g(y)| \leq k|x-y|$ for all $x, y \in [0,2]$ (i.e. $g(x)$ is Lipschitz on the interval). If $y = 0$, there must exist some $k > 0$ such that $\sqrt{x} \leq kx$. We'll handle two cases:
(i) $k \leq 1$ : $\sqrt{x} \leq x$. If $x = \frac{1}{4}$ then $\frac{1}{2} \leq \frac{1}{4}$. Contradiction.
(ii) $k > 1$ : take $x = 1 / x^4$, so:
$$ \frac{1}{k^2} \leq k\frac{1}{k^4} \implies k \leq 1 $$
Which is another contradiction. Hence $\sqrt{x}$ is not Lipschitz on $[0,2]$.
This ends our coverage of section 5.4, and we're skipping 5.5 because it covers gauges, which are only useful for the book's treatment of integration, which we will not be using (instead developing integration via Riemann sums).
2Increasing and decreasing functions¶
This is pretty much a repeat of increasing/decreasing/monotone sequences, but nonetheless we provide the following definitions:
Definition: Let $A \subseteq \mathbb{R}$. A function is said to be increasing on $A$ if for $x_1, x_2 \in A$, $x_1 \leq x_2$ implies $f(x_1) \leq f(x_2)$. A function is said to be strictly increasing on $A$ if $x_1 < x_2$ implies $f(x_1) < f(x_2)$. The definitions of decreasing and strictly decreasing follow similarly.
Definition: A function $f : A \to \mathbb{R}$ is said to be monotone on $A$ if it is either always increasing or always decreasing on $A$. A function is strictly monotone similarly.
A pretty trivial observation is that if $f$ is decreasing then $-f$ is increasing and vice-versa.
While increasing/decreasing functions are not necessarily continuous, they are "almost" continuous in the following sense:
Theorem: Let $I \subseteq \mathbb{R}$ be and interval and $f : I \to \mathbb{R}$ be monotone increasing and $c \in I$ such that $c$ is not an endpoint of $I$. Then:
$$ \text{(i)} \lim_{x \to c^-} f(x) = \sup\{f(x) : x \in I, x < c\} \\ \text{(ii)} \lim_{x \to c^+} f(x) = \inf\{f(x) : x \in I, x > c\} $$
Proof: We'll show (i), as the proof for (ii) follows similarly. Define $A := \{f(x) : x \in I, x < C\}$. Since $c$ is not an endpoint, $A \neq \varnothing$. Since $f$ is monotone, $A$ is bounded above by $f(c)$ hence $\sup A = s$ exists. We will show that $\displaystyle \lim_{x \to c^-} f(x) = s$. For a given $\epsilon > 0$, $s - \epsilon$ is no longer an upper bound, thus there exists $x_0 \in I$ such that $s - \epsilon < f(x_0) \leq s$. Let $\delta = c - x_0$. Then if $0 < c-x < \delta = c - x_0$ then $x > x_0$ so $f(x) \geq f(x_0)$ so $s - \epsilon < f(x_0) \leq f(x) < s$ and therefore $s - f(x) = |f(x) - s| < \epsilon$ $\blacksquare$.