Final review CC-BY-NC

Maintainer: admin

The final examination will be held from 6pm-9pm on Wednesday, December 19, in Burnside basement (the specific room depends on your last name and which section you're registered for).

Sections covered: everything up to and including section 5.2.

Format of the exam: 6 multi-part short-answer questions (no multiple choice). Questions will be similar to those seen in the assignments, the homework and worked examples in the textbook.

Source: the content below is curated from the course textbook (Introduction to Real Analysis by Robert G. Bartle and Donald R. Sherbert). It omits much of the content and mainly attempts to summarise the important parts, and although certain (small) sections are copied verbatim from the text this usage should constitute fair dealing. Any errors or omissions should be assumed to be the fault of the author of this page (@dellsystem) and not of the textbook itself. If you have any questions or want to make a correction, feel free to either contact @dellsystem or edit the page directly.

Other useful resources when preparing for the final:

1Preliminaries

1.1Sets and functions

1.1.1Sets

easy

1.1.2Functions

Inverse image of a subset $H$ under $f$: all the things in the domain of $f$ that map to something in $H$. Direct iage: all the things that get mapped to by things in the subset $G$.

1.1.3Exercises

Answers to select homework exercises

1.2Mathematical induction

Well-ordering property

Proof by induction (easy)

Some useful formulae:

  • $1 + 2 + \ldots + n = \frac{1}{2} n (n+1)$
  • $1^2+2^2+\ldots+n^2 = \frac{1}{6} n(n+1)(2n+1)$
  • $a-b$ divides $a^n-b^n$
  • $2^n > 2n+1$ for $n \geq 3$
  • $2^n \leq (n+1)!$
  • $\displaystyle 1 + r + r^2 + \ldots + r^n = \frac{1-r^{n+1}}{1-r}$

1.2.1Exercises

Answers to select homework exercises

1.3Finite and infinite sets

Countably infinite or denumerable: bijection with $\mathbb N$.

Rationals, Cartesian product with $\mathbb N^n$, union of countable sets, etc all countable.

Cantor's theorem: no surjection from $A$ onto its power set. Proof by contradiction, consider set of all things mapped to things they don't belong to.

1.3.1Exercises

Answers to select homework exercises

2The real numbers

2.1The algebraic and order properties of R

2.1.1The algebraic properties

A1: Commutativity of addition
A2: Associativity of addition
A3: Existence of zero
A4: Existence of negatives
M1: Commutativity of multiplication
M2: Associativity of multiplication
M3: Existence of one
M4: Existence of reciprocals (for non-zero elements)
D: Distributivity

2.1.2Rationals and irrationals

Proving that $\sqrt{2}$ is irrational: Assume that $\frac{p}{q} = \sqrt{2}$ for some $p, q \in \mathbb Z$, where $p$ and $q$ are positive and have no common factors other than 1 (so the fraction is written in its lowest form). Then, if we square both sides, we get that $p^2/q^2 = 2$, so $p^2 = 2q^2$. So $p^2$ is clearly even. This implies that $p$ is even as well, for if it were odd, its square would be odd as well. What about $q$? Well, $q$ must be odd, otherwise it would have 2 as a factor and thus $p$ and $q$ would have a common factor other than 1, contradicting one of the premises. So $q$ must be odd. However, since $p$ is even, then $p = 2m$ for some $m \in \mathbb N$, and hence $p^2 = 2q^2$ implies $4m^2 = 2q^2$ and so $q^2 = 2m^2$. So $q^2$ is even, which means that $q$ is even as well. So we get a contradiction, which tells us that $\sqrt{2}$ is irrational.

To prove that a number other than 2 is irrational, just use divisibility by that number instead of even/odd properties.

2.1.3The order properties

There is a set of positive numbers $\mathbb P$ which is closed under addition and multiplication. Furthermore, we have the trichotomy property: any real number is either positive, negative, or 0. We can define > and < in terms of membership in $\mathbb P$: if $a - b \in \mathbb P$, then $a > b$. Properties of > are fairly easy to derive.

Theorem: If $a \in \mathbb R$ is such that $0 \leq a < \epsilon$ for every $\epsilon > 0$, then $a = 0$. Proof: if $a > 0$, then letting $\epsilon = \frac{1}{2}a$, we have that $0 < \epsilon < a$ which contradicts the premise. So $a = 0$.

2.1.4Solving inequalities

If it's linear, it should be easy. If it's quadratic, factor it and check the cases. If it's a fraction, make the constant term 0 (move things over to the other side) and check cases.

2.1.5The arithmetic-geometric mean inequality

$$\sqrt{ab} \leq \frac{1}{2}(a +b)$$

The geometric mean is the left, the arithmetic mean is the right. The proof of this comes from the fact that $(\sqrt{a} - \sqrt{b})^2 > 0$. Then we just need to expand that square and move things around.

Equality occurs if and only if $a = b$. We can prove this by squaring both sides of $\sqrt{ab} = \frac{1}{2}(a+b)$ and then multiplying by 4, from which we get that $(a-b)^2 = 0$ and so $a-b=0$ which means $a=b$.

2.1.6Bernoulli's inequality

$$(1+x)^n \geq 1+nx$$

for all $n\in \mathbb N$ and $x > -1$. Proof: by induction.

2.1.7Exercises

Answers to select homework exercises

2.2Absolute value and the real line

2.2.1Absolute value

Some properties of the absolute value operator:

(a) $|ab| = |a||b|$
(b) $|a|^2 = a^2$
(c) $-c \leq a \leq c$ implies $|a| \leq c$ and vice versa (for $c \geq 0$)
(d) $-|a| \leq a \leq |a|$

2.2.2The triangle inequality

$$|a+b| \leq |a| + |b|$$

Proof: Use the fact that $-|a| \leq a \leq |a|$ and $-|b| \leq b \leq |b|$ (from (d)). Adding those together gives us

$$-(|a| + |b|) \leq a+b \leq |a| + |b|$$

and so $|a + b| \leq |a| + |b|$ by (c).

Other variations:

  • $||a| - |b|| \leq |a-b|$. We derive this by writing $a$ as $a - b + b = (a-b) + b$ and then applying the triangle inequality to get $|(a-b) + b| \leq |a-b| + |b|$. Then we subtract $|b|$ from both sides to get $|(a-b) + b| - |b| \leq |a-b|$ which is really just $|a| - |b| \leq |a-b|$. We then do the same thing with $b = b - a + a$, resulting in $-|a-b| = -|b-a| \leq |a| - |b|$. Combining the two inequalities gives us $-|a-b| \leq |a| - |b| \leq |a-b|$ and so by (c) we have $||a|-|b|| \leq |a-b|$.
  • $|a-b| \leq |a| + |b|$. Derive this by replacing $b$ with $-b$.

2.2.3Solving inequalities involving absolute value

When trying to find the set of all $x$ that satisfy some inequality, you can either consider cases, and take the union of the resulting sets, or use the fact that $a < b$ if and only if $a^2 < b^2$ for non-negative $a$ and $b$. The case-by-case approach is the more general one.

When finding a constant $M$ such that $|f(x)| \leq M$ for all $x$ in some interval: usually $f(x)$ will will be a fraction. Then we just use techniques like the triangle inequality, multiplying by the conjugate, etc. We find an upper bound for the numerator, and a lower bound for the denominator, then take the reciprocal of the denominator and turn the lower bound into an upper bound. Then we just multiply the two upper bounds together.

2.2.4Neighbourhoods

The $\epsilon$ neighbourhood of $a$ is the set $V_{\epsilon}(a) = \{x \in \mathbb R: |x-a| < \epsilon\}$ (where $\epsilon > 0$ is given). If $x$ is in the $\epsilon$ neighbourhood of $a$ for every $\epsilon$, then $x = a$.

2.2.5Exercises

Answers to select homework exercises

2.3The completeness property of R

2.3.1Suprema and infima

Upper bound $u$: $s \leq u$ for all $s \in S$

Lower bound $w$: $s \geq w$ for all $w \in S$

Supremum $u$: least upper bound. First, $u$ is an upper bound of $S$, and if $v$ is any upper bound of $S$, then $v \geq u$. Another formulation: if $v < u$, then there exists $s \in S$ such that $v < s$ (meaning that anything less than $u$ is not an upper bound). Alternatively: for every $\epsilon > 0$ there exists $s \in S$ such that $u - \epsilon < s$.

2.3.2The completeness property

Every nonempty set of real numbers that has an upper bound also has a supremum in $\mathbb R$.

2.3.3Exercises

Answers to select homework exercises

2.4Applications of the supremum property

2.4.1Properties of suprema and infima

If $u = \sup S$, then $\sup\{a + s: s \in S\} = a + u$. Proof: easy, use definition.

If $a \leq b$ for all $a \in A, b \in B$, then $\sup A \leq \inf B$. Proof: same.

2.4.2Bounds of functions

Just use the range. Pretty trivial.

2.4.3The Archimedean property

For any $x \in \mathbb R$, there exists $n \in \mathbb N$ such that $n \geq x$. Proof: by contradiction. If this is not true, then $\mathbb N$ has an upper bound, and thus it has a supremum in $\mathbb R$ (by the completeness property), which we will denote $u$. Consider $u-1$. This number is smaller than the supremum, so it's not an upper bound of $\mathbb N$, meaning that there exists $m \in \mathbb N$ such that $u-1 < m$. If we add 1 to both sides we get $u < m + 1$. But $m + 1 \in \mathbb N$. Then $u$ is not an upper bound of $\mathbb N$, since there is something in $\mathbb N$ that is greater than $u$. So $\mathbb N$ does not have an upper bound. $\blacksquare$

Using this property, we can prove that the infimum of $\{\frac{1}{n}:n\in \mathbb N \}$ is 0. We know it's not an empty set, and since it's bounded below (by 0), it must have an infimum. We will denote this infimum by $w \geq 0$. For any $\epsilon > 0$, the Archimedean Property tells us that there exists an $n$ such that $n > \frac{1}{\epsilon}$ and so $\epsilon > \frac{1}{n}$. Then we know that $0 \leq w < \epsilon$ for every $\epsilon > 0$ and so $w = 0$.

Some (other) corollaries of the Archimedean property:

  • If $t > 0$, there exists $n \in \mathbb N$ such that $0 < \frac{1}{n} < t$. Proof: $t$ is not a lower bound, so, yeah.
  • If $y > 0$, there exists $n \in \mathbb N$ such that $n - 1 \leq y < n$. Proof: the set of all upper bounds of $y$ is non-empty, and has a least element; that element minus 1 is not in the set.

2.4.4The existence of irrationals

We can use the supremum property to show that one irrational, $\sqrt{2}$, exists. Let $S = \{s \in \mathbb R: 0 \leq s, s^2 \leq 2 \}$. Now, we know the set is not empty; for instance, 1 is in the set. Also, the set is bounded above (for instance, 2 is an upper bound - if $t > 2$, then $t^2 > 4$ and so $t \notin S$). Then, by the supremum property, $S$ has a supremum $x \in \mathbb R$. We will prove that $x^2 = \sqrt{2}$ by ruling out the other possibilities: $x^2 < 2$ and $x^2 > 2$.

To show that $x^2 < 2$ leads to a contradiction, we find an $n \in \mathbb N$ such that $x + \frac{1}{n} \in S$, which would imply that $x$ is not an upper bound for $S$. How do we choose $n$? Well, note that $1/n^2 \leq 1/n$, so that

$$\left ( x + \frac{1}{n} \right )^2 = x^2 + \frac{2x}{n} + \frac{1}{n^2} \leq x^2 + \frac{1}{n} (2x+1)$$

So, if we choose $n$ such that

$$\frac{1}{n}(2x+1) < 2-x^2$$

then we have that $(x+1/n)^2 < x^2 + (2-x^2) = 2$. We can use the Archimedean property to find such an $n$.

Similarly, for $x^2 > 2$, we have $(x-\frac{1}{m})^2 > x^2 - \frac{2x}{m}$ and we just need to choose an $m$. So $x^2=2$. In fact, we can do this with any positive real number $a$ instead of 2.

2.4.5The density of rational numbers

Qualitatively, the fact that $\mathbb Q$ is dense in $\mathbb R$ means that between two any two different real numbers there is a rational number between two. (In fact, there are infinitely many rational numbers, but we're concerned with the singular case for now.)

The density theorem: if $x$ and $y$ are any real numbers with $x < y$, then there exists a real number $r \in \mathbb Q$ such that $x < r < y$. Proof: since $y > x$, $y-x > 0$. By a corollary of the Archimedean property, we know that there exists an $n \in \mathbb N$ such that $1/n < y-x$, so, $1 < ny - nx$, i.e., $nx + 1 < ny$. Applying another corollary of the Archimedean property, we get that there exists $m \in \mathbb N$ such that $m-1 \leq nx < m$. Consequently, $m \leq nx+1 \leq ny$, and so $nx < m < ny$. Then if we let $r = m/n$, $x < r < y$ is satisfied.

2.4.6The density of irrational numbers

The irrational numbers $\mathbb R \setminus \mathbb Q$ are dense as well!! To prove this, we apply the density theorem to the real numbers $x/\sqrt{2}$ and $y/\sqrt{2}$, so we get that there exists a rational number $r$ such that

$$\frac{x}{\sqrt{2}} < r < \frac{y}{\sqrt{2}}$$

Then, $r' = r\sqrt{2}$ is irrational and satisfies $x < r' < y'$.

2.4.7Exercises

Answers to select homework exercises

2.5Intervals

2.5.1The characterisation theorem

If $S$ is a subset of $\mathbb R$ that contains at least two points and has the property that if $x, y \in S$ and $x < y$, then $[x, y] \subseteq S$, then $S$ is an interval.

Proof: by cases. First, the case that $S$ is bounded, and thus has a sup and inf. Then $S \subseteq [a, b]$. Also, $(a, b) \subseteq S$. Etc.

2.5.2Nested intervals

Nested interval property: A nested sequence of closed bounded intervals will always have a common point $x \in \mathbb R$. Proof: if $I_n = [a_n, b_n]$ for $n\in \mathbb N$, then we know that $x \leq b_n$ and $x \geq a_n$ for all $n$. I don't know, it seems kind of trivial. Also, this number is unique.

2.5.3The uncountability of R

The unit interval $I = [0, 1]$ is uncountable. Proof using nested intervals: if it is countable, then we can enumerate the elements as $I = \{x_1, x_2, \ldots, x_n\}$. Then we select a closed subinterval $I_1$ of $I$ such that $x_1 \notin I_1$, and a closed subinterval $I_2$ of $I_1$ such that $x_2 \notin I_2$, and so on. Then, we obtain nonempty closed intervals that satisfy

$$I_1 \supseteq I_2 \supseteq \ldots \subseteq I_n \supseteq I$$

By the nested interval property, there is a point within all of these intervals. But this point isn't equal to any of the points in $I$. Thus, uncountable.

2.5.4Binary representations

If something is in the left half, 0; right half, 1. Pretty trivial.

2.5.5Periodic decimals

A positive real number is rational if and only if its decimal representation is periodic.

Proof of $\to$: long division, get a cycle, decimals repeat.

Proof of $\leftarrow$: multiply the decimal by a power of ten, subtract the original, get an integer. Use that to write the fraction.

2.5.6Cantor's diagonal argument

Another proof of uncountability (also posited by Cantor) uses decimal expansions. Create a new number that is not the same as any of the numbers in the enumeration. QED.

2.5.7Exercises

Answers to select homework exercises

3Sequences and series

3.1Sequences and their limits

3.1.1Definition of the limit

A sequence $X = (x_n)$ in $\mathbb R$ is said to converge to $x \in \mathbb R$ (with $x$ being the limit of $(x_n)$) if, for every $\epsilon > 0$, there exists $K(\epsilon) \in \mathbb N$ such that for all $n \geq K(\epsilon)$, $|x_n - x| < \epsilon$.

Proof of uniqueness: suppose both $x'$ and $x''$ are limits. Then we have $|x_n - x'| < \epsilon/2$ for all $n \geq K'$, and $|x_n - x''| < \epsilon / 2$ for all $n \geq K''$. Let $K$ be the larger of $K'$ and $K''$. Then, by the triangle inequality, for $n \geq K$:

$$|x' - x''| = |x' - x_n + x_n - x''| \leq |x' - x_n'| + |x_n - x''| < \epsilon/2 + \epsilon/2 = \epsilon$$

Hence $x' - x'' = 0$ and so $x' = x''$.

3.1.2Equivalent limit definitions

(a) $X$ converges to $x$
(b) For every $\epsilon > 0$, there exists a natural number $K$ such that for all $n \geq K$, the terms $x_n$ satisfy $|x_n-x| < \epsilon$.
(c) For every $\epsilon > 0$, there exists a natural number $K$ such that for all $n \geq K$, the terms $x_n$ satisfy $x-\epsilon < x_n < x + \epsilon$.
(d) For every $\epsilon$-neighbourhood $V_{\epsilon}(x)$ of $x$, there exists a natural number $K$ such that for all $n\geq K$, the terms $x_n$ belong to $V_{\epsilon}(x)$.

Proof: (a) $\to$ (b) is just the definition. (b) $\to$ (c) uses a pretty trivial property of the absolute value operator. (d) follows from (b) and (c) because of how neighbourhoods are defined. So this is pretty trivial.

3.1.3Proving the limit of a sequence

Some examples, using the definition.

(a) $\lim(1/n) = 0$. If $\epsilon > 0$, then $1/\epsilon > 0$. By the Archimedean property, there exists a natural number $K$ such that $K > 1/\epsilon$. Multiplying both sides by $1/K$ and $\epsilon$ gives us that $\epsilon > 1/K$. Then, if $n \geq K$, we have $1/n \leq 1/K < \epsilon$. Then:

$$\left | \frac{1}{n} - 0 \right | = \frac{1}{n} < \epsilon$$

(b) $\lim(1/(n^2+1)) = 0$. Use inequalities, show that it's less than $1/n$, etc.

(c) $\displaystyle \lim\left ( \frac{3n+2}{n+1} \right ) = 3$. First we simply the expression

$$\left | \frac{3n+2}{n+1}-3 \right | < \epsilon$$

and get $1/(n+1)$. That's less than $1/n$, so yeah.

(d) If $0 < b < 1$, then $\lim(b^n) = 0$. We can prove this using properties of the natural logarithm function (lol?) to find a $K$ for which it would work. The answer is $K > \ln\epsilon / \ln b$.

Some other examples, using the following theorem.

Theorem: If $|x_n-x| \leq Ca_n$ where $C > 0$ is a constant and $a_n$ is something with a limit of 0, then $\lim(x_n) = x$. Proof: use $\epsilon/C$, show that $a_n < \epsilon/C$, then we get that $|x_n-x| \leq Ca_n < C(\epsilon/C) = \epsilon$.

(a) If $a > 0$, then $\displaystyle \lim \left ( \frac{1}{1+na} \right ) = 0$. Proof: use the fact that it's bounded above by $(1/a) \cdot (1/n)$.

(b) If $0 < b < 1$, then $\lim(b^n) = 0$. Proof: similar to above. Use Bernoulli's inequality, dominate it by $1/(na)$.

(c) If $c > 0$, then $\lim(c^{1/n}) = 1$. Trivial for $c=1$. For $c>1$, we write $c^{1/n} = 1 + d_n$ for some $d_n > 0$. Then, by Bernoulli's inequality, we have that

$$c = (1+d_n)^n \leq 1+nd_n)$$

and so $c-1 \geq n_dn$, which tells us that $d_n \leq (c-1)/n$. So then we have

$$|c^{1/n} - 1| = d_n(c-1)\frac{1}{n}$$

so then the limit is just 1.

For $0 < c < 1$, we write it as $c^{1/n} = 1/(1+h_n)$. Use Bernoulli's inequality again, dominate it by $1/n$, etc.

(d) $\lim(n^{1/n}) = 1$. Proof: write $n^{1/n}$ as $1 + k_n$ for some $k_n > 0$, binomial theorem, drop all but the first and last terms, get that $k_n^2 \leq 2/n$ (for $n > 1$), then if $\epsilon$ is given it follows that exists a natural number $N$ such that $2/N < \epsilon^2$. Then we take the sup of 2 and $N$ to get that $2/n < \epsilon^2$ and so

$$0 < n^{1/n} - 1 = k_n \leq (2/n)^{1/2} < \epsilon$$

3.1.4Disproving convergence to a number

If we want to prove that a sequence does not converge to a given number, we just need to find a particular $\epsilon > 0$ such that no matter what values we use for $K$, $|x_n - x| <\epsilon$ is not satisfied.

3.1.5Tails of sequence

The $m$-tail of a sequence converges if and only if the original sequence converges. Proof: just take the epsilon definition for both the tail and the original and see what you can do. (Hint: add $m$ to $K$.)

3.1.6Exercises

Answers to select homework exercises

3.2Limit theorems

3.2.1Bounded sequences

A sequence is bounded if there exists a real number $M > 0$ such that $|x_n| < M$ for all $n$. To prove that convergent sequences are bounded, let $\epsilon = 1$ (or just let $\epsilon$ remain $\epsilon$, up to you). So we know that $|x_n - x| < 1$ for all $n \geq K$. Then, applying the triangle inequality to the following, we get: $|x_n| = |x_n - x + x| \leq |x_n - x| + |x| < 1 + |x|$. So if we take the sup of $|x_1|$, $|x_2|$, etc, and $1+|x|$, then $|x_n| \leq M$ for all $n$. $\blacksquare$

3.2.2Combining sequences

Adding, subtracting, multiplying, and multiplying by constants preserves the limit (for convergent sequences). For quotients, the denominator's limit can't be zero.

Proof for addition/subtraction: triangle inequality, using $\epsilon/2$, and $K = \sup\{K_1, K_2\}$. Specifically: $|(x_n + y_n) - (x+y)| = |(x_n -x) + (y_n -y)| \leq |x_n-x| + |y_n - y| < \epsilon/2 + \epsilon/2 = \epsilon$. $\blacksquare$

Proof for multiplication: $|x_ny_n - xy| = |(x_ny_n - x_ny) + (x_ny - xy)| \leq |x_n(y_n-y)| + |(x_n-x)y| = |x_n||y_n-y| + |x_n-x||y|$. We know that $|x_n|$ has an upper bound $M_1$. So let $M = \sup\{M_1, |y|\}$. So then we have $|x_ny_n - xy| \leq M|y_n-y| + M|x_n-x|$ and using $\epsilon/(2M)$ we get $|x_ny_n - xy| < M(\epsilon/(2M)) + M(\epsilon/(2M)) = \epsilon$. $\blacksquare$

Proof for quotients: first show that the sequence of reciprocals converges to exactly what it seems it should. idk

3.2.3Theorems for limits

If $x_n \geq 0$, $\lim(x_n) \geq 0$. Proof: by contradiction; assume that the limit $< 0$, then the negative of that is positive, so set epsilon to be that. But then that gives us a term that's less than 0. Contradiction!

If $x_n \leq y_n$, $\lim(x_n) \leq \lim(y_n)$. Proof: $Z = Y - X$. The limit of $Z$ is $\geq 0$ by the previous theorem.

If $a \leq x_n \leq b$, $a \leq \lim(x_n) \leq b$. Easy - use the constant sequences $A$ and $B$.

If the sequence converges, the sequence of absolute values converges. Proof: triangle inequality. It's short.

If a positive sequence converges, the square root sequence does too. Proof: use $\epsilon^2$, multiply by the conjugate.

3.2.4Squeeze theorem

Suppose that $x_n \leq y_n \leq z_n$ and $\lim(x_n) = \lim(z_n)$. Then $(y_n)$ converges and its limit is the same.

Proof: If $w$ is the limit, then $|x_n-w| < \epsilon$, and $|z_n -w| <\epsilon$. From the inequality in the hypothesis, we have that $x_n-w \leq y_n- w \leq z_n-w$. Also, $-\epsilon < x_n - w < \epsilon$ and $-\epsilon < z_n -w < \epsilon$. So then $-\epsilon < y_n-w < \epsilon$ and so $|y_n-w| < \epsilon$ for all $\epsilon$ which shows that $w$ is the limit.

Example uses:

  • $\displaystyle \lim \left ( \frac{\sin n}{n} \right ) = 0$. Since $\sin(n)$ is bounded between -1 and 1, then $-1/n \leq \sin n / n \leq 1/n$. Squeeze theorem that shit, the limit is 0.

3.2.5Proving that a sequence is divergent

(a) $(n)$ is divergent. Proof: Archimedean property.

(b) $((-1)^n)$ is divergent. Proof: play the $K$-$\epsilon$ game and let $\epsilon = 1$.

3.2.6Ratio test

If $\lim(x_{n+1}/x_n) < 1$ (must exist), then the limit of $x_n$ is 0.

Proof: Let $r$ be a number such that $L < r < 1$, and let $\epsilon = r - L > 0$. Then there exists $K \in \mathbb N$ such that if $n \geq K$, $|x_{n+1}/x_n - L| < \epsilon$. Then, if we add $L$ to both sides and use the triangle inequality, we get

$$\frac{x_{n+1}}{x_n} < L+ \epsilon = L + (r-L) = r$$

So then we have $0 < x_{n+1} < x_nr < x_{n-1}r^2 < \ldots x_Kr^{n-K+1}$. Then if we set $C = x_K / r^K$, we see that $0 < x_{n+1} < Cr^{n+1}$. Since $0 < r < 1$, then $\lim(r^n) = 0$ and so $\lim(x_n) = 0$ (since the sequence of $|x_n - x|$ is less than a constant multiple of another one that does converge to zero).

3.2.7Exercises

Answers to select homework exercises

3.3Monotone sequences

3.3.1Recap of proving convergence

Ways to prove that a sequence converges so far:

  • Use the $\epsilon$ definition
  • Dominate $|x_n-x|$ by a multiple of the terms in a sequence known to converge to 0
  • Take tails, algebraic combinations, absolute values, or square roots of other sequences
  • Squeeze theorem
  • Ratio test

3.3.2Monotone convergence theorem

A monotone sequence $X$ is convergent if and only if it's bounded.

Proof: If $X$ is bounded and increasing, there exists $M \in \mathbb R$ such that $x_n \leq M$. Also, according to the completeness property of $\mathbb R$, the set $\{x_n:n\in \mathbb N\}$ has a supremum in $\mathbb R$, which we will call $x^*$. We will show that $x^* = \lim(x_n)$. Given $\epsilon > 0$, $x^*-\epsilon$ is not an upper bound of the set, by definition of the supremum; consequently, there exists $x_k$ such that $x^*-\epsilon < x_k$. Also, we have that $x_k \leq x_n$ whenever $n \geq k$ since the sequence is increasing. Consequently, we have:

$$x^*-\epsilon < x_k \leq x_n \leq x^* \leq x^* + \epsilon, \quad n \geq k$$

and so $-\epsilon < x_n - x^* < \epsilon$ which means that $|x_n-x^*| < \epsilon$. So the sequence converges to $x^*$.

For decreasing sequences, we can use the negative of the sup to get the inf, or we can just prove it in the same way.

Using the monotone convergence theorem

(a) $\lim(1/\sqrt{n}) = 0$. 0 is a lower bound, and it's decreasing.

(b) $x = 1/1 + 1/2 + \ldots + 1/n$ does not converge, since it's not bounded

(c) $y_1 = 1$, $y_{n+1} = \frac{1}{4}(2y_n+3)$. First, prove by induction that it's bounded above by 2. $y_1 = 5/4$, $y_{k+1} = 7/4$. So that works. Now show that it's increasing by induction. Also pretty easy. Then to find the actual limit, use the fact that $y = \lim Y_1 = \lim Y$ and so we can write $y = \frac{1}{4}(2y+3) = y/2 + 3/4$ and so $y/2 = 3/4$ thus $y = 3/2$.

3.3.3Calculating square roots

To define a sequence that converges to $\sqrt{a}$: $s_1 > 0$ is arbitrary, $s_{n+1} = \frac{1}{2}(s_n + a/s_n)$. We show that it's bounded below by $\sqrt{a}$ (satisfies the equation $s_n^2 - 2s_{n+1} + a = 0$ which has a real root) and that it's ultimately decreasing (the difference between one term and the next is positive). Then we just have $s = 1/2(s + a/s)$ from which we get $s = a/s$ or $s^2 = a$, so $s= \sqrt{a}$.

3.3.4Euler's number

$$\lim\left (\left ( 1 + \frac{1}{n}\right )^n\right ) = e$$

Proof: using monotone convergence theorem. We show that it's increasing using the binomial theorem - the expression for $e_{n}$ contains $n+1$ terms and the expression for $e_{n+1}$ has $n+2$ terms, an $e_{n+1}$ has terms that are greater (and one extra). So it's increasing. For boundedness, we have $(1-p/n) < 1$ for $p \in \mathbb N$, so $2^{p-1} \leq p!$, and $1/p! \leq 1/2^{p-1}$. Then $e_n < 1 + 1 + 1/2 + 1/2^2 + \ldots + 1/2^{n-1} = 2 + (1 - 1/2^{n-1}) < 3$.

3.3.5Exercises

Answers to select homework exercises

3.4Subsequences and the Bolzano-Weierstrass theorem

3.4.1Subsequences

Same order as the original sequence, just, drop select terms. A tail of a sequence is a special type of subsequence.

3.4.2Convergent subsequences

Theorem: if a sequence of real numbers converges to something, then any subsequence also converges to it. Proof: $n_k \geq k \geq K(\epsilon$ so that $|x_{n_k} - x| <\epsilon$.

An example using this theorem: to find the limit of $x_n = b^n$ if $0 < b < 1$, the limit of which we will denote by $x$, we consider the subsequence $x_{2n} = b^{2n} = (b^n)^2 = x^2_n$, which has a limit of $x^2$ (from the limit laws, etc). But it also has the same limit $x$ as the sequence itself, due to the theorem above. So then $x = \lim(x_{2n}) = (\lim(x_n))^2 = x^2$. Then, either $x=0$ or $x=1$. Since the sequence is decreasing and bounded above by 1, then $x=0$.

Another example: find the limit of $z_n = c^{1/n}$ for $c > 1$, which we will denote by $z$. Consider the subsequence $z_{2n} = c^{1/(2n)} = (c^{1/n})^{1/2} = z_n^{1/2}$. By limit laws in 3.2, we have that $z = \lim(z_{2n}) = (\lim(z_n))^{1/2} = z^{1/2}$ and so $z^2=z$, as before. In this case, since $z_n > 1$, it can't be 0, so it must be that $z=1$.

3.4.3Divergent subsequences

Theorem: TFAE:

(i) A sequence $X = (x_n)$ does not converge to $x$.
(ii) There exists an $\epsilon > 0$ such that for any $k \in \mathbb{N}$, there exists $n_k \in \mathbb{N}$ such that $n_k \geq k$ and $|x_{n_k} - x| \geq \epsilon$.
(iii) There exists an $\epsilon > 0$ and a subsequence $(x_{n_k})$ of $X$ such that $|x_{n_k} - x| \geq \epsilon$ for all $k \in \mathbb{N}$.

Proof for (i) $\to$ (ii): just the definition of converging to a limit.

Proof for (ii) $\to$ (iii): Just find terms that fit the definition and make a subsequence out of them.

Proof for (iii) $\to$ (i): A subsequence satisfying the condition in (iii) (meaning that it does not converge to $x$) exists. So, the sequence as a whole cannot converge to $x$ either.

Divergence criteria (is this covered or not?): a sequence is divergent if: 1) it contains two convergent subsequences with different limits; or 2) it is unbounded.

Example: $X=((-1)^n)$ is divergent, because the even terms converge to 1, and the odd terms converge to -1.

Another example: $S=\sin(n)$ is divergent. Proof: $\sin(\pi/6) = \sin(5pi/6) = 1/2$ and $\sin x > 1/2$ in the interval $(\pi/6, 5\pi/6)$. Since the length of this interval is greater than 2, there are at least 2 natural numbers in here. We take the first one, and we do this for every such interval (since sine is periodic). Then, we form a subsequence $(\sin n_k)$ for which everything lies between 1/2 and 1. then, we form another subsequence using the interval $(7\pi/6, 11\pi/6)$ and since $\sin x < -1/2$ in this interval, and the length is the same, we can make another subsequence, values between -1 and $-1/2$. Then, given any real number $c$, we see that at least one of the subsequenes lies outside the $1/2$-neighbourhood of $c$, so $c$ cannot be a limit of $S$. Thus nothing can be a limit of $S$ and so it's divergent.

3.4.4The monotone subsequence theorem

Every sequence (even sequences that are not themselves monotone) has a monotone subsequence.

Proof: Consider the concept of a "peak" (a term in a sequence that is greater than everything that follows it). Any given sequence $X$ can have either finitely many (including zero) or infinitely many peaks. In the first case, we can simply take the first term after the last peak. Since this term is not a peak, there must be a term after it that is greater than it, and so on, until we get an increasing subsequence. In the second case, the subsequence of peaks is decreasing and thus monotone.

3.4.5The Bolzano-Weierstrass theorem

A bounded sequence of real numbers has a convergent subsequence.

Proof: By the monotone subsequence theorem, this sequence must contain a monotone subsequence. Since it's also bounded, we can use the monotone convergence theorem to show that this subsequence is convergent as well.

3.4.6Limits of subsequences

If every convergent subsequence of a sequence converges to $x$, so does the sequence itself.

Proof: by contradiction, using Bolzano-Weierstrass.

3.4.7Exercises

Answers to select homework exercises

3.5The Cauchy criterion

3.5.1Introduction to Cauchy sequences

A sequence $X = (x_n)$ of real numbers is said to be a Cauchy sequence if for every $\epsilon > 0$ there exists a natural number $H(\epsilon)$ such that for all natural numbers $n, m \geq H(\epsilon)$, the terms $x_n$, $x_m$ satisfy $|x_n - x_m| < \epsilon$.

Example: $(1/n)$. Proof: given $\epsilon > 0$, choose $H = H(\epsilon)$ such that $H > 2/\epsilon$. Then if $m, n \geq H$, we have $1/n \leq 1/H < \epsilon/2$ and similarly $1/m < \epsilon/2$. Then, we have:

$$\left | \frac{1}{n} - \frac{1}{m} \right | \leq \frac{1}{n} + \frac{1}{m} < \frac{\epsilon}{2} + \frac{\epsilon}{2} = \epsilon$$

which tells us that $(1/n)$ is a Cauchy sequence.

Example of something that is not a Cauchy sequence: $(1 + (-1)^n)$. The difference between any two terms is always 2, so we can always find an $\epsilon$ such that, yeah. Remember that game?

Note that to prove that a sequence is a Cauchy sequence, we cannot assume a relationship between $m$ and $n$, for the inequality $|x_n - x_m| < \epsilon$ must hold for all $n, m \geq H(\epsilon)$. However, to prove that a sequence is not a Cauchy sequence, we can make use of a relation between $m$ and $n$ (as long as it shows that for arbitrarily large values of $m$ and $n$, the inequality does not hold).

Lemma: if a sequence of real numbers is convergent, it is a Cauchy sequence. Proof: $\epsilon$ definition, using $|x_n - x| < \epsilon/2$ and $|x_m - x| < \epsilon/2$.

Lemma: Cauchy sequences of real numbers (does that matter) are bounded. Proof: by triangle inequality and use of supremums. If we let $\epsilon = 1$, then $|x_n - x_m| < 1$, so $|x_n| \leq |x_m| + 1$ and if we take the sup of all the other terms and that one, we know that is an upper bound.

3.5.2The Cauchy convergence criterion

A sequence of real numbers is convergent if and only if it is a Cauchy sequence.

Proof: For →, we use the first lemma in the previous section. For ←, we first note that a Cauchy sequence is bounded (second lemma in previous section). By Bolzano-Weierstrass, there is a subsequence that converges to some number $x'$. We then use the $\epsilon$ definition for the sequence and the subsequence, and then we can sort of combine the two and then use the triangle inequality to conclude that the sequence itself converges to $x'$, and so the sequence is convergent. (Use $\epsilon/2$.)

3.5.3Exercises

Answers to select homework exercises

3.6Properly divergent sequences

A properly divergent sequence is one whose limit is $\pm \infty$. We can define an infinite limit as follows: $(x_n)$ tends to $+\infty$ if for every $\alpha \in \mathbb R$ there exists $K(\alpha) \in \mathbb{N}$ such that if $n \geq K(\alpha)$, then $x_n > \alpha$. Similar for $-\infty$.

Examples of things that tend to $\infty$: $n$, $n^2$, $c^n$ for any $c > 1$.

A monotone sequence is properly divergent if and only if it is unbounded. Not too difficult to prove, just think about it.

Also, this is kind of obvious, but if you have two sequences, and the first tends to infinity and all of its terms are less than those of the other sequence, then the other sequence tends to infinity as well.

3.6.1Exercises

Answers to select homework exercises

4Limits

4.1Limits of functions

4.1.1Cluster points

A point in a set $A \subseteq \mathbb{R}$ such that for every $\delta > 0$, there exists at least one point $x \in A$ ($x \neq c$) such that $|x-c| \leq \delta$.

Note that $c$ may or may not be a member of $A$. Even if it is, we ignore the point itself.

A cluster point is really just something that elements in the set get arbitrarily close to.

Theorem: a number $c \in \mathbb R$ is a cluster point of a subset $A \in \mathbb R$ iff there exists a sequence $(a_n)$ in $A$ such that $\lim(a_n) = c$ and $a_n \neq c$ for all $n \in \mathbb N$.

Examples: for the open interval from 0 to 1, every point in the closed interval is a cluster point. Incidentally, 0 and 1 are cluster points of the open interval even though they don't belong to it. All the points of the open interval are cluster points of itself.

A finite set has no cluster points, and the infinite set $\mathbb N$ has no cluster points.

The set $\{1/n:n\in \mathbb N\}$ has only one cluster point: 0.

Every point in the closed interval from 0 to something is a cluster point of $\mathbb Q$.

4.1.2The definition of the limit

The function $f: A \to \mathbb R$ has a limit $L \in \mathbb R$ at a cluster point $c$ of $A$ if, for any given $\epsilon > 0$, there exists $\delta > 0$ such that if $x \in A$ and $|x-c| < \delta$ (where $x \neq c$), then $|f(x) - L| < \epsilon$.

Note that the value of $\delta$ depends on $\epsilon$.

If a limit does not exist at a point $c$, then $f$ diverges at $c$. Otherwise, it has a unique limit at that point. To prove this, do the standard thing where you get $\epsilon/2 + \epsilon/2 = \epsilon$ from which you can conclude that $L - L' = 0$, etc.

We can rephrase this in terms of neighbourhoods by saying that $x$ belongs to the $\delta$-neighbourhood of $c$.

Examples:

  • the constant function's limit at any point is obvious
  • $\displaystyle \lim_{x \to c} x = c$. To prove, just replace $f(x)$ with $x$.
  • $\displaystyle \lim_{x \to c} x^2 = c^2$, this proof is a bit longer but we use the fact that $x^2-c^2$ (which we want to make arbitrarily small) is equivalent to $(x-c)(x+c)$ and then we do stuff with the triangle inequality. Then, take the inf of some things.
  • $\displaystyle \lim_{x \to c} 1/x = 1/c$ if $c > 0$. Basically just take the absolute value of the difference, then get it in terms of $|x-c|$. Use the fact that we can limit the neighbourhood if necessary (to get a bound on the coefficient of $|x-c|$).
  • $\displaystyle \lim_{x\to 2} \frac{x^3-4}{x^2+1} = \frac{4}{5}$. Same idea as the previous ones, just restrict it to the 1-neighbourhood and find the max value for the denom (and min for the numerator) to get an upper bound of $\frac{15}{2} |x-2|$. So $\delta$ is the lesser of 1 and $2/15\epsilon$.

4.1.3The sequential criterion for limits

TFAE:

(i) $\displaystyle \lim_{x \to c} f = L$
(ii) For every sequence $(x_n)$ in $A$ that converges to $c$ (with $x_n \neq c$), then the sequence $(f(x_n))$ also converges to $L$.

Proof: (→) it involves $\delta$ and $\epsilon$ and $K$ (the definitions). ← contrapositive, look at neighbourhoods.

4.1.4Divergence criteria

  • If there exists a sequence where $x_n \neq c$ but the sequence converges to $c$, and $(f(x_n))$ does not converge to $L$, then $f$ does not converge to $L$ at $c$
  • If the sequence $(f(x_n))$ does not converge at all at $c$, neither does $f$ at $c$

Proof: left as exercise.

Examples:

  • $1/x$ as $x \to 0$ does not converge. Take the sequence $1/n$. Then $1/n \to 0$ but $1/(1/n) = n$ which diverges.
  • The signum function does not have a limit. (Defined by $x/|x|$.) Take the sequence $(-1)^n/n$ which goes to 0, but the signum function turns it into just $(-1)^n$ which diverges.
  • $\sin(1/x)$ diverges. Make a sequence using the periodicity of sine, etc

4.1.5Exercises

Answers to select homework exercises

4.2Limit theorems

4.2.1Bounded on a neighbourhood

Bounded on a neighbourhood: exactly what it sounds like. If a function has a limit at a point, then it's bounded on some neighbourhood of the point. Proof: same as usual; triangle inequality, $\epsilon = 1$, take the sup.

Sum, product, quotient laws, squeeze theorem, etc. Same idea as 3.2.

Theorem: if the limit is strictly positive/negative, there exists a $\delta$ neighbourhood of $c$ where $f(x)$ is also strictly positive/negative.

4.2.2Exercises

Answers to select homework exercises

4.3Some extensions of the limit concept

4.3.1One-sided limits

If $c$ is a cluster point of the set $A \cap (c, \infty) = \{x \in A: x > c\}$, then $L \in \mathbb R$ is a right-rhand limit of $f$ at $c$ if (standard limit def follows).

Theorem: the left and right limits must agree.

The rest seems unimportant.

4.3.2Exercises

Answers to select homework exercises

5Continuous functions

5.1Continuous functions

Definition: $A \subseteq \mathbb R$, $f: A \to \mathbb R$, $c \in A$. $f$ is continuous at $c$ if, given any number $\epsilon > 0$, there exists $\delta > 0$ such that if $x$ is any point satisfying $|x-c| <\delta$, then $|f(x) - f(c)| < \epsilon$. We can also think of this in terms of neighbourhoods.

Note that $c$ is a cluster point, we must have that $f(c) = \lim_{x\to c} f(x)$.

5.1.1The sequential criterion for continuity

$f: A \to \mathbb R$ is continuous at the point $c \in A$ if and only if for every sequence $(x_n)$ in $A$ that converges to $c$, the sequence $f(x_n))$ converges to $f(c)$.

The discontinuity criterion is the negation of this.

5.1.2Continuity of functions

The function's gotta be continuous for every point in a given domain

5.1.3Examples of continuity

  • $f(x) = b$ is continuous because the limit is $b$ at every point
  • $g(x) = x$ cuz $g(c) = c = $ the limit
  • same for $x^2$
  • $1/x$ is not continuous at $x=0$, because the function is not defined there
  • the signum function is not continuous, as the limit does not exist at 0
  • Dirichlet's discontinuous function: 1 at rationals, 0 at irrationals. This is not continuous anywhere, due to the density theorem.
  • Thomae's function: 0 at irrationals, $1/m$ at rationals (where the rational is $n/m$ in lowest form). This is continuous at irrationals and continuous at irrationals. Discontinuous at rationals: let $(x_n)$ be a sequence of irrationals that converges to a rational $a$. Then the limit of this sequence is 0, but $1/m > 0$. Continuous at irrationals: we choose a really really small $\delta$ so that the delta neighbourhood contains no rational numbers with denominator less than $m$. Then use the Archimedean property to show that it's less than $\epsilon$.

5.1.4Exercises

Answers to select homework exercises

5.2Combinations of continuous functions

Product, sum, scalar multiple, difference of cont functions = cont. Quotient too if the denom is non-zero. Proof: easy, sum the limits. Also, the absolute value and square roots of continuous functions are also continuous (the latter only if the function is always positive). Pretty much the same as 3.2/4.2.

Compositions: continuous. Proof: using neighbourhoods.

5.2.1Exercises

Answers to select homework exercises