**Maintainer:**admin

The third midterm will be held during class on Thursday, November 15. The rooms are assigned based on your last name:

A to K - go to RPHYS 112

L to N - go to ENGMC 13

O to V - go to BURN 1B45

W to Z - go to BURN 1B39

Sections covered: everything up until section 4.1 (inclusive) of the textbook (except the divergence criterion and with only basic treatment of cluster points). The previous midterm covered up to section 3.3 (inclusive), so this midterm will most likely focus on sections 3.4-4.1.

Source: the content below is curated from the course textbook (*Introduction to Real Analysis* by Robert G. Bartle and Donald R. Sherbert). It omits much of the content and mainly attempts to summarise the important parts, and although certain (small) sections are copied verbatim from the text this usage should constitute fair dealing. Any errors or omissions should be assumed to be the fault of the author of this page (@dellsystem) and not of the textbook itself. If you have any questions or want to make a correction, feel free to either contact @dellsystem or edit the page directly.

*1*Midterm 1 review¶

Very basic stuff. Under construction.

*2*Midterm 2 review¶

Under construction.

*3*Midterm 3 review¶

*3.1*Section 3.4: Subsequences and the Bolzano-Weierstrass theorem¶

*3.1.1*Subsequences¶

Same order as the original sequence, just, drop select terms. A **tail** of a sequence is a special type of subsequence.

*3.1.1.1*Convergent subsequences¶

Theorem: if a sequence of real numbers converges to something, then any subsequence also converges to it.

An example using this theorem: to find the limit of $x_n = b^n$ if $0 < b < 1$, the limit of which we will denote by $x$, we consider the subsequence $x_{2n} = b^{2n} = (b^n)^2 = x^2_n$, which has a limit of $x^2$ (from the limit laws, etc). But it also has the same limit $x$ as the sequence itself, due to the theorem above. So then $x = \lim(x_{2n}) = (\lim(x_n))^2 = x^2$. Then, either $x=0$ or $x=1$. Since the sequence is decreasing and bounded above by 1, then $x=0$.

Another example: find the limit of $z_n = c^{1/n}$ for $c > 1$, which we will denote by $z$. Consider the subsequence $z_{2n} = c^{1/(2n)} = (c^{1/n})^{1/2} = z_n^{1/2}$. By limit laws in 3.2, we have that $z = \lim(z_{2n}) = (\lim(z_n))^{1/2} = z^{1/2}$ and so $z^2=z$, as before. In this case, since $z_n > 1$, it can't be 0, so it must be that $z=1$.

*3.1.2*Divergent subsequences¶

Probably not the best title for this section, but, whatever.

Theorem: TFAE:

(i) A sequence $X = (x_n)$ does not converge to $x$.

(ii) There exists an $\epsilon > 0$ such that for any $k \in \mathbb{N}$, there exists $n_k \in \mathbb{N}$ such that $n_k \geq k$ and $|x_{n_k} - x| \geq \epsilon$.

(iii) There exists an $\epsilon > 0$ and a subsequence $(x_{n_k})$ of $X$ such that $|x_{n_k} - x| \geq \epsilon$ for all $k \in \mathbb{N}$.

Proof for (i) $\to$ (ii): just the definition of converging to a limit.

Proof for (ii) $\to$ (iii): Just find terms that fit the definition and make a subsequence out of them.

Proof for (iii) $\to$ (i): A subsequence satisfying the condition in (iii) (meaning that it does *not* converge to $x$) exists. So, the sequence as a whole cannot converge to $x$ either.

Divergence criteria (is this covered or not?): a sequence is divergent if: 1) it contains two convergent subsequences with different limits; or 2) it is unbounded.

Example: $X=((-1)^n)$ is divergent, because the even terms converge to 1, and the odd terms converge to -1.

Another example: $\sin(n)$ is divergent. The proof given in the textbook is confusing to me.

*3.1.3*The monotone subsequence theorem¶

Every sequence (even sequences that are not themselves monotone) has a monotone subsequence.

Proof: Consider the concept of a "peak" (a term in a sequence that is greater than everything that follows it). Any given sequence $X$ can have either finitely many (including zero) or infinitely many peaks. In the first case, we can simply take the first term after the last peak. Since this term is not a peak, there must be a term after it that is greater than it, and so on, until we get an increasing subsequence. In the second case, the subsequence of peaks is decreasing and thus monotone.

*3.1.4*The Bolzano-Weierstrass theorem¶

A bounded sequence of real numbers has a convergent subsequence.

Proof: By the monotone subsequence theorem, this sequence must contain a monotone subsequence. Since it's also bounded, we can use the monotone convergence theorem to show that this subsequence is convergent as well.

*3.1.5*Limits of subsequences¶

If every convergent subsequence of a sequence converges to $x$, so does the sequence itself.

Proof: by contradiction, using Bolzano-Weierstrass.

*3.2*Section 3.5: The Cauchy Criterion¶

*3.2.1*Introduction to Cauchy sequences¶

A sequence $X = (x_n)$ of real numbers is said to be a Cauchy sequence if for every $\epsilon > 0$ there exists a natural number $H(\epsilon)$ such that for all natural numbers $n, m \geq H(\epsilon)$, the terms $x_n$, $x_m$ satisfy $|x_n - x_m| < \epsilon$.

This is probably an important definition.

Example: $(1/n)$. Proof: given $\epsilon > 0$, choose $H = H(\epsilon)$ such that $H > 2/\epsilon$. Then if $m, n \geq H$, we have $1/n \leq 1/H < \epsilon/2$ and similarly $1/m < \epsilon/2$. Then, we have:

$$\left | \frac{1}{n} - \frac{1}{m} \right | \leq \frac{1}{n} + \frac{1}{m} < \frac{\epsilon}{2} + \frac{\epsilon}{2} = \epsilon$$

which tells us that $(1/n)$ is a Cauchy sequence.

Example of something that is *not* a Cauchy sequence: $(1 + (-1)^n)$. The difference between any two terms is always 2, so we can always find an $\epsilon$ such that, yeah. Remember that game?

Note that to prove that a sequence *is* a Cauchy sequence, we cannot assume a relationship between $m$ and $n$, for the inequality $|x_n - x_m| < \epsilon$ must hold for *all* $n, m \geq H(\epsilon)$. However, to prove that a sequence is *not* a Cauchy sequence, we can make use of a relation between $m$ and $n$ (as long as it shows that for arbitrarily large values o f $m$ and $n$, the inequality does not hold).

Lemma: if a sequence of real numbers is convergent, it is a Cauchy sequence. Proof: $\epsilon$ definition.

Lemma: Cauchy sequences of real numbers (does that matter) are bounded. Proof: by triangle inequality and use of supremums.

*3.2.2*The Cauchy convergence criterion¶

A sequence of real numbers is convergent if and only if it is a Cauchy sequence.

Proof: For →, we use the first lemma in the previous section. For ←, we first note that a Cauchy sequence is bounded (second lemma in previous section). By Bolzano-Weierstrass, there is a subsequence that converges to some number $x'$. We then use the $\epsilon$ definition for the sequence and the subsequence, and then we can sort of combine the two and then use the triangle inequality to conclude that the sequence itself converges to $x'$, and so the sequence is convergent.

*3.2.3*Contractive sequences¶

A sequence $X = (x_n)$ is constractive if there exists a constant $C$ between 0 and 1 such that

$$|x_{n+1} - x_{n+1}| \leq C |x_{n+1} - x_n|$$

for all $n \in \mathbb{N}$. $C$ is called the

constantof this contractive sequence.

Every contractive sequence is a Cauchy sequence, and thus is convergent. Proof: keep applying the definition (with $C$) for contractive sequences until you get to $C^n|x_2 - x_1|$ which is $\leq$ all the others. For $m > n$, we use the triangle inequality. This gets weird. I give up.

*3.3*Section 3.6: Properly divergent sequences¶

A properly divergent sequence is one whose limit is $\pm \infty$. We can define an infinite limit as follows: $(x_n)$ tends to $+\infty$ if for every $\alpha \in \mathbb R$ there exists $K(\alpha) \in \mathbb{N}$ such that if $n \geq K(\alpha)$, then $x_n > \alpha$. Similar for $-\infty$.

Examples of things that tend to $\infty$: $n$, $n^2$, $c^n$ for any $c > 1$.

A monotone sequence is properly divergent if and only if it is unbounded. Not too difficult to prove, just think about it.

Also, this is kind of obvious, but if you have two sequences, and the first tends to infinity and all of its terms are less than those of the other sequence, then the other sequence tends to infinity as well.

*3.4*Section 3.7: Introduction to infinite series¶

Did we even do this? Did I just sleep through this week?

*3.5*Section 4.1: Limits of functions¶

*3.5.1*Cluster points¶

A point in a set $A \subseteq \mathbb{R}$ such that for every $\delta > 0$, there exists at least one point $x \in A$ ($x \neq c$) such that $|x-c| \leq \delta$.

Note that $c$ may or may not be a member of $A$. Even if it is, we ignore the point itself.

A cluster point is really just something that elements in the set get arbitrarily close to.

Theorem: a number $c \in \mathbb R$ is a cluster point of a subset $A \in \mathbb R$ iff there exists a sequence $(a_n)$ in $A$ such that $\lim(a_n) = c$ and $a_n \neq c$ for all $n \in \mathbb N$.

Examples: for the open interval from 0 to 1, every point in the closed interval is a cluster point. Incidentally, 0 and 1 are cluster points of the open interval even though they don't belong to it. All the points of the open interval are cluster points of itself.

A finite set has no cluster points, and the infinite set $\mathbb N$ has no cluster points.

The set $\{1/n:n\in \mathbb N\}$ has only one cluster point: 0.

Every point in the closed interval from 0 to something is a cluster point of $\mathbb Q$.

*3.5.2*The definition of the limit¶

The function $f: A \to \mathbb R$ has a limit $L \in \mathbb R$ at a cluster point $c$ of $A$ if, for any given $\epsilon > 0$, there exists $\delta > 0$ such that if $x \in A$ and $|x-c| < \delta$ (where $x \neq c$), then $|f(x) - L| < \epsilon$.

Note that the value of $\delta$ depends on $\epsilon$.

If a limit does not exist at a point $c$, then $f$ diverges at $c$. Otherwise, it has a unique limit at that point. To prove this, do the standard thing where you get $\epsilon/2 + \epsilon/2 = \epsilon$ from which you can conclude that $L - L' = 0$, etc.

We can rephrase this in terms of neighbourhoods by saying that $x$ belongs to the $\delta$-neighbourhood of $c$.

Examples:

- the constant function's limit at any point is obvious
- $\lim_{x \to c} x = c$. To prove, just replace $f(x)$ with $x$.
- $\lim_{x \to c} x^2 = c^2$, this proof is a bit longer but we use the fact that $x^2-c^2$ (which we want to make arbitrarily small) is equivalent to $(x-c)(x+c)$ and then we do weird stuff with the triangle inequality, so, like, yeah

I don't know what anything means anymore

*3.5.3*The sequential criterion for limits¶

TFAE:

(i) $\displaystyle \lim_{x \to c} f = L$

(ii) For every sequence $(x_n)$ in $A$ that converges to $c$ (with $x_n \neq c$), then the sequence $(f(x_n))$ also converges to $L$.

Proof: (→) it involves $\delta$ and $\epsilon$ and $K$. ← contrapositive, look at neighbourhoods.

*3.5.4*Divergence criteria¶

I think we may not have to know this. In that case, I won't.