1About the course¶
The homework questions will be harder than exam questions. The exam will be open book. The midterm will take place on October 10, in class. For more information, refer to the syllabus.
Important note for office hours: if you come to them, don't just wait outside the office, even if there's someone in there; feel free to barge right in.
2Finite automata¶
2.1Introduction to finite automata¶
The first thing we'll study in this course is the concept of finite automata. While the origins of this field lie in the theory of neurons, its current incarnation is quite removed from these origins, being more mathematical than anything else.
What exactly is a finite automaton? One way of describing it is as a finite state machine (FSM). You can do quite a bit with these machines, but not everything -- they're not as powerful as Turing machines, for example. On the other hand, there is a nice algebra underlying these machines, and so we'll cover this topic until the midterm; after the midterm, we'll look at pushdown automata, a kind of rich finite automata that uses a stack for memory that are more powerful than FSMs but still less powerful than Turing machines. Later, we'll study the pumping lemma, which will define what things can and cannot be done via pushdown automata.
Incidentally, all programming languages are context-free languages.
2.2The limits of computability¶
Are there limits to what can be computed, even with a general (Turing) machine? Indeed, there are. Alan Turing and Alonzo Church both showed the existence of unsolvable problems. For a telling illustration, recall the quote on the back of the sweater the professor was wearing:
I DON'T KNOW IF I HALT OR NOT
3Presumed knowledge¶
- Set notation: membership ($\in$), containment ($\subseteq$)
- Binary relations and functions
- $\forall$, $\exists$, and the Abelard-Heloise game (theorem: a statement is true if Heloise always has a winning strategy)
4Well-founded induction¶
This section is covered fairly quickly, with some of the details omitted; I assume that it's because much of it is presumed knowledge.
4.1Equivalence relations¶
We'll build up the definition of induction from scratch. Let's start from the basics: an equivalence relation, on a set $S$, is a binary relation (denoted by $\sim$, or $\equiv$, or $\simeq$), that has the following properties:
- Reflexivity
- $\forall a \in S, \,a \sim a$
- Symmetry
- $\forall a, b \in S,\, a \sim b \to b \sim a$
- Transitivity
- $\forall a, b, c \in S, \, a \sim b \land b \sim c \to a \sim c$
Next we define the concept of an equivalence class: if $\sim \subseteq S \times S$, and $t \in S$, then we define $[t]$ to be
$$[t] = \{ s \in S \,|\, s \sim t\}$$
i.e., the set of all things related to $t$. This is called the equivalence class of $t$. Incidentally, any two equivalence classes are either disjoint or equal; in other words, an equivalence relation is a partition. The proof follows from transitivity.
$S/\sim$ is the collection of equivalence classes resulting from the equivalence relation $\sim$.
4.2Partial orders¶
A partial order, $\leq \subseteq S \times S$, is a relation with the following properties:
- Reflexivity
- $\forall a \in S, \,a \sim a$
- Antisymmetry
- $\forall a, b \in S,\, a \sim b \land b \sim a \to a = b$
- Transitivity
- $\forall a, b, c \in S, \, a \sim b \land b \sim c \to a \sim c$
The standard example of a partial order is $\mathbb N$ when ordered by the usual $\leq$ ordering. This also happens to be a total order (aka linear order), since every element in $\mathbb N$ is related to every other element (mathematically, we have that $\forall a, b \in S$, either $a \leq b$ or $b \leq a$).
Another, more powerful example, is that of the subsets of a set of 3 elements, ordered by inclusion. If $A = \{a, b, c\}$, and we look at the power set of $A$, we have 8 elements. The element $\{\}$, for instance, is a subset of every other subset of $A$, so that's the minimal element. Some elements cannot be compared, however, making this not a total order; for instance, neither of $\{a\}$ and $\{b\}$ is contained within the other. The lattice for this partial order is shown as follows (taken from my MATH 318 lecture notes):
Food for thought: how would we create a partial order for $\mathbb C$? If we were to just order by magnitude, that certainly woldn't result in a partial order, as antisymmetry would be violated. On the other hand, if we were to incorporate the angle as well, we would probably get somewhere. My guess is that we could view it as ever-expanding concentric circles in the complex plane: first we order by magnitude, then, among numbers of the same magnitude, we order by the angle, starting at 0 degrees and moving counterclockwise. Another option would be to order first by the $a$ in $a + bi$, then by the $b$ (so $7 + 3i \leq 7 + 4i \leq 8 - 10i$, etc). But this is all conjecture.
4.3Induction¶
Now we finally get to the concept of induction. This is a technique of proof that can be used on any set with a well-ordering. First, we define well-foundedness:
A set $S$ is well-founded if every non-empty subset of $S$ has a minimal element, such that there is nothing strictly smaller than it in the set. (Note that "minimal", as used here, is not quite the same as "least"; "least" implies that it is actually less than every else, which is not a requirement. This clarification ensures that non-total orders can be considered.)
Next, we define an inductive order: a poset (partially ordered set) $(S, \leq)$ is inductive if for any predicate1 $P$, we have that $\forall x \in S$, if $(\forall y < P(y)) \to P(x)$, then $P$ holds for everything in the set. Note that not every order has this property! Furthermore, this is all that we require to define an inductive set - no discussion of "base cases" is required.
4.3.1Equivalence of inductive orders and well-founded sets¶
Now we get to the crux of this section. An order is inductive if and only if the underlying set is well-founded. We prove this as follows:
($\Leftarrow$) Assume that $\forall x \in S$, $(\forall y < x\, Py) \to Px$ holds true. Let $U = \{x \,|\, \neg Px\}$, i.e., the set of things that don't satisfy $P$. This set must have a minimal element since it's a well-founded order (by assumption). Then we get a standard (but nice) proof by contradiction: if $u$ is the minimal element, then $\forall y < u$, $y \notin U$ thus $Py$ holds. But then we have that $Pu$ holds true as well! This is a contradiction, as $u$ is supposed to be in the set of elements that don't satisfy $P$. There is no $u \in U$. Thus $U$ is empty. $\blacksquare$
($\Rightarrow$) Assume that induction holds. We need to prove that the set is well-founded. Suppose $U$ is a subset of $S$ such that $U$ does not have a minimal element. We define a predicate $P$ as follows: $Px$ holds true for any $x \notin U$. For all $y < x$, $Py \to Px$ for all $x$. Why? If this is not true, then $x$ would have to be minimal. Then by induction, there is nothing in $U$, so $U = \varnothing$, etc. So only the empty set has no minimal element. $\blacksquare$
4.4Supplemental Notes¶
Prakash Panangaden provides supplemental notes on well-foundedness and induction, available on his website.
-
Just a way of defining a subset of $S$. Can be thought of as a property that members of $S$ may possess, which can also be defined by the applicable members themselves. ↩