THESE NOTES HAVE NOT BEEN VERIFIED BY THE INSTRUCTOR. USE AT YOUR OWN RISK.
Kripke automata, stone duality, Brzozowski's algorithm. Detailed notes on duality are available here.
1Sidebar on vector spaces¶
The below is just an introduction to duality via vector spaces, so don't worry too much about it.
Let $V$ be a vector space over $\mathbb R$ of finite dimension $n$, and let $\lambda$ be a linear map $V \to \mathbb R$ (so $\lambda(\alpha u + \beta v) = \alpha \lambda (u) + \beta \lambda (v)$).
The collection of all possible linear maps is itself an $n$dimensional vector space, which we can call $V^*$. Let $v_1, \ldots, v_n$ be a basis for $V$. Then a basis for $V^*$ is given by $\sigma_1, \ldots, \sigma_n$ where
$$\sigma_i(v_j) = \begin{cases} 0 & \text{if} & i \neq j \\ 1 & \text{if} & i = j \end{cases}$$
$V$ and $V^*$ are isomorphic, but mirror images, if you will. Specifically, if $M$ is a linear function $V \to U$, then $M^*$ is a map $U^* \to V^*$. If $\alpha \in U^*$ and $v \in V$, $M^*(\alpha)(v) = \alpha(M(v))$.
So the $^*$ is just the adjoint/transpose operator, really. Also, $V^{**} \simeq V$ (canonical isomorphism).
2Stone duality for automata theory¶
2.1Kripke automata¶
Consider a Kripke automaton (or, Moore machine)  there are buttons, and lights, which can be on or off. We can let the states be defined by the exact lights that are on, and transitions between states are governed by the buttons. Formally, we have $(S, A, O, \delta, \gamma)$, where:
 $S$: states
 $A$: alphabet, or actions
 $O$: observations (lights)
 $\delta$: transition function, $S \times A \to S$
 $\gamma$: the possible observations for each state, $S \to \mathcal P(O)$
2.1.1Introduction to tests¶
Tests/experiments that can be done:
 Any $w \in O$ is a test (e.g., "is the blue light on")
 If $t$ is a test and $a\in A$ then $at$ is a test (e.g., "press this button then check if the blue light is on")
We can then have tests like $abaccB$ where $a, b, c$ are all actions and $B$ is an observation.
Let $s \in S$ and let $t$ be a test. We can define what it means to satisfy a test inductively. If $t$ consists of just an observation, then $s \models t$ ($S$ satisfies $t$) if $t \in \gamma(s)$, i.e., if the observation $t$ is a possibility from the state $s$. Now consider $at$ for $a \in A$. $s \models at$ if $\delta(s, a) \models t$. Furthermore, if $w$ is a sequence of actions, then $s \models wt$ if $\delta^*(s, w) \models t$. In other words, we perform all the actions of the test, then check if the observation is true.
Two tests $t_1$ and $t_2$ are equivalent if $\forall s \in S$, $s \models t_1 \iff s \models t_2$.
This defines an equivalence relation on tests. There are of course infinitely many tests^{1}, but the number of equivalence classes has an upper bound of $2^S$ since any test is fully specified by the subset of states that satisfy it. Formally, we define $⟦t⟧ = \{s \mid s \models t\}$. Then $t_1 \sim t_2 \iff ⟦t_1⟧ = ⟦t_2⟧$.
In addition, we define an equivalence relation on states as follows: $s_1 \equiv s_2$ if for all tests $t$, $s_1 \models t \iff s_2 \models t$ (so two states are equivalent if there is no observable difference in behaviour between them that can be revealed by tests).
2.1.2Creating the dual machine¶
Now, since there are finitely many equivalence classes for the equivalence relation on tests, we can create a dual machine which uses these equivalence classes for tests as states, and states as observations. We define this machine as $K = (S', A, O', \delta', \gamma')$:
 $S' = \text{Tests} / \sim$ (i.e., the equivalence classes for tests mentioned earlier)
 $O' = S$
 $\delta'([t], a) = [at]$^{2}
 $\gamma'([t]) = \{s \mid s \models t \} = ⟦t⟧$ as we defined earlier.
This machine is similar to the original in many ways, except that it's backwards, in a sense.
Now consider what happens if you take the dual of the dual machine. Do you get the original machine? Not quite  you get the minimal machine. Why is this? Well, recall our equivalence relation $\equiv$ for states. If there are no pairs of states $s$ and $s'$ such that $s \equiv s'$ (and $s \neq s'$), then the machine is minimal. But that's exactly what results from taking the dual twice!
2.1.3Brzozowski's algorithm¶
We can formalise this method of obtaining the minimal machine as an algorithm, known as Brzozowski's algorithm (a mouthful of a name if I ever saw one). It works as follows:
 Take a DFA and reverse the arrows.
 Make the initial state final, and the final states initial. This produces a DFA.
 Determinise the NFA and take the reachable subset.
 Repeat steps 13 on the reachable subset.
 Rejoice, for you have created a minimal machine.
Unfortunately, this algorithm isn't entirely practical  the running time is (potentially) exponential, which is much worse than other algorithms that accomplish the same thing^{which?}. However, the median running time is pretty good.

You could just keep prepending actions, for instance. ↩

We should probably show that this is welldefined. To do so, we'd have to show that when $t_1 \sim t_2$, then $at_1 \sim at_2$. Well: $\forall s$, $s \models at_1 \iff \delta(s, a) \models t_1 \iff \delta(s, a) \models t_2 \iff s \models at_2$. $\blacksquare$ ↩