Friday, October 28, 2011 CC-BY-NC
More Peano arithmetic

Maintainer: admin

1Proofs in Peano arithmetic

From last class, we have:

PA $\vdash \forall x (x = 0 \lor \exists y (x = Sy))$
PA $\vdash (\forall x \forall y \forall z(x + (y+z) = (x+y) + z))$ (associativity of addition)

1.1Proof that 0 + x = x

Even the axioms can be proven in Peano arithmetic. Axiom A1, for instance, can be proven in two lines - one a use of a tautology, the other a use of monotonicity. We can also prove the following similar theorem:

$$\forall x(0 + x = x)$$

Basis step:

$0 + 0 = 0$, by A1

Induction step:

Assume $0 + x = x$.
$0 + Sx = S(0+x)$ by A2
$S(0 + x) = Sx$ by the induction hypothesis and substitution
$0 + Sx = Sx$ by the transitivity of equality $\blacksquare$

From this, we can also conclude that $x + 0 = 0 + x$.

N.B. $x + S0 = S(x + 0) = Sx$ by A2 and A1 for all $x, \mathcal{N}$ in PA.

1.2Proof that S0 + x = Sx

$$\text{PA } \vdash \forall x (S0 + x = Sx)$$

Basis:

$S0 + 0 = S0$ by A1

Induction step:

Assume $S0 + x = Sx$ (induction hypothesis)
Then $S0 + Sx = S(S0 + x) = SSx$ by the induction hypothesis $\blacksquare$.

1.3Proof that addition is commutative

$$\text{PA } \vdash \forall x \forall y (x + y = y + x)$$

This can be proved by induction on y. Basis:

$x + 0 = x = 0 + x$ (from above)

Induction step:

Assume $x + y = y + x$.
Then, $x + Sy = S(x + y)$
Also, $Sy + x = (y + S0) + s$ (from above)
which is equal to $y + (S0 + x)$ (due to the associativity of addition)
which is equal to $y + Sx = S(y+x)$ (from above, and by A2) $\blacksquare$

So by the induction hypothesis and substitution, we get $S(x+y) = S(y+x)$, so indeed $x + Sy = Sy + x$.

1.4Proof that 0 * x = 0

$$\text{PA } \vdash \forall x (0 \cdot x = 0)$$

(This is like M1, but the other way around.)

Basis:

$0 \cdot 0 = 0$ by M1

Induction step:

Assume $0 \cdot x = 0$.
Then, $x \cdot S0 = x \cdot 0 + x$ by M2
$= 0 + x$ by M1
$= x$ by $\text{PA } \vdash \forall x (0 + x = 0)$ $\blacksquare$

1.5Proof that S0 * x = x

Basis:

$S0 \cdot 0 = 0$ by M1

Induction step:

Assume $S0 \cdot x = x$
Then, $S0 \cdot Sx = (S0 \cdot x) + S0$ by M2
$= x + S0$ by the induction hypothesis
$ = Sx$ (previously proved)

1.6Proof of the distributivity of multiplication

$$\text{PA } \vdash \forall x \forall y(x \cdot (y + z) = (x + y) \cdot (x + z))$$

This can be proved by induction on $z$.1 Incidentally, if you have a commutative law, one distributive law follows from the other. But it's actually harder to prove commutativity than to prove either distributive law, so.

Basis:

$x \cdot (y + 0) = x + y$ by A1 and substitution
and $(x \cdot y) + (x \cdot 0) = (x \cdot y) + 0$ by M1
$= x \cdot y$ by A1

Induction step:

Assume $x \cdot (y + z) = (x \cdot y) + (x \cdot z)$.
Then, $x \cdot (y + Sz) = x \cdot (S(y + z))$ by A2
$= x \cdot (y + z) + x$ by M2
$= (x \cdot y + x \cdot z) + x$ by the induction hypothesis
$= x \cdot y + (x \cdot z + x)$ by the associativity of addition
$=x \cdot y + x \cdot Sz$ by M2 $\blacksquare$

1.7Proof of the other distributivity law

$$\text{PA } \vdash \forall x \forall y \forall z ((x + y) \cdot z = x \cdot z + y \cdot z)$$

Basis:

$(x + y) \cdot 0 = 0$ by M1
$x \cdot 0 + y \cdot 0 = 0 + 0 = 0$ by M1 and A1

Induction step:

Assume $(x + y) \cdot z = x \cdot z + y \cdot z$.
Then $(x + y) \cdot Sz = (x + y) \cdot z + (x + y)$ by M1
$=(x \cdot z + y \cdot z) + (x + y)$ by the induction hypothesis
$=((x \cdot z + y \cdot z) + x) + y$ by the associativity of addition
$= (x \cdot z + (y \cdot z + x)) + y$ by the associativity of addition
$=(x \cdot z + (x + y \cdot z)) + y$ by the commutativity of addition
$=((x \cdot z + x) + y \cdot z) + y$ by the associativity of addition
$=(x \cdot z + x) + (y \cdot z + y)$ by the associativity of addition
$=x \cdot Sz + y \cdot Sz$ by M2 $\blacksquare$

1.8Proof of the associativity of multiplication

$$\text{PA } \vdash \forall x \forall y \forall z (x \cdot (y \cdot z) = (x \cdot y) \cdot z)$$

Basis:

$x \cdot (y \cdot 0) = x \cdot 0 = 0$ by M1. twice
$(x \cdot y) \cdot 0 = 0$ by M1 (once)

Induction step:

Assume $x \cdot (y \cdot z) = (x \cdot y) \cdot z$.
Then $x \cdot (y \cdot Sz) = x \cdot (y \cdot z + y)$ by M2
$=x \cdot (y \cdot z) + x \cdot y$ by distributivity
$=(x \cdot y) \cdot z + (x\cdot y)$ by the induction hypothesis
$=(x \cdot y) \cdot Sz$ by M2

1.9Proof of the commutativity of multiplication

$$\text{PA } \vdash \forall x \forall y (x \cdot y = y \cdot x)$$

Basis:

$x \cdot 0 = 0 \cdot x$ from before I think

Induction step:

Assume $x \cdot y = y \cdot x$.
Then $x \cdot Sy = x \cdot y + x$ by M2
$= y \cdot x + x$ by the induction hypothesis
$=y \cdot x + S0 \cdot x$ (previously proved)
$=(y+S0) \cdot x$ by distributivity
$=Sy \cdot x$ (previously proved) $\blacksquare$

1.10Some new symbols

Let's throw in a new symbol: $x^*y$, which means $x^y$. Also, $\Pi(x)$ which abbreviates "$x$ is prime" (i.e. $S0 < x \land \forall y \forall z (y \cdot z = x \to (y = S0 \lor z = S0))$). And recall that $x \mid y$ abbreviates $\exists z (x \cdot z = y)$ (which is a partial order - transitive due to the associativity of multiplication, and antisymmetric only for the positive integers).

1.11Strong induction

$$\text{PA } \vdash \forall x (S0 < x \to \exists y (y \mid x \land \Pi(y)))$$

We can't really prove this by regular induction, because you can't figure out the factors of $Sx$ from the factors of $x$.

So instead, we use strong induction, which makes a stronger assumption than regular induction. Here's the induction hypothesis:

$$\forall y ((\forall z(z < x \to \varphi(z, \bar{y})) \to \varphi(x, \bar{y})) \to \forall x \varphi(x, \bar{y}))$$

Basically you assume that the thing you want to prove is true not only for some $x$, but for all the numbers up to $x$ as well.

1.11.1A proof by strong induction

In this case, the theorem $\varphi(x)$ that we want to prove is as follows:

$$S0 < x \to \exists y(\Pi(y) \land y \mid x)$$

Our induction hypothesis is $\forall z (z < x \to \varphi(z, \bar{y}))$.

From this, we want to be able to conclude that $S0 < x \to \exists y(\Pi(y) \land y \mid x)$.

We can assume that $S0 < x$. If $\Pi(x)$, then since $x \mid x$, $\exists y (\Pi(y) \land y \mid x)$. Else, if $\neg \Pi(x)$, then there exist $u, v$ such that $x = u \cdot v$ with $S0 < u \land S0 < v$.

Claim: $u < x \land v < x$. Why? Well, $S0 + w = u$ for some $w \neq 0$ (axiom L).

So, $x = (S0 + w) \cdot v = S0 \cdot v + w \cdot v = v + w \cdot v = w \cdot v \neq 0$.

Say $S0 + t = v$. Then $w \cdot v = w \cdot (S0 + t) = w \cdot st = wt + w = wt + Sv = S(w \cdot t + v) \neq 0$. So $w \neq 0$. So $w = Sv$ for some $v$. I think that's a v.

By axiom L, $v < x$ (similarly, $u <x$).

By the induction hypothesis, there is a $y$ such that $\Pi(y)$ and $y \mid u$. So $y \cdot z = u$ for some $z$. So $(y \cdot z) \cdot v = u \cdot v = x$. So $y \cdot (z \cdot v) = x$, so $y\mid x$.

By strong induction, $\forall x \varphi (x)$.

(Is this something different?)

Lemma: $\forall z \forall x (z < Sx \iff (z < x \lor z = x))$.

Proof: say $z < Sx$. Then, by L, $z + w = Sz$ for some $w \neq 0$. So $w = Sk$ for some $k$. So $z = Su = Sx$, and $S(z + u) = Sx$ (by A2), and $z + u = x$ (by S2). If $u \neq 0$, by L, $z < x$. If $u = 0$, by A1, $z = x$.

Now for the other direction. Note: $x < Sx$, since $x + S0 = Sx$, and $S0 \neq 0$, and by the transitivity of $<$2, if $z < x$, then $z < Sx$. If $z = x$, then $z < Sx$.

(Is this something different?)

Given $\varphi(x, \bar{y})$, let $\Psi(x, \bar{y})$ be $\forall z(z < x \to \varphi(z, \bar{y}))$. Assuming $\forall x (\forall z(z < x \to \varphi(z, \bar{y})) \to \varphi(x, \bar{y}))$, we will show $\Psi(0, \bar{y}) \land \forall x (\Psi(x, y) \to \varphi(Sx, \bar{y}))$. Then, by ordinary induction, we will show that $\forall x \Psi(x, \bar{y})$.

$\Psi(0, \bar{y})$ says $\forall z (z < 0 \to \varphi(z, \bar{y}))$ (since $z < 0$ is always false). Suppose $\Psi(x, y)$, i.e. $\forall z(z < x \to \varphi(z, \bar{y}))$. So $\Psi(0, \bar{y})$ is true.

Using our assumption, we get $\varphi(x, y)$.

By our lemma, $x < Sx \iff z < x \lor z = x$.

But $\forall z(z < x \to \varphi(z, \bar{y})) \land \varphi(x, \bar{y})$.

So $\forall z(z < Sx \to \varphi(z, \bar{y}))$.

By $\forall x \Psi(x, \bar{y})$, then $\Psi(Sx, \bar{y})$ is true. If $\forall z (z < Sx \to \Psi(z, \bar{y})) $ and $x < Sx$,what? so $\varphi(x, \bar{y})$ is true for all $x$.

1.12Proof that there are infinitely many primes

A very informal proof that there are infinitely many primes, by contradiction.

$$\forall x \exists y (x < y \land \Pi(y))$$

Suppose there's some $x$ such that every prime $\leq x$.

Then we can list all the primes as $y_1, y_2 \ldots y_n$. Let $N = y_1 \cdot \ldots \cdot y_n + 1$ (so the product of all the primes, plus one). Clearly $N>1$. So there is a prime $p$ such that $p \mid N$, but it can't be one of the $y$'s, because otherwise you'd get a remainder of 1. Consequently, $N$ must also be prime.

1.13Euclid's division algorithm

$$\forall x \forall y(0 < y \to \exists z \exists w(x = y\cdot z + w \land w < y \land \forall u \forall v (x = y \cdot u + v \land v < y \to u = z \land v = w)))$$

Proof of existence, by strong induction on $x$.

If $x < y$, let $z = 0 \land w = x$.

Assume the induction hypothesis: $\forall t (t < x \ to \exists z \exists w(t = y \cdot z + w) \land w < y)$. Claim: from this, we get the same for $x$, and we may assume $y = x$ or $y < x$. If $y = x$, let $z = 1$ and $w = 0$. If $y < x$, let $y + t = x$ for some $t > 0$ and $t < x$ because $y \neq 0$. So there are $z$ and $w$ such that $t = y \cdot z + w \land w < y$. $x = y + t = y + (y \cdot z +w) = (y \cdot S0 + y \cdot z) + w = y \cdot (S0 + z) + w$, so we are done by induction.?

Incidentally, the natural numbers are well-ordered (by the strict, total order $<$), meaning that if you have any subset of $\mathbb{N}$, there is a smallest one. Not sure why this was mentioned in this lecture but there you go.

  1. If you have a theorem with multiple variables, you can typically do induction on the variable farthest to the right, e.g. $z$ in $\forall x \forall y \forall z$ because of where shit is in the axioms 

  2. We'll have to prove this, eventually.