Actions

Natural proof system (propositional logic)

From Learning Logic for Computer Science

A natural (formal) proof is a proof in a proof system which tries to mimic how mathematicians argue when they prove theorems.

Introduction

When theorems are proved, the setup is typically as follows. Given are a set of premises and a goal which hopefully is a correct logical consequence of the premises. The prover then uses (simple) logical arguments to derive conclusions from the premises. The new conclusions are used to draw further conclusions, again with (simple) logical arguments. And so on and so forth. Hopefully, the original goal will be among the conclusions drawn. Some of the arguments may use (temporary) assumptions to derive, for instance, contradictions in indirect arguments.

Here is an example of how this might go. Suppose the only premise we start from is $X_0 \rightarrow X_1$. Then we may want to prove $X_0 \vee X_2 \rightarrow X_1 \vee X_2$ (which is indeed a logical consequence of the premise).

A simple logical argument is the following: if one wants to prove $\varphi \rightarrow \psi$, then it is sufficient to prove $\psi$ under the assumption $\varphi$. So in our case, we may make the assumption that $X_0 \vee X_2$ holds and try to derive that $X_1 \vee X_2$ holds.

Now do we have a logical argument that tells us how we can exploit a disjunction like $\varphi \vee \psi$ (in our case, the disjunction is $X_0 \vee X_2$) in a proof? A fairly general argument is: if under the assumption $\varphi$ we can prove $\chi$ and under the assumption $\psi$ we can prove $\chi$, then we can derive $\chi$ in general.

So let us assume $X_0$. From the premise, $X_0 \rightarrow X_1$, we conclude $X_1$ (by another simple logical argument). So, in particular, we have $X_1 \vee X_2$ (by yet another simple logical argument). Next, we assume $X_2$. Then, clearly, we can conclude $X_1 \vee X_2$. So we can apply the above logical argument for disjunction and obtain $X_1 \vee X_2$.

The latter is exactly what we needed to complete the proof.

Here is the way we will write down this proof:

\begin{array}{lll} 1.1 & X_0 \vee X_2 & \text{assumption}\\ 1.2.1 & X_0 & \text{assumption}\\ 1.2.2 & X_0 \rightarrow X_1 & \text{premise}\\ 1.2.3 & X_1 & (\rightarrow\text e) \text{ with } 1.2.1 \text{ and } 1.2.2\\ 1.2.4 & X_1 \vee X_2 & (\vee\text i) \text{ with } 1.2.3\\ 1.3.1 & X_2 & \text{assumption}\\ 1.3.2 & X_1 \vee X_2 & (\vee\text i) \text{ with } 1.3.1\\ 1.4 & X_1 \vee X_2 & (\vee\text e) \text{ with } 1.2 \text{ and } 1.3\\ 2 & X_0 \vee X_2 \rightarrow X_1 \vee X_2 & (\rightarrow\text i) \text{ with } 1 \end{array}

The line numbers reflect that we make temporary assumptions and draw conclusions from them. In some sense, these line numbers reflect the structure of the proof into subproofs. The comments at the end of the lines explain which logical argument is applied to derive the current formula. The single letter "e" stands for "elimination" and the single letter "i" stands for "introduction".

All of this is formalized in what follows.

Proof rules

The proof system we use assumes that formulas in proof normal form, which are formulas built using the connectives $\rightarrow$, $\vee$, $\wedge$, and $\neg$ only and that $\bot$ is the only symbol for logical constants. So $\top$ is not permitted.

For each of these symbols there is a rule to introduce it and one to elminate it. In some cases, there are two rules, a left and a right variant.

Here is the full set of rules: \begin{array}{l|c|c} \def\arraystretch{1.5} & \text{introduction} & \text{elimination}\\ \hline \hline \vee & \frac{\varphi_0}{\varphi_0 \vee \varphi_1} \quad (\vee\text{i-l}) \qquad \frac{\varphi_1}{\varphi_0 \vee \varphi_1} \quad (\vee\text{i-r}) & \frac{\varphi_0 \dots \varphi_2 \quad \varphi_1 \dots \varphi_2 \quad \varphi_0 \vee \varphi_1}{\varphi_2} \quad (\vee\text e)\\ \hline \wedge & \frac{\varphi_0 \quad \varphi_1}{\varphi_0 \wedge \varphi_1} \quad (\wedge \text i) & \frac{\varphi_0 \wedge \varphi_1}{\varphi_0} \quad (\wedge \text{e-l}) \qquad \frac{\varphi_0 \wedge \varphi_1}{\varphi_1} \quad (\wedge \text{e-r})\\ \hline \rightarrow & \frac{\varphi_0 \dots \varphi_1}{\varphi_0 \rightarrow \varphi_1} \quad (\rightarrow \text i) & \frac{\varphi_0 \quad \varphi_0 \rightarrow \varphi_1}{\varphi_1} \quad (\rightarrow\text e)\\ \hline \neg & \frac{\varphi_0 \dots \bot}{\neg \varphi_0} \quad (\neg \text i) & \frac{\neg \neg \varphi_0}{\varphi_0} \quad (\neg \text e)\\ \hline \bot & \frac{\varphi_0 \quad \neg \varphi_0}{\bot} \quad (\bot \text i) & \frac{\bot}{\varphi_0} \quad (\bot \text e) \end{array}

Three dots $(\dots)$ indicate a subproof.

The upper part of a proof rule contains its premises, the lower part its conclusion. If one would allow multiple conclusions, there would be really exactly two rules per connective, for instance, one could write \begin{align} \frac{\varphi_0}{\varphi_0 \vee \varphi_1 \quad \varphi_1 \vee \varphi_0} \end{align} to capture the two introduction rules for $\vee$.

Formal proofs

A formal proof is a sequence, possibly nested, of formulas. The definition of the set of all proofs is an inductive definition. It not only specifies what proofs are, it also says what the assumptions of a proof are and what its conclusions are. The notation we use is $P \colon A \mid C$, which means that $P$ is a proof with set of assumptions $A$ and set of conclusions $C$.—The formal definition is somewhat more liberal than the definition above.

Base elements:

  • The empty sequence is a proof with no assumption and no conlcusion.

Inductive rules: There is one rule which allows to add assumptions and there is one inductive rule for each proof rule.

  • Adding assumptions. Suppose $P \colon A \mid C$ and suppose $\varphi$ is a formula. Then $P, \varphi \colon A \cup \{\varphi\} \mid C$ is a proof.
  • $\vee$-introduction on the left. Suppose $P \colon A \mid C$ and suppose $\varphi_0 \in C$. Then $P \colon A \mid C \cup \{\varphi_0 \vee \varphi_1\}$.
  • $\vee$-introduction on the right. Suppose $P \colon A \mid C$ and suppose $\varphi_1 \in C$. Then $P \colon A \mid C \cup \{\varphi_0 \vee \varphi_1\}$.
  • $\vee$-elimination. Suppose $P_i \colon A_0 \mid C_i$ for $i<3$ and $\varphi_0 \vee \varphi_1 \in C_0$, $\varphi_2 \in C_1 \cap C_2$, then $$P_0, \langle P_1 \rangle, \langle P_2 \rangle, \varphi_2 \colon A_0 \cup (A_1 \setminus (C_0 \cup \{\varphi_0\})) \cup (A_2 \setminus (C_0 \cup \{\varphi_1\})) \mid C_0 \cup \{\varphi_2\}\enspace.$$
  • ...

The remaining inductive rules are along the same lines.

Derivability relation and syntactic equivalence

A formula $\psi$ can be derived from a set of formulas $\Phi$ if there is a proof $P \colon A \mid C$ such that $A \subseteq \Phi$ and $\psi \in C$. That is, there is a proof which has $\psi$ as a conclusion and where all its assumptions are among the premises.

Special notation is as follows:

  • $\NatImplies \psi$ instead of $\emptyset \NatImplies \psi$.
  • $\varphi_0, \dots, \varphi_{n-1} \NatImplies \psi$ instead of $\{\varphi_0, \dots, \varphi_{n-1}\} \NatImplies \psi$.

The semantic notion of equivalence between formulas can also be phrased syntactically: formulas $\varphi$ and $\psi$ are syntactically equivalent if $\varphi \NatImplies \psi$ and $\psi \NatImplies \varphi$, which is written as $\varphi \SynEquiv \psi$.

Soundness and completeness

The important fact about the above proof system is that every formula that can be derived from a set of formulas is entailed by it (soundness) and that every formula that is entailed by a set of formulas can be derived from it (completeness).

Theorem Let $\psi$ be a propositional formula and $\Phi$ a set of propositional formulas where $\neg, \vee, \wedge, \rightarrow, \bot$ are the only connectives and constant symbols used.

Soundness.
If $\Phi \NatImplies \psi$, then $\Phi \models \psi$.
Completeness.
If $\Phi \models \psi$, then $\Phi \NatImplies \psi$.

Corollary For all propositional formulas $\varphi$ and $\psi$, \begin{align} \varphi \SemEquiv \psi \qquad \text{ iff } \qquad \psi \SynEquiv \varphi \enspace. \end{align}

Exercises

Exercise Zeigen Sie durch Angabe entsprechender formaler Beweise die Kommutativität von $\wedge$ und $\vee$:

  1. $\varphi_0 \wedge \varphi_1 \SynEquiv \varphi_1 \wedge \varphi_0$
  2. $\varphi_0 \vee \varphi_1 \SynEquiv \varphi_1 \vee \varphi_0$

Exercise Zeigen Sie durch Angabe entsprechender formaler Beweise:

  1. $X_0 \rightarrow X_1, X_0 \rightarrow \neg X_1 \vdash \neg X_0$.
  2. $\neg X_0 \vee X_1 \SynEquiv X_0 \rightarrow X_1$. (Dadurch beweisen Sie ein logisches Gesetz syntaktisch.)
  3. $X_0 \vee X_1$, $\neg X_1 \vee X_2 \vdash X_0 \vee X_2$. (Dadurch beweisen Sie eine spezielle Form der Resolutionsregel syntaktisch.)
  4. $\neg X_0 \rightarrow \bot \vdash X_0$. (Wenn Sie einen Beweis finden, dann haben Sie etwas über die Korrektheit der Regel

\begin{align} \frac{\neg X_0 \dots \bot}{X_0} \end{align} gelernt, die wir zu unserem Regelsatz hinzufügen könnten. Sie repräsentiert Widerspruchsbeweise.)



Lösung

  1. Beweis für $X_0 \rightarrow X_1, X_0 \rightarrow \neg X_1 \vdash \neg X_0$: \begin{array}{lll} 1 & X_0 \rightarrow X_1 &\text{premise}\\ 2 & X_0 \rightarrow \neg X_1 &\text{premise}\\ 3.1 & X_0 &\text{assumption}\\ 3.2 & X_1 &(\rightarrow\text e) \text{ with } 3.1 \text{ and } 1\\ 3.3 & \neg X_1 &(\rightarrow\text e) \text{ with } 3.1 \text{ and } 2\\ 3.4 & \bot &(\bot\text i) \text{ with } 3.2 \text{ and } 3.3 \\ 4 & \neg X_0 &(\neg\text i) \text{ with } 3 \\ \end{array}
    1. Beweis für $\neg X_0 \vee X_1 \vdash X_0 \rightarrow X_1$

      \begin{array}{lll} 1 & \neg X_0 \vee X_1 & \text{premise}\\ 2.1 & X_0 & \text{assumption}\\ 2.2.1 & X_1 & \text{assumption}\\ 2.3.1 & \neg X_0 & \text{assumption}\\ 2.3.2 & \bot & (\bot \text i) \text{ with } 2.3.1 \text{ and } 2.1\\ 2.3.3 & X_1 & (\bot \text e) \text{ with 2.3.2}\\ 2.4 & X_1 &(\vee\text e) \text{ with } 1, 2.2 \text{ and } 2.3 \\ 3 & X_0 \rightarrow X_1 &(\rightarrow\text i) \text{ with } 2 \end{array}

    2. Beweis für $X_0 \rightarrow X_1 \vdash \neg X_0 \vee X_1$ \begin{array}{lll} 1 & X_0 \rightarrow X_1 & \text{premise}\\ 2.1 & \neg(X_0 \vee \neg X_0) & \text{assumption}\\ 2.2.1 & X_0 & \text{assumption}\\ 2.2.2 & X_0 \vee \neg X_0 & (\vee\text{i-l})\text{ with } 2.2.1\\ 2.2.3 & \bot & (\bot\text i)\text{ with } 2.2.2\text{ and } 2.1\\ 2.3 & \neg X_0 & (\neg\text i)\text{ with }2.2 \\ 2.4 & X_0 \vee \neg X_0 & (\vee\text{i-r})\text{ with } 2.3\\ 2.5 & \bot & (\bot\text i)\text{ with } 2.4\text{ and } 2.1\\ 3 & \neg\neg(X_0 \vee \neg X_0) & (\neg\text i)\text{ with } 2\\ 4 & X_0 \vee \neg X_0 & (\neg\text e)\text{ with } 3\\ 5.1 & X_0 &\text{assumption}\\ 5.2 & X_1 &(\rightarrow\text e)\text{ with } 5.1 \text{ and } 1\\ 5.3 & \neg X_0 \vee X_1 &(\vee\text{i-r})\text{ with } 5.2\\ 6.1 & \neg X_0 &\text{assumption}\\ 6.2 & \neg X_0 \vee X_1 &(\vee\text{i-l})\text{ with } 6.1\\ 7 & \neg X_0 \vee X_1 &(\vee\text e) \text{ with } 4, 5, \text{ and } 6 \end{array}
    3. Beweis für $X_0 \vee X_1$, $\neg X_1 \vee X_2 \vdash X_0 \vee X_2$: \begin{array}{lll} 1 & X_0 \vee X_1 & \text{premise}\\ 2.1 & X_0 & \text{assumption}\\ 2.2 & X_0 \vee X_2 & \text{($\vee$i-l)}\\ 3.1 & X_1 & \text{assumption}\\ 3.2 & \neg X_1 \vee X_2 & \text{premise}\\ 3.3.1 & \neg X_1 & \text{assumption}\\ 3.3.2 & \bot & (\bot \text i) \text{ with } 3.1 \text{ and } 3.2.1\\ 3.3.3 & X_2 & (\bot \text e) \text{ with 3.3.2}\\ 3.4.1 & X_2 & \text{assumption} \\ 3.5 & X_2 &(\vee\text e) \text{ with } 3.3, 3.4 \text{ and } 3.2\\ 3.6 & X_0 \vee X_2 &(\vee\text{i-r}) \text{ with } 3.5 \\ 4 & X_0 \vee X_2 &(\vee\text e) \text{ with } 2, 3, \text{ and } 1 \end{array}
    4. Beweis für $\neg X_0 \rightarrow \bot \vdash X_0$: \begin{array}{lll} 1 & \neg X_0 \rightarrow \bot & \text{premise}\\ 2.1 & \neg X_0 & \text{assumption}\\ 2.2 & \bot & (\rightarrow\text e)\text{ with } 2.1 \text{ and } 1\\ 3 & \neg \neg X_0 & (\neg\text i)\text{ with } 2\\ 4 & X_0 & (\neg\text e)\text{ with } 3\\ \end{array} $~$

Exercise Zeigen Sie durch Angabe entsprechender formaler Beweise: $X_1\wedge X_2\SynEquiv\neg(\neg X_1\vee\neg X_2)$