Journal of Automated Reasoning

, Volume 58, Issue 1, pp 149–179 | Cite as

Soundness and Completeness Proofs by Coinductive Methods

  • Jasmin Christian Blanchette
  • Andrei Popescu
  • Dmitriy Traytel
Article

Abstract

We show how codatatypes can be employed to produce compact, high-level proofs of key results in logic: the soundness and completeness of proof systems for variations of first-order logic. For the classical completeness result, we first establish an abstract property of possibly infinite derivation trees. The abstract proof can be instantiated for a wide range of Gentzen and tableau systems for various flavors of first-order logic. Soundness becomes interesting as soon as one allows infinite proofs of first-order formulas. This forms the subject of several cyclic proof systems for first-order logic augmented with inductive predicate definitions studied in the literature. All the discussed results are formalized using Isabelle/HOL’s recently introduced support for codatatypes and corecursion. The development illustrates some unique features of Isabelle/HOL’s new coinductive specification language such as nesting through non-free types and mixed recursion–corecursion.

Keywords

Codatatypes Lazy evaluation First-order logic Soundness Completeness Gentian systems Proof assistants Isabelle/HOL 

1 Introduction

Gödel’s completeness theorem [23] is a major result about first-order logic (FOL). It forms the basis of results and techniques in various areas, including mathematical logic, automated deduction, and program verification. It can be stated as follows: If a set of formulas is satisfied by all structures, then it has a proof. The theorem enjoys many accounts in the literature that generalize and simplify the original proof; indeed, a textbook on mathematical logic would be incomplete without a proof of this fundamental theorem.

Formal logic has always been a battleground between semantic and syntactic methods. Generally speaking, mathematicians belong to the semantic school, whereas computer scientists tend to take the other side of the argument. The completeness theorem, which combines syntax and semantics, is also disputed, with the result that each school has its own proof. In his review of Gallier’s Logic for Computer Science [22], Pfenning, a fellow “syntactician,” notes the following [41]:

All too often, proof-theoretic methods are neglected in favor of shorter, and superficially more elegant semantic arguments. [In contrast, in Gallier’s book] the treatment of the proof theory of the Gentzen system is oriented towards computation with proofs. For example, a pseudo-Pascal version of a complete search procedure for first-order cut-free Gentzen proofs is presented.

In the context of completeness, the “superficially more elegant semantic arguments” are proofs that rely on Hilbert systems. These systems have several axioms but only one or two deduction rules, providing minimal support for presenting the structure of proofs or for modeling proof search. A proof of completeness based on Hilbert systems follows the Henkin style: It employs a heavy bureaucratic apparatus to establish facts about deduction and conservative language extensions, culminating in a highly nonconstructive step: an application of Zorn’s lemma to extend any syntactically consistent set of formulas to a maximally consistent one, from which a model is produced.

In contrast, a proof of completeness based on more elaborate Gentzen or tableau systems follows the Beth–Hintikka style [31]. It performs a search that builds either a finite deduction tree yielding a proof (or refutation, depending on the system) or an infinite tree from which a countermodel (or model) can be extracted. Such completeness proofs have an intuitive content that stresses the tension of the argument: The deduction system systematically tries to prove the goal; a failure yields, at the limit, a countermodel.

The intuitive appeal of the Beth–Hintikka approach comes at a price: It requires reasoning about infinite derivation trees and infinite paths. Unfortunately, convenient means to reason about infinite (or lazy) data structures are lacking in mainstream mathematics. For example, an otherwise extremely rigorous textbook such as Bell and Machover’s [1] becomes surprisingly informal when defining and using possibly infinite refutation tableau trees:

A tableau is a set of elements, called nodes, partially ordered and classified into levels as explained below. With each node is associated a finite set of formulas. We shall usually identify a given node with its associated set of formulas; this is somewhat imprecise (since in fact the same set of formulas can be associated with different nodes) but will not cause confusion.

Each node belongs to a unique level, which is labeled by some natural number. There is just one node of level 0, called the initial node of the tableau. Each node at level \(n+1\) is a successor of a unique node, which must be of level n.

In textbooks, at best the trees are defined rigorously (e.g., as prefix-closed sets), but the reasoning is performed informally, disregarding the original definition and relying on the intuitive notion of trees, as Gallier does. One could argue that trees are intuitive and do not need a formal treatment, but the same holds for the syntax of formulas, which is treated very rigorously in most of the textbooks.

The main contribution of this article is a rigorous Beth–Hintikka-style proof of the completeness theorem, based on a Gentzen system. The potentially infinite trees are captured by codatatypes (also called coinductive datatypes or final coalgebras) [29]. Another novel aspect of the proof is its modularity: The core tree construction argument is isolated from the proof system and concrete formula syntax, with the concrete syntactic details concealed behind an abstract Herbrandness assumption (Sect. 3). This assumption can be verified in concrete cases (by performing the standard Herbrand model construction) for a wide range of Gentzen and tableau systems for FOL, various flavors of FOL (e.g., with or without predicates, equality, or sorts), and even modal logics with explicit-world Gentzen systems (Sect. 4). This modularization replaces the textbook proofs by analogy. The core of the argument amounts to reasoning about a functional program over lazy data structures.

A second contribution of this article is an application of the same coinductive machinery (infinite trees and streams and corecursive functions between them) to some interesting recent results from the automated deduction literature: the soundness of infinite (including cyclic) proofs for FOL with inductive definitions and related logics, studied by Brotherston et al. [12, 13, 14, 15, 16]. For formalizing these results, we follow rather closely the abstract constructions of Brotherston et al. [16], except that we use coinduction and corecursion to streamline the development. The presentation follows the same path as for completeness: The main result is stated abstractly (Sect. 5) and instantiated by concrete examples (Sect. 6).

The proofs of the abstract results are formalized in Isabelle/HOL (Sect. 7). The definitions of infinite trees and paths rely on a new definitional package for codatatypes [6, 11], which automates the derivation of characteristic theorems from high-level specifications of types and functions. Through Isabelle’s code generator [25], the corecursive construction gives rise to a Haskell program that implements a semidecision procedure for validity instantiable with various proof systems, yielding verified sound and complete provers.

The completeness proof has been applied to the formalization of optimized translations between sorted and unsorted FOL [4, 7]. The soundness proofs of these translations rest on the Löwenheim–Skolem theorem, a corollary of a slightly generalized version of the completeness theorem. The previous formal proofs of the completeness theorem, including two in Isabelle, support a more restrictive logic than many-sorted FOL (Sect. 8).

An earlier version of this article was presented at the IJCAR 2014 conference in Vienna, Austria, under a different title [9]. The article considerably extends the conference paper with infinite-proof soundness (Sects. 5, 6) as a second application of coinductive methods. It also provides more details about the Beth–Hintikka-style proof of the completeness theorem (Sects. 3, 4).

Conventions Isabelle/HOL [39] is a proof assistant based on classical higher-order logic (HOL) with Hilbert choice, the axiom of infinity, and rank-1 polymorphism. It is the logic of Gordon’s original HOL system [24] and of its many successors. HOL notations are similar to those of functional programming languages, but they also include many traditional symbols and syntaxes from mathematics, notably to denote simply typed sets. We refer to Nipkow and Klein [38, Part 1] for a modern introduction. In this article, the logic is viewed not as a formal system but rather as a framework for expressing mathematics, much like set theory is employed by working mathematicians. In keeping with the standard semantics of HOL, types \(\alpha \) are identified with sets.

2 Preliminaries on First-Order Logic

The soundness and completeness results we formalize in this article apply to an array of variations of first-order logic and beyond (modal logics, separation logic, etc.). To give some concrete grounding for the forthcoming abstract development, we recall basic (unsorted) first-order logic and its extension with inductive predicates.

2.1 Classical First-Order Logic

We fix a first-order language: a countably infinite set \({\textsf {var}}\) of variables \(x,\, y,\, z\) and countable sets \({\textsf {fsym}}\) and \({\textsf {psym}}\) of function symbols f and predicate symbols p together with an assignment \({\textsf {ar}}: {\textsf {fsym}}\mathrel {\uplus } {\textsf {psym}}\rightarrow {\textsf {nat}}\) of numeric arities. Terms \(t \in {\textsf {term}}\) are symbolic expressions built inductively from variables by application of function symbols \(f \in {\textsf {fsym}}\) to tuples of arguments whose lengths respect the arities: \(f (t_1,\ldots ,t_{{\textsf {ar}}\,f})\). Atoms \(a \in {\textsf {atom}}\) are symbolic expressions of the form \(p (t_1,\ldots ,t_{{\textsf {ar}}\,p})\), where \(p \in {\textsf {psym}}\) and \(t_1,\ldots ,t_{{\textsf {ar}}\,p} \in {\textsf {term}}\).

Formulas \(\varphi ,\, \psi \) may be atoms, negations, conjunctions, or universal quantifications. They are defined as follows:As usual, we define the (syntactic) implication of two formulas \(\varphi _1,\varphi _2\) by \({\textsf {Imp}}\;\varphi _1\;\varphi _2 = {\textsf {Neg}}\;({\textsf {Conj}}\;\varphi _1\;({\textsf {Neg}}\;\varphi _2))\).

To distinguish between the first-order language of study and the HOL metalanguage, we use the constructor names \({\textsf {Neg}}\), \({\textsf {Conj}}\), and \({\textsf {All}}\) for the former, keeping the traditional symbols \(\lnot \), \(\wedge \), and \(\forall \) for the latter. We often write a instead of \({\textsf {Atm}}\;a\), thus pretending that atoms are included in formulas.

A structure \(\mathscr {S}= \bigl (\textit{S},\, (F_{f})_{f \,\in \, {\textsf {fsym}}},\, (P_{p})_{p \,\in \, {\textsf {psym}}}\bigr )\) for the given language consists of a carrier set \(\textit{S}\), together with a function \(F_{f} : \textit{S}^{n} \rightarrow S\) for each n-ary \(f \in {\textsf {fsym}}\) and a predicate \(P_{p} : \textit{S}^{n} \rightarrow {\textsf {bool}}\) for each n-ary \(p \in {\textsf {psym}}\). The notions of interpretation of a term t and satisfaction of a formula \(\varphi \) by a structure \(\mathscr {S}\) with respect to a variable valuation \(\xi : {\textsf {var}}\rightarrow \textit{S}\) are defined in the standard way. For terms (by structural recursion):
$$\begin{aligned} \llbracket x\rrbracket ^{\mathscr {S}}_{\xi } = \xi x \quad \quad \llbracket f (t_1,\ldots ,t_n)\rrbracket ^{\mathscr {S}}_{\xi } = F_{f}\;\bigl (\llbracket t_1\rrbracket ^{\mathscr {S}}_{\xi },\ldots ,\llbracket t_n\rrbracket ^{\mathscr {S}}_{\xi }\bigr ) \end{aligned}$$
For atoms:For formulas (by structural recursion):Above, \(\xi [x \leftarrow a]\) denotes the valuation that sends x to a and all other variables y to \(\xi \;y\).

We define the notion of a structure \(\mathscr {S}\) satisfying a formula \(\varphi \), written \(\mathscr {S}\models \varphi \), to mean satisfaction with respect to all valuations: \(\forall \xi .\;\mathscr {S}\models _\xi \varphi \).

A sequent is a pair \(\Gamma \mathrel {\rhd }\Delta \) of finite sets of formulas. Satisfaction is extended to sequents: \(\mathscr {S}\models \Gamma \mathrel {\rhd }\Delta \)  iff   \((\forall \varphi \mathbin {\in }\Gamma .\;\mathscr {S}\models \varphi ) \Rightarrow (\exists \psi \mathbin {\in }\Delta .\;\mathscr {S}\models \psi )\). We can think of \(\Gamma \mathrel {\rhd }\Delta \) as an implication between the conjunction of the formulas of \(\Gamma \) and the disjunction of the formulas of \(\Delta \). If \(\mathscr {S}\models \Gamma \mathrel {\rhd }\Delta \), we also call \(\mathscr {S}\) a model of \(\Gamma \mathrel {\rhd }\Delta \). By contrast, if \(\mathscr {S}\not \models \Gamma \mathrel {\rhd }\Delta \), we call \(\mathscr {S}\) a countermodel of \(\Gamma \mathrel {\rhd }\Delta \)

A standard proof system on sequents is defined inductively as follows, where the notation \(\Gamma ,\varphi \) abbreviates the set \(\Gamma \cup \{\varphi \}\):

Since here sequents are sets, we do not need the structural rules (weakening, contraction, and exchange).

In a proof, the rules are applied from bottom to top. One chooses a formula from either side of the sequent, the eigenformula, and applies a rule according to the topmost connective or quantifier. For a given choice of eigenformula, at most one rule is applicable. The aim of applying the rules is to prove the sequent by building a finite derivation tree whose branches are closed by an axiom (Ax). Provability of a sequent in this system is denoted by prefixing \(\vdash \) to the sequent (e.g., \(\vdash \Gamma \mathrel {\rhd }\Delta \)).

The soundness theorem states that any provable sequent is satisfied by all structures:
$$\begin{aligned} \vdash \Gamma \mathrel {\rhd }\Delta \,\implies \, (\forall \mathscr {S}.\,\mathscr {S}\models \Gamma \mathrel {\rhd }\Delta ) \end{aligned}$$
The completeness theorem states the converse—namely, any sequent that is satisfied by all structures is provable:
$$\begin{aligned} (\forall \mathscr {S}.\,\mathscr {S}\models \Gamma \mathrel {\rhd }\Delta ) \,\implies \, {\vdash \Gamma \mathrel {\rhd }\Delta } \end{aligned}$$
Section 3 presents abstract versions of these classic soundness and completeness theorems.

2.2 First-Order Logic with Inductive Predicates

Another logic we consider is first-order logic with inductive predicates (FOL\(_{\textsf {ind}}\)). In addition to the first-order language given by \({\textsf {var}}\), \({\textsf {fsym}}\), \({\textsf {psym}}\), and \({\textsf {ar}}: {\textsf {fsym}}\mathrel {\uplus } {\textsf {psym}}\rightarrow {\textsf {nat}}\), we fix a subset of predicate symbols \({\textsf {ipsym}}\subseteq {\textsf {psym}}\) which are called inductive. Moreover, we fix, for each \(p \in {\textsf {ipsym}}\), a set \({\textsf {ind}}_p\) of Horn clauses specifying p, so that each element of \({\textsf {ind}}_p\) is of the form \({\textsf {Imp}}\;({\textsf {Conj}}\;\psi _1\;\ldots \;\psi _m\;\varphi _1\;\ldots \;\varphi _n)\;\varphi \), where
  • \({\textsf {Conj}}\) denotes the conjunction of multiple formulas1;

  • \(\varphi \) is an atom of the form \(p(t_1,\ldots ,t_{{\textsf {ar}}\,p})\);

  • each \(\varphi _i\) is an atom of the form \(p'(t_1,\ldots ,t_{{\textsf {ar}}\,p'})\) for some \(p' \in {\textsf {ipsym}}\);

  • each \(\psi _j\) is a formula not containing any \(p' \in {\textsf {ipsym}}\).

The above set of restrictions is aimed to ensure monotonicity of the Horn clauses interpreted as inductive definitions. To simplify the exposition, we choose a rather strong restriction, but more flexible ones are possible [12].

A first-order structure \(\mathscr {S}= \bigl (\textit{S},\, (F_{f})_{f \,\in \, {\textsf {fsym}}},\, (P_{p})_{p \,\in \, {\textsf {psym}}}\bigr )\) is said to be inductive (with respect to \({\textsf {ipsym}}\) and \(({\textsf {ind}}_p)_{p \in {\textsf {ipsym}}}\)) if the interpretation of each \(p \in {\textsf {ipsym}}\) is indeed inductive, i.e., the family \((P_{p})_{p \,\in \, {\textsf {ipsym}}}\) constitutes the least fixpoint of all the inductive Horn clauses interpreted in \(\mathscr {S}\), meaning that \((P_p)_{p \in {\textsf {ipsym}}}\) is the least family of predicates (with respect to component-wise implication) such that \(\mathscr {S}\models \chi \) for all \(\chi \in \bigcup _{p \in {\textsf {psym}}} {\textsf {ind}}_p\).

Example 1

Assume \({\textsf {fsym}}= \{0,{{\textsf {Suc}}}\}\), \({\textsf {ipsym}}= {\textsf {psym}}= \{{{\textsf {even}}},{{\textsf {odd}}}\}\), and
  • \({\textsf {ind}}_{{\textsf {even}}}\) consists of the following clauses:
    • \({{\textsf {even}}}(0)\);

    • \({\textsf {Imp}}\;({{\textsf {even}}}(x))\;({{\textsf {even}}}({{\textsf {Suc}}}({{\textsf {Suc}}}(x))))\);

  • \({\textsf {ind}}_{{\textsf {odd}}}\) consists of the following clauses:
    • \({{\textsf {odd}}}({{\textsf {Suc}}}(0))\);

    • \({\textsf {Imp}}\;({{\textsf {odd}}}(x))\;({{\textsf {odd}}}({{\textsf {Suc}}}({{\textsf {Suc}}}(x))))\).

A structure \(\mathscr {S}= \bigl (\textit{S},\,(F_{0},\,F_{{\textsf {Suc}}}),\,(P_{{\textsf {even}}},\,P_{{\textsf {odd}}})\bigr )\) for this language is inductive iff, for fixed \(\bigl (\textit{S},\,(F_{0},\,F_{{\textsf {Suc}}}))\), \(P_{{\textsf {even}}}\) and \(P_{{\textsf {odd}}}\) is the least (i.e., strongest) pair of unary predicates on \(\textit{S}\) such that
  • \(P_{{\textsf {even}}}(F_0)\);

  • \(\forall n \in S.\;P_{{\textsf {even}}}(n) \implies P_{{\textsf {even}}}(F_{{\textsf {Suc}}}(F_{{\textsf {Suc}}}(n)))\);

  • \(P_{{\textsf {odd}}}(F_{{\textsf {Suc}}}(F_0))\);

  • \(\forall n \in S.\;P_{{\textsf {odd}}}(n) \implies P_{{\textsf {odd}}}(F_{{\textsf {Suc}}}(F_{{\textsf {Suc}}}(n)))\).

In particular, if \(\bigl (\textit{S},\,(F_{0},\,F_{{\textsf {Suc}}}))\) is the standard set of natural numbers with zero and successor, then \(P_{{\textsf {even}}}\) and \(P_{{\textsf {odd}}}\) must be the standard parity predicates.
Obviously, there are more admissible rules for inductive structures than for arbitrary structures. The following rules are admissible for \({{\textsf {even}}}\), where \(\Gamma [t/x]\) denotes the (capture-avoiding) substitution of the term t for the variable x in all formulas of \(\Gamma \):
The direct rules \(\textsc {Even}_0\) and \(\textsc {Even}_{{\textsf {Suc}}}\) (corresponding to the two cases in the inductive specification of \({{\textsf {even}}}\)) are admissible since all inductive structures satisfy the Horn clauses for \({{\textsf {even}}}\). Moreover, the inversion rule \(\textsc {Even}_{\mathrm{split}}\) is admissible by the least fixpoint assumption—the rule expresses that, if \({{\textsf {even}}}(x)\) is known, then it must have been obtained by an application of one of the inductive clauses for \({{\textsf {even}}}\). Similar rules hold for \({{\textsf {odd}}}\):
In general, for all inductive predicates \(p \in {\textsf {ipsym}}\), the following rules are sound:
There is one \(p_{\chi }\) rule for each \(p \in {\textsf {ipsym}}\) and \(\chi \in {\textsf {ind}}_p\), with \({\textsf {prems}}(\chi )\) denoting the premises of \(\chi \) and \({\textsf {concl}}(\chi )\) its conclusion. In addition, there is one \(p_{\mathrm{split}}\) rule for each \(p \in {\textsf {ipsym}}\) such that
  • \(\chi '\) is a variant of \(\chi \) where all the free variables are fresh for \(\Gamma \,\cup \, \Delta \);

  • \(p(\overline{t})\) is \({\textsf {concl}}(\chi ')\)—with \(\overline{t}\) thus being a tuple of terms \((t_1,\ldots ,t_{{\textsf {ar}}\;p})\);

  • \(\overline{x}\) is a tuple of distinct variables \((x_1,\ldots ,x_{{\textsf {ar}}\;p})\).

The Gentzen system for FOL\(_{\textsf {ind}}\) consists of the FOL rules from Sect. 2.1 extended with the above rules \(p_{\chi }\), \(p_{\mathrm{split}}\) and the following substitution rule:
This rule is designed to complement the direct rule \(p_{\chi }\), for the purpose of applying the Horn clauses of p to particular instances.

The inversion rule \(p_{\mathrm{split}}\) is not as powerful as it could be, due to the restriction that the eigenformula, \(p(\overline{x})\), must be a predicate symbol applied to variables. In a FOL\(_{\textsf {ind}}\) variant with equality, this could be strengthened to speak about arbitrary terms instead of variables [13, Section 4.1].

Beyond the additional admissible rules, a crucial insight of Brotherston [12] is that, for inductive structures, a certain type of proof circularity is permissible. It allows the proof trees to have as leaf nodes not only axioms, but also backward links to other sequents in the tree. This insight is useful for automating induction [12]. Brotherston et al. [16] give an abstract, logic-independent argument for why allowing such circularities is sound. In Sect. 5, we show that this argument can be naturally formalized using a coinductive machinery similar to the one we use for classic completeness.

3 Abstract Soundness and Completeness

Before studying applications, we first develop an abstract framework that allows us to state and prove soundness and completeness for an unspecified syntax and class of structures and satisfaction relation. The framework is obtained by distilling the corresponding concrete results for FOL.

For soundness, we simply assume the rules to be locally sound for the models, which immediately yields global soundness. Completeness is more elaborate. The proof is divided in two parts. The first part, captured in this section, focuses on the core of the completeness argument in an abstract, syntax-free manner. This level captures the tension between the existence of a proof and of a countermodel-producing path, introduced through what we call an escape path—an infinite sequence of rule applications that “escapes” the proof attempt. The tension is distilled in the following result:

Either there exists a finite derivation tree or there exists an infinite derivation tree containing a suitable escape path.

The second part maps the escape path to a concrete, proof-system-specific countermodel employing a construction due to Herbrand. At the abstract level, we assume a “Herbrand function” that produces countermodels from escape paths. In Sect. 4, we instantiate this function for the Gentzen system introduced in Sect. 2.1.

3.1 Sequents and Structures

We abstract away from the syntax of formulas and sequents and the specific rules of the proof system. We fix countable sets \({\textsf {sequent}}\) and \({\textsf {rule}}\) for sequents and rules. Our abstract sequents represent the formal statements in the logic, which can take either the form of concrete sequents or other forms.

Moreover, we abstract away from the concrete form of structures, assuming a fixed class \({\textsf {structure}}\) and a satisfaction relation
$$\begin{aligned} {\models } \mathrel {:} {\textsf {structure}}\rightarrow {\textsf {sequent}}\rightarrow {\textsf {bool}}\end{aligned}$$
where \(\textit{S} \models s\) indicates that \(\textit{S}\) satisfies (is a model of) s. We write \(\models s\) to indicate that s is satisfied by all models in \({\textsf {structure}}\): \(\forall \textit{S} \mathbin \in {\textsf {structure}}.\;\textit{S} \models s\).

3.2 Rule Systems

We assume that the meaning of the rules is given by an effect relation
$$\begin{aligned} {\textsf {eff}}: {\textsf {rule}}\rightarrow {\textsf {sequent}}\rightarrow {\textsf {sequent}}\;\textsf {{fset}}\rightarrow {\textsf {bool}}\end{aligned}$$
where \(\alpha \;\textsf {{fset}}\) denotes the set of finite subsets of \(\alpha \). The reading of \({\textsf {eff}}\;r\;s\;{ ss }\) is as follows: Starting from sequent s, applying rule r expands s into the sequents \({ ss }\). We can think of sequents as proof goals, each goal being replaced by zero or more subgoals by applying a rule. The triple \(\mathscr {R}= ({\textsf {sequent}},{\textsf {rule}},{\textsf {eff}})\) forms a rule system.

Example 2

The Gentzen system from Sect. 2.1 can be presented as a rule system. The set \({\textsf {sequent}}\) is the set of sequents, and \({\textsf {rule}}\) consists of the following: a rule \({\textsc {Ax}}_a\) for each atom a; rules \(\textsc {Neg}\textsc {L}_\varphi \) and \(\textsc {Neg}\textsc {R}_\varphi \) for each formula \(\varphi \); rules \({\textsc {Conj}}\textsc {L}_{\varphi ,\psi }\) and \({\textsc {Conj}}\textsc {R}_{\varphi ,\psi }\) for each pair of formulas \(\varphi \) and \(\psi \); a rule \({\textsc {All}}\textsc {L}_{x,\varphi ,t}\) for each variable x, formula \(\varphi \), and term t; and a rule \({\textsc {All}}\textsc {R}_{x,\varphi }\) for each variable x and formula \(\varphi \).

The eigenformula is part of the rule. Hence we have a countably infinite number of rules. The effect is defined as follows, where we use semicolons (; ) to separate set elements:
$$\begin{aligned}&{\textsf {eff}}\;{\textsc {Ax}}_a\;(\Gamma ,{\textsf {Atm}}\;a \mathbin {\mathrel {{\mathrel {\rhd }}}} \Delta ,{\textsf {Atm}}\;a)\;\emptyset \\&{\textsf {eff}}\;\textsc {Neg}\textsc {R}_{\varphi }\;(\Gamma \mathbin {\mathrel {{\mathrel {\rhd }}}} \Delta {,}\;{\textsf {Neg}}\;\varphi )\;\{\Gamma ,\varphi \mathbin {\mathrel {{\mathrel {\rhd }}}} \Delta \} \\&{\textsf {eff}}\;\textsc {Neg}\textsc {L}_{\varphi }\;(\Gamma ,{\textsf {Neg}}\;\varphi \mathbin {\mathrel {{\mathrel {\rhd }}}} \Delta )\;\{\Gamma \mathbin {\mathrel {{\mathrel {\rhd }}}} \Delta ,\varphi \} \\&{\textsf {eff}}\;{\textsc {Conj}}\textsc {L}_{\varphi ,\psi }\;(\Gamma ,{\textsf {Conj}}\;\varphi \;\psi \mathbin {\mathrel {{\mathrel {\rhd }}}} \Delta )\;\{\Gamma ,\varphi ,\psi \mathbin {\mathrel {{\mathrel {\rhd }}}} \Delta \} \\&{\textsf {eff}}\;{\textsc {Conj}}\textsc {R}_{\varphi ,\psi }\;(\Gamma \mathbin {\mathrel {{\mathrel {\rhd }}}} \Delta {,}\;{\textsf {Conj}}\;\varphi \;\psi )\;\{\Gamma \mathbin {\mathrel {{\mathrel {\rhd }}}} \Delta ,\varphi ;\, \Gamma \mathbin {\mathrel {{\mathrel {\rhd }}}} \Delta ,\psi \} \\&{\textsf {eff}}\;{\textsc {All}}\textsc {L}_{x,\varphi ,t}\;(\Gamma ,{\textsf {All}}\;x\;\varphi \mathbin {\mathrel {{\mathrel {\rhd }}}} \Delta )\;\{\Gamma ,{\textsf {All}}\;x\;\varphi ,\varphi [t / x] \mathbin {\mathrel {{\mathrel {\rhd }}}} \Delta \} \\&{\textsf {eff}}\;{\textsc {All}}\textsc {R}_{x,\varphi }\;(\Gamma \mathbin {\mathrel {{\mathrel {\rhd }}}} \Delta {,}\;{\textsf {All}}\;x\;\varphi )\;\{\Gamma \mathbin {\mathrel {{\mathrel {\rhd }}}} \Delta {,}\;\varphi [y / x]\} \quad \text {where }y\hbox { is fresh for }\Gamma \hbox { and }{\textsf {All}}\;x\;\varphi \end{aligned}$$

3.3 Derivation Trees

Finitely branching, possibly infinite trees with nodes labeled by elements in a set \(\alpha \) are represented by the following codatatype:This definition introduces a constructor \({\textsf {Node}}: \alpha \rightarrow (\alpha \;{\textsf {tree}})\;\textsf {{fset}}\rightarrow \alpha \;{\textsf {tree}}\) and two selectors \({{\textsf {lab}}}: \alpha \;{\textsf {tree}}\rightarrow \alpha \) and \({{\textsf {sub}}}: \alpha \;{\textsf {tree}}\rightarrow (\alpha \;{\textsf {tree}})\;\textsf {{fset}}\). Trees have the form \({\textsf {Node}}\;a\;{ Ts }\), where a is the tree’s label and \({ Ts }\) is the finite set of its (immediate) subtrees. The
keyword indicates that, unlike for inductive datatypes, this tree formation rule may be iterated an infinite number of times to create infinitely deep objects.

Remark 3

Inductive datatypes can also store infinite objects, provided the constructors are infinitely branching. However, the infiniteness of these objects manifests itself in breadth only. The following examples illustrate this distinction:Due to infinite branching over the type \({\textsf {nat}}\), the elements of the datatype \(\alpha \;{\textsf {tree}}_1\) can be infinite, and indeed of unbounded depth—however, they will never have a genuinely infinite depth, as witnessed by an infinite path. By contrast, the elements of the codatatype \(\alpha \;{\textsf {tree}}_2\) can have infinite paths (but need not to).
A step combines the current sequent and the rule to be applied. Derivation trees are defined as trees labeled by steps:
$$\begin{aligned} {\textsf {step}}= {\textsf {sequent}}\times {\textsf {rule}}\quad {\textsf {dtree}}= {\textsf {step}}\;{\textsf {tree}}\end{aligned}$$
We think of the root’s label (sr) as representing the proved goal s and the first (backwards) applied rule r. The well-formed derivation trees are captured by the predicate \({\textsf {wf}}: {\textsf {dtree}}\rightarrow {\textsf {bool}}\) defined by the coinductive ruleThe term \(\textsf {{image}}\;f\;A\) denotes the image of set A through function f, and \(\textsf {{fst}}\) is the left projection operator (i.e., \(\textsf {{fst}}\;(x,y) = x\)). The first assumption requires that the rule r from the root be applied to obtain the subtrees’ labels. The second assumption requires that wellformedness holds for the immediate subtrees. The coinductive interpretation of the definition ensures that the iteration of this rule can cover infinite trees; this would not be the case with an inductive interpretation.
Double lines distinguish coinductive rules from their inductive counterparts. Thus, the predicate \({\textsf {wf}}\) is the greatest (weakest) solution to the equivalence
$$\begin{aligned} {\textsf {wf}}\;({\textsf {Node}}\;(s,r)\;{ Ts }) \,\iff \, {\textsf {eff}}\;r\;s\;(\textsf {{image}}\;(\textsf {{fst}}\circ {{\textsf {lab}}})\;{ Ts }) \mathrel \wedge (\forall T \mathbin \in { Ts }.\;{\textsf {wf}}\;T) \end{aligned}$$
which is also the greatest solution to the implication
$$\begin{aligned} {\textsf {wf}}\;({\textsf {Node}}\;(s,r)\;{ Ts }) \,\implies \, {\textsf {eff}}\;r\;s\;(\textsf {{image}}\;(\textsf {{fst}}\circ {{\textsf {lab}}})\;{ Ts }) \mathrel \wedge (\forall T \mathbin \in { Ts }.\;{\textsf {wf}}\;T) \end{aligned}$$
To establish a fact of the form \(\forall T.\;P\;T \implies {\textsf {wf}}\;T\) with \(P : {\textsf {dtree}}\rightarrow {\textsf {bool}}\), a proof by coinduction on the definition of\({\textsf {wf}}\) proceeds by simply showing that P is also a solution of the same implication:
$$\begin{aligned} P\;({\textsf {Node}}\;(s,r)\;{ Ts }) \,\implies \, {\textsf {eff}}\;r\;s\;(\textsf {{image}}\;(\textsf {{fst}}\circ {{\textsf {lab}}})\;{ Ts }) \mathrel \wedge (\forall T \mathbin \in { Ts }.\;P\;T) \end{aligned}$$

3.4 Proofs

The finite derivation trees can be carved out of the codatatype \({\textsf {dtree}}\) using the predicate \({\textsf {finite}}\) defined inductively (i.e., as a least fixpoint) by the rule

Indeed, the inductive interpretation is the right one for defining \({\textsf {finite}}\), since we want to enforce the well-founded iteration of the rule. (By contrast, a coinductive interpretation would classify all trees as “finite,” which is certainly not desirable.)

A proof of a sequent s is a finite well-formed derivation tree with s at its root:
$$\begin{aligned} {\textsf {proof}}\;T\;s \,\iff \, {\textsf {finite}}\;T \mathrel \wedge {\textsf {wf}}\;T \mathrel \wedge \textsf {{fst}}\,({{\textsf {lab}}}\;T) = s \end{aligned}$$
An infinite well-formed derivation tree represents a failed proof attempt.
Fig. 1

A proof

Example 4

Given the instantiation of Example 2, Fig. 1 shows a finite derivation tree for the sequent \({\textsf {All}}\;x\;(p(x)) \mathrel {{\mathrel {\rhd }}}{\textsf {Conj}}\;(p(y))\;(p(z))\) written using the familiar syntax for logical symbols. Figure 2 shows an infinite tree for the same sequent.

Fig. 2

A failed proof attempt

3.5 Soundness

We assume that the rules are locally sound.

The soundness theorem follows by induction on the finiteness of trees representing proofs.

Theorem 5

Assume the rule system fulfills Local Soundness. Then every sequent that has a proof is satisfied by all structures. Formally:
$$\begin{aligned} \forall s.\;(\exists T.\;{\textsf {proof}}\;T\;s) \,\implies \, {\models s} \end{aligned}$$

3.6 Infinite Paths and König’s Lemma

An infinite path in a derivation tree can be regarded as a way to “escape” the proof. To represent infinite paths independently of trees, we introduce the codatatype of streams over a type \(\alpha \) with the constructor \({{\textsf {SCons}}}\) and the selectors \({\textsf {shead}}\) and \({\textsf {stail}}\):The coinductive predicate \({\textsf {ipath}}: {\textsf {dtree}}\rightarrow {\textsf {step}}\;{\textsf {stream}}\rightarrow {\textsf {bool}}\) determines whether a stream of steps is an infinite path in a tree:Our notion of a tree being finite can be shown to coincide with the more standard notion of having a finite number of nodes. Hence the following result is a variant of König’s lemma. Its proof allows us to show a first simple corecursive definition.

Lemma 6

If the tree T is infinite (i.e., non-finite), there exists an infinite path \({\sigma }\) in T.

Proof

By the contrapositive of Fin, if \({\textsf {Node}}\;(s,r)\;{ Ts }\) is infinite, there exists an infinite subtree \(T \in { Ts }\). Let \(f : \{T \mathbin \in {\textsf {dtree}}\mid \lnot \;{\textsf {finite}}\;T\} \rightarrow \{T \mathbin \in {\textsf {dtree}}\mid \lnot \;{\textsf {finite}}\;T\}\) be a function witnessing this fact—i.e., \(f\;T\) is an immediate infinite subtree of T. The desired infinite path \({\textsf {p}} : \{T \mathbin \in {\textsf {dtree}}.\;\lnot \;{\textsf {finite}}\;T\} \rightarrow {\textsf {step}}\;{\textsf {stream}}\) can be defined by primitive corecursion over the codatatype of streams: \({\textsf {p}}\;T = {{\textsf {SCons}}}\;({{\textsf {lab}}}\;T)\;({\textsf {p}}\;(f\;T))\). Equivalently, in terms of the selectors:
$$\begin{aligned} {\textsf {shead}}\;({\textsf {p}}\;T) = {{\textsf {lab}}}\;T \quad {\textsf {stail}}\;({\textsf {p}}\;T) = {\textsf {p}}\;(f\;T) \end{aligned}$$
Thus, \({\textsf {ipath}}\;({\textsf {p}}\;T)\;T\) by straightforward coinduction on the definition of \({\textsf {ipath}}\). \(\square \)

Remark 7

Essentially the same proof would hold if we allowed our trees to be infinitely branching, by replacing finite sets with countable sets in the definition of the \({\textsf {tree}}\) codatatype. This may seem counterintuitive if we think of the standard formulation of König’s lemma, but \({\textsf {finite}}\) would no longer mean having a finite number of nodes—it would simply mean “well founded.”

The following extension of König’s lemma applies to well-formed derivation trees, allowing one to construct an infinite path satisfying a unary invariant on its steps and a binary invariant on pairs of neighbored steps. The latter additionally takes the transition rule between the neighbored steps into account. The lemma’s statement involves the “always” predicate defined coinductively for streams over any set \(\beta \), namely, \({\textsf {alw}}: (\beta \;{\textsf {stream}}\rightarrow {\textsf {bool}}) \rightarrow \beta \;{\textsf {stream}}\rightarrow {\textsf {bool}}\), where \({\textsf {alw}}\;P\;{ xs }\) states that the predicate P holds for all suffixes of \({ xs }\):

Lemma 8

Fix the set \(\alpha \) and the predicates \(I : {\textsf {sequent}}\times \alpha \rightarrow {\textsf {bool}}\) and \(P : {\textsf {sequent}}\times \alpha \rightarrow {\textsf {rule}}\rightarrow {\textsf {sequent}}\times \alpha \rightarrow {\textsf {bool}}\). Assume that
$$\begin{aligned} \begin{array}{l} \forall r\;s\;{ ss }\;a.\;\, {\textsf {eff}}\;r\;s\;{ ss }\mathrel \wedge I\;(s,a) \implies (\exists s'\;a'.\;s' \in { ss }\mathrel \wedge I\;(s',a') \mathrel \wedge P\;(s,a)\;r\;(s',a')) \end{array} \end{aligned}$$
If the tree T is well formed, there exists a stream \(\rho \in ({\textsf {step}}\times \alpha )\;{\textsf {stream}}\) such that its first projection is an infinite path in T (formally, \({\textsf {ipath}}\;T\,(\textsf {{smap}}\;\textsf {{fst}}\;\rho )\)) and \(\rho \) satisfies the predicate

In the above lemma, the assumption is that, for any sequent s, rule r, and element \(a \in \alpha \) such that the predicate I holds for (sa), there exists a premise sequent \(s'\) along the backward application of r and an element \(a'\) such that I again holds for \((s',a')\) and the predicate P holds for (sa), r, and \((s',a')\). The conclusion is that there exists an infinite path in the tree along which I and P always hold.

The proof is similar to that of Lemma 6, but it requires a slightly more complex function f, namely \(f : B \rightarrow B\), where \(B = \{(T,a) \in {\textsf {dtree}}\times \alpha \mid I\;(\textsf {{fst}}\;({{\textsf {lab}}}\;T), a)\}\), such that \( P\;(\textsf {{fst}}\;({{\textsf {lab}}}\;T),\, a)\;r\;(\textsf {{fst}}\;({{\textsf {lab}}}\;T'),\, a')\) holds whenever \((T,a) \in B\) and \((T',a') = f\,(T,a)\). The lemma’s assumption ensures such a choice of f is possible.

3.7 Escape Paths

An escape path is a stream of steps that can form an infinite path in a derivation tree. It is defined coinductively as the predicate \({\textsf {epath}}: {\textsf {step}}\;{\textsf {stream}}\rightarrow {\textsf {bool}}\), which requires that every element in the given stream be obtained by applying an existing rule and choosing one of the resulting sequents:The following lemma is easy to prove by coinduction on the definition of \({\textsf {epath}}\).

Lemma 9

For any stream \({\sigma }\) and tree T, if \({\textsf {wf}}\,\,T\) and \({\textsf {ipath}}\;{\sigma }\;T\), then \({\textsf {epath}}\;{\sigma }\).

Example 10

The stream
$$\begin{aligned} \begin{aligned}&\forall x.\, p(x) \mathrel {\rhd }p(y) \wedge p(z) ,\, {{\textsc {Conj}}\textsc {R}_{p(y),\, p(z)}}) \cdot (\forall x.\, p(x) \mathrel {\rhd }p(z) ,\, {{\textsc {All}}\textsc {L}_{x,p(x),y}}) \cdot {} \\&(\forall x.\, p(x), p(y) \mathrel {\rhd }p(z) ,\, {{\textsc {All}}\textsc {L}_{x,p(x),y}})^\infty \end{aligned}\end{aligned}$$
where \((s,r) \cdot {\sigma }= {{\textsf {SCons}}}\;(s,r)\;{\sigma }\) and \((s,r)^\infty = (s,r) \cdot (s,r) \cdot \ldots \,\), is an escape path for the tree of Fig. 2.

3.8 Countermodel Paths

A countermodel path is a sequence of steps that witnesses the unprovability of a sequent s. Any escape path starting at s is a candidate for a countermodel path, given that it indicates a way to apply the proof rules without reaching any result. For it to be a genuine countermodel path, all possible proofs must have been attempted. More specifically, whenever a rule becomes enabled along the escape path, it is eventually applied later in the sequence. For FOL with its standard sequents, such paths can be used to produce actual countermodels by interpreting all statements along the path on the left of the sequents as true, and all statements on the right as false.

Formally, a rule r is enabled in a sequent s if it has an effect (i.e., \({\textsf {eff}}\;r\;s\;{ ss }\) for some \({ ss }\)). This is written \({\textsf {enabled}}\;r\;s\). For any rule r and stream \({\sigma }\):
  • \({\textsf {taken}}_r\;{\sigma }\) states that r is taken at the start of the stream (i.e., \({\textsf {shead}}\;{\sigma }= (s,r)\) for some s);

  • \({\textsf {enabledAt}}_r\;{\sigma }\) states that r is enabled at the beginning of the stream (i.e., if \({\textsf {shead}}\;{\sigma }= (s, r')\), then \({\textsf {enabled}}\;r\;s\));

Recall that, given any set \(\alpha \), predicate \(P : \alpha \;{\textsf {stream}}\rightarrow {\textsf {bool}}\), and stream \({ xs }\in \alpha \;{\textsf {stream}}\), the predicate \({\textsf {alw}}\;P\;{ xs }\) (“always P”) states that P is true for all suffixes of \({ xs }\). Dually, we take \({\textsf {ev}}\;P\;{\sigma }\) (“eventually P”) to mean that P is true for some suffix of \({ xs }\).
A stream \({\sigma }\) is saturated if, at each point, any enabled rule is taken at a later point:
$$\begin{aligned} {\textsf {saturated}}\;{\sigma }\,\iff \, (\forall r \mathbin \in {\textsf {rule}}.\;\, {\textsf {alw}}\;(\lambda {\sigma }'.\,\, {\textsf {enabledAt}}_r\;{\sigma }' \Rightarrow {\textsf {ev}}\;{\textsf {taken}}_r\;{\sigma }')\;{\sigma }) \end{aligned}$$
A countermodel path for a sequent s is a saturated escape path \({\sigma }\) starting at s:
$$\begin{aligned} {\textsf {countermodelPath}}\;s\;{\sigma }\,\iff \, {\textsf {epath}}\;{\sigma }\mathrel \wedge {\textsf {saturated}}\;{\sigma }\mathrel \wedge \textsf {{fst}}\;({\textsf {shead}}\;{\sigma }) = s \end{aligned}$$
Fig. 3

A derivation tree with a countermodel path

Example 11

The escape path presented in Example 10 is not saturated, since the rule \({\textsc {All}}\textsc {L}_{x,p(x),z}\) is enabled starting from the first position but never taken.

Example 12

The escape path in the tree of Fig. 3 is a countermodel path for \(\forall x.\,p(x) \mathrel {{\mathrel {\rhd }}}q(y)\), assuming that each possible term occurs infinitely often in the sequence \(t_1, t_2, \ldots \). The enabled rules along the escape path are all of the form \({\textsc {All}}\textsc {L}_{x,p(x),\_}\), and they are all always eventually taken.

3.9 Completeness

For the proof of completeness, we assume that the set of rules fulfills two properties:
  • Availability:For each sequent, at least one rule is enabled (i.e., \(\forall s.\;\exists r.\;{\textsf {enabled}}\;r\;s\)).

  • Persistence:For each sequent, if a rule is enabled but not taken, it remains enabled (i.e., \(\forall s\, r\,r'\, s'\, { ss }.\;{\textsf {enabled}}\;r'\,s \mathrel \wedge r' \not = r \mathrel \wedge {\textsf {eff}}\;r\;s\;{ ss }\mathrel \wedge s'\in \textsf {{set}}\;{ ss }\,\Rightarrow \, {\textsf {enabled}}\;r'\,s'\)).

(We will later remove the first condition with Theorem 18.) The above conditions are local properties of the effects of rules, not global properties of the proof system. This makes them easy to verify for particular systems.

Remark 13

The saturation condition on streams of steps from Sect. 3.8 is stronger than the standard properties of fairness and justice [21]. Fairness would require the rules to be continuously enabled to guarantee that they are eventually taken. Justice is stronger in that it would require the rules to be enabled infinitely often, but not necessarily continuously. Saturation goes further: If a rule is ever enabled, it will certainly be chosen at a later point. Saturation may seem too strong for the task at hand. However, in the presence of Persistence, a rule enabled at some point and never taken will be continuously (in particular, infinitely often) enabled; hence the notions of fairness, justice, and saturation all coincide.

In addition to Availability and Persistence, we assume a function \({\textsf {herbrand}}: {\textsf {sequent}}\rightarrow {\textsf {step}}\;{\textsf {stream}}\rightarrow {\textsf {structure}}\) that maps countermodel paths to actual countermodels:

A countermodel path provides an argument against provability. That this argument fully complements provability is the essence of the completeness theorem in its abstract form:

Lemma 14

Assume the rule system fulfills Availability and Persistence. Then every sequent admits a proof or a countermodel path. Formally:
$$\begin{aligned} \forall s.\;\, (\exists T.\;{\textsf {proof}}\;T\;s) \mathrel \vee (\exists {\sigma }.\;{\textsf {countermodelPath}}\;s\;{\sigma }) \end{aligned}$$

Proof

The proof uses the following combinators:
  • \({\textsf {stake}}\): \(\alpha \;{\textsf {stream}}\rightarrow {\textsf {nat}}\rightarrow \alpha \;{\textsf {list}}\) maps \(\rho \) and n to the list of the first n elements of \(\rho \);

  • \(\textsf {{smap}}\): \((\alpha \rightarrow \beta ) \rightarrow \alpha \;{\textsf {stream}}\rightarrow \beta \;{\textsf {stream}}\) maps a function to every element of the stream;

  • \({\textsf {nats}}\): \({\textsf {nat}}\;{\textsf {stream}}\) denotes the stream of natural numbers: \(0 \cdot 1 \cdot 2 \cdot 3 \cdot \ldots \,\);

  • \({\textsf {flat}}\): \((\alpha \;{\textsf {list}})\;{\textsf {stream}}\rightarrow \alpha \;{\textsf {stream}}\) maps a stream of finite nonempty lists to the stream obtained by concatenating those lists;

  • \({\textsf {sdropWhile}}\): \((\alpha \rightarrow {\textsf {bool}}) \rightarrow \alpha \;{\textsf {stream}}\rightarrow \alpha \;{\textsf {stream}}\) removes the maximal prefix of elements that fulfill a given predicate from a given stream (or returns an irrelevant default value if the predicate holds for all elements).

We start by constructing a fair stream of rules \({\textsf {fenum}}\)—i.e., every rule occurs infinitely often in \({\textsf {fenum}}\). Let \({\textsf {enum}}\) be a stream whose elements cover the entire set \({\textsf {rule}}\), which is required to be countable. Take \({\textsf {fenum}}= {\textsf {flat}}\;(\textsf {{smap}}\;({\textsf {stake}}\;{\textsf {enum}})\;({\textsf {stail}}\;{\textsf {nats}}))\). Thus, if \({\textsf {enum}}= r_1 \cdot r_2 \cdot r_3 \cdot \ldots \), then \({\textsf {fenum}}= r_1 \cdot r_1 \cdot r_2 \cdot r_1 \cdot r_2 \cdot r_3 \cdot \ldots \).
Let s be a sequent. Using \({\textsf {fenum}}\), we build a derivation tree \(T_0\) labeled with s such that all its infinite paths are saturated. Let \({\textsf {fair}}\) be the subset of \({\textsf {rule}}\;{\textsf {stream}}\) consisting of the fair streams. Clearly, any suffix of an element in \({\textsf {fair}}\) also belongs to \({\textsf {fair}}\). In particular, \({\textsf {fenum}}\) and all its suffixes belong to \({\textsf {fair}}\). Given \(\rho \in {\textsf {fair}}\) and \(s\in {\textsf {sequent}}\), \({\textsf {sdropWhile}}\;(\lambda r.\;\lnot \;{\textsf {enabled}}\;r\;s)\;\rho \) has the form \({{\textsf {SCons}}}\;r\;\rho '\), making r the first enabled rule in \(\rho \). Such a rule exists because, by Availability, at least one rule is enabled at s and, by fairness, all the rules occur in \(\rho \). Since \({\textsf {enabled}}\;r\;s\), we can pick a set of sequents \({ ss }\) such that \({\textsf {eff}}\;r\;s\;{ ss }\). We define \({\textsf {mkTree}}: {\textsf {fair}}\rightarrow {\textsf {sequent}}\rightarrow {\textsf {dtree}}\) corecursively as
$$\begin{aligned} {\textsf {mkTree}}\;\rho \;s = {\textsf {Node}}\; (s, r) \; (\textsf {{image}}\;({\textsf {mkTree}}\;\rho ')\;{ ss }) \end{aligned}$$
We prove that, for all \(\rho \in {\textsf {fair}}\) and s, the derivation tree \({\textsf {mkTree}}\;\rho \;s\) is well formed and all its infinite paths are saturated. Wellformedness is obvious because at each point the continuation is built starting with the effect of a rule. For saturation, we show that if rule r is enabled at sequent s and \({\textsf {ipath}}\;({\textsf {mkTree}}\;\rho \;s)\;{\sigma }\), then r appears along \({\sigma }\) (i.e., there exists a sequent \(s'\) such that \((s',r)\) is in \({\sigma }\)). This follows by induction on the position of r in \(\rho \), \({\textsf {pos}}\;r\;\rho \)—formally, the length of the shortest list \(\rho _0\) such that \(\rho = \rho _0 \mathbin {@} {{\textsf {SCons}}}\;r\;\_\), where @ denotes concatenation.

Let \(r'\) be the first rule from \(\rho \) enabled at sequent s. If \(r = r'\), then \({\textsf {mkTree}}\;\rho \;s\) has label (sr) already. Otherwise, \(\rho \) has the form \(\rho _1 \mathbin {@} [r'] \mathbin {@} \rho '\), with r not in \(\rho _1\), hence \({\textsf {pos}}\;r\;\rho ' < {\textsf {pos}}\;r\;\rho \). From the definitions of \({\textsf {ipath}}\) and \({\textsf {mkTree}}\), it follows that \({\textsf {ipath}}\;({\textsf {mkTree}}\;\rho '\;s')\;({\textsf {stail}}\;{\sigma })\) holds for some \(s' \in { ss }\) such that \({\textsf {eff}}\;r\;s'\;{ ss }\). By the induction hypothesis, r appears along \({\textsf {stail}}\;{\sigma }\), hence along \({\sigma }\) as desired. In particular, \(T_0 = {\textsf {mkTree}}\;{\textsf {fenum}}\;s\) is well formed and all its infinite paths are saturated.

Finally, if \(T_0\) is finite, it is the desired proof of s. Otherwise, by Lemma 6 (König) it has an infinite path. This path is necessarily saturated; by Lemma 9, it is the desired countermodel path. \(\square \)

Lemma 14 captures the abstract essence of arguments from the literature, although this is sometimes hard to grasp under the thick forest of syntactic details and concrete strategies for fair enumeration: A fair tree is constructed, which attempts a proof; in case of failure, the tree exhibits a saturated escape path. By Herbrandness, we immediately obtain completeness:

Theorem 15

Fix a rule system and assume that it fulfills Availability, Persistence, and Herbrandness. Then every sequent that is satisfied by all structures admits a proof:
$$\begin{aligned} \forall s.\;{\models s} \,\implies \, (\exists T.\;{\textsf {proof}}\;T\;s) \end{aligned}$$

Proof

Given s such that \(\models s\), assume by absurdity that there exists no T such that \({\textsf {proof}}\;T\;s\). By Lemma 14, we obtain \({\sigma }\) such that \({\textsf {countermodelPath}}\;s\;{\sigma }\). Then, by Herbrandness, we have \({\textsf {herbrand}}\;s\;{\sigma }\not \models s\), which contradicts \(\models s\). \(\square \)

Remark 16

If we are not interested in witnessing the proof attempt closely, Lemma 14 can be established more directly by building the fair path without going through an intermediate fair tree. The key observation is that if a sequent s has no proof and \({\textsf {eff}}\;r\;s\;{ ss }\), there must exist a sequent \(s'\in { ss }\) that has no proof. (Otherwise, we could compose the proofs of all \(s'\) into a proof of s by applying rule r.) Let \({\textsf {pick}}\;r\;s\;{ ss }\) denote such an \(s'\). We proceed directly to the construction of a saturated escape path as a corecursive predicate \({\textsf {mkPath}}: {\textsf {fair}}\rightarrow \{s \in {\textsf {sequent}}.\;s \text{ has } \text{ no } \text{ proof }\} \rightarrow {\textsf {step}}\;{\textsf {stream}}\) following the same idea as for the previous tree construction (function \({\textsf {mkTree}}\)):
$$\begin{aligned} {\textsf {mkPath}}\;\,\rho \;s = {{\textsf {SCons}}}\; (s, r) \; ({\textsf {mkPath}}\;\,\rho '\;({\textsf {pick}}\;r\;s\;{ ss })) \end{aligned}$$
where again \({{\textsf {SCons}}}\;r\;\rho '={\textsf {sdropWhile}}\;(\lambda r.\;\lnot \;{\textsf {enabled}}\;r\;s)\;\rho \) and \({ ss }\) is such that \({\textsf {eff}}\;r\;s\;{ ss }\). Fairness of \({\textsf {mkPath}}\;\rho \;s\) follows by a similar argument as before for fairness of the tree.

3.10 Omitting the Availability Assumption

The abstract completeness result (Lemma 14) assumes Availability and Persistence. Among these assumptions, Persistence is essential: It ensures that the constructed fair path is saturated, meaning that every rule available at any point is eventually applied. Availability can be added later to the system without affecting its behavior by introducing a special “idle” rule.

Lemma 17

A rule system \(\mathscr {R}= ({\textsf {sequent}},{\textsf {rule}},{\textsf {eff}})\) that fulfills Persistence can be transformed into the rule system \(\mathscr {R}_\mathrm {idle}= ({\textsf {sequent}},{\textsf {rule}}_\mathrm {idle},{\textsf {eff}}_\mathrm {idle})\) that fulfills both Persistence and Availability, with \({\textsf {rule}}_\mathrm {idle}= {\textsf {rule}}\;\cup \;\{{\textsc {Idle}}\}\) and \({\textsf {eff}}_\mathrm {idle}\) behaving like \({\textsf {eff}}\) on \({\textsf {rule}}\) and \({\textsf {eff}}_\mathrm {idle}\;{\textsc {Idle}}\;s\;{ ss }\,\iff \, { ss }= \{s\}\).

Proof

Availability for the modified system follows from the continuous enabledness of \({\textsc {Idle}}\). Persistence follows from the Persistence of the original system together with the property that \({\textsc {Idle}}\) is continuously enabled and does not alter the sequent. The modified system is equivalent to the original one because \({\textsc {Idle}}\) does not alter the sequent. \(\square \)

Now we can rephrase Theorem 15 to assume Persistence and a slight variation of the Herbrandness condition. All the concepts refer as before to the rule system \(\mathscr {R}\), except for \({\textsf {countermodelPath}}_\mathrm {idle}\), which refers to \(\mathscr {R}_\mathrm {idle}\):

Theorem 18

Let \(\mathscr {R}\) be a rule system that fulfills Persistence and Herbrandness\(_\mathrm {idle}\). Then every sequent that is satisfied by all structures admits a proof in \(\mathscr {R}\).

Proof

We apply Lemma 14 to the system \(\mathscr {R}_\mathrm {idle}\) to obtain that every sequent admits either a proof or a countermodel path, both in this system. Since \(\mathscr {R}_\mathrm {idle}\) is an extension of \(\mathscr {R}\) with a single rule, any proof in \(\mathscr {R}\) corresponds to a proof in \(\mathscr {R}_\mathrm {idle}\). Conversely, any proof in \(\mathscr {R}_\mathrm {idle}\) can be transformed into a proof in \(\mathscr {R}\) by omitting all applications of \({\textsc {Idle}}\). We thus proved that every sequent admits a proof in \(\mathscr {R}\) or a countermodel path over \(\mathscr {R}_\mathrm {idle}\). We can then apply Herbrandness\(_\mathrm {idle}\), just as we did with Herbrandness for Theorem 18. \(\square \)

Remark 19

The addition of \({\textsc {Idle}}\) is inspired by, and similarly motivated as, that of idle transitions to Kripke structures in the context of temporal logic, where it is technically convenient to consider only infinite paths.

4 Concrete Instances of Soundness and Completeness

The abstract soundness is based on the Local Soundness assumption, which is easy to verify for all the considered instances. Therefore, below we focus on completeness.

4.1 Classical First-Order Logic

The abstract completeness proof is parameterized by a rule system. This section concretizes the result for the Gentzen system from Sect. 2.1 to derive the standard completeness theorem. Example 2 recasts it as a rule system; we must verify that it fulfills the Persistence and Herbrandness conditions.

The Gentzen rules are syntax-directed in that they operate on formulas having specific connectives or quantifiers at the top. This is what guarantees Persistence. For example, an application of \({\textsc {Ax}}_{a}\) (which affects only the atom a) leaves any potential enabledness of \({\textsc {All}}\textsc {L}_{x,\varphi ,t}\) (which affects only formulas with \({\textsf {All}}\) at the top) unchanged, and vice versa; moreover, \({\textsc {Ax}}_{a}\) does not overlap with \({\textsc {Ax}}_{b}\) for \(a \not = b\). A minor subtlety concerns \({\textsc {All}}\textsc {R}_{x,\varphi }\), which requires the existence of a fresh y in order to be enabled. Persistence holds because the sequents are finite, so we can always find a fresh variable in the countably infinite set \({\textsf {var}}\). On the other hand, Availability does not hold; for example, the sequent \(p(x) \mathrel {{\mathrel {\rhd }}}q(x)\) has no enabled rule. Hence, we need Theorem 18 and its Idle rule.

To infer the standard completeness theorem from Theorem 18, it suffices to define a suitable function \({\textsf {herbrand}}\). Let \({\sigma }\) be a countermodel path for \(\Gamma \mathrel {{\mathrel {\rhd }}}\Delta \) (i.e., a saturated escape path starting at \(\Gamma \mathrel {{\mathrel {\rhd }}}\Delta \)). Let Open image in new window be the union of the left-hand sides of sequents occurring in \({\sigma }\), and let Open image in new window be the union of the corresponding right-hand sides. Clearly, Open image in new window and Open image in new window. We define \({\textsf {herbrand}}\;(\Gamma \mathrel {{\mathrel {\rhd }}}\Delta )\;{\sigma }\) to be \(\mathscr {S}= (\textit{S}, F, P)\), where
  • S is the set of terms, \({\textsf {term}}\);

  • for each n-ary f and p and each \(t_1,\ldots ,t_n \in \textit{S}\):

In what follows, we employ the substitution lemma, which relates the notions of satisfaction and substitution:

Lemma 20

\(\mathscr {S}\models _\xi \varphi [t / x]\) iff Open image in new window.

Lemma 21

The structure \({\textsf {herbrand}}\;(\Gamma \mathrel {{\mathrel {\rhd }}}\Delta )\;{\sigma }\) is a countermodel for \(\Gamma \mathrel {{\mathrel {\rhd }}}\Delta \), meaning that there exists a valuation \(\xi : {\textsf {var}}\rightarrow S\) such that \(\mathscr {S}\not \models _\xi \Gamma \mathrel {{\mathrel {\rhd }}}\Delta \).

Proof

First, the pair Open image in new window from the definition of \({\textsf {herbrand}}\) can be shown to be well behaved with respect to all the connectives and quantifiers in the following sense:A pair Open image in new window fulfilling these properties is sometimes called a Hintikka set [1, 22]. These properties follow from the saturation of \({\sigma }\) with respect to the corresponding rules. The proofs are routine. For example:
  1. 1.

    If Open image in new window, the rule \({\textsc {Ax}}_a\) is enabled in \({\sigma }\) and hence, by saturation, it is eventually taken. This is impossible since this is a rule without premises, whose application would prevent the continuation of the infinite sequence \({\sigma }\).

     
  2. 6.

    If Open image in new window and t is a term, \({\textsc {All}}\textsc {L}_{x,\varphi ,t}\) is enabled in \({\sigma }\) and hence eventually taken, ensuring that Open image in new window.

     
Let \(\xi \) be the embedding of variables into terms. To prove \(\mathscr {S}\not \models _\xi \Gamma \mathrel {{\mathrel {\rhd }}}\Delta \), it suffices to show that Open image in new window and Open image in new window. These two facts follow together by induction on the depth of \(\varphi \). In the base case, if Open image in new window, then \(\mathscr {S}\models _\xi {\textsf {Atm}}\;a\) follows directly from the definition of \(\mathscr {S}\); moreover, if Open image in new window, then by property 1 Open image in new window, hence again \(\mathscr {S}\not \models _\xi {\textsf {Atm}}\;a\) follows from the definition of \(\mathscr {S}\). The only nontrivial inductive case is \({\textsf {All}}\), which requires Lemma 20. Assume Open image in new window. By property 6, we have Open image in new window for any t. Hence, by the induction hypothesis, \(\mathscr {S}\models _{\xi } \varphi [t / x]\). By Lemma 20, \(\mathscr {S}\models _{\xi [x \leftarrow t]} \varphi \) for all t; that is, \(\mathscr {S}\models _\xi {\textsf {All}}\;x\;\varphi \). The second fact, concerning Open image in new window, follows similarly from property 7. \(\square \)

We have thus obtained:

Corollary 22

A sequent is provable by a (finite) proof in the Gentzen system of classical FOL iff it is satisfied by all FOL structures.

Remark 23

The rule \({\textsc {All}}\textsc {L}\) stores, in the left context, a copy of the universal formula \({\textsf {All}}\;x\;\varphi \) when applied backwards. This is crucial for concrete completeness since a fair enumeration should try all the t instances of the universally quantified variable x, which requires Availability of \({\textsf {All}}\;x\;\varphi \) even after its use. If we labeled \({\textsc {All}}\textsc {L}\) as \({\textsc {All}}\textsc {L}_{x,\varphi }\) instead of \({\textsc {All}}\textsc {L}_{x,\varphi ,t}\), thereby delegating the choice of t to the nondeterminism of \({\textsf {eff}}\), the system would still be persistent as required by the abstract completeness proof, but Lemma 21 (and hence concrete completeness) would not hold—property 6 from the lemma’s proof would fail.

4.2 Further Instances

Theorem 18 is applicable to classical FOL Gentzen systems from the literature, in several variants: with sequent components represented as lists, multisets, or sets, one-sided or two-sided, and so on. This includes the systems G\('\), GCNF\('\), G, and G\(_{\text {=}}\) from Gallier [22] and G1, G2, G3, GS1, GS2, and GS3 from Troelstra and Schwichtenberg [51]. Persistence is easy to check. The syntax-independent part of the argument is provided by Theorem 18, while an ad hoc step analogous to Lemma 21 is required to build a concrete countermodel from a countermodel path to complete the proof.

Several FOL refutation systems based on tableaux or resolution are instances of the abstract theorem, providing that we read the abstract notion of “proof” as “refutation” and “countermodel” as “model.” Nondestructive tableaux [26]—including those presented in Bell and Machover [1] and in Fitting [20]—are usually persistent when regarded as derivation systems. After an application of Theorem 18, the ad hoc argument for interpreting the abstract model is similar to that for Gentzen systems (Lemma 21).

Regrettably, abstract completeness is not directly applicable beyond classical logic. It is generally not clear how to extract a specific model from a nonstandard logic from an abstract (proof-theoretic) model. Another issue is that standard sequent systems for nonclassical variations of FOL such as modal or intuitionistic logics do not have Persistence. A typical right rule for the modal operator \(\Box \) (“must”) is as follows [51]:To be applicable, the rule requires that all the formulas in the context surrounding the eigenformula have \(\Box \) or \(\Diamond \) at the top. Other rules may remove these operators, or introduce formulas that do not have them, thus disabling MustR.
Recent work targeted at simplifying completeness arguments [37] organizes modal logics as labeled transition systems, for which Kripke completeness is derived. (The technique also applies to the completeness of intuitionistic logic with respect to Kripke semantics.) In the proposed systems, the above rule becomes

The use of labels for worlds (\(w{,}\,w'\)) and the bookkeeping of the accessibility relation R makes it possible to recast the rule so that either no facts (as above) or only resilient facts are ever assumed about the surrounding context. The resulting proof system fulfills Persistence, enabling Theorem 18. The Kripke countermodel construction is roughly as for classical FOL Gentzen systems.

5 Abstract Infinite-Proof Soundness

In the previous sections, completeness is established by analyzing the interplay between the existence of finite proof trees (representing valid proofs) and certain infinite proof trees (used to produce countermodels). In this section, we look into the question: When can infinite proof trees be accepted as valid proofs?

We give a coinductive account of the abstract development of Brotherston et al. [16], in a slightly more general form since we work with arbitrary infinite proofs, which may be acyclic. We start by recalling some motivation and intuition about cyclic proofs, in the context of the FOL\(_{\textsf {ind}}\) logic from Sect. 2.2. It is clear that the rules of the FOL\(_{\textsf {ind}}\) Gentzen system are (locally) sound for inductive structures. Hence, any finite proof tree built using these rules represents a valid proof, in that its root sequent is known to be satisfied by all inductive structures. But other proofs are permissible too.

For the language of Example 1 from Sect. 2.2, consider the cyclic tree of Fig. 4, where one of the leaf nodes is not an axiom, but rather a link L “back” to the root of the tree (decorated with L). A cyclic proof indicates that, when L is reached, the (backward) proof continues with the sequent that L points to. Thus, the intended proof corresponding to the cyclic tree from Fig. 4 is the infinite tree from Fig. 5.
Fig. 4

A cyclic proof tree

Fig. 5

The infinite proof tree corresponding to the cyclic proof tree of Fig. 4

5.1 Soundness of Infinite Proof Trees

We fix the countable sets \({\textsf {sequent}}\) and \({\textsf {rule}}\) for sequents and rules, the class \({\textsf {structure}}\), and the satisfaction relation \({\models } : {\textsf {structure}}\rightarrow {\textsf {sequent}}\rightarrow {\textsf {bool}}\), writing \(\models s\) for \(\forall S \mathbin \in {\textsf {structure}}.\;S \models s\). In addition, we fix the following:
  • a set \({\textsf {marker}}\) of items called markers;

  • a marking function\({\textsf {mark}}: \{(s,r,s') \in {\textsf {sequent}}\times {\textsf {rule}}\times {\textsf {sequent}}\mid \exists { ss }.\;{\textsf {eff}}\;s\;r\;{ ss }\mathrel \wedge s' \in { ss }\} \rightarrow ({\textsf {marker}}\times {\textsf {bool}}\times {\textsf {marker}})\;\textsf {{set}}\);

  • an ordinal \({{\textsf {ord}}}\), with < and \(\le \) denoting its strict and nonstrict order relations;

  • a height function \({\textsf {height}}: {\textsf {marker}}\times {\textsf {structure}}\rightarrow {{\textsf {ord}}}\).

The marking function associates a set of triples \((M,b,M')\) with every backward application \((s,r,s')\) of every rule. In \((s,r,s')\), the rule r is thought of as being applied to s and then one of the resulting sequents, \(s'\), being chosen. In \((M,b,M')\), the Boolean value b being \(\textsf {{False}}\) means “stay” and b being \(\textsf {{True}}\) means “decrease.” The “stay” and “decrease” directives refer to how the height function \({\textsf {height}}\) should evolve along this rule application: Given a sequent s, a countermodel S for it and a rule r that yields a set of sequents \({ ss }\) when applied to s, there should exist a sequent \(s' \in { ss }\) and a countermodel \(S'\) of \(s'\) such that, for all \((M,b,M') \in {\textsf {mark}}\;(s,r,s')\), when moving from (MS) to \((M',S')\):
  • the height should at least stay the same (or decrease) when b says “stay”;

  • the height should decrease when b says “decrease.”

Formally, we postulate the following condition:

Descent:For all S and s such that \(S \not \models s\) and all r, \({ ss }\) such that \({\textsf {eff}}\;s\;r\;{ ss }\), there exist \(S'\) and \(s' \in { ss }\) such that \(S' \not \models s'\) and \({{\textsf {descent}}}\;(s,S)\;r\;(s',S')\).

In the above, \({{\textsf {descent}}}: {\textsf {sequent}}\times {\textsf {structure}}\rightarrow {\textsf {rule}}\rightarrow {\textsf {sequent}}\times {\textsf {structure}}\rightarrow {\textsf {bool}}\) is defined as follows:
$$\begin{aligned} \begin{array}{l} {{\textsf {descent}}}\;(s,S)\;r\;(s',S') \,\iff \, {} \\ (\forall M\;b\;M'.\;\, (M,b,M') \in {\textsf {mark}}\;(s,r,s') \,\implies {} \\ \phantom {(\forall M\;b\;M'.\;\,}(b = \textsf {{False}}\mathrel \wedge {\textsf {height}}\;(M',S') \le {\textsf {height}}\;(M,S)) \;\vee \; \\ \phantom {(\forall M\;b\;M'.\;\,}(b = \textsf {{True}}\mathrel \wedge {\textsf {height}}\;(M',S') < {\textsf {height}}\;(M,S))) \end{array} \end{aligned}$$

Remark 24

The Descent condition is a strengthening of the Local Soundness condition from Sect. 3.5: Removing the “and \({{\textsf {descent}}}\;(s,S)\;r\;(s',S')\)” part yields the contrapositive of Local Soundness. Essentially, Descent is a form of Local Soundness with the additional requirement that the marking function’s directives must be respected.

Under the Descent assumption, we can identify certain “good” infinite proof trees that can be accepted as valid proofs along with the finite ones. First, we need the notion of a stream of Booleans and a stream of markers following an infinite path. The predicate \({\textsf {follow}}: {\textsf {bool}}\;{\textsf {stream}}\rightarrow {\textsf {marker}}\;{\textsf {stream}}\rightarrow {\textsf {step}}\;{\textsf {stream}}\rightarrow {\textsf {bool}}\) is defined coinductively:Thus, \({\textsf {follow}}\;{ bs }\;{ Ms }\;{\sigma }\) states that, for any consecutive steps \((s,r),(s',\_)\) in \({\sigma }\), the corresponding markers \(M,M'\) in \({ Ms }\) and the Boolean corresponding to the first of these, b in \({ bs }\), have the property that \((M,b,M') \in {\textsf {mark}}\;(s,r,s')\); in other words, the streams \({ bs }\) and \({ Ms }\) represent choices of the sets of markers for \({\sigma }\).
We define a tree to be good if each of its infinite paths has some streams of Booleans and markers that eventually follow it (i.e., follow a suffix of it), in such a way that the Booleans indicate infinite decrease:
$$\begin{aligned}{\textsf {good}}\;T \iff (\forall {\sigma }.\; {\textsf {ipath}}\;T\;{\sigma }\implies {\textsf {ev}}\;(\lambda {\sigma }'.\;\exists { bs }\;{ Ms }.\;{\textsf {follow}}\;{ bs }\;{ Ms }\;{\sigma }' \mathbin \wedge {\textsf {infDecr}}\;{ bs })\;{\sigma }) \end{aligned}$$
where the infinite decrease of a Boolean stream simply means \(\textsf {{True}}\) (“decrease”) occurring infinitely often:
$$\begin{aligned} {\textsf {infDecr}}={\textsf {alw}}\;({\textsf {ev}}\;(\lambda \;{ bs }.\;{\textsf {shead}}\;{ bs }= \textsf {{True}})) \end{aligned}$$
A (potentially) infinite proof of a sequent, or iproof, is then defined as a good, well-formed tree having that sequent at the root:
$$\begin{aligned} {\textsf {iproof}}\;T\;s \,\iff \, {\textsf {wf}}\;T \mathrel \wedge {\textsf {good}}\;T \mathrel \wedge \textsf {{fst}}\;({{\textsf {lab}}}\;T) = s \end{aligned}$$
Since finite trees (have no infinite paths and hence) are trivially good, (finite) proofs are particular cases of iproofs. Our goal is to show that iproofs are also sound.

Theorem 25

Assume the rule system fulfills Descent. Then every sequent that has an iproof is satisfied by all structures:
$$\begin{aligned} \forall s.\;\, (\exists T.\;{\textsf {iproof}}\;T\;s) \implies {\models s} \end{aligned}$$

Proof

Fix s and T such that \({\textsf {iproof}}\;T\;s\) and assume by absurdity that \(\not \models s\), meaning there exists S such that \(S \not \models s\). To obtain a contradiction, we proceed as follows:
  1. 1.
    Applying Lemma 8 (Sect. 3.6) for \(\alpha = {\textsf {structure}}\), \(I = \lambda (s,\textit{S}).\;\textit{S} \not \models s\) and \(P = {{\textsf {descent}}}\) to the Descent assumption, we obtain a stream \(\delta \) of step–structure pairs (i.e., of sequent–rule–structure triples) \(((s,r),\textit{S})\), such that, at each point in the stream, the structure \(\textit{S}\) is a countermodel for the sequent s and \({{\textsf {descent}}}\) holds for any two consecutive elements. Formally, we have \({\textsf {alwCmodDesc}}\;\delta \) (“always countermodel and descending”), where \({\textsf {alwCmodDesc}}\) is defined as
    and the step components of \(\delta \) form an infinite path in T: \({\textsf {ipath}}\;T\;(\textsf {{smap}}\;\textsf {{fst}}\;\delta )\).
     
  2. 2.
    Using the goodness of T, we obtain the streams \({ bs }\) and \({ Ms }\) that follow a suffix \({\sigma }'\) of \(\textsf {{smap}}\;\textsf {{fst}}\;\delta \):
    $$\begin{aligned} \exists {\sigma }'\;{\sigma }''\;{ bs }\;{ Ms }.\;\,\textsf {{smap}}\;\textsf {{fst}}\;\delta = {\sigma }'' {{@}}\, {\sigma }' \mathrel \wedge {\textsf {follow}}\;{ bs }\;{ Ms }\;{\sigma }' \mathrel \wedge {\textsf {infDecr}}\;{ bs }\end{aligned}$$
    where \({{@}}\) denotes concatenation of a list and a stream.
     
  3. 3.
    Taking the suffix \(\delta '\) of \(\delta \) that corresponds to the suffix \({\sigma }'\) of its step component \(\textsf {{smap}}\;\textsf {{fst}}\;\delta \) (formally, defining \(\delta '\) to be \({\textsf {stake}}\,({{\textsf {length}}}\;{\sigma }')\,\delta \)), we end up with the streams \(\delta '\), \({ bs }\) and \({ Ms }\) such that
    • \({\textsf {follow}}\;{ bs }\;{ Ms }\;(\textsf {{smap}}\;\textsf {{fst}}\;\delta ')\);

    • \({\textsf {infDecr}}\;{ bs }\);

    • \({\textsf {alwCmodDesc}}\;\delta '\) (because \({\textsf {alw}}\) is invariant under suffix).

     
  4. 4.
    Let \({\textsf {zip}}\) denote the pair-zipping of two streams, defined by primitive corecursion as \({\textsf {zip}}\;({{\textsf {SCons}}}\;x\;{ xs })\;({{\textsf {SCons}}}\;y\)\(\;{ ys }) = {{\textsf {SCons}}}\;(x,y)\;({\textsf {zip}}\;{ xs }\;{ ys })\). Taking \({ ks }= \textsf {{smap}}\;{\textsf {height}}\;({\textsf {zip}}\;(\textsf {{smap}}\;\textsf {{fst}}\;\delta ')\;{ Ms })\) (a stream of \({{\textsf {ord}}}\) elements), we have that
    • \({ ks }\) is always nonstrictly decreasing: \({\textsf {alw}}\;(\lambda { ks }'.\;{\textsf {shead}}\;{ ks }' \ge {\textsf {shead}}\;({\textsf {stail}}\;{ ks }'))\;{ ks }\);

    • \({ ks }\) is infinitely often strictly decreasing: \({\textsf {alw}}\;({\textsf {ev}}\;(\lambda { ks }'.\;{\textsf {shead}}\;({\textsf {stail}}\;{ ks }') > {\textsf {shead}}\;{ ks }'))\;{ ks }\).

    This follows easily by coinduction using the gathered properties for \(\delta '\), \({ bs }\), and \({ Ms }\).
     
  5. 5.
    From \({ ks }\), we obtain the substream \({ ks }' \) that is always strictly decreasing:
    $$\begin{aligned} {\textsf {alw}}\;(\lambda { ks }'.\;{\textsf {shead}}\;{ ks }' > {\textsf {shead}}\;({\textsf {stail}}\;{ ks }'))\;{ ks }' \end{aligned}$$
    which is in contradiction with the wellfoundedness of the order < on the ordinal \({{\textsf {ord}}}\). \(\square \)
     
One detail not explained in the above proof is the construction of \({ ks }'\), a stream of strictly decreasing items, from \({ ks }\), a stream of nonstrictly decreasing but always eventually strictly decreasing items. Informally, if \({ ks }\) is of the form \(k_0 \cdot k_1 \cdot \ldots \) with \(k_0 = k_1 = \cdots = k_n> k_{n+1} = k_{n+2} = \cdots = k_{n+m} > k_{n+m+1} = \cdots \), then \({ ks }'\) will be \(k_0 \cdot k_{n+1} \cdot k_{n+m+1} \cdot \ldots \); that is, \({ ks }'\) will only contain the “jump” elements. Formally, this construction can be compactly described as \({ ks }' = {\textsf {bfilter}}\,(>,{ ks })\), where \(>\; : {{\textsf {ord}}}\rightarrow {{\textsf {ord}}}\rightarrow {\textsf {bool}}\) is the dual strict order on \({{\textsf {ord}}}\). The general-purpose polymorphic “binary filter” combinator, \({\textsf {bfilter}}: D_\alpha \rightarrow \alpha \;{\textsf {stream}}\), has as domain the subset \(D_\alpha \) of \((\alpha \rightarrow \alpha \rightarrow {\textsf {bool}}) \times \alpha \;{\textsf {stream}}\) consisting of pairs \((P,{ xs })\) such that, for every element x in \({ xs }\), there exists an element y occurring later in \({ xs }\) such that \(P\;x\;y\):
$$\begin{aligned} D_\alpha = \{(P,{ xs }) \mid {\textsf {alw}}\;(\lambda { xs }'.\,{\textsf {ev}}\;(\lambda { xs }''.\,P\,({\textsf {shead}}\;{ xs }')\,({\textsf {shead}}\;{ xs }''))\,({\textsf {stail}}\;{ xs }'))\;{ xs }\} \end{aligned}$$
The \({\textsf {bfilter}}\) function is defined as follows, where \({\textsf {bfilter}}_P\) abbreviates \({\textsf {bfilter}}\;(P,\_)\):
Thus, \({\textsf {bfilter}}_P\) collects from a stream all the pairs of elements proximally related by P and forms a new stream with them. Its definition is neither purely recursive nor purely corecursive. Whereas the call on the ‘then’ branch is corecursive (showing how to produce the first element in the result stream, \({\textsf {shead}}\;{ xs }\)), the call on the ‘else’ branch does not exhibit any explicit productivity. However, the overall function is well-defined (and indeed productive) because, thanks to the choice of the domain, at each point there will be at most a finite number of consecutive calls on the ‘else’ branch, until an element satisfying \(P\;({\textsf {shead}}\;{ xs })\) is met. In summary, the recursive call on the ‘else’ branch concurs with the corecursive call on the ‘then’ branch for the welldefinedness of this function.
Thanks to recent infrastructure development [11], Isabelle accepts this mixed recursive–corecursive definition, after the user has shown that the recursion in the ‘else’ branch calls terminates: If \((P,{ xs }) \in D_\alpha \), then in particular \({\textsf {ev}}\;(\lambda { xs }''.\,P\,({\textsf {shead}}\;{ xs })\,({\textsf {shead}}\;{ xs }''))\) holds for \({\textsf {stail}}\;{ xs }\). If we write \({ xs }'\) for the input to the recursive call, \({{\textsf {SCons}}}\,({\textsf {shead}}\;{ xs })\,({\textsf {stail}}\,({\textsf {stail}}\;{ xs }))\), we have that \({\textsf {shead}}\;{ xs }' = {\textsf {shead}}\;{ xs }\) and \({\textsf {stail}}\;{ xs }' = {\textsf {stail}}\,({\textsf {stail}}\;{ xs })\). Hence:
$$\begin{aligned} {\textsf {num}}_P\,({\textsf {shead}}\;{ xs }')\,({\textsf {stail}}\;{ xs }') ={}&{\textsf {num}}_P\,({\textsf {shead}}\;{ xs })\,({\textsf {stail}}\,({\textsf {stail}}\;{ xs })) \\ {} <{}&{\textsf {num}}_P\,({\textsf {shead}}\;{ xs })\,({\textsf {stail}}\;{ xs }) \end{aligned}$$
where \({\textsf {num}}_P\,z\;{ ys }\) is the number of elements that need to be consumed from \({ ys }\) before reaching some y such that \(P\;z\;y\). The use of such sophisticated corecursive machinery for this example may seem excessive. But note that \({\textsf {bfilter}}\) is a general-purpose combinator, defined once and usable for arbitrary predicates P.
For \(P = {>}\) and \(k_0 = k_1 = \cdots = k_n> k_{n+1} = k_{n+2} = \cdots = k_{n+m} > k_{n+m+1} = \cdots \), the execution of \({\textsf {bfilter}}\) proceeds as follows:

5.2 From Cyclic Trees to Infinite Trees

With the general result for the soundness of infinite trees in place, let us come back to the notion of cyclic tree. In addition to the sets \({\textsf {sequent}}\) and \({\textsf {rule}}\), we assume a set \({\textsf {link}}\) of links. The set of cyclic derivation trees is introduced as a datatype:Thus, there are two differences between the derivation trees we used so far (\({\textsf {dtree}}\)) and the cyclic derivation trees: the latter are restricted to be finite and are allowed to have, in addition to usual nodes, links to other cyclic derivation trees. To animate the links, we fix a function \({\textsf {pointsTo}}: {\textsf {link}}\rightarrow {\textsf {cdtree}}\) specifying, for each link, where it points to, with the following requirement:
Each cyclic derivation tree T yields a derivation tree \({\textsf {treeOf}}\;T\) as follows: T is traversed as if recursively producing \({\textsf {treeOf}}\;T\) by the application of the same rules, and when reaching a \({\textsf {Link}}\;L\) one moves to the pointed cyclic tree \({\textsf {pointsTo}}\;L\). Because \({\textsf {pointsTo}}\;L\) can be arbitrarily large (in particular, larger than T), the definition of \({\textsf {treeOf}}: {\textsf {cdtree}}\rightarrow {\textsf {dtree}}\) cannot proceed recursively on the domain (\({\textsf {cdtree}}\)). However, it can naturally proceed corecursively on the codomain, \({\textsf {dtree}}\):
$$\begin{aligned} {\textsf {treeOf}}\;({\textsf {Node}}\;(s,r)\;{ Ts })= & {} {\textsf {Node}}\;(s,r)\;(\textsf {{fimage}}\;{\textsf {treeOf}}\;{ Ts })\\ {\textsf {treeOf}}\;({\textsf {Link}}\;L)= & {} {\textsf {treeOf}}\;({\textsf {pointsTo}}\;L) \end{aligned}$$
Strictly speaking, the definition is not purely corecursive: In the \({\textsf {Link}}\) case, it is not apparent what is the first layer of the result tree. However, since \({\textsf {treeOf}}\,({\textsf {pointsTo}}\;L)\) is guaranteed to not have the form \({\textsf {Link}}\;L'\), hence start with the \({\textsf {Node}}\) constructor, we know that after exactly two calls we reach the \({\textsf {Node}}\) case—so Isabelle has no problem accepting this definition, again as an instance of recursion–corecursion.
The root sequent of a cyclic derivation tree, \({\textsf {seqOf}}: {\textsf {cdtree}}\rightarrow {\textsf {sequent}}\), is defined as expected, following the link when necessary:
$$\begin{aligned} {\textsf {seqOf}}\;({\textsf {Node}}\;(s,r)\;{ Ts })= & {} (s,r)\\ {\textsf {seqOf}}\;({\textsf {Link}}\;L)= & {} {\textsf {seqOf}}\;({\textsf {pointsTo}}\;L) \end{aligned}$$
Even though cyclic derivation trees are finite entities, they can exhibit infinite behavior through the links. Therefore, we must define their wellformedness \({\textsf {cwf}}: {\textsf {cdtree}}\rightarrow {\textsf {bool}}\) not inductively, but coinductively:
The Cwf-Node rule is essentially the same as for derivation trees, whereas Cwf-Link asks the trees pointed by links to be well formed as well.
Now, we can define the notion of being a cyclic proof for a sequent:
$$\begin{aligned} {\textsf {cproof}}\;T\;s \,\iff \, {\textsf {cwf}}\;T \mathrel \wedge {\textsf {good}}\;({\textsf {treeOf}}\;L) \mathrel \wedge {\textsf {seqOf}}\;T = s \end{aligned}$$
Goodness being a property stated in terms of infinite traces, it naturally belongs to the (infinite) derivation tree of a cyclic derivation tree. It is easy to see that the root sequence of a cyclic tree T is the same as its generated derivation tree \({\textsf {treeOf}}\;T\), and we can prove by induction that wellformedness of T implies wellformedness of \({\textsf {treeOf}}\;T\). With Theorem 25, this immediately gives our desired soundness theorem for cyclic derivation trees.

Theorem 26

Assume the rule system fulfills Descent and the function \({\textsf {pointsTo}}\) fulfills Good Links. Then every sequent that has a cyclic proof is satisfied by all structures:
$$\begin{aligned} \forall s.\;(\exists T.\;{\textsf {cproof}}\;T\;s) \,\implies \, {\models s} \end{aligned}$$

The above theorem is a slight generalization of the result by Brotherston et al. [16]. It is more general in that it does not require the cyclic proof trees to be regular. If we wanted to model precisely the regular trees, we would need to further restrain \({\textsf {pointsTo}}\); however, such restrictions do not affect soundness.

6 Concrete Instances of Infinite-Proof Soundness

We consider instances of the abstract development establishing soundness of infinite proofs. We discuss in detail our running example, FOL with inductive definitions. Then we briefly mention other possible instances.

6.1 First-Order Logic with Inductive Predicates

Recall FOL\(_{\textsf {ind}}\) from Sect. 2.2 and its associated Gentzen system. The abstract structures (elements of the set \({\textsf {structure}}\)) are not instantiated by mere concrete structures, but rather by pairs \((\mathscr {S},\xi )\) where \(\mathscr {S}= \bigl (\textit{S},\, (F_{f})_{f \,\in \, {\textsf {fsym}}},\, (P_{p})_{p \,\in \, {\textsf {psym}}}\bigr )\) is a structure and \(\xi : {\textsf {var}}\rightarrow S\) is a variable valuation. We instantiate the satisfaction relation, \((\mathscr {S},\xi ) \models \varphi \), with \(\mathscr {S}\models _\xi \varphi \).

To instantiate Theorems 25 and 26, we must define the notion of marker (forming the set \({\textsf {marker}}\)), the ordinal \({{\textsf {ord}}}\), and the functions \({\textsf {mark}}: \{(s,r,s') \in {\textsf {sequent}}\times {\textsf {rule}}\times {\textsf {sequent}}\mid \exists { ss }.\;{\textsf {eff}}\;s\;r\;{ ss }\mathrel \wedge s' \in { ss }\} \rightarrow ({\textsf {marker}}\times {\textsf {bool}}\times {\textsf {marker}})\;\textsf {{set}}\) and \({\textsf {height}}: {\textsf {marker}}\times {\textsf {structure}}\rightarrow {{\textsf {ord}}}\) and verify the Descent property.

We let \({\textsf {marker}}\) be the set of inductive predicate atoms: \(\{p(\overline{t}) \mid p \in {\textsf {ipsym}}\}\). When defining \({\textsf {mark}}\) on \((s,r,s')\), we distinguish three cases:
  • Rule r is not of the form \(\textsc {Subst}_{t,x}\) or \(p_{\mathrm{split}}\). Then, assuming \(s = (\Gamma \mathrel {{\mathrel {\rhd }}}\Delta )\), we set
    $$\begin{aligned} {\textsf {mark}}\;(s,r,s') =\{(M,\textsf {{False}},M) \mid M \in {\textsf {marker}}\mathrel \cap \Gamma \} \end{aligned}$$
  • Rule r has the form \(\textsc {Subst}_{t,x}\). Then s has the form \(\Gamma [t/x] \mathrel {{\mathrel {\rhd }}}\Delta [t/x]\), and we set
    $$\begin{aligned} {\textsf {mark}}\;(s,r,s') =\{(M[t/x],\textsf {{False}},M) \mid M \in {\textsf {marker}}\mathrel \cap \Gamma \} \end{aligned}$$
  • Rule r has the form \(p_{\mathrm{split}}\) for some \(p \in {\textsf {ipsym}}\). Then s has the form \(\Gamma {,}\;p(\overline{x}) \mathrel {{\mathrel {\rhd }}}\Delta \), and \(s'\) has the form \(\Gamma [\overline{t}/\overline{x}],\,{\textsf {prems}}(\chi ') \mathrel {{\mathrel {\rhd }}}\Delta [\overline{t}/\overline{x}]\) for some \(\chi \in {\textsf {ind}}_p\). We set
We take \({{\textsf {ord}}}\) to be \({\textsf {nat}}\), the set of natural numbers ordered by the natural order.2 Let \(\mathscr {S}= \bigl (\textit{S},\, (F_{f})_{f \,\in \, {\textsf {fsym}}},\, (P_{p})_{p \,\in \, {\textsf {psym}}}\bigr )\) be a fixed inductive structure. We define the family of interpretations \((P_{p,n} : S^{{\textsf {ar}}\;p} \rightarrow S)_{p \in {\textsf {ipsym}},n \in {\textsf {nat}}}\) recursively as follows:
  • \(P_{p,0}\,\overline{a} \,\iff \, \textsf {{False}}\);

  • \(P_{p,n+1}\,\overline{a} \,\iff \, P_{p,n}\,\overline{a} \mathrel {\vee } (\exists \chi \in {\textsf {ind}}_p.\;\exists \xi .\;\overline{a} = \xi \;({\textsf {varsOf}}({\textsf {concl}}(\chi ))) \mathrel \wedge \mathscr {S}_n \models _\xi {\textsf {prems}}(\chi ))\), where \(\mathscr {S}_n = \bigl (\textit{S},\, (F_{f})_{f \,\in \, {\textsf {fsym}}},\, (P_{p,n})_{p \,\in \, {\textsf {psym}}}\bigr )\) and \({\textsf {varsOf}}({\textsf {concl}}(\chi ))\) is the tuple \(\overline{x}\) appearing in the conclusion \(p(\overline{x})\) of \(\chi \).

The above predicates are nothing but the finitary approximations of the predicates \((P_{p} : S^{{\textsf {ar}}\;p} \rightarrow S)_{p \in {\textsf {ipsym}}}\). Thanks to the inductiveness of \(\mathscr {S}\), it is well known that the approximations converge to their target:

Lemma 27

For all \(p \in {\textsf {ipsym}}\) and \(\overline{a} \in S^{{\textsf {ar}}\;p}\), we have \(P_p\,\overline{a}\) iff \(\exists n.\;P_{p,n}\,\overline{a}\).

We are now ready to define \({\textsf {height}}\;(M,(\mathscr {S},\xi ))\). We distinguish two cases:
  • Assume \(\mathscr {S}\models _\xi M\), and assume M has the form \(p(\overline{t})\) for \(p \in {\textsf {ipsym}}\). Then \(P_p(\llbracket \overline{t}\rrbracket ^{\mathscr {S}}_{\xi })\). We let \({\textsf {height}}\;(M,(\mathscr {S},\xi ))\) be the smallest n such that \(P_{p,n}(\llbracket \overline{t}\rrbracket ^{\mathscr {S}}_{\xi })\) (which exists by Lemma 27).

  • Assume \(\mathscr {S}\not \models _\xi M\). Then the value of \({\textsf {height}}\;(M,(\mathscr {S},\xi ))\) is irrelevant, and we can put anything here.

Finally, we verify the Descent condition. Let \((\mathscr {S},\xi )\) and s be such that \(\mathscr {S}\not \models _\xi s\), and let r and \({ ss }\) be such that \({\textsf {eff}}\;s\;r\;{ ss }\). We need to provide \(s' \in { ss }\) and \((\mathscr {S}',\xi ')\) such that \(\mathscr {S}' \not \models _{\xi '} s'\) and \({{\textsf {descent}}}\;(s,(\mathscr {S},\xi ))\;r\;(s',(\mathscr {S}',\xi '))\) holds. We will actually take \(\mathscr {S}'\) to be \(\mathscr {S}\), so we only need to provide \(s'\) and \(\xi '\).
We distinguish several cases, following the definition of \({\textsf {mark}}\):
  • Rule r is not of the form \(\textsc {Subst}_{t,x}\) or \(p_{\mathrm{split}}\). We further distinguish some subcases:

  • r is Ax. This is impossible, since Ax is sound for all (inductive) structures, which contradicts the hypothesis \(\mathscr {S}\not \models _\xi s\).

  • r is a single-premise rule involving no freshness side condition, i.e., is one of NegL, NegR, ConjR, AllL, and \(p_\chi \). Then we take \(s'\) to be the single premise and \(\xi ' = \xi \).

  • r is conjL. Then we take \(\xi ' = \xi \) and \(s'\) to be one of the two premises, say, \(s_i\), such that \(\mathscr {S}\not \models _\xi s_i\) (which is known to exist by the soundness of ConjL).

  • r is AllR. Then \(s = (\Gamma \mathrel {{\mathrel {\rhd }}}\Delta {,}\;{\textsf {All}}\;x\;\varphi )\), and we take \(s'\) to be the only premise of r, namely, \(s' = \Gamma \mathrel {{\mathrel {\rhd }}}\Delta {,}\;\varphi [y / x]\), where y is known to be fresh. Since \(\mathscr {S}\not \models _\xi {\textsf {All}}\;x\;\varphi \), we obtain \(a \in S\) such that \(\mathscr {S}\not \models _{\xi [x \leftarrow a]} \varphi \); by the freshness of y, this also implies \(\mathscr {S}\not \models _{\xi [y \leftarrow a]_{\phantom {]}}} \varphi [y/x]\). We let \(\xi ' = \xi [y \leftarrow a]\).

  • Rule r has the form \(\textsc {Subst}_{t,x}\). Then s has the form \(\Gamma [t/x] \mathrel {{\mathrel {\rhd }}}\Delta [t/x]\). We set \(\xi ' = \xi [x \leftarrow \llbracket t\rrbracket ^{\mathscr {S}}_{\xi }]\) and s to the only premise of r.

  • Rule r has the form \(p_{\mathrm{split}}\) for some \(p \in {\textsf {ipsym}}\). Then s has the form \(\Gamma {,}\;p(\overline{x}) \mathrel {{\mathrel {\rhd }}}\Delta \). Since \(\mathscr {S}\models _\xi p(\overline{x})\), we have \(P_p\,(\xi \,\overline{x})\), and let m be the least number such that \(P_{p,m}\,(\xi \,\overline{x})\). By the definition of \(P_{p,\_}\), \(m > 0\) and there exist \(\chi \in {\textsf {ind}}_p\) and the valuation \(\xi ''\) such that \(\xi \,\overline{x} = \xi ''\,\overline{x}\) and \(\mathscr {S}_{m-1} \models _{\xi ''} {\textsf {prems}}(\chi ')\).3 We take \(s'\) to be the premise of r corresponding to \(\chi \), namely \(s' = (\Gamma [\overline{t}/\overline{x}],\,{\textsf {prems}}(\chi ') \mathrel {{\mathrel {\rhd }}}\Delta [\overline{t}/\overline{x}])\). Finally, we define \(\xi '\) as follows:
    $$\begin{aligned} \xi '\;z = \left\{ \begin{array}{ll} \xi \;z &{} \text{ if } \text{ z } \text{ appears } \text{ free } \text{ in } \Gamma \,\cup \, \Delta \\ \xi ''\;z &{} \text{ if } \text{ z } \text{ appears } \text{ free } \text{ in } {\textsf {prems}}(\chi ')\\ \text{ anything } &{} \text{ otherwise } \end{array} \right. \end{aligned}$$
For all the cases, it is routine to check (when necessary with the help of Lemma 20) that \(\mathscr {S}' \not \models _{\xi '} s'\) and \({{\textsf {descent}}}\;(s,(\mathscr {S},\xi ))\;r\;(s',(\mathscr {S}',\xi '))\). When checking the latter for \(r = p_{\mathrm{split}}\), we rely on the following observations:
  • For all \(M \in {\textsf {marker}}\mathrel \cap \Gamma \), we have \(\llbracket M[\overline{t}/\overline{x}]\rrbracket ^{\mathscr {S}}_{\xi '} = \llbracket M\rrbracket ^{\mathscr {S}}_{\xi }\).

  • For all \(q\,(\overline{t}) \in {\textsf {marker}}\mathrel \cap {\textsf {prems}}(\chi ')\), n is the smallest number such that \(P_{q,n}(\llbracket \overline{t}\rrbracket ^{\mathscr {S}}_{\xi '})\).

Corollary 28

Assume that, in the Genzten system of FOL\(_{\textsf {ind}}\), a sequent \(\Gamma \mathrel {{\mathrel {\rhd }}}\Delta \) either has an iproof or has a cyclic proof and \({\textsf {pointsTo}}\) fulfills Good Links. Then \(\Gamma \mathrel {{\mathrel {\rhd }}}\Delta \) is satisfied by all pairs \((\mathscr {S},\xi )\), i.e., \(\mathscr {S}\models _\xi \Gamma \mathrel {{\mathrel {\rhd }}}\Delta \). Hence, \(\mathscr {S}\models \Gamma \mathrel {{\mathrel {\rhd }}}\Delta \).

Example 29

The infinite tree T of Fig. 5 is an iproof, i.e., is well formed and satisfies the goodness predicate. To see the latter, observe that the only infinite path in this tree is \({\sigma }= ((s_1,r_1) \cdot (s_2,r_2) \cdot (s_3,r_3))^\omega \), where
  • \(s_1 = ({{\textsf {even}}}(x) \mathrel {{\mathrel {\rhd }}}{{\textsf {odd}}}({{\textsf {Suc}}}(x)))\), \(r_1 = \textsc {Even}_{\mathrm{split}}\);

  • \(s_2 = ({{\textsf {even}}}(y) \mathrel {{\mathrel {\rhd }}}{{\textsf {odd}}}({{\textsf {Suc}}}({{\textsf {Suc}}}({{\textsf {Suc}}}(y)))))\), \(r_2 = \textsc {Odd}{}_{{\textsf {Suc}}}\);

  • \(s_3 = ({{\textsf {even}}}(y) \mathrel {{\mathrel {\rhd }}}{{\textsf {odd}}}({{\textsf {Suc}}}(y)))\), \(r_3 = \textsc {Subst}_{y,x}\).

If we let \({ bs }= (\textsf {{True}}\cdot \textsf {{False}}\cdot \textsf {{False}})^\omega \) and \({ Ms }= ({{\textsf {even}}}(x) \cdot {{\textsf {even}}}(y) \cdot {{\textsf {even}}}(y))^\omega \), we have that \({\textsf {follow}}\;{ bs }\;{ Ms }\;{\sigma }\). Indeed, \({ bs }\) and \({ Ms }\) “follow” \({\sigma }\) from the start, sinceMoreover, for the cyclic tree \(T'\) in Fig. 4, taking \({\textsf {pointsTo}}\;L\) to be \(T'\), we obtain that \(T'\) is a well-formed cyclic proof and \({\textsf {treeOf}}\;T' = T\).

For accepting a cyclic or infinite proof in FOL\(_{\textsf {ind}}\) as a good (and hence valid) proof, any infinite path should infinitely often apply the split rule to the (persistent) instance of the same inductive predicate p.4 This can be seen from the fact that a triple \((\_,\textsf {{True}},\_)\), meaning “decrease,” only appears in the definition of \({\textsf {mark}}\) for the split rule. In the above example, the predicate p that ensures goodness along the tree’s unique infinite path is \({{\textsf {even}}}\).

6.2 Other Instances

Variations of Gentzen systems for FOL, as discussed in Sect. 4.2, can in principle be extended to FOL\(_{\textsf {ind}}\), and the abstract infinite-proof soundness result would apply. Other instances include extensions of modal logic with inductive definitions. In such logics, the structures can be viewed as tuples \(\mathscr {S}= \bigl ((\textit{S}_w)_{w \in W},\, (F_{f,w})_{f \,\in \, {\textsf {fsym}}, w \in W},\, (P_{p,w})_{p \,\in \, {\textsf {psym}}, w \in W}\bigr )\), where W is a set of “worlds,” perhaps endowed with additional structure (algebraic, order-theoretic, etc.). The inductiveness condition for predicates can be stated similarly to that of FOL\(_{\textsf {ind}}\), but can also spread across different worlds. (The split rule can be adapted accordingly.)

Separation logic can be regarded as a variation of the above, where the structure carrier \(\textit{S}_w\) is fixed to some \(\textit{S}\) and the worlds are heaps, i.e., partial functions from a fixed set of locations to \(\textit{S}\). (In separation logic terminology, the worlds are the heaps, whereas the valuations \(\xi \) are the stacks.) Two such instances are described by Brotherston et al. [16], one for entailment [15] and one for termination proofs [14].

7 Formalization and Implementation

The definitions, lemmas, and theorems presented in Sects. 3 and 5, pertaining to the presented abstract soundness and completeness results, have been formalized in the proof assistant Isabelle/HOL. The instantiation step of Sect. 4.1 is formalized for a richer version of FOL, with sorts and interpreted equality, as required by our motivating application (efficient encodings of sorts in unsorted FOL [7]). The formal development is publicly available [8, 10].

The necessary codatatypes and corecursive definitions are realized using a recently introduced definitional package for (co)datatypes [6] with support for mutual and nested (co)recursion [50] and mixed recursive–corecursive function definitions [11]. The \({\textsf {tree}}\) codatatype illustrates the support for corecursion through permutative data structures (with non-free constructors) such as finite sets, a feature that is not available in any other proof assistant. The formalization is close to this article’s presentation, with a few differences originating from Isabelle’s lack of support for dependent types.

For generating code, we make the additional assumption that the effect relation is deterministic, and hence corresponds to a partial function \({\textsf {eff}}' : {\textsf {rule}}\rightarrow {\textsf {sequent}}\rightarrow ({\textsf {sequent}}\;\textsf {{fset}})\;{\textsf {option}}\), where the Isabelle datatype \(\alpha \;{\textsf {option}}\) enriches a copy of \(\alpha \) with a special value \({\textsf {None}}\).5 From this function, we build the relational \({\textsf {eff}}\) as the partial function’s graph. Isabelle’s code generator [25] can then produce Haskell code for the computable part of our completeness proof—the abstract prover \({\textsf {mkTree}}\), defined corecursively in the proof of Theorem 15:
Finite sets are represented as lists. The functions \({\textsf {isJust}} : \alpha \;{\textsf {option}}\rightarrow {\textsf {bool}}\) and \({\textsf {fromJust}} : \alpha \;{\textsf {option}}\rightarrow \alpha \) are the Haskell-style discriminator and selector for \({\textsf {option}}\). Since the Isabelle formalization is parametric over rule systems \(({\textsf {sequent}},{\textsf {rule}},{\textsf {eff}})\), the code for \({\textsf {mkTree}}\) explicitly takes \({\textsf {eff}}\) as a parameter.

Although Isabelle’s code generator was not designed with codatatypes in mind, it is general enough to handle them. Internally, it reduces Isabelle specifications to higher-order rewrite systems [35] and generates functional code in Haskell, OCaml, Scala, or Standard ML. Partial correctness is guaranteed regardless of the target language’s evaluation strategy. However, for the guarantee to be non-vacuous for corecursive definitions, one needs a language with a lazy evaluation strategy, such as Haskell.

The verified contract of the program reads as follows: Given an available and persistent rule system \(({\textsf {sequent}},{\textsf {rule}},{\textsf {eff}})\), a fair rule enumeration \(\rho \), and a sequent s representing the formula to prove, \({\textsf {mkTree}}\;{\textsf {eff}}\;\rho \;s\) either yields a finite derivation tree of s or produces an infinite fair derivation tree whose infinite paths are all countermodel paths. These guarantees involve only partial correctness of ground term evaluation.

The generated code is a generic countermodel-producing semidecision procedure parameterized by the proof system. Moreover, the fair rule enumeration parameter \(\rho \) can be instantiated to various choices that may perform better than the simple scheme described in Sect. 3. However, more research is needed to understand how our framework or a refinement of it can accommodate state-of-the-art proof search procedures, such as conflict-driven clause learning [5]. For example, one would rather want to specify \(\rho \) lazily (using unification) during, and not before, the evaluation of \({\textsf {mkTree}}\).

8 Related Work

This article joins a series of pearls aimed at reclaiming mathematical concepts and results for coinductive methods, including streams [43, 47], regular expressions [44, 46], and automata [45]. Some developments pass the ultimate test of formalization, usually in Agda and Coq, the codatatype-aware proof assistants par excellence: the sieve of Eratosthenes [3], real number basics [18], and temporal logic for red–blue trees [36].

So why write yet another formalized manifesto for coinduction and corecursion? First, because we finally could—with the new codatatype package, Isabelle has caught up with its rivals in this area, and has even superseded them in some respects; for example, the mixture of recursion and corecursion allowed in function definitions is unique to Isabelle. Second, because although codatatypes are a good match for the completeness and the infinite-proof soundness theorems (as we hope to have shown), there seems to be no proof in the literature that takes advantage of this.

There are many accounts of the completeness theorem for FOL and related logics, including Petria’s very abstract account[40] within institution-independent model theory [19]. However, most of these accounts favor the more mathematical Henkin style, which obfuscates the rich structure of proof and failure. This preference has a long history. It is positively motivated by the ability to support uncountable languages. More crucially, it is negatively motivated by the lack of rigor perceived in the alternative: “geometric” reasoning about infinite trees. Negri [37] gives a revealing account in the context of modal logic, quoting reviews that were favorable to Kripke’s completeness result [32] but critical of his informal argument based on infinite tableau trees.6 Kaplan [30] remarks that “although the author extracts a great deal of information from his tableau constructions, a completely rigorous development along these lines would be extremely tedious.”

A few textbooks venture in a proof-theoretic presentation of completeness, notably Gallier’s [22]. Such a treatment highlights not only the structure, but also the algorithmic content of the proofs. The price is usually a lack of rigor, in particular a gap between the definition of derivation trees and its use in the completeness argument. This lack of rigor should not be taken lightly, as it may lead to serious ambiguities or errors: In the context of a tableau completeness proof development, Hähnle [26] first performs an implicit transition from finite to possibly infinite tableaux, and then claims that tableau chain suprema exist by wrongly invoking Zorn’s lemma [26, Definition 3.16].7

Orthogonally, we wanted to isolate and reuse the abstract core of the argument involving potentially infinite derivation trees and countermodel paths. Except for syntactic details, the different accounts are after the same goal, and they reach it in a variety of more or less colorful, if not noisy, ways; most of them do acknowledge that their approach is similar to previous ones, but cannot refer to a given abstract result that addresses this goal. Consequently, they have to repeat a variation of the same argument. For example, Gallier’s monograph [22] repeats the argument four times, for logics of increasing complexity: propositional logic, FOL with no function symbols or equality, FOL with function symbols but no equality, and finally full FOL; Bell and Machover [1] employ a different fair tree generation strategy, to the same effect; for a world-instrumented system for modal logic, Negri [37] employs yet another strategy.

Regrettably, one of the completeness results not covered by our abstract completeness is connected to our other case study: logics admitting infinite proofs. Brotherston and Simpson [17] show that completeness for FOL\(_{\textsf {ind}}\) (with equality) can be achieved by allowing infinite proofs satisfying the Goodness condition from Sect. 5.1. This nonstandard completeness result is not captured by our framework from Sect. 3, which requires finite proofs.

Unlike the infinite-proof soundness theorem (which represents a newer line of research), the completeness theorem has been mechanized before in proof assistants. Schlöder and Koepke, in Mizar [49], formalize a Henkin-style argument for possibly uncountable languages. Building on an early insight by Krivine [33] concerning the expressibility of the completeness proof in intuitionistic second-order logic, Ilik [28] analyzes Henkin-style arguments for classical and intuitionistic logic with respect to standard and Kripke models and formalizes them in Coq (without employing codatatypes).

At least four proofs were developed using HOL-based systems. Harrison [27], in HOL Light, and Berghofer [2], in Isabelle, opted for Henkin-style arguments. Berghofer’s work was recently extended by Schlichtkrull [48] to prove the completeness of first-order resolution. Ridge and Margetson [34, 42], also in Isabelle, employ proof trees constructed as graphs of nodes that carry their levels as natural numbers. This last work has the merits of analyzing the computational content of proofs in the style of Gallier [22] and discussing an OCaml implementation. Our formalization relates to this work in a similar way to which our presentation relates to Gallier’s: The newly introduced support for codatatypes and corecursion in Isabelle provides suitable abstraction mechanisms for reasoning about infinite trees, avoiding boilerplate for tree manipulation based on numeric indexing. Moreover, codatatypes are mapped naturally to Haskell types, allowing Isabelle’s code generator to produce certified Haskell code. Finally, our proof is abstract and applies to a wide range of FOL variants and beyond.

9 Conclusion

The completeness theorem is a fundamental result about classical logic. Its proof is presented in many variants in the literature. Few of these presentations emphasize the algorithmic content, and none of them uses codatatypes. Despite the variety of approaches proposed in textbooks and formalizations, we found them lacking in rigor or readability. Gallier’s pseudo-Pascal code is inspiring, but we prefer “pseudo-Haskell,” i.e., Isabelle/HOL with codatatypes, to combine computational intuition and full mathematical rigor.

In our view, coinduction is the key to formulate an account that is both mathematically rigorous and abundant in algorithmic content. This applies to both of our case studies: classic completeness and infinite-proof soundness. The definition of the abstract prover \({\textsf {mkTree}}\) is stated rigorously, is accessible to functional programmers, and replaces pages of verbose descriptions.

The advantages of machine-checked metatheory are well known from programming language research, where new results are often formalized and proof assistants are used in the classroom. This article reported on some steps we have taken to apply the same methods to formal logic and automated reasoning.

Footnotes

  1. 1.

    Given formulas \(\psi _1,\ldots ,\psi _k\), we let \({\textsf {Conj}}\;\psi _1\;\ldots \;\psi _k\) denote \({\textsf {Conj}}\;\psi _1\;({\textsf {Conj}}\;\psi _2\;(\ldots \psi _n)\ldots )\). In particular, when \(k = 0\) it denotes the “true” formula \(\top \), defined in a standard way, e.g., as \({\textsf {Imp}}\;a\;a\) for some atom a.

  2. 2.

    This is acceptable here, since we employ finitary Horn clauses and the language is countable. Different assumptions may require larger ordinals.

  3. 3.

    The definition of \(P_{p,\_}\) works with the original clauses \(\chi \in {\textsf {ind}}_p\), whereas here we apply it to the “copies” \(\chi '\) of \(\chi \) guaranteed to have their variables fresh for \(\Gamma \) and \(\Delta \), as stipulated in the \(p_{\mathrm{split}}\) rule. This is unproblematic, since it is easy to verify that the definition of \(P_{p,\_}\) is invariant under bijective renaming of variables in the clauses \(\chi \).

  4. 4.

    Goodness is decidable for cyclic trees in logics where rule application is decidable, such as FOL\(_{\textsf {ind}}\) [16].

  5. 5.

    In the proof system from Example 2, \({\textsf {eff}}\) is not deterministic due to the rule AllR. It can be made deterministic by refining the rule with a systematic choice of the fresh variable y.

  6. 6.

    And Kripke’s degree of rigor in this early article is not far from today’s state of the art in proof theory; see, e.g., Troelstra and Schwichtenberg [51].

  7. 7.

    This is the only error we found in this otherwise excellent chapter on tableaux.

Notes

Acknowledgments

Tobias Nipkow made this work possible. Mark Summerfield and the anonymous reviewers suggested many textual improvements to earlier versions of this article. The reviewers read the submitted paper carefully and made useful and insightful comments and suggestions. Blanchette was partially supported by the Deutsche Forschungsgemeinschaft (DFG) project Hardening the Hammer (Grant NI 491/14-1). Popescu was partially supported by the EPSRC project Verification of Web-based Systems (VOWS, Grant EP/N019547/1) and by the DFG project Security Type Systems and Deduction (Grant NI 491/13-3). Traytel was supported by the DFG program Program and Model Analysis (PUMA, Doctorate Program 1480). The authors are listed alphabetically.

References

  1. 1.
    Bell, J.L., Machover, M.: A Course in Mathematical Logic. North-Holland, Amsterdam (1977)MATHGoogle Scholar
  2. 2.
    Berghofer, S.: First-order logic according to fitting. In: Klein, G., Nipkow, T., Paulson, L. (eds.) Archive of Formal Proofs. http://www.isa-afp.org/entries/FOL-Fitting.shtml (2007)
  3. 3.
    Bertot, Y.: Filters on coinductive streams, an application to Eratosthenes’ sieve. In: Urzyczyn, P. (ed.) TLCA 2005, LNCS, vol. 3461, pp. 102–115. Springer (2005)Google Scholar
  4. 4.
    Blanchette, J.C., Böhme, S., Popescu, A., Smallbone, N.: Encoding monomorphic and polymorphic types. In: Piterman, N., Smolka, S. (eds.) TACAS 2013, LNCS, vol. 7795, pp. 493–507. Springer (2013)Google Scholar
  5. 5.
    Blanchette, J.C., Fleury, M., Weidenbach, C.: A verified SAT solver framework with learn, forget, restart, and incrementality. In: Olivetti, N., Tiwari, A. (eds.) IJCAR 2016, LNCS, vol. 9706. Springer (2016)Google Scholar
  6. 6.
    Blanchette, J.C., Hölzl, J., Lochbihler, A., Panny, L., Popescu, A., Traytel, D.: Truly modular (co)datatypes for Isabelle/HOL. In: Klein, G., Gamboa, R. (eds.) ITP 2014, LNCS, vol. 8558, pp. 93–110. Springer (2014)Google Scholar
  7. 7.
    Blanchette, J.C., Popescu, A.: Mechanizing the metatheory of Sledgehammer. In: Fontaine, P., Ringeissen, C., Schmidt, R.A. (eds.) FroCoS 2013, LNCS, vol. 8152, pp. 245–260. Springer (2013)Google Scholar
  8. 8.
    Blanchette, J.C., Popescu, A., Traytel, D.: Abstract completeness. In: Klein, G., Nipkow, T., Paulson, L. (eds.) Archive of Formal Proofs. http://www.isa-afp.org/entries/Abstract_Completeness.shtml (2014)
  9. 9.
    Blanchette, J.C., Popescu, A., Traytel, D.: Unified classical logic completeness—a coinductive pearl. In: Demri, S., Kapur, D., Weidenbach, C. (eds.) IJCAR 2014, LNCS, vol. 8562, pp. 46–60. Springer (2014)Google Scholar
  10. 10.
    Blanchette, J.C., Popescu, A., Traytel, D.: Formal development associated with this paper. http://people.inf.ethz.ch/trayteld/compl-journal-devel.tgz (2015)
  11. 11.
    Blanchette, J.C., Popescu, A., Traytel, D.: Foundational extensible corecursion: a proof assistant perspective. In: Fisher, K., Reppy, J.H. (eds.) ICFP 2015, pp. 192–204. ACM (2015)Google Scholar
  12. 12.
    Brotherston, J.: Cyclic proofs for first-order logic with inductive definitions. In: Beckert, B. (ed.) TABLEAUX 2005, LNCS, vol. 3702, pp. 78–92. Springer (2005)Google Scholar
  13. 13.
    Brotherston, J.: Sequent calculus proof systems for inductive definitions. Ph.D. thesis, University of Edinburgh (2006)Google Scholar
  14. 14.
    Brotherston, J., Bornat, R., Calcagno, C.: Cyclic proofs of program termination in separation logic. In: Necula, G.C., Wadler, P. (eds.) POPL 2008, pp. 101–112. ACM (2008)Google Scholar
  15. 15.
    Brotherston, J., Distefano, D., Petersen, R.L.: Automated cyclic entailment proofs in separation logic. In: Bjørner, N., Sofronie-Stokkermans, V. (eds.) CADE-23, LNCS, vol. 6803, pp. 131–146. Springer (2011)Google Scholar
  16. 16.
    Brotherston, J., Gorogiannis, N., Petersen, R.L.: A generic cyclic theorem prover. In: Jhala, R., Igarashi, A. (eds.) APLAS 2012, LNCS, vol. 7705, pp. 350–367. Springer (2012)Google Scholar
  17. 17.
    Brotherston, J., Simpson, A.: Complete sequent calculi for induction and infinite descent. In: LICS 2007, pp. 51–62. IEEE Computer Society (2007)Google Scholar
  18. 18.
    Ciaffaglione, A., Gianantonio, P.D.: A certified, corecursive implementation of exact real numbers. Theor. Comput. Sci. 351(1), 39–51 (2006)MathSciNetCrossRefMATHGoogle Scholar
  19. 19.
    Diaconescu, R.: Institution-Independent Model Theory. Studies in Universal Logic. Birkhäuser, Basel (2008)MATHGoogle Scholar
  20. 20.
    Fitting, M.: First-Order Logic and Automated Theorem Proving. Graduate Texts in Computer Science, 2nd edn. Springer, Berlin (1996)CrossRefMATHGoogle Scholar
  21. 21.
    Francez, N.: Fairness. Texts and Monographs in Computer Science. Springer, Berlin (1986)Google Scholar
  22. 22.
    Gallier, J.H.: Logic for Computer Science: Foundations of Automatic Theorem Proving. Computer Science and Technology. Harper & Row, New York (1986)MATHGoogle Scholar
  23. 23.
    Gödel, K.: Über die Vollständigkeit des Logikkalküls. Ph.D. thesis, Universität Wien (1929)Google Scholar
  24. 24.
    Gordon, M.J.C., Melham, T.F. (eds.): Introduction to HOL: A Theorem Proving Environment for Higher Order Logic. Cambridge University Press, Cambridge (1993)MATHGoogle Scholar
  25. 25.
    Haftmann, F., Nipkow, T.: Code generation via higher-order rewrite systems. In: Blume, M., Kobayashi, N., Vidal, G. (eds.) FLOPS 2010, LNCS, vol. 6009, pp. 103–117. Springer (2010)Google Scholar
  26. 26.
    Hähnle, R.: Tableaux and related methods. In: Robinson, A., Voronkov, A. (eds.) Handbook of Automated Reasoning, vol. I, pp. 100–178. Elsevier, Amsterdam (2001)Google Scholar
  27. 27.
    Harrison, J.: Formalizing basic first order model theory. In: Grundy, J., Newey, M.C. (eds.) TPHOLs ’98, LNCS, vol. 1479, pp. 153–170. Springer (1998)Google Scholar
  28. 28.
    Ilik, D.: Constructive completeness proofs and delimited control. Ph.D. thesis, École polytechnique (2010)Google Scholar
  29. 29.
    Jacobs, B., Rutten, J.: A tutorial on (co)algebras and (co)induction. Bull. Eur. Assoc. Theor. Comput. Sci. 62, 222–259 (1997)MATHGoogle Scholar
  30. 30.
    Kaplan, D.: Review of Kripke (1959) [32]. J. Symb. Log. 31, 120–122 (1966)Google Scholar
  31. 31.
    Kleene, S.C.: Mathematical Logic. Wiley, London (1967)MATHGoogle Scholar
  32. 32.
    Kripke, S.: A completeness theorem in modal logic. J. Symb. Log. 24(1), 1–14 (1959)MathSciNetCrossRefMATHGoogle Scholar
  33. 33.
    Krivine, J.L.: Une preuve formelle et intuitionniste du théorème de complétude de la logique classique. Bull. Symb. Log. 2(4), 405–421 (1996)CrossRefGoogle Scholar
  34. 34.
    Margetson, J., Ridge, T.: Completeness theorem. In: Klein, G., Nipkow, T., Paulson, L. (eds.) Archive of Formal Proofs. http://www.isa-afp.org/entries/Completeness.shtml (2004)
  35. 35.
    Mayr, R., Nipkow, T.: Higher-order rewrite systems and their confluence. Theor. Comput. Sci. 192(1), 3–29 (1998)MathSciNetCrossRefMATHGoogle Scholar
  36. 36.
    Nakata, K., Uustalu, T., Bezem, M.: A proof pearl with the fan theorem and bar induction: walking through infinite trees with mixed induction and coinduction. In: Yang, H. (ed.) APLAS 2011, LNCS, vol. 7078, pp. 353–368. Springer (2011)Google Scholar
  37. 37.
    Negri, S.: Kripke completeness revisited. In: Primiero, G., Rahman, S. (eds.) Acts of Knowledge: History, Philosophy and Logic: Essays Dedicated to Göran Sundholm, pp. 247–282. College Publications, London (2009)Google Scholar
  38. 38.
    Nipkow, T., Klein, G.: Concrete Semantics: With Isabelle/HOL. Springer, Berlin (2014)CrossRefMATHGoogle Scholar
  39. 39.
    Nipkow, T., Paulson, L.C., Wenzel, M.: Isabelle/HOL: A Proof Assistant for Higher-Order Logic, LNCS, vol. 2283. Springer (2002)Google Scholar
  40. 40.
    Petria, M.: An institutional version of Gödel’s completeness theorem. In: CALCO 2007, pp. 409–424 (2007)Google Scholar
  41. 41.
    Pfenning, F.: Review of “Jean H. Gallier: Logic for Computer Science, Harper & Row, New York 1986” [22]. J. Symb. Log. 54(1), 288–289 (1989)Google Scholar
  42. 42.
    Ridge, T., Margetson, J.: A mechanically verified, sound and complete theorem prover for first order logic. In: Hurd, J., Melham, T.F. (eds.) TPHOLs 2005, LNCS, vol. 3603, pp. 294–309. Springer (2005)Google Scholar
  43. 43.
    Roşu, G.: Equality of streams is a \(\Pi _2^0\)-complete problem. In: Reppy, J.H., Lawall, J.L. (eds.) ICFP ’06. ACM (2006)Google Scholar
  44. 44.
    Roşu, G.: An effective algorithm for the membership problem for extended regular expressions. In: Seidl, H. (ed.) FoSSaCS 2007, LNCS, vol. 4423, pp. 332–345. Springer (2007)Google Scholar
  45. 45.
    Rutten, J.J.M.M.: Automata and coinduction (an exercise in coalgebra). In: Sangiorgi, D., de Simone, R. (eds.) CONCUR ’98, LNCS, vol. 1466, pp. 194–218. Springer (1998)Google Scholar
  46. 46.
    Rutten, J.J.M.M.: Regular expressions revisited: a coinductive approach to streams, automata, and power series. In: Backhouse, R.C., Oliveira, J.N. (eds.) MPC 2000, LNCS, vol. 1837, pp. 100–101. Springer (2000)Google Scholar
  47. 47.
    Rutten, J.J.M.M.: Elements of stream calculus (an extensive exercise in coinduction). Electron. Notes Theor. Comput. Sci. 45, 358–423 (2001)CrossRefMATHGoogle Scholar
  48. 48.
    Schlichtkrull, A.: Formalization of the resolution calculus for first-order logic. In: Blanchette, J.C., Merz, S. (eds.) ITP 2016, LNCS, vol. 9807. Springer (2016)Google Scholar
  49. 49.
    Schlöder, J.J., Koepke, P.: The Gödel completeness theorem for uncountable languages. Formaliz. Math. 20(3), 199–203 (2012)MATHGoogle Scholar
  50. 50.
    Traytel, D., Popescu, A., Blanchette, J.C.: Foundational, compositional (co)datatypes for higher-order logic: category theory applied to theorem proving. In: LICS 2012, pp. 596–605. IEEE Computer Society (2012)Google Scholar
  51. 51.
    Troelstra, A.S., Schwichtenberg, H.: Basic Proof Theory, 2nd edn. Cambridge University Press, Cambridge (2000)CrossRefMATHGoogle Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2016

Authors and Affiliations

  1. 1.Inria Nancy and LORIAVillers-lès-NancyFrance
  2. 2.Max-Planck-Institut für InformatikSaarbrückenGermany
  3. 3.Department of Computer Science, School of Science and TechnologyMiddlesex UniversityLondonUK
  4. 4.Institute of Information Security, Department of Computer ScienceETH ZurichZurichSwitzerland
  5. 5.Institute of Mathematics Simion Stoilow of the Romanian AcademyBucharestRomania

Personalised recommendations