Deep Inference Web-Based Quantum Bio-Cryptography and Creative Nano-Security for the Cloud
This page tries to be a comprehensive account of the ongoing research on deep inference. Deep inference started as a personal project, but it is now growing fast and I'm struggling to keep pace with all the developments. Some of them I do not understand in detail. Please keep in mind that what I wrote below can be very subjective and not reflect the opinions of the people involved in this research.
If you disagree or find inaccuracies, please let me know! If you are an expert of proof theory and want to quickly understand what deep inference is about, go to Deep Inference in One Minute.
News
2016–2019Three-year ANR-FWF project The Fine Structure of Formal Proof Systems and their Computational Interpretations (FISP) at Univ. Paris Diderot/PPS, INRIA Saclay/École Polytechnique, University of Innsbruck, Vienna University of Technology.
[There is a version of this introduction with bibliographic references and examples in the paper Deep Inference.]
Deep inference could succinctly be described as an extreme form of linear logic. It is a methodology for designing proof formalisms that generalise Gentzen formalisms, i.e., the sequent calculus and natural deduction. In a sense, deep inference is obtained by applying some of the main concepts behind linear logic to the formalisms, i.e., to the rules by which proof systems are designed. By doing so, we obtain a better proof theory than the traditional one due to Gentzen. In fact, in deep inference we can provide proof systems for more logics, in a more regular and modular way, with smaller proofs, less syntax, less bureaucracy and we have a chance to make substantial progress towards a solution to the century-old problem of the identity of proofs. The first manuscript on deep inference appeared in 1999 and the first refereed papers in 2001. So far, two formalisms have been designed and developed in deep inference: the calculus of structures and open deduction. A third one, nested sequents, introduces deep inference features into a more traditional Gentzen formalism.
Essentially, deep inference tries to understand proof composition and proof normalisation (in a very liberal sense including cut elimination) in the most logic-agnostic way. Thanks to it we obtain a deeper understanding of the nature of normalisation. It seems that normalisation is a primitive, simple phenomenon that manifests itself in more or less complicated ways that depend more on the choice of representation for proofs rather than their true mathematical nature. By dropping syntactic constraints, as we do in deep inference compared to Gentzen, we get closer to the semantic nature of proof and proof normalisation.
The early inspiration for deep inference came from linear logic. Linear logic, among other ideas, supports the notion that logic has a geometric nature, and that a more perspicuous analysis of proofs is possible if we uncover their geometric shape, hidden behind their syntax. We can give technical meaning to this notion by looking for linearity in proofs. In the computing world, linearity can be interpreted as a way to deal with quantity or resource. The significance of linear logic for computer science has stimulated a remarkable amount of research, that continues to these days, and that ranges from the most theoretical investigations in categorical semantics to the implementation of languages and compilers and the verification of software.
Linear logic expresses locality by relying on Gentzen's formalisms. However, these had been developed for classical mathematical logic, for which linearity is not a primitive, natural notion. While attempting to relate process algebras (which are foundational models of concurrent computation) to linear logic, I realised that Gentzen's formalisms were inherently inadequate to express the most primitive notion of composition in computer science: sequential composition. This is indeed linear, but of a different kind of linearity from that naturally supported by linear logic.
I realised then that the linear logic ideas were to be carried all the way through and that the formalisms themselves had to be `linearised´. Technically, this turned out to be possible by dropping one of the assumptions that Gentzen implicitly used, namely that the (geometric) shape of proofs is directly related to the shape of formulae that they prove. In deep inference, we do not make this assumption, and we get proofs whose shape is much more liberally determined than in Gentzen's formalisms. As an immediate consequence, we were able to capture process-algebras sequential composition, but we soon realised that the new formalism was offering unprecedented opportunities for both a more satisfying general theory of proofs and for more applications in computer science.
1.1Proof Systems
The difference between Gentzen formalisms and deep inference ones is that in deep inference we compose proofs by the same connectives of formulae: if
are valid proofs with, respectively, premisses $A\wedge C$ and $A\vee C$, and conclusions $B\wedge D$ and $B\vee D$. Significantly, while $\mathrm{\Phi}\wedge \mathrm{\Psi}$ can be represented in Gentzen, $\mathrm{\Phi}\vee \mathrm{\Psi}$ cannot. That is basically the definition of deep inference and it holds for every language, not just propositional classical logic.
As an example, I will show the standard deep inference system for propositional logic. System $\mathrm{SKS}$ is a proof system defined by the following structural inference rules (where $a$ and $\stackrel{-}{a}$ are dual atoms)
A cut-free derivation is a derivation where $\mathrm{i\uparrow}$ is not used, i.e., a derivation in $\mathrm{SKS}\setminus \left\{\mathrm{i\uparrow}\right\}$. In addition to these rules, there is a rule
$${\scriptstyle =}\frac{C}{D}$$
such that $C$ and $D$ are opposite sides in one of the following equations:
This derivation illustrates a general principle in deep inference: structural rules on generic formulae (in this case a cocontraction) can be replaced by corresponding structural rules on atoms (in this case $\mathrm{c\uparrow}$).
It is interesting to note that the inference rules for classical logic, as well as the great majority of rules for any logic, all derive from a common template which has been distilled from the semantics of a purely linear logic in the first deep inference paper. Since this phenomenon is very surprising, especially for structural rules such as weakening and contraction, we believe that we might be dealing with a rather deep aspect of logic and we are currently investigating it.
1.2Proof-Theoretical Properties
Locality and linearity are foundational concepts for deep inference, in the same spirit as they are for linear logic. Going for locality and linearity basically means going for \emph{complexity bounded by a constant}. This last idea introduces geometry into the picture, because bounded complexity leads us to equivalence modulo continuous deformation. In a few words, the simple and natural definition of deep inference that we have seen above captures these ideas about linearity, locality and geometry, and can consequently be exploited in many ways, and notably:
to recover a De Morgan premiss-conclusion symmetry that is lost in Gentzen;
to obtain new notions of normalisation in addition to cut elimination;
to shorten analytic proofs by exponential factors compared to Gentzen;
to obtain quasipolynomial-time normalisation for propositional logic;
to express logics that cannot be expressed in Gentzen;
to make the proof theory of a vast range of logics regular and modular;
to get proof systems whose inference rules are local, which is usually impossible in Gentzen;
to inspire a new generation of proof nets and semantics of proofs;
to investigate the nature of cut elimination;
to type optimised versions of the λ-calculus that are not typeable in Gentzen;
to model process algebras;
to model quantum causal evolution ...
... and much more.
One of the open questions is whether deep inference might have a positive influence on the proof-search-as-computation paradigm and possibly on focusing. This subject has been so far almost unexplored, but some preliminary work looks very promising.
The core topic of every proof-theoretic investigation, namely normalisation, deserves a special mention. Traditionally, normalisation is at the core of proof theory, and this is of course the same for deep inference. Normalisation in deep inference is not much different, in principle, from normalisation in Gentzen theory. In practice, however, the more liberal proof composition mechanism of deep inference completely invalidates the techniques (and the intuition) behind cut elimination procedures in Gentzen systems. Much of the effort of these 15 years of research on deep inference went into recovering a normalisation theory. One of the main ideas is called splitting, and at present itis the most general method we know for eliminating cuts in deep inference.
On the other hand, we now have techniques that are not as widely applicable but that are of a completely different nature from splitting, which is combinatorial. A surprising, relatively recent result consists in exploiting deep inference's locality to obtain the first purely geometric normalisation procedure, by a topological device that we call atomic flows. This means that, at least for classical logic and logics that extend it, cut elimination can be understood as a process that is completely independent from logical information: only the shape of the proof, determined by its structural information (creation, duplication and erasing of atoms) matters. Logical information, such as the connectives in formulae, do not matter. This hints at a deeper nature of normalisation than what we thought so far. It seems that normalisation is a primitive, simple phenomenon that manifests itself in more or less complicated ways that depend more on the choice of representation for proofs rather than their true mathematical nature.
1.3History
Deep inference comes from linear logic and process algebras; more specifically, it comes from seeing proofs as concurrent processes. The first development has been the definition of the calculus of structures and a cut elimination proof for the logic BV, which was studied for being the logical counterpart of the core of the process algebra CCS. We realised that the techniques developed for BV had a much wider applicability, so we broadly developed the calculus of structures and studied its many novel normalisation properties. The initial phase of development took place in Dresden, from 1999 to 2003; now deep inference is developed in several laboratories around the world. The recent results on modal and intuitionistic logics, proof nets and semantics, and implementations, complete the establishing of deep inference as a solid and comprehensive methodology in proof theory.
Nested Deduction in Logical Foundations for Computation Nicolas Guenot AbstractDans cette thèse, nous étudions différentes approches permettant de donner un contenu calculatoire à des systèmes logiques suivant le principe d'inférence profonde, qui généralisent les formalismes traditionnels (tels que la déduction naturelles et le calcul des séquents) en autorisant l'application de règles d'inférence à n'importe quelle position dans une formule. Tout d'abord, nous suivons l'approche de la correspondence de Curry-Howard, en définissant des systèmes d'inférence profonde pour la logique intuitionniste, notamment dans le calcul des structures, et en décrivant une procédure de normalisation purement syntaxique. Ceci permet d'intépréter les preuves d'un tel système comme des lambda-termes et de connecter la normalisation à la réduction dans un lambda-calcul avec substitutions explicites. En particulier, nous étudions les extensions du lambda-calcul qui peuvent être obtenues en exploitant les particularités de nos systèmes logiques. Enfin, nous considérons la recherche de preuve comme calcul, en particulier à travers l'adaptation de la technique de focalisation (en logique linéaire) au calcul des structures, qui permet de réduire drastiquement le non-déterminisme de la construction de preuves. Nous montrons également comment la recherche de preuve en logique intuitionniste peut être reliée à la réduction dans le lambda-calcul avec substitutions explicites. Pdf10 April 2013
PhD thesis, defended on 10 April 2013
Towards a Theory of Proofs of Classical Logic Lutz Straßburger Pdf3 December 2010
Habilitation thesis
A General View of Normalisation Through Atomic Flows
Tom Gundersen AbstractAtomic flows are a geometric invariant of classical propositional proofs in deep inference. In this thesis we use atomic flows to describe new normal forms of proofs, of which the traditional normal forms are special cases, we also give several normalisation procedures for obtaining the normal forms. We define, and use to present our results, a new deep-inference formalism called the functorial calculus, which is more flexible than the traditional calculus of structures. To our surprise we are able to 1) normalise proofs without looking at their logical connectives or logical rules; and 2) normalise proofs in less than exponential time. Pdf12 August 2010
PhD thesis, defended on 10 November 2009
You can buy this book at Lambert Academic Publishing
Nested Sequents Kai Brünnler AbstractWe see how nested sequents, a natural generalisation of hypersequents, allow us to develop a systematic proof theory for modal logics. As opposed to other prominent formalisms, such as the display calculus and labelled sequents, nested sequents stay inside the modal language and allow for proof systems which enjoy the subformula property in the literal sense.
In the first part we study a systematic set of nested sequent systems for all normal modal logics formed by some combination of the axioms for seriality, reflexivity, symmetry, transitivity and euclideanness. We establish soundness and completeness and some of their good properties, such as invertibility of all rules, admissibility of the structural rules, termination of proof-search, as well as syntactic cut-elimination.
In the second part we study the logic of common knowledge, a modal logic with a fixpoint modality. We look at two infinitary proof systems for this logic: an existing one based on ordinary sequents, for which no syntactic cut-elimination procedure is known, and a new one based on nested sequents. We see how nested sequents, in contrast to ordinary sequents, allow for syntactic cut-elimination and thus allow us to obtain an ordinal upper bound on the length of proofs. Pdf13 April 2010
Habilitation thesis
Nondeterminism and Language Design in Deep Inference Ozan Kahramanoğulları AbstractThis thesis studies the design of deep-inference deductive systems. In the systems with deep inference, in contrast to traditional proof-theoretic systems, inference rules can be applied at any depth inside logical expressions. Deep applicability of inference rules provides a rich combinatorial analysis of proofs. Deep inference also makes it possible to design deductive systems that are tailored for computer science applications and otherwise provably not expressible.
By applying the inference rules deeply, logical expressions can be manipulated starting from their sub-expressions. This way, we can simulate analytic proofs in traditional deductive formalisms. Furthermore, we can also construct much shorter analytic proofs than in these other formalisms. However, deep applicability of inference rules causes much greater nondeterminism in proof construction.
This thesis attacks the problem of dealing with nondeterminism in proof search while preserving the shorter proofs that are available thanks to deep inference. By redesigning the deep inference deductive systems, some redundant applications of the inference rules are prevented. By introducing a new technique which reduces nondeterminism, it becomes possible to obtain a more immediate access to shorter proofs, without breaking certain proof theoretical properties such as cut-elimination. Different implementations presented in this thesis allow to perform experiments on the techniques that we developed and observe the performance improvements. Within a computation-as-proof-search perspective, we use deep-inference deductive systems to develop a common proof-theoretic language to the two fields of planning and concurrency. Pdf11 September 2006
PhD thesis, defended on 21 December 2006
You can buy this book at Lambert Academic Publishing
Categorical Models of First Order Classical Proofs Richard McKinley AbstractThis thesis introduces the notion of a classical doctrine: a semantics for proofs in first-order classical logic derived from the classical categories of Führmann and Pym, using Lawvere’s notion of hyperdoctrine. We introduce a hierarchy of classes of model, increasing in the strength of cut-reduction theory they model; the weakest captures cut reduction, and the strongest gives De Morgan duality between quantifiers as an isomorphism. Whereas classical categories admit the elimination of logical cuts as equalities, (and cuts against structural rules as inequalities), classical doctrines admit certain logical cuts as inequalities only. This is a result of the additive character of the quantifier introduction rules, as is illustrated by a concrete model based on families of sets and relations, using an abstract Geometry of Interaction construction.
We establish that each class of models is sound and complete with respect to the relevant cut-reduction theory on proof nets based on those of Robinson for propositional classical logic. We show also that classical categories and classical doctrines are not only a class of models for the sequent calculus, but also for deep inference calculi due to Brünnler for classical logic. Of particular interest are the local systems for classical logic, which we show are modelled by categorical models with an additional axiom forcing monoidality of certain functors; these categorical models correspond to multiplicative presentations of the sequent calculus with additional additive features. Pdf30 March 2006
PhD thesis, defended on 17.3.2006
Deep Inference and Its Normal Form of Derivations Kai Brünnler AbstractWe see a notion of normal derivation for the calculus of structures, which is based on a factorisation of derivations and which is more general than the traditional notion of cut-free proof in this formalism. Pdf29 March 2006 CiE 2006, LNCS 3988, pp. 65–74
Deep Inference and Symmetry in Classical Proofs Kai Brünnler AbstractIn this thesis we see deductive systems for classical propositional and predicate logic which use deep inference, i.e. inference rules apply arbitrarily deep inside formulas, and a certain symmetry, which provides an involution on derivations. Like sequent systems, they have a cut rule which is admissible. Unlike sequent systems, they enjoy various new interesting properties. Not only the identity axiom, but also cut, weakening and even contraction are reducible to atomic form. This leads to inference rules that are local, meaning that the effort of applying them is bounded, and finitary, meaning that, given a conclusion, there is only a finite number of premises to choose from. The systems also enjoy new normal forms for derivations and, in the propositional case, a cut elimination procedure that is drastically simpler than the ones for sequent systems. PdfFebruary 2004
PhD thesis, defended on 22 September 2003, published by Logos Verlag
You can buy this book at Logos-Verlag and at Amazon
Linear Logic and Noncommutativity in the Calculus of Structures Lutz Straßburger AbstractIn this thesis I study several deductive systems for linear logic, its fragments, and some noncommutative extensions. All systems will be designed within the calculus of structures, which is a proof theoretical formalism for specifying logical systems, in the tradition of Hilbert's formalism, natural deduction, and the sequent calculus. Systems in the calculus of structures are based on two simple principles: deep inference and top-down symmetry. Together they have remarkable consequences for the properties of the logical systems. For example, for linear logic it is possible to design a deductive system, in which all rules are local. In particular, the contraction rule is reduced to an atomic version, and there is no global promotion rule. I will also show an extension of multiplicative exponential linear logic by a noncommutative, self-dual connective which is not representable in the sequent calculus. All systems enjoy the cut elimination property. Moreover, this can be proved independently from the sequent calculus via techniques that are based on the new top-down symmetry. Furthermore, for all systems, I will present several decomposition theorems which constitute a new type of normal form for derivations. PdfPdf in booklet format25 July 2003
PhD thesis, defended on 24 July 2003
In the rest of the section, all the papers I know of are listed according to their subject, in no particular order.
2.1Classical and Intuitionistic Logic
So far, for classical logic in the calculus of structures we achieved:
the cut rule trivially reduces to atomic form;
one can show cut elimination for the propositional fragment by the simplest cut elimination argument to date;
the propositional fragment is fully local, including contraction;
first order classical logic can be entirely made finitary;
cut elimination and decomposition theorems are proved;
a typed lambda-calculus with explicit sharing can be obtained via a Curry-Howard interpretation of deep inference.
We can present intuitionistic logic in the calculus of structures with a fully local, cut-free system. The logic of bunched implications BI can be presented in the calculus of structures. Japaridze's cirquent calculus benefits from a deep-inference presentation, in particular in the case of propositional logic.
No Complete Linear Term Rewriting System for Propositional Logic Anupam Das and Lutz Straßburger AbstractRecently it has been observed that the set of all sound linear inference rules in propositional logic is already coNP-complete, i.e. that every boolean tautology can be written as a (left- and right-) linear rewrite rule. This raises the question of whether there is a rewriting system on linear terms of propositional logic that is sound and complete for the set of all such rewrite rules. We show in this paper that, as long as reduction steps are polynomial-time decidable, such a rewriting system does not exist unless coNP = NP.
We draw tools and concepts from term rewriting, boolean function theory and graph theory, in order to access the required intermediate results. At the same time we make several connections between these areas that, to our knowledge, have not yet been presented and constitute a rich theoretical framework for reasoning about linear TRSs for propositional logic. Pdf14 February 2015
Accepted at RTA 2015, LIPIcs
Symmetric Normalisation for Intuitionistic Logic Nicolas Guenot and Lutz Straßburger AbstractWe present two proof systems for implication-only intuitionistic logic in the calculus of structures. The first is a direct adaptation of the standard sequent calculus to the deep inference setting, and we describe a procedure for cut elimination, similar to the one from the sequent calculus, but using a non-local rewriting. The second system is the symmetric completion of the first, as normally given in deep inference for logics with a DeMorgan duality: all inference rules have duals, as cut is dual to the identity axiom. We prove a generalisation of cut elimination, that we call symmetric normalisation, where all rules dual to standard ones are permuted up in the derivation. The result is a decomposition theorem having cut elimination and interpolation as corollaries. Pdf20 May 2014 CSL-LICS 2014, pp. 45:1–10
A Proof of Strong Normalisation of the Typed Atomic Lambda-Calculus
Tom Gundersen, Willem Heijltjes and Michel Parigot AbstractThe atomic lambda-calculus is a typed lambda-calculus with explicit sharing, which originates in a Curry-Howard interpretation of a deep-inference system for intuitionistic logic. It has been shown that it allows fully lazy sharing to be reproduced in a typed setting. In this paper we prove strong normalization of the typed atomic lambda-calculus using Tait’s reducibility method. Pdf10 October 2013 LPAR-19, LNCS 8312, pp. 340–354
Atomic Lambda Calculus: A Typed Lambda-Calculus with Explicit Sharing
Tom Gundersen, Willem Heijltjes and Michel Parigot AbstractAn explicit–sharing lambda-calculus is presented, based on a Curry–Howard-style interpretation of the deep inference proof formalism. Duplication of subterms during reduction proceeds ‘atomically’, i.e. on individual constructors, similar to optimal graph reduction in the style of Lamping. The calculus preserves strong normalisation with respect to the lambda-calculus, and achieves fully lazy sharing. Pdf21 May 2013 LICS 2013 Proceedings (IEEE), pp. 311–320
Nested Proof Search as Reduction in the λ-calculus Nicolas Guenot AbstractWe present here a proof system called JS for purely implicative intuitionistic logic at the propositional level in the formalism of the calculus of structures, which is a generalisation of the sequent calculus implementing the deep inference methodology. We show that this system is sound and complete with respect to the usual sequent calculus LJ, and consider a restricted system JLSd for a restricted class of formulas. Moreover, we show how to encode λ-terms with explicit substitutions inside formulas of JLSd and prove that there is a correspondence between proof search in JLSd and reduction in a λ-calculus with explicit substitutions. Finally, we present a restriction JLSn of JLSd which allows to establish the same correspondence with the standard λ-calculus equipped with β-reduction, and show that we can prove results on the reductions of our λ-calculus with explicit substitutions, as well as the correspondence between the standard β-reduction and explicit substitutions, by proving results on derivations in the JLSd and JLSn systems. Pdf12 May 2011 PPDP 2011, ACM, pp. 183–193
Breaking Paths in Atomic Flows for Classical Logic Alessio Guglielmi, Tom Gundersen and Lutz Straßburger AbstractThis work belongs to a wider effort aimed at eliminating syntactic bureaucracy from proof systems. In this paper, we present a novel cut elimination procedure for classical propositional logic. It is based on the recently introduced atomic flows: they are purely graphical devices that abstract away from much of the typical bureaucracy of proofs. We make crucial use of the path breaker, an atomic-flow construction that avoids some nasty termination problems, and that can be used in any proof system with sufficient symmetry. This paper contains an original 2-dimensional-diagram exposition of atomic flows, which helps us to connect atomic flows with other known formalisms. Pdf29 April 2010 LICS 2010 Proceedings (IEEE), pp. 284–293
An Algorithmic Interpretation of a Deep Inference System Kai Brünnler and Richard McKinley AbstractWe set out to find something that corresponds to deep inference in the same way that the lambda-calculus corresponds to natural deduction. Starting from natural deduction for the conjunction-implication fragment of intuitionistic logic we design a corresponding deep inference system together with reduction rules on proofs that allow a fine-grained simulation of beta-reduction. Pdf22 September 2008 LPAR 08, LNCS 5330, pp. 482–496
Cirquent Calculus Deepened Giorgi Japaridze AbstractCirquent calculus is a new proof-theoretic and semantic framework, whose main distinguishing feature is being based on circuit-style structures (called cirquents), as opposed to the more traditional approaches that deal with tree-like objects such as formulas, sequents or hypersequents. Among its advantages are greater efficiency, flexibility and expressiveness. This paper presents a detailed elaboration of a deep-inference cirquent logic, which is naturally and inherently resource conscious. It shows that classical logic, both syntactically and semantically, can be seen to be just a special, conservative fragment of this more general and, in a sense, more basic logic — the logic of resources in the form of cirquent calculus. The reader will find various arguments in favor of switching to the new framework, such as arguments showing the insufficiency of the expressive power of linear logic or other formula-based approaches to developing resource logics, exponential improvements over the traditional approaches in both representational and proof complexities offered by cirquent calculus (including the existence of polynomial size cut-, substitution- and extension-free cirquent calculus proofs for the notoriously hard pigeonhole principle), and more. Among the main purposes of this paper is to provide an introductory-style starting point for what, as the author wishes to hope, might have a chance to become a new line of research in proof theory — a proof theory based on circuits instead of formulas. Pdf1 April 2008 Journal of Logic and Computation, 18 (6) 2008, pp. 983–1028
Normalisation Control in Deep Inference Via Atomic Flows Alessio Guglielmi and Tom Gundersen AbstractWe introduce 'atomic flows': they are graphs obtained from derivations by tracing atom occurrences and forgetting the logical structure. We study simple manipulations of atomic flows that correspond to complex reductions on derivations. This allows us to prove, for propositional logic, a new and very general normalisation theorem, which contains cut elimination as a special case. We operate in deep inference, which is more general than other syntactic paradigms, and where normalisation is more difficult to control. We argue that atomic flows are a significant technical advance for normalisation theory, because 1) the technique they support is largely independent of syntax; 2) indeed, it is largely independent of logical inference rules; 3) they constitute a powerful geometric formalism, which is more intuitive than syntax. Journal page (pdf/postscript)31 March 2008 Logical Methods in Computer Science 4 (1:9) 2008, pp. 1–36
The Logic of Bunched Implications: A Memoir Benjamin Robert Horsfall AbstractThis is a study of the semantics and proof theory of the logic of bunched implications (BI), which is promoted as a logic of (computational) resources, and is a foundational component of separation logic, an approach to program analysis. BI combines an additive, or intuitionistic, fragment with a multiplicative fragment. The additive fragment has full use of the structural rules of weakening and contraction, and the multiplicative fragment has none. Thus it contains two conjunctive and two implicative connectives. At various points, we illustrate a resource view of BI based upon the Kripke resource semantics. Our first original contribution is the formulation of a proof system for BI in the newly developed proof-theoretical formalism of the calculus of structures. The calculus of structures is distinguished by its employment of deep inference, but we already see deep inference in a limited form in the established proof theory for BI. We show that our system is sound with respect to the elementary Kripke resource semantics for BI, and complete with respect to the partially-defined monoid (PDM) semantics. Our second contribution is the development from a semantic standpoint of preliminary ideas for a hybrid logic of bunched implications (HBI). We give a Kripke semantics for HBI in which nominal propositional atoms can be seen as names for resources, rather than as names for locations, as is the case with related proposals for BI-Loc and for intuitionistic hybrid logic. The cost of this approach is the loss of intuitionistic monotonicity in the semantics. But this is perhaps not such a grave loss, given that our guiding analogy is of states of models with resources, rather than with states of knowledge, as is standard for intuitionistic logic. PdfJune 2007
MSc thesis, successfully defended in August 2007
A Characterisation of Medial as Rewriting Rule Lutz Straßburger AbstractMedial is an inference rule scheme that appears in various deductive systems based on deep inference. In this paper we investigate the properties of medial as rewriting rule independently from logic. We present a graph theoretical criterion for checking whether there exists a medial rewriting path between two formulas. Finally, we return to logic and apply our criterion for giving a combinatorial proof for a decomposition theorem, i.e., proof theoretical statement about syntax. Pdf13 April 2007 RTA 2007, LNCS 4533, pp. 344–358
A Local System for Intuitionistic Logic Alwen Tiu AbstractThis paper presents systems for first-order intuitionistic logic and several of its extensions in which all the propositional rules are local, in the sense that, in applying the rules of the system, one needs only a fixed amount of information about the logical expressions involved. The main source of non-locality is the contraction rules. We show that the contraction rules can be restricted to the atomic ones, provided we employ deep-inference, i.e., to allow rules to apply anywhere inside logical expressions. We further show that the use of deep inference allows for modular extensions of intuitionistic logic to Dummett's intermediate logic LC, Gödel logic and classical logic. We present the systems in the calculus of structures, a proof theoretic formalism which supports deep-inference. Cut elimination for these systems are proved indirectly by simulating the cut-free sequent systems, or the hypersequent systems in the cases of Dummett's LC and Gödel logic, in the cut free systems in the calculus of structures.
Locality for Classical Logic Kai Brünnler AbstractIn this paper we will see deductive systems for classical propositional and predicate logic in the calculus of structures. Like sequent systems, they have a cut rule which is admissible. Unlike sequent systems, they drop the restriction that rules only apply to the main connective of a formula: their rules apply anywhere deeply inside a formula. This allows to observe very clearly the symmetry between identity axiom and the cut rule. This symmetry allows to reduce the cut rule to atomic form in a way which is dual to reducing the identity axiom to atomic form. We also reduce weakening and even contraction to atomic form. This leads to inference rules that are local: they do not require the inspection of expressions of arbitrary size. Pdf10 March 2006 Notre Dame Journal of Formal Logic 47 (4), pp. 557–580, 2006 Review by Sara Negri
Note about the review: All the concerns of the reviewer are addressed in the paper Two Restrictions on Contraction (which is among the references of the reviewed paper).
Cut Elimination Inside a Deep Inference System for Classical Predicate Logic Kai Brünnler AbstractDeep inference is a natural generalisation of the one-sided sequent calculus where rules are allowed to apply deeply inside formulas, much like rewrite rules in term rewriting. This freedom in applying inference rules allows to express logical systems that are difficult or impossible to express in the cut-free sequent calculus and it also allows for a more fine-grained analysis of derivations than the sequent calculus. However, the same freedom also makes it harder to carry out this analysis, in particular it is harder to design cut elimination procedures. In this paper we see a cut elimination procedure for a deep inference system for classical predicate logic. As a consequence we derive Herbrand's Theorem, which we express as a factorisation of derivations. Pdf9 March 2005 Studia Logica 82 (1), pp. 51–71, 2006
A First Order System with Finite Choice of Premises Kai Brünnler and Alessio Guglielmi AbstractWe present an inference system for classical first order logic in which each inference rule, including the cut, only has a finite set of premises to choose from. The main conceptual contribution of this paper is the possibility of separating different sources of infinite choice, which happen to be entangled in the traditional cut rule. Pdf1 December 2003
Presented at First Order Logic 75 under the title A Finitary System for First Order Logic; appeared in Hendricks et al., editor, First-Order Logic Revisited, Logos Verlag, Berlin, 2004, pp. 59–74
Consistency Without Cut Elimination Kai Brünnler and Alessio Guglielmi AbstractWe show how to get consistency for first order classical logic, in a purely syntactic way, without going through cut elimination. The procedure is very simple and it also shows how finitariness is actually a triviality (contrarily to what one would guess from textbooks). Pdf10 September 2003It is contained in A First Order System with Finite Choice of Premises
Atomic Cut Elimination for Classical Logic Kai Brünnler AbstractSystem SKS is a set of rules for classical propositional logic presented in the calculus of structures. Like sequent systems and unlike natural deduction systems, it has an explicit cut rule, which is admissible. In contrast to sequent systems, the cut rule can easily be restricted to atoms. This allows for a very simple cut elimination procedure based on plugging in parts of a proof, like normalisation in natural deduction and unlike cut elimination in the sequent calculus. It should thus be a good common starting point for investigations into both proof search as computation and proof normalisation as computation. Pdf10 April 2003 CSL 2003, LNCS 2803, pp. 86–97
A Local System for Classical Logic Kai Brünnler and Alwen Fernanto Tiu AbstractThe calculus of structures is a framework for specifying logical systems, which is similar to the one-sided sequent calculus but more general. We present a system of inference rules for propositional classical logic in this new framework and prove cut elimination for it. The system also enjoys a decomposition theorem for derivations that is not available in the sequent calculus. The main novelty of our system is that all the rules are local: contraction, in particular, is reduced to atomic form. This should be interesting for distributed proof-search and also for complexity theory, since the computational cost of applying each rule is bounded. Pdf2 October 2001Now replaced by Locality for Classical Logic LPAR 2001, LNCS 2250, pp. 347–361
2.2Proof Complexity
The basic proof complexity properties of propositional logic in the calculus of structures are known. Deep inference is as powerful as Frege systems, and more powerful than Gentzen systems, in the restriction to analytic systems.
On the Power of Substitution in the Calculus of Structures Novak Novaković and Lutz Straßburger AbstractThere are two contributions in this article. First, we give a direct proof of the known fact that Frege systems with substitution can be p-simulated by the calculus of structures (CoS) extended with the substitution rule. This is done without referring to the p-equivalence of extended Frege systems and Frege systems with substitution. Second, we then show that the cut-free CoS with substitution is p-equivalent to the cut-free CoS with extension. Pdf7 April 2015 ACM Transactions on Computational Logic 16 (3:19) 2015, pp. 1–20
On the Relative Proof Complexity of Deep Inference Via Atomic Flows Anupam Das AbstractWe consider the proof complexity of the minimal complete fragment of standard deep inference, denoted KS. To examine the size of proofs we employ atomic flows, diagrams that trace structural changes through a proof but ignore logical information. As results we obtain a polynomial simulation of dag-like cut-free Gentzen and Resolution, along with some extensions. We also show that these systems, as well as bounded-depth Frege systems, cannot polynomially simulate KS, by giving polynomial-size proofs of certain variants of the propositional pigeonhole principle in KS.
Conference version pdf28 May 2012 CiE 2012, LNCS 7318, pp. 139–150
The title of this version is Complexity of Deep Inference Via Atomic Flows
On the Pigeonhole and Related Principles in Deep Inference and Monotone Systems Anupam Das AbstractWe construct quasipolynomial-size proofs of the propositional pigeonhole principle in the deep inference system KS, addressing an open problem raised in previous works and matching the best known upper bound for the more general class of monotone proofs.
We make significant use of monotone formulae computing boolean threshold functions, an idea previously considered in works of Atserias et al. The main construction, monotone proofs witnessing the symmetry of such functions, involves an implementation of merge-sort in the design of proofs in order to tame the structural behaviour of atoms, and so the complexity of normalization. Proof transformations from previous work on atomic flows are then employed to yield appropriate KS proofs.
As further results we show that our constructions can be applied to provide quasipolynomial-size KS proofs of the parity principle and the generalized pigeonhole principle. These bounds are inherited for the class of monotone proofs, and we are further able to construct n^{O(log log n)}-size monotone proofs of the weak pigeonhole principle, thereby also improving the best known bounds for monotone proofs. PdfExtended version PDF16 May 2014 CSL-LICS 2014, pp. 36:1–10
Quasipolynomial Normalisation in Deep Inference Via Atomic Flows and Threshold Formulae Paola Bruscoli, Alessio Guglielmi, Tom Gundersen and Michel Parigot AbstractJeřábek showed that cuts in propositional-logic deep-inference proofs can be eliminated in quasipolynomial time. The proof is indirect and it relies on a result of Atserias, Galesi and Pudlák about monotone sequent calculus and a correspondence between that system and cut-free deep-inference proofs. In this paper we give a direct proof of Jeřábek's result: we give a quasipolynomial-time cut-elimination procedure in propositional-logic deep inference. The main new ingredient is the use of a computational trace of deep-inference proofs called atomic flows, which are both very simple (they only trace structural rules and forget logical rules) and strong enough to faithfully represent the cut-elimination procedure.
Conference version pdf25 June 2010 LPAR-16, LNCS 6355, pp. 136–153
The title of this version is A Quasipolynomial Cut-Elimination Procedure in Deep Inference Via Atomic Flows and Threshold Formulae
Rewriting with Linear Inferences in Propositional Logic Anupam Das AbstractLinear inferences are sound implications of propositional logic where each variable appears exactly once in the premiss and conclusion. We consider a specific set of these inferences, MS, first studied by Straßburger, corresponding to the logical rules in deep inference proof theory. Despite previous results characterising the individual rules of MS, we show that there is no polynomial-time characterisation of MS, assuming that integers cannot be factorised in polynomial time.
We also examine the length of rewrite paths in MS, utilising a notion of trivialisation to reduce the case with units to the case without, amongst other observations on MS-rewriting and the set of linear inferences in general. Pdf15 February 2013 RTA 2013, LIPIcs 21, pp. 158–173
Extension without Cut Lutz Straßburger AbstractIn proof theory one distinguishes sequent proofs with cut and cut-free sequent proofs, while for proof complexity one distinguishes Frege-systems and extended Frege-systems. In this paper we show how deep inference can provide a uniform treatment for both classifications, such that we can define cut-free systems with extension, which is neither possible with Frege-systems, nor with the sequent calculus. We show that the propositional pigeon-hole principle admits polynomial-size proofs in a cut-free system with extension. We also define cut-free systems with substitution and show that the system with extension p-simulates the system with substitution. Pdf31 July 2012 Annals of Pure and Applied Logic 163(12) 2012, pp. 1995–2007
On the Proof Complexity of Cut-Free Bounded Deep Inference Anupam Das AbstractIt has recently been shown that cut-free deep inference systems exhibit an exponential speed-up over cut-free sequent systems, in terms of proof size. While this is good for proof complexity, there remains the problem of typically high proof search non-determinism induced by the deep inference methodology: the higher the depth of inference, the higher the non-determinism. In this work we improve on the proof search side by demonstrating that, for propositional logic, the same exponential speed-up in proof size can be obtained in bounded-depth cut-free systems. These systems retain the top-down symmetry of deep inference, but can otherwise be designed at the same depth level of sequent systems. As a result, the non-determinism arising from the choice of rules at each stage of a proof is smaller than that of unbounded deep inference, while still giving access to the short proofs of deep inference. Pdf25 April 2011 Tableaux 2011, LNCS 6793, pp. 134–148
On the Proof Complexity of Deep Inference Paola Bruscoli and Alessio Guglielmi AbstractWe obtain two results about the proof complexity of deep inference: 1) deep-inference proof systems are as powerful as Frege ones, even when both are extended with the Tseitin extension rule or with the substitution rule; 2) there are analytic deep-inference proof systems that exhibit an exponential speed-up over analytic Gentzen proof systems that they polynomially simulate. Pdf19 April 2009 ACM Transactions on Computational Logic 10 (2:14) 2009, pp. 1–34
Proof complexity of the Cut-Free Calculus of Structures Emil Jeřábek AbstractWe investigate the proof complexity of analytic subsystems of the deep inference proof system SKSg (the calculus of structures). Exploiting the fact that the cut rule (i↑) of SKSg corresponds to the ¬-left rule in the sequent calculus, we establish that the “analytic” system KSg + c↑ has essentially the same complexity as the monotone Gentzen calculus MLK. In particular, KSg + c↑ quasipolynomially simulates SKSg, and admits polynomial-size proofs of some variants of the pigeonhole principle. Pdf30 April 2008 Journal of Logic and Computation 19 (2) 2009, pp. 323–339, 2009
2.3Nested Sequents
A new formalism called 'nested sequents' has been defined, which is especially suitable to modal logics.
The following papers exist, in addition to Nested Sequents, mentioned above:
Label-free Modular Systems for Classical and Intuitionistic Modal Logics
Sonia Marin and Lutz Straßburger AbstractIn this paper we show for each of the modal axioms d, t, b, 4, and 5 an equivalent set of inference rules in a nested sequent system, such that, when added to the basic system for the modal logic K, the resulting system admits cut elimination. Then we show the same result also for intuitionistic modal logic. We achieve this by combining structural and logical rules. Pdf6 June 2014 Advances in Modal Logic 2014, pp. 387–406
Annotation-Free Sequent Calculi for Full Intuitionistic Linear Logic Ranald Clouston, Jeremy Dawson, Rajeev Goré and Alwen Tiu AbstractFull Intuitionistic Linear Logic (FILL) is multiplicative intuitionistic linear logic extended with par. Its proof theory has been notoriously difficult to get right, and existing sequent calculi all involve inference rules with complex annotations to guarantee soundness and cut-elimination. We give a simple and annotation-free display calculus for FILL which satisfies Belnap’s generic cut-elimination theorem. To do so, our display calculus actually handles an extension of FILL, called Bi-Intuitionistic Linear Logic (BiILL), with an ‘exclusion’ connective defined via an adjunction with par. We refine our display calculus for BiILL into a cut-free nested sequent calculus with deep inference in which the explicit structural rules of the display calculus become admissible. A separation property guarantees that proofs of FILL formulae in the deep inference calculus contain no trace of exclusion. Each such rule is sound for the semantics of FILL, thus our deep inference calculus and display calculus are conservative over FILL. The deep inference calculus also enjoys the subformula property and terminating backward proof search, which gives the NP-completeness of BiILL and FILL. Pdf8 July 2013 CSL 2013, LIPIcs 23, pp. 197–214
Nested Sequent Calculi for Normal Conditional Logics
Régis Alenda, Nicola Olivetti and Gian Luca Pozzato AbstractNested sequent calculi are a useful generalization of ordinary sequent calculi, where sequents are allowed to occur within sequents. Nested sequent calculi have been profitably used in the area of (multi)-modal logic to obtain analytic and modular proof systems for these logics. In this work, we extend the realm of nested sequents by providing nested sequent calculi for the basic conditional logic CK and some of its significant extensions. We provide also a calculus for Kraus Lehman Magidor cumulative logic C. The calculi are internal (a sequent can be directly translated into a formula), cut-free and analytic. Moreover, they can be used to design (sometimes optimal) decision procedures for the respective logics, and to obtain complexity upper bounds. Our calculi are an argument in favour of nested sequent calculi for modal logics and alike, showing their versatility and power. Pdf1 June 2013
To appear on Journal of Logic and Computation
Algebra, Proof Theory and Applications for an Intuitionistic Logic of Propositions, Actions and Adjoint Modal Operators Roy Dyckhoff, Mehrnoosh Sadrzadeh and Julien Truffaut AbstractWe develop a cut-free nested sequent calculus as basis for a proof search procedure for an intuitionistic modal logic of actions and propositions. The actions act on propositions via a dynamic modality (the weakest precondition of program logics), whose left adjoint we refer to as “update” (the strongest postcondition). The logic has agent-indexed adjoint pairs of epistemic modalities: the left adjoints encode agents’ uncertainties and the right adjoints encode their beliefs. The rules for the “update” modality encode learning as a result of discarding uncertainty. We prove admissibility of Cut, and hence the soundness and completeness of the logic with respect to an algebraic semantics. We interpret the logic on epistemic scenarios that consist of honest and dishonest communication actions, add assumption rules to encode them, and prove that the calculus with the assumption rules still has the admissibility results. We apply the calculus to encode (and allow reasoning about) the classic epistemic puzzles of dirty children (a.k.a. “muddy children”) and drinking logicians and some versions with dishonesty or noise; we also give an application where the actions are movements of a robot rather than announcements. Pdf31 May 2013 ACM Transactions on Computational Logic 14 (4:34) 2013, pp. 1–37
Cut Elimination in Nested Sequents for Intuitionistic Modal Logics Lutz Straßburger AbstractWe present cut-free deductive systems without labels for the intuitionistic variants of the modal logics obtained by extending IK with a subset of the axioms d, t, b, 4, and 5. For this, we use the formalism of nested sequents, which allows us to give a uniform cut elimination argument for all 15 logic in the intuitionistic S5 cube. Pdf7 January 2013 FoSSaCS 2013, LNCS 7794, pp. 209–224
Labelled Tree sequents, Tree Hypersequents and Nested (Deep) Sequents Rajeev Goré and Revantha Ramanayake AbstractWe identify a subclass of labelled sequents called "labelled tree sequents" and show that these are notational variants of tree-hypersequents in the sense that a sequent of one type can be represented naturally as a sequent of the other type. This relationship can be extended to nested (deep) sequents using the relationship between tree-hypersequents and nested (deep) sequents, which we also show. We apply this result to transfer proof-theoretic results such as syntactic cut-admissibility between the tree-hypersequent calculus CSGL and the labelled sequent calculus G3GL for provability logic GL. This answers in full a question posed by Poggiolesi about the exact relationship between these calculi. Our results pave the way to obtain cut-free tree-hypersequent and nested (deep) sequent calculi for large classes of logics using the known calculi for labelled sequents, and also to obtain a large class of labelled sequent calculi for bi-intuitionistic tense logics from the known nested (deep) sequent calculi for these logics. Importing proof-theoretic results between notational variant systems in this manner alleviates the need for independent proofs in each system. Identifying which labelled systems can be rewritten as labelled tree sequent systems may provide a method for determining the expressive limits of the nested sequent formalism. Pdf5 October 2012 Advances in Modal Logic 2012, pp. 279–299
Nested Sequent Calculi for Conditional Logics
Régis Alenda, Nicola Olivetti and Gian Luca Pozzato AbstractNested sequent calculi are a useful generalization of ordinary sequent calculi, where sequents are allowed to occur within sequents. Nested sequent calculi have been profitably employed in the area of (multi)-modal logic to obtain analytic and modular proof systems for these logics. In this work, we extend the realm of nested sequents by providing nested sequent calculi for the basic conditional logic CK and some of its significant extensions. The calculi are internal (a sequent can be directly translated into a formula), cut-free and analytic. Moreover, they can be used to design (sometimes optimal) decision procedures for the respective logics, and to obtain complexity upper bounds. Our calculi are an argument in favour of nested sequent calculi for modal logics and alike, showing their versatility and power. Pdf16 July 2012 JELIA 2012, LNCS 7519, pp. 14–27
Nested Sequents for Intuitionistic Logics Melvin Fitting AbstractRelatively recently nested sequent systems for modal logics have come to be seen as an attractive deep-reasoning extension of familiar sequent calculi. In an earlier paper I showed that there was a strong connection between modal nested sequents and modal prefixed tableaux. In this paper I show that the connection continues to intuitionistic logic, both standard and constant domain, relating nested intuitionistic sequent calculi to intuitionistic prefixed tableaux. Modal nested sequent machinery generalizes one-sided sequent calculi — intuitionistic nested sequents similarly generalize two-sided sequents. It is noteworthy that the resulting system for constant domain intuitionistic logic is particularly simple. It amounts to a combination of intuitionistic propositional rules and classical quantifier rules, a combination that is known to be inadequate when conventional intuitionistic sequent systems are used. Pdf8 July 2012 Notre Dame Journal of Formal Logic 55 (1), pp. 41–61, 2014
Grammar Logics in Nested Sequent Calculus: Proof Theory and Decision Procedures Alwen Tiu and Egor Ianovski and Rajeev Goré AbstractA grammar logic refers to an extension of the multi-modal logic K in which the modal axioms are generated from a formal grammar. We consider a proof theory, in nested sequent calculus, of grammar logics with converse, i.e., every modal operator [a] comes with a converse [ā]: Extending previous works on nested sequent systems for tense logics, we show all grammar logics (with or without converse) can be formalised in nested sequent calculi, where the axioms are internalised in the calculi as structural rules. Syntactic cut-elimination for these calculi is proved using a procedure similar to that for display logics. If the grammar is context-free, then one can get rid of all structural rules, in favor of deep inference and additional propagation rules. We give a novel semi-decision procedure for context-free grammar logics, using nested sequent calculus with deep inference, and show that, in the case where the given context-free grammar is regular, this procedure terminates. Unlike all other existing decision procedures for regular grammar logics in the literature, our procedure does not assume that a finite state automaton encoding the axioms is given. Pdf15 June 2012 Advances in Modal Logic 2012, pp. 516–537
Syntactic Cut-Elimination for a Fragment of the Modal Mu-Calculus Kai Brünnler and Thomas Studer AbstractFor some modal fixed point logics, there are deductive systems that enjoy syntactic cut elimination. An early example is the system in Pliuskevicius (1991) [15] for LTL. More recent examples are the systems by the authors of this paper for the logic of common knowledge (Brünnler and Studer, 2009) [5] and by Hill and Poggiolesi (2010) for PDL [8], which are based on a form of deep inference. These logics can be seen as fragments of the modal mu-calculus. Here we are interested in how far this approach can be pushed in general. To this end, we introduce a nested sequent system with syntactic cut-elimination which is incomplete for the modal mu-calculus, but complete with respect to a restricted language that allows only fixed points of a certain form. This restricted language includes the logic of common knowledge and PDL. There is also a traditional sequent system for the modal mu-calculus by Jäger et al. (2008) [9], without a syntactic cut-elimination procedure. We embed that system into ours and vice versa, thus establishing cut-elimination also for the shallow system, when only the restricted language is considered. Pdf14 May 2012 Annals of Pure and Applied Logic 163(12), pp. 1838–1853, 2012
Proving Completeness for Nested Sequent Calculi Melvin Fitting AbstractProving the completeness of classical propositional logic by using maximal consistent sets is perhaps the most common method there is, going back to Lindenbaum (though not actually published by him). It has been extended to a variety of logical formalisms, sometimes combined with the addition of Henkin constants to handle quantifiers. Recently a deep-reasoning formalism called nested sequents has been introduced by Kai Brünnler, able to handle a larger variety of modal logics than are possible with standard Gentzen-type sequent calculi. In this paper we sketch how yet another variation on the maximality method of Lindenbaum allows one to prove completeness for nested sequent calculi. It is certainly not the only method available, but it should be entered into the record as one more useful tool available to the logician. Pdf4 October 2011 Logic without Frontiers, College Publications, pp. 145–154, 2011
Prefixed Tableaus and Nested Sequents Melvin Fitting AbstractNested sequent systems for modal logics are a relatively recent development, within the general area known as deep reasoning. The idea of deep reasoning is to create systems within which one operates at lower levels in formulas than just those involving the main connective or operator. Prefixed tableaus go back to 1972, and are modal tableau systems with extra machinery to represent accessibility in a purely syntactic way. We show that modal nested sequents and prefixed modal tableaus are notational variants of each other, roughly in the same way that Gentzen sequent calculi and tableaus are notational variants. This immediately gives rise to new modal nested sequent systems which may be of independent interest. We discuss some of these, including those for some justification logics that include standard modal operators. Pdf14 September 2011 Annals of Pure and Applied Logic 163(3), pp. 291–313, 2012
On the Correspondence Between Display Postulates and Deep Inference in Nested Sequent Calculi for Tense Logics Rajeev Goré, Linda Postniece and Alwen Tiu AbstractWe consider two styles of proof calculi for a family of tense logics, presented in a formalism based on nested sequents. A nested sequent can be seen as a tree of traditional single-sided sequents. Our first style of calculi is what we call “shallow calculi”, where inference rules are only applied at the root node in a nested sequent. Our shallow calculi are extensions of Kashima’s calculus for tense logic and share an essential characteristic with display calculi, namely, the presence of structural rules called “display postulates”. Shallow calculi enjoy a simple cut elimination procedure, but are unsuitable for proof search due to the presence of display postulates and other structural rules. The second style of calculi uses deep-inference, whereby inference rules can be applied at any node in a nested sequent. We show that, for a range of extensions of tense logic, the two styles of calculi are equivalent, and there is a natural proof theoretic correspondence between display postulates and deep inference. The deep inference calculi enjoy the subformula property and have no display postulates or other structural rules, making them a better framework for proof search.
Conference version pdf1 May 2009 Tableaux 2009, LNCS 5607, pp. 189–204
The title of this version is Taming Displayed Tense Logics Using Nested Sequents With Deep Inference
Proof Theory and Proof Search of Bi-Intuitionistic and Tense Logic Linda Postniece AbstractIn this thesis, we consider bi-intuitionistic logic and tense logic, as well as the combined bi-intuitionistic tense logic. Each of these logics contains a pair of dual connectives, for example, Rauszer's bi-intuitionistic logic contains intuitionistic implication and dual intuitionistic exclusion. The interaction between these dual connectives makes it non-trivial to develop a cut-free sequent calculus for these logics.
In the first part of this thesis we develop a new extended sequent calculus for biintuitionistic logic using a framework of derivations and refutations. This is the first purely syntactic cut-free sequent calculus for bi-intuitionistic logic and thus solves an open problem. Our calculus is sound, semantically complete and allows terminating backward proof search, hence giving rise to a decision procedure for bi-intuitionistic logic.
In the second part of this thesis we consider the broader problem of taming proof search in display calculi, using bi-intuitionistic logic and tense logic as case studies. While the generality of display calculi makes it an excellent framework for designing sequent calculi for logics where traditional sequent calculi fail, this generality also leads to a large degree of non-determinism, which is problematic for backward proof-search. We control this non-determinism in two ways:
1. First, we limit the structural connectives used in the calculi and consequently, the number of display postulates. Specifically, we work with nested structures which can be viewed as a tree of traditional Gentzen's sequents, called nested sequents, which have been used previously by Kashima and, independently, by Brünnler and Straßburger and Poggiolesi to present several modal and tense logics.
2. Second, since residuation rules are largely responsible for the difficulty in finding a proof search procedure for display-like calculi, we show how to eliminate these residuation rules using deep inference in nested sequents.
Finally, we study the combined bi-intuitionistic tense logic, which contains the well-known intuitionistic modal logic as a sublogic. We give a nested sequent calculus for bi-intuitionistic tense logic that has cut-elimination, and a derived deep inference nested sequent calculus that is complete with respect to the first calculus and where contraction and residuation rules are admissible. We also show how our calculi can capture Simpson's intuitionistic modal logic and Ewald's intuitionistic tense logic. Pdf21 November 2010
PhD thesis, defended on March 2011
How to Universally Close the Existential Rule Kai Brünnler AbstractThis paper introduces a nested sequent system for predicate logic. The system features a structural universal quantifier and a universally closed existential rule. One nice consequence of this is that proofs of sentences cannot contain free variables. Another nice consequence is that the assumption of a non-empty domain is isolated in a single inference rule. This rule can be removed or added at will, leading to a system for free logic or classical predicate logic, respectively. The system for free logic is interesting because it has no need for an existence predicate. We see syntactic cut-elimination and completeness results for these two systems as well as two standard applications: Herbrand ?s Theorem and interpolation. Pdf (long version)31 July 2010 LPAR-17, LNCS 6397, pp. 172–186
Cut-Elimination and Proof Search for Bi-Intuitionistic Tense Logic Rajeev Goré, Linda Postniece and Alwen Tiu AbstractWe consider an extension of bi-intuitionistic logic with the traditional modalities from tense logic Kt. Proof theoretically, this extension is obtained simply by extending an existing sequent calculus for bi-intuitionistic logic with typical inference rules for the modalities used in display logics. As it turns out, the resulting calculus, LBiKt, seems to be more basic than most intuitionistic tense or modal logics considered in the literature, in particular, those studied by Ewald and Simpson, as it does not assume any a priori relationship between the diamond and the box modal operators. We recover Ewald's intuitionistic tense logic and Simpson's intuitionistic modal logic by modularly extending LBiKt with additional structural rules. The calculus LBiKt is formulated in a variant of display calculus, using a form of sequents called nested sequents. Cut elimination is proved for LBiKt, using a technique similar to that used in display calculi. As in display calculi, the inference rules of LBiKt are ''shallow'' rules, in the sense that they act on top-level formulae in a nested sequent. The calculus LBiKt is ill-suited for backward proof search due to the presence of certain structural rules called ''display postulates'' and the contraction rules on arbitrary structures. We show that these structural rules can be made redundant in another calculus, DBiKt, which uses deep inference, allowing one to apply inference rules at an arbitrary depth in a nested sequent. We prove the equivalence between LBiKt and DBiKt and outline a proof search strategy for DBiKt. We also give a Kripke semantics and prove that LBiKt is sound with respect to the semantics, but completeness is still an open problem. We then discuss various extensions of LBiKt. Pdf29 June 2010 Advances in Modal Logic 2010, pp 156–177
A Syntactic Realization Theorem for Justification Logics Kai Brünnler, Remo Goetschi and Roman Kuznets AbstractJustification logics are refinements of modal logics where modalities are replaced by justification terms. They are connected to modal logics via so-called realization theorems. We present a syntactic proof of a single realization theorem which uniformly connects all the normal modal logics formed from the axioms d, t, b, 4, and 5 with their justification counterparts. The proof employs cut-free nested sequent systems together with Fitting ?s realization merging technique. We further strengthen the realization theorem for KB5 and S5 by showing that the positive introspection operator is superfluous. Pdf3 April 2010 Advances in Modal Logic 2010, pp 39–58
A Note on Uniform Interpolation Proofs in Modal Deep Inference Calculi Marta Bílková AbstractThis paper answers one rather particular question: how to perform a proof of uniform interpolation property in deep inference calculi for modal logics. We show how to perform a proof of uniform interpolation property in deep inference calculus for the basic modal logic K via forgetting a variable in a certain normal form constructed by backward proof search. For that purpose we modify the framework of deep inference calculus using a cover modality on the meta level to structure deep sequents.
21 September 2009 TbiLLC 2009, LNAI 6618, pp. 30–45
Deep Inference in Bi-intuitionistic Logic Linda Postniece AbstractBi-intuitionistic logic is the extension of intuitionistic logic with exclusion, a connective dual to implication. Cut-elimination in biintuitionistic logic is complicated due to the interaction between these two connectives, and various extended sequent calculi, including a display calculus, have been proposed to address this problem.
In this paper, we present a new extended sequent calculus DBiInt for bi-intuitionistic logic which uses nested sequents and “deep inference”, i.e., inference rules can be applied at any level in the nested sequent. We show that DBiInt can simulate our previous “shallow” sequent calculus LBiInt. In particular, we show that deep inference can simulate the residuation rules in the display-like shallow calculus LBiInt. We also consider proof search and give a simple restriction of DBiInt which allows terminating proof search. Thus our work is another step towards addressing the broader problem of proof search in display logic. Pdf7 June 2009 WoLLIC 2009, LNCS 5514, pp. 320–334
Modular Sequent Systems for Modal Logic Kai Brünnler and Lutz Straßburger AbstractWe see cut-free sequent systems for the basic normal modal logics formed by any combination the axioms d, t, b, 4, 5. These systems are modular in the sense that each axiom has a corresponding rule and each combination of these rules is complete for the corresponding frame conditions. The systems are based on nested sequents, a natural generalisation of hypersequents. Nested sequents stay inside the modal language, as opposed to both the display calculus and labelled sequents. The completeness proof is via syntactic cut elimination. Pdf23 April 2009 Tableaux 2009, LNCS 5607, pp. 152–166
Attention, this paper contains some serious mistakes. The authors are working on a corrected version.
Syntactic Cut-Elimination for Common Knowledge Kai Brünnler and Thomas Studer AbstractWe first look at an existing infinitary sequent system for common knowledge for which there is no known syntactic cut-elimination procedure and also no known non-trivial bound on the proof-depth. We then present another infinitary sequent system based on nested sequents that are essentially trees and with inference rules that apply deeply inside these trees. Thus we call this system "deep" while we call the former system "shallow". In contrast to the shallow system, the deep system allows one to give a straightforward syntactic cut-elimination procedure. Since both systems can be embedded into each other, this also yields a syntactic cut-elimination procedure for the shallow system. For both systems we thus obtain an upper bound of φ_{2}0 on the depth of proofs, where φ is the Veblen function. We see a cut-free infinitary sequent system for common knowledge. Its sequents are essentially trees and the inference rules apply deeply inside of these trees. This allows to give a syntactic cut-elimination procedure which yields an upper bound of on the depth of proofs, where is the Veblen function.
Cut-Elimination and Proof-Search for Bi-Intuitionistic Logic Using Nested Sequents Rajeev Goré, Linda Postniece and Alwen Tiu AbstractWe propose a new sequent calculus for bi-intuitionistic logic which sits somewhere between display calculi and traditional sequent calculi by using nested sequents. Our calculus enjoys a simple (purely syntactic) cut-elimination proof as do display calculi. But it has an easily derivable variant calculus which is amenable to automated proof search as are (some) traditional sequent calculi. We first present the initial calculus and its cutelimination proof. We then present the derived calculus, and then present a proof-search strategy which allows it to be used for automated proof search. We prove that this search strategy is terminating and complete by showing how it can be used to mimic derivations obtained from an existing calculus GBiInt for bi-intuitionistic logic. As far as we know, our new calculus is the first sequent calculus for bi-intuitionistic logic which uses no semantic additions like labels, which has a purely syntactic cut-elimination proof, and which can be used naturally for backwards proof-search. Pdf15 September 2008 Advances in Modal Logic 2008, pp 43–66
Deep Sequent Systems for Modal Logic Kai Brünnler AbstractWe see a systematic set of cut-free axiomatisations for all the basic normal modal logics formed by some combination the axioms d, t, b, 4, 5. They employ a form of deep inference but otherwise stay very close to Gentzen’s sequent calculus, in particular they enjoy a subformula property in the literal sense. No semantic notions are used inside the proof systems, in particular there is no use of labels. All their rules are invertible and the rules cut, weakening and contraction are admissible. All systems admit a straightforward terminating proof search procedure as well as a syntactic cut elimination procedure.
We can present systematically several normal propositional modal logics, including S5, B and K5, for which cut elimination is proved. We also investigated geometric theories, some of which we expressed in the calculus of structures.
Deep Inference for Hybrid Logic Lutz Straßburger AbstractThis paper describes work in progress on using deep inference for designing a deductive system for hybrid logic. We will see a cut-free system and prove its soundness and completeness. An immediate observation about the system is that there is no need for additional rewrite rules as in Blackburn’s tableaux, nor substitution rules as in Seligman’s sequent system. Pdf15 May 2007
Proceedings of International Workshop on Hybrid Logic 2007 (HyLo 2007), pp. 13–22
Classical Modal Display Logic in the Calculus of Structures and Minimal and Cut-free Deep Inference Calculi for S5 Rajeev Goré and Alwen Tiu AbstractWe begin by showing how to faithfully encode the Classical Modal Display Logic (CMDL) of Wansing into the Calculus of Structures (CoS) of Guglielmi. Since every CMDL calculus enjoys cut-elimination, we obtain a cut-elimination theorem for all corresponding CoS calculi. We then show how our result leads to a minimal cut-free CoS calculus for modal logic S5. As far as we know, no other existing CoS calculi for S5 enjoy both these properties simultaneously. Pdf4 May 2007 Journal of Logic and Computation 17 (4) 2007, pp. 767–794
A Deep Inference System for the Modal Logic S5 Phiniki Stouppa AbstractWe present a cut-admissible system for the modal logic S5 in a framework that makes explicit and intensive use of deep inference. Deep inference is induced by the methods applied so far in conceptually pure systems for this logic. Thus, the formulation of a system in such a framework is an evolutional process and leads to positive proof theoretical results. The system enjoys systematicity and modularity, two important properties that seek satisfaction from modal systems. Furthermore, it enjoys a simple and direct design: the rules are few and the modal rules are in exact correspondence to the modal axioms. Pdf1 March 2006 Studia Logica 85 (2) 2007, pp. 199–214
Purity Through Unravelling Robert Hein and Charles Stewart AbstractWe divide attempts to give the structural proof theory of modal logics into two kinds, those pure formulations whose inference rules characterise modality completely by means of manipulations of boxes and diamonds, and those labelled formulations that leverage the use of labels in giving inference rules. The widespread adoption of labelled formulations is driven by their ability to model features of the model theory of modal logic in its proof theory.
We describe here an approach to the structural proof theory of modal logic that aims to bring under one roof the benefits of both the pure and the labelled formulations. We introduce two proof calculi, one labelled sequent formulation and one pure formulation in the calculus of structures that are shown to be in a systematic correlation, where the latter calculus uses deep inference with shaped modal rules to capture in a pure manner the manipulations that the former calculations mediates through the use of labels.
We situate this work within a larger investigation into the proof theory of modal logic that solves problems that existed with the earlier investigation based on prefix modal rules. We hold this development provides yet stronger evidence justifying the claim that good, pure proof theory for modal logic needs deep inference. Pdf25 June 2005
Proceedings of Structures and Deduction '05, pp. 126–143
Geometric Theories and Modal Logic in the Calculus of Structures Robert Hein AbstractMuch of the success of modal logic can be attributed to the adoption of relational semantics. Consequently, modal logic is seen as logic of relational structures, where logical axioms correspond to structural properties. Alex Simpson, in his 1993 PhD thesis, introduced a labelled proof theory for modal logic that that allows cut-elimination for a class of modal logics, which is characterised by so called geometric theories. This includes important and well know logics such as M, B, S4 and S5. This thesis tries to make a bridge between Simpson's result and purely symbolic proof theory. We introduce a method to characterise frame relational properties by means of deep inference in the Calculus of Structures. The results are only partial. Only what we call 3/4-Scott-Lemmon logics are characterised and we only give plausible reason, rather than a proof that the cut-elimination argument can be transferred too. Gzipped postscript20 March 2005
MSc thesis, successfully defended on 23 March 2005
The Design of Modal Proof Theories: The Case of S5 Phiniki Stouppa AbstractThe sequent calculus does not seem to be capable of supporting cut-admissible formulations for S5. Through a survey on existing cut-admissible systems for this logic, we investigate the solutions proposed to overcome this defect. Accordingly, the systems can be divided into two categories: in those which allow semantic-oriented formulae and those which allow formulae in positions not reachable by the usual systems in the sequent calculus. The first solution is not desirable because it is conceptually impure, that is, these systems express concepts of frame semantics in the language of the logic.
Consequently, we focus on the systems of the second group for which we define notions related to deep inference—the ability to apply rules deep inside structures—as well as other desirable properties good systems should enjoy. We classify these systems accordingly and examine how these properties are affected in the presence of deep inference. Finally, we present a cut-admissible system for S5 in a formalism which makes explicit use of deep inference, the calculus of structures, and give reasons for its effectiveness in providing good modal formulations. Pdf20 October 2004
MSc thesis, successfully defended on 27 October 2004
A Systematic Proof Theory for Several Modal Logics Charles Stewart and Phiniki Stouppa AbstractThe family of normal propositional modal logic systems is given a very systematic organisation by their model theory. This model theory is generally given using frame semantics, and it is systematic in the sense that for the most important systems we have a clean, exact correspondence between their constitutive axioms as they are usually given in a Hilbert-Lewis style and conditions on the accessibility relation on frames.
By contrast, the usual structural proof theory of modal logic, as given in Gentzen systems, is ad-hoc. While we can formulate several modal logics in the sequent calculus that enjoy cut-elimination, their formalisation arises through system-by-system fine tuning to ensure that the cut-elimination holds, and the correspondence to the formulation in the Hilbert-Lewis systems becomes opaque.
This paper introduces a systematic presentation for the systems K, D, M, S4, and S5 in the calculus of structures, a structural proof theory that employs deep inference. Because of this, we are able to axiomatise the modal logics in a manner directly analogous to the Hilbert-Lewis axiomatisation. We show that the calculus possesses a cut-elimination property directly analogous to cut-elimination for the sequent calculus for these systems, and we discuss the extension to several other modal logics. Postscript2 September 2004 Advances in Modal Logic 2004, pp. 309–333
2.5Linear Logic
Linear logic enjoys presentations in deep inference that obtain the expected properties of locality, with rather astounding decomposition theorems and the usual, general normalisation results.
Equality and Fixpoints in the Calculus of Structures Kaustuv Chaudhuri and Nicolas Guenot AbstractThe standard proof theory for logics with equality and fixpoints suffers from limitations of the sequent calculus, where reasoning is separated from computational tasks such as unification or rewriting. We propose in this paper an extension of the calculus of structures, a deep inference formalism, that supports incremental and contextual reasoning with equality and fixpoints in the setting of linear logic. This system allows deductive and computational steps to mix freely in a continuum which integrates smoothly into the usual versatile rules of multiplicative-additive linear logic in deep inference. Pdf2 June 2014 CSL-LICS 2014, pp. 30:1–10
Interaction and Depth Against Nondeterminism in Proof Search Ozan Kahramanoğulları AbstractDeep inference is a proof theoretic methodology that generalizes the standard notion of inference of the sequent calculus, whereby inference rules become applicable at any depth inside logical expressions. Deep inference provides more freedom in the design of deductive systems for different logics and a rich combinatoric analysis of proofs. In particular, construction of exponentially shorter analytic proofs becomes possible, however with the cost of a greater nondeterminism than in the sequent calculus. In this paper, we show that the nondeterminism in proof search can be reduced without losing the shorter proofs and without sacrificing proof theoretic cleanliness. For this, we exploit an interaction and depth scheme in the logical expressions. We demonstrate our method on deep inference systems for multiplicative linear logic and classical logic, discuss its proof complexity and its relation to focusing, and present implementations. Pdf29 May 2014 Logical Methods in Computer Science 10 (2:5) 2014, pp. 1–49 Interana: interactive multiplicative linear logic prover
The Focused Calculus of Structures Kaustuv Chaudhuri, Nicolas Guenot and Lutz Straßburger AbstractThe focusing theorem identifies a complete class of sequent proofs that have no inessential nondeterministic choices and restrict the essential choices to a particular normal form. Focused proofs are therefore well suited both for the search and for the representation of sequent proofs. The calculus of structures is a proof formalism that allows rules to be applied deep inside a formula. Through this freedom it can be used to give analytic proof systems for a wider variety of logics than the sequent calculus, but standard presentations of this calculus are too permissive, allowing too many proofs. In order to make it more amenable to proof search, we transplant the focusing theorem from the sequent calculus to the calculus of structures. The key technical contribution is an incremental treatment of focusing that avoids trivializing the calculus of structures. We give a direct inductive proof of the completeness of the focused calculus of structures with respect to a more standard unfocused form. We also show that any focused sequent proof can be represented in the focused calculus of structures, and, conversely, any proof in the focused calculus of structures corresponds to a focused sequent proof. Pdf25 August 2011 CSL 2011, LIPIcs 12, pp. 159–173
Focused Proof Search for Linear Logic in the Calculus of Structures Nicolas Guenot AbstractThe proof-theoretic approach to logic programming has benefited from the introduction of focused proof systems, through the non-determinism reduction and control they provide when searching for proofs in the sequent calculus. However, this technique was not available in the calculus of structures, known for inducing even more non-determinism than other logical formalisms. This work in progress aims at translating the notion of focusing into the presentation of linear logic in this setting, and use some of its specific features, such as deep application of rules and fine granularity, in order to improve proof search procedures. The starting point for this research line is the multiplicative fragment of linear logic, for which a simple focused proof system can be built. Pdf23 June 2010 ICLP 2010, technical communication, LIPIcs 7, pp. 84–93
Some Observations on the Proof Theory of Second Order Propositional Multiplicative Linear Logic Lutz Straßburger AbstractWe investigate the question of what constitutes a proof when quantifiers and multiplicative units are both present. On the technical level this paper provides two new aspects of the proof theory of MLL2 with units. First, we give a novel proof system in the framework of the calculus of structures. The main feature of the new system is the consequent use of deep inference, which allows us to observe a decomposition which is a version of Herbrand’s theorem that is not visible in the sequent calculus. Second, we show a new notion of proof nets which is independent from any deductive system. We have “sequentialisation” into the calculus of structures as well as into the sequent calculus. Since cut elimination is terminating and confluent, we have a category of MLL2 proof nets. The treatment of the units is such that this category is star-autonomous. Pdf10 April 2009 TLCA 2009, LNCS 5608, pp. 309–324 Pdf with proofs in appendix
MELL in the Calculus of Structures Lutz Straßburger AbstractThe calculus of structures is a new proof-theoretical framework, like natural deduction, the sequent calculus and proof nets, for specifying logical systems syntactically. In a rule in the calculus of structures, the premise as well as the conclusion are structures, which are expressions that share properties of formulae and sequents. In this paper, I study a system for MELL, the multiplicative exponential fragment of linear logic, in the calculus of structures. It has the following features: a local promotion rule, no non-deterministic splitting of the context in the times rule and a modular proof for the cut-elimination theorem. Further, derivations have a new property, called decomposition, that cannot be observed in any other known proof-theoretical framework. Pdf24 November 2003 Theoretical Computer Science 309, pp. 213–285
A Local System for Linear Logic Lutz Straßburger AbstractIn this paper I will present a deductive system for linear logic, in which all rules are local. In particular, the contraction rule is reduced to an atomic version, and there is no global promotion rule. In order to achieve this, it is necessary to depart from the sequent calculus to the calculus of structures, which is a generalization of the one-sided sequent calculus. In a rule, premise and conclusion are not sequents, but structures, which are expressions that share properties of formulae and sequents.
Full paper pdf10 June 2002
Technical Report WV-02-01, Technische Universität Dresden
2.6Commutative/Non-commutative Linear Logic
We conservatively extend mixed multiplicative and multiplicative exponential linear logic with a self-dual non-commutative operator. The systems so obtained cannot be presented in the sequent calculus, but they enjoy the usual properties of locality, decomposition and cut elimination available in the calculus of structures. We can present Yetter's cyclic linear logic in the calculus of structures and prove cut elimination; interestingly, cyclicity is naturally subsumed by deep inference. New, purely proof-theoretical, techniques are developed for reducing the non-determinism in the calculus of structures.
A Deep Inference System with a Self-Dual Binder Which Is Complete for Linear Lambda Calculus Luca Roversi AbstractWe recall that SBV, a proof system developed under the methodology of deep inference, extends multiplicative linear logic with the self-dual non-commutative logical operator Seq. We introduce SBVB that extends SBV by adding the self-dual binder Sdb on names. The system SBVB is consistent because we prove that (the analogous of) cut-elimination holds for it. Moreover, the resulting interplay between Seq and Sdb can model β-reduction of linear λ-calculus inside the cut-free subsystem BVB of SBVB. The long term aim is to keep developing a programme whose goal is to give pure logical accounts of computational primitives under the proof-search-as-computation analogy, by means of minimal and incremental extensions of SBV. Pdf25 January 2014
To appear on Journal of Logic and Computation
A Logical Basis for Quantum Evolution and Entanglement Richard F. Blute, Alessio Guglielmi, Ivan T. Ivanov, Prakash Panangaden and Lutz Straßburger AbstractWe reconsider discrete quantum causal dynamics where quantum systems are viewed as discrete structures, namely directed acyclic graphs. In such a graph, events are considered as vertices and edges depict propagation between events. Evolution is described as happening between a special family of spacelike slices, which were referred to as locative slices. Such slices are not so large as to result in acausal influences, but large enough to capture nonlocal correlations.
In our logical interpretation, edges are assigned logical formulas in a special logical system, called BV, an instance of a deep inference system. We demonstrate that BV, with its mix of commutative and noncommutative connectives, is precisely the right logic for such analysis. We show that the commutative tensor encodes (possible) entanglement, and the noncommutative seq encodes causal precedence. With this interpretation, the locative slices are precisely the derivable strings of formulas. Several new technical results about BV are developed as part of this analysis. Pdf18 July 2013
In Categories and Types in Logic, Language, and Physics, LNCS 8222, pp. 90–107
Communication, and Concurrency with Logic-Based Restriction Inside a Calculus of Structures Luca Roversi AbstractIt is well known that we can use structural proof theory to refine, or generalize, existing paradigmatic computational primitives, or to discover new ones. Under such a point of view we keep developing a programme whose goal is establishing a correspondence between proof-search of a logical system and computations in a process algebra. We give a purely logical account of a process algebra operation which strictly includes the behavior of restriction on actions we find in Milner CCS. This is possible inside a logical system in the Calculus of Structures of Deep Inference endowed with a self-dual quantifier. Using proof-search of cut-free proofs of such a logical system we show how to solve reachability problems in a process algebra that subsumes a significant fragment of Milner CCS. Pdf19 December 2012
Submitted
Linear Lambda Calculus and Deep Inference Luca Roversi AbstractWe introduce a deep inference logical system SBVr which extends SBV with Rename, a self-dual atom-renaming operator. We prove that the cut free subsystem BVr of SBVr exists. We embed the terms of linear lambda-calculus with explicit substitutions into formulas of SBVr. Our embedding recalls the one of full lambda-calculus into pi-calculus. The proof-search inside SBVr and BVr is complete with respect to the evaluation of linear lambda-calculus with explicit substitutions. Instead, only soundness of proof-search in SBVr holds. Rename is crucial to let proof-search simulate the substitution of a linear lambda-term for a variable in the course of linear beta-reduction. Despite SBVr is a minimal extension of SBV its proof-search can compute all boolean functions, exactly like linear lambda-calculus with explicit substitutions can do.
Full paper pdf16 November 2010
The title of this version is Linear Lambda Calculus with Explicit Substitutions as Proof-Search in Deep Inference
A System of Interaction and Structure V: The Exponentials and Splitting Alessio Guglielmi and Lutz Straßburger AbstractSystem NEL is the mixed commutative/non-commutative linear logic BV augmented with linear logic’s exponentials, or, equivalently, it is MELL augmented with the non-commutative self-dual connective seq. NEL is presented in deep inference, because no Gentzen formalism can express it in such a way that the cut rule is admissible. Other, recent works, show that system NEL is Turing- complete, it is able to directly express process algebra sequential composition and it faithfully models causal quantum evolution. In this paper, we show cut elimination for NEL, based on a technique that we call splitting. The splitting theorem shows how and to what extent we can recover a sequent-like structure in NEL proofs. Together with a ‘decomposition’ theorem, proved in the previous paper of this series, splitting yields a cut-elimination procedure for NEL. Pdf21 September 2010 Mathematical Structures in Computer Science, 21 (3) 2011, pp. 563–584
This paper is the second part resulted from splitting into two parts the paper A System of Interaction and Structure IV: The Exponentials (which was previously titled A Non-commutative Extension of Multiplicative Exponential Linear Logic and was the journal version of A Non-commutative Extension of MELL).
A System of Interaction and Structure IV: The Exponentials and Decomposition Lutz Straßburger and Alessio Guglielmi AbstractWe study a system, called NEL, which is the mixed commutative/non-commutative linear logic BV augmented with linear logic’s exponentials. Equivalently, NEL is MELL augmented with the non-commutative self-dual connective seq. In this paper, we show a basic compositionality property of NEL, which we call decomposition. This result leads to a cut-elimination theorem, which is proved in the next paper of this series. To control the induction measure for the theorem, we rely on a novel technique that extracts from NEL proofs the structure of exponentials, into what we call !-?-Flow-Graphs. Pdf27 July 2010 ACM Transactions on Computational Logic 12 (4:23) 2011, pp. 1–39
This paper is the first part resulted from splitting into two parts the paper A System of Interaction and Structure IV: The Exponentials (which was previously titled A Non-commutative Extension of Multiplicative Exponential Linear Logic and was the journal version of A Non-commutative Extension of MELL).
Deep Inference and Probabilistic Coherence Spaces Richard Blute, Prakash Panangaden and Sergey Slavnov AbstractThis paper proposes a definition of categorical model of the deep inference system BV, defined by Guglielmi. Deep inference introduces the idea of performing a deduction in the interior of a formula, at any depth. Traditional sequent calculus rules only see the roots of formulae. However in these new systems, one can rewrite at any position in the formula tree. Deep inference in particular allows the syntactic description of logics for which there is no sequent calculus. One such system is BV, which extends linear logic to include a noncommutative self-dual connective. This is the logic our paper proposes to model. Our definition is based on the notion of a linear functor, due to Cockett and Seely. A BV-category is a linearly distributive category, possibly with negation, with an additional tensor product which, when viewed as a bivariant functor, is linear with a degeneracy condition. We show that this simple definition implies all of the key isomorphisms of the theory. We consider Girard’s category of probabilistic coherence spaces and show that it contains a self-dual monoidal structure in addition to the *-autonomous structure exhibited by Girard. This structure makes the category a BV-category. We believe this structure is also of independent interest, as well-behaved noncommutative operators generally are. Pdf10 February 2009 Applied Categorical Structures 20 2012, pp. 209–228
System BV Is NP-Complete Ozan Kahramanoğulları AbstractSystem BV is an extension of multiplicative linear logic (MLL) with the rules mix, nullary mix, and a self-dual, non-commutative logical operator, called seq. While the rules mix and nullary mix extend the deductive system, the operator seq extends the language of MLL. Due to the operator seq, system BV extends the applications of MLL to those where sequential composition is crucial, e.g., concurrency theory. System FBV is an extension of MLL with the rules mix and nullary mix. In this paper, by relying on the fact that system BV is a conservative extension of system FBV, I show that system BV is NP-complete by encoding the 3-Partition problem in FBV. I provide a simple completeness proof of this encoding by resorting to a novel proof theoretical method for reducing the nondeterminism in proof search, which is also of independent interest.
A System of Interaction and Structure Alessio Guglielmi AbstractThis paper introduces a logical system, called BV, which extends multiplicative linear logic by a non-commutative self-dual logical operator. This extension is particularly challenging for the sequent calculus, and so far it is not achieved therein. It becomes very natural in a new formalism, called the calculus of structures, which is the main contribution of this work. Structures are formulae subject to certain equational laws typical of sequents. The calculus of structures is obtained by generalising the sequent calculus in such a way that a new top-down symmetry of derivations is observed, and it employs inference rules that rewrite inside structures at any depth. These properties, in addition to allowing the design of BV, yield a modular proof of cut elimination. Pdf27 January 2007 ACM Transactions on Computational Logic 8 (1:1) 2007, pp. 1–64
The journal version has been butchered by the editorial process, but the preprint (linked here) is fine.
At the end of 2012, this was the most cited article appeared in ACM TOCL in the previous five years.
Reducing Nondeterminism in the Calculus of Structures Ozan Kahramanoğulları AbstractThe calculus of structures is a proof theoretical formalism which generalizes the sequent calculus with the feature of deep inference: In contrast to the sequent calculus, inference rules can be applied at any depth inside a formula, bringing shorter proofs than any other formalisms supporting analytical proofs. However, deep applicability of the inference rules causes greater nondeterminism than in the sequent calculus regarding proof search. In this paper, we introduce a new technique which reduces nondeterminism without breaking proof theoretical properties and provides a more immediate access to shorter proofs. We present this technique on system BV, the smallest technically non-trivial system in the calculus of structures, extending multiplicative linear logic with the rules mix, nullary mix, and a self-dual non-commutative logical operator. Because our technique exploits a scheme common to all the systems in the calculus of structures, we argue that it generalizes to these systems for classical logic, linear logic, and modal logics. Pdf23 August 2006 LPAR 2006, LNCS 4246, pp. 272–286
A System of Interaction and Structure II: The Need for Deep Inference Alwen Tiu AbstractThis paper studies properties of the logic BV, which is an extension of multiplicative linear logic (MLL) with a self-dual non-commutative operator. BV is presented in the calculus of structures, a proof theoretic formalism that supports deep inference, in which inference rules can be applied anywhere inside logical expressions. The use of deep inference results in a simple logical system for MLL extended with the self-dual non-commutative operator, which has been to date not known to be expressible in sequent calculus. In this paper, deep inference is shown to be crucial for the logic BV, that is, any restriction on the "depth" of the inference rules of BV would result in a strictly less expressive logical system. Pdf3 April 2006 Logical Methods in Computer Science 2 (2:4) 2006, pp. 1–24
There are pictures that help understanding this paper in Alwen Tiu's MSc thesis Properties of a Logical System in the Calculus of Structures
Structures for Multiplicative Cyclic Linear Logic: Deepness vs Cyclicity Pietro Di Gianantonio AbstractThe aim of this work is to give an alternative presentation for the multiplicative fragment of Yetter’s cyclic linear logic. The new presentation is inspired by the calculus of structures, and has the interesting feature of avoiding the cyclic rule. The main point in this work is to show how cyclicity can be substituted by deepness, i.e. the possibility of applying an inference rule at any point of a formula. We finally derive, through a new proof technique, the cut elimination property of the calculus. Pdf15 April 2004 CSL 2004, LNCS 3210, pp. 130–144
A Non-commutative Extension of MELL Alessio Guglielmi and Lutz Straßburger AbstractWe extend multiplicative exponential linear logic (MELL) by a non-commutative, self-dual logical operator. The extended system, called NEL, is defined in the formalism of the calculus of structures, which is a generalisation of the sequent calculus and provides a more refined analysis of proofs. We should then be able to extend the range of applications of MELL, by modelling a broad notion of sequentiality and providing new properties of proofs. We show some proof theoretical results: decomposition and cut elimination. The new operator represents a significant challenge: to get our results we use here for the first time some novel techniques, which constitute a uniform and modular approach to cut elimination, contrary to what is possible in the sequent calculus. Pdf9 August 2002 LPAR 2002, LNCS 2514, pp. 231–246
The Undecidability of System NEL Lutz Straßburger AbstractSystem NEL is a conservative extension of multiplicative exponential linear logic (MELL) by a self-dual non-commutative connective called seq which lives between the par and the times. In this paper, I will show that system NEL is undecidable by encoding two counter machines into NEL. Although the encoding is quite simple, the proof of the faithfulness is a little intricate because there is no sequent calculus and no phase semantics available for NEL.
Combining A1- and AC1-Unification Sharing Unit Alwen Fernanto Tiu Gzipped postscript19 February 2002
Technical Report WV-01-09, Technische Universität Dresden
Properties of a Logical System in the Calculus of Structures Alwen Fernanto Tiu AbstractThe calculus of structures is a new framework for presenting logical systems. It is a generalisation of a traditional framework, the one-sided sequent calculus. One of the main features of the calculus of structures is that the inference rules are deep: they can be applied anywhere inside logical expressions. Rules in the sequent calculus are, in contrast, shallow. A certain logical system in the calculus of structures, called System BV, is studied here. We see that the deep-nesting of rules is a real distinguishing feature between the two frameworks. To this purpose a notion of shallow systems is introduced, such that sequent systems are particular instances. A counterexample, sort of a fractal structure, is then presented to show that there is no shallow system for the logic behind BV, and hence no sequent system for BV. This result contributes to justifying the claim that the calculus of structures is a better logical framework than sequent calculus, for certain logics. Pdf12 September 2001Now replaced by A System of Interaction and Structure II: The Need for Deep-Inference
MSc thesis, successfully defended on 1 August 2001, Technical Report WV-01-06, Technische Universität Dresden
Non-commutativity and MELL in the Calculus of Structures Alessio Guglielmi and Lutz Straßburger AbstractWe introduce the calculus of structures: it is more general than the sequent calculus and it allows for cut elimination and the subformula property. We show a simple extension of multiplicative linear logic, by a self-dual non-commutative operator inspired by CCS, that seems not to be expressible in the sequent calculus. Then we show that multiplicative exponential linear logic benefits from its presentation in the calculus of structures, especially because we can replace the ordinary, global promotion rule by a local version. These formal systems, for which we prove cut elimination, outline a range of techniques and properties that were not previously available. Contrarily to what happens in the sequent calculus, the cut elimination proof is modular. Pdf28 June 2001 CSL 2001, LNCS 2142, pp. 54–68
A Calculus of Order and Interaction Alessio Guglielmi This paper has been thoroughly rewritten as A System of Interaction and Structure. The introduction of the new paper has not been written under the effect of psychotropic substances. Please forget about this paper.
Technical Report WV-99-04, Technische Universität Dresden
2.7Proof Nets, Semantics of Proofs and the War to Bureaucracy
Deep inference and the calculus of structures are influencing the design of a new generation of proof nets. Moreover, they offer new insight for semantics of proofs and categorical proof theory. Finally, they open decisive new perspectives in the fight against bureaucracy.
The Structure of Interaction Stéphane Gimenez and Georg Moser AbstractInteraction nets form a local and strongly confluent model of computation that is per se parallel. We introduce a Curry–Howard correspondence between well-formed interaction nets and a deep-inference deduction system based on linear logic. In particular, linear logic itself is easily expressed in the system and its computational aspects materialise though the correspondence. The system of interaction nets obtained is a typed variant of already well-known sharing graphs. Due to a strong confluence property, strong normalisation for this system follows from weak normalisation. The latter is obtained via an adaptation of Girard’s reducibility method. The approach is modular, readily gives rise to generalisations (e.g. second order, known as polymorphism to the programmer) and could therefore be extended to various systems of interaction nets. Pdf4 July 2013 CSL 2013, LIPIcs 23, pp. 316–331
A Proof Calculus Which Reduces Syntactic Bureaucracy Alessio Guglielmi, Tom Gundersen and Michel Parigot AbstractIn usual proof systems, like sequent calculus, only a very limited way of combining proofs is available through the tree structure. We present in this paper a logic-independent proof calculus, where proofs can be freely composed by connectives, and prove its basic properties. The main advantage of this proof calculus is that it allows to avoid certain types of syntactic bureaucracy inherent to all usual proof systems, in particular sequent calculus. Proofs in this system closely reflect their atomic flow, which traces the behaviour of atoms through structural rules. The general definition is illustrated by the standard deep-inference system for propositional logic, for which there are known rewriting techniques that achieve cut elimination based only on the information in atomic flows. Pdf12 April 2010 RTA 2010, Leibniz International Proceedings in Informatics (LIPIcs) 6, pp. 135–150
This paper supersedes the note Formalism A.
What Is the Problem with Proof Nets for Classical Logic? Lutz Straßburger AbstractThis paper is an informal (and nonexhaustive) overview over some existing notions of proof nets for classical logic, and gives some hints why they might be considered to be unsatisfactory. Pdf6 April 2010 CiE 2010, LNCS 6158, pp. 406–416
From Deep Inference to Proof Nets Via Cut Elimination Lutz Straßburger AbstractThis article shows how derivations in the deep inference system SKS for classical propositional logic can be translated into proof nets. Since an SKS derivation contains more information about a proof than the corresponding proof net, we observe a loss of information which can be understood as “eliminating bureaucracy”. Technically this is achieved by cut reduction on proof nets. As an intermediate step between the two extremes, SKS derivations and proof nets, we will see proof graphs representing derivations in “Formalism A”.
On the Axiomatisation of Boolean Categories with and without Medial Lutz Straßburger AbstractIn its most general meaning, a Boolean category is to categories what a Boolean algebra is to posets. In a more specific meaning a Boolean category should provide the abstract algebraic structure underlying the proofs in Boolean Logic, in the same sense as a Cartesian closed category captures the proofs in intuitionistic logic and a *-autonomous category captures the proofs in linear logic. However, recent work has shown that there is no canonical axiomatisation of a Boolean category. In this work, we will see a series (with increasing strength) of possible such axiomatisations, all based on the notion of *-autonomous category. We will particularly focus on the medial map, which has its origin in an inference rule in KS, a cut-free deductive system for Boolean logic in the calculus of structures. Finally, we will present a category proof nets as a particularly well-behaved example of a Boolean category.
What Is a Logic, and What Is a Proof? Lutz Straßburger AbstractI will discuss the two problems of how to define identity between logics and how to define identity between proofs. For the identity of logics, I propose to simply use the notion of preorder equivalence. This might be considered to be folklore, but is exactly what is needed from the viewpoint of the problem of the identity of proofs: If the proofs are considered to be part of the logic, then preorder equivalence becomes equivalence of categories, whose arrows are the proofs. For identifying these, the concept of proof nets is discussed. Pdf23 October 2006 Logica Universalis—Towards a General Theory of Logic, pp. 135–152, Birkhäuser, 2007
Proof Nets and the Identity of Proofs Lutz Straßburger AbstractThese are the notes for a 5-lecture-course given at ESSLLI 2006 in Malaga, Spain. The URL of the school is http://esslli2006.lcc.uma.es/. The course is intended to be introductory. That means no prior knowledge of proof nets is required. However, the student should be familiar with the basics of propositional logic, and should have seen formal proofs in some formal deductive system (e.g., sequent calculus, natural deduction, resolution, tableaux, calculus of structures, Frege-Hilbert-systems, ...). It is probably helpful if the student knows already what cut elimination is, but this is not strictly necessary. In these notes, I will introduce the concept of “proof nets” from the viewpoint of the problem of the identity of proofs. I will proceed in a rather informal way. The focus will be more on presenting ideas than on presenting technical details. The goal of the course is to give the student an overview of the theory of proof nets and make the vast amount of literature on the topic easier accessible to the beginner. For introducing the basic concepts of the theory, I will in the first part of the course stick to the unit-free multiplicative fragment of linear logic because of its rather simple notion of proof nets. In the second part of the course we will see proof nets for more sophisticated logics. Pdf20 October 2006
Technical Report 6013, INRIA
From Proof Nets to the Free *-Autonomous Category François Lamarche and Lutz Straßburger AbstractIn the first part of this paper we present a theory of proof nets for full multiplicative linear logic, including the two units. It naturally extends the well-known theory of unit-free multiplicative proof nets. A linking is no longer a set of axiom links but a tree in which the axiom links are subtrees. These trees will be identified according to an equivalence relation based on a simple form of graph rewriting. We show the standard results of sequentialization and strong normalization of cut elimination. In the second part of the paper we show that the identifications enforced on proofs are such that the class of two-conclusion proof nets defines the free *-autonomous category. Pdf5 October 2006Journal version of On Proof Nets for Multiplicative Linear Logic with Units Logical Methods in Computer Science 2 (4:3) 2006, pp. 1–44
Completeness of MLL Proof-Nets w.r.t. Weak Distributivity Jean-Baptiste Joinet AbstractWe examine ‘weak-distributivity’ as a rewriting rule ^{WD}→ defined on multiplicative proof-structures (so, in particular, on multiplicative proof-nets: MLL). This rewriting does not preserve the type of proofs-nets, but does nevertheless preserve their correctness. The specific contribution of this paper, is to give a direct proof of completeness for ^{WD}→: starting from a set of simple generators (proof-nets which are a n-ary ⊗ of ℘-ized axioms), any mono-conclusion MLL proof-net can be reached by ^{WD}→ rewriting (up to ⊗ and ℘ associativity and commutativity). Pdf8 September 2006 The Journal of Symbolic Logic 72 (1) 2007, pp. 159–170
Invited talk at WoLLIC 2003, under the title 'Calculus of Structures and Proof Nets', Electronic Notes in Theoretical Computer Science 84; a short version appeared in the proceedings of Structures and Deduction '05, pp. 81–94
Exploring the Gap between Linear and Classical Logic François Lamarche AbstractThe Medial rule was first devised as a deduction rule in the Calculus of Structures. In this paper we explore it from the point of view of category theory, as additional structure on a *-autonomous category. This gives us some insights on the denotational semantics of classical propositional logic, and allows us to construct new models for it, based on suitable generalizations of the theory of coherence spaces. Pdf27 June 2006 Theory and Applications of Categories 18 (17) 2007, pp. 473–535
The Three Dimensions of Proofs Yves Guiraud AbstractIn this document, we study a 3-polygraphic translation for the proofs of SKS, a formal system for classical propositional logic. We prove that the free 3-category generated by this 3-polygraph describes the proofs of classical propositional logic modulo structural bureaucracy. We give a 3-dimensional generalization of Penrose diagrams and use it to provide several pictures of a proof. We sketch how local transformations of proofs yield a non contrived example of 4-dimensional rewriting.
On Two Forms of Bureaucracy in Derivations Kai Brünnler and Stéphane Lengrand AbstractWe call irrelevant information in derivations bureaucracy. An example of such irrelevant information is the order between two consecutive inference rules that trivially permute. Building on ideas by Guglielmi, we identify two forms of bureaucracy that occur in the calculus of structures (and, in fact, in every non-trivial term rewriting derivation). We develop term calculi that provide derivations that do not contain this bureaucracy. We also give a normalisation procedure that removes bureaucracy from derivations and find that in a certain sense the normalisation process is a process of cut elimination. Pdf10 June 2005
Proceedings of Structures and Deduction '05, pp. 69–80
This is further expanded in Ch. 11 of Stéphane Lengrand's PhD thesis Normalisation & Equivalence in Proof Theory & Type Theory (2007 Ackermann Award of the EACSL)
Constructing Free Boolean Categories François Lamarche and Lutz Straßburger AbstractBy Boolean category we mean something which is to a Boolean algebra what a category is to a poset. We propose an axiomatic system for Boolean categories, which is different in several respects from the ones proposed recently. In particular everything is done from the start in a *-autonomous category and not in a weakly distributive one, which simplifies issues like the Mix rule. An important axiom, which is introduced later, is a "graphical" condition, which is closely related to denotational semantics and the Geometry of Interaction. Then we show that a previously constructed category of proof nets is the free "graphical" Boolean category in our sense. This validates our categorical axiomatization with respect to a real-life example. Another important aspect of this work is that we do not assume a-priori the existence of units in the *-autonomous categories we use. This has some retroactive interest for the semantics of linear logic, and is motivated by the properties of our example with respect to units.
Classical Categories and Deep Inference Richard McKinley AbstractDeep inference is a proof-theoretic notion in which proof rules apply arbitrarily deeply inside a formula. We show that the essence of deep inference is the bifunctoriality of the connectives. We demonstrate that, when given an inequational theory that models cut-reduction, a deep inference calculus for classical logic (SKSg) is a categorical model of the classical sequent calculus LK in the sense of Führmann and Pym. We observe that this gives a notion of cut-reduction for derivations in SKSg, for which the usual notion of cut in SKSg is a special case. Viewing SKSg as a model of the sequent calculus uncovers new insights into the Craig interpolation lemma and intuitionistic provability. Pdf31 March 2005
Proceedings of Structures and Deduction '05, pp. 19–33
Naming Proofs in Classical Propositional Logic François Lamarche and Lutz Straßburger AbstractWe present a theory of proof denotations in classical propositional logic. The abstract definition is in terms of a semiring of weights, and two concrete instances are explored. With the Boolean semiring we get a theory of classical proof nets, with a geometric correctness criterion, a sequentialization theorem, and a strongly normalizing cut-elimination procedure. With the semiring of natural numbers, we obtain a sound semantics for classical logic, in which fewer proofs are identified. Though a "real" sequentialization theorem is missing, these proof nets have a grip on complexity issues. In both cases the cut elimination procedure is closely related to its equivalent in the calculus of structures, and we get "Boolean" categories which are not posets. Pdf31 January 2005 TLCA 2005, LNCS 3461, pp. 246–261
Deep Inference Proof Theory Equals Categorical Proof Theory Minus Coherence Dominic Hughes AbstractThis paper links deep inference proof theory, as studied by Guglielmi et. al., to categorical proof theory in the sense of Lambek et. al. It observes how deep inference proof theory is categorical proof theory, minus the coherence diagrams/laws. Coherence yields a ready-made and well studied notion of equality on deep inference proofs. The paper notes a precise correspondence between the symmetric deep inference system for multiplicative linear logic (the linear fragment of SKSg and the presentation of *-autonomous categories as symmetric linearly distributive categories with negation. Contraction and weakening in SKSg corresponds precisely to the presence of (co)monoids. Pdf6 October 2004
On Proof Nets for Multiplicative Linear Logic with Units Lutz Straßburger and François Lamarche AbstractIn this paper we present a theory of proof nets for full multiplicative linear logic, including the two units. It naturally extends the well-known theory of unit-free multiplicative proof nets. A linking is no longer a set of axiom links but a tree in which the axiom links are subtrees. These trees will be identified according to an equivalence relation based on a simple form of graph rewriting. We show the standard results of sequentialization and strong normalization of cut elimination. Furthermore, the identifications enforced on proofs are such that the proof nets, as they are presented here, form the arrows of the free (symmetric) *-autonomous category. Pdf30 June 2004 CSL 2004, LNCS 3210, pp. 145–159
2.8Language Design
Thanks to a self-dual non-commutative extension of linear logic one gets the first purely logical account of sequentiality in proof search. The new logical operators make possible a new approach to partial order planning and its relation to concurrency.
On Linear Logic Planning and Concurrency Ozan Kahramanoğulları AbstractWe present an approach to linear logic planning where an explicit correspondence between partial order plans and multiplicative exponential linear logic proofs is established. This is performed by extracting partial order plans from sound and complete encoding of planning problems in multiplicative exponential linear logic. These partial order plans exhibit a non-interleaving behavioural concurrency semantics, i.e., labelled event structures. Relying on this fact, we argue that this work is a crucial step for establishing a common language for concurrency and planning that will allow to carry techniques and methods between these two fields.
A Deductive Compositional Approach to Petri Nets for Systems Biology Ozan Kahramanoğulları AbstractWe introduce the language CP, a compositional language for place transition petri nets for the purpose of modelling signalling pathways in complex biological systems. We give the operational semantics of the language CP by means of a proof theoretical deductive system which extends multiplicative exponential linear logic with a self-dual non-commutative logical operator. This allows to express parallel and sequential composition of processes at the same syntactic level as in process algebra, and perform logical reasoning on these processes. We demonstrate the use of the language on a model of a signaling pathway for Fc receptor-mediated phagocytosis. Pdf18 September 2007
Presented as poster at CMSB '07
Towards Planning as Concurrency Ozan Kahramanoğulları AbstractWe present a purely logical framework to planning where we bring the sequential and parallel composition in the plans to the same level, as in process algebras. The problem of expressing causality, which is very challenging for common logics and traditional deductive systems, is solved by resorting to a recently developed extension of multiplicative exponential linear logic with a self-dual, noncommutative operator. We present an encoding of the conjunctive planning problems in this logic, and provide a constructive soundness and completeness result. We argue that this work is the first, but crucial, step of a uniform deductive formalism that connects planning and concurrency inside a common language, and allow to transfer methods from concurrency to planning. Pdf19 October 2004 Artificial Intelligence and Applications 2005, ACTA Press, pp. 197–202
A Purely Logical Account of Sequentiality in Proof Search Paola Bruscoli AbstractWe establish a strict correspondence between the proof-search space of a logical formal system and computations in a simple process algebra. Sequential composition in the process algebra corresponds to a logical relation in the formal system—in this sense our approach is purely logical, no axioms or encodings are involved. The process algebra is a minimal restriction of CCS to parallel and sequential composition; the logical system is a minimal extension of multiplicative linear logic. This way we get the first purely logical account of sequentiality in proof search. Since we restrict attention to a small but meaningful fragment, which is then of very broad interest, our techniques should become a common basis for several possible extensions. In particular, we argue about this work being the first step in a two-step research for capturing most of CCS in a purely logical fashion. Pdf12 August 2002 ICLP 2002, LNCS 2401, pp. 302–316
2.9Implementations
Ozan Kahramanoğulları, Pierre-Etienne Moreau and Antoine Reilles are implementing calculus-of-structures proof systems in Maude and in Tom. Ozan managed to achieve efficiency without sacrificing proof theoretic cleanliness, and he is obtaining results of independent theoretical interest. There are two slides presentations:
Max Schäfer has built a graphical proof editor in Java, called GraPE, for the Maude modules written by Ozan Kahramanoğulları; this means that one can interactively build and find proofs in several deep-inference systems.
Ingredients of a Deep Inference Theorem Prover Ozan Kahramanoğulları AbstractDeep inference deductive systems for classical logic provide exponentially shorter proofs than the sequent calculus systems, however with the cost of higher nondeterminism and larger search space in proof search. We report on our ongoing work on proof search with deep inference deductive systems. We present systems for classical logic where nondeterminism in proof search is reduced by constraining the context management rule of these systems. We argue that a deep inference system for classical logic can outperform sequent calculus deductive systems in proof search when nondeterminism and the application of the contraction rule are controlled by means of invertible rules. Pdf24 June 2008
Short paper at CL&C'08
Maude as a Platform for Designing and Implementing Deep Inference Systems Ozan Kahramanoğulları AbstractDeep inference is a proof theoretical methodology that generalizes the traditional notion of inference in the sequent calculus: In contrast to the sequent calculus, the deductive systems with deep inference do not rely on the notion of main connective, and permit the application of the inference rules at any depth inside logical expressions, in a way which resembles the application of term rewriting rules. Deep inference provides a richer combinatoric analysis of proofs for different logics. In particular, construction of exponentially shorter proofs becomes possible. In this paper, aiming at the development of computation as proof search tools, we propose the Maude language as a means for designing and implementing different deep inference deductive systems and proof strategies that work on these systems. We demonstrate these ideas on classical logic and argue that these ideas can be analogously carried to other deductive systems for other logics. Pdf16 September 2007 RULE '07, Electronic Notes in Theoretical Computer Science 219, 2008, pp. 35–50
Canonical Abstract Syntax Trees
Antoine Reilles AbstractThis paper presents GOM, a language for describing abstract syntax trees and generating a Java implementation for those trees. GOM includes features allowing to specify and modify the interface of the data structure. These features provide in particular the capability to maintain the internal representation of data in canonical form with respect to a rewrite system. This explicitly guarantees that the client program only manipulates normal forms for this rewrite system, a feature which is only implicitly used in many implementations. Pdf28 March 2006
Proceedings of 6th International Workshop on Rewriting Logic and Its Applications, Electronic Notes in Theoretical Computer Science 176, 2007, pp. 165–179
Implementing Deep Inference in TOM Ozan Kahramanoğulları, Pierre-Etienne Moreau, Antoine Reilles AbstractThe calculus of structures is a proof theoretical formalism which generalizes sequent calculus with the feature of deep inference: in contrast to sequent calculus, the calculus of structures does not rely on the notion of main connective and, like in term rewriting, it permits the application of inference rules at any depth inside a formula. Tom is a pattern matching processor that integrates term rewriting facilities into imperative languages. In this paper, relying on the correspondence between the systems in the calculus of structures and term rewriting systems, we present an implementation of system BV of the calculus of structures in Java by exploiting the term rewriting features of Tom. This way, by means of the expressive power due to Java, it becomes possible to implement different search strategies. Since the systems in the calculus of structures follow a common scheme, we argue that our implementation can be generalized to other systems in the calculus of structures for classical logic, modal logics, and different fragments of linear logic. Pdf23 April 2005
Proceedings of Structures and Deduction '05, pp. 158–172
System BV Without the Equalities for Unit Ozan Kahramanoğulları AbstractSystem BV is an extension of multiplicative linear logic with a non-commutative self-dual operator. In this paper, we present systems equivalent to system BV where equalities for unit are oriented from left to right and new structural rules are introduced to preserve completeness. While the first system allows units to appear in the structures, the second system makes it possible to completely remove the units from the language of BV by proving the normal forms of the structures that are provable in BV. The resulting systems provide better performance in automated proof search by disabling redundant applications of inference rules due to the unit. As evidence, we provide a comparison of the performance of these systems in a Maude implementation. Pdf4 August 2004 ISCIS 2004, LNCS 3280, pp. 986–995
From the Calculus of Structures to Term Rewriting Systems Steffen Hölldobler and Ozan Kahramanoğulları AbstractThe calculus of structures is a recently developed proof theoretical formalism that extends one-sided sequent calculus, with the gain of interesting proof theoretical properties. In contrast to sequent calculus, it does not rely on the notion of main connective and, like in term rewriting, permits the application of inference rules anywhere deep inside a formula. In this paper, exploiting this resemblance, we present a procedure turning derivations in the calculus of structures in four steps into rewritings in a term rewriting system modulo equality.
13 June 2004
Technical Report WV-04-03, Technische Universität Dresden
Implementing System BV of the Calculus of Structures in Maude Ozan Kahramanoğulları AbstractSystem BV is an extension of multiplicative linear logic with a noncommutative self-dual operator. We first map derivations of system BV of the calculus of structures to rewritings in a term rewriting system modulo equality, and then express this rewriting system as a Maude system module. This results in an automated proof search implementation for this system, and provides a recipe for implementing existing calculus of structures systems for other logics. Our result is interesting from the view of applications, specially, where sequentiality is essential, e.g., planning and natural language processing. In particular, we argue that we can express plans as logical formulae by using the sequential operator of BV and reason on them in a purely logical way. Pdf13 June 2004
Proceedings of ESSLLI 2004-Student Session
3Counterexamples
Finding counterexamples is very important to us. Counterexamples are typically of a very combinatorial nature, due to the new combinatorial possibilities offered by deep inference.
Confluent and Natural Cut Elimination in Classical LogicAlessio Guglielmi and Benjamin RalphWe show that, in deep inference, there is a natural and confluent cut elimination procedure that has a strikingly simple semantic justification. We proceed in two phases: we first tackle the propositional case with a construction that we call the 'experiments method'. Here we build a proof made of as many derivations as there are models of the given tautology. Then we lift the experiment method to the predicate calculus, by tracing all the existential witnesses, and in so doing we reconstruct the Herbrand theorem. The confluence of the procedure is essentially taken care of by the commutativity and associativity of disjunction.31 March 2015Accepted at Proof, Computation, Complexity 2015
Subatomic Proof SystemsAndrea Aler Tubella and Alessio GuglielmiWe show a proof system that generates propositional proofs by employing a single, linear, simple and regular inference rule scheme. The main idea is to consider atoms as self-dual, noncommutative binary logical relations and build formulae by freely composing units by atoms, disjunction and conjunction. If we restrict proofs to formulae where no atom occurs in the scope of another atom, we fully and faithfully recover a deduction system for propositional logics in the usual sense, where traditional proof analysis and transformation techniques such as cut elimination can be studied.30 March 2015Accepted at Proof, Computation, Complexity 2015
A Subatomic Proof SystemAndrea Aler Tubella and Alessio GuglielmiIn this work we show a proof system, called SA, that generates propositional proofs by employing a single, linear, simple and regular inference rule scheme. The main idea is to consider atoms as self-dual, noncommutative binary logical relations and build formulae by freely composing units by atoms, disjunction and conjunction. If we restrict proofs to formulae where no atom occurs in the scope of another atom, we fully and faithfully recover a deduction system for propositional logic in the usual sense, where traditional proof analysis and transformation techniques such as cut elimination can be studied.5 July 2014
The Commutative/Noncommutative Linear Logic BVAlessio GuglielmiThis brief survey contains an informal presentation of the commutative/noncommutative linear logic BV in terms of a naif space-temporal model. BV improves on the ability of linear logic to model spatial structure by satisfactorily capturing temporal structure in programming languages and quantum physics. I provide a guide to the literature on BV and suggest that the only satisfactory treatment for it can be obtained in the proof-theoretic methodology of deep inference.8 May 2014
Finitary CutAlessio GuglielmiWe all know that the traditional cut rule is considered infinitary. But if we reduce the cut rule to atomic form, as we always can do in the calculus of structures, is atomic cut still infinitary? Not really.22 September 2007Superseded by A First Order System with Finite Choice of Premises
ButterfliesAlessio GuglielmiIf you're interested in NEL and BV, try and find the mistake in this note.29 July 2005
Some News on Subatomic LogicAlessio GuglielmiI found a simple explanation for the soundness of the subatomic inference rule.28 July 2005
Red and BlueAlessio GuglielmiThe sequent calculus is often colour blind, but not always; the calculus of structures has a consistent behaviour.26 July 2005
Formalism BAlessio GuglielmiInference rules operate on derivations.20 December 2004
Using BV to Describe Causal Quantum EvolutionPrakash PanangadenIn this note I describe how to capture the kinematics of quantum causal evolution using a logic called BV developed by the Calculus of Structures group at Dresden.5 July 2004
Resolution in the Calculus of StructuresAlessio GuglielmiIt is possible to search for cut-free proofs by the resolution strategy in the calculus of structures; the sequent calculus does not allow the same freedom.29 September 2003
MismatchAlessio GuglielmiThere is a mismatch between meta and object levels in many calculi, like the sequent calculus and natural deduction. The mismatch has undesirable proof theoretical consequences, the most important being the inability to design deductive systems.24 August 2003Brief note for the specialist, posted to the Foundations of Mathematics and Proof Theory lists
Normalisation Without Cut EliminationAlessio GuglielmiI propose a possible notion of equivalence for proofs in classical logic, sort of a normal form for case analysis.25 February 2003
Subatomic LogicAlessio GuglielmiOne can unify classical and linear logic by using only two simple, linear, 'hyper' inference rules; they generate nice systems for all the fragments, where all rules are local. The various logics are determined by the algebra of their units, for example boolean algebra determines classical logic. We can prove cut elimination easily and once and for all in the two-rule system, and the procedure scales down to all the derived fragments.21 November 2002
On Lafont's CounterexampleAlessio GuglielmiLafont's counterexample shows why cut elimination in the sequent calculus of classical logic is not confluent. I show here that the counterexample doesn't work in the calculus of structures.1 November 2002
RecipeAlessio GuglielmiA family of tautologies that hold in most logics can be used, following a simple recipe, to produce deductive systems where all rules are local, including cut.27 October 2002
Goodness, Perfection and MiraclesAlessio GuglielmiI argue about three properties closely related to cut elimination and interpolation theorems. 18 October 2002
There is a very friendly mailing list devoted to deep inference, proof nets, structads, the calculus of structures and other amphibians of structural proof theory, named Frogs. Join it!
These are the current and recent grants I am aware of, regarding deep inference:
The Fine Structure of Formal Proof Systems and their Computational Interpretations (FISP)2016–2019. Three-year ANR-FWF project at Université Paris Diderot/PPS, INRIA Saclay/École Polytechnique, University of Innsbruck, Vienna University of Technology. 690,647 EUR. Coordinator is Michel Parigot.
Efficient and Natural Proof Systems2013–2016. Three-year EPSRC research project at University of Bath. 756,061 GBP (full economic costing, including indexation and a research studentship funded by the University of Bath). Principal investigator is Alessio Guglielmi, co-investigator is Guy McCusker.
Extending the Realm of the Curry-Howard Correspondence2011–2012. French-Swiss Academic Research Collaboration Programme (PAI Germaine De Stael). Exchange programme between INRIA Futurs and IAM, Universität Bern. The two sides get around 4,300 EUR each. Principal investigators: Lutz Straßburger and Kai Brünnler.
Structural and Computational Proof Theory (Structural)2010–2012. ANR—Programme Blanc International/FWF 2010. Three-year project at PPS, LIX, Innsbruck and Vienna. The project gets 441,229 EUR, out of a total cost in France of 2,032,556 EUR, from the Agence Nationale de la Recherche and it gets 291,577 EUR from the FWF. Coordinator and principal investigator is Michel Parigot.
REDO: Redesigning Logical Syntax2009–2010. INRIA—Action de Recherche Collaborative. Two-year project of the teams Calligramme/Démosthène and Parsifal (INRIA) and the Computer Science Department of the University of Bath. The project gets 80,000 EUR. Principal investigators are Alessio Guglielmi, François Lamarche and Lutz Straßburger (coordinator).
Identity and Geometric Essence of Proofs (Démosthène)ANR—Chaire d’excellence 2008–2010. Two-year project at LORIA. The project gets about 355,000 EUR from the Agence Nationale de la Recherche, out of a total cost of 750,448 EUR, the rest of which is covered by INRIA. Principal investigator is Alessio Guglielmi.
Complexity and Non-Determinism in Deep Inference2007–2008. 17 month project at University of Bath. 118,072 GBP, out of a total cost of 147,591 GBP, from the Engineering and Physical Sciences Research Council (EPSRC) Principal investigator is Alessio Guglielmi, co-investigator is Paola Bruscoli.
Deep Inference and the Essence of Proofs2007–2008. French-Swiss Academic Research Collaboration Programme (PAI Germaine De Stael). Exchange programme between INRIA Futurs and IAM, Universität Bern. The two sides get around 5,000 EUR each. Principal investigators: Lutz Straßburger and Kai Brünnler.
The Realm of Cut Elimination2007–2008. French-Austrian Academic Research Collaboration Programme (PAI Amadeus). Exchange programme between INRIA Futurs and Theory and Logic Group, Technische Universität Wien. The two sides get around 5,000 EUR each. Principal investigators: Lutz Straßburger and Agata Ciabattoni.
Theory and Application of Deep Inference (Infer)ANR—Programme Blanc 2006. Three-year project at LIX, LORIA and PPS. The project gets 230,400 EUR, out of a total cost of 1,280,598 EUR, from the Agence Nationale de la Recherche. Coordinator and principal investigator is Lutz Straßburger, other principal investigators are François Lamarche and Michel Parigot.
Analyticity and Proof Search for Modal Logics in Deep Inference2006–2008. British-German Academic Research Collaboration Programme. Two-year exchange programme between the University of Bath and the Technische Universität Dresden. Bath gets 2,650 GBP from the British Council (principal investigator Alessio Guglielmi) and Dresden gets 5,694 EUR from the German Academic Exchange Service (principal investigator Steffen Hölldobler).
New Deductive Systems, Normalisation Methods and Semantics for Deep Inference2006–2007. Alliance—Franco-British Partnership Programme. Two-year exchange programme between the University of Bath and the PPS group in Paris. Bath gets 4,300 GBP from the British Council (principal investigator Alessio Guglielmi) and PPS gets 6,250 EUR from the French MAE and the Ministry of Research (principal investigator Michel Parigot).
I did not indicate individual travel grants.
7Beans
Journals and chapters in books41
In press
3
Journal of Logic and Computation (2), Logical Methods in Computer Science
2015
3
Mathematical Logic and Foundations; Logical Methods in Computer Science; ACM Transactions on Computational Logic
2014
3
Categories and Types in Logic, Language, and Physics; Logical Methods in Computer Science; Notre Dame Journal of Formal Logic
2013
2
Logical Methods in Computer Science; ACM Transactions on Computational Logic
2012
4
Annals of Pure and Applied Logic (3); Applied Categorical Structures
2011
5
Mathematical Structures in Computer Science; Journal of Logic and Computation; ACM Transactions on Computational Logic; Logical Methods in Computer Science; Logic without Frontiers (Festschrift for Walter Alexandre Carnielli)
2009
6
ACM Transactions on Computational Logic; Electronic Notes in Theoretical Computer Science; Journal of Logic and Computation; Annals of Pure and Applied Logic; Information and Computation; Archive for Mathematical Logic
2008
3
Annals of Pure and Applied Logic; Logical Methods in Computer Science; Journal of Logic and Computation
2007
5
ACM Transactions on Computational Logic; Studia Logica; Journal of Logic and Computation; Theory and Applications of Categories (2)
2006
5
Studia Logica; Annals of Pure and Applied Logic; Logical Methods in Computer Science (2); Notre Dame Journal of Formal Logic
2003
2
Logic Journal of the IGPL; Theoretical Computer Science
Conferences56
2015
1
RTA
2014
4
CSL-LICS (3), AiML
2013
6
FoSSaCS, LICS, RTA, CSL (2), LPAR
2012
4
CiE, AiML (2), JELIA
2011
3
TLCA, Tableaux, CSL
2010
7
LPAR (2), RTA, LICS, CiE, AiML (2)
2009
5
Tableaux (2), WoLLIC, TLCA, TbiLLC
2008
3
LATA, LPAR, AiML
2007
1
RTA
2006
4
AiML, CiE, LPAR (2)
2005
5
WoLLIC, Logica Universalis, TLCA, AIA, LICS
2004
4
AiML, CSL (2), ISCIS
2003
4
CSL, FOL75, WoLLIC (2, one of which invited talk)
2002
3
LPAR (2), ICLP
2001
2
LPAR, CSL
Habilitation theses2
2011
1
Lutz Straßburger
2010
1
Kai Brünnler
PhD theses8
2013
2
Nicolas Guenot, Anupam Das
2011
1
Linda Postniece
2009
1
Tom Gundersen
2006
2
Richard McKinley, Ozan Kahramanoğulları
2003
2
Kai Brünnler, Lutz Straßburger
MSc theses4
2007
1
Benjamin Robert Horsfall
2005
1
Robert Hein
2004
1
Phiniki Stouppa
2001
1
Alwen Tiu
8Events
Deep inference was one of the main attractions at the following international events: