Alessio Guglielmi's Research and Teaching / Deep Inference
This page tries to be a comprehensive account of the ongoing research on deep inference.
Deep inference started as a personal project, but it is now growing fast and I'm struggling to keep pace with all the developments. Some of them I do not understand in detail.
Please keep in mind that what I wrote below can be very subjective and not reflect the opinions of the people involved in this research.
If you disagree or find inaccuracies, please let me know!
If you are an expert of proof theory and want to quickly understand what deep inference is about, go to Deep Inference in One Minute.
19 December 2012The three-year EPSRC project Efficient and Natural Proof Systems at the University of Bath has been approved.
20 January 2012The two-year Royal Society International Exchanges Grant Sharing and Sequentiality in Proof Systems with Locality between the University of Bath and the Università di Torino has been approved.
15-17 June 2011First Structural meeting in Paris.
2 December 2010ANR/FWF three-year project Structural is approved (about 2,324,000 EUR).
1416 September 2010Meeting REDO: Redesigning Logical Syntax in Bath.
920 August 2010Course Introduction to Proof Theory, by Lutz Straßburger, at ESSLLI 2010 in Copenhagen
2324 February 2010Meeting Geometric and Logic Approaches to Computation in Nancy
This page contains:
Other pages contain more detailed information:
Deep inference is a new methodology in proof theory, which is the discipline that studies mathematical proofs. Deep inference is about designing proof systems and formalisms with excellent properties of analyticity, proof complexity and semantics of proofs. With deep inference, we express logics that elude the traditional methods of proof theory, in particular we represent modal logics and logics related to process algebras. Deep inference provides beautiful and simple proof systems and formalisms.
The following picture illustrates a point of view on deep inference: most of traditional proof theory adopts a methodology that we call shallow inference. The methodologies of shallow and deep inference inspire the design of formalisms, which, in turn, express proof systems for several logics. At the bottom of the picture, solid arrows signal the existence of widely established proof systems in the given formalisms, while dashed arrows stand for a shabbier relationship. Some formalisms in deep inference are under development (dashed border).
We will now explore the concepts in the picture in more detail, and we will summarise the results we obtained so far in deep inference, and the current research themes.
A formal proof consists in breaking down a mathematical argument into small inference steps and connecting them together. Its validity can then be checked by local inspection of the inference steps, as dictated by syntactic inference rules. Checking a proof is a mechanical procedure, and this is much of its value: human fallacies are ruled out, and computers can be employed for checking and also discovering proofs.
A logic can be considered a class of theorems. Normally, mathematics is performed in first-order classical logic, but, in many cases, we use stronger or weaker systems. In computer science, we use many logics, some of which are rather far apart from classical logic.
A set of inference rules is called a (proof) system. For any given logic, several different proof systems exist, which prove the same theorems. However, they might greatly differ for several properties, notably:
In our research, these three properties of proof systems play a crucial role.
Proof systems can be classified according to the style they adhere to. These classes are not always mathematically defined, but, in practice, they are recognisable. We say that proof systems belong to different formalisms. The most important traditional formalisms are the Frege-Hilbert formalism, the sequent calculus and natural deduction, and there are many others; the sequent calculus* is special because it supports analyticity.
Different formalisms allow for proof systems with different properties. For example, systems in the natural deduction formalism do not lend themselves easily to proof search. Systems with excellent proof search properties, due to analyticity, can be designed in the sequent calculus*; however, they produce proofs which are exponentially bigger than those in natural deduction, on certain classes of theorems.
Often, formalisms entail interesting notions of normalisation inside their proof systems. For example, natural deduction systems possess a notion of normalisation that corresponds closely to a very important notion of computation, in what we call the Curry-Howard correspondence: normalising proofs in certain systems for intuitionistic logic corresponds to computing in the simply-typed lambda-calculus.
It is sometimes very challenging to design proof systems for a given logic in a given formalism: for example, many variations of modal logic, which are easily expressible in Frege-Hilbert systems, only find awkward presentations in sequent-calculus systems. Often, the reason for adopting the sequent calculus* is for getting analyticity; however, in some cases, with great effort, one only gets sequent-calculus systems that are not analytic.
For many logics, like modal logic, it might be necessary to adopt non-traditional formalisms, which allow for more freedom in designing inference rules. For example, many modal logic variations can be expressed with analytic proof systems in a formalism called hypersequents, which is a generalisation of the sequent calculus*.
Formalisms, to a very large extent, dictate the design of inference rules. For example, natural deduction prescribes that, for every connective, two rules are given: one that introduces it and one that eliminates it.
In all traditional formalisms, and in the modern ones derived from them, a methodology that we call shallow inference is adopted. `Shallow´ inference rules operate on connectives that appear in close proximity to the root of formulae, when we consider them as trees. For example, the introduction and elimination rules of natural deduction operate on root connectives of formulae. The sequent calculus usually goes one level deeper than natural deduction, and in some cases two levels deeper. The hypersequents formalism, which is derived from the sequent calculus, does the same.
Shallow inference is a very natural methodology, because it is about generating proofs by a straightforward structural induction on the formulae they prove. However, shallow inference is not optimal regarding the three properties of proof systems mentioned above, and in particular:
Shallow inference also has difficulties in coping with modal logic: modal logic theories can be defined in Frege-Hilbert systems, but obtaining analyticity for them (in sequent systems) can be very difficult, and in some cases unachieved. It is equally difficult or impossible to express proof systems for some logics involving non-commutativity, linearity and other features typical of computer-science concurrency languages, like process algebras.
All these seemingly heterogeneous difficulties can largely be ascribed to one cause: the adoption of the shallow-inference methodology. In fact, and very roughly, all shallow-inference formalisms require the adoption of a so-called meta, or structural, level for organising the `pieces´ of formulae obtained by the structural induction they adopt. For example, the sequent calculus organises these pieces into sets and trees. Because of the historical development of proof theory, the meta level, in a sense, coincides with the algebraic structure of classical logic. This becomes an increasingly serious impediment the more the logics one wants to express depart from classical logic.
Deep inference is based on a very simple idea: formalisms adopt the same algebraic structure of any given logic for keeping organised the pieces of proofs; in other words, there is no meta level. This means that inference rules must be able to operate at any level (`deeply´) inside formula trees.
The idea is simple, but the adoption of deep inference poses two difficult challenges:
We developed the proof theory of a deep-inference formalism that we call the calculus of structures. This is the simplest formalism conceivable in deep inference, because inference rules behave as rewrite rules in term rewriting systems. The calculus of structures is a milestone in the development of deep inference, because of its simplicity and its resemblance to traditional formalisms. Moreover, the normalisation techniques of the calculus of structures are applicable also to other formalisms in deep inference, currently under development.
We achieved the following results for the calculus of structures:
The calculus of structures generalises most shallow-inference formalisms, in particular the sequent calculus. This means that every proof in shallow-inference formalisms can be `mimicked´ in the calculus of structures, by preserving complexity and without losing any structural property.
It might be desirable, for example for constraining proof search and reducing non-determinism, to reach a compromise between the sequent calculus and the calculus of structures. A new deep-inference formalism, called nested sequents, has been developed, especially targeting modal logics. In deep sequents (and so, in the calculus of structures), modal logics like B and K5, which do not enjoy analytic presentations in the sequent calculus, find simple analytic systems.
The cirquent calculus, a new formalism recently developed by Giorgi Japaridze, benefits from a deep-inference presentation.
The calculus of structures promoted the discovery of a new class of proof nets for classical and linear logic. Proof nets are not proof systems, because they cannot be checked by local inspection, but they play a crucial role in understanding the semantics of proofs, which is one of the most active research areas in proof theory, in close connection with theoretical computer science.
Designing syntax is an eminently semantic activity: the only way to avoid creating monsters is having clear semantic objectives and guidance. The ambition of deep inference is to provide a unifying, simple syntax of proofs, which, above all, reflects the meaning of proofs. The semantics of proofs depends heavily on syntax, because proofs, more than formulae, are objects that are built, they are constructions, not just statements. Because of lack of adequate syntax, at present, we are basically unable to answer questions like Are two given proofs the same? (This is the so-called Hilbert's 24th problem.) The only answers we can give right now are either trivial (and uninteresting) or excessively technical. One of the main objectives of deep inference is to provide a simple answer to this question.
Deep inference comes from linear logic and process algebras; more specifically, it comes from seeing proofs as concurrent processes. The first development has been the definition of the calculus of structures and a cut elimination proof for the logic BV, which was studied for being the logical counterpart of the core of the process algebra CCS. We realised that the techniques developed for BV had a much wider applicability, so we broadly developed the calculus of structures and studied its many novel normalisation properties. The initial phase of development took place in Dresden, from 1999 to 2003; now, deep inference is developed in several laboratories around the world. The recent results on modal and intuitionistic logics, proof nets and semantics, and implementations, complete the establishing of deep inference as a solid and comprehensive methodology in proof theory.
Please do not hesitate to ask questions, either by email to me, or, even better, by using our mailing list.
The following material is broad in scope; if you are new to deep inference and the calculus of structures, start here:
Nested Deduction in Logical Foundations for Computation
AbstractDans cette thèse, nous étudions différentes approches permettant de donner un contenu calculatoire à des systèmes logiques suivant le principe d'inférence profonde, qui généralisent les formalismes traditionnels (tels que la déduction naturelles et le calcul des séquents) en autorisant l'application de règles d'inférence à n'importe quelle position dans une formule. Tout d'abord, nous suivons l'approche de la correspondence de Curry-Howard, en définissant des systèmes d'inférence profonde pour la logique intuitionniste, notamment dans le calcul des structures, et en décrivant une procédure de normalisation purement syntaxique. Ceci permet d'intépréter les preuves d'un tel système comme des lambda-termes et de connecter la normalisation à la réduction dans un lambda-calcul avec substitutions explicites. En particulier, nous étudions les extensions du lambda-calcul qui peuvent être obtenues en exploitant les particularités de nos systèmes logiques. Enfin, nous considérons la recherche de preuve comme calcul, en particulier à travers l'adaptation de la technique de focalisation (en logique linéaire) au calcul des structures, qui permet de réduire drastiquement le non-déterminisme de la construction de preuves. Nous montrons également comment la recherche de preuve en logique intuitionniste peut être reliée à la réduction dans le lambda-calcul avec substitutions explicites.
Pdf10 April 2013
PhD thesis, defended on 10 April 2013
A General View of Normalisation Through Atomic Flows
AbstractAtomic flows are a geometric invariant of classical propositional proofs in deep inference. In this thesis we use atomic flows to describe new normal forms of proofs, of which the traditional normal forms are special cases, we also give several normalisation procedures for obtaining the normal forms. We define, and use to present our results, a new deep-inference formalism called the functorial calculus, which is more flexible than the traditional calculus of structures. To our surprise we are able to 1) normalise proofs without looking at their logical connectives or logical rules; and 2) normalise proofs in less than exponential time.
Pdf12 August 2010
PhD thesis, defended on 10 November 2009
You can buy this book at Lambert Academic Publishing
AbstractWe see how nested sequents, a natural generalisation of hypersequents, allow us to develop a systematic proof theory for modal logics. As opposed to other prominent formalisms, such as the display calculus and labelled sequents, nested sequents stay inside the modal language and allow for proof systems which enjoy the subformula property in the literal sense.
In the first part we study a systematic set of nested sequent systems for all normal modal logics formed by some combination of the axioms for seriality, reflexivity, symmetry, transitivity and euclideanness. We establish soundness and completeness and some of their good properties, such as invertibility of all rules, admissibility of the structural rules, termination of proof-search, as well as syntactic cut-elimination.
In the second part we study the logic of common knowledge, a modal logic with a fixpoint modality. We look at two infinitary proof systems for this logic: an existing one based on ordinary sequents, for which no syntactic cut-elimination procedure is known, and a new one based on nested sequents. We see how nested sequents, in contrast to ordinary sequents, allow for syntactic cut-elimination and thus allow us to obtain an ordinal upper bound on the length of proofs.
Pdf13 April 2010
Nondeterminism and Language Design in Deep Inference
AbstractThis thesis studies the design of deep-inference deductive systems. In the systems with deep inference, in contrast to traditional proof-theoretic systems, inference rules can be applied at any depth inside logical expressions. Deep applicability of inference rules provides a rich combinatorial analysis of proofs. Deep inference also makes it possible to design deductive systems that are tailored for computer science applications and otherwise provably not expressible.
By applying the inference rules deeply, logical expressions can be manipulated starting from their sub-expressions. This way, we can simulate analytic proofs in traditional deductive formalisms. Furthermore, we can also construct much shorter analytic proofs than in these other formalisms. However, deep applicability of inference rules causes much greater nondeterminism in proof construction.
This thesis attacks the problem of dealing with nondeterminism in proof search while preserving the shorter proofs that are available thanks to deep inference. By redesigning the deep inference deductive systems, some redundant applications of the inference rules are prevented. By introducing a new technique which reduces nondeterminism, it becomes possible to obtain a more immediate access to shorter proofs, without breaking certain proof theoretical properties such as cut-elimination. Different implementations presented in this thesis allow to perform experiments on the techniques that we developed and observe the performance improvements. Within a computation-as-proof-search perspective, we use deep-inference deductive systems to develop a common proof-theoretic language to the two fields of planning and concurrency.
Pdf11 September 2006
PhD thesis, defended on 21 December 2006
You can buy this book at Lambert Academic Publishing
Categorical Models of First Order Classical Proofs
AbstractThis thesis introduces the notion of a classical doctrine: a semantics for proofs in first-order classical logic derived from the classical categories of Führmann and Pym, using Lawvere’s notion of hyperdoctrine. We introduce a hierarchy of classes of model, increasing in the strength of cut-reduction theory they model; the weakest captures cut reduction, and the strongest gives De Morgan duality between quantifiers as an isomorphism. Whereas classical categories admit the elimination of logical cuts as equalities, (and cuts against structural rules as inequalities), classical doctrines admit certain logical cuts as inequalities only. This is a result of the additive character of the quantifier introduction rules, as is illustrated by a concrete model based on families of sets and relations, using an abstract Geometry of Interaction construction.
We establish that each class of models is sound and complete with respect to the relevant cut-reduction theory on proof nets based on those of Robinson for propositional classical logic. We show also that classical categories and classical doctrines are not only a class of models for the sequent calculus, but also for deep inference calculi due to Brünnler for classical logic. Of particular interest are the local systems for classical logic, which we show are modelled by categorical models with an additional axiom forcing monoidality of certain functors; these categorical models correspond to multiplicative presentations of the sequent calculus with additional additive features.
Pdf30 March 2006
PhD thesis, defended on 17.3.2006
Deep Inference and Its Normal Form of Derivations
AbstractWe see a notion of normal derivation for the calculus of structures, which is based on a factorisation of derivations and which is more general than the traditional notion of cut-free proof in this formalism.
Pdf29 March 2006
CiE 2006, LNCS 3988, pp. 6574
Deep Inference and Symmetry in Classical Proofs
AbstractIn this thesis we see deductive systems for classical propositional and predicate logic which use deep inference, i.e. inference rules apply arbitrarily deep inside formulas, and a certain symmetry, which provides an involution on derivations. Like sequent systems, they have a cut rule which is admissible. Unlike sequent systems, they enjoy various new interesting properties. Not only the identity axiom, but also cut, weakening and even contraction are reducible to atomic form. This leads to inference rules that are local, meaning that the effort of applying them is bounded, and finitary, meaning that, given a conclusion, there is only a finite number of premises to choose from. The systems also enjoy new normal forms for derivations and, in the propositional case, a cut elimination procedure that is drastically simpler than the ones for sequent systems.
PhD thesis, defended on 22 September 2003, published by Logos Verlag
You can buy this book at Logos-Verlag and at Amazon
Linear Logic and Noncommutativity in the Calculus of Structures
AbstractIn this thesis I study several deductive systems for linear logic, its fragments, and some noncommutative extensions. All systems will be designed within the calculus of structures, which is a proof theoretical formalism for specifying logical systems, in the tradition of Hilbert's formalism, natural deduction, and the sequent calculus. Systems in the calculus of structures are based on two simple principles: deep inference and top-down symmetry. Together they have remarkable consequences for the properties of the logical systems. For example, for linear logic it is possible to design a deductive system, in which all rules are local. In particular, the contraction rule is reduced to an atomic version, and there is no global promotion rule. I will also show an extension of multiplicative exponential linear logic by a noncommutative, self-dual connective which is not representable in the sequent calculus. All systems enjoy the cut elimination property. Moreover, this can be proved independently from the sequent calculus via techniques that are based on the new top-down symmetry. Furthermore, for all systems, I will present several decomposition theorems which constitute a new type of normal form for derivations.
PdfPdf in booklet format25 July 2003
PhD thesis, defended on 24 July 2003
In the rest of the section, all the papers I know of are listed according to their subject, in no particular order.
So far, for classical logic in the calculus of structures we achieved:
We can present intuitionistic logic in the calculus of structures with a fully local, cut-free system. The logic of bunched implications BI can be presented in the calculus of structures. Japaridze's cirquent calculus benefits from a deep-inference presentation, in particular in the case of propositional logic.
The following papers exist, in addition to Deep Inference and Its Normal Form of Derivations and Deep Inference and Symmetry in Classical Proofs, mentioned above:
Atomic Lambda Calculus: A Typed Lambda-Calculus with Explicit Sharing
Tom Gundersen, Willem Heijltjes and Michel Parigot
AbstractAn explicit-sharing lambda calculus is presented, based on a Curry-Howard-style interpretation of the deep-inference proof formalism. Duplication of subterms during reduction proceeds `atomically´, i.e. on individual constructors, similar to optimal graph reduction in the style of Lamping. The calculus preserves strong normalisation with respect to the lambda-calculus, and achieves fully lazy sharing. Reduction in the calculus can be viewed as a restriction of optimal graph reduction that preserves simple typing.
16 January 2013
Accepted by LICS 2013
Breaking Paths in Atomic Flows for Classical Logic
Alessio Guglielmi, Tom Gundersen and Lutz Straßburger
AbstractThis work belongs to a wider effort aimed at eliminating syntactic bureaucracy from proof systems. In this paper, we present a novel cut elimination procedure for classical propositional logic. It is based on the recently introduced atomic flows: they are purely graphical devices that abstract away from much of the typical bureaucracy of proofs. We make crucial use of the path breaker, an atomic-flow construction that avoids some nasty termination problems, and that can be used in any proof system with sufficient symmetry. This paper contains an original 2-dimensional-diagram exposition of atomic flows, which helps us to connect atomic flows with other known formalisms.
Pdf29 April 2010
LICS 2010 Proceedings (IEEE), pp. 284293
An Algorithmic Interpretation of a Deep Inference System
Kai Brünnler and Richard McKinley
AbstractWe set out to find something that corresponds to deep inference in the same way that the lambda-calculus corresponds to natural deduction. Starting from natural deduction for the conjunction-implication fragment of intuitionistic logic we design a corresponding deep inference system together with reduction rules on proofs that allow a fine-grained simulation of beta-reduction.
Pdf22 September 2008
LPAR 08, LNCS 5330, pp. 482496
Cirquent Calculus Deepened
AbstractCirquent calculus is a new proof-theoretic and semantic framework, whose main distinguishing feature is being based on circuit-style structures (called cirquents), as opposed to the more traditional approaches that deal with tree-like objects such as formulas, sequents or hypersequents. Among its advantages are greater efficiency, flexibility and expressiveness. This paper presents a detailed elaboration of a deep-inference cirquent logic, which is naturally and inherently resource conscious. It shows that classical logic, both syntactically and semantically, can be seen to be just a special, conservative fragment of this more general and, in a sense, more basic logic the logic of resources in the form of cirquent calculus. The reader will find various arguments in favor of switching to the new framework, such as arguments showing the insufficiency of the expressive power of linear logic or other formula-based approaches to developing resource logics, exponential improvements over the traditional approaches in both representational and proof complexities offered by cirquent calculus (including the existence of polynomial size cut-, substitution- and extension-free cirquent calculus proofs for the notoriously hard pigeonhole principle), and more. Among the main purposes of this paper is to provide an introductory-style starting point for what, as the author wishes to hope, might have a chance to become a new line of research in proof theory a proof theory based on circuits instead of formulas.
Pdf1 April 2008
Journal of Logic and Computation, 18 (6) 2008, pp. 9831028
Normalisation Control in Deep Inference via Atomic Flows
Alessio Guglielmi and Tom Gundersen
AbstractWe introduce `atomic flows´: they are graphs obtained from derivations by tracing atom occurrences and forgetting the logical structure. We study simple manipulations of atomic flows that correspond to complex reductions on derivations. This allows us to prove, for propositional logic, a new and very general normalisation theorem, which contains cut elimination as a special case. We operate in deep inference, which is more general than other syntactic paradigms, and where normalisation is more difficult to control. We argue that atomic flows are a significant technical advance for normalisation theory, because 1) the technique they support is largely independent of syntax; 2) indeed, it is largely independent of logical inference rules; 3) they constitute a powerful geometric formalism, which is more intuitive than syntax.
Journal page (pdf/postscript)31 March 2008
Logical Methods in Computer Science 4 (1:9) 2008, pp. 136
The Logic of Bunched Implications: A Memoir
Benjamin Robert Horsfall
AbstractThis is a study of the semantics and proof theory of the logic of bunched implications (BI), which is promoted as a logic of (computational) resources, and is a foundational component of separation logic, an approach to program analysis. BI combines an additive, or intuitionistic, fragment with a multiplicative fragment. The additive fragment has full use of the structural rules of weakening and contraction, and the multiplicative fragment has none. Thus it contains two conjunctive and two implicative connectives. At various points, we illustrate a resource view of BI based upon the Kripke resource semantics. Our first original contribution is the formulation of a proof system for BI in the newly developed proof-theoretical formalism of the calculus of structures. The calculus of structures is distinguished by its employment of deep inference, but we already see deep inference in a limited form in the established proof theory for BI. We show that our system is sound with respect to the elementary Kripke resource semantics for BI, and complete with respect to the partially-defined monoid (PDM) semantics. Our second contribution is the development from a semantic standpoint of preliminary ideas for a hybrid logic of bunched implications (HBI). We give a Kripke semantics for HBI in which nominal propositional atoms can be seen as names for resources, rather than as names for locations, as is the case with related proposals for BI-Loc and for intuitionistic hybrid logic. The cost of this approach is the loss of intuitionistic monotonicity in the semantics. But this is perhaps not such a grave loss, given that our guiding analogy is of states of models with resources, rather than with states of knowledge, as is standard for intuitionistic logic.
MSc thesis, successfully defended in August 2007
A Characterisation of Medial as Rewriting Rule
AbstractMedial is an inference rule scheme that appears in various deductive systems based on deep inference. In this paper we investigate the properties of medial as rewriting rule independently from logic. We present a graph theoretical criterion for checking whether there exists a medial rewriting path between two formulas. Finally, we return to logic and apply our criterion for giving a combinatorial proof for a decomposition theorem, i.e., proof theoretical statement about syntax.
Pdf13 April 2007
RTA 2007, LNCS 4533, pp. 344358
A Local System for Intuitionistic Logic
AbstractThis paper presents systems for first-order intuitionistic logic and several of its extensions in which all the propositional rules are local, in the sense that, in applying the rules of the system, one needs only a fixed amount of information about the logical expressions involved. The main source of non-locality is the contraction rules. We show that the contraction rules can be restricted to the atomic ones, provided we employ deep-inference, i.e., to allow rules to apply anywhere inside logical expressions. We further show that the use of deep inference allows for modular extensions of intuitionistic logic to Dummett's intermediate logic LC, Gödel logic and classical logic. We present the systems in the calculus of structures, a proof theoretic formalism which supports deep-inference. Cut elimination for these systems are proved indirectly by simulating the cut-free sequent systems, or the hypersequent systems in the cases of Dummett's LC and Gödel logic, in the cut free systems in the calculus of structures.
Locality for Classical Logic
AbstractIn this paper we will see deductive systems for classical propositional and predicate logic in the calculus of structures. Like sequent systems, they have a cut rule which is admissible. Unlike sequent systems, they drop the restriction that rules only apply to the main connective of a formula: their rules apply anywhere deeply inside a formula. This allows to observe very clearly the symmetry between identity axiom and the cut rule. This symmetry allows to reduce the cut rule to atomic form in a way which is dual to reducing the identity axiom to atomic form. We also reduce weakening and even contraction to atomic form. This leads to inference rules that are local: they do not require the inspection of expressions of arbitrary size.
Pdf10 March 2006
Notre Dame Journal of Formal Logic 47 (4), pp. 557580, 2006
Review by Sara Negri
Note about the review: All the concerns of the reviewer are addressed in the paper Two Restrictions on Contraction (which is among the references of the reviewed paper).
Cut Elimination Inside a Deep Inference System for Classical Predicate Logic
AbstractDeep inference is a natural generalisation of the one-sided sequent calculus where rules are allowed to apply deeply inside formulas, much like rewrite rules in term rewriting. This freedom in applying inference rules allows to express logical systems that are difficult or impossible to express in the cut-free sequent calculus and it also allows for a more fine-grained analysis of derivations than the sequent calculus. However, the same freedom also makes it harder to carry out this analysis, in particular it is harder to design cut elimination procedures. In this paper we see a cut elimination procedure for a deep inference system for classical predicate logic. As a consequence we derive Herbrand's Theorem, which we express as a factorisation of derivations.
Pdf9 March 2005
Studia Logica 82 (1), pp. 5171, 2006
A First Order System with Finite Choice of Premises
Kai Brünnler and Alessio Guglielmi
AbstractWe present an inference system for classical first order logic in which each inference rule, including the cut, only has a finite set of premises to choose from. The main conceptual contribution of this paper is the possibility of separating different sources of infinite choice, which happen to be entangled in the traditional cut rule.
Pdf1 December 2003
Presented at First Order Logic 75 under the title A Finitary System for First Order Logic; appeared in Hendricks et al., editor, First-Order Logic Revisited, Logos Verlag, Berlin, 2004, pp. 5974
Two Restrictions on Contraction
AbstractI show two simple limitations of sequent systems with multiplicative context treatment: contraction can neither be restricted to atoms nor to the bottom of a proof tree.
Pdf24 November 2003
Logic Journal of the Interest Group in Pure and Applied Logics 11 (5), pp. 525529
Consistency Without Cut Elimination
Kai Brünnler and Alessio Guglielmi
AbstractWe show how to get consistency for first order classical logic, in a purely syntactic way, without going through cut elimination. The procedure is very simple and it also shows how finitariness is actually a triviality (contrarily to what one would guess from textbooks).
Pdf10 September 2003It is contained in A First Order System with Finite Choice of Premises
Atomic Cut Elimination for Classical Logic
AbstractSystem SKS is a set of rules for classical propositional logic presented in the calculus of structures. Like sequent systems and unlike natural deduction systems, it has an explicit cut rule, which is admissible. In contrast to sequent systems, the cut rule can easily be restricted to atoms. This allows for a very simple cut elimination procedure based on plugging in parts of a proof, like normalisation in natural deduction and unlike cut elimination in the sequent calculus. It should thus be a good common starting point for investigations into both proof search as computation and proof normalisation as computation.
Pdf10 April 2003
CSL 2003, LNCS 2803, pp. 8697
A Local System for Classical Logic
Kai Brünnler and Alwen Fernanto Tiu
AbstractThe calculus of structures is a framework for specifying logical systems, which is similar to the one-sided sequent calculus but more general. We present a system of inference rules for propositional classical logic in this new framework and prove cut elimination for it. The system also enjoys a decomposition theorem for derivations that is not available in the sequent calculus. The main novelty of our system is that all the rules are local: contraction, in particular, is reduced to atomic form. This should be interesting for distributed proof-search and also for complexity theory, since the computational cost of applying each rule is bounded.
Pdf2 October 2001Now replaced by Locality for Classical Logic
LPAR 2001, LNCS 2250, pp. 347361
The basic proof complexity properties of propositional logic in the calculus of structures are known. Deep inference is as powerful as Frege systems, and more powerful than Gentzen systems, in the restriction to analytic systems.
The following papers exist:
The Pigeonhole Principle and Related Counting Arguments in Weak Monotone Systems
AbstractWe construct quasipolynomial-size proofs of the propositional pigeonhole principle for the fragment of the sequent calculus with no cuts between ancestors of left and right negation, weakening and contraction rules.
The main construction of our argument, inspired by previous work on the monotone calculus by Atserias et al., provides formal proofs that permute the inputs of formulae computing threshold functions, essentially by implementing merge sort as a template.
Since it is non-trivial to eliminate the remaining weakening/contraction cuts efficiently, our arguments are implemented in deep inference where known normalization procedures exist, although the work is mostly self-contained. This also (partially) answers previous questions raised about the size of proofs of the pigeonhole principle in certain deep inference systems.
Pdf 28 February 2013
Rewriting with Linear Inferences in Propositional Logic
AbstractLinear inferences are sound implications of propositional logic where each variable appears exactly once in the premiss and conclusion. We consider a specific set of these inferences, MS, first studied by Straßburger, corresponding to the logical rules in deep inference proof theory. Despite previous results characterising the individual rules of MS, we show that there is no polynomial-time characterisation of MS, assuming that integers cannot be factorised in polynomial time.
We also examine the length of rewrite paths in MS, utilising a notion of trivialisation to reduce the case with units to the case without, amongst other observations on MS-rewriting and the set of linear inferences in general.
Pdf 15 February 2013
Accepted by RTA 2013
Extension without Cut
AbstractIn proof theory one distinguishes sequent proofs with cut and cut-free sequent proofs, while for proof complexity one distinguishes Frege-systems and extended Frege-systems. In this paper we show how deep inference can provide a uniform treatment for both classifications, such that we can define cut-free systems with extension, which is neither possible with Frege-systems, nor with the sequent calculus. We show that the propositional pigeon-hole principle admits polynomial-size proofs in a cut-free system with extension. We also define cut-free systems with substitution and show that the system with extension p-simulates the system with substitution.
Pdf31 July 2012
Annals of Pure and Applied Logic 163(12) 2012, pp. 19952007
Some Results on the Relative Proof Complexity of Deep Inference Via Atomic Flows
AbstractWe consider the proof complexity of the minimal complete fragment of standard deep inference, denoted KS. To examine the size of proofs we employ atomic flows, diagrams that trace structural changes through a proof but ignore logical information. As results we obtain a polynomial simulation of dag-like cut-free Gentzen and Resolution, along with some extensions. We also show that these systems, as well as bounded-depth Frege systems, cannot polynomially simulate KS, by giving polynomial-size proofs of certain variants of the propositional pigeonhole principle in KS.
On the Proof Complexity of Cut-Free Bounded Deep Inference
AbstractIt has recently been shown that cut-free deep inference systems exhibit an exponential speed-up over cut-free sequent systems, in terms of proof size. While this is good for proof complexity, there remains the problem of typically high proof search non-determinism induced by the deep inference methodology: the higher the depth of inference, the higher the non-determinism. In this work we improve on the proof search side by demonstrating that, for propositional logic, the same exponential speed-up in proof size can be obtained in bounded-depth cut-free systems. These systems retain the top-down symmetry of deep inference, but can otherwise be designed at the same depth level of sequent systems. As a result, the non-determinism arising from the choice of rules at each stage of a proof is smaller than that of unbounded deep inference, while still giving access to the short proofs of deep inference.
Pdf25 April 2011
Tableaux 2011, LNCS 6793, pp. 134148
A Quasipolynomial Normalisation Procedure in Deep Inference
Paola Bruscoli, Alessio Guglielmi, Tom Gundersen and Michel Parigot
AbstractJeřábek showed in 2008 that cuts in propositional-logic deep-inference proofs can be eliminated in quasipolynomial time. The proof is an indirect one relying on a result of Atserias, Galesi and Pudlák about monotone sequent calculus and a correspondence between this system and cut-free deep-inference proofs. In this paper we give a direct proof of Jeřábek’s result: we give a quasipolynomialtime cut-elimination procedure in propositional-logic deep inference. The main new ingredient is the use of a computational trace of deep-inference proofs called atomic flows, which are both very simple (they trace only structural rules and forget logical rules) and strong enough to faithfully represent the cut-elimination procedure. We also show how the technique can be extended to obtain a more general notion of normalisation called streamlining.
On the Proof Complexity of Deep Inference
Paola Bruscoli and Alessio Guglielmi
AbstractWe obtain two results about the proof complexity of deep inference: 1) deep-inference proof systems are as powerful as Frege ones, even when both are extended with the Tseitin extension rule or with the substitution rule; 2) there are analytic deep-inference proof systems that exhibit an exponential speed-up over analytic Gentzen proof systems that they polynomially simulate.
Pdf19 April 2009
ACM Transactions on Computational Logic 10 (2:14) 2009, pp. 134
Proof complexity of the Cut-Free Calculus of Structures
AbstractWe investigate the proof complexity of analytic subsystems of the deep inference proof system SKSg (the calculus of structures). Exploiting the fact that the cut rule (i↑) of SKSg corresponds to the ¬-left rule in the sequent calculus, we establish that the “analytic” system KSg + c↑ has essentially the same complexity as the monotone Gentzen calculus MLK. In particular, KSg + c↑ quasipolynomially simulates SKSg, and admits polynomial-size proofs of some variants of the pigeonhole principle.
Pdf30 April 2008
Journal of Logic and Computation 19 (2) 2009, pp. 323339, 2009
A new formalism called `nested sequents´ has been defined, which is especially suitable to modal logics.
The following papers exist, in addition to Nested Sequents, mentioned above:
Cut Elimination in Nested Sequents for Intuitionistic Modal Logics
AbstractWe present cut-free deductive systems without labels for the intuitionistic variants of the modal logics obtained by extending IK with a subset of the axioms d, t, b, 4, and 5. For this, we use the formalism of nested sequents, which allows us to give a uniform cut elimination argument for all 15 logic in the intuitionistic S5 cube.
Pdf7 January 2013
Accepted by FoSSaCS 2013
Labelled Tree sequents, Tree Hypersequents and Nested (Deep) Sequents
Rajeev Goré and Revantha Ramanayake
AbstractWe identify a subclass of labelled sequents called "labelled tree sequents" and show that these are notational variants of tree-hypersequents in the sense that a sequent of one type can be represented naturally as a sequent of the other type. This relationship can be extended to nested (deep) sequents using the relationship between tree-hypersequents and nested (deep) sequents, which we also show. We apply this result to transfer proof-theoretic results such as syntactic cut-admissibility between the tree-hypersequent calculus CSGL and the labelled sequent calculus G3GL for provability logic GL. This answers in full a question posed by Poggiolesi about the exact relationship between these calculi. Our results pave the way to obtain cut-free tree-hypersequent and nested (deep) sequent calculi for large classes of logics using the known calculi for labelled sequents, and also to obtain a large class of labelled sequent calculi for bi-intuitionistic tense logics from the known nested (deep) sequent calculi for these logics. Importing proof-theoretic results between notational variant systems in this manner alleviates the need for independent proofs in each system. Identifying which labelled systems can be rewritten as labelled tree sequent systems may provide a method for determining the expressive limits of the nested sequent formalism.
Pdf5 October 2012
Advances in Modal Logic 2012, pp. 279299
Nested Sequent Calculi for Conditional Logics
Régis Alenda, Nicola Olivetti and Gian Luca Pozzato
AbstractNested sequent calculi are a useful generalization of ordinary sequent calculi, where sequents are allowed to occur within sequents. Nested sequent calculi have been profitably employed in the area of (multi)-modal logic to obtain analytic and modular proof systems for these logics. In this work, we extend the realm of nested sequents by providing nested sequent calculi for the basic conditional logic CK and some of its significant extensions. The calculi are internal (a sequent can be directly translated into a formula), cut-free and analytic. Moreover, they can be used to design (sometimes optimal) decision procedures for the respective logics, and to obtain complexity upper bounds. Our calculi are an argument in favour of nested sequent calculi for modal logics and alike, showing their versatility and power.
Pdf16 July 2012
JELIA 2012, LNCS 7519, pp. 1427
Nested Sequents for Intuitionistic Logics
AbstractNested sequent systems for modal logics were introduced by Kai Brünnler, and have come to be seen as an attractive deep reasoning extension of familiar sequent calculi. In an earlier paper I showed there was a connection between modal nested sequents and modal prefixed tableaus. In this paper I extend the nested sequent machinery to intuitionistic logic, both standard and constant domain, and relate the resulting sequent calculi to intuitionistic prefixed tableaus. Modal nested sequent machinery generalizes one sided sequent calculithe present work similarly generalizes two sided sequents. It is noteworthy that the resulting system for constant domain intuitionistic logic is particularly simple. It amounts to a combination of intuitionistic propositional rules and classical quantifier rules, a combination that is known to be inadequate when conventional intuitionistic sequent systems are used.
Pdf8 July 2012
Grammar Logics in Nested Sequent Calculus: Proof Theory and Decision Procedures
Alwen Tiu and Egor Ianovski and Rajeev Goré
AbstractA grammar logic refers to an extension of the multi-modal logic K in which the modal axioms are generated from a formal grammar. We consider a proof theory, in nested sequent calculus, of grammar logics with converse, i.e., every modal operator [a] comes with a converse [ā]: Extending previous works on nested sequent systems for tense logics, we show all grammar logics (with or without converse) can be formalised in nested sequent calculi, where the axioms are internalised in the calculi as structural rules. Syntactic cut-elimination for these calculi is proved using a procedure similar to that for display logics. If the grammar is context-free, then one can get rid of all structural rules, in favor of deep inference and additional propagation rules. We give a novel semi-decision procedure for context-free grammar logics, using nested sequent calculus with deep inference, and show that, in the case where the given context-free grammar is regular, this procedure terminates. Unlike all other existing decision procedures for regular grammar logics in the literature, our procedure does not assume that a finite state automaton encoding the axioms is given.
Pdf15 June 2012
Advances in Modal Logic 2012, pp. 516537
Syntactic Cut-Elimination for a Fragment of the Modal Mu-Calculus
Kai Brünnler and Thomas Studer
AbstractFor some modal fixed point logics, there are deductive systems that enjoy syntactic cut elimination. An early example is the system in Pliuskevicius (1991)  for LTL. More recent examples are the systems by the authors of this paper for the logic of common knowledge (Brünnler and Studer, 2009)  and by Hill and Poggiolesi (2010) for PDL , which are based on a form of deep inference. These logics can be seen as fragments of the modal mu-calculus. Here we are interested in how far this approach can be pushed in general. To this end, we introduce a nested sequent system with syntactic cut-elimination which is incomplete for the modal mu-calculus, but complete with respect to a restricted language that allows only fixed points of a certain form. This restricted language includes the logic of common knowledge and PDL. There is also a traditional sequent system for the modal mu-calculus by Jäger et al. (2008) , without a syntactic cut-elimination procedure. We embed that system into ours and vice versa, thus establishing cut-elimination also for the shallow system, when only the restricted language is considered.
Pdf14 May 2012
Annals of Pure and Applied Logic 163(12), pp. 18381853, 2012
Proving Completeness for Nested Sequent Calculi
AbstractProving the completeness of classical propositional logic by using maximal consistent sets is perhaps the most common method there is, going back to Lindenbaum (though not actually published by him). It has been extended to a variety of logical formalisms, sometimes combined with the addition of Henkin constants to handle quantifiers. Recently a deep-reasoning formalism called nested sequents has been introduced by Kai Brünnler, able to handle a larger variety of modal logics than are possible with standard Gentzen-type sequent calculi. In this paper we sketch how yet another variation on the maximality method of Lindenbaum allows one to prove completeness for nested sequent calculi. It is certainly not the only method available, but it should be entered into the record as one more useful tool available to the logician.
Pdf4 October 2011
Logic without Frontiers, College Publications, pp. 145154, 2011
Prefixed Tableaus and Nested Sequents
AbstractNested sequent systems for modal logics are a relatively recent development, within the general area known as deep reasoning. The idea of deep reasoning is to create systems within which one operates at lower levels in formulas than just those involving the main connective or operator. Prefixed tableaus go back to 1972, and are modal tableau systems with extra machinery to represent accessibility in a purely syntactic way. We show that modal nested sequents and prefixed modal tableaus are notational variants of each other, roughly in the same way that Gentzen sequent calculi and tableaus are notational variants. This immediately gives rise to new modal nested sequent systems which may be of independent interest. We discuss some of these, including those for some justification logics that include standard modal operators.
Pdf14 September 2011
Annals of Pure and Applied Logic 163(3), pp. 291313, 2012
On the Correspondence Between Display Postulates and Deep Inference in Nested Sequent Calculi for Tense Logics
Rajeev Goré, Linda Postniece and Alwen Tiu
AbstractWe consider two styles of proof calculi for a family of tense logics, presented in a formalism based on nested sequents. A nested sequent can be seen as a tree of traditional single-sided sequents. Our first style of calculi is what we call “shallow calculi”, where inference rules are only applied at the root node in a nested sequent. Our shallow calculi are extensions of Kashima’s calculus for tense logic and share an essential characteristic with display calculi, namely, the presence of structural rules called “display postulates”. Shallow calculi enjoy a simple cut elimination procedure, but are unsuitable for proof search due to the presence of display postulates and other structural rules. The second style of calculi uses deep-inference, whereby inference rules can be applied at any node in a nested sequent. We show that, for a range of extensions of tense logic, the two styles of calculi are equivalent, and there is a natural proof theoretic correspondence between display postulates and deep inference. The deep inference calculi enjoy the subformula property and have no display postulates or other structural rules, making them a better framework for proof search.
Proof Theory and Proof Search of Bi-Intuitionistic and Tense Logic
AbstractIn this thesis, we consider bi-intuitionistic logic and tense logic, as well as the combined bi-intuitionistic tense logic. Each of these logics contains a pair of dual connectives, for example, Rauszer's bi-intuitionistic logic contains intuitionistic implication and dual intuitionistic exclusion. The interaction between these dual connectives makes it non-trivial to develop a cut-free sequent calculus for these logics.
In the first part of this thesis we develop a new extended sequent calculus for biintuitionistic logic using a framework of derivations and refutations. This is the first purely syntactic cut-free sequent calculus for bi-intuitionistic logic and thus solves an open problem. Our calculus is sound, semantically complete and allows terminating backward proof search, hence giving rise to a decision procedure for bi-intuitionistic logic.
In the second part of this thesis we consider the broader problem of taming proof search in display calculi, using bi-intuitionistic logic and tense logic as case studies. While the generality of display calculi makes it an excellent framework for designing sequent calculi for logics where traditional sequent calculi fail, this generality also leads to a large degree of non-determinism, which is problematic for backward proof-search. We control this non-determinism in two ways:
1. First, we limit the structural connectives used in the calculi and consequently, the number of display postulates. Specifically, we work with nested structures which can be viewed as a tree of traditional Gentzen's sequents, called nested sequents, which have been used previously by Kashima and, independently, by Brünnler and Straßburger and Poggiolesi to present several modal and tense logics.
2. Second, since residuation rules are largely responsible for the difficulty in finding a proof search procedure for display-like calculi, we show how to eliminate these residuation rules using deep inference in nested sequents.
Finally, we study the combined bi-intuitionistic tense logic, which contains the well-known intuitionistic modal logic as a sublogic. We give a nested sequent calculus for bi-intuitionistic tense logic that has cut-elimination, and a derived deep inference nested sequent calculus that is complete with respect to the first calculus and where contraction and residuation rules are admissible. We also show how our calculi can capture Simpson's intuitionistic modal logic and Ewald's intuitionistic tense logic.
Pdf21 November 2010
PhD thesis, defended on March 2011
How to Universally Close the Existential Rule
AbstractThis paper introduces a nested sequent system for predicate logic. The system features a structural universal quantifier and a universally closed existential rule. One nice consequence of this is that proofs of sentences cannot contain free variables. Another nice consequence is that the assumption of a non-empty domain is isolated in a single inference rule. This rule can be removed or added at will, leading to a system for free logic or classical predicate logic, respectively. The system for free logic is interesting because it has no need for an existence predicate. We see syntactic cut-elimination and completeness results for these two systems as well as two standard applications: Herbrand ?s Theorem and interpolation.
Pdf (long version)31 July 2010
LPAR-17, LNCS 6397, pp. 172186
Cut-Elimination and Proof Search for Bi-Intuitionistic Tense Logic
Rajeev Goré, Linda Postniece and Alwen Tiu
AbstractWe consider an extension of bi-intuitionistic logic with the traditional modalities from tense logic Kt. Proof theoretically, this extension is obtained simply by extending an existing sequent calculus for bi-intuitionistic logic with typical inference rules for the modalities used in display logics. As it turns out, the resulting calculus, LBiKt, seems to be more basic than most intuitionistic tense or modal logics considered in the literature, in particular, those studied by Ewald and Simpson, as it does not assume any a priori relationship between the diamond and the box modal operators. We recover Ewald's intuitionistic tense logic and Simpson's intuitionistic modal logic by modularly extending LBiKt with additional structural rules. The calculus LBiKt is formulated in a variant of display calculus, using a form of sequents called nested sequents. Cut elimination is proved for LBiKt, using a technique similar to that used in display calculi. As in display calculi, the inference rules of LBiKt are ``shallow'' rules, in the sense that they act on top-level formulae in a nested sequent. The calculus LBiKt is ill-suited for backward proof search due to the presence of certain structural rules called ``display postulates'' and the contraction rules on arbitrary structures. We show that these structural rules can be made redundant in another calculus, DBiKt, which uses deep inference, allowing one to apply inference rules at an arbitrary depth in a nested sequent. We prove the equivalence between LBiKt and DBiKt and outline a proof search strategy for DBiKt. We also give a Kripke semantics and prove that LBiKt is sound with respect to the semantics, but completeness is still an open problem. We then discuss various extensions of LBiKt.
Pdf29 June 2010
Advances in Modal Logic 2010, pp 156177
A Syntactic Realization Theorem for Justification Logics
Kai Brünnler, Remo Goetschi and Roman Kuznets
AbstractJustification logics are refinements of modal logics where modalities are replaced by justification terms. They are connected to modal logics via so-called realization theorems. We present a syntactic proof of a single realization theorem which uniformly connects all the normal modal logics formed from the axioms d, t, b, 4, and 5 with their justification counterparts. The proof employs cut-free nested sequent systems together with Fitting ?s realization merging technique. We further strengthen the realization theorem for KB5 and S5 by showing that the positive introspection operator is superfluous.
Pdf3 April 2010
Advances in Modal Logic 2010, pp 3958
A Note on Uniform Interpolation Proofs in Modal Deep Inference Calculi
AbstractThis paper answers one rather particular question: how to perform a proof of uniform interpolation property in deep inference calculi for modal logics. We show how to perform a proof of uniform interpolation property in deep inference calculus for the basic modal logic K via forgetting a variable in a certain normal form constructed by backward proof search. For that purpose we modify the framework of deep inference calculus using a cover modality on the meta level to structure deep sequents.
21 September 2009
TbiLLC 2009, LNAI 6618, pp. 3045
Deep Inference in Bi-intuitionistic Logic
AbstractBi-intuitionistic logic is the extension of intuitionistic logic with exclusion, a connective dual to implication. Cut-elimination in biintuitionistic logic is complicated due to the interaction between these two connectives, and various extended sequent calculi, including a display calculus, have been proposed to address this problem.
In this paper, we present a new extended sequent calculus DBiInt for bi-intuitionistic logic which uses nested sequents and “deep inference”, i.e., inference rules can be applied at any level in the nested sequent. We show that DBiInt can simulate our previous “shallow” sequent calculus LBiInt. In particular, we show that deep inference can simulate the residuation rules in the display-like shallow calculus LBiInt. We also consider proof search and give a simple restriction of DBiInt which allows terminating proof search. Thus our work is another step towards addressing the broader problem of proof search in display logic.
Pdf7 June 2009
WoLLIC 2009, LNCS 5514, pp. 320334
Modular Sequent Systems for Modal Logic
Kai Brünnler and Lutz Straßburger
AbstractWe see cut-free sequent systems for the basic normal modal logics formed by any combination the axioms d, t, b, 4, 5. These systems are modular in the sense that each axiom has a corresponding rule and each combination of these rules is complete for the corresponding frame conditions. The systems are based on nested sequents, a natural generalisation of hypersequents. Nested sequents stay inside the modal language, as opposed to both the display calculus and labelled sequents. The completeness proof is via syntactic cut elimination.
Pdf23 April 2009
Tableaux 2009, LNCS 5607, pp. 152166
Attention, this paper contains some serious mistakes. The authors are working on a corrected version.
Syntactic Cut-Elimination for Common Knowledge
Kai Brünnler and Thomas Studer
AbstractWe first look at an existing infinitary sequent system for common knowledge for which there is no known syntactic cut-elimination procedure and also no known non-trivial bound on the proof-depth. We then present another infinitary sequent system based on nested sequents that are essentially trees and with inference rules that apply deeply inside these trees. Thus we call this system "deep" while we call the former system "shallow". In contrast to the shallow system, the deep system allows one to give a straightforward syntactic cut-elimination procedure. Since both systems can be embedded into each other, this also yields a syntactic cut-elimination procedure for the shallow system. For both systems we thus obtain an upper bound of φ20 on the depth of proofs, where φ is the Veblen function. We see a cut-free infinitary sequent system for common knowledge. Its sequents are essentially trees and the inference rules apply deeply inside of these trees. This allows to give a syntactic cut-elimination procedure which yields an upper bound of on the depth of proofs, where is the Veblen function.
Cut-Elimination and Proof-Search for Bi-Intuitionistic Logic Using Nested Sequents
Rajeev Goré, Linda Postniece and Alwen Tiu
AbstractWe propose a new sequent calculus for bi-intuitionistic logic which sits somewhere between display calculi and traditional sequent calculi by using nested sequents. Our calculus enjoys a simple (purely syntactic) cut-elimination proof as do display calculi. But it has an easily derivable variant calculus which is amenable to automated proof search as are (some) traditional sequent calculi. We first present the initial calculus and its cutelimination proof. We then present the derived calculus, and then present a proof-search strategy which allows it to be used for automated proof search. We prove that this search strategy is terminating and complete by showing how it can be used to mimic derivations obtained from an existing calculus GBiInt for bi-intuitionistic logic. As far as we know, our new calculus is the first sequent calculus for bi-intuitionistic logic which uses no semantic additions like labels, which has a purely syntactic cut-elimination proof, and which can be used naturally for backwards proof-search.
Pdf15 September 2008
Advances in Modal Logic 2008, pp 4366
Deep Sequent Systems for Modal Logic
AbstractWe see a systematic set of cut-free axiomatisations for all the basic normal modal logics formed by some combination the axioms d, t, b, 4, 5. They employ a form of deep inference but otherwise stay very close to Gentzen’s sequent calculus, in particular they enjoy a subformula property in the literal sense. No semantic notions are used inside the proof systems, in particular there is no use of labels. All their rules are invertible and the rules cut, weakening and contraction are admissible. All systems admit a straightforward terminating proof search procedure as well as a syntactic cut elimination procedure.
We can present systematically several normal propositional modal logics, including S5, B and K5, for which cut elimination is proved. We also investigated geometric theories, some of which we expressed in the calculus of structures.
The following papers exist:
Deep Inference for Hybrid Logic
AbstractThis paper describes work in progress on using deep inference for designing a deductive system for hybrid logic. We will see a cut-free system and prove its soundness and completeness. An immediate observation about the system is that there is no need for additional rewrite rules as in Blackburn’s tableaux, nor substitution rules as in Seligman’s sequent system.
Pdf15 May 2007
Proceedings of International Workshop on Hybrid Logic 2007 (HyLo 2007), pp. 1322
Classical Modal Display Logic in the Calculus of Structures and Minimal and Cut-free Deep Inference Calculi for S5
Rajeev Goré and Alwen Tiu
AbstractWe begin by showing how to faithfully encode the Classical Modal Display Logic (CMDL) of Wansing into the Calculus of Structures (CoS) of Guglielmi. Since every CMDL calculus enjoys cut-elimination, we obtain a cut-elimination theorem for all corresponding CoS calculi. We then show how our result leads to a minimal cut-free CoS calculus for modal logic S5. As far as we know, no other existing CoS calculi for S5 enjoy both these properties simultaneously.
Pdf4 May 2007
Journal of Logic and Computation 17 (4) 2007, pp. 767794
A Deep Inference System for the Modal Logic S5
AbstractWe present a cut-admissible system for the modal logic S5 in a framework that makes explicit and intensive use of deep inference. Deep inference is induced by the methods applied so far in conceptually pure systems for this logic. Thus, the formulation of a system in such a framework is an evolutional process and leads to positive proof theoretical results. The system enjoys systematicity and modularity, two important properties that seek satisfaction from modal systems. Furthermore, it enjoys a simple and direct design: the rules are few and the modal rules are in exact correspondence to the modal axioms.
Pdf1 March 2006
Studia Logica 85 (2) 2007, pp. 199214
Purity Through Unravelling
Robert Hein and Charles Stewart
AbstractWe divide attempts to give the structural proof theory of modal logics into two kinds, those pure formulations whose inference rules characterise modality completely by means of manipulations of boxes and diamonds, and those labelled formulations that leverage the use of labels in giving inference rules. The widespread adoption of labelled formulations is driven by their ability to model features of the model theory of modal logic in its proof theory.
We describe here an approach to the structural proof theory of modal logic that aims to bring under one roof the benefits of both the pure and the labelled formulations. We introduce two proof calculi, one labelled sequent formulation and one pure formulation in the calculus of structures that are shown to be in a systematic correlation, where the latter calculus uses deep inference with shaped modal rules to capture in a pure manner the manipulations that the former calculations mediates through the use of labels.
We situate this work within a larger investigation into the proof theory of modal logic that solves problems that existed with the earlier investigation based on prefix modal rules. We hold this development provides yet stronger evidence justifying the claim that good, pure proof theory for modal logic needs deep inference.
Pdf25 June 2005
Proceedings of Structures and Deduction '05, pp. 126143
Geometric Theories and Modal Logic in the Calculus of Structures
AbstractMuch of the success of modal logic can be attributed to the adoption of relational semantics. Consequently, modal logic is seen as logic of relational structures, where logical axioms correspond to structural properties. Alex Simpson, in his 1993 PhD thesis, introduced a labelled proof theory for modal logic that that allows cut-elimination for a class of modal logics, which is characterised by so called geometric theories. This includes important and well know logics such as M, B, S4 and S5. This thesis tries to make a bridge between Simpson's result and purely symbolic proof theory. We introduce a method to characterise frame relational properties by means of deep inference in the Calculus of Structures. The results are only partial. Only what we call 3/4-Scott-Lemmon logics are characterised and we only give plausible reason, rather than a proof that the cut-elimination argument can be transferred too.
Gzipped postscript20 March 2005
MSc thesis, successfully defended on 23 March 2005
The Design of Modal Proof Theories: The Case of S5
AbstractThe sequent calculus does not seem to be capable of supporting cut-admissible formulations for S5. Through a survey on existing cut-admissible systems for this logic, we investigate the solutions proposed to overcome this defect. Accordingly, the systems can be divided into two categories: in those which allow semantic-oriented formulae and those which allow formulae in positions not reachable by the usual systems in the sequent calculus. The first solution is not desirable because it is conceptually impure, that is, these systems express concepts of frame semantics in the language of the logic.
Consequently, we focus on the systems of the second group for which we define notions related to deep inferencethe ability to apply rules deep inside structuresas well as other desirable properties good systems should enjoy. We classify these systems accordingly and examine how these properties are affected in the presence of deep inference. Finally, we present a cut-admissible system for S5 in a formalism which makes explicit use of deep inference, the calculus of structures, and give reasons for its effectiveness in providing good modal formulations.
Pdf20 October 2004
MSc thesis, successfully defended on 27 October 2004
A Systematic Proof Theory for Several Modal Logics
Charles Stewart and Phiniki Stouppa
AbstractThe family of normal propositional modal logic systems is given a very systematic organisation by their model theory. This model theory is generally given using frame semantics, and it is systematic in the sense that for the most important systems we have a clean, exact correspondence between their constitutive axioms as they are usually given in a Hilbert-Lewis style and conditions on the accessibility relation on frames.
By contrast, the usual structural proof theory of modal logic, as given in Gentzen systems, is ad-hoc. While we can formulate several modal logics in the sequent calculus that enjoy cut-elimination, their formalisation arises through system-by-system fine tuning to ensure that the cut-elimination holds, and the correspondence to the formulation in the Hilbert-Lewis systems becomes opaque.
This paper introduces a systematic presentation for the systems K, D, M, S4, and S5 in the calculus of structures, a structural proof theory that employs deep inference. Because of this, we are able to axiomatise the modal logics in a manner directly analogous to the Hilbert-Lewis axiomatisation. We show that the calculus possesses a cut-elimination property directly analogous to cut-elimination for the sequent calculus for these systems, and we discuss the extension to several other modal logics.
Postscript2 September 2004
Advances in Modal Logic 2004, pp. 309333
Linear logic enjoys presentations in deep inference that obtain the expected properties of locality, with rather astounding decomposition theorems and the usual, general normalisation results.
The following papers exist, in addition to Linear Logic and Noncommutativity in the Calculus of Structures, mentioned above:
Interaction and Depth Against Nondeterminism in Proof Search
AbstractDeep inference is a proof theoretic methodology that generalises the traditional notion of inference of the sequent calculus: inference rules can be applied at any depth inside logical expressions. Deep inference provides more freedom in design of deductive systems for different logics and a rich combinatoric analysis of proofs. In particular, construction of exponentially shorter analytic proofs becomes possible, but with the cost of a greater nondeterminism than in the sequent calculus. In this paper, we extend our previous work on proof search with deep inference deductive systems. We show that, by exploiting an interaction and depth scheme in the logical expressions, the nondeterminism in proof search can be reduced without losing the shorter proofs and without sacrificing from proof theoretic cleanliness. We demonstrate this on deep inference systems for multiplicative linear logic and classical logic.
Pdf19 March 2013
Interana: interactive multiplicative linear logic prover
The Focused Calculus of Structures
Kaustuv Chaudhuri, Nicolas Guenot and Lutz Straßburger
AbstractThe focusing theorem identifies a complete class of sequent proofs that have no inessential nondeterministic choices and restrict the essential choices to a particular normal form. Focused proofs are therefore well suited both for the search and for the representation of sequent proofs. The calculus of structures is a proof formalism that allows rules to be applied deep inside a formula. Through this freedom it can be used to give analytic proof systems for a wider variety of logics than the sequent calculus, but standard presentations of this calculus are too permissive, allowing too many proofs. In order to make it more amenable to proof search, we transplant the focusing theorem from the sequent calculus to the calculus of structures. The key technical contribution is an incremental treatment of focusing that avoids trivializing the calculus of structures. We give a direct inductive proof of the completeness of the focused calculus of structures with respect to a more standard unfocused form. We also show that any focused sequent proof can be represented in the focused calculus of structures, and, conversely, any proof in the focused calculus of structures corresponds to a focused sequent proof.
Pdf25 August 2011
CSL 2011, LIPIcs 12, pp. 159173
Focused Proof Search for Linear Logic in the Calculus of Structures
AbstractThe proof-theoretic approach to logic programming has benefited from the introduction of focused proof systems, through the non-determinism reduction and control they provide when searching for proofs in the sequent calculus. However, this technique was not available in the calculus of structures, known for inducing even more non-determinism than other logical formalisms. This work in progress aims at translating the notion of focusing into the presentation of linear logic in this setting, and use some of its specific features, such as deep application of rules and fine granularity, in order to improve proof search procedures. The starting point for this research line is the multiplicative fragment of linear logic, for which a simple focused proof system can be built.
Pdf23 June 2010
ICLP 2010, technical communication, LIPIcs 7, pp. 8493
Some Observations on the Proof Theory of Second Order Propositional Multiplicative Linear Logic
AbstractWe investigate the question of what constitutes a proof when quantifiers and multiplicative units are both present. On the technical level this paper provides two new aspects of the proof theory of MLL2 with units. First, we give a novel proof system in the framework of the calculus of structures. The main feature of the new system is the consequent use of deep inference, which allows us to observe a decomposition which is a version of Herbrand’s theorem that is not visible in the sequent calculus. Second, we show a new notion of proof nets which is independent from any deductive system. We have “sequentialisation” into the calculus of structures as well as into the sequent calculus. Since cut elimination is terminating and confluent, we have a category of MLL2 proof nets. The treatment of the units is such that this category is star-autonomous.
Pdf10 April 2009
TLCA 2009, LNCS 5608, pp. 309324
Pdf with proofs in appendix
MELL in the Calculus of Structures
AbstractThe calculus of structures is a new proof-theoretical framework, like natural deduction, the sequent calculus and proof nets, for specifying logical systems syntactically. In a rule in the calculus of structures, the premise as well as the conclusion are structures, which are expressions that share properties of formulae and sequents. In this paper, I study a system for MELL, the multiplicative exponential fragment of linear logic, in the calculus of structures. It has the following features: a local promotion rule, no non-deterministic splitting of the context in the times rule and a modular proof for the cut-elimination theorem. Further, derivations have a new property, called decomposition, that cannot be observed in any other known proof-theoretical framework.
Pdf24 November 2003
Theoretical Computer Science 309, pp. 213285
A Local System for Linear Logic
AbstractIn this paper I will present a deductive system for linear logic, in which all rules are local. In particular, the contraction rule is reduced to an atomic version, and there is no global promotion rule. In order to achieve this, it is necessary to depart from the sequent calculus to the calculus of structures, which is a generalization of the one-sided sequent calculus. In a rule, premise and conclusion are not sequents, but structures, which are expressions that share properties of formulae and sequents.
We conservatively extend mixed multiplicative and multiplicative exponential linear logic with a self-dual non-commutative operator. The systems so obtained cannot be presented in the sequent calculus, but they enjoy the usual properties of locality, decomposition and cut elimination available in the calculus of structures. We can present Yetter's cyclic linear logic in the calculus of structures and prove cut elimination; interestingly, cyclicity is naturally subsumed by deep inference. New, purely proof-theoretical, techniques are developed for reducing the non-determinism in the calculus of structures.
The following papers exist, in addition to Linear Logic and Noncommutativity in the Calculus of Structures, mentioned above:
Communication, and Concurrency with Logic-Based Restriction Inside a Calculus of Structures
AbstractIt is well known that we can use structural proof theory to refine, or generalize, existing paradigmatic computational primitives, or to discover new ones. Under such a point of view we keep developing a programme whose goal is establishing a correspondence between proof-search of a logical system and computations in a process algebra. We give a purely logical account of a process algebra operation which strictly includes the behavior of restriction on actions we find in Milner CCS. This is possible inside a logical system in the Calculus of Structures of Deep Inference endowed with a self-dual quantifier. Using proof-search of cut-free proofs of such a logical system we show how to solve reachability problems in a process algebra that subsumes a significant fragment of Milner CCS.
Pdf19 December 2012
Extending a System in the Calculus of Structures with a Self-Dual Quantifier
AbstractWe recall that SBV, a proof system developed under the methodology of deep inference, extends multiplicative linear logic with the self-dual non-commutative logical operator Seq. We introduce SBVQ that extends SBV by adding the self-dual quantifier Sdq. The system SBVQ is consistent because we prove that (the analogous of) cut elimination holds for it. Its new logical operator Sdq operationally behaves as a binder, in a way that the interplay between Seq, and Sdq can model β-reduction of linear λ-calculus inside the cut-free subsystem BVQ of SBVQ. The long term aim is to keep developing a programme whose goal is to give pure logical accounts of computational primitives under the proof-search-as-computation analogy, by means of minimal, and incremental extensions of SBV.
Pdf18 December 2012
A Logical Basis for Quantum Evolution and Entanglement
Richard F. Blute, Alessio Guglielmi, Ivan T. Ivanov, Prakash Panangaden and Lutz Straßburger
AbstractWe reconsider discrete quantum casual dynamics where quantum systems are viewed as discrete structures, namely directed acyclic graphs. In such a graph, events are considered as vertices and edges depict propagation between events. Evolution is described as happening between a special family of spacelike slices, which were referred to as locative slices. Such slices are not so large as to result in acausal influences, but large enough to capture nonlocal correlations. In our logical interpretation, edges are assigned logical formulas in a special logical system, called BV, an instance of a deep inference system. We demonstrate that BV, with its mix of commutative and noncommutative connectives, is precisely the right logic for such analysis. We show that the commutative tensor encodes (possible) entanglement, and the noncommutative seq encodes causal precedence. With this interpretation, the locative slices are precisely the derivable strings of formulas.
Pdf12 May 2011
Linear Lambda Calculus and Deep Inference
AbstractWe introduce a deep inference logical system SBVr which extends SBV with Rename, a self-dual atom-renaming operator. We prove that the cut free subsystem BVr of SBVr exists. We embed the terms of linear lambda-calculus with explicit substitutions into formulas of SBVr. Our embedding recalls the one of full lambda-calculus into pi-calculus. The proof-search inside SBVr and BVr is complete with respect to the evaluation of linear lambda-calculus with explicit substitutions. Instead, only soundness of proof-search in SBVr holds. Rename is crucial to let proof-search simulate the substitution of a linear lambda-term for a variable in the course of linear beta-reduction. Despite SBVr is a minimal extension of SBV its proof-search can compute all boolean functions, exactly like linear lambda-calculus with explicit substitutions can do.
A System of Interaction and Structure V: The Exponentials and Splitting
Alessio Guglielmi and Lutz Straßburger
AbstractSystem NEL is the mixed commutative/non-commutative linear logic BV augmented with linear logic’s exponentials, or, equivalently, it is MELL augmented with the non-commutative self-dual connective seq. NEL is presented in deep inference, because no Gentzen formalism can express it in such a way that the cut rule is admissible. Other, recent works, show that system NEL is Turing- complete, it is able to directly express process algebra sequential composition and it faithfully models causal quantum evolution. In this paper, we show cut elimination for NEL, based on a technique that we call splitting. The splitting theorem shows how and to what extent we can recover a sequent-like structure in NEL proofs. Together with a ‘decomposition’ theorem, proved in the previous paper of this series, splitting yields a cut-elimination procedure for NEL.
Pdf21 September 2010
Mathematical Structures in Computer Science, 21 (3) 2011, pp. 563584
This paper is the second part resulted from splitting into two parts the paper A System of Interaction and Structure IV: The Exponentials (which was previously titled A Non-commutative Extension of Multiplicative Exponential Linear Logic and was the journal version of A Non-commutative Extension of MELL).
A System of Interaction and Structure IV: The Exponentials and Decomposition
Lutz Straßburger and Alessio Guglielmi
AbstractWe study a system, called NEL, which is the mixed commutative/non-commutative linear logic BV augmented with linear logic’s exponentials. Equivalently, NEL is MELL augmented with the non-commutative self-dual connective seq. In this paper, we show a basic compositionality property of NEL, which we call decomposition. This result leads to a cut-elimination theorem, which is proved in the next paper of this series. To control the induction measure for the theorem, we rely on a novel technique that extracts from NEL proofs the structure of exponentials, into what we call !-?-Flow-Graphs.
Pdf27 July 2010
ACM Transactions on Computational Logic 12 (4:23) 2011, pp. 139
This paper is the first part resulted from splitting into two parts the paper A System of Interaction and Structure IV: The Exponentials (which was previously titled A Non-commutative Extension of Multiplicative Exponential Linear Logic and was the journal version of A Non-commutative Extension of MELL).
Deep Inference and Probabilistic Coherence Spaces
Richard Blute, Prakash Panangaden and Sergey Slavnov
AbstractThis paper proposes a definition of categorical model of the deep inference system BV, defined by Guglielmi. Deep inference introduces the idea of performing a deduction in the interior of a formula, at any depth. Traditional sequent calculus rules only see the roots of formulae. However in these new systems, one can rewrite at any position in the formula tree. Deep inference in particular allows the syntactic description of logics for which there is no sequent calculus. One such system is BV, which extends linear logic to include a noncommutative self-dual connective. This is the logic our paper proposes to model. Our definition is based on the notion of a linear functor, due to Cockett and Seely. A BV-category is a linearly distributive category, possibly with negation, with an additional tensor product which, when viewed as a bivariant functor, is linear with a degeneracy condition. We show that this simple definition implies all of the key isomorphisms of the theory. We consider Girard’s category of probabilistic coherence spaces and show that it contains a self-dual monoidal structure in addition to the *-autonomous structure exhibited by Girard. This structure makes the category a BV-category. We believe this structure is also of independent interest, as well-behaved noncommutative operators generally are.
Pdf10 February 2009
Applied Categorical Structures 20 2012, pp. 209228
System BV Is NP-Complete
AbstractSystem BV is an extension of multiplicative linear logic (MLL) with the rules mix, nullary mix, and a self-dual, non-commutative logical operator, called seq. While the rules mix and nullary mix extend the deductive system, the operator seq extends the language of MLL. Due to the operator seq, system BV extends the applications of MLL to those where sequential composition is crucial, e.g., concurrency theory. System FBV is an extension of MLL with the rules mix and nullary mix. In this paper, by relying on the fact that system BV is a conservative extension of system FBV, I show that system BV is NP-complete by encoding the 3-Partition problem in FBV. I provide a simple completeness proof of this encoding by resorting to a novel proof theoretical method for reducing the nondeterminism in proof search, which is also of independent interest.
A System of Interaction and Structure
AbstractThis paper introduces a logical system, called BV, which extends multiplicative linear logic by a non-commutative self-dual logical operator. This extension is particularly challenging for the sequent calculus, and so far it is not achieved therein. It becomes very natural in a new formalism, called the calculus of structures, which is the main contribution of this work. Structures are formulae subject to certain equational laws typical of sequents. The calculus of structures is obtained by generalising the sequent calculus in such a way that a new top-down symmetry of derivations is observed, and it employs inference rules that rewrite inside structures at any depth. These properties, in addition to allowing the design of BV, yield a modular proof of cut elimination.
Pdf27 January 2007
ACM Transactions on Computational Logic 8 (1:1) 2007, pp. 164
The journal version has been butchered by the editorial process, but the preprint (linked here) is fine.
At the end of 2012, this was the most cited article appeared in ACM TOCL in the previous five years.
Reducing Nondeterminism in the Calculus of Structures
AbstractThe calculus of structures is a proof theoretical formalism which generalizes the sequent calculus with the feature of deep inference: In contrast to the sequent calculus, inference rules can be applied at any depth inside a formula, bringing shorter proofs than any other formalisms supporting analytical proofs. However, deep applicability of the inference rules causes greater nondeterminism than in the sequent calculus regarding proof search. In this paper, we introduce a new technique which reduces nondeterminism without breaking proof theoretical properties and provides a more immediate access to shorter proofs. We present this technique on system BV, the smallest technically non-trivial system in the calculus of structures, extending multiplicative linear logic with the rules mix, nullary mix, and a self-dual non-commutative logical operator. Because our technique exploits a scheme common to all the systems in the calculus of structures, we argue that it generalizes to these systems for classical logic, linear logic, and modal logics.
Pdf23 August 2006
LPAR 2006, LNCS 4246, pp. 272286
A System of Interaction and Structure II: The Need for Deep Inference
AbstractThis paper studies properties of the logic BV, which is an extension of multiplicative linear logic (MLL) with a self-dual non-commutative operator. BV is presented in the calculus of structures, a proof theoretic formalism that supports deep inference, in which inference rules can be applied anywhere inside logical expressions. The use of deep inference results in a simple logical system for MLL extended with the self-dual non-commutative operator, which has been to date not known to be expressible in sequent calculus. In this paper, deep inference is shown to be crucial for the logic BV, that is, any restriction on the "depth" of the inference rules of BV would result in a strictly less expressive logical system.
Pdf3 April 2006
Logical Methods in Computer Science 2 (2:4) 2006, pp. 124
There are pictures that help understanding this paper in Alwen Tiu's MSc thesis Properties of a Logical System in the Calculus of Structures
Structures for Multiplicative Cyclic Linear Logic: Deepness vs Cyclicity
Pietro Di Gianantonio
AbstractThe aim of this work is to give an alternative presentation for the multiplicative fragment of Yetter’s cyclic linear logic. The new presentation is inspired by the calculus of structures, and has the interesting feature of avoiding the cyclic rule. The main point in this work is to show how cyclicity can be substituted by deepness, i.e. the possibility of applying an inference rule at any point of a formula. We finally derive, through a new proof technique, the cut elimination property of the calculus.
Pdf15 April 2004
CSL 2004, LNCS 3210, pp. 130144
A Non-commutative Extension of MELL
Alessio Guglielmi and Lutz Straßburger
AbstractWe extend multiplicative exponential linear logic (MELL) by a non-commutative, self-dual logical operator. The extended system, called NEL, is defined in the formalism of the calculus of structures, which is a generalisation of the sequent calculus and provides a more refined analysis of proofs. We should then be able to extend the range of applications of MELL, by modelling a broad notion of sequentiality and providing new properties of proofs. We show some proof theoretical results: decomposition and cut elimination. The new operator represents a significant challenge: to get our results we use here for the first time some novel techniques, which constitute a uniform and modular approach to cut elimination, contrary to what is possible in the sequent calculus.
Pdf9 August 2002
LPAR 2002, LNCS 2514, pp. 231246
The Undecidability of System NEL
AbstractSystem NEL is a conservative extension of multiplicative exponential linear logic (MELL) by a self-dual non-commutative connective called seq which lives between the par and the times. In this paper, I will show that system NEL is undecidable by encoding two counter machines into NEL. Although the encoding is quite simple, the proof of the faithfulness is a little intricate because there is no sequent calculus and no phase semantics available for NEL.
Properties of a Logical System in the Calculus of Structures
Alwen Fernanto Tiu
AbstractThe calculus of structures is a new framework for presenting logical systems. It is a generalisation of a traditional framework, the one-sided sequent calculus. One of the main features of the calculus of structures is that the inference rules are deep: they can be applied anywhere inside logical expressions. Rules in the sequent calculus are, in contrast, shallow. A certain logical system in the calculus of structures, called System BV, is studied here. We see that the deep-nesting of rules is a real distinguishing feature between the two frameworks. To this purpose a notion of shallow systems is introduced, such that sequent systems are particular instances. A counterexample, sort of a fractal structure, is then presented to show that there is no shallow system for the logic behind BV, and hence no sequent system for BV. This result contributes to justifying the claim that the calculus of structures is a better logical framework than sequent calculus, for certain logics.
Pdf12 September 2001Now replaced by A System of Interaction and Structure II: The Need for Deep-Inference
MSc thesis, successfully defended on 1 August 2001, Technical Report WV-01-06, Technische Universität Dresden
Non-commutativity and MELL in the Calculus of Structures
Alessio Guglielmi and Lutz Straßburger
AbstractWe introduce the calculus of structures: it is more general than the sequent calculus and it allows for cut elimination and the subformula property. We show a simple extension of multiplicative linear logic, by a self-dual non-commutative operator inspired by CCS, that seems not to be expressible in the sequent calculus. Then we show that multiplicative exponential linear logic benefits from its presentation in the calculus of structures, especially because we can replace the ordinary, global promotion rule by a local version. These formal systems, for which we prove cut elimination, outline a range of techniques and properties that were not previously available. Contrarily to what happens in the sequent calculus, the cut elimination proof is modular.
Pdf28 June 2001
CSL 2001, LNCS 2142, pp. 5468
A Calculus of Order and Interaction
This paper has been thoroughly rewritten as A System of Interaction and Structure. The introduction of the new paper has not been written under the effect of psychotropic substances. Please forget about this paper.
Technical Report WV-99-04, Technische Universität Dresden
Deep inference and the calculus of structures are influencing the design of a new generation of proof nets. Moreover, they offer new insight for semantics of proofs and categorical proof theory. Finally, they open decisive new perspectives in the fight against bureaucracy.
The following papers exist, in addition to A General View of Normalisation Through Atomic Flows and Categorical Models of First Order Classical Proofs, mentioned above:
A Proof Calculus Which Reduces Syntactic Bureaucracy
Alessio Guglielmi, Tom Gundersen and Michel Parigot
AbstractIn usual proof systems, like sequent calculus, only a very limited way of combining proofs is available through the tree structure. We present in this paper a logic-independent proof calculus, where proofs can be freely composed by connectives, and prove its basic properties. The main advantage of this proof calculus is that it allows to avoid certain types of syntactic bureaucracy inherent to all usual proof systems, in particular sequent calculus. Proofs in this system closely reflect their atomic flow, which traces the behaviour of atoms through structural rules. The general definition is illustrated by the standard deep-inference system for propositional logic, for which there are known rewriting techniques that achieve cut elimination based only on the information in atomic flows.
Pdf12 April 2010
RTA 2010, Leibniz International Proceedings in Informatics (LIPIcs) 6, pp. 135150
This paper supersedes the note Formalism A.
What Is the Problem with Proof Nets for Classical Logic?
AbstractThis paper is an informal (and nonexhaustive) overview over some existing notions of proof nets for classical logic, and gives some hints why they might be considered to be unsatisfactory.
Pdf6 April 2010
CiE 2010, LNCS 6158, pp. 406416
From Deep Inference to Proof Nets via Cut Elimination
AbstractThis article shows how derivations in the deep inference system SKS for classical propositional logic can be translated into proof nets. Since an SKS derivation contains more information about a proof than the corresponding proof net, we observe a loss of information which can be understood as “eliminating bureaucracy”. Technically this is achieved by cut reduction on proof nets. As an intermediate step between the two extremes, SKS derivations and proof nets, we will see proof graphs representing derivations in “Formalism A”.
On the Axiomatisation of Boolean Categories with and without Medial
AbstractIn its most general meaning, a Boolean category is to categories what a Boolean algebra is to posets. In a more specific meaning a Boolean category should provide the abstract algebraic structure underlying the proofs in Boolean Logic, in the same sense as a Cartesian closed category captures the proofs in intuitionistic logic and a *-autonomous category captures the proofs in linear logic. However, recent work has shown that there is no canonical axiomatisation of a Boolean category. In this work, we will see a series (with increasing strength) of possible such axiomatisations, all based on the notion of *-autonomous category. We will particularly focus on the medial map, which has its origin in an inference rule in KS, a cut-free deductive system for Boolean logic in the calculus of structures. Finally, we will present a category proof nets as a particularly well-behaved example of a Boolean category.
What Is a Logic, and What Is a Proof?
AbstractI will discuss the two problems of how to define identity between logics and how to define identity between proofs. For the identity of logics, I propose to simply use the notion of preorder equivalence. This might be considered to be folklore, but is exactly what is needed from the viewpoint of the problem of the identity of proofs: If the proofs are considered to be part of the logic, then preorder equivalence becomes equivalence of categories, whose arrows are the proofs. For identifying these, the concept of proof nets is discussed.
Pdf23 October 2006
Logica UniversalisTowards a General Theory of Logic, pp. 135152, Birkhäuser, 2007
Proof Nets and the Identity of Proofs
AbstractThese are the notes for a 5-lecture-course given at ESSLLI 2006 in Malaga, Spain. The URL of the school is http://esslli2006.lcc.uma.es/. The course is intended to be introductory. That means no prior knowledge of proof nets is required. However, the student should be familiar with the basics of propositional logic, and should have seen formal proofs in some formal deductive system (e.g., sequent calculus, natural deduction, resolution, tableaux, calculus of structures, Frege-Hilbert-systems, ...). It is probably helpful if the student knows already what cut elimination is, but this is not strictly necessary. In these notes, I will introduce the concept of “proof nets” from the viewpoint of the problem of the identity of proofs. I will proceed in a rather informal way. The focus will be more on presenting ideas than on presenting technical details. The goal of the course is to give the student an overview of the theory of proof nets and make the vast amount of literature on the topic easier accessible to the beginner. For introducing the basic concepts of the theory, I will in the first part of the course stick to the unit-free multiplicative fragment of linear logic because of its rather simple notion of proof nets. In the second part of the course we will see proof nets for more sophisticated logics.
Pdf20 October 2006
Technical Report 6013, INRIA
From Proof Nets to the Free *-Autonomous Category
François Lamarche and Lutz Straßburger
AbstractIn the first part of this paper we present a theory of proof nets for full multiplicative linear logic, including the two units. It naturally extends the well-known theory of unit-free multiplicative proof nets. A linking is no longer a set of axiom links but a tree in which the axiom links are subtrees. These trees will be identified according to an equivalence relation based on a simple form of graph rewriting. We show the standard results of sequentialization and strong normalization of cut elimination. In the second part of the paper we show that the identifications enforced on proofs are such that the class of two-conclusion proof nets defines the free *-autonomous category.
Pdf5 October 2006Journal version of On Proof Nets for Multiplicative Linear Logic with Units
Logical Methods in Computer Science 2 (4:3) 2006, pp. 144
Completeness of MLL Proof-Nets w.r.t. Weak Distributivity
AbstractWe examine ‘weak-distributivity’ as a rewriting rule WD→ defined on multiplicative proof-structures (so, in particular, on multiplicative proof-nets: MLL). This rewriting does not preserve the type of proofs-nets, but does nevertheless preserve their correctness. The specific contribution of this paper, is to give a direct proof of completeness for WD→: starting from a set of simple generators (proof-nets which are a n-ary ⊗ of ℘-ized axioms), any mono-conclusion MLL proof-net can be reached by WD→ rewriting (up to ⊗ and ℘ associativity and commutativity).
Pdf8 September 2006
The Journal of Symbolic Logic 72 (1) 2007, pp. 159170
Invited talk at WoLLIC 2003, under the title `Calculus of Structures and Proof Nets´, Electronic Notes in Theoretical Computer Science 84; a short version appeared in the proceedings of Structures and Deduction '05, pp. 8194
Exploring the Gap between Linear and Classical Logic
AbstractThe Medial rule was first devised as a deduction rule in the Calculus of Structures. In this paper we explore it from the point of view of category theory, as additional structure on a *-autonomous category. This gives us some insights on the denotational semantics of classical propositional logic, and allows us to construct new models for it, based on suitable generalizations of the theory of coherence spaces.
Pdf27 June 2006
Theory and Applications of Categories 18 (17) 2007, pp. 473535
The Three Dimensions of Proofs
AbstractIn this document, we study a 3-polygraphic translation for the proofs of SKS, a formal system for classical propositional logic. We prove that the free 3-category generated by this 3-polygraph describes the proofs of classical propositional logic modulo structural bureaucracy. We give a 3-dimensional generalization of Penrose diagrams and use it to provide several pictures of a proof. We sketch how local transformations of proofs yield a non contrived example of 4-dimensional rewriting.
On Two Forms of Bureaucracy in Derivations
Kai Brünnler and Stéphane Lengrand
AbstractWe call irrelevant information in derivations bureaucracy. An example of such irrelevant information is the order between two consecutive inference rules that trivially permute. Building on ideas by Guglielmi, we identify two forms of bureaucracy that occur in the calculus of structures (and, in fact, in every non-trivial term rewriting derivation). We develop term calculi that provide derivations that do not contain this bureaucracy. We also give a normalisation procedure that removes bureaucracy from derivations and find that in a certain sense the normalisation process is a process of cut elimination.
Pdf10 June 2005
Proceedings of Structures and Deduction '05, pp. 6980
This is further expanded in Ch. 11 of Stéphane Lengrand's PhD thesis Normalisation & Equivalence in Proof Theory & Type Theory (2007 Ackermann Award of the EACSL)
Constructing Free Boolean Categories
François Lamarche and Lutz Straßburger
AbstractBy Boolean category we mean something which is to a Boolean algebra what a category is to a poset. We propose an axiomatic system for Boolean categories, which is different in several respects from the ones proposed recently. In particular everything is done from the start in a *-autonomous category and not in a weakly distributive one, which simplifies issues like the Mix rule. An important axiom, which is introduced later, is a "graphical" condition, which is closely related to denotational semantics and the Geometry of Interaction. Then we show that a previously constructed category of proof nets is the free "graphical" Boolean category in our sense. This validates our categorical axiomatization with respect to a real-life example. Another important aspect of this work is that we do not assume a-priori the existence of units in the *-autonomous categories we use. This has some retroactive interest for the semantics of linear logic, and is motivated by the properties of our example with respect to units.
Classical Categories and Deep Inference
AbstractDeep inference is a proof-theoretic notion in which proof rules apply arbitrarily deeply inside a formula. We show that the essence of deep inference is the bifunctoriality of the connectives. We demonstrate that, when given an inequational theory that models cut-reduction, a deep inference calculus for classical logic (SKSg) is a categorical model of the classical sequent calculus LK in the sense of Führmann and Pym. We observe that this gives a notion of cut-reduction for derivations in SKSg, for which the usual notion of cut in SKSg is a special case. Viewing SKSg as a model of the sequent calculus uncovers new insights into the Craig interpolation lemma and intuitionistic provability.
Pdf31 March 2005
Proceedings of Structures and Deduction '05, pp. 1933
Naming Proofs in Classical Propositional Logic
François Lamarche and Lutz Straßburger
AbstractWe present a theory of proof denotations in classical propositional logic. The abstract definition is in terms of a semiring of weights, and two concrete instances are explored. With the Boolean semiring we get a theory of classical proof nets, with a geometric correctness criterion, a sequentialization theorem, and a strongly normalizing cut-elimination procedure. With the semiring of natural numbers, we obtain a sound semantics for classical logic, in which fewer proofs are identified. Though a "real" sequentialization theorem is missing, these proof nets have a grip on complexity issues. In both cases the cut elimination procedure is closely related to its equivalent in the calculus of structures, and we get "Boolean" categories which are not posets.
Pdf31 January 2005
TLCA 2005, LNCS 3461, pp. 246261
Deep Inference Proof Theory Equals Categorical Proof Theory Minus Coherence
AbstractThis paper links deep inference proof theory, as studied by Guglielmi et. al., to categorical proof theory in the sense of Lambek et. al. It observes how deep inference proof theory is categorical proof theory, minus the coherence diagrams/laws. Coherence yields a ready-made and well studied notion of equality on deep inference proofs. The paper notes a precise correspondence between the symmetric deep inference system for multiplicative linear logic (the linear fragment of SKSg and the presentation of *-autonomous categories as symmetric linearly distributive categories with negation. Contraction and weakening in SKSg corresponds precisely to the presence of (co)monoids.
Pdf6 October 2004
On Proof Nets for Multiplicative Linear Logic with Units
Lutz Straßburger and François Lamarche
AbstractIn this paper we present a theory of proof nets for full multiplicative linear logic, including the two units. It naturally extends the well-known theory of unit-free multiplicative proof nets. A linking is no longer a set of axiom links but a tree in which the axiom links are subtrees. These trees will be identified according to an equivalence relation based on a simple form of graph rewriting. We show the standard results of sequentialization and strong normalization of cut elimination. Furthermore, the identifications enforced on proofs are such that the proof nets, as they are presented here, form the arrows of the free (symmetric) *-autonomous category.
Pdf30 June 2004
CSL 2004, LNCS 3210, pp. 145159
Thanks to a self-dual non-commutative extension of linear logic one gets the first purely logical account of sequentiality in proof search. The new logical operators make possible a new approach to partial order planning and its relation to concurrency.
The following papers exist, in addition to Nondeterminism and Language Design in Deep Inference, mentioned above:
On Linear Logic Planning and Concurrency
AbstractWe present an approach to linear logic planning where an explicit correspondence between partial order plans and multiplicative exponential linear logic proofs is established. This is performed by extracting partial order plans from sound and complete encoding of planning problems in multiplicative exponential linear logic. These partial order plans exhibit a non-interleaving behavioural concurrency semantics, i.e., labelled event structures. Relying on this fact, we argue that this work is a crucial step for establishing a common language for concurrency and planning that will allow to carry techniques and methods between these two fields.
A Deductive Compositional Approach to Petri Nets for Systems Biology
AbstractWe introduce the language CP, a compositional language for place transition petri nets for the purpose of modelling signalling pathways in complex biological systems. We give the operational semantics of the language CP by means of a proof theoretical deductive system which extends multiplicative exponential linear logic with a self-dual non-commutative logical operator. This allows to express parallel and sequential composition of processes at the same syntactic level as in process algebra, and perform logical reasoning on these processes. We demonstrate the use of the language on a model of a signaling pathway for Fc receptor-mediated phagocytosis.
Pdf18 September 2007
Presented as poster at CMSB '07
Towards Planning as Concurrency
AbstractWe present a purely logical framework to planning where we bring the sequential and parallel composition in the plans to the same level, as in process algebras. The problem of expressing causality, which is very challenging for common logics and traditional deductive systems, is solved by resorting to a recently developed extension of multiplicative exponential linear logic with a self-dual, noncommutative operator. We present an encoding of the conjunctive planning problems in this logic, and provide a constructive soundness and completeness result. We argue that this work is the first, but crucial, step of a uniform deductive formalism that connects planning and concurrency inside a common language, and allow to transfer methods from concurrency to planning.
Pdf19 October 2004
Artificial Intelligence and Applications 2005, ACTA Press, pp. 197202
A Purely Logical Account of Sequentiality in Proof Search
AbstractWe establish a strict correspondence between the proof-search space of a logical formal system and computations in a simple process algebra. Sequential composition in the process algebra corresponds to a logical relation in the formal systemin this sense our approach is purely logical, no axioms or encodings are involved. The process algebra is a minimal restriction of CCS to parallel and sequential composition; the logical system is a minimal extension of multiplicative linear logic. This way we get the first purely logical account of sequentiality in proof search. Since we restrict attention to a small but meaningful fragment, which is then of very broad interest, our techniques should become a common basis for several possible extensions. In particular, we argue about this work being the first step in a two-step research for capturing most of CCS in a purely logical fashion.
Pdf12 August 2002
ICLP 2002, LNCS 2401, pp. 302316
Ozan Kahramanogullari, Pierre-Etienne Moreau and Antoine Reilles are implementing calculus-of-structures proof systems in Maude and in Tom. Ozan managed to achieve efficiency without sacrificing proof theoretic cleanliness, and he is obtaining results of independent theoretical interest. There are two slides presentations:
Implementing Deep InferenceOzan KahramanogullariTalk24 November 2004
Interaction and Depth Against Nondeterminism in Proof SearchOzan KahramanogullariTalk15 December 2006
Max Schäfer has built a graphical proof editor in Java, called GraPE, for the Maude modules written by Ozan Kahramanogullari; this means that one can interactively build and find proofs in several deep-inference systems.
The following papers exist, in addition to Nondeterminism and Language Design in Deep Inference, mentioned above:
Ingredients of a Deep Inference Theorem Prover
AbstractDeep inference deductive systems for classical logic provide exponentially shorter proofs than the sequent calculus systems, however with the cost of higher nondeterminism and larger search space in proof search. We report on our ongoing work on proof search with deep inference deductive systems. We present systems for classical logic where nondeterminism in proof search is reduced by constraining the context management rule of these systems. We argue that a deep inference system for classical logic can outperform sequent calculus deductive systems in proof search when nondeterminism and the application of the contraction rule are controlled by means of invertible rules.
Pdf24 June 2008
Short paper at CL&C'08
Maude as a Platform for Designing and Implementing Deep Inference Systems
AbstractDeep inference is a proof theoretical methodology that generalizes the traditional notion of inference in the sequent calculus: In contrast to the sequent calculus, the deductive systems with deep inference do not rely on the notion of main connective, and permit the application of the inference rules at any depth inside logical expressions, in a way which resembles the application of term rewriting rules. Deep inference provides a richer combinatoric analysis of proofs for different logics. In particular, construction of exponentially shorter proofs becomes possible. In this paper, aiming at the development of computation as proof search tools, we propose the Maude language as a means for designing and implementing different deep inference deductive systems and proof strategies that work on these systems. We demonstrate these ideas on classical logic and argue that these ideas can be analogously carried to other deductive systems for other logics.
Pdf16 September 2007
RULE '07, Electronic Notes in Theoretical Computer Science 219, 2008, pp. 3550
Canonical Abstract Syntax Trees
AbstractThis paper presents GOM, a language for describing abstract syntax trees and generating a Java implementation for those trees. GOM includes features allowing to specify and modify the interface of the data structure. These features provide in particular the capability to maintain the internal representation of data in canonical form with respect to a rewrite system. This explicitly guarantees that the client program only manipulates normal forms for this rewrite system, a feature which is only implicitly used in many implementations.
Pdf28 March 2006
Proceedings of 6th International Workshop on Rewriting Logic and Its Applications, Electronic Notes in Theoretical Computer Science 176, 2007, pp. 165179
Implementing Deep Inference in TOM
Ozan Kahramanogullari, Pierre-Etienne Moreau, Antoine Reilles
AbstractThe calculus of structures is a proof theoretical formalism which generalizes sequent calculus with the feature of deep inference: in contrast to sequent calculus, the calculus of structures does not rely on the notion of main connective and, like in term rewriting, it permits the application of inference rules at any depth inside a formula. Tom is a pattern matching processor that integrates term rewriting facilities into imperative languages. In this paper, relying on the correspondence between the systems in the calculus of structures and term rewriting systems, we present an implementation of system BV of the calculus of structures in Java by exploiting the term rewriting features of Tom. This way, by means of the expressive power due to Java, it becomes possible to implement different search strategies. Since the systems in the calculus of structures follow a common scheme, we argue that our implementation can be generalized to other systems in the calculus of structures for classical logic, modal logics, and different fragments of linear logic.
Pdf23 April 2005
Proceedings of Structures and Deduction '05, pp. 158172
System BV Without the Equalities for Unit
AbstractSystem BV is an extension of multiplicative linear logic with a non-commutative self-dual operator. In this paper, we present systems equivalent to system BV where equalities for unit are oriented from left to right and new structural rules are introduced to preserve completeness. While the first system allows units to appear in the structures, the second system makes it possible to completely remove the units from the language of BV by proving the normal forms of the structures that are provable in BV. The resulting systems provide better performance in automated proof search by disabling redundant applications of inference rules due to the unit. As evidence, we provide a comparison of the performance of these systems in a Maude implementation.
Pdf4 August 2004
ISCIS 2004, LNCS 3280, pp. 986995
From the Calculus of Structures to Term Rewriting Systems
Steffen Hölldobler and Ozan Kahramanogullari
AbstractThe calculus of structures is a recently developed proof theoretical formalism that extends one-sided sequent calculus, with the gain of interesting proof theoretical properties. In contrast to sequent calculus, it does not rely on the notion of main connective and, like in term rewriting, permits the application of inference rules anywhere deep inside a formula. In this paper, exploiting this resemblance, we present a procedure turning derivations in the calculus of structures in four steps into rewritings in a term rewriting system modulo equality.
13 June 2004
Technical Report WV-04-03, Technische Universität Dresden
Implementing System BV of the Calculus of Structures in Maude
AbstractSystem BV is an extension of multiplicative linear logic with a noncommutative self-dual operator. We first map derivations of system BV of the calculus of structures to rewritings in a term rewriting system modulo equality, and then express this rewriting system as a Maude system module. This results in an automated proof search implementation for this system, and provides a recipe for implementing existing calculus of structures systems for other logics. Our result is interesting from the view of applications, specially, where sequentiality is essential, e.g., planning and natural language processing. In particular, we argue that we can express plans as logical formulae by using the sequential operator of BV and reason on them in a purely logical way.
Pdf13 June 2004
Proceedings of ESSLLI 2004-Student Session
In this page I list all open and currently explored research subjects I am aware of. The solutions to most of these problems are instrumental in reaching the common goal of a comprehensive bureaucracy-free proof theory based on geometric methods.
Finding counterexamples is one of the funniest sides of our activity. They are typically of a very combinatorial nature, due to the new combinatorial possibilities offered by deep inference.
The Logic BV and Quantum CausalityRichard Blute, Prakash Panangaden and Lutz StraßburgerWe describe how a logic with commutative and noncommutative connectives can be used for capturing the essence of discrete quantum causal propagation.12 September 2008
Polynomial Size Deep-Inference Proofs Instead of Exponential Size Shallow-Inference ProofsAlessio GuglielmiBy a simple example, I show how deep inference can provide for an exponential speed-up in the size of proofs with respect to shallow inference.22 September 2007SlidesSuperseded by On the Proof Complexity of Deep Inference
Finitary CutAlessio GuglielmiWe all know that the traditional cut rule is considered infinitary. But if we reduce the cut rule to atomic form, as we always can do in the calculus of structures, is atomic cut still infinitary? Not really.22 September 2007Superseded by A First Order System with Finite Choice of Premises
On Analytic Inference Rules in the Calculus of StructuresPaola Bruscoli and Alessio GuglielmiWe discuss the notion of analytic inference rule for propositional logics in the calculus of structures.25 November 2009
Interaction and Depth Against Nondeterminism in Proof SearchOzan KahramanogullariWe argue that, by exploiting an interaction and depth scheme in the logical expressions, the nondeterminism in proof search can be reduced without losing the shorter proofs and breaking proof theoretical properties.23 April 2007Presented at Automated Reasoning Workshop 2007
Intuitionistic Logic in the Calculus of StructuresKai BrünnlerIn this note I present preliminary ideas about a system for minimal logic.7 February 2006
ButterfliesAlessio GuglielmiIf you're interested in NEL and BV, try and find the mistake in this note.29 July 2005
Some News on Subatomic LogicAlessio GuglielmiI found a simple explanation for the soundness of the subatomic inference rule.28 July 2005
Red and BlueAlessio GuglielmiThe sequent calculus is often colour blind, but not always; the calculus of structures has a consistent behaviour.26 July 2005
The Problem of Bureaucracy and Identity of Proofs from the Perspective of Deep InferenceAlessio GuglielmiSome discussion and then the first attempt at wired/weird deduction.17 June 2005Proceedings of Structures and Deduction '05, pp. 5368
Getting Formalisms A and B by Proof-Terms and Typing SystemsStéphane Lengrand and Kai Brünnler 11 February 2005Expanded into On Two Forms of Bureaucracy in Derivations
Formalism BAlessio GuglielmiInference rules operate on derivations.20 December 2004
Using BV to Describe Causal Quantum EvolutionPrakash PanangadenIn this note I describe how to capture the kinematics of quantum causal evolution using a logic called BV developed by the Calculus of Structures group at Dresden.5 July 2004
Formalism AAlessio GuglielmiI generalise the notion of derivation for taking care of a certain kind of bureaucracy.23 April 2004Superseded by A Proof Calculus Which Reduces Syntactic Bureaucracy
Resolution in the Calculus of StructuresAlessio GuglielmiIt is possible to search for cut-free proofs by the resolution strategy in the calculus of structures; the sequent calculus does not allow the same freedom.29 September 2003
MismatchAlessio GuglielmiThere is a mismatch between meta and object levels in many calculi, like the sequent calculus and natural deduction. The mismatch has undesirable proof theoretical consequences, the most important being the inability to design deductive systems.24 August 2003Brief note for the specialist, posted to the Foundations of Mathematics and Proof Theory lists
Normalisation Without Cut EliminationAlessio GuglielmiI propose a possible notion of equivalence for proofs in classical logic, sort of a normal form for case analysis.25 February 2003
Subatomic LogicAlessio GuglielmiOne can unify classical and linear logic by using only two simple, linear, `hyper´ inference rules; they generate nice systems for all the fragments, where all rules are local. The various logics are determined by the algebra of their units, for example boolean algebra determines classical logic. We can prove cut elimination easily and once and for all in the two-rule system, and the procedure scales down to all the derived fragments.21 November 2002
On Lafont's CounterexampleAlessio GuglielmiLafont's counterexample shows why cut elimination in the sequent calculus of classical logic is not confluent. I show here that the counterexample doesn't work in the calculus of structures.1 November 2002
RecipeAlessio GuglielmiA family of tautologies that hold in most logics can be used, following a simple recipe, to produce deductive systems where all rules are local, including cut.27 October 2002
Goodness, Perfection and MiraclesAlessio GuglielmiI argue about three properties closely related to cut elimination and interpolation theorems. 18 October 2002
Freedom from BureaucracyFF15 February 1963
There is a very friendly mailing list devoted to deep inference, proof nets, structads, the calculus of structures and other amphibians of structural proof theory, named Frogs. Join it!
These are the current and recent grants I am aware of, regarding deep inference:
Efficient and Natural Proof Systems20132016. Three-year EPSRC research project at University of Bath. 756,061 GBP (full economic costing, including indexation and a research studentship funded by the University of Bath). Principal investigator is Alessio Guglielmi, co-investigator is Guy McCusker.
Sharing and Sequentiality in Proof Systems with Locality20122014. Two-year Royal Society International Exchanges Scheme. Exchange programme between University of Bath and Università di Torino. The two sides get 11,960 GBP. Principal investigators: Alessio Guglielmi and Luca Roversi.
Extending the Realm of the Curry-Howard Correspondence20112012. French-Swiss Academic Research Collaboration Programme (PAI Germaine De Stael). Exchange programme between INRIA Futurs and IAM, Universität Bern. The two sides get around 4,300 EUR each. Principal investigators: Lutz Straßburger and Kai Brünnler.
Structural and Computational Proof Theory (Structural)20102012. ANRProgramme Blanc International/FWF 2010. Three-year project at PPS, LIX, Innsbruck and Vienna. The project gets 441,229 EUR, out of a total cost in France of 2,032,556 EUR, from the Agence Nationale de la Recherche and it gets 291,577 EUR from the FWF. Coordinator and principal investigator is Michel Parigot.
REDO: Redesigning Logical Syntax20092010. INRIAAction de Recherche Collaborative. Two-year project of the teams Calligramme/Démosthène and Parsifal (INRIA) and the Computer Science Department of the University of Bath. The project gets 80,000 EUR. Principal investigators are Alessio Guglielmi, François Lamarche and Lutz Straßburger (coordinator).
Identity and Geometric Essence of Proofs (Démosthène)ANRChaire d’excellence 20082010. Two-year project at LORIA. The project gets about 355,000 EUR from the Agence Nationale de la Recherche, out of a total cost of 750,448 EUR, the rest of which is covered by INRIA. Principal investigator is Alessio Guglielmi.
Complexity and Non-Determinism in Deep Inference20072008. 17 month project at University of Bath. 118,072 GBP, out of a total cost of 147,591 GBP, from the Engineering and Physical Sciences Research Council (EPSRC) Principal investigator is Alessio Guglielmi, co-investigator is Paola Bruscoli.
Deep Inference and the Essence of Proofs20072008. French-Swiss Academic Research Collaboration Programme (PAI Germaine De Stael). Exchange programme between INRIA Futurs and IAM, Universität Bern. The two sides get around 5,000 EUR each. Principal investigators: Lutz Straßburger and Kai Brünnler.
The Realm of Cut Elimination20072008. French-Austrian Academic Research Collaboration Programme (PAI Amadeus). Exchange programme between INRIA Futurs and Theory and Logic Group, Technische Universität Wien. The two sides get around 5,000 EUR each. Principal investigators: Lutz Straßburger and Agata Ciabattoni.
Theory and Application of Deep Inference (Infer)ANRProgramme Blanc 2006. Three-year project at LIX, LORIA and PPS. The project gets 230,400 EUR, out of a total cost of 1,280,598 EUR, from the Agence Nationale de la Recherche. Coordinator and principal investigator is Lutz Straßburger, other principal investigators are François Lamarche and Michel Parigot.
Analyticity and Proof Search for Modal Logics in Deep Inference20062008. British-German Academic Research Collaboration Programme. Two-year exchange programme between the University of Bath and the Technische Universität Dresden. Bath gets 2,650 GBP from the British Council (principal investigator Alessio Guglielmi) and Dresden gets 5,694 EUR from the German Academic Exchange Service (principal investigator Steffen Hölldobler).
New Deductive Systems, Normalisation Methods and Semantics for Deep Inference 20062007. AllianceFranco-British Partnership Programme. Two-year exchange programme between the University of Bath and the PPS group in Paris. Bath gets 4,300 GBP from the British Council (principal investigator Alessio Guglielmi) and PPS gets 6,250 EUR from the French MAE and the Ministry of Research (principal investigator Michel Parigot).
I did not indicate individual travel grants.
|2013||1||Logical Methods in Computer Science|
|2012||4||Annals of Pure and Applied Logic (3), Applied Categorical Structures|
|2011||5||Mathematical Structures in Computer Science, Journal of Logic and Computation, ACM Transactions on Computational Logic, Logical Methods in Computer Science, Logic without Frontiers (Festschrift for Walter Alexandre Carnielli)|
|2009||6||ACM Transactions on Computational Logic, Electronic Notes in Theoretical Computer Science, Journal of Logic and Computation, Annals of Pure and Applied Logic, Information and Computation, Archive for Mathematical Logic|
|2008||3||Annals of Pure and Applied Logic, Logical Methods in Computer Science, Journal of Logic and Computation|
|2007||5||ACM Transactions on Computational Logic, Studia Logica, Journal of Logic and Computation, Theory and Applications of Categories (2)|
|2006||5||Studia Logica, Annals of Pure and Applied Logic, Logical Methods in Computer Science (2), Notre Dame Journal of Formal Logic|
|2003||2||Logic Journal of the IGPL, Theoretical Computer Science|
|2013||3||FoSSaCS, LICS, RTA|
|2012||4||CiE, AiML (2), JELIA|
|2011||3||TLCA, Tableaux, CSL|
|2010||7||LPAR (2), RTA, LICS, CiE, AiML (2)|
|2009||5||Tableaux (2), WoLLIC, TLCA, TbiLLC|
|2008||3||LATA, LPAR, AiML|
|2006||4||AiML, CiE, LPAR (2)|
|2005||5||WoLLIC, Logica Universalis, TLCA, AIA, LICS|
|2004||4||AiML, CSL (2), ISCIS|
|2003||4||CSL, FOL75, WoLLIC (2, one of which invited talk)|
|2002||3||LPAR (2), ICLP|
|2007||6||RULE '07, HyLo 2007, ARW '07, LPAR '07, CMSB '07, M4M 5|
|2006||2||WRLA06, Cl&C '06|
|2006||2||Richard McKinley, Ozan Kahramanogullari|
|2003||2||Kai Brünnler, Lutz Straßburger|
|2007||1||Benjamin Robert Horsfall|
Deep inference was one of the main attractions at the following international events:
Deep inference and the calculus of structures have been taught at
The Virginia Lake LaTeX macros help typing deep inference structures, derivations and atomic flows.
The DedStraker TeX macros are now obsolete.
This BibTeX database of deep-inference publications is kept updated.