Alessio Guglielmi's Research and Teaching / Deep Inference
Deep Inference
CloudBased Quantum BioCryptography and Creative NanoSecurity for the Blockchain of Unicorns
This page tries to be a comprehensive account of the ongoing research on deep inference. Deep inference started as a personal project, but it is now growing fast and I'm struggling to keep pace with all the developments. Some of them I do not understand in detail. Please keep in mind that what I wrote below can be very subjective and not reflect the opinions of the people involved in this research. If you disagree or find inaccuracies, please let me know! If you are an expert in proof theory and want to quickly understand what deep inference is about, go to Deep Inference in One Minute.
News
2018 Andrea Aler Tubella's PhD thesis has been shortlisted in the top four by the E.W. Beth Dissertation Prize 2018.
7 July 2018 Workshop on Twenty Years of Deep Inference in Oxford.
2018–2021 Threeyear EPSRC project Typed LambdaCalculi with Sharing and Unsharing at the University of Bath.
8–9 September 2017 4th International Workshop on Structures and Deduction in Oxford.
8–10 June 2017 Second FISP Meeting in Paris.
15–17 November 2016 Workshop on The Fine Structure of Formal Proof Systems and Their Computational Interpretations in Innsbruck.
2016–2019 Threeyear ANRFWF project The Fine Structure of Formal Proof Systems and their Computational Interpretations (FISP) at Univ. Paris Diderot/PPS, INRIA Saclay/École Polytechnique, University of Innsbruck, Vienna University of Technology.
14–16 December 2015 Workshop on Efficient and Natural Proof Systems in Bath.
3–14 August 2015 Two courses on deep inference at ESSLLI 2015.
2013–2016 Threeyear EPSRC project Efficient and Natural Proof Systems at the University of Bath.
Contents
 Introduction
 Proof Systems
 ProofTheoretical Properties
 History
 Papers
 Classical and Intuitionistic Logic
 Proof Complexity
 Nested Sequents
 Modal Logic
 Linear Logic
 Commutative/Noncommutative Linear Logic
 Proof Nets, Semantics of Proofs and the War to Bureaucracy
 Language Design
 Implementations
 Notes
 Mailing List
 Grants
 Beans
 Events
 TeX Macros and BibTeX database
1 Introduction
[There is a version of this introduction with bibliographic references and examples in the paper Deep Inference.]
Deep inference could succinctly be described as an extreme form of linear logic. It is a methodology for designing proof formalisms that generalise Gentzen formalisms, i.e., the sequent calculus and natural deduction. In a sense, deep inference is obtained by applying some of the main concepts behind linear logic to the formalisms, i.e., to the rules by which proof systems are designed. By doing so, we obtain a better proof theory than the traditional one due to Gentzen. In fact, in deep inference, we can provide proof systems for more logics, in a more regular and modular way, with smaller proofs, less syntax, less bureaucracy and we have a chance to make substantial progress towards a solution to the centuryold problem of the identity of proofs. The first manuscript on deep inference appeared in 1999 and the first refereed papers in 2001. So far, two formalisms have been designed and developed in deep inference: the calculus of structures and open deduction. A third one, nested sequents, introduces deep inference features into a more traditional Gentzen formalism.
Essentially, deep inference tries to understand proof composition and proof normalisation (in a very liberal sense including cut elimination) in the most logicagnostic way. Thanks to it we obtain a deeper understanding of the nature of normalisation. It seems that normalisation is a primitive, simple phenomenon that manifests itself in more or less complicated ways that depend on the choice of representation for proofs rather than their true mathematical nature. By dropping syntactic constraints, as we do in deep inference compared to Gentzen, we get closer to the semantic nature of proof and proof normalisation.
The early inspiration for deep inference came from linear logic. Linear logic, among other ideas, supports the notion that logic has a geometric nature, and that a more perspicuous analysis of proofs is possible if we uncover their geometric shape, hidden behind their syntax. We can give technical meaning to this notion by looking for linearity in proofs. In the computing world, linearity can be interpreted as a way to deal with quantity or resource. The significance of linear logic for computer science has stimulated a remarkable amount of research, that continues to these days, and that ranges from the most theoretical investigations in categorical semantics to the implementation of languages and compilers and the verification of software.
Linear logic expresses locality by relying on Gentzen's formalisms. However, these had been developed for classical mathematical logic, for which linearity is not a primitive, natural notion. While attempting to relate process algebras (which are foundational models of concurrent computation) to linear logic, I realised that Gentzen's formalisms were inherently inadequate to express the most primitive notion of composition in computer science: sequential composition. This is indeed linear, but of a different kind of linearity from that naturally supported by linear logic.
I realised then that the linear logic ideas were to be carried all the way through and that the formalisms themselves had to be 'linearised'. Technically, this turned out to be possible by dropping one of the assumptions that Gentzen implicitly used, namely that the (geometric) shape of proofs is directly related to the shape of formulae that they prove. In deep inference, we do not make this assumption, and we get proofs whose shape is much more liberally determined than in Gentzen's formalisms. As an immediate consequence, we were able to capture processalgebras sequential composition, but we soon realised that the new formalism was offering unprecedented opportunities for both a more satisfying general theory of proofs and for more applications in computer science.
1.1 Proof Systems
The difference between Gentzen formalisms and deep inference ones is that in deep inference we compose proofs by the same connectives of formulae: if
\(\Phi=\array{A\\ \Vert\\ B\\}\) 

and 

\(\Psi=\array{C\\ \Vert\\ D\\}\) 
are two proofs with, respectively, premisses \(A\) and \(C\), and conclusions \(B\) and \(D\), then
\(\Phi\land\Psi=\array{A\land C\\ \Vert\\ B\land D\\}\) 

and 

\(\Phi\lor\Psi=\array{A\lor C\\ \Vert\\ B\lor D\\}\) 
are valid proofs with, respectively, premisses \(A\land C\) and \(A\lor C\), and conclusions \(B\land D\) and \(B\lor D\). Significantly, while \(\Phi\land\Psi\) can be represented in Gentzen, \(\Phi\lor\Psi\) cannot. That is basically the definition of deep inference and it holds for every language – not just for propositional classical logic.
As an example, I will show the standard deep inference system for propositional logic. System \(\SKS\) is a proof system defined by the following structural inference rules (where \(a\) and \(\bar a\) are dual atoms):
\(\inf\id{\t}{a\lor\bar a}\) 

\(\inf\wd{\f}{a}\) 

\(\inf\cd{a\lor a}{a}\) 

identity 

weakening 

contraction 






; 
\(\inf\iu{a\land\bar a}{\f}\) 

\(\inf\wu{a}{\t}\) 

\(\inf\cu{a}{a\land a}\) 

cut 

coweakening 

cocontraction 

and by the following two logical inference rules:
\(\inf\s{A\land(B\lor C)}{(A\land B)\lor C}\) 

\(\inf\m{(A\land B)\lor(C\land D)}{(A\lor C)\land(B\lor D)}\) 
. 
switch 

medial 

A cutfree derivation is a derivation where \(\iu\) is not used, i.e., a derivation in \(\SKS\setminus\{\iu\}\). In addition to these rules, there is a rule
\[
\inf{=}{C}{D}
\quad,
\]
such that \(C\) and \(D\) are opposite sides in one of the following equations:
\[
\begin{array}{rcl}
A\lor B&=&B\lor A\\
A\land B&=&B\land A\\
(A\lor B)\lor C&=&A\lor(B\lor C)\\
(A\land B)\land C&=&A\land(B\land C)\\
\end{array}
\qquad
\begin{array}{rcl}
A\lor\f&=&A\\
A\land\t&=&A\\
\t\lor\t&=&\t\\
\f\land\f&=&\f\\
\end{array}
\quad.
\]
We do not always show the instances of rule \(=\), and when we do show them, we gather several contiguous instances into one.
For example, this is a valid derivation:
\[
\array{(a\lor b)\land a\\\Vert\\((a\lor b)\land a)\land((a\lor b)\land a)}
\equiv
\vcenter{\fbox{\(\inf\m{\fbox{\(\inf\cu{a}{a\land a}\)}
\lor
\fbox{\(\inf\cu{b}{b\land b}\)}}
{(a\lor b)\land(a\lor b)}
\)}}
\land
\vcenter{\fbox{\(\inf\cu{a}{a\land a}\)}}
\quad.
\]
This derivation illustrates a general principle in deep inference: structural rules on generic formulae (in this case a cocontraction) can be replaced by corresponding structural rules on atoms (in this case \(\cu\)).
It is interesting to note that the inference rules for classical logic, as well as the great majority of rules for any logic, all derive from a common template which has been distilled from the semantics of a purely linear logic in the first deep inference paper. Since this phenomenon is very surprising, especially for structural rules such as weakening and contraction, we believe that we might be dealing with a rather deep aspect of logic and we are currently investigating it.
1.2 ProofTheoretical Properties
Locality and linearity are foundational concepts for deep inference, in the same spirit as they are for linear logic. Going for locality and linearity basically means going for complexity bounded by a constant. This last idea introduces geometry into the picture because bounded complexity leads us to equivalence modulo continuous deformation. In a few words, the simple and natural definition of deep inference that we have seen above captures these ideas about linearity, locality and geometry, and can consequently be exploited in many ways, and notably:
 to recover a De Morgan premissconclusion symmetry that is lost in Gentzen;
 to obtain new notions of normalisation in addition to cut elimination;
 to shorten analytic proofs by exponential factors compared to Gentzen;
 to obtain quasipolynomialtime normalisation for propositional logic;
 to express logics that cannot be expressed in Gentzen;
 to make the proof theory of a vast range of logics regular and modular;
 to get proof systems whose inference rules are local, which is usually impossible in Gentzen;
 to inspire a new generation of proof nets and semantics of proofs;
 to investigate the nature of cut elimination;
 to type optimised versions of the λcalculus that are not typeable in Gentzen;
 to model process algebras;
 to model quantum causal evolution ...
 ... and much more.
One of the open questions is whether deep inference might have a positive influence on the proofsearchascomputation paradigm and possibly on focusing. This subject has been so far almost unexplored, but some preliminary work looks very promising.
The core topic of every prooftheoretic investigation, namely normalisation, deserves a special mention. Traditionally, normalisation is at the core of proof theory, and this is, of course, the same for deep inference. Normalisation in deep inference is not much different, in principle, from normalisation in Gentzen theory. In practice, however, the more liberal proof composition mechanism of deep inference completely invalidates the techniques (and the intuition) behind cut elimination procedures in Gentzen systems. Much of the effort of these 15 years of research on deep inference went into recovering a normalisation theory. One of the main ideas is called splitting and at present it is the most general method we know for eliminating cuts in deep inference.
On the other hand, we now have techniques that are not as widely applicable but that are of a completely different nature from splitting, which is combinatorial. A surprising, relatively recent result consists in exploiting deep inference's locality to obtain the first purely geometric normalisation procedure, by a topological device that we call atomic flows. This means that, at least for classical logic and logics that extend it, cut elimination can be understood as a process that is completely independent of logical information: only the shape of the proof, determined by its structural information (creation, duplication and erasing of atoms) matters. Logical information, such as the connectives in formulae, do not matter. This hints at a deeper nature of normalisation than what we thought so far. It seems that normalisation is a primitive, simple phenomenon that manifests itself in more or less complicated ways that depend on the choice of representation for proofs rather than their true mathematical nature.
1.3 History
Deep inference comes from linear logic and process algebras; more specifically, it comes from seeing proofs as concurrent processes. The first development has been the definition of the calculus of structures and a cut elimination proof for the logic BV, which was studied for being the logical counterpart of the core of the process algebra CCS. We realised that the techniques developed for BV had a much wider applicability, so we broadly developed the calculus of structures and studied its many novel normalisation properties. The initial phase of development took place in Dresden, from 1999 to 2003; now deep inference is developed in several laboratories around the world. The recent results on modal and intuitionistic logics, proof nets and semantics, and implementations complete the establishing of deep inference as a solid and comprehensive methodology in proof theory.
2 Papers
The following material is broad in scope; if you are new to deep inference and the calculus of structures, start here:
 Modular Normalisation of Classical Proofs
Benjamin Ralph
The main contribution of this thesis is to present a study of two normalisation theorems and proofs in classical logic: one propositional, one firstorder. For propositional logic, we show a local cycle removal procedure through reductions on merge contractions that ensures that proofs can be decomposed—that contractions can be pushed to the bottom of a proof—in a straightforward way. For firstorder logic, we show how decomposition of proofs can correspond to two presentations of Herbrand’s Theorem, and how we can use translations into expansion proofs to give a new, indirect cut elimination theorem for firstorder logic.
In addition, an old but interesting cut elimination method for propositional logic, the experiments method, is formally presented for the first time, and we extend the theory of merge contractions to firstorder logic.
Pdf 8 February 2019
PhD thesis, defended on 30 October 2018
 The Atomic LambdaMu Calculus
Fanny He
A cornerstone of theoretical computer science is the CurryHoward correspondence where formulas are types, proofs are programs, and proof normalization is computation. In this framework we introduce the atomic λμcalculus, an interpretation of a classical deep inference proof system. It is based on two extensions of the λcalculus, the λμcalculus and the atomic λcalculus. The former interprets classical logic, featuring continuationlike constructs, while the latter interprets intuitionistic deep inference, featuring explicit sharing operators.
The main property of the atomic λcalculus is reduction on individual constructors, derived from atomicity in deep inference. We thus work on open deduction, a deep inference formalism, allowing composition with connectives and with derivations, and using the medial rule to obtain atomicity. One challenge is to find a suitable formulation for deriving a computational interpretation of classical natural deduction. A second design challenge leads us to work on a variant of the λμcalculus, the ΛμScalculus, adding streams and dropping names.
We show that our calculus has preservation of strong normalization (PSN), confluence, fullylazy sharing, and subject reduction in the typed case. There are two challenges with PSN. First, we need to show that sharing reductions strongly normalize, underlining that only β,μreductions create divergence. Our proof is new and follows a graphical approach to terms close to the idea of sharing. Second, infinite reductions of the atomic calculus can appear in weakenings, creating infinite atomic paths corresponding to finite ΛμSpaths. Our solution is to separate the proof into two parts, isolating the problem of sharing from that of weakening. We first translate into an intermediate weakening calculus, which unfolds shared terms while keeping weakened ones, and preserves infinite reductions. We then design a reduction strategy preventing infinite paths from falling into weakenings.
Pdf 11 July 2018
PhD thesis, defended on 10 April 2018
 A Study of Normalisation Through Subatomic Logic
Andrea Aler Tubella
We introduce subatomic logic, a new methodology where by looking inside of atoms we are able to represent a wide variety of proof systems in such a way that every rule is an instance of a single, regular, linear rule scheme. We show the generality of the subatomic approach by presenting how it can be applied to several different systems with very different expressivity.
In this thesis we use subatomic logic to study two normalisation procedures: cutelimination and decomposition. In particular, we study cutelimination by characterising a whole class of substructural logics and giving a generalised cutelimination procedure for them, and we study decomposition by providing generalised rewriting rules for derivations that we can then apply to decompose derivations and to eliminate cycles.
Pdf 31 January 2017
PhD thesis, defended on 21 December 2016, shortlisted in the top four by the E.W. Beth Dissertation Prize 2018
 Deep Inference
Alessio Guglielmi
Pdf 30 November 2014
All About Proofs, Proofs for All, Mathematical Logic and Foundations, College Publications, pp. 164–172, 2015
 The Complexity of Propositional Proofs in Deep Inference
Anupam Das
Deep inference is a relatively recent proof methodology whose systems differ from traditional systems by allowing inference rules to operate on any connective appearing in a formula, rather than just the main connective. Its distinguishing feature, from a structural proof theoretic point of view, is that its systems are local: inference steps can be checked in constant time, a property impossible to achieve in Gentzen sequent calculi.
Due to the greater flexibility in applying inference rules, deep inference systems introduce more normal forms to classical proof theory, splitting Gentzen cutelimination into smaller steps. While the complexity of full cutelimination in the sequent calculus is necessarily exponential in the size of the input proof, these intermediate procedures exhibit varying complexities, while still entailing properties of proof theoretic interest, such as consistency and decidability.
This dissertation contributes to the classification of these intermediate classes of proof, from the point of view of proof complexity, and introduces new techniques to analyse their complexity.
Pdf 20 October 2014
PhD thesis, defended on 18 October 2013
 Nested Deduction in Logical Foundations for Computation
Nicolas Guenot
Dans cette thèse, nous étudions différentes approches permettant de donner un contenu calculatoire à des systèmes logiques suivant le principe d'inférence profonde, qui généralisent les formalismes traditionnels (tels que la déduction naturelles et le calcul des séquents) en autorisant l'application de règles d'inférence à n'importe quelle position dans une formule. Tout d'abord, nous suivons l'approche de la correspondence de CurryHoward, en définissant des systèmes d'inférence profonde pour la logique intuitionniste, notamment dans le calcul des structures, et en décrivant une procédure de normalisation purement syntaxique. Ceci permet d'intépréter les preuves d'un tel système comme des lambdatermes et de connecter la normalisation à la réduction dans un lambdacalcul avec substitutions explicites. En particulier, nous étudions les extensions du lambdacalcul qui peuvent être obtenues en exploitant les particularités de nos systèmes logiques. Enfin, nous considérons la recherche de preuve comme calcul, en particulier à travers l'adaptation de la technique de focalisation (en logique linéaire) au calcul des structures, qui permet de réduire drastiquement le nondéterminisme de la construction de preuves. Nous montrons également comment la recherche de preuve en logique intuitionniste peut être reliée à la réduction dans le lambdacalcul avec substitutions explicites.
Pdf 10 April 2013
PhD thesis, defended on 10 April 2013
 Towards a Theory of Proofs of Classical Logic
Lutz Straßburger
Pdf 3 December 2010
Habilitation thesis

A General View of Normalisation Through Atomic Flows
Tom Gundersen
Atomic flows are a geometric invariant of classical propositional proofs in deep inference. In this thesis we use atomic flows to describe new normal forms of proofs, of which the traditional normal forms are special cases, we also give several normalisation procedures for obtaining the normal forms. We define, and use to present our results, a new deepinference formalism called the functorial calculus, which is more flexible than the traditional calculus of structures. To our surprise we are able to 1) normalise proofs without looking at their logical connectives or logical rules; and 2) normalise proofs in less than exponential time.
Pdf 12 August 2010
PhD thesis, defended on 10 November 2009
You can buy this book at Lambert Academic Publishing
 Nested Sequents
Kai Brünnler
We see how nested sequents, a natural generalisation of hypersequents, allow us to develop a systematic proof theory for modal logics. As opposed to other prominent formalisms, such as the display calculus and labelled sequents, nested sequents stay inside the modal language and allow for proof systems which enjoy the subformula property in the literal sense.
In the first part we study a systematic set of nested sequent systems for all normal modal logics formed by some combination of the axioms for seriality, reflexivity, symmetry, transitivity and euclideanness. We establish soundness and completeness and some of their good properties, such as invertibility of all rules, admissibility of the structural rules, termination of proofsearch, as well as syntactic cutelimination.
In the second part we study the logic of common knowledge, a modal logic with a fixpoint modality. We look at two infinitary proof systems for this logic: an existing one based on ordinary sequents, for which no syntactic cutelimination procedure is known, and a new one based on nested sequents. We see how nested sequents, in contrast to ordinary sequents, allow for syntactic cutelimination and thus allow us to obtain an ordinal upper bound on the length of proofs.
Pdf 13 April 2010
Habilitation thesis

Nondeterminism and Language Design in Deep Inference
Ozan Kahramanoğulları
This thesis studies the design of deepinference deductive systems. In the systems with deep inference, in contrast to traditional prooftheoretic systems, inference rules can be applied at any depth inside logical expressions. Deep applicability of inference rules provides a rich combinatorial analysis of proofs. Deep inference also makes it possible to design deductive systems that are tailored for computer science applications and otherwise provably not expressible.
By applying the inference rules deeply, logical expressions can be manipulated starting from their subexpressions. This way, we can simulate analytic proofs in traditional deductive formalisms. Furthermore, we can also construct much shorter analytic proofs than in these other formalisms. However, deep applicability of inference rules causes much greater nondeterminism in proof construction.
This thesis attacks the problem of dealing with nondeterminism in proof search while preserving the shorter proofs that are available thanks to deep inference. By redesigning the deep inference deductive systems, some redundant applications of the inference rules are prevented. By introducing a new technique which reduces nondeterminism, it becomes possible to obtain a more immediate access to shorter proofs, without breaking certain proof theoretical properties such as cutelimination. Different implementations presented in this thesis allow to perform experiments on the techniques that we developed and observe the performance improvements. Within a computationasproofsearch perspective, we use deepinference deductive systems to develop a common prooftheoretic language to the two fields of planning and concurrency.
Pdf 11 September 2006
PhD thesis, defended on 21 December 2006
You can buy this book at Lambert Academic Publishing
 Categorical Models of First Order Classical Proofs
Richard McKinley
This thesis introduces the notion of a classical doctrine: a semantics for proofs in firstorder classical logic derived from the classical categories of Führmann and Pym, using Lawvere's notion of hyperdoctrine. We introduce a hierarchy of classes of model, increasing in the strength of cutreduction theory they model; the weakest captures cut reduction, and the strongest gives De Morgan duality between quantifiers as an isomorphism. Whereas classical categories admit the elimination of logical cuts as equalities, (and cuts against structural rules as inequalities), classical doctrines admit certain logical cuts as inequalities only. This is a result of the additive character of the quantifier introduction rules, as is illustrated by a concrete model based on families of sets and relations, using an abstract Geometry of Interaction construction.
We establish that each class of models is sound and complete with respect to the relevant cutreduction theory on proof nets based on those of Robinson for propositional classical logic. We show also that classical categories and classical doctrines are not only a class of models for the sequent calculus, but also for deep inference calculi due to Brünnler for classical logic. Of particular interest are the local systems for classical logic, which we show are modelled by categorical models with an additional axiom forcing monoidality of certain functors; these categorical models correspond to multiplicative presentations of the sequent calculus with additional additive features.
Pdf 30 March 2006
PhD thesis, defended on 17 March 2006
 Deep Inference and Its Normal Form of Derivations
Kai Brünnler
We see a notion of normal derivation for the calculus of structures, which is based on a factorisation of derivations and which is more general than the traditional notion of cutfree proof in this formalism.
Pdf 29 March 2006
CiE 2006, LNCS 3988, pp. 65–74

Deep Inference and Symmetry in Classical Proofs
Kai Brünnler
In this thesis we see deductive systems for classical propositional and predicate logic which use deep inference, i.e. inference rules apply arbitrarily deep inside formulas, and a certain symmetry, which provides an involution on derivations. Like sequent systems, they have a cut rule which is admissible. Unlike sequent systems, they enjoy various new interesting properties. Not only the identity axiom, but also cut, weakening and even contraction are reducible to atomic form. This leads to inference rules that are local, meaning that the effort of applying them is bounded, and finitary, meaning that, given a conclusion, there is only a finite number of premises to choose from. The systems also enjoy new normal forms for derivations and, in the propositional case, a cut elimination procedure that is drastically simpler than the ones for sequent systems.
Pdf February 2004
PhD thesis, defended on 22 September 2003, published by Logos Verlag
You can buy this book at LogosVerlag and at Amazon
 Linear Logic and Noncommutativity in the Calculus of Structures
Lutz Straßburger
In this thesis I study several deductive systems for linear logic, its fragments, and some noncommutative extensions. All systems will be designed within the calculus of structures, which is a proof theoretical formalism for specifying logical systems, in the tradition of Hilbert's formalism, natural deduction, and the sequent calculus. Systems in the calculus of structures are based on two simple principles: deep inference and topdown symmetry. Together they have remarkable consequences for the properties of the logical systems. For example, for linear logic it is possible to design a deductive system, in which all rules are local. In particular, the contraction rule is reduced to an atomic version, and there is no global promotion rule. I will also show an extension of multiplicative exponential linear logic by a noncommutative, selfdual connective which is not representable in the sequent calculus. All systems enjoy the cut elimination property. Moreover, this can be proved independently from the sequent calculus via techniques that are based on the new topdown symmetry. Furthermore, for all systems, I will present several decomposition theorems which constitute a new type of normal form for derivations.
Pdf Pdf in booklet format 25 July 2003
PhD thesis, defended on 24 July 2003
In the rest of the section, all the papers I know of are listed according to their subject, in no particular order.
2.1 Classical and Intuitionistic Logic
So far, for classical logic in the calculus of structures we achieved:
 the cut rule trivially reduces to atomic form;
 one can show cut elimination for the propositional fragment by the simplest cut elimination argument to date;
 the propositional fragment is fully local, including contraction;
 first order classical logic can be entirely made finitary;
 cut elimination and decomposition theorems are proved;
 a typed lambdacalculus with explicit sharing can be obtained via a CurryHoward interpretation of deep inference.
We can present intuitionistic logic in the calculus of structures with a fully local, cutfree system. The logic of bunched implications BI can be presented in the calculus of structures. Japaridze's cirquent calculus benefits from a deepinference presentation, in particular in the case of propositional logic.
The following papers exist, in addition to Deep Inference and Its Normal Form of Derivations, Deep Inference and Symmetry in Classical Proofs and Towards a Theory of Proofs of Classical Logic, mentioned above:
 A Natural Proof System for Herbrand’s Theorem
Benjamin Ralph
The reduction of undecidable firstorder logic to decidable propositional logic via Herbrand’s theorem has long been of interest to theoretical computer science, with the notion of a Herbrand proof motivating the definition of expansion proofs. The problem of building a natural proof system around expansion proofs, with composition of proofs and cutfree completeness, has been approached from a variety of different angles. In this paper we construct a simple deep inference system for firstorder logic, KSh2, based around the notion of expansion proofs, as a starting point to developing a rich proof theory around this foundation. Translations between proofs in this system and expansion proofs are given, retaining much of the structure in each direction.
Pdf 14 November 2017
LFCS 2018, LNCS 10703, pp. 289–308 – Winner of the Rosser Prize for best student paper
 Removing Cycles from Proofs
Andrea Aler Tubella, Alessio Guglielmi and Benjamin Ralph
If we track atom occurrences in classical propositional proofs in deep inference, we see that they can form cyclic structures between cuts and identity steps. These cycles are an obstacle to a very natural form of normalisation, that simply unfolds all the contractions in a proof. This mechanism, which we call 'decomposition', has many points of contact with explicit substitutions in lambda calculi. In the presence of cycles, decomposition does not terminate, and this is an obvious drawback if we want to interpret proofs computationally. One way of eliminating cycles is eliminating cuts. However, we could ask ourselves whether it is possible to eliminate cycles independently of (general) cut elimination. This paper shows an efficient way to do so, therefore establishing the independence of decomposition from cut elimination. In other words, cut elimination in propositional logic can be separated into three separate procedures: 1) cycle elimination, 2) unfolding of contractions, 3) elimination of cuts in the linear fragment.
Pdf 20 June 2017
CSL 2017, Leibniz International Proceedings in Informatics (LIPIcs) 82, pp. 9:1–17
 On Linear Rewriting Systems for Boolean Logic and Some Applications to Proof Theory
Anupam Das and Lutz Straßburger
Linear rules have played an increasing role in structural proof theory in recent years. It has been observed that the set of all sound linear inference rules in Boolean logic is already coNPcomplete, i.e. that every Boolean tautology can be written as a (left and right)linear rewrite rule. In this paper we study properties of systems consisting only of linear inferences. Our main result is that the length of any ‘nontrivial’ derivation in such a system is bound by a polynomial. As a consequence there is no polynomialtime decidable sound and complete system of linear inferences, unless coNP = NP. We draw tools and concepts from term rewriting, Boolean function theory and graph theory in order to access some required intermediate results. At the same time we make several connections between these areas that, to our knowledge, have not yet been presented and constitute a rich theoretical framework for reasoning about linear TRSs for Boolean logic.
 Symmetric Normalisation for Intuitionistic Logic
Nicolas Guenot and Lutz Straßburger
We present two proof systems for implicationonly intuitionistic logic in the calculus of structures. The first is a direct adaptation of the standard sequent calculus to the deep inference setting, and we describe a procedure for cut elimination, similar to the one from the sequent calculus, but using a nonlocal rewriting. The second system is the symmetric completion of the first, as normally given in deep inference for logics with a DeMorgan duality: all inference rules have duals, as cut is dual to the identity axiom. We prove a generalisation of cut elimination, that we call symmetric normalisation, where all rules dual to standard ones are permuted up in the derivation. The result is a decomposition theorem having cut elimination and interpolation as corollaries.
Pdf 20 May 2014
CSLLICS 2014 Proceedings (ACM), pp. 45:1–10
 A Proof of Strong Normalisation of the Typed Atomic LambdaCalculus
Tom Gundersen, Willem Heijltjes and Michel Parigot
The atomic lambdacalculus is a typed lambdacalculus with explicit sharing, which originates in a CurryHoward interpretation of a deepinference system for intuitionistic logic. It has been shown that it allows fully lazy sharing to be reproduced in a typed setting. In this paper we prove strong normalization of the typed atomic lambdacalculus using Tait’s reducibility method.
Pdf 10 October 2013
LPAR19, LNCS 8312, pp. 340–354
 Atomic Lambda Calculus: A Typed LambdaCalculus with Explicit Sharing
Tom Gundersen, Willem Heijltjes and Michel Parigot
An explicit–sharing lambdacalculus is presented, based on a Curry–Howardstyle interpretation of the deep inference proof formalism. Duplication of subterms during reduction proceeds ‘atomically’, i.e. on individual constructors, similar to optimal graph reduction in the style of Lamping. The calculus preserves strong normalisation with respect to the lambdacalculus, and achieves fully lazy sharing.
Pdf 21 May 2013
LICS 2013 Proceedings (IEEE), pp. 311–320
 Nested Proof Search as Reduction in the λcalculus
Nicolas Guenot
We present here a proof system called JS for purely implicative intuitionistic logic at the propositional level in the formalism of the calculus of structures, which is a generalisation of the sequent calculus implementing the deep inference methodology. We show that this system is sound and complete with respect to the usual sequent calculus LJ, and consider a restricted system JLSd for a restricted class of formulas. Moreover, we show how to encode λterms with explicit substitutions inside formulas of JLSd and prove that there is a correspondence between proof search in JLSd and reduction in a λcalculus with explicit substitutions. Finally, we present a restriction JLSn of JLSd which allows to establish the same correspondence with the standard λcalculus equipped with βreduction, and show that we can prove results on the reductions of our λcalculus with explicit substitutions, as well as the correspondence between the standard βreduction and explicit substitutions, by proving results on derivations in the JLSd and JLSn systems.
Pdf 12 May 2011
PPDP 2011, ACM, pp. 183–193
 Breaking Paths in Atomic Flows for Classical Logic
Alessio Guglielmi, Tom Gundersen and Lutz Straßburger
This work belongs to a wider effort aimed at eliminating syntactic bureaucracy from proof systems. In this paper, we present a novel cut elimination procedure for classical propositional logic. It is based on the recently introduced atomic flows: they are purely graphical devices that abstract away from much of the typical bureaucracy of proofs. We make crucial use of the path breaker, an atomicflow construction that avoids some nasty termination problems, and that can be used in any proof system with sufficient symmetry. This paper contains an original 2dimensionaldiagram exposition of atomic flows, which helps us to connect atomic flows with other known formalisms.
Pdf 29 April 2010
LICS 2010 Proceedings (IEEE), pp. 284–293
 An Algorithmic Interpretation of a Deep Inference System
Kai Brünnler and Richard McKinley
We set out to find something that corresponds to deep inference in the same way that the lambdacalculus corresponds to natural deduction. Starting from natural deduction for the conjunctionimplication fragment of intuitionistic logic we design a corresponding deep inference system together with reduction rules on proofs that allow a finegrained simulation of betareduction.
Pdf 22 September 2008
LPAR 08, LNCS 5330, pp. 482–496
 Cirquent Calculus Deepened
Giorgi Japaridze
Cirquent calculus is a new prooftheoretic and semantic framework, whose main distinguishing feature is being based on circuitstyle structures (called cirquents), as opposed to the more traditional approaches that deal with treelike objects such as formulas, sequents or hypersequents. Among its advantages are greater efficiency, flexibility and expressiveness. This paper presents a detailed elaboration of a deepinference cirquent logic, which is naturally and inherently resource conscious. It shows that classical logic, both syntactically and semantically, can be seen to be just a special, conservative fragment of this more general and, in a sense, more basic logic — the logic of resources in the form of cirquent calculus. The reader will find various arguments in favor of switching to the new framework, such as arguments showing the insufficiency of the expressive power of linear logic or other formulabased approaches to developing resource logics, exponential improvements over the traditional approaches in both representational and proof complexities offered by cirquent calculus (including the existence of polynomial size cut, substitution and extensionfree cirquent calculus proofs for the notoriously hard pigeonhole principle), and more. Among the main purposes of this paper is to provide an introductorystyle starting point for what, as the author wishes to hope, might have a chance to become a new line of research in proof theory — a proof theory based on circuits instead of formulas.
Pdf 1 April 2008
Journal of Logic and Computation, 18 (6) 2008, pp. 983–1028
 Normalisation Control in Deep Inference Via Atomic Flows
Alessio Guglielmi and Tom Gundersen
We introduce 'atomic flows': they are graphs obtained from derivations by tracing atom occurrences and forgetting the logical structure. We study simple manipulations of atomic flows that correspond to complex reductions on derivations. This allows us to prove, for propositional logic, a new and very general normalisation theorem, which contains cut elimination as a special case. We operate in deep inference, which is more general than other syntactic paradigms, and where normalisation is more difficult to control. We argue that atomic flows are a significant technical advance for normalisation theory, because 1) the technique they support is largely independent of syntax; 2) indeed, it is largely independent of logical inference rules; 3) they constitute a powerful geometric formalism, which is more intuitive than syntax.
Pdf 31 March 2008
Logical Methods in Computer Science 4 (1:9) 2008, pp. 1–36
 The Logic of Bunched Implications: A Memoir
Benjamin Robert Horsfall
This is a study of the semantics and proof theory of the logic of bunched implications (BI), which is promoted as a logic of (computational) resources, and is a foundational component of separation logic, an approach to program analysis. BI combines an additive, or intuitionistic, fragment with a multiplicative fragment. The additive fragment has full use of the structural rules of weakening and contraction, and the multiplicative fragment has none. Thus it contains two conjunctive and two implicative connectives. At various points, we illustrate a resource view of BI based upon the Kripke resource semantics. Our first original contribution is the formulation of a proof system for BI in the newly developed prooftheoretical formalism of the calculus of structures. The calculus of structures is distinguished by its employment of deep inference, but we already see deep inference in a limited form in the established proof theory for BI. We show that our system is sound with respect to the elementary Kripke resource semantics for BI, and complete with respect to the partiallydefined monoid (PDM) semantics. Our second contribution is the development from a semantic standpoint of preliminary ideas for a hybrid logic of bunched implications (HBI). We give a Kripke semantics for HBI in which nominal propositional atoms can be seen as names for resources, rather than as names for locations, as is the case with related proposals for BILoc and for intuitionistic hybrid logic. The cost of this approach is the loss of intuitionistic monotonicity in the semantics. But this is perhaps not such a grave loss, given that our guiding analogy is of states of models with resources, rather than with states of knowledge, as is standard for intuitionistic logic.
Pdf June 2007
MSc thesis, successfully defended in August 2007
 A Characterisation of Medial as Rewriting Rule
Lutz Straßburger
Medial is an inference rule scheme that appears in various deductive systems based on deep inference. In this paper we investigate the properties of medial as rewriting rule independently from logic. We present a graph theoretical criterion for checking whether there exists a medial rewriting path between two formulas. Finally, we return to logic and apply our criterion for giving a combinatorial proof for a decomposition theorem, i.e., proof theoretical statement about syntax.
Pdf 13 April 2007
RTA 2007, LNCS 4533, pp. 344–358
 A Local System for Intuitionistic Logic
Alwen Tiu
This paper presents systems for firstorder intuitionistic logic and several of its extensions in which all the propositional rules are local, in the sense that, in applying the rules of the system, one needs only a fixed amount of information about the logical expressions involved. The main source of nonlocality is the contraction rules. We show that the contraction rules can be restricted to the atomic ones, provided we employ deepinference, i.e., to allow rules to apply anywhere inside logical expressions. We further show that the use of deep inference allows for modular extensions of intuitionistic logic to Dummett's intermediate logic LC, Gödel logic and classical logic. We present the systems in the calculus of structures, a proof theoretic formalism which supports deepinference. Cut elimination for these systems are proved indirectly by simulating the cutfree sequent systems, or the hypersequent systems in the cases of Dummett's LC and Gödel logic, in the cut free systems in the calculus of structures.
 Locality for Classical Logic
Kai Brünnler
In this paper we will see deductive systems for classical propositional and predicate logic in the calculus of structures. Like sequent systems, they have a cut rule which is admissible. Unlike sequent systems, they drop the restriction that rules only apply to the main connective of a formula: their rules apply anywhere deeply inside a formula. This allows to observe very clearly the symmetry between identity axiom and the cut rule. This symmetry allows to reduce the cut rule to atomic form in a way which is dual to reducing the identity axiom to atomic form. We also reduce weakening and even contraction to atomic form. This leads to inference rules that are local: they do not require the inspection of expressions of arbitrary size.
Pdf 10 March 2006
Notre Dame Journal of Formal Logic 47 (4) 2006, pp. 557–580
 Cut Elimination Inside a Deep Inference System for Classical Predicate Logic
Kai Brünnler
Deep inference is a natural generalisation of the onesided sequent calculus where rules are allowed to apply deeply inside formulas, much like rewrite rules in term rewriting. This freedom in applying inference rules allows to express logical systems that are difficult or impossible to express in the cutfree sequent calculus and it also allows for a more finegrained analysis of derivations than the sequent calculus. However, the same freedom also makes it harder to carry out this analysis, in particular it is harder to design cut elimination procedures. In this paper we see a cut elimination procedure for a deep inference system for classical predicate logic. As a consequence we derive Herbrand's Theorem, which we express as a factorisation of derivations.
Pdf 9 March 2005
Studia Logica 82 (1) 2006, pp. 51–71
 A First Order System with Finite Choice of Premises
Kai Brünnler and Alessio Guglielmi
We present an inference system for classical first order logic in which each inference rule, including the cut, only has a finite set of premises to choose from. The main conceptual contribution of this paper is the possibility of separating different sources of infinite choice, which happen to be entangled in the traditional cut rule.
Pdf 1 December 2003
Presented at First Order Logic 75 under the title A Finitary System for First Order Logic; appeared in Hendricks et al., editor, FirstOrder Logic Revisited, Logos Verlag, Berlin, 2004, pp. 59–74
 Two Restrictions on Contraction
Kai Brünnler
I show two simple limitations of sequent systems with multiplicative context treatment: contraction can neither be restricted to atoms nor to the bottom of a proof tree.
Pdf 24 November 2003
Logic Journal of the Interest Group in Pure and Applied Logics 11 (5) 2003, pp. 525–529
 Consistency Without Cut Elimination
Kai Brünnler and Alessio Guglielmi
We show how to get consistency for first order classical logic, in a purely syntactic way, without going through cut elimination. The procedure is very simple and it also shows how finitariness is actually a triviality (contrarily to what one would guess from textbooks).
Pdf 10 September 2003 It is contained in A First Order System with Finite Choice of Premises
 Atomic Cut Elimination for Classical Logic
Kai Brünnler
System SKS is a set of rules for classical propositional logic presented in the calculus of structures. Like sequent systems and unlike natural deduction systems, it has an explicit cut rule, which is admissible. In contrast to sequent systems, the cut rule can easily be restricted to atoms. This allows for a very simple cut elimination procedure based on plugging in parts of a proof, like normalisation in natural deduction and unlike cut elimination in the sequent calculus. It should thus be a good common starting point for investigations into both proof search as computation and proof normalisation as computation.
Pdf 10 April 2003
CSL 2003, LNCS 2803, pp. 86–97
 A Local System for Classical Logic
Kai Brünnler and Alwen Fernanto Tiu
The calculus of structures is a framework for specifying logical systems, which is similar to the onesided sequent calculus but more general. We present a system of inference rules for propositional classical logic in this new framework and prove cut elimination for it. The system also enjoys a decomposition theorem for derivations that is not available in the sequent calculus. The main novelty of our system is that all the rules are local: contraction, in particular, is reduced to atomic form. This should be interesting for distributed proofsearch and also for complexity theory, since the computational cost of applying each rule is bounded.
Pdf 2 October 2001 Now replaced by Locality for Classical Logic
LPAR 2001, LNCS 2250, pp. 347–361
2.2 Proof Complexity
The basic proof complexity properties of propositional logic in the calculus of structures are known. Deep inference is as powerful as Frege systems, and more powerful than Gentzen systems, in the restriction to analytic systems.
The following papers exist, in addition to The Complexity of Propositional Proofs in Deep Inference:
 On the Length of MedialSwitchMix Derivations
Paola Bruscoli and Lutz Straßburger
Switch and medial are two inference rules that play a central role in many deep inference proof systems. In specific proof systems, the mix rule may also be present. In this paper we show that the maximal length of a derivation using only the inference rules for switch, medial, and mix, modulo associativity and commutativity of the two binary connectives involved, is quadratic in the size of the formula at the conclusion of the derivation. This shows, at the same time, the termination of the rewrite system.
Pdf 26 June 2017
WoLLIC 2017, LNCS 10388, pp. 68–79
 Quasipolynomial Normalisation in Deep Inference Via Atomic Flows and Threshold Formulae
Paola Bruscoli, Alessio Guglielmi, Tom Gundersen and Michel Parigot
Jeřábek showed that cuts in classical propositional logic proofs in deep inference can be eliminated in quasipolynomial time. The proof is indirect and it relies on a result of Atserias, Galesi and Pudlák about monotone sequent calculus and a correspondence between that system and cutfree deepinference proofs. In this paper we give a direct proof of Jeřábek's result: we give a quasipolynomialtime cutelimination procedure for classical propositional logic in deep inference. The main new ingredient is the use of a computational trace of deepinference proofs called atomic flows, which are both very simple (they only trace structural rules and forget logical rules) and strong enough to faithfully represent the cutelimination procedure.
 From Positive and Intuitionistic Bounded Arithmetic to Monotone Proof Complexity
Anupam Das
We study versions of secondorder bounded arithmetic where induction and comprehension formulae are positive or where the underlying logic is intuitionistic, examining their relationships to monotone and deep inference proof systems for propositional logic. Both settings induce a distinction between polynomial induction (PIND) and usual induction (IND) independent of quantifier complexity, namely due to the restrictions on negation/implication, and we exploit this to distinguish systems at the propositional level.
In the monotone setting a restriction of the ParisWilkie (PW) translation immediately induces quasipolynomialsize monotone propositional proofs from Π_{1}^{0} arithmetic theorems, as expected. We further show that, when only PIND is used, quasipolynomialsize normal deep inference proofs may be obtained, via a graphrewriting normalisation procedure from earlier work.
For the intuitionistic setting we calibrate the PW translation with the BrouwerHeytingKolmogorov interpretation of intuitionistic implication. This induces a hierarchy of propositional systems that includes monotone proofs. We show that these theories are powerful enough to prove converse results, namely the soundness of their respective propositional systems, thereby establishing a full correspondence.
Pdf 19 April 2016
LICS 2016 Proceedings (ACM), pp. 126–135
 On the Power of Substitution in the Calculus of Structures
Novak Novaković and Lutz Straßburger
There are two contributions in this article. First, we give a direct proof of the known fact that Frege systems with substitution can be psimulated by the calculus of structures (CoS) extended with the substitution rule. This is done without referring to the pequivalence of extended Frege systems and Frege systems with substitution. Second, we then show that the cutfree CoS with substitution is pequivalent to the cutfree CoS with extension.
Pdf 7 April 2015
ACM Transactions on Computational Logic, 16 (3:19) 2015, pp. 1–20
 On the Relative Proof Complexity of Deep Inference Via Atomic Flows
Anupam Das
We consider the proof complexity of the minimal complete fragment of standard deep inference, denoted KS. To examine the size of proofs we employ atomic flows, diagrams that trace structural changes through a proof but ignore logical information. As results we obtain a polynomial simulation of daglike cutfree Gentzen and Resolution, along with some extensions. We also show that these systems, as well as boundeddepth Frege systems, cannot polynomially simulate KS, by giving polynomialsize proofs of certain variants of the propositional pigeonhole principle in KS.
 On the Pigeonhole and Related Principles in Deep Inference and Monotone Systems
Anupam Das
We construct quasipolynomialsize proofs of the propositional pigeonhole principle in the deep inference system KS, addressing an open problem raised in previous works and matching the best known upper bound for the more general class of monotone proofs.
We make significant use of monotone formulae computing boolean threshold functions, an idea previously considered in works of Atserias et al. The main construction, monotone proofs witnessing the symmetry of such functions, involves an implementation of mergesort in the design of proofs in order to tame the structural behaviour of atoms, and so the complexity of normalization. Proof transformations from previous work on atomic flows are then employed to yield appropriate KS proofs.
As further results we show that our constructions can be applied to provide quasipolynomialsize KS proofs of the parity principle and the generalized pigeonhole principle. These bounds are inherited for the class of monotone proofs, and we are further able to construct n^{O(log log n)}size monotone proofs of the weak pigeonhole principle, thereby also improving the best known bounds for monotone proofs.
Pdf Extended version PDF 16 May 2014
CSLLICS 2014 Proceedings (ACM), pp. 36:1–10
 Rewriting with Linear Inferences in Propositional Logic
Anupam Das
Linear inferences are sound implications of propositional logic where each variable appears exactly once in the premiss and conclusion. We consider a specific set of these inferences, MS, first studied by Straßburger, corresponding to the logical rules in deep inference proof theory. Despite previous results characterising the individual rules of MS, we show that there is no polynomialtime characterisation of MS, assuming that integers cannot be factorised in polynomial time.
We also examine the length of rewrite paths in MS, utilising a notion of trivialisation to reduce the case with units to the case without, amongst other observations on MSrewriting and the set of linear inferences in general.
Pdf 15 February 2013
RTA 2013, LIPIcs 21, pp. 158–173
 Extension without Cut
Lutz Straßburger
In proof theory one distinguishes sequent proofs with cut and cutfree sequent proofs, while for proof complexity one distinguishes Fregesystems and extended Fregesystems. In this paper we show how deep inference can provide a uniform treatment for both classifications, such that we can define cutfree systems with extension, which is neither possible with Fregesystems, nor with the sequent calculus. We show that the propositional pigeonhole principle admits polynomialsize proofs in a cutfree system with extension. We also define cutfree systems with substitution and show that the system with extension psimulates the system with substitution.
Pdf 31 July 2012
Annals of Pure and Applied Logic 163(12) 2012, pp. 1995–2007
 On the Proof Complexity of CutFree Bounded Deep Inference
Anupam Das
It has recently been shown that cutfree deep inference systems exhibit an exponential speedup over cutfree sequent systems, in terms of proof size. While this is good for proof complexity, there remains the problem of typically high proof search nondeterminism induced by the deep inference methodology: the higher the depth of inference, the higher the nondeterminism. In this work we improve on the proof search side by demonstrating that, for propositional logic, the same exponential speedup in proof size can be obtained in boundeddepth cutfree systems. These systems retain the topdown symmetry of deep inference, but can otherwise be designed at the same depth level of sequent systems. As a result, the nondeterminism arising from the choice of rules at each stage of a proof is smaller than that of unbounded deep inference, while still giving access to the short proofs of deep inference.
Pdf 25 April 2011
Tableaux 2011, LNCS 6793, pp. 134–148
 On the Proof Complexity of Deep Inference
Paola Bruscoli and Alessio Guglielmi
We obtain two results about the proof complexity of deep inference: 1) deepinference proof systems are as powerful as Frege ones, even when both are extended with the Tseitin extension rule or with the substitution rule; 2) there are analytic deepinference proof systems that exhibit an exponential speedup over analytic Gentzen proof systems that they polynomially simulate.
Pdf 19 April 2009
ACM Transactions on Computational Logic, 10 (2:14) 2009, pp. 1–34
 Proof complexity of the CutFree Calculus of Structures
Emil Jeřábek
We investigate the proof complexity of analytic subsystems of the deep inference proof system SKSg (the calculus of structures). Exploiting the fact that the cut rule (i↑) of SKSg corresponds to the ¬left rule in the sequent calculus, we establish that the "analytic" system KSg + c↑ has essentially the same complexity as the monotone Gentzen calculus MLK. In particular, KSg + c↑ quasipolynomially simulates SKSg, and admits polynomialsize proofs of some variants of the pigeonhole principle.
Pdf 30 April 2008
Journal of Logic and Computation 19 (2) 2009, pp. 323–339
2.3 Nested Sequents
A new formalism called 'nested sequents' has been defined, which is especially suitable to modal logics.
The following papers exist, in addition to Nested Sequents and Nested Deduction in Logical Foundations for Computation, mentioned above:
 MaeharaStyle Modal Nested Calculi
Roman Kuznets and Lutz Straßburger
We develop multiconclusion nested sequent calculi for the fifteen logics of the intuitionistic modal cube between IK and IS5. The proof of cutfree completeness for all logics is provided both syntactically via a Maeharastyle translation and semantically by constructing an infinite birelational countermodel from a failed proof search. Interestingly, the Maeharastyle translation for proving soundness syntactically fails due to the hierarchical structure of nested sequents. Consequently, we only provide the semantic proof of soundness. The countermodel construction used to prove completeness required a completely novel approach to deal with two independent sources of nontermination in the proof search present in the case of transitive and Euclidean logics.
Pdf 18 July 2018
Archive for Mathematical Logic, 58 (34) 2018, pp. 359–385
 Modularisation of Sequent Calculi for Normal and NonNormal Modalities
Björn Lellmann and Elaine Pimentel
In this work, we explore the connections between (linear) nested sequent calculi and ordinary sequent calculi for normal and nonnormal modal logics. By proposing local versions to ordinary sequent rules, we obtain linear nested sequent calculi for a number of logics, including, to our knowledge, the first nested sequent calculi for a large class of simply dependent multimodal logics and for many standard nonnormal modal logics. The resulting systems are modular and have separate left and right introduction rules for the modalities, which makes them amenable to specification as bipole clauses. While this granulation of the sequent rules introduces more choices for proof search, we show how linear nested sequent calculi can be restricted to blocked derivations, which directly correspond to ordinary sequent derivations.
Pdf 16 November 2017
ACM Transactions on Computational Logic, 20 (2:7) 2019, pp. 1–46
 A Uniform Framework for Substructural Logics with Modalities
Björn Lellmann, Carlos Olarte and Elaine Pimentel
It is well known that context dependent logical rules can be problematic both to implement and reason about. This is one of the factors driving the quest for better behaved, i.e., local, logical systems. In this work we investigate such a local system for linear logic (LL) based on linear nested sequents (LNS). Relying on that system, we propose a general framework for modularly describing systems combining, coherently, substructural behaviors inherited from LL with simply dependent multimodalities. This class of systems includes linear, elementary, affine, bounded and subexponential linear logics and extensions of multiplicative additive linear logic (MALL) with normal modalities, as well as general combinations of them. The resulting LNS systems can be adequately encoded into (plain) linear logic, supporting the idea that LL is, in fact, a “universal framework” for the specification of logical systems. From the theoretical point of view, we give a uniform presentation of LL featuring different axioms for its modal operators. From the practical point of view, our results lead to a generic way of constructing theorem provers for different logics, all of them based on the same grounds. This opens the possibility of using the same logical framework for reasoning about all such logical systems.
Pdf 4 May 2017
LPAR21, EPiC 46, pp. 435–455
 Modular Focused Proof Systems for Intuitionistic Modal Logics
Kaustuv Chaudhuri, Sonia Marin and Lutz Straßburger
Focusing is a general technique for syntactically compartmentalizing the nondeterministic choices in a proof system, which not only improves proof search but also has the representational benefit of distilling sequent proofs into synthetic normal forms. However, since focusing is usually specified as a restriction of the sequent calculus, the technique has not been transferred to logics that lack a (shallow) sequent presentation, as is the case for some of the logics of the modal cube. We have recently extended the focusing technique to classical nested sequents, a generalization of ordinary sequents. In this work we further extend focusing to intuitionistic nested sequents, which can capture all the logics of the intuitionistic S5 cube in a modular fashion. We present an internal cutelimination procedure for the focused system which in turn is used to show its completeness.
Pdf 1 May 2016
FSCD 2016, LIPIcs 52, pp. 16:1–18
 Focused and Synthetic Nested Sequents
Kaustuv Chaudhuri, Sonia Marin and Lutz Straßburger
Focusing is a general technique for transforming a sequent proof system into one with a syntactic separation of nondeterministic choices without sacrificing completeness. This not only improves proof search, but also has the representational benefit of distilling sequent proofs into synthetic normal forms. We show how to apply the focusing technique to nested sequent calculi, a generalization of ordinary sequent calculi to treelike instead of listlike structures. We thus improve the reach of focusing to the most commonly studied modal logics, the logics of the modal S5 cube. Among our key contributions is a focused cutelimination theorem for focused nested sequents.
Pdf 11 January 2016
FoSSaCS 2016, LNCS 9634, pp. 390–407
 Proof Search in Nested Sequent Calculi
Björn Lellmann and Elaine Pimentel
We propose a notion of focusing for nested sequent calculi for modal logics which brings down the complexity of proof search to that of the corresponding sequent calculi. The resulting systems are amenable to specifications in linear logic. Examples include modal logic K, a simply dependent bimodal logic and the standard nonnormal modal logics. As byproduct we obtain the first nested sequent calculi for the considered nonnormal modal logics.
Pdf 15 September 2015
LPAR20, LNCS 9450, pp. 558–574

On Nested Sequents for Constructive Modal Logics
Ryuta Arisaka, Anupam Das and Lutz Straßburger
We present deductive systems for various modal logics that can be obtained from the constructive variant of the normal modal logic CK by adding combinations of the axioms d, t, b, 4, and 5. This includes the constructive variants of the standard modal logics K4, S4, and S5. We use for our presentation the formalism of nested sequents and give a syntactic proof of cut elimination.
Pdf 3 September 2015
Logical Methods in Computer Science 11 (3:7) 2015, pp. 1–33
 Modal Interpolation Via Nested Sequents
Melvin Fitting and Roman Kuznets
The main method of proving the Craig Interpolation Property (CIP) constructively uses cutfree sequent proof systems. Until now, however, no such method has been known for proving the CIP using more general sequentlike proof formalisms, such as hypersequents, nested sequents, and labelled sequents. In this paper, we start closing this gap by presenting an algorithm for proving the CIP for modal logics by induction on a nestedsequent derivation. This algorithm is applied to all the logics of the socalled modal cube.
Pdf 12 August 2015
Annals of Pure and Applied Logic 166 (3) 2015, pp. 274–305
 Nested Sequents for Provability Logic GLP
Daniyar Shamkanov
We present a proof system for the provability logic GLP in the formalism of nested sequents and prove the cut elimination theorem for it. As an application, we obtain the reduction of GLP to its important fragment called J syntactically.
Pdf 27 October 2014
Logic Journal of the Interest Group in Pure and Applied Logics 23 (5) 2015, pp. 789–815
 Labelfree Modular Systems for Classical and Intuitionistic Modal Logics
Sonia Marin and Lutz Straßburger
In this paper we show for each of the modal axioms d, t, b, 4, and 5 an equivalent set of inference rules in a nested sequent system, such that, when added to the basic system for the modal logic K, the resulting system admits cut elimination. Then we show the same result also for intuitionistic modal logic. We achieve this by combining structural and logical rules.
Pdf 6 June 2014
Advances in Modal Logic 2014, pp. 387–406
 AnnotationFree Sequent Calculi for Full Intuitionistic Linear Logic
Ranald Clouston, Jeremy Dawson, Rajeev Goré and Alwen Tiu
Full Intuitionistic Linear Logic (FILL) is multiplicative intuitionistic linear logic extended with par. Its proof theory has been notoriously difficult to get right, and existing sequent calculi all involve inference rules with complex annotations to guarantee soundness and cutelimination. We give a simple and annotationfree display calculus for FILL which satisfies Belnap’s generic cutelimination theorem. To do so, our display calculus actually handles an extension of FILL, called BiIntuitionistic Linear Logic (BiILL), with an ‘exclusion’ connective defined via an adjunction with par. We refine our display calculus for BiILL into a cutfree nested sequent calculus with deep inference in which the explicit structural rules of the display calculus become admissible. A separation property guarantees that proofs of FILL formulae in the deep inference calculus contain no trace of exclusion. Each such rule is sound for the semantics of FILL, thus our deep inference calculus and display calculus are conservative over FILL. The deep inference calculus also enjoys the subformula property and terminating backward proof search, which gives the NPcompleteness of BiILL and FILL.
Pdf 8 July 2013
CSL 2013, LIPIcs 23, pp. 197–214
 Nested Sequent Calculi for Normal Conditional Logics
Régis Alenda, Nicola Olivetti and Gian Luca Pozzato
Nested sequent calculi are a useful generalization of ordinary sequent calculi, where sequents are allowed to occur within sequents. Nested sequent calculi have been profitably used in the area of (multi)modal logic to obtain analytic and modular proof systems for these logics. In this work, we extend the realm of nested sequents by providing nested sequent calculi for the basic conditional logic CK and some of its significant extensions. We provide also a calculus for Kraus Lehman Magidor cumulative logic C. The calculi are internal (a sequent can be directly translated into a formula), cutfree and analytic. Moreover, they can be used to design (sometimes optimal) decision procedures for the respective logics, and to obtain complexity upper bounds. Our calculi are an argument in favour of nested sequent calculi for modal logics and alike, showing their versatility and power.
Pdf 1 June 2013
Journal of Logic and Computation 26 (1) 2016, pp. 7–50
 Algebra, Proof Theory and Applications for an Intuitionistic Logic of Propositions, Actions and Adjoint Modal Operators
Roy Dyckhoff, Mehrnoosh Sadrzadeh and Julien Truffaut
We develop a cutfree nested sequent calculus as basis for a proof search procedure for an intuitionistic modal logic of actions and propositions. The actions act on propositions via a dynamic modality (the weakest precondition of program logics), whose left adjoint we refer to as “update” (the strongest postcondition). The logic has agentindexed adjoint pairs of epistemic modalities: the left adjoints encode agents’ uncertainties and the right adjoints encode their beliefs. The rules for the “update” modality encode learning as a result of discarding uncertainty. We prove admissibility of Cut, and hence the soundness and completeness of the logic with respect to an algebraic semantics. We interpret the logic on epistemic scenarios that consist of honest and dishonest communication actions, add assumption rules to encode them, and prove that the calculus with the assumption rules still has the admissibility results. We apply the calculus to encode (and allow reasoning about) the classic epistemic puzzles of dirty children (a.k.a. “muddy children”) and drinking logicians and some versions with dishonesty or noise; we also give an application where the actions are movements of a robot rather than announcements.
Pdf 31 May 2013
ACM Transactions on Computational Logic, 14 (4:34) 2013, pp. 1–37
 Cut Elimination in Nested Sequents for Intuitionistic Modal Logics
Lutz Straßburger
We present cutfree deductive systems without labels for the intuitionistic variants of the modal logics obtained by extending IK with a subset of the axioms d, t, b, 4, and 5. For this, we use the formalism of nested sequents, which allows us to give a uniform cut elimination argument for all 15 logic in the intuitionistic S5 cube.
Pdf 7 January 2013
FoSSaCS 2013, LNCS 7794, pp. 209–224
 Labelled Tree sequents, Tree Hypersequents and Nested (Deep) Sequents
Rajeev Goré and Revantha Ramanayake
We identify a subclass of labelled sequents called "labelled tree sequents" and show that these are notational variants of treehypersequents in the sense that a sequent of one type can be represented naturally as a sequent of the other type. This relationship can be extended to nested (deep) sequents using the relationship between treehypersequents and nested (deep) sequents, which we also show. We apply this result to transfer prooftheoretic results such as syntactic cutadmissibility between the treehypersequent calculus CSGL and the labelled sequent calculus G3GL for provability logic GL. This answers in full a question posed by Poggiolesi about the exact relationship between these calculi. Our results pave the way to obtain cutfree treehypersequent and nested (deep) sequent calculi for large classes of logics using the known calculi for labelled sequents, and also to obtain a large class of labelled sequent calculi for biintuitionistic tense logics from the known nested (deep) sequent calculi for these logics. Importing prooftheoretic results between notational variant systems in this manner alleviates the need for independent proofs in each system. Identifying which labelled systems can be rewritten as labelled tree sequent systems may provide a method for determining the expressive limits of the nested sequent formalism.
Pdf 5 October 2012
Advances in Modal Logic 2012, pp. 279–299
 Nested Sequent Calculi for Conditional Logics
Régis Alenda, Nicola Olivetti and Gian Luca Pozzato
Nested sequent calculi are a useful generalization of ordinary sequent calculi, where sequents are allowed to occur within sequents. Nested sequent calculi have been profitably employed in the area of (multi)modal logic to obtain analytic and modular proof systems for these logics. In this work, we extend the realm of nested sequents by providing nested sequent calculi for the basic conditional logic CK and some of its significant extensions. The calculi are internal (a sequent can be directly translated into a formula), cutfree and analytic. Moreover, they can be used to design (sometimes optimal) decision procedures for the respective logics, and to obtain complexity upper bounds. Our calculi are an argument in favour of nested sequent calculi for modal logics and alike, showing their versatility and power.
Pdf 16 July 2012
JELIA 2012, LNCS 7519, pp. 14–27
 Nested Sequents for Intuitionistic Logics
Melvin Fitting
Relatively recently nested sequent systems for modal logics have come to be seen as an attractive deepreasoning extension of familiar sequent calculi. In an earlier paper I showed that there was a strong connection between modal nested sequents and modal prefixed tableaux. In this paper I show that the connection continues to intuitionistic logic, both standard and constant domain, relating nested intuitionistic sequent calculi to intuitionistic prefixed tableaux. Modal nested sequent machinery generalizes onesided sequent calculi — intuitionistic nested sequents similarly generalize twosided sequents. It is noteworthy that the resulting system for constant domain intuitionistic logic is particularly simple. It amounts to a combination of intuitionistic propositional rules and classical quantifier rules, a combination that is known to be inadequate when conventional intuitionistic sequent systems are used.
Pdf 8 July 2012
Notre Dame Journal of Formal Logic 55 (1) 2014, pp. 41–61
 Grammar Logics in Nested Sequent Calculus: Proof Theory and Decision Procedures
Alwen Tiu, Egor Ianovski and Rajeev Goré
A grammar logic refers to an extension of the multimodal logic K in which the modal axioms are generated from a formal grammar. We consider a proof theory, in nested sequent calculus, of grammar logics with converse, i.e., every modal operator [a] comes with a converse [ā]: Extending previous works on nested sequent systems for tense logics, we show all grammar logics (with or without converse) can be formalised in nested sequent calculi, where the axioms are internalised in the calculi as structural rules. Syntactic cutelimination for these calculi is proved using a procedure similar to that for display logics. If the grammar is contextfree, then one can get rid of all structural rules, in favor of deep inference and additional propagation rules. We give a novel semidecision procedure for contextfree grammar logics, using nested sequent calculus with deep inference, and show that, in the case where the given contextfree grammar is regular, this procedure terminates. Unlike all other existing decision procedures for regular grammar logics in the literature, our procedure does not assume that a finite state automaton encoding the axioms is given.
Pdf 15 June 2012
Advances in Modal Logic 2012, pp. 516–537
 Syntactic CutElimination for a Fragment of the Modal MuCalculus
Kai Brünnler and Thomas Studer
For some modal fixed point logics, there are deductive systems that enjoy syntactic cut elimination. An early example is the system in Pliuskevicius (1991) [15] for LTL. More recent examples are the systems by the authors of this paper for the logic of common knowledge (Brünnler and Studer, 2009) [5] and by Hill and Poggiolesi (2010) for PDL [8], which are based on a form of deep inference. These logics can be seen as fragments of the modal mucalculus. Here we are interested in how far this approach can be pushed in general. To this end, we introduce a nested sequent system with syntactic cutelimination which is incomplete for the modal mucalculus, but complete with respect to a restricted language that allows only fixed points of a certain form. This restricted language includes the logic of common knowledge and PDL. There is also a traditional sequent system for the modal mucalculus by Jäger et al. (2008) [9], without a syntactic cutelimination procedure. We embed that system into ours and vice versa, thus establishing cutelimination also for the shallow system, when only the restricted language is considered.
Pdf 14 May 2012
Annals of Pure and Applied Logic 163 (12) 2012, pp. 1838–1853
 Proving Completeness for Nested Sequent Calculi
Melvin Fitting
Proving the completeness of classical propositional logic by using maximal consistent sets is perhaps the most common method there is, going back to Lindenbaum (though not actually published by him). It has been extended to a variety of logical formalisms, sometimes combined with the addition of Henkin constants to handle quantifiers. Recently a deepreasoning formalism called nested sequents has been introduced by Kai Brünnler, able to handle a larger variety of modal logics than are possible with standard Gentzentype sequent calculi. In this paper we sketch how yet another variation on the maximality method of Lindenbaum allows one to prove completeness for nested sequent calculi. It is certainly not the only method available, but it should be entered into the record as one more useful tool available to the logician.
Pdf 4 October 2011
Logic without Frontiers, College Publications 2011, pp. 145–154
 Prefixed Tableaus and Nested Sequents
Melvin Fitting
Nested sequent systems for modal logics are a relatively recent development, within the general area known as deep reasoning. The idea of deep reasoning is to create systems within which one operates at lower levels in formulas than just those involving the main connective or operator. Prefixed tableaus go back to 1972, and are modal tableau systems with extra machinery to represent accessibility in a purely syntactic way. We show that modal nested sequents and prefixed modal tableaus are notational variants of each other, roughly in the same way that Gentzen sequent calculi and tableaus are notational variants. This immediately gives rise to new modal nested sequent systems which may be of independent interest. We discuss some of these, including those for some justification logics that include standard modal operators.
Pdf 14 September 2011
Annals of Pure and Applied Logic 163 (3) 2012, pp. 291–313
 On the Correspondence Between Display Postulates and Deep Inference in Nested Sequent Calculi for Tense Logics
Rajeev Goré, Linda Postniece and Alwen Tiu
We consider two styles of proof calculi for a family of tense logics, presented in a formalism based on nested sequents. A nested sequent can be seen as a tree of traditional singlesided sequents. Our first style of calculi is what we call "shallow calculi", where inference rules are only applied at the root node in a nested sequent. Our shallow calculi are extensions of Kashima's calculus for tense logic and share an essential characteristic with display calculi, namely, the presence of structural rules called "display postulates". Shallow calculi enjoy a simple cut elimination procedure, but are unsuitable for proof search due to the presence of display postulates and other structural rules. The second style of calculi uses deepinference, whereby inference rules can be applied at any node in a nested sequent. We show that, for a range of extensions of tense logic, the two styles of calculi are equivalent, and there is a natural proof theoretic correspondence between display postulates and deep inference. The deep inference calculi enjoy the subformula property and have no display postulates or other structural rules, making them a better framework for proof search.
 Proof Theory and Proof Search of BiIntuitionistic and Tense Logic
Linda Postniece
In this thesis, we consider biintuitionistic logic and tense logic, as well as the combined biintuitionistic tense logic. Each of these logics contains a pair of dual connectives, for example, Rauszer's biintuitionistic logic contains intuitionistic implication and dual intuitionistic exclusion. The interaction between these dual connectives makes it nontrivial to develop a cutfree sequent calculus for these logics.
In the first part of this thesis we develop a new extended sequent calculus for biintuitionistic logic using a framework of derivations and refutations. This is the first purely syntactic cutfree sequent calculus for biintuitionistic logic and thus solves an open problem. Our calculus is sound, semantically complete and allows terminating backward proof search, hence giving rise to a decision procedure for biintuitionistic logic.
In the second part of this thesis we consider the broader problem of taming proof search in display calculi, using biintuitionistic logic and tense logic as case studies. While the generality of display calculi makes it an excellent framework for designing sequent calculi for logics where traditional sequent calculi fail, this generality also leads to a large degree of nondeterminism, which is problematic for backward proofsearch. We control this nondeterminism in two ways:
1. First, we limit the structural connectives used in the calculi and consequently, the number of display postulates. Specifically, we work with nested structures which can be viewed as a tree of traditional Gentzen's sequents, called nested sequents, which have been used previously by Kashima and, independently, by Brünnler and Straßburger and Poggiolesi to present several modal and tense logics.
2. Second, since residuation rules are largely responsible for the difficulty in finding a proof search procedure for displaylike calculi, we show how to eliminate these residuation rules using deep inference in nested sequents.
Finally, we study the combined biintuitionistic tense logic, which contains the wellknown intuitionistic modal logic as a sublogic. We give a nested sequent calculus for biintuitionistic tense logic that has cutelimination, and a derived deep inference nested sequent calculus that is complete with respect to the first calculus and where contraction and residuation rules are admissible. We also show how our calculi can capture Simpson's intuitionistic modal logic and Ewald's intuitionistic tense logic.
Pdf 21 November 2010
PhD thesis, defended on March 2011
 How to Universally Close the Existential Rule
Kai Brünnler
This paper introduces a nested sequent system for predicate logic. The system features a structural universal quantifier and a universally closed existential rule. One nice consequence of this is that proofs of sentences cannot contain free variables. Another nice consequence is that the assumption of a nonempty domain is isolated in a single inference rule. This rule can be removed or added at will, leading to a system for free logic or classical predicate logic, respectively. The system for free logic is interesting because it has no need for an existence predicate. We see syntactic cutelimination and completeness results for these two systems as well as two standard applications: Herbrand ?s Theorem and interpolation.
Pdf (long version) 31 July 2010
LPAR17, LNCS 6397, pp. 172–186
 CutElimination and Proof Search for BiIntuitionistic Tense Logic
Rajeev Goré, Linda Postniece and Alwen Tiu
We consider an extension of biintuitionistic logic with the traditional modalities from tense logic Kt. Proof theoretically, this extension is obtained simply by extending an existing sequent calculus for biintuitionistic logic with typical inference rules for the modalities used in display logics. As it turns out, the resulting calculus, LBiKt, seems to be more basic than most intuitionistic tense or modal logics considered in the literature, in particular, those studied by Ewald and Simpson, as it does not assume any a priori relationship between the diamond and the box modal operators. We recover Ewald's intuitionistic tense logic and Simpson's intuitionistic modal logic by modularly extending LBiKt with additional structural rules. The calculus LBiKt is formulated in a variant of display calculus, using a form of sequents called nested sequents. Cut elimination is proved for LBiKt, using a technique similar to that used in display calculi. As in display calculi, the inference rules of LBiKt are ''shallow'' rules, in the sense that they act on toplevel formulae in a nested sequent. The calculus LBiKt is illsuited for backward proof search due to the presence of certain structural rules called ''display postulates'' and the contraction rules on arbitrary structures. We show that these structural rules can be made redundant in another calculus, DBiKt, which uses deep inference, allowing one to apply inference rules at an arbitrary depth in a nested sequent. We prove the equivalence between LBiKt and DBiKt and outline a proof search strategy for DBiKt. We also give a Kripke semantics and prove that LBiKt is sound with respect to the semantics, but completeness is still an open problem. We then discuss various extensions of LBiKt.
Pdf 29 June 2010
Advances in Modal Logic 2010, pp 156–177
 A Syntactic Realization Theorem for Justification Logics
Kai Brünnler, Remo Goetschi and Roman Kuznets
Justification logics are refinements of modal logics where modalities are replaced by justification terms. They are connected to modal logics via socalled realization theorems. We present a syntactic proof of a single realization theorem which uniformly connects all the normal modal logics formed from the axioms d, t, b, 4, and 5 with their justification counterparts. The proof employs cutfree nested sequent systems together with Fitting's realization merging technique. We further strengthen the realization theorem for KB5 and S5 by showing that the positive introspection operator is superfluous.
Pdf 3 April 2010
Advances in Modal Logic 2010, pp 39–58
 A Note on Uniform Interpolation Proofs in Modal Deep Inference Calculi
Marta Bílková
This paper answers one rather particular question: how to perform a proof of uniform interpolation property in deep inference calculi for modal logics. We show how to perform a proof of uniform interpolation property in deep inference calculus for the basic modal logic K via forgetting a variable in a certain normal form constructed by backward proof search. For that purpose we modify the framework of deep inference calculus using a cover modality on the meta level to structure deep sequents.
21 September 2009
TbiLLC 2009, LNAI 6618, pp. 30–45
 Deep Inference in Biintuitionistic Logic
Linda Postniece
Biintuitionistic logic is the extension of intuitionistic logic with exclusion, a connective dual to implication. Cutelimination in biintuitionistic logic is complicated due to the interaction between these two connectives, and various extended sequent calculi, including a display calculus, have been proposed to address this problem.
In this paper, we present a new extended sequent calculus DBiInt for biintuitionistic logic which uses nested sequents and "deep inference", i.e., inference rules can be applied at any level in the nested sequent. We show that DBiInt can simulate our previous "shallow" sequent calculus LBiInt. In particular, we show that deep inference can simulate the residuation rules in the displaylike shallow calculus LBiInt. We also consider proof search and give a simple restriction of DBiInt which allows terminating proof search. Thus our work is another step towards addressing the broader problem of proof search in display logic.
Pdf 7 June 2009
WoLLIC 2009, LNCS 5514, pp. 320–334
 Modular Sequent Systems for Modal Logic
Kai Brünnler and Lutz Straßburger
We see cutfree sequent systems for the basic normal modal logics formed by any combination the axioms d, t, b, 4, 5. These systems are modular in the sense that each axiom has a corresponding rule and each combination of these rules is complete for the corresponding frame conditions. The systems are based on nested sequents, a natural generalisation of hypersequents. Nested sequents stay inside the modal language, as opposed to both the display calculus and labelled sequents. The completeness proof is via syntactic cut elimination.
Pdf 23 April 2009
Tableaux 2009, LNCS 5607, pp. 152–166
Attention, this paper contains some serious mistakes. The authors are working on a corrected version.
 Syntactic CutElimination for Common Knowledge
Kai Brünnler and Thomas Studer
We first look at an existing infinitary sequent system for common knowledge for which there is no known syntactic cutelimination procedure and also no known nontrivial bound on the proofdepth. We then present another infinitary sequent system based on nested sequents that are essentially trees and with inference rules that apply deeply inside these trees. Thus we call this system "deep" while we call the former system "shallow". In contrast to the shallow system, the deep system allows one to give a straightforward syntactic cutelimination procedure. Since both systems can be embedded into each other, this also yields a syntactic cutelimination procedure for the shallow system. For both systems we thus obtain an upper bound of Φ_{2}0 on the depth of proofs, where Φ is the Veblen function. We see a cutfree infinitary sequent system for common knowledge. Its sequents are essentially trees and the inference rules apply deeply inside of these trees. This allows to give a syntactic cutelimination procedure which yields an upper bound of on the depth of proofs, where is the Veblen function.
 CutElimination and ProofSearch for BiIntuitionistic Logic Using Nested Sequents
Rajeev Goré, Linda Postniece and Alwen Tiu
We propose a new sequent calculus for biintuitionistic logic which sits somewhere between display calculi and traditional sequent calculi by using nested sequents. Our calculus enjoys a simple (purely syntactic) cutelimination proof as do display calculi. But it has an easily derivable variant calculus which is amenable to automated proof search as are (some) traditional sequent calculi. We first present the initial calculus and its cutelimination proof. We then present the derived calculus, and then present a proofsearch strategy which allows it to be used for automated proof search. We prove that this search strategy is terminating and complete by showing how it can be used to mimic derivations obtained from an existing calculus GBiInt for biintuitionistic logic. As far as we know, our new calculus is the first sequent calculus for biintuitionistic logic which uses no semantic additions like labels, which has a purely syntactic cutelimination proof, and which can be used naturally for backwards proofsearch.
Pdf 15 September 2008
Advances in Modal Logic 2008, pp 43–66
 Deep Sequent Systems for Modal Logic
Kai Brünnler
We see a systematic set of cutfree axiomatisations for all the basic normal modal logics formed by some combination the axioms d, t, b, 4, 5. They employ a form of deep inference but otherwise stay very close to Gentzen's sequent calculus, in particular they enjoy a subformula property in the literal sense. No semantic notions are used inside the proof systems, in particular there is no use of labels. All their rules are invertible and the rules cut, weakening and contraction are admissible. All systems admit a straightforward terminating proof search procedure as well as a syntactic cut elimination procedure.
2.4 Modal Logic
We can present systematically several normal propositional modal logics, including S5, B and K5, for which cut elimination is proved. We also investigated geometric theories, some of which we expressed in the calculus of structures.
The following papers exist:
 Deep Inference for Hybrid Logic
Lutz Straßburger
This paper describes work in progress on using deep inference for designing a deductive system for hybrid logic. We will see a cutfree system and prove its soundness and completeness. An immediate observation about the system is that there is no need for additional rewrite rules as in Blackburn's tableaux, nor substitution rules as in Seligman's sequent system.
Pdf 15 May 2007
Proceedings of International Workshop on Hybrid Logic 2007 (HyLo 2007), pp. 13–22
 Classical Modal Display Logic in the Calculus of Structures and Minimal and Cutfree Deep Inference Calculi for S5
Rajeev Goré and Alwen Tiu
We begin by showing how to faithfully encode the Classical Modal Display Logic (CMDL) of Wansing into the Calculus of Structures (CoS) of Guglielmi. Since every CMDL calculus enjoys cutelimination, we obtain a cutelimination theorem for all corresponding CoS calculi. We then show how our result leads to a minimal cutfree CoS calculus for modal logic S5. As far as we know, no other existing CoS calculi for S5 enjoy both these properties simultaneously.
Pdf 4 May 2007
Journal of Logic and Computation 17 (4) 2007, pp. 767–794
 A Deep Inference System for the Modal Logic S5
Phiniki Stouppa
We present a cutadmissible system for the modal logic S5 in a framework that makes explicit and intensive use of deep inference. Deep inference is induced by the methods applied so far in conceptually pure systems for this logic. Thus, the formulation of a system in such a framework is an evolutional process and leads to positive proof theoretical results. The system enjoys systematicity and modularity, two important properties that seek satisfaction from modal systems. Furthermore, it enjoys a simple and direct design: the rules are few and the modal rules are in exact correspondence to the modal axioms.
Pdf 1 March 2006
Studia Logica 85 (2) 2007, pp. 199–214
 Purity Through Unravelling
Robert Hein and Charles Stewart
We divide attempts to give the structural proof theory of modal logics into two kinds, those pure formulations whose inference rules characterise modality completely by means of manipulations of boxes and diamonds, and those labelled formulations that leverage the use of labels in giving inference rules. The widespread adoption of labelled formulations is driven by their ability to model features of the model theory of modal logic in its proof theory.
We describe here an approach to the structural proof theory of modal logic that aims to bring under one roof the benefits of both the pure and the labelled formulations. We introduce two proof calculi, one labelled sequent formulation and one pure formulation in the calculus of structures that are shown to be in a systematic correlation, where the latter calculus uses deep inference with shaped modal rules to capture in a pure manner the manipulations that the former calculations mediates through the use of labels.
We situate this work within a larger investigation into the proof theory of modal logic that solves problems that existed with the earlier investigation based on prefix modal rules. We hold this development provides yet stronger evidence justifying the claim that good, pure proof theory for modal logic needs deep inference.
Pdf 25 June 2005
Proceedings of Structures and Deduction '05, pp. 126–143
 Geometric Theories and Modal Logic in the Calculus of Structures
Robert Hein
Much of the success of modal logic can be attributed to the adoption of relational semantics. Consequently, modal logic is seen as logic of relational structures, where logical axioms correspond to structural properties. Alex Simpson, in his 1993 PhD thesis, introduced a labelled proof theory for modal logic that that allows cutelimination for a class of modal logics, which is characterised by so called geometric theories. This includes important and well know logics such as M, B, S4 and S5. This thesis tries to make a bridge between Simpson's result and purely symbolic proof theory. We introduce a method to characterise frame relational properties by means of deep inference in the Calculus of Structures. The results are only partial. Only what we call 3/4ScottLemmon logics are characterised and we only give plausible reason, rather than a proof that the cutelimination argument can be transferred too.
Gzipped postscript 20 March 2005
MSc thesis, successfully defended on 23 March 2005
 The Design of Modal Proof Theories: The Case of S5
Phiniki Stouppa
The sequent calculus does not seem to be capable of supporting cutadmissible formulations for S5. Through a survey on existing cutadmissible systems for this logic, we investigate the solutions proposed to overcome this defect. Accordingly, the systems can be divided into two categories: in those which allow semanticoriented formulae and those which allow formulae in positions not reachable by the usual systems in the sequent calculus. The first solution is not desirable because it is conceptually impure, that is, these systems express concepts of frame semantics in the language of the logic.
Consequently, we focus on the systems of the second group for which we define notions related to deep inference—the ability to apply rules deep inside structures—as well as other desirable properties good systems should enjoy. We classify these systems accordingly and examine how these properties are affected in the presence of deep inference. Finally, we present a cutadmissible system for S5 in a formalism which makes explicit use of deep inference, the calculus of structures, and give reasons for its effectiveness in providing good modal formulations.
Pdf 20 October 2004
MSc thesis, successfully defended on 27 October 2004
 A Systematic Proof Theory for Several Modal Logics
Charles Stewart and Phiniki Stouppa
The family of normal propositional modal logic systems is given a very systematic organisation by their model theory. This model theory is generally given using frame semantics, and it is systematic in the sense that for the most important systems we have a clean, exact correspondence between their constitutive axioms as they are usually given in a HilbertLewis style and conditions on the accessibility relation on frames.
By contrast, the usual structural proof theory of modal logic, as given in Gentzen systems, is adhoc. While we can formulate several modal logics in the sequent calculus that enjoy cutelimination, their formalisation arises through systembysystem fine tuning to ensure that the cutelimination holds, and the correspondence to the formulation in the HilbertLewis systems becomes opaque.
This paper introduces a systematic presentation for the systems K, D, M, S4, and S5 in the calculus of structures, a structural proof theory that employs deep inference. Because of this, we are able to axiomatise the modal logics in a manner directly analogous to the HilbertLewis axiomatisation. We show that the calculus possesses a cutelimination property directly analogous to cutelimination for the sequent calculus for these systems, and we discuss the extension to several other modal logics.
Postscript 2 September 2004
Advances in Modal Logic 2004, pp. 309–333
2.5 Linear Logic
Linear logic enjoys presentations in deep inference that obtain the expected properties of locality, with rather astounding decomposition theorems and the usual, general normalisation results.
The following papers exist, in addition to Linear Logic and Noncommutativity in the Calculus of Structures, mentioned above:
 Deep Inference and Expansion Trees for SecondOrder Multiplicative Linear Logic
Lutz Straßburger
In this paper, we introduce the notion of expansion tree for linear logic. As in Miller’s original work, we have a shallow reading of an expansion tree that corresponds to the conclusion of the proof, and a deep reading which is a formula that can be proved by propositional rules. We focus our attention to MLL2, and we also present a deep inference system for that logic. This allows us to give a syntactic proof to a version of Herbrand’s theorem.
Pdf 3 December 2018
Mathematical Structures in Computer Science, in press
 Deep Proof Search in MELL
Ozan Kahramanoğulları
The deep inference presentation of multiplicative exponential linear logic (MELL) benefits from a rich combinatoric analysis with many more proofs in comparison to its sequent calculus presentation. In the deep inference setting, all the sequent calculus proofs are preserved. Moreover, many other proofs become available, and some of these proofs are much shorter. However, proof search in deep inference is subject to a greater nondeterminism, and this nondeterminism constitutes a bottleneck for applications. To this end, we address the problem of reducing nondeterminism in MELL by refining and extending our technique that has been previously applied to multiplicative linear logic and classical logic. We show that, besides the nondeterminism in commutative contexts, the nondeterminism in exponential contexts can be reduced in a proof theoretically clean manner. The method conserves the exponential speedup in proof construction due to deep inference, exemplified by Statman tautologies. We validate the improvement in accessing the shorter proofs by experiments with our implementations.
Pdf 4 May 2017 Talk
LPAR21, EPiC 46, pp. 106–124
 Equality and Fixpoints in the Calculus of Structures
Kaustuv Chaudhuri and Nicolas Guenot
The standard proof theory for logics with equality and fixpoints suffers from limitations of the sequent calculus, where reasoning is separated from computational tasks such as unification or rewriting. We propose in this paper an extension of the calculus of structures, a deep inference formalism, that supports incremental and contextual reasoning with equality and fixpoints in the setting of linear logic. This system allows deductive and computational steps to mix freely in a continuum which integrates smoothly into the usual versatile rules of multiplicativeadditive linear logic in deep inference.
Pdf 2 June 2014
CSLLICS 2014 Proceedings (ACM), pp. 30:1–10
 Interaction and Depth Against Nondeterminism in Proof Search
Ozan Kahramanoğulları
Deep inference is a proof theoretic methodology that generalizes the standard notion of inference of the sequent calculus, whereby inference rules become applicable at any depth inside logical expressions. Deep inference provides more freedom in the design of deductive systems for different logics and a rich combinatoric analysis of proofs. In particular, construction of exponentially shorter analytic proofs becomes possible, however with the cost of a greater nondeterminism than in the sequent calculus. In this paper, we show that the nondeterminism in proof search can be reduced without losing the shorter proofs and without sacrificing proof theoretic cleanliness. For this, we exploit an interaction and depth scheme in the logical expressions. We demonstrate our method on deep inference systems for multiplicative linear logic and classical logic, discuss its proof complexity and its relation to focusing, and present implementations.
Pdf 29 May 2014
Logical Methods in Computer Science 10 (2:5) 2014, pp. 1–49
Interana: interactive prover for multiplicative linear logic
 The Focused Calculus of Structures
Kaustuv Chaudhuri, Nicolas Guenot and Lutz Straßburger
The focusing theorem identifies a complete class of sequent proofs that have no inessential nondeterministic choices and restrict the essential choices to a particular normal form. Focused proofs are therefore well suited both for the search and for the representation of sequent proofs. The calculus of structures is a proof formalism that allows rules to be applied deep inside a formula. Through this freedom it can be used to give analytic proof systems for a wider variety of logics than the sequent calculus, but standard presentations of this calculus are too permissive, allowing too many proofs. In order to make it more amenable to proof search, we transplant the focusing theorem from the sequent calculus to the calculus of structures. The key technical contribution is an incremental treatment of focusing that avoids trivializing the calculus of structures. We give a direct inductive proof of the completeness of the focused calculus of structures with respect to a more standard unfocused form. We also show that any focused sequent proof can be represented in the focused calculus of structures, and, conversely, any proof in the focused calculus of structures corresponds to a focused sequent proof.
Pdf 25 August 2011
CSL 2011, LIPIcs 12, pp. 159–173
 Focused Proof Search for Linear Logic in the Calculus of Structures
Nicolas Guenot
The prooftheoretic approach to logic programming has benefited from the introduction of focused proof systems, through the nondeterminism reduction and control they provide when searching for proofs in the sequent calculus. However, this technique was not available in the calculus of structures, known for inducing even more nondeterminism than other logical formalisms. This work in progress aims at translating the notion of focusing into the presentation of linear logic in this setting, and use some of its specific features, such as deep application of rules and fine granularity, in order to improve proof search procedures. The starting point for this research line is the multiplicative fragment of linear logic, for which a simple focused proof system can be built.
Pdf 23 June 2010
ICLP 2010, technical communication, LIPIcs 7, pp. 84–93
 Some Observations on the Proof Theory of Second Order Propositional Multiplicative Linear Logic
Lutz Straßburger
We investigate the question of what constitutes a proof when quantifiers and multiplicative units are both present. On the technical level this paper provides two new aspects of the proof theory of MLL2 with units. First, we give a novel proof system in the framework of the calculus of structures. The main feature of the new system is the consequent use of deep inference, which allows us to observe a decomposition which is a version of Herbrand's theorem that is not visible in the sequent calculus. Second, we show a new notion of proof nets which is independent from any deductive system. We have "sequentialisation" into the calculus of structures as well as into the sequent calculus. Since cut elimination is terminating and confluent, we have a category of MLL2 proof nets. The treatment of the units is such that this category is starautonomous.
Pdf 10 April 2009
TLCA 2009, LNCS 5608, pp. 309–324
Pdf with proofs in appendix
 MELL in the Calculus of Structures
Lutz Straßburger
The calculus of structures is a new prooftheoretical framework, like natural deduction, the sequent calculus and proof nets, for specifying logical systems syntactically. In a rule in the calculus of structures, the premise as well as the conclusion are structures, which are expressions that share properties of formulae and sequents. In this paper, I study a system for MELL, the multiplicative exponential fragment of linear logic, in the calculus of structures. It has the following features: a local promotion rule, no nondeterministic splitting of the context in the times rule and a modular proof for the cutelimination theorem. Further, derivations have a new property, called decomposition, that cannot be observed in any other known prooftheoretical framework.
Pdf 24 November 2003
Theoretical Computer Science 309, pp. 213–285
 A Local System for Linear Logic
Lutz Straßburger
In this paper I will present a deductive system for linear logic, in which all rules are local. In particular, the contraction rule is reduced to an atomic version, and there is no global promotion rule. In order to achieve this, it is necessary to depart from the sequent calculus to the calculus of structures, which is a generalization of the onesided sequent calculus. In a rule, premise and conclusion are not sequents, but structures, which are expressions that share properties of formulae and sequents.
2.6 Commutative/Noncommutative Linear Logic
We conservatively extend mixed multiplicative and multiplicative exponential linear logic with a selfdual noncommutative operator. The systems so obtained cannot be presented in the sequent calculus, but they enjoy the usual properties of locality, decomposition and cut elimination available in the calculus of structures. We can present Yetter's cyclic linear logic in the calculus of structures and prove cut elimination; interestingly, cyclicity is naturally subsumed by deep inference. New, purely prooftheoretical, techniques are developed for reducing the nondeterminism in the calculus of structures.
The following papers exist, in addition to Linear Logic and Noncommutativity in the Calculus of Structures, mentioned above:
 Constructing Weak Simulations from Linear Implications for Processes with Private Names
Ross Horne and Alwen Tiu
This paper clarifies that linear implication defines a branchingtime preorder, preserved in all contexts, when used to compare embeddings of process in noncommutative logic. The logic considered is a first order extension of the proof system BV featuring a de Morgan dual pair of nominal quantifiers, called BV1. An embedding of πcalculus processes as formulae in BV1 is defined, and the soundness of linear implication in BV1 with respect to a notion of weak simulation in the πcalculus is established. A novel contribution of this work is that we generalise the notion of a ‘left proof’ to a class of formulae sufficiently large to compare embeddings of processes, from which simulating execution steps are extracted. We illustrate the expressive power of BV1 by demonstrating that results extend to the internal πcalculus, where privacy of inputs is guaranteed. We also remark that linear implication is strictly finer than any interleaving preorder.
Pdf 29 March 2019
Mathematical Structures in Computer Science, in press
 Subatomic Proof Systems: Splittable Systems
Andrea Aler Tubella and Alessio Guglielmi
This paper presents the first in a series of results that allow us to develop a theory providing finer control over the complexity of normalisation, and in particular of cut elimination. By considering atoms as selfdual noncommutative connectives, we are able to classify a vast class of inference rules in a uniform and very simple way. This allows us to define simple conditions that are easily verifiable and that ensure normalisation and cut elimination by way of a general theorem. In this paper we define and consider splittable systems, which essentially comprise a large class of linear logics, including MLL and BV, and we prove for them a splitting theorem, guaranteeing cut elimination and other admissibility results as corollaries. In papers to follow, we will extend this result to nonlinear logics. The final outcome will be a comprehensive theory giving a uniform treatment for most existing logics and providing a blueprint for the design of future proof systems.
Pdf 4 December 2017
ACM Transactions on Computational Logic, 19 (1:5) 2018, pp. 1–33
 Semantics for Specialising Attack Trees based on Linear Logic
Ross Horne, Sjouke Mauw and Alwen Tiu
Attack trees profile the subgoals of the proponent of an attack. Attack trees have a variety of semantics depending on the kind of question posed about the attack, where questions are captured by an attribute domain. We observe that one of the most general semantics for attack trees, the multiset semantics, coincides with a semantics expressed using linear logic propositions. The semantics can be used to compare attack trees to determine whether one attack tree is a specialisation of another attack tree. Building on these observations, we propose two new semantics for an extension of attack trees named causal attack trees. Such attack trees are extended with an operator capturing the causal order of subgoals in an attack. These two semantics extend the multiset semantics to sets of seriesparallel graphs closed under certain graph homomorphisms, where each semantics respects a class of attribute domains. We define a sound logical system with respect to each of these semantics, by using a recently introduced extension of linear logic, called MAV, featuring a noncommutative operator. The noncommutative operator models causal dependencies in causal attack trees. Similarly to linear logic for attack trees, implication defines a decidable preorder for specialising causal attack trees that soundly respects a class of attribute domains.
Pdf 15 March 2017
Fundamenta Informaticae, 153 (12) 2017, pp. 57–86
 Private Names in NonCommutative Logic
Ross Horne, Alwen Tiu, Bogdan Aman and Gabriel Ciobanu
We present an expressive but decidable firstorder system (named MAV1) defined by using the calculus of structures, a generalisation of the sequent calculus. In addition to firstorder universal and existential quantifiers the system incorporates a de Morgan dual pair of nominal quantifiers called 'new' and 'wen', distinct from the selfdual GabbayPitts and MillerTiu nominal quantifiers. The novelty of the operators 'new' and 'wen' is they are polarised in the sense that 'new' distributes over positive operators while 'wen' distributes over negative operators. This greater control of bookkeeping enables private names to be modelled in processes embedded as predicates in MAV1. Modelling processes as predicates in MAV1 has the advantage that linear implication defines a precongruence over processes that fully respects causality and branching. The transitivity of this precongruence is established by novel techniques for handling firstorder quantifiers in the cut elimination proof.
Pdf 22 June 2016
CONCUR 2016, LIPIcs 59, pp. 31:1–16
 The Consistency and Complexity of Multiplicative Additive System Virtual
Ross Horne
This paper investigates the proof theory of multiplicative additive system virtual (MAV). MAV combines two established proof calculi: multiplicative additive linear logic (MALL) and basic system virtual (BV). Due to the presence of the selfdual noncommutative operator from BV, the calculus MAV is defined in the calculus of structures — a generalisation of the sequent calculus where inference rules can be applied in any context. A generalised cut elimination result is proven for MAV, thereby establishing the consistency of linear implication defined in the calculus. The cut elimination proof involves a termination measure based on multisets of multisets of natural numbers to handle subtle interactions between operators of BV and MAV. Proof search in MAV is proven to be a PSPACEcomplete decision problem. The study of this calculus is motivated by observations about applications in computer science to the verification of protocols and to querying.
Pdf 21 December 2015
Scientific Annals of Computer Science, 25 (2) 2015, pp. 245–316
 A Deep Inference System with a SelfDual Binder Which Is Complete for Linear Lambda Calculus
Luca Roversi
We recall that SBV, a proof system developed under the methodology of deep inference, extends multiplicative linear logic with the selfdual noncommutative logical operator Seq. We introduce SBVB that extends SBV by adding the selfdual binder Sdb on names. The system SBVB is consistent because we prove that (the analogous of) cutelimination holds for it. Moreover, the resulting interplay between Seq and Sdb can model βreduction of linear λcalculus inside the cutfree subsystem BVB of SBVB. The long term aim is to keep developing a programme whose goal is to give pure logical accounts of computational primitives under the proofsearchascomputation analogy, by means of minimal and incremental extensions of SBV.
Pdf 25 January 2014
Journal of Logic and Computation, 26 (2) 2016, pp. 677–698
 A Logical Basis for Quantum Evolution and Entanglement
Richard F. Blute, Alessio Guglielmi, Ivan T. Ivanov, Prakash Panangaden and Lutz Straßburger
We reconsider discrete quantum causal dynamics where quantum systems are viewed as discrete structures, namely directed acyclic graphs. In such a graph, events are considered as vertices and edges depict propagation between events. Evolution is described as happening between a special family of spacelike slices, which were referred to as locative slices. Such slices are not so large as to result in acausal influences, but large enough to capture nonlocal correlations.
In our logical interpretation, edges are assigned logical formulas in a special logical system, called BV, an instance of a deep inference system. We demonstrate that BV, with its mix of commutative and noncommutative connectives, is precisely the right logic for such analysis. We show that the commutative tensor encodes (possible) entanglement, and the noncommutative seq encodes causal precedence. With this interpretation, the locative slices are precisely the derivable strings of formulas. Several new technical results about BV are developed as part of this analysis.
Pdf 18 July 2013
In Categories and Types in Logic, Language, and Physics, LNCS 8222, pp. 90–107
 Communication, and Concurrency with LogicBased Restriction Inside a Calculus of Structures
Luca Roversi
It is well known that we can use structural proof theory to refine, or generalize, existing paradigmatic computational primitives, or to discover new ones. Under such a point of view we keep developing a programme whose goal is establishing a correspondence between proofsearch of a logical system and computations in a process algebra. We give a purely logical account of a process algebra operation which strictly includes the behavior of restriction on actions we find in Milner CCS. This is possible inside a logical system in the Calculus of Structures of Deep Inference endowed with a selfdual quantifier. Using proofsearch of cutfree proofs of such a logical system we show how to solve reachability problems in a process algebra that subsumes a significant fragment of Milner CCS.
Pdf 19 December 2012
Submitted
 Linear Lambda Calculus and Deep Inference
Luca Roversi
We introduce a deep inference logical system SBVr which extends SBV with Rename, a selfdual atomrenaming operator. We prove that the cut free subsystem BVr of SBVr exists. We embed the terms of linear lambdacalculus with explicit substitutions into formulas of SBVr. Our embedding recalls the one of full lambdacalculus into picalculus. The proofsearch inside SBVr and BVr is complete with respect to the evaluation of linear lambdacalculus with explicit substitutions. Instead, only soundness of proofsearch in SBVr holds. Rename is crucial to let proofsearch simulate the substitution of a linear lambdaterm for a variable in the course of linear betareduction. Despite SBVr is a minimal extension of SBV its proofsearch can compute all boolean functions, exactly like linear lambdacalculus with explicit substitutions can do.

A System of Interaction and Structure V: The Exponentials and Splitting
Alessio Guglielmi and Lutz Straßburger
System NEL is the mixed commutative/noncommutative linear logic BV augmented with linear logic's exponentials, or, equivalently, it is MELL augmented with the noncommutative selfdual connective seq. NEL is presented in deep inference, because no Gentzen formalism can express it in such a way that the cut rule is admissible. Other, recent works, show that system NEL is Turingcomplete, it is able to directly express process algebra sequential composition and it faithfully models causal quantum evolution. In this paper, we show cut elimination for NEL, based on a technique that we call splitting. The splitting theorem shows how and to what extent we can recover a sequentlike structure in NEL proofs. Together with a 'decomposition' theorem, proved in the previous paper of this series, splitting yields a cutelimination procedure for NEL.
Pdf 21 September 2010
Mathematical Structures in Computer Science, 21 (3) 2011, pp. 563–584
This paper is the second part resulted from splitting into two parts the paper A System of Interaction and Structure IV: The Exponentials (which was previously titled A Noncommutative Extension of Multiplicative Exponential Linear Logic and was the journal version of A Noncommutative Extension of MELL).
 A System of Interaction and Structure IV: The Exponentials and Decomposition
Lutz Straßburger and Alessio Guglielmi
We study a system, called NEL, which is the mixed commutative/noncommutative linear logic BV augmented with linear logic's exponentials. Equivalently, NEL is MELL augmented with the noncommutative selfdual connective seq. In this paper, we show a basic compositionality property of NEL, which we call decomposition. This result leads to a cutelimination theorem, which is proved in the next paper of this series. To control the induction measure for the theorem, we rely on a novel technique that extracts from NEL proofs the structure of exponentials, into what we call !?FlowGraphs.
Pdf 27 July 2010
ACM Transactions on Computational Logic, 12 (4:23) 2011, pp. 1–39
This paper is the first part resulted from splitting into two parts the paper A System of Interaction and Structure IV: The Exponentials (which was previously titled A Noncommutative Extension of Multiplicative Exponential Linear Logic and was the journal version of A Noncommutative Extension of MELL).
 Deep Inference and Probabilistic Coherence Spaces
Richard Blute, Prakash Panangaden and Sergey Slavnov
This paper proposes a definition of categorical model of the deep inference system BV, defined by Guglielmi. Deep inference introduces the idea of performing a deduction in the interior of a formula, at any depth. Traditional sequent calculus rules only see the roots of formulae. However in these new systems, one can rewrite at any position in the formula tree. Deep inference in particular allows the syntactic description of logics for which there is no sequent calculus. One such system is BV, which extends linear logic to include a noncommutative selfdual connective. This is the logic our paper proposes to model. Our definition is based on the notion of a linear functor, due to Cockett and Seely. A BVcategory is a linearly distributive category, possibly with negation, with an additional tensor product which, when viewed as a bivariant functor, is linear with a degeneracy condition. We show that this simple definition implies all of the key isomorphisms of the theory. We consider Girard's category of probabilistic coherence spaces and show that it contains a selfdual monoidal structure in addition to the *autonomous structure exhibited by Girard. This structure makes the category a BVcategory. We believe this structure is also of independent interest, as wellbehaved noncommutative operators generally are.
Pdf 10 February 2009
Applied Categorical Structures 20 2012, pp. 209–228
 System BV Is NPComplete
Ozan Kahramanoğulları
System BV is an extension of multiplicative linear logic (MLL) with the rules mix, nullary mix, and a selfdual, noncommutative logical operator, called seq. While the rules mix and nullary mix extend the deductive system, the operator seq extends the language of MLL. Due to the operator seq, system BV extends the applications of MLL to those where sequential composition is crucial, e.g., concurrency theory. System FBV is an extension of MLL with the rules mix and nullary mix. In this paper, by relying on the fact that system BV is a conservative extension of system FBV, I show that system BV is NPcomplete by encoding the 3Partition problem in FBV. I provide a simple completeness proof of this encoding by resorting to a novel proof theoretical method for reducing the nondeterminism in proof search, which is also of independent interest.
 A System of Interaction and Structure
Alessio Guglielmi
This paper introduces a logical system, called BV, which extends multiplicative linear logic by a noncommutative selfdual logical operator. This extension is particularly challenging for the sequent calculus, and so far it is not achieved therein. It becomes very natural in a new formalism, called the calculus of structures, which is the main contribution of this work. Structures are formulae subject to certain equational laws typical of sequents. The calculus of structures is obtained by generalising the sequent calculus in such a way that a new topdown symmetry of derivations is observed, and it employs inference rules that rewrite inside structures at any depth. These properties, in addition to allowing the design of BV, yield a modular proof of cut elimination.
Pdf 27 January 2007
ACM Transactions on Computational Logic, 8 (1:1) 2007, pp. 1–64
The journal version has been butchered by the editorial process, but the preprint (linked here) is fine.
At the end of 2012, this was the most cited article appeared in ACM TOCL in the previous five years.
 Reducing Nondeterminism in the Calculus of Structures
Ozan Kahramanoğulları
The calculus of structures is a proof theoretical formalism which generalizes the sequent calculus with the feature of deep inference: In contrast to the sequent calculus, inference rules can be applied at any depth inside a formula, bringing shorter proofs than any other formalisms supporting analytical proofs. However, deep applicability of the inference rules causes greater nondeterminism than in the sequent calculus regarding proof search. In this paper, we introduce a new technique which reduces nondeterminism without breaking proof theoretical properties and provides a more immediate access to shorter proofs. We present this technique on system BV, the smallest technically nontrivial system in the calculus of structures, extending multiplicative linear logic with the rules mix, nullary mix, and a selfdual noncommutative logical operator. Because our technique exploits a scheme common to all the systems in the calculus of structures, we argue that it generalizes to these systems for classical logic, linear logic, and modal logics.
Pdf 23 August 2006
LPAR 2006, LNCS 4246, pp. 272–286
 A System of Interaction and Structure II: The Need for Deep Inference
Alwen Tiu
This paper studies properties of the logic BV, which is an extension of multiplicative linear logic (MLL) with a selfdual noncommutative operator. BV is presented in the calculus of structures, a proof theoretic formalism that supports deep inference, in which inference rules can be applied anywhere inside logical expressions. The use of deep inference results in a simple logical system for MLL extended with the selfdual noncommutative operator, which has been to date not known to be expressible in sequent calculus. In this paper, deep inference is shown to be crucial for the logic BV, that is, any restriction on the "depth" of the inference rules of BV would result in a strictly less expressive logical system.
Pdf 3 April 2006
Logical Methods in Computer Science 2 (2:4) 2006, pp. 1–24
There are pictures that help understanding this paper in Alwen Tiu's MSc thesis Properties of a Logical System in the Calculus of Structures
 Structures for Multiplicative Cyclic Linear Logic: Deepness vs Cyclicity
Pietro Di Gianantonio
The aim of this work is to give an alternative presentation for the multiplicative fragment of Yetter's cyclic linear logic. The new presentation is inspired by the calculus of structures, and has the interesting feature of avoiding the cyclic rule. The main point in this work is to show how cyclicity can be substituted by deepness, i.e. the possibility of applying an inference rule at any point of a formula. We finally derive, through a new proof technique, the cut elimination property of the calculus.
Pdf 15 April 2004
CSL 2004, LNCS 3210, pp. 130–144
 A Noncommutative Extension of MELL
Alessio Guglielmi and Lutz Straßburger
We extend multiplicative exponential linear logic (MELL) by a noncommutative, selfdual logical operator. The extended system, called NEL, is defined in the formalism of the calculus of structures, which is a generalisation of the sequent calculus and provides a more refined analysis of proofs. We should then be able to extend the range of applications of MELL, by modelling a broad notion of sequentiality and providing new properties of proofs. We show some proof theoretical results: decomposition and cut elimination. The new operator represents a significant challenge: to get our results we use here for the first time some novel techniques, which constitute a uniform and modular approach to cut elimination, contrary to what is possible in the sequent calculus.
Pdf 9 August 2002
LPAR 2002, LNCS 2514, pp. 231–246
 The Undecidability of System NEL
Lutz Straßburger
System NEL is a conservative extension of multiplicative exponential linear logic (MELL) by a selfdual noncommutative connective called seq which lives between the par and the times. In this paper, I will show that system NEL is undecidable by encoding two counter machines into NEL. Although the encoding is quite simple, the proof of the faithfulness is a little intricate because there is no sequent calculus and no phase semantics available for NEL.
 A1Unification
Alwen Fernanto Tiu
Gzipped postscript 19 February 2002
Technical Report WV0108, Technische Universität Dresden
 Combining A1 and AC1Unification Sharing Unit
Alwen Fernanto Tiu
Gzipped postscript 19 February 2002
Technical Report WV0109, Technische Universität Dresden
 Properties of a Logical System in the Calculus of Structures
Alwen Fernanto Tiu
The calculus of structures is a new framework for presenting logical systems. It is a generalisation of a traditional framework, the onesided sequent calculus. One of the main features of the calculus of structures is that the inference rules are deep: they can be applied anywhere inside logical expressions. Rules in the sequent calculus are, in contrast, shallow. A certain logical system in the calculus of structures, called System BV, is studied here. We see that the deepnesting of rules is a real distinguishing feature between the two frameworks. To this purpose a notion of shallow systems is introduced, such that sequent systems are particular instances. A counterexample, sort of a fractal structure, is then presented to show that there is no shallow system for the logic behind BV, and hence no sequent system for BV. This result contributes to justifying the claim that the calculus of structures is a better logical framework than sequent calculus, for certain logics.
Pdf 12 September 2001 Now replaced by A System of Interaction and Structure II: The Need for DeepInference
MSc thesis, successfully defended on 1 August 2001, Technical Report WV0106, Technische Universität Dresden
 Noncommutativity and MELL in the Calculus of Structures
Alessio Guglielmi and Lutz Straßburger
We introduce the calculus of structures: it is more general than the sequent calculus and it allows for cut elimination and the subformula property. We show a simple extension of multiplicative linear logic, by a selfdual noncommutative operator inspired by CCS, that seems not to be expressible in the sequent calculus. Then we show that multiplicative exponential linear logic benefits from its presentation in the calculus of structures, especially because we can replace the ordinary, global promotion rule by a local version. These formal systems, for which we prove cut elimination, outline a range of techniques and properties that were not previously available. Contrarily to what happens in the sequent calculus, the cut elimination proof is modular.
Pdf 28 June 2001
CSL 2001, LNCS 2142, pp. 54–68
 A Calculus of Order and Interaction
Alessio Guglielmi
This paper has been thoroughly rewritten as A System of Interaction and Structure. The introduction of the new paper has not been written under the effect of psychotropic substances. Please forget about this paper.
Technical Report WV9904, Technische Universität Dresden
2.7 Proof Nets, Semantics of Proofs and the War to Bureaucracy
Deep inference and the calculus of structures are influencing the design of a new generation of proof nets. Moreover, they offer new insight for semantics of proofs and categorical proof theory. Finally, they open decisive new perspectives in the fight against bureaucracy.
The following papers exist, in addition to A General View of Normalisation Through Atomic Flows, Categorical Models of First Order Classical Proofs and Towards a Theory of Proofs of Classical Logic, mentioned above:
 Combinatorial Flows and Their Normalisation
Lutz Straßburger
This paper introduces combinatorial flows that generalize combinatorial proofs such that they also include cut and substitution as methods of proof compression. We show a normalization procedure for combinatorial flows, and how syntactic proofs are translated into combinatorial flows and vice versa.
Pdf 21 August 2017
FSCD 2017, LIPIcs 84, pp. 31:1–17
 True Concurrency of Deep Inference Proofs
Ozan Kahramanoğulları
We give an event structures based trueconcurrency characterization of deep inference proofs. The method is general to all deep inference systems that can be expressed as term rewriting systems. This delivers three consequences in a spectrum from theoretical to practical: the event structure characterization (i) provides a qualification of proof identity akin to proof nets for multiplicative linear logic and to atomic flows for classical logic; (ii) provides a concurrency theoretic interpretation for applications in logic programming; (iii) reduces the length of the proofs, and thereby extends the margin of proof search applications.
Pdf 6 May 2016
WoLLIC 2016, LNCS 9803, pp. 249–264
 The Structure of Interaction
Stéphane Gimenez and Georg Moser
Interaction nets form a local and strongly confluent model of computation that is per se parallel. We introduce a Curry–Howard correspondence between wellformed interaction nets and a deepinference deduction system based on linear logic. In particular, linear logic itself is easily expressed in the system and its computational aspects materialise though the correspondence. The system of interaction nets obtained is a typed variant of already wellknown sharing graphs. Due to a strong confluence property, strong normalisation for this system follows from weak normalisation. The latter is obtained via an adaptation of Girard’s reducibility method. The approach is modular, readily gives rise to generalisations (e.g. second order, known as polymorphism to the programmer) and could therefore be extended to various systems of interaction nets.
Pdf 4 July 2013
CSL 2013, LIPIcs 23, pp. 316–331
 A Proof Calculus Which Reduces Syntactic Bureaucracy
Alessio Guglielmi, Tom Gundersen and Michel Parigot
In usual proof systems, like sequent calculus, only a very limited way of combining proofs is available through the tree structure. We present in this paper a logicindependent proof calculus, where proofs can be freely composed by connectives, and prove its basic properties. The main advantage of this proof calculus is that it allows to avoid certain types of syntactic bureaucracy inherent to all usual proof systems, in particular sequent calculus. Proofs in this system closely reflect their atomic flow, which traces the behaviour of atoms through structural rules. The general definition is illustrated by the standard deepinference system for propositional logic, for which there are known rewriting techniques that achieve cut elimination based only on the information in atomic flows.
Pdf 12 April 2010
RTA 2010, Leibniz International Proceedings in Informatics (LIPIcs) 6, pp. 135–150
This paper supersedes the note Formalism A.
 What Is the Problem with Proof Nets for Classical Logic?
Lutz Straßburger
This paper is an informal (and nonexhaustive) overview over some existing notions of proof nets for classical logic, and gives some hints why they might be considered to be unsatisfactory.
Pdf 6 April 2010
CiE 2010, LNCS 6158, pp. 406–416
 From Deep Inference to Proof Nets Via Cut Elimination
Lutz Straßburger
This article shows how derivations in the deep inference system SKS for classical propositional logic can be translated into proof nets. Since an SKS derivation contains more information about a proof than the corresponding proof net, we observe a loss of information which can be understood as "eliminating bureaucracy". Technically this is achieved by cut reduction on proof nets. As an intermediate step between the two extremes, SKS derivations and proof nets, we will see proof graphs representing derivations in "Formalism A".

On the Axiomatisation of Boolean Categories with and without Medial
Lutz Straßburger
In its most general meaning, a Boolean category is to categories what a Boolean algebra is to posets. In a more specific meaning a Boolean category should provide the abstract algebraic structure underlying the proofs in Boolean Logic, in the same sense as a Cartesian closed category captures the proofs in intuitionistic logic and a *autonomous category captures the proofs in linear logic. However, recent work has shown that there is no canonical axiomatisation of a Boolean category. In this work, we will see a series (with increasing strength) of possible such axiomatisations, all based on the notion of *autonomous category. We will particularly focus on the medial map, which has its origin in an inference rule in KS, a cutfree deductive system for Boolean logic in the calculus of structures. Finally, we will present a category proof nets as a particularly wellbehaved example of a Boolean category.

What Is a Logic, and What Is a Proof?
Lutz Straßburger
I will discuss the two problems of how to define identity between logics and how to define identity between proofs. For the identity of logics, I propose to simply use the notion of preorder equivalence. This might be considered to be folklore, but is exactly what is needed from the viewpoint of the problem of the identity of proofs: If the proofs are considered to be part of the logic, then preorder equivalence becomes equivalence of categories, whose arrows are the proofs. For identifying these, the concept of proof nets is discussed.
Pdf 23 October 2006
Logica Universalis—Towards a General Theory of Logic, pp. 135–152, Birkhäuser, 2007
 Proof Nets and the Identity of Proofs
Lutz Straßburger
These are the notes for a 5lecturecourse given at ESSLLI 2006 in Malaga, Spain. The URL of the school is http://esslli2006.lcc.uma.es/. The course is intended to be introductory. That means no prior knowledge of proof nets is required. However, the student should be familiar with the basics of propositional logic, and should have seen formal proofs in some formal deductive system (e.g., sequent calculus, natural deduction, resolution, tableaux, calculus of structures, FregeHilbertsystems, ...). It is probably helpful if the student knows already what cut elimination is, but this is not strictly necessary. In these notes, I will introduce the concept of "proof nets" from the viewpoint of the problem of the identity of proofs. I will proceed in a rather informal way. The focus will be more on presenting ideas than on presenting technical details. The goal of the course is to give the student an overview of the theory of proof nets and make the vast amount of literature on the topic easier accessible to the beginner. For introducing the basic concepts of the theory, I will in the first part of the course stick to the unitfree multiplicative fragment of linear logic because of its rather simple notion of proof nets. In the second part of the course we will see proof nets for more sophisticated logics.
Pdf 20 October 2006
Technical Report 6013, INRIA
 From Proof Nets to the Free *Autonomous Category
François Lamarche and Lutz Straßburger
In the first part of this paper we present a theory of proof nets for full multiplicative linear logic, including the two units. It naturally extends the wellknown theory of unitfree multiplicative proof nets. A linking is no longer a set of axiom links but a tree in which the axiom links are subtrees. These trees will be identified according to an equivalence relation based on a simple form of graph rewriting. We show the standard results of sequentialization and strong normalization of cut elimination. In the second part of the paper we show that the identifications enforced on proofs are such that the class of twoconclusion proof nets defines the free *autonomous category.
Pdf 5 October 2006 Journal version of On Proof Nets for Multiplicative Linear Logic with Units
Logical Methods in Computer Science 2 (4:3) 2006, pp. 1–44
 Completeness of MLL ProofNets w.r.t. Weak Distributivity
JeanBaptiste Joinet
We examine 'weakdistributivity' as a rewriting rule \(\xrightarrow{WD}\) defined on multiplicative proofstructures (so, in particular, on multiplicative proofnets: MLL). This rewriting does not preserve the type of proofsnets, but does nevertheless preserve their correctness. The specific contribution of this paper, is to give a direct proof of completeness for \(\xrightarrow{WD}\): starting from a set of simple generators (proofnets which are a nary \(\otimes\) of \(\wp\)ized axioms), any monoconclusion MLL proofnet can be reached by \(\xrightarrow{WD}\) rewriting (up to \(\otimes\) and \(\wp\) associativity and commutativity).
Pdf 8 September 2006
The Journal of Symbolic Logic 72 (1) 2007, pp. 159–170
Invited talk at WoLLIC 2003, under the title 'Calculus of Structures and Proof Nets', Electronic Notes in Theoretical Computer Science 84; a short version appeared in the proceedings of Structures and Deduction '05, pp. 81–94
 Exploring the Gap between Linear and Classical Logic
François Lamarche
The Medial rule was first devised as a deduction rule in the Calculus of Structures. In this paper we explore it from the point of view of category theory, as additional structure on a *autonomous category. This gives us some insights on the denotational semantics of classical propositional logic, and allows us to construct new models for it, based on suitable generalizations of the theory of coherence spaces.
Pdf 27 June 2006
Theory and Applications of Categories 18 (17) 2007, pp. 473–535
 The Three Dimensions of Proofs
Yves Guiraud
In this document, we study a 3polygraphic translation for the proofs of SKS, a formal system for classical propositional logic. We prove that the free 3category generated by this 3polygraph describes the proofs of classical propositional logic modulo structural bureaucracy. We give a 3dimensional generalization of Penrose diagrams and use it to provide several pictures of a proof. We sketch how local transformations of proofs yield a non contrived example of 4dimensional rewriting.
 On Two Forms of Bureaucracy in Derivations
Kai Brünnler and Stéphane Lengrand
We call irrelevant information in derivations bureaucracy. An example of such irrelevant information is the order between two consecutive inference rules that trivially permute. Building on ideas by Guglielmi, we identify two forms of bureaucracy that occur in the calculus of structures (and, in fact, in every nontrivial term rewriting derivation). We develop term calculi that provide derivations that do not contain this bureaucracy. We also give a normalisation procedure that removes bureaucracy from derivations and find that in a certain sense the normalisation process is a process of cut elimination.
Pdf 10 June 2005
Proceedings of Structures and Deduction '05, pp. 69–80
This is further expanded in Ch. 11 of Stéphane Lengrand's PhD thesis Normalisation & Equivalence in Proof Theory & Type Theory (2007 Ackermann Award of the EACSL)
 Constructing Free Boolean Categories
François Lamarche and Lutz Straßburger
By Boolean category we mean something which is to a Boolean algebra what a category is to a poset. We propose an axiomatic system for Boolean categories, which is different in several respects from the ones proposed recently. In particular everything is done from the start in a *autonomous category and not in a weakly distributive one, which simplifies issues like the Mix rule. An important axiom, which is introduced later, is a "graphical" condition, which is closely related to denotational semantics and the Geometry of Interaction. Then we show that a previously constructed category of proof nets is the free "graphical" Boolean category in our sense. This validates our categorical axiomatization with respect to a reallife example. Another important aspect of this work is that we do not assume apriori the existence of units in the *autonomous categories we use. This has some retroactive interest for the semantics of linear logic, and is motivated by the properties of our example with respect to units.
 Classical Categories and Deep Inference
Richard McKinley
Deep inference is a prooftheoretic notion in which proof rules apply arbitrarily deeply inside a formula. We show that the essence of deep inference is the bifunctoriality of the connectives. We demonstrate that, when given an inequational theory that models cutreduction, a deep inference calculus for classical logic (SKSg) is a categorical model of the classical sequent calculus LK in the sense of Führmann and Pym. We observe that this gives a notion of cutreduction for derivations in SKSg, for which the usual notion of cut in SKSg is a special case. Viewing SKSg as a model of the sequent calculus uncovers new insights into the Craig interpolation lemma and intuitionistic provability.
Pdf 31 March 2005
Proceedings of Structures and Deduction '05, pp. 19–33
 Naming Proofs in Classical Propositional Logic
François Lamarche and Lutz Straßburger
We present a theory of proof denotations in classical propositional logic. The abstract definition is in terms of a semiring of weights, and two concrete instances are explored. With the Boolean semiring we get a theory of classical proof nets, with a geometric correctness criterion, a sequentialization theorem, and a strongly normalizing cutelimination procedure. With the semiring of natural numbers, we obtain a sound semantics for classical logic, in which fewer proofs are identified. Though a "real" sequentialization theorem is missing, these proof nets have a grip on complexity issues. In both cases the cut elimination procedure is closely related to its equivalent in the calculus of structures, and we get "Boolean" categories which are not posets.
Pdf 31 January 2005
TLCA 2005, LNCS 3461, pp. 246–261
 Deep Inference Proof Theory Equals Categorical Proof Theory Minus Coherence
Dominic Hughes
This paper links deep inference proof theory, as studied by Guglielmi et. al., to categorical proof theory in the sense of Lambek et. al. It observes how deep inference proof theory is categorical proof theory, minus the coherence diagrams/laws. Coherence yields a readymade and well studied notion of equality on deep inference proofs. The paper notes a precise correspondence between the symmetric deep inference system for multiplicative linear logic (the linear fragment of SKSg and the presentation of *autonomous categories as symmetric linearly distributive categories with negation. Contraction and weakening in SKSg corresponds precisely to the presence of (co)monoids.
Pdf 6 October 2004
 On Proof Nets for Multiplicative Linear Logic with Units
Lutz Straßburger and François Lamarche
In this paper we present a theory of proof nets for full multiplicative linear logic, including the two units. It naturally extends the wellknown theory of unitfree multiplicative proof nets. A linking is no longer a set of axiom links but a tree in which the axiom links are subtrees. These trees will be identified according to an equivalence relation based on a simple form of graph rewriting. We show the standard results of sequentialization and strong normalization of cut elimination. Furthermore, the identifications enforced on proofs are such that the proof nets, as they are presented here, form the arrows of the free (symmetric) *autonomous category.
Pdf 30 June 2004
CSL 2004, LNCS 3210, pp. 145–159
2.8 Language Design
Thanks to a selfdual noncommutative extension of linear logic one gets the first purely logical account of sequentiality in proof search. The new logical operators make possible a new approach to partial order planning and its relation to concurrency.
The following papers exist, in addition to Nondeterminism and Language Design in Deep Inference, mentioned above:
 Behavioural Analysis of Sessions Using the Calculus of Structures
Gabriel Ciobanu and Ross Horne
This paper describes an approach to the behavioural analysis of sessions. The approach is made possible by the calculus of structures — a deep inference proof calculus, generalising the sequent calculus, where inference rules are applied in any context. The approach involves specifications of global and local sessions inspired by the Scribble language. The calculus features a novel operator that synchronises parts of a protocol that must be treated atomically. Firstly, the calculus can be used to determine whether local sessions can be compose in a type safe fashion such that sessions are capable of successfully completing. Secondly, the calculus defines a subtyping relation for sessions that allows causal dependencies to be weakened while retaining termination potential. Consistency and complexity results follow from proof theory.
Pdf 2 December 2015
PSI 2015, LNCS 9609, pp. 91–106
 On Linear Logic Planning and Concurrency
Ozan Kahramanoğulları
We present an approach to linear logic planning where an explicit correspondence between partial order plans and multiplicative exponential linear logic proofs is established. This is performed by extracting partial order plans from sound and complete encoding of planning problems in multiplicative exponential linear logic. These partial order plans exhibit a noninterleaving behavioural concurrency semantics, i.e., labelled event structures. Relying on this fact, we argue that this work is a crucial step for establishing a common language for concurrency and planning that will allow to carry techniques and methods between these two fields.
 A Deductive Compositional Approach to Petri Nets for Systems Biology
Ozan Kahramanoğulları
We introduce the language CP, a compositional language for place transition petri nets for the purpose of modelling signalling pathways in complex biological systems. We give the operational semantics of the language CP by means of a proof theoretical deductive system which extends multiplicative exponential linear logic with a selfdual noncommutative logical operator. This allows to express parallel and sequential composition of processes at the same syntactic level as in process algebra, and perform logical reasoning on these processes. We demonstrate the use of the language on a model of a signaling pathway for Fc receptormediated phagocytosis.
Pdf 18 September 2007
Presented as a poster at CMSB '07
 Labeled Event Structure Semantics of Linear Logic Planning
Ozan Kahramanoğulları
Pdf 17 January 2005
Presented at 1st World Congress on Universal Logic
 Towards Planning as Concurrency
Ozan Kahramanoğulları
We present a purely logical framework to planning where we bring the sequential and parallel composition in the plans to the same level, as in process algebras. The problem of expressing causality, which is very challenging for common logics and traditional deductive systems, is solved by resorting to a recently developed extension of multiplicative exponential linear logic with a selfdual, noncommutative operator. We present an encoding of the conjunctive planning problems in this logic, and provide a constructive soundness and completeness result. We argue that this work is the first, but crucial, step of a uniform deductive formalism that connects planning and concurrency inside a common language, and allow to transfer methods from concurrency to planning.
Pdf 19 October 2004
Artificial Intelligence and Applications 2005, ACTA Press, pp. 197–202
 A Purely Logical Account of Sequentiality in Proof Search
Paola Bruscoli
We establish a strict correspondence between the proofsearch space of a logical formal system and computations in a simple process algebra. Sequential composition in the process algebra corresponds to a logical relation in the formal system—in this sense our approach is purely logical, no axioms or encodings are involved. The process algebra is a minimal restriction of CCS to parallel and sequential composition; the logical system is a minimal extension of multiplicative linear logic. This way we get the first purely logical account of sequentiality in proof search. Since we restrict attention to a small but meaningful fragment, which is then of very broad interest, our techniques should become a common basis for several possible extensions. In particular, we argue about this work being the first step in a twostep research for capturing most of CCS in a purely logical fashion.
Pdf 12 August 2002
ICLP 2002, LNCS 2401, pp. 302–316
2.9 Implementations
Ozan Kahramanoğulları, PierreEtienne Moreau and Antoine Reilles implemented calculusofstructures proof systems in Maude (MLL and MELL) and in Tom. Ozan managed to achieve efficiency without sacrificing proof theoretic cleanliness, and he is obtaining results of independent theoretical interest. There are two slides presentations:
Max Schäfer has built a graphical proof editor in Java, called GraPE, for the Maude modules written by Ozan Kahramanoğulları; this means that one can interactively build and find proofs in several deepinference systems.
The following papers exist, in addition to Nondeterminism and Language Design in Deep Inference, mentioned above:
 Ingredients of a Deep Inference Theorem Prover
Ozan Kahramanoğulları
Deep inference deductive systems for classical logic provide exponentially shorter proofs than the sequent calculus systems, however with the cost of higher nondeterminism and larger search space in proof search. We report on our ongoing work on proof search with deep inference deductive systems. We present systems for classical logic where nondeterminism in proof search is reduced by constraining the context management rule of these systems. We argue that a deep inference system for classical logic can outperform sequent calculus deductive systems in proof search when nondeterminism and the application of the contraction rule are controlled by means of invertible rules.
Pdf 24 June 2008
Short paper at CL&C '08
 Maude as a Platform for Designing and Implementing Deep Inference Systems
Ozan Kahramanoğulları
Deep inference is a proof theoretical methodology that generalizes the traditional notion of inference in the sequent calculus: In contrast to the sequent calculus, the deductive systems with deep inference do not rely on the notion of main connective, and permit the application of the inference rules at any depth inside logical expressions, in a way which resembles the application of term rewriting rules. Deep inference provides a richer combinatoric analysis of proofs for different logics. In particular, construction of exponentially shorter proofs becomes possible. In this paper, aiming at the development of computation as proof search tools, we propose the Maude language as a means for designing and implementing different deep inference deductive systems and proof strategies that work on these systems. We demonstrate these ideas on classical logic and argue that these ideas can be analogously carried to other deductive systems for other logics.
Pdf 16 September 2007
RULE '07, Electronic Notes in Theoretical Computer Science 219, 2008, pp. 35–50
MLL and MELL in Maude
 Canonical Abstract Syntax Trees
Antoine Reilles
This paper presents GOM, a language for describing abstract syntax trees and generating a Java implementation for those trees. GOM includes features allowing to specify and modify the interface of the data structure. These features provide in particular the capability to maintain the internal representation of data in canonical form with respect to a rewrite system. This explicitly guarantees that the client program only manipulates normal forms for this rewrite system, a feature which is only implicitly used in many implementations.
Pdf 28 March 2006
Proceedings of 6th International Workshop on Rewriting Logic and Its Applications, Electronic Notes in Theoretical Computer Science 176, 2007, pp. 165–179
 Implementing Deep Inference in TOM
Ozan Kahramanoğulları, PierreEtienne Moreau, Antoine Reilles
The calculus of structures is a proof theoretical formalism which generalizes sequent calculus with the feature of deep inference: in contrast to sequent calculus, the calculus of structures does not rely on the notion of main connective and, like in term rewriting, it permits the application of inference rules at any depth inside a formula. Tom is a pattern matching processor that integrates term rewriting facilities into imperative languages. In this paper, relying on the correspondence between the systems in the calculus of structures and term rewriting systems, we present an implementation of system BV of the calculus of structures in Java by exploiting the term rewriting features of Tom. This way, by means of the expressive power due to Java, it becomes possible to implement different search strategies. Since the systems in the calculus of structures follow a common scheme, we argue that our implementation can be generalized to other systems in the calculus of structures for classical logic, modal logics, and different fragments of linear logic.
Pdf 23 April 2005
Proceedings of Structures and Deduction '05, pp. 158–172
 System BV Without the Equalities for Unit
Ozan Kahramanoğulları
System BV is an extension of multiplicative linear logic with a noncommutative selfdual operator. In this paper, we present systems equivalent to system BV where equalities for unit are oriented from left to right and new structural rules are introduced to preserve completeness. While the first system allows units to appear in the structures, the second system makes it possible to completely remove the units from the language of BV by proving the normal forms of the structures that are provable in BV. The resulting systems provide better performance in automated proof search by disabling redundant applications of inference rules due to the unit. As evidence, we provide a comparison of the performance of these systems in a Maude implementation.
Pdf 4 August 2004
ISCIS 2004, LNCS 3280, pp. 986–995
 From the Calculus of Structures to Term Rewriting Systems
Steffen Hölldobler and Ozan Kahramanoğulları
The calculus of structures is a recently developed proof theoretical formalism that extends onesided sequent calculus, with the gain of interesting proof theoretical properties. In contrast to sequent calculus, it does not rely on the notion of main connective and, like in term rewriting, permits the application of inference rules anywhere deep inside a formula. In this paper, exploiting this resemblance, we present a procedure turning derivations in the calculus of structures in four steps into rewritings in a term rewriting system modulo equality.
13 June 2004
Technical Report WV0403, Technische Universität Dresden
 Implementing System BV of the Calculus of Structures in Maude
Ozan Kahramanoğulları
System BV is an extension of multiplicative linear logic with a noncommutative selfdual operator. We first map derivations of system BV of the calculus of structures to rewritings in a term rewriting system modulo equality, and then express this rewriting system as a Maude system module. This results in an automated proof search implementation for this system, and provides a recipe for implementing existing calculus of structures systems for other logics. Our result is interesting from the view of applications, specially, where sequentiality is essential, e.g., planning and natural language processing. In particular, we argue that we can express plans as logical formulae by using the sequential operator of BV and reason on them in a purely logical way.
Pdf 13 June 2004
Proceedings of ESSLLI 2004Student Session
3 Notes
 An Unavoidable Contraction Loop in Monotone Deep Inference Anupam Das 25 June 2017
 On Analyticity in Deep Inference Paola Bruscoli and Alessio Guglielmi In this note, we discuss the notion of analyticity in deep inference and propose a formal definition for it. The idea is to obtain a notion that would guarantee the same properties that analyticity in Gentzen theory guarantees, in particular, some reasonable starting point for algorithmic proof search. Given that deep inference generalises Gentzen proof theory, the notion of analyticity discussed here could be useful in general, and we offer some reasons why this might be the case. 13 November 2016
 Confluent and Natural Cut Elimination in Classical Logic Alessio Guglielmi and Benjamin Ralph We show that, in deep inference, there is a natural and confluent cut elimination procedure that has a strikingly simple semantic justification. We proceed in two phases: we first tackle the propositional case with a construction that we call the 'experiments method'. Here we build a proof made of as many derivations as there are models of the given tautology. Then we lift the experiment method to the predicate calculus, by tracing all the existential witnesses, and in so doing we reconstruct the Herbrand theorem. The confluence of the procedure is essentially taken care of by the commutativity and associativity of disjunction. 31 March 2015 Accepted at Proof, Computation, Complexity 2015
 Subatomic Proof Systems Andrea Aler Tubella and Alessio Guglielmi We show a proof system that generates propositional proofs by employing a single, linear, simple and regular inference rule scheme. The main idea is to consider atoms as selfdual, noncommutative binary logical relations and build formulae by freely composing units by atoms, disjunction and conjunction. If we restrict proofs to formulae where no atom occurs in the scope of another atom, we fully and faithfully recover a deduction system for propositional logics in the usual sense, where traditional proof analysis and transformation techniques such as cut elimination can be studied. 30 March 2015 Presented at Proof, Computation, Complexity 2015, then at Women in Logic 2017 (Pdf) and then at SYSMICSW2 2018 (Pdf)
 A Subatomic Proof System Andrea Aler Tubella and Alessio Guglielmi In this work we show a proof system, called SA, that generates propositional proofs by employing a single, linear, simple and regular inference rule scheme. The main idea is to consider atoms as selfdual, noncommutative binary logical relations and build formulae by freely composing units by atoms, disjunction and conjunction. If we restrict proofs to formulae where no atom occurs in the scope of another atom, we fully and faithfully recover a deduction system for propositional logic in the usual sense, where traditional proof analysis and transformation techniques such as cut elimination can be studied. 5 July 2014
 The Commutative/Noncommutative Linear Logic BV Alessio Guglielmi This brief survey contains an informal presentation of the commutative/noncommutative linear logic BV in terms of a naif spacetemporal model. BV improves on the ability of linear logic to model spatial structure by satisfactorily capturing temporal structure in programming languages and quantum physics. I provide a guide to the literature on BV and suggest that the only satisfactory treatment for it can be obtained in the prooftheoretic methodology of deep inference. 8 May 2014
 The Logic BV and Quantum Causality Richard Blute, Prakash Panangaden and Lutz Straßburger We describe how a logic with commutative and noncommutative connectives can be used for capturing the essence of discrete quantum causal propagation. 12 September 2008 Superseded by A Logical Basis for Quantum Evolution and Entanglement
 Polynomial Size DeepInference Proofs Instead of Exponential Size ShallowInference Proofs Alessio Guglielmi By a simple example, I show how deep inference can provide for an exponential speedup in the size of proofs with respect to shallow inference. 22 September 2007 Slides Superseded by On the Proof Complexity of Deep Inference
 Finitary Cut Alessio Guglielmi We all know that the traditional cut rule is considered infinitary. But if we reduce the cut rule to atomic form, as we always can do in the calculus of structures, is atomic cut still infinitary? Not really. 22 September 2007 Superseded by A First Order System with Finite Choice of Premises
 On Analytic Inference Rules in the Calculus of Structures Paola Bruscoli and Alessio Guglielmi We discuss the notion of analytic inference rule for propositional logics in the calculus of structures. 25 November 2009 Superseded by On Analyticity in Deep Inference
 Interaction and Depth Against Nondeterminism in Proof Search Ozan Kahramanoğulları We argue that, by exploiting an interaction and depth scheme in the logical expressions, the nondeterminism in proof search can be reduced without losing the shorter proofs and breaking proof theoretical properties. 23 April 2007 Presented at Automated Reasoning Workshop 2007
 Butterflies Alessio Guglielmi If you're interested in NEL and BV, try and find the mistake in this note. 29 July 2005
 Some News on Subatomic Logic Alessio Guglielmi I found a simple explanation for the soundness of the subatomic inference rule. 28 July 2005 Superseded by A Study of Normalisation Through Subatomic Logic
 Red and Blue Alessio Guglielmi The sequent calculus is often colour blind, but not always; the calculus of structures has a consistent behaviour. 26 July 2005
 The Problem of Bureaucracy and Identity of Proofs from the Perspective of Deep Inference Alessio Guglielmi Some discussion and then the first attempt at wired/weird deduction. 17 June 2005 Proceedings of Structures and Deduction '05, pp. 53–68
 Getting Formalisms A and B by ProofTerms and Typing Systems Stéphane Lengrand and Kai Brünnler 11 February 2005 Expanded into On Two Forms of Bureaucracy in Derivations
 Formalism B Alessio Guglielmi Inference rules operate on derivations. 20 December 2004
 Using BV to Describe Causal Quantum Evolution Prakash Panangaden In this note I describe how to capture the kinematics of quantum causal evolution using a logic called BV developed by the Calculus of Structures group at Dresden. 5 July 2004
 Formalism A Alessio Guglielmi I generalise the notion of derivation for taking care of a certain kind of bureaucracy. 23 April 2004 Superseded by A Proof Calculus Which Reduces Syntactic Bureaucracy
 Resolution in the Calculus of Structures Alessio Guglielmi It is possible to search for cutfree proofs by the resolution strategy in the calculus of structures; the sequent calculus does not allow the same freedom. 29 September 2003
 Mismatch Alessio Guglielmi There is a mismatch between meta and object levels in many calculi, like the sequent calculus and natural deduction. The mismatch has undesirable proof theoretical consequences, the most important being the inability to design deductive systems. 24 August 2003 Brief note for the specialist, posted to the Foundations of Mathematics and Proof Theory lists
 Normalisation Without Cut Elimination Alessio Guglielmi I propose a possible notion of equivalence for proofs in classical logic, sort of a normal form for case analysis. 25 February 2003
 Subatomic Logic Alessio Guglielmi One can unify classical and linear logic by using only two simple, linear, 'hyper' inference rules; they generate nice systems for all the fragments, where all rules are local. The various logics are determined by the algebra of their units, for example boolean algebra determines classical logic. We can prove cut elimination easily and once and for all in the tworule system, and the procedure scales down to all the derived fragments. 21 November 2002 Superseded by A Study of Normalisation Through Subatomic Logic
 On Lafont's Counterexample Alessio Guglielmi Lafont's counterexample shows why cut elimination in the sequent calculus of classical logic is not confluent. I show here that the counterexample doesn't work in the calculus of structures. 1 November 2002
 Recipe Alessio Guglielmi A family of tautologies that hold in most logics can be used, following a simple recipe, to produce deductive systems where all rules are local, including cut. 27 October 2002
 Goodness, Perfection and Miracles Alessio Guglielmi I argue about three properties closely related to cut elimination and interpolation theorems. 18 October 2002
 Freedom from Bureaucracy FF 15 February 1963
4 Mailing List
There is a very friendly mailing list devoted to deep inference, proof nets, structads, the calculus of structures and other amphibians of structural proof theory, named Frogs. Join it!
5 Grants
These are the current and recent grants I am aware of, regarding deep inference:
 Typed LambdaCalculi with Sharing and Unsharing 2018–2021. Threeyear EPSRC research project at University of Bath. 400,250 GBP. Principal investigator is Willem Heijltjes. The project was ranked first at the prioritisation panel.
 The Fine Structure of Formal Proof Systems and their Computational Interpretations (FISP) 2016–2019. Threeyear ANRFWF project at Université Paris Diderot/PPS, INRIA Saclay/École Polytechnique, University of Innsbruck, Vienna University of Technology. 690,647 EUR. Coordinator is Michel Parigot.
 Efficient and Natural Proof Systems 2013–2016. Threeyear EPSRC research project at University of Bath. 756,061 GBP (full economic costing, including indexation and a research studentship funded by the University of Bath). Principal investigator is Alessio Guglielmi, coinvestigator is Guy McCusker.
 Sharing and Sequentiality in Proof Systems with Locality 2012–2014. Twoyear Royal Society International Exchanges Scheme. Exchange programme between University of Bath and Università di Torino. The two sides get 11,960 GBP. Principal investigators: Alessio Guglielmi and Luca Roversi.
 Extending the Realm of the CurryHoward Correspondence 2011–2012. FrenchSwiss Academic Research Collaboration Programme (PAI Germaine De Stael). Exchange programme between INRIA Futurs and IAM, Universität Bern. The two sides get around 4,300 EUR each. Principal investigators: Lutz Straßburger and Kai Brünnler.
 Structural and Computational Proof Theory (Structural) 2010–2012. ANR—Programme Blanc International/FWF 2010. Threeyear project at PPS, LIX, Innsbruck and Vienna. The project gets 441,229 EUR, out of a total cost in France of 2,032,556 EUR, from the Agence Nationale de la Recherche and it gets 291,577 EUR from the FWF. Coordinator and principal investigator is Michel Parigot.
 REDO: Redesigning Logical Syntax 2009–2010. INRIA—Action de Recherche Collaborative. Twoyear project of the teams Calligramme/Démosthène and Parsifal (INRIA) and the Computer Science Department of the University of Bath. The project gets 80,000 EUR. Principal investigators are Alessio Guglielmi, François Lamarche and Lutz Straßburger (coordinator).
 Identity and Geometric Essence of Proofs (Démosthène) ANR—Chaire d'excellence 2008–2010. Twoyear project at LORIA. The project gets about 355,000 EUR from the Agence Nationale de la Recherche, out of a total cost of 750,448 EUR, the rest of which is covered by INRIA. Principal investigator is Alessio Guglielmi.
 Complexity and NonDeterminism in Deep Inference 2007–2008. 17 month project at University of Bath. 118,072 GBP, out of a total cost of 147,591 GBP, from the Engineering and Physical Sciences Research Council (EPSRC) Principal investigator is Alessio Guglielmi, coinvestigator is Paola Bruscoli.
 Deep Inference and the Essence of Proofs 2007–2008. FrenchSwiss Academic Research Collaboration Programme (PAI Germaine De Stael). Exchange programme between INRIA Futurs and IAM, Universität Bern. The two sides get around 5,000 EUR each. Principal investigators: Lutz Straßburger and Kai Brünnler.
 The Realm of Cut Elimination 2007–2008. FrenchAustrian Academic Research Collaboration Programme (PAI Amadeus). Exchange programme between INRIA Futurs and Theory and Logic Group, Technische Universität Wien. The two sides get around 5,000 EUR each. Principal investigators: Lutz Straßburger and Agata Ciabattoni.
 Theory and Application of Deep Inference (Infer) ANR—Programme Blanc 2006. Threeyear project at LIX, LORIA and PPS. The project gets 230,400 EUR, out of a total cost of 1,280,598 EUR, from the Agence Nationale de la Recherche. Coordinator and principal investigator is Lutz Straßburger, other principal investigators are François Lamarche and Michel Parigot.
 Analyticity and Proof Search for Modal Logics in Deep Inference 2006–2008. BritishGerman Academic Research Collaboration Programme. Twoyear exchange programme between the University of Bath and the Technische Universität Dresden. Bath gets 2,650 GBP from the British Council (principal investigator Alessio Guglielmi) and Dresden gets 5,694 EUR from the German Academic Exchange Service (principal investigator Steffen Hölldobler).
 New Deductive Systems, Normalisation Methods and Semantics for Deep Inference 2006–2007. Alliance—FrancoBritish Partnership Programme. Twoyear exchange programme between the University of Bath and the PPS group in Paris. Bath gets 4,300 GBP from the British Council (principal investigator Alessio Guglielmi) and PPS gets 6,250 EUR from the French MAE and the Ministry of Research (principal investigator Michel Parigot).
I did not indicate individual travel grants.
6 Beans
 Journals and chapters in books 52


2019 

4 

ACM Transactions on Computational Logic; Archive for Mathematical Logic; Mathematical Structures in Computer Science (2) 
2018 

1 

ACM Transactions on Computational Logic 
2017 

1 

Fundamenta Informaticae 
2016 

4 

Journal of Logic and Computation (2); Logical Methods in Computer Science (2) 
2015 

7 

ACM Transactions on Computational Logic; Annals of Pure and Applied Logic; Logic Journal of the IGPL; Logical Methods in Computer Science (2); Mathematical Logic and Foundations; Scientific Annals of Computer Science 
2014 

3 

Categories and Types in Logic, Language, and Physics; Logical Methods in Computer Science; Notre Dame Journal of Formal Logic 
2013 

2 

ACM Transactions on Computational Logic; Logical Methods in Computer Science 
2012 

4 

Annals of Pure and Applied Logic (3); Applied Categorical Structures 
2011 

5 

ACM Transactions on Computational Logic; Journal of Logic and Computation; Logical Methods in Computer Science; Logic Without Frontiers (Festschrift for Walter Alexandre Carnielli); Mathematical Structures in Computer Science 
2009 

6 

ACM Transactions on Computational Logic; Annals of Pure and Applied Logic; Archive for Mathematical Logic; Electronic Notes in Theoretical Computer Science; Information and Computation; Journal of Logic and Computation 
2008 

3 

Annals of Pure and Applied Logic; Journal of Logic and Computation; Logical Methods in Computer Science 
2007 

5 

ACM Transactions on Computational Logic; Journal of Logic and Computation; Studia Logica; Theory and Applications of Categories (2) 
2006 

5 

Annals of Pure and Applied Logic; Logical Methods in Computer Science (2); Notre Dame Journal of Formal Logic; Studia Logica 
2003 

2 

Logic Journal of the IGPL; Theoretical Computer Science 
 Conferences 69


2018 

1 

LFCS 
2017 

5 

CSL, FSCD, LPAR (2), WoLLIC 
2016 

5 

CONCUR, FoSSaCS, FSCD, LICS, WoLLIC 
2015 

3 

LPAR, PSI, RTA 
2014 

4 

AiML, CSLLICS (3) 
2013 

6 

CSL (2), FoSSaCS, LICS, LPAR, RTA 
2012 

4 

AiML (2), CiE, JELIA 
2011 

3 

CSL, Tableaux, TLCA 
2010 

7 

AiML (2), CiE, LICS, LPAR (2), RTA 
2009 

5 

Tableaux (2), TbiLLC, TLCA, WoLLIC 
2008 

3 

AiML, LATA, LPAR 
2007 

1 

RTA 
2006 

4 

AiML, CiE, LPAR (2) 
2005 

5 

AIA, LICS, Logica Universalis, TLCA, WoLLIC 
2004 

4 

AiML, CSL (2), ISCIS 
2003 

4 

CSL, FOL75, WoLLIC (2, one of which invited talk) 
2002 

3 

ICLP, LPAR (2) 
2001 

2 

CSL, LPAR 
 Habilitation theses 2


2011 

1 

Lutz Straßburger 
2010 

1 

Kai Brünnler 
 PhD theses 11


2018 

2 

Fanny He, Benjamin Ralph 
2016 

1 

Andrea Aler Tubella 
2013 

2 

Nicolas Guenot, Anupam Das 
2011 

1 

Linda Postniece 
2009 

1 

Tom Gundersen 
2006 

2 

Richard McKinley, Ozan Kahramanoğulları 
2003 

2 

Kai Brünnler, Lutz Straßburger 
 MSc theses 4


2007 

1 

Benjamin Robert Horsfall 
2005 

1 

Robert Hein 
2004 

1 

Phiniki Stouppa 
2001 

1 

Alwen Tiu 
7 Events
Deep inference was one of the main attractions at the following international events:
Deep inference has been taught at
8 TeX Macros and BibTeX database
The Virginia Lake LaTeX macros help typing deep inference and open deduction structures, derivations and atomic flows. Willem Heijltjes also provides macros for open deduction, which should be more intuitive than Virginia Lake.
The DedStraker TeX macros are now obsolete.
This BibTeX database of deepinference publications is kept updated.
5.4.2019 Alessio Guglielmi email