Alessio Guglielmi's Research and Teaching / Deep Inference

Deep Inference
Cloud-Based Quantum Bio-Cryptography and Creative Nano-Security for the Blockchain of Queer Unicorns

This page tries to be a comprehensive account of the ongoing research on deep inference. Deep inference started as a personal project, but it is now growing fast and I'm struggling to keep pace with all the developments. Some of them I do not understand in detail. Please keep in mind that what I wrote below can be very subjective and not reflect the opinions of the people involved in this research. If you disagree or find inaccuracies, please let me know! If you are an expert in proof theory and want to quickly understand what deep inference is about, go to Deep Inference in One Minute.

News

2020–2024 (–2027) Four-year (extensible to seven-year) UKRI Future Leaders Fellowship Structure vs. Invariants in Proofs (StrIP) awarded to Anupam Das.

5–9 August 2019 An Introduction to Deep Inference at ESSLLI 2019.

29–30 June 2019 5th International Workshop on Structures and Deduction in Dortmund.

2018 Andrea Aler Tubella's PhD thesis has been shortlisted in the top four by the E.W. Beth Dissertation Prize 2018.

7 July 2018 Workshop on Twenty Years of Deep Inference in Oxford.

2018–2021 Three-year EPSRC project Typed Lambda-Calculi with Sharing and Unsharing at the University of Bath.

8–9 September 2017 4th International Workshop on Structures and Deduction in Oxford.

17–21 July 2017 Efficient Proof Systems for Modal Logics at ESSLLI 2017.

8–10 June 2017 Second FISP Meeting in Paris.

15–17 November 2016 Workshop on The Fine Structure of Formal Proof Systems and Their Computational Interpretations in Innsbruck.

2016–2019 Three-year ANR-FWF project The Fine Structure of Formal Proof Systems and their Computational Interpretations (FISP) at Univ. Paris Diderot/PPS, INRIA Saclay/École Polytechnique, University of Innsbruck, Vienna University of Technology.

14–16 December 2015 Workshop on Efficient and Natural Proof Systems in Bath.

3–14 August 2015 Two courses on deep inference at ESSLLI 2015.

2013–2016 Three-year EPSRC project Efficient and Natural Proof Systems at the University of Bath.


Image credit: Theodor W. Hänsch (Nobel Prize)

Contents

  1. Introduction
    1. Proof Systems
    2. Proof-Theoretical Properties
    3. History
  2. Papers
    1. Classical and Intuitionistic Logic
    2. Proof Complexity
    3. Nested Sequents
    4. Modal Logic
    5. Linear Logic
    6. Commutative/Non-commutative and Graph Logics
    7. Proof Nets, Semantics of Proofs and the War to Bureaucracy
    8. Language Design
    9. Implementations
  3. Notes
  4. Mailing List
  5. Grants
  6. Beans
  7. Events
  8. TeX Macros and BibTeX database

1 Introduction

[There is a version of this introduction with bibliographic references and examples in the paper Deep Inference.]

Deep inference could succinctly be described as an extreme form of linear logic. It is a methodology for designing proof formalisms that generalise Gentzen formalisms, i.e., the sequent calculus and natural deduction. In a sense, deep inference is obtained by applying some of the main concepts behind linear logic to the formalisms, i.e., to the rules by which proof systems are designed. By doing so, we obtain a better proof theory than the traditional one due to Gentzen. In fact, in deep inference, we can provide proof systems for more logics, in a more regular and modular way, with smaller proofs, less syntax, less bureaucracy and we have a chance to make substantial progress towards a solution to the century-old problem of the identity of proofs. The first manuscript on deep inference appeared in 1999 and the first refereed papers in 2001. So far, two formalisms have been designed and developed in deep inference: the calculus of structures and open deduction. A third one, nested sequents, introduces deep inference features into a more traditional Gentzen formalism.

Essentially, deep inference tries to understand proof composition and proof normalisation (in a very liberal sense including cut elimination) in the most logic-agnostic way. Thanks to it we obtain a deeper understanding of the nature of normalisation. It seems that normalisation is a primitive, simple phenomenon that manifests itself in more or less complicated ways that depend on the choice of representation for proofs rather than their true mathematical nature. By dropping syntactic constraints, as we do in deep inference compared to Gentzen, we get closer to the semantic nature of proof and proof normalisation.

The early inspiration for deep inference came from linear logic. Linear logic, among other ideas, supports the notion that logic has a geometric nature, and that a more perspicuous analysis of proofs is possible if we uncover their geometric shape, hidden behind their syntax. We can give technical meaning to this notion by looking for linearity in proofs. In the computing world, linearity can be interpreted as a way to deal with quantity or resource. The significance of linear logic for computer science has stimulated a remarkable amount of research, that continues to these days, and that ranges from the most theoretical investigations in categorical semantics to the implementation of languages and compilers and the verification of software.

Linear logic expresses locality by relying on Gentzen's formalisms. However, these had been developed for classical mathematical logic, for which linearity is not a primitive, natural notion. While attempting to relate process algebras (which are foundational models of concurrent computation) to linear logic, I realised that Gentzen's formalisms were inherently inadequate to express the most primitive notion of composition in computer science: sequential composition. This is indeed linear, but of a different kind of linearity from that naturally supported by linear logic.

I realised then that the linear logic ideas were to be carried all the way through and that the formalisms themselves had to be 'linearised'. Technically, this turned out to be possible by dropping one of the assumptions that Gentzen implicitly used, namely that the (geometric) shape of proofs is directly related to the shape of formulae that they prove. In deep inference, we do not make this assumption, and we get proofs whose shape is much more liberally determined than in Gentzen's formalisms. As an immediate consequence, we were able to capture process-algebras sequential composition, but we soon realised that the new formalism was offering unprecedented opportunities for both a more satisfying general theory of proofs and for more applications in computer science.

1.1 Proof Systems

The difference between Gentzen formalisms and deep inference ones is that in deep inference we compose proofs by the same connectives of formulae: if

\(\Phi=\array{A\\ \Vert\\ B\\}\) and \(\Psi=\array{C\\ \Vert\\ D\\}\)

are two proofs with, respectively, premisses \(A\) and \(C\), and conclusions \(B\) and \(D\), then

\(\Phi\land\Psi=\array{A\land C\\ \Vert\\ B\land D\\}\) and \(\Phi\lor\Psi=\array{A\lor C\\ \Vert\\ B\lor D\\}\)

are valid proofs with, respectively, premisses \(A\land C\) and \(A\lor C\), and conclusions \(B\land D\) and \(B\lor D\). Significantly, while \(\Phi\land\Psi\) can be represented in Gentzen, \(\Phi\lor\Psi\) cannot. That is basically the definition of deep inference and it holds for every language – not just for propositional classical logic.

As an example, I will show the standard deep inference system for propositional logic. System \(\SKS\) is a proof system defined by the following structural inference rules (where \(a\) and \(\bar a\) are dual atoms):

\(\inf\id{\t}{a\lor\bar a}\) \(\inf\wd{\f}{a}\) \(\inf\cd{a\lor a}{a}\)
identity weakening contraction





 ;
\(\inf\iu{a\land\bar a}{\f}\) \(\inf\wu{a}{\t}\) \(\inf\cu{a}{a\land a}\)
cut coweakening cocontraction

and by the following two logical inference rules:

\(\inf\s{A\land(B\lor C)}{(A\land B)\lor C}\) \(\inf\m{(A\land B)\lor(C\land D)}{(A\lor C)\land(B\lor D)}\)  .
switch medial

A cut-free derivation is a derivation where \(\iu\) is not used, i.e., a derivation in \(\SKS\setminus\{\iu\}\). In addition to these rules, there is a rule \[ \inf{=}{C}{D} \quad, \] such that \(C\) and \(D\) are opposite sides in one of the following equations: \[ \begin{array}{rcl} A\lor B&=&B\lor A\\ A\land B&=&B\land A\\ (A\lor B)\lor C&=&A\lor(B\lor C)\\ (A\land B)\land C&=&A\land(B\land C)\\ \end{array} \qquad \begin{array}{rcl} A\lor\f&=&A\\ A\land\t&=&A\\ \t\lor\t&=&\t\\ \f\land\f&=&\f\\ \end{array} \quad. \] We do not always show the instances of rule \(=\), and when we do show them, we gather several contiguous instances into one.

For example, this is a valid derivation: \[ \array{(a\lor b)\land a\\\Vert\\((a\lor b)\land a)\land((a\lor b)\land a)} \equiv \vcenter{\fbox{\(\inf\m{\fbox{\(\inf\cu{a}{a\land a}\)} \lor \fbox{\(\inf\cu{b}{b\land b}\)}} {(a\lor b)\land(a\lor b)} \)}} \land \vcenter{\fbox{\(\inf\cu{a}{a\land a}\)}} \quad. \] This derivation illustrates a general principle in deep inference: structural rules on generic formulae (in this case a cocontraction) can be replaced by corresponding structural rules on atoms (in this case \(\cu\)).

It is interesting to note that the inference rules for classical logic, as well as the great majority of rules for any logic, all derive from a common template which has been distilled from the semantics of a purely linear logic in the first deep inference paper. Since this phenomenon is very surprising, especially for structural rules such as weakening and contraction, we believe that we might be dealing with a rather deep aspect of logic and we are currently investigating it.

1.2 Proof-Theoretical Properties

Locality and linearity are foundational concepts for deep inference, in the same spirit as they are for linear logic. Going for locality and linearity basically means going for complexity bounded by a constant. This last idea introduces geometry into the picture because bounded complexity leads us to equivalence modulo continuous deformation. In a few words, the simple and natural definition of deep inference that we have seen above captures these ideas about linearity, locality and geometry, and can consequently be exploited in many ways, and notably:

One of the open questions is whether deep inference might have a positive influence on the proof-search-as-computation paradigm and possibly on focusing. This subject has been so far almost unexplored, but some preliminary work looks very promising.

The core topic of every proof-theoretic investigation, namely normalisation, deserves a special mention. Traditionally, normalisation is at the core of proof theory, and this is, of course, the same for deep inference. Normalisation in deep inference is not much different, in principle, from normalisation in Gentzen theory. In practice, however, the more liberal proof composition mechanism of deep inference completely invalidates the techniques (and the intuition) behind cut elimination procedures in Gentzen systems. Much effort went into recovering a normalisation theory. One of the main ideas is called splitting and, at present, it is the most general method we know for eliminating cuts in deep inference.

On the other hand, we now have techniques that are not as widely applicable but that are of a completely different nature from splitting, which is combinatorial. A surprising, relatively recent result consists in exploiting deep inference's locality to obtain the first purely geometric normalisation procedure, by a topological device that we call atomic flows. This means that, at least for classical logic and logics that extend it, cut elimination can be understood as a process that is completely independent of logical information: only the shape of the proof, determined by its structural information (creation, duplication and erasing of atoms) matters. Logical information, such as the connectives in formulae, do not matter. This hints at a deeper nature of normalisation than what we thought so far. It seems that normalisation is a primitive, simple phenomenon that manifests itself in more or less complicated ways that depend on the choice of representation for proofs rather than their true mathematical nature.

1.3 History

Deep inference comes from linear logic and process algebras; more specifically, it comes from seeing proofs as concurrent processes. The first development has been the definition of the calculus of structures and a cut elimination proof for the logic BV, which was studied for being the logical counterpart of the core of the process algebra CCS. We realised that the techniques developed for BV had a much wider applicability, so we broadly developed the calculus of structures and studied its many novel normalisation properties. The initial phase of development took place in Dresden, from 1999 to 2003; now deep inference is developed in several laboratories around the world. The recent results on modal and intuitionistic logics, proof nets and semantics, and implementations complete the establishing of deep inference as a solid and comprehensive methodology in proof theory.

2 Papers

The following material is broad in scope; if you are new to deep inference and the calculus of structures, start here:

In the rest of the section, all the papers I know of are listed according to their subject, in no particular order.

2.1 Classical and Intuitionistic Logic

So far, for classical logic in the calculus of structures we achieved:

We can present intuitionistic logic in the calculus of structures with a fully local, cut-free system. The logic of bunched implications BI can be presented in the calculus of structures. Japaridze's cirquent calculus benefits from a deep-inference presentation, in particular in the case of propositional logic.

The following papers exist, in addition to Deep Inference and Its Normal Form of Derivations, Deep Inference and Symmetry in Classical Proofs and Towards a Theory of Proofs of Classical Logic, mentioned above:

2.2 Proof Complexity

The basic proof complexity properties of propositional logic in the calculus of structures are known. Deep inference is as powerful as Frege systems, and more powerful than Gentzen systems, in the restriction to analytic systems.

The following papers exist, in addition to The Complexity of Propositional Proofs in Deep Inference:

2.3 Nested Sequents

A new formalism called 'nested sequents' has been defined, which is especially suitable to modal logics.

The following papers exist, in addition to Nested Sequents and Nested Deduction in Logical Foundations for Computation, mentioned above:

2.4 Modal Logic

We can present systematically several normal propositional modal logics, including S5, B and K5, for which cut elimination is proved. We also investigated geometric theories, some of which we expressed in the calculus of structures.

The following papers exist:

2.5 Linear Logic

Linear logic enjoys presentations in deep inference that obtain the expected properties of locality, with rather astounding decomposition theorems and the usual, general normalisation results.

The following papers exist, in addition to Linear Logic and Noncommutativity in the Calculus of Structures, mentioned above:

2.6 Commutative/Non-commutative and Graph Logics

We conservatively extend mixed multiplicative and multiplicative exponential linear logic with a self-dual non-commutative operator. The systems so obtained cannot be presented in the sequent calculus, but they enjoy the usual properties of locality, decomposition and cut elimination available in the calculus of structures. We can present Yetter's cyclic linear logic in the calculus of structures and prove cut elimination; interestingly, cyclicity is naturally subsumed by deep inference. We go further and design proof systems that operate on generic graphs rather than just formulae.

The following papers exist, in addition to Linear Logic and Noncommutativity in the Calculus of Structures, mentioned above:

2.7 Proof Nets, Semantics of Proofs and the War to Bureaucracy

Deep inference and the calculus of structures are influencing the design of a new generation of proof nets. Moreover, they offer new insight for semantics of proofs and categorical proof theory. Finally, they open decisive new perspectives in the fight against bureaucracy.

The following papers exist, in addition to A General View of Normalisation Through Atomic Flows, Categorical Models of First Order Classical Proofs and Towards a Theory of Proofs of Classical Logic, mentioned above:

2.8 Language Design

Thanks to a self-dual non-commutative extension of linear logic one gets the first purely logical account of sequentiality in proof search. The new logical operators make possible a new approach to partial order planning and its relation to concurrency.

The following papers exist, in addition to Nondeterminism and Language Design in Deep Inference, mentioned above:

2.9 Implementations

Ozan Kahramanoğulları, Pierre-Etienne Moreau and Antoine Reilles implemented calculus-of-structures proof systems in Maude (MLL and MELL) and in Tom. Ozan managed to achieve efficiency without sacrificing proof theoretic cleanliness, and he is obtaining results of independent theoretical interest. There are two slides presentations:

Max Schäfer has built a graphical proof editor in Java, called GraPE, for the Maude modules written by Ozan Kahramanoğulları; this means that one can interactively build and find proofs in several deep-inference systems.

The following papers exist, in addition to Nondeterminism and Language Design in Deep Inference, mentioned above:

3 Notes

4 Mailing List

There is a very friendly mailing list devoted to deep inference, proof nets, structads, the calculus of structures and other amphibians of structural proof theory, named Frogs. Join it!

5 Grants

These are the current and recent grants I am aware of, regarding deep inference:

I did not indicate individual travel grants.

6 Beans

7 Events

Deep inference was one of the main attractions at the following international events:

Deep inference has been taught at

8 TeX Macros and BibTeX database

The Virginia Lake LaTeX macros help typing deep inference and open deduction structures, derivations and atomic flows. Willem Heijltjes also provides macros for open deduction, which should be more intuitive than Virginia Lake.

The DedStraker TeX macros are now obsolete.

This BibTeX database of deep-inference publications is kept updated.

1.11.2022 Alessio Guglielmiemail