Alessio Guglielmi's Research and Teaching / Deep Inference

Deep Inference
Web-Based Quantum Bio-Cryptography and Creative Nano-Security for the Cloud

This page tries to be a comprehensive account of the ongoing research on deep inference. Deep inference started as a personal project, but it is now growing fast and I'm struggling to keep pace with all the developments. Some of them I do not understand in detail. Please keep in mind that what I wrote below can be very subjective and not reflect the opinions of the people involved in this research. If you disagree or find inaccuracies, please let me know! If you are an expert of proof theory and want to quickly understand what deep inference is about, go to Deep Inference in One Minute.

Latest News

4 September 2015On 15–16 December 2015 in Bath there will be a workshop on Efficient and Natural Proof Systems.

3 December 2014There are two courses on deep inference at ESSLLI 2015.

19 December 2012The three-year EPSRC project Efficient and Natural Proof Systems at the University of Bath has been approved.

Image credit: Theodor W. Hänsch (Nobel Prize)


  1. Introduction
    1. Proof Systems
    2. Proof-Theoretical Properties
    3. History
  2. Papers
    1. Classical and Intuitionistic Logic
    2. Proof Complexity
    3. Nested Sequents
    4. Modal Logic
    5. Linear Logic
    6. Commutative/Non-commutative Linear Logic
    7. Proof Nets, Semantics of Proofs and the War to Bureaucracy
    8. Language Design
    9. Implementations
  3. Counterexamples
  4. Notes
  5. Mailing List
  6. Grants
  7. Beans
  8. Events
  9. TeX Macros and BibTeX database


[There is a version of this introduction with bibliographic references and examples in the paper Deep Inference.]

Deep inference could succinctly be described as an extreme form of linear logic. It is a methodology for designing proof formalisms that generalise Gentzen formalisms, i.e., the sequent calculus and natural deduction. In a sense, deep inference is obtained by applying some of the main concepts behind linear logic to the formalisms, i.e., to the rules by which proof systems are designed. By doing so, we obtain a better proof theory than the traditional one due to Gentzen. In fact, in deep inference we can provide proof systems for more logics, in a more regular and modular way, with smaller proofs, less syntax, less bureaucracy and we have a chance to make substantial progress towards a solution to the century-old problem of the identity of proofs. The first manuscript on deep inference appeared in 1999 and the first refereed papers in 2001. So far, two formalisms have been designed and developed in deep inference: the calculus of structures and open deduction. A third one, nested sequents, introduces deep inference features into a more traditional Gentzen formalism.

Essentially, deep inference tries to understand proof composition and proof normalisation (in a very liberal sense including cut elimination) in the most logic-agnostic way. Thanks to it we obtain a deeper understanding of the nature of normalisation. It seems that normalisation is a primitive, simple phenomenon that manifests itself in more or less complicated ways that depend more on the choice of representation for proofs rather than their true mathematical nature. By dropping syntactic constraints, as we do in deep inference compared to Gentzen, we get closer to the semantic nature of proof and proof normalisation.

The early inspiration for deep inference came from linear logic. Linear logic, among other ideas, supports the notion that logic has a geometric nature, and that a more perspicuous analysis of proofs is possible if we uncover their geometric shape, hidden behind their syntax. We can give technical meaning to this notion by looking for linearity in proofs. In the computing world, linearity can be interpreted as a way to deal with quantity or resource. The significance of linear logic for computer science has stimulated a remarkable amount of research, that continues to these days, and that ranges from the most theoretical investigations in categorical semantics to the implementation of languages and compilers and the verification of software.

Linear logic expresses locality by relying on Gentzen's formalisms. However, these had been developed for classical mathematical logic, for which linearity is not a primitive, natural notion. While attempting to relate process algebras (which are foundational models of concurrent computation) to linear logic, I realised that Gentzen's formalisms were inherently inadequate to express the most primitive notion of composition in computer science: sequential composition. This is indeed linear, but of a different kind of linearity from that naturally supported by linear logic.

I realised then that the linear logic ideas were to be carried all the way through and that the formalisms themselves had to be `linearised´. Technically, this turned out to be possible by dropping one of the assumptions that Gentzen implicitly used, namely that the (geometric) shape of proofs is directly related to the shape of formulae that they prove. In deep inference, we do not make this assumption, and we get proofs whose shape is much more liberally determined than in Gentzen's formalisms. As an immediate consequence, we were able to capture process-algebras sequential composition, but we soon realised that the new formalism was offering unprecedented opportunities for both a more satisfying general theory of proofs and for more applications in computer science.

1.1Proof Systems

The difference between Gentzen formalisms and deep inference ones is that in deep inference we compose proofs by the same connectives of formulae: if

Φ = A B and Ψ = C D

are two proofs with, respectively, premisses A and C and conclusions B and D , then

Φ Ψ = A C B D and Φ Ψ = A C B D

are valid proofs with, respectively, premisses A C and A C , and conclusions B D and B D . Significantly, while Φ Ψ can be represented in Gentzen, Φ Ψ cannot. That is basically the definition of deep inference and it holds for every language, not just propositional classical logic.

As an example, I will show the standard deep inference system for propositional logic. System SKS is a proof system defined by the following structural inference rules (where a and a - are dual atoms)

i↓ t a a - w↓ f a c↓ a a a identity weakening contraction i↑ a a - f w↑ a t c↑ a a a cut coweakening cocontraction

and by the following two logical inference rules:

s A ( B C ) ( A B ) C m ( A B ) ( C D ) ( A C ) ( B D ) switch medial

A cut-free derivation is a derivation where i↑ is not used, i.e., a derivation in SKS { i↑ } . In addition to these rules, there is a rule

= C D

such that C and D are opposite sides in one of the following equations:

A B = B A A f = A A B = B A A t = A ( A B ) C = A ( B C ) t t = t ( A B ) C = A ( B C ) f f = f

We do not always show the instances of rule = , and when we do show them, we gather several contiguous instances into one.

For example, this is a valid derivation:

( a b ) a ( ( a b ) a ) ( ( a b ) a ) = m c a a a c b b b ( a b ) ( a b ) c a a a

This derivation illustrates a general principle in deep inference: structural rules on generic formulae (in this case a cocontraction) can be replaced by corresponding structural rules on atoms (in this case c↑ ).

It is interesting to note that the inference rules for classical logic, as well as the great majority of rules for any logic, all derive from a common template which has been distilled from the semantics of a purely linear logic in the first deep inference paper. Since this phenomenon is very surprising, especially for structural rules such as weakening and contraction, we believe that we might be dealing with a rather deep aspect of logic and we are currently investigating it.

1.2Proof-Theoretical Properties

Locality and linearity are foundational concepts for deep inference, in the same spirit as they are for linear logic. Going for locality and linearity basically means going for \emph{complexity bounded by a constant}. This last idea introduces geometry into the picture, because bounded complexity leads us to equivalence modulo continuous deformation. In a few words, the simple and natural definition of deep inference that we have seen above captures these ideas about linearity, locality and geometry, and can consequently be exploited in many ways, and notably:

One of the open questions is whether deep inference might have a positive influence on the proof-search-as-computation paradigm and possibly on focusing. This subject has been so far almost unexplored, but some preliminary work looks very promising.

The core topic of every proof-theoretic investigation, namely normalisation, deserves a special mention. Traditionally, normalisation is at the core of proof theory, and this is of course the same for deep inference. Normalisation in deep inference is not much different, in principle, from normalisation in Gentzen theory. In practice, however, the more liberal proof composition mechanism of deep inference completely invalidates the techniques (and the intuition) behind cut elimination procedures in Gentzen systems. Much of the effort of these 15 years of research on deep inference went into recovering a normalisation theory. One of the main ideas is called splitting, and at present itis the most general method we know for eliminating cuts in deep inference.

On the other hand, we now have techniques that are not as widely applicable but that are of a completely different nature from splitting, which is combinatorial. A surprising, relatively recent result consists in exploiting deep inference's locality to obtain the first purely geometric normalisation procedure, by a topological device that we call atomic flows. This means that, at least for classical logic and logics that extend it, cut elimination can be understood as a process that is completely independent from logical information: only the shape of the proof, determined by its structural information (creation, duplication and erasing of atoms) matters. Logical information, such as the connectives in formulae, do not matter. This hints at a deeper nature of normalisation than what we thought so far. It seems that normalisation is a primitive, simple phenomenon that manifests itself in more or less complicated ways that depend more on the choice of representation for proofs rather than their true mathematical nature.


Deep inference comes from linear logic and process algebras; more specifically, it comes from seeing proofs as concurrent processes. The first development has been the definition of the calculus of structures and a cut elimination proof for the logic BV, which was studied for being the logical counterpart of the core of the process algebra CCS. We realised that the techniques developed for BV had a much wider applicability, so we broadly developed the calculus of structures and studied its many novel normalisation properties. The initial phase of development took place in Dresden, from 1999 to 2003; now deep inference is developed in several laboratories around the world. The recent results on modal and intuitionistic logics, proof nets and semantics, and implementations, complete the establishing of deep inference as a solid and comprehensive methodology in proof theory.


The following material is broad in scope; if you are new to deep inference and the calculus of structures, start here:

Abstracts ON or OFF (Javascript required).

In the rest of the section, all the papers I know of are listed according to their subject, in no particular order.

2.1Classical and Intuitionistic Logic

So far, for classical logic in the calculus of structures we achieved:

We can present intuitionistic logic in the calculus of structures with a fully local, cut-free system. The logic of bunched implications BI can be presented in the calculus of structures. Japaridze's cirquent calculus benefits from a deep-inference presentation, in particular in the case of propositional logic.

The following papers exist, in addition to Deep Inference and Its Normal Form of Derivations and Deep Inference and Symmetry in Classical Proofs, mentioned above:

Abstracts ON or OFF (Javascript required).

2.2Proof Complexity

The basic proof complexity properties of propositional logic in the calculus of structures are known. Deep inference is as powerful as Frege systems, and more powerful than Gentzen systems, in the restriction to analytic systems.

The following papers exist:

Abstracts ON or OFF (Javascript required).

2.3Nested Sequents

A new formalism called 'nested sequents' has been defined, which is especially suitable to modal logics.

The following papers exist, in addition to Nested Sequents, mentioned above:

Abstracts ON or OFF (Javascript required).

2.4Modal Logic

We can present systematically several normal propositional modal logics, including S5, B and K5, for which cut elimination is proved. We also investigated geometric theories, some of which we expressed in the calculus of structures.

The following papers exist:

Abstracts ON or OFF (Javascript required).

2.5Linear Logic

Linear logic enjoys presentations in deep inference that obtain the expected properties of locality, with rather astounding decomposition theorems and the usual, general normalisation results.

The following papers exist, in addition to Linear Logic and Noncommutativity in the Calculus of Structures, mentioned above:

Abstracts ON or OFF (Javascript required).

2.6Commutative/Non-commutative Linear Logic

We conservatively extend mixed multiplicative and multiplicative exponential linear logic with a self-dual non-commutative operator. The systems so obtained cannot be presented in the sequent calculus, but they enjoy the usual properties of locality, decomposition and cut elimination available in the calculus of structures. We can present Yetter's cyclic linear logic in the calculus of structures and prove cut elimination; interestingly, cyclicity is naturally subsumed by deep inference. New, purely proof-theoretical, techniques are developed for reducing the non-determinism in the calculus of structures.

The following papers exist, in addition to Linear Logic and Noncommutativity in the Calculus of Structures, mentioned above:

Abstracts ON or OFF (Javascript required).

2.7Proof Nets, Semantics of Proofs and the War to Bureaucracy

Deep inference and the calculus of structures are influencing the design of a new generation of proof nets. Moreover, they offer new insight for semantics of proofs and categorical proof theory. Finally, they open decisive new perspectives in the fight against bureaucracy.

The following papers exist, in addition to A General View of Normalisation Through Atomic Flows and Categorical Models of First Order Classical Proofs, mentioned above:

Abstracts ON or OFF (Javascript required).

2.8Language Design

Thanks to a self-dual non-commutative extension of linear logic one gets the first purely logical account of sequentiality in proof search. The new logical operators make possible a new approach to partial order planning and its relation to concurrency.

The following papers exist, in addition to Nondeterminism and Language Design in Deep Inference, mentioned above:

Abstracts ON or OFF (Javascript required).


Ozan Kahramanoğulları, Pierre-Etienne Moreau and Antoine Reilles are implementing calculus-of-structures proof systems in Maude and in Tom. Ozan managed to achieve efficiency without sacrificing proof theoretic cleanliness, and he is obtaining results of independent theoretical interest. There are two slides presentations:

Max Schäfer has built a graphical proof editor in Java, called GraPE, for the Maude modules written by Ozan Kahramanoğulları; this means that one can interactively build and find proofs in several deep-inference systems.

The following papers exist, in addition to Nondeterminism and Language Design in Deep Inference, mentioned above:

Abstracts ON or OFF (Javascript required).


Finding counterexamples is very important to us. Counterexamples are typically of a very combinatorial nature, due to the new combinatorial possibilities offered by deep inference.

  1. Classical Logic
    1. Atomic contraction is not achievable in the sequent calculus
    2. Contraction cannot be pushed to the root of sequent calculus proofs
    3. Cocontraction does not replace contraction
    4. Binary tautologies and atomic contraction
  2. Modal Logic
    1. 4 rules cannot be interchanged for K4
  3. BV
    1. BV cannot be expressed in the sequent calculus
    2. Comerge does not permute over merge
    3. Interaction can only be pushed up in the presence of coseq


5Mailing List

There is a very friendly mailing list devoted to deep inference, proof nets, structads, the calculus of structures and other amphibians of structural proof theory, named Frogs. Join it!


These are the current and recent grants I am aware of, regarding deep inference:

I did not indicate individual travel grants.



Deep inference was one of the main attractions at the following international events:

Deep inference has been taught at

9TeX Macros and BibTeX database

The Virginia Lake LaTeX macros help typing deep inference structures, derivations and atomic flows.

The DedStraker TeX macros are now obsolete.

This BibTeX database of deep-inference publications is kept updated.

4.9.2015Alessio Guglielmiemail