Perhaps you were looking for the Deep Inference pages?
Contents |
AddressDepartment of Computer Science |
I do proof theory from a theoretical computer science perspective. I am mainly interested in deep inference, a new methodology in proof theory that my collaborators and I have introduced.
Theoretical computer science mainly helps proof theory by bringing to it quantitative and semantic ideas, and proof theory mainly helps computer science by supporting the design of programming languages and the verification of software correctness.
The heart of what I do in proof theory is normalisation. This is still a mysterious combinatorial phenomenon, that is intimately connected to fundamental questions about how much we can compress information. Proof theoretic research takes place in the purest setting where these problems can be defined, and, as such, it is a fundamental study that has and will continue to have far reaching applications in computer science.
Two problems motivated the development of this research: understanding normalisation and the identity of proofs.
One research objective is to understand normalisation in propositional logics (first order and, at a later time, second order). Deep inference reveals finer and finer structure in the inference rules. With the methods developed for deep inference, we can see that cut elimination is only a special instance of a much broader normalisation phenomenon. Our techniques unveil new properties of derivations, impossible to be observed in the sequent calculus. One of the most interesting discoveries is that deep-inference rules show an unprecedented regularity in the shape of inference rules of very different logics. In what I call `subatomic proof theory´, it seems possible to describe most of known logics by just one, simple inference rule.
I'm also interested in proof complexity: since deep inference generalises the sequent calculus, proofs in deep inference can only be shorter than in the sequent calculus. Indeed, on some classes of tautologies, like Statman's ones, there is an exponential speed-up of an analytic deep-inference system over all analytic sequent-calculus ones. We started only recently to study the proof complexity of deep inference. Deep inference is a powerful methodology in this respect, on par with Frege-Hilbert systems (but with better properties). We wonder whether deep inference could give good insights into the many open questions in proof complexity.
We are obtaining a broad theory of normalisation and, more in general, of design of deductive systems, which are based on more basic and simple principles than traditional proof theory offers.
Another research objective is to come up with useful, interesting notions of identity of proofs. It is embarrassing for proof theory that the natural question of `when are two proofs to be considered identical?´ lacks good answers.^{*} (This question is tightly connected to the equally non-well-understood problem of deciding when two algorithms are the same.) One reason for our embarrassment is that the notion of normalisation available, viz. cut elimination, is inadequate to decide the question; I believe that the problem is that this is the only (reasonable) normalisation notion available in the sequent calculus and natural deduction. Thanks to its finer granularity, deep inference offers several natural new notions of normalisation, this way improving our ability to study new concepts of identity.
The calculus of structures is now a fully developed deep inference formalism, but I don't believe that it is the natural setting where to study the proof identity problem: there still is too much bureaucracy in it. I believe that deductive nets will be the right place where to work. Proof nets eliminate, in fact, much of the bureaucracy of calculi like the sequent calculus or the calculus of structures. But which proof nets? Linear logic proof nets are certainly inadequate, because of the poor way they deal with `sharing´, and then because they are not deductive, in the sense that they are not a proof system. We need a much more refined notion, suitable also to classical logic.
We observed that deep inference yields some interesting notions towards deductive nets. In fact, deep inference allows us to design deductive systems whose rules are all local, i.e., their computational complexity is constant (this is provably impossible in traditional proof theory). This property lends itself naturally to a geometric understanding of proofs. In other words, we can now see a geometric structure lurking behind the syntax, and we can control this structure thanks to our local inference rules. We are already able to use part of this geometric structure to control normalisation, by what we call atomic flows. I think that in a few years we will be able to exploit the full potential of the geometry behind proofs in order to have a satisfactory, topological answer to the problem of the identity of proofs.
Deep inference reveals regularities that traditional formalisms cannot detect, and it offers new proof theoretical properties, of special interest for normalisation. My colleagues and I are exploiting these regularities and properties in a unified theory of several logics. This will yield deductive nets, for which the problem of identity of proofs will be solved in a satisfactory way.
I have three main reasons to work in this area: 1) It is beautiful; this research involves working with beautiful, elegant mathematical objects. 2) It is interesting; it addresses important, challenging problems in proof theory, and they are difficult. 3) It is useful for designing computer science languages; in fact, my research started from questions raised by computer languages for concurrency.
The following web pages contain more details about my current research:
Perhaps you might be interested in my political opinions.
Abstracts ON or OFF (Javascript required).
A Logical Basis for Quantum Evolution and Entanglement
Richard F. Blute, Alessio Guglielmi, Ivan T. Ivanov, Prakash Panangaden and Lutz Straßburger
Abstract
Pdf18 July 2013
In Categories and Types in Logic, Language, and Physics, LNCS 8222, pp. 90–107
A Quasipolynomial Normalisation Procedure in Deep Inference
Paola Bruscoli, Alessio Guglielmi, Tom Gundersen and Michel Parigot
Abstract
A System of Interaction and Structure V: The Exponentials and Splitting
Alessio Guglielmi and Lutz Straßburger
Abstract
Pdf21 September 2010
Mathematical Structures in Computer Science, 21 (3) 2011, pp. 563–584
This paper is the second part resulted from splitting into two parts the paper A System of Interaction and Structure IV: The Exponentials (which was previously titled A Non-commutative Extension of Multiplicative Exponential Linear Logic and was the journal version of A Non-commutative Extension of MELL).
A System of Interaction and Structure IV: The Exponentials and Decomposition
Lutz Straßburger and Alessio Guglielmi
Abstract
Pdf27 July 2010
ACM Transactions on Computational Logic 12 (4:23) 2011, pp. 1–39
This paper is the first part resulted from splitting into two parts the paper A System of Interaction and Structure IV: The Exponentials (which was previously titled A Non-commutative Extension of Multiplicative Exponential Linear Logic and was the journal version of A Non-commutative Extension of MELL).
Breaking Paths in Atomic Flows for Classical Logic
Alessio Guglielmi, Tom Gundersen and Lutz Straßburger
Abstract
Pdf29 April 2010
LICS 2010 Proceedings (IEEE), pp. 284–293
A Proof Calculus Which Reduces Syntactic Bureaucracy
Alessio Guglielmi, Tom Gundersen and Michel Parigot
Abstract
Pdf12 April 2010
RTA 2010, Leibniz International Proceedings in Informatics (LIPIcs) 6, pp. 135–150
This paper supersedes the note Formalism A.
Personal Portrait of Giorgio Levi
Alessio Guglielmi
Pdf23 October 2009
Theoretical Computer Science 410 (46) 2009, pp. 4605–4607
On the Proof Complexity of Deep Inference
Paola Bruscoli and Alessio Guglielmi
Abstract
Pdf19 April 2009
ACM Transactions on Computational Logic 10 (2:14) 2009, pp. 1–34
Normalisation Control in Deep Inference via Atomic Flows
Alessio Guglielmi and Tom Gundersen
Abstract
Journal page (pdf/postscript)31 March 2008
Logical Methods in Computer Science 4 (1:9) 2008, pp. 1–36
A System of Interaction and Structure
Alessio Guglielmi
Abstract
Pdf27 January 2007
ACM Transactions on Computational Logic, Vol. 8 (1:1) 2007, pp. 1–64
The journal version has been butchered by the editorial process, but the preprint (linked here) is fine.
At the end of 2012, this was the most cited article appeared in ACM TOCL in the previous five years.
On Structuring Proof Search for First Order Linear Logic
Paola Bruscoli and Alessio Guglielmi
Abstract
A Non-commutative Extension of MELL
Alessio Guglielmi and Lutz Straßburger
Abstract
Pdf9 August 2002
LPAR 2002, LNCS 2514, pp. 231–246
A First Order System with Finite Choice of Premises
Kai Brünnler and Alessio Guglielmi
Abstract
Pdf1 December 2003
Presented at First Order Logic 75 under the title A Finitary System for First Order Logic; appeared in Hendricks et al., editor, First-Order Logic Revisited, Logos Verlag, Berlin, 2004, pp. 59–74
A Tutorial on Proof Theoretic Foundations of Logic Programming
Paola Bruscoli and Alessio Guglielmi
Abstract
Pdf10 September 2003
Invited tutorial at ICLP '03, LNCS 2916, pp. 109–127
Non-commutativity and MELL in the Calculus of Structures
Alessio Guglielmi and Lutz Straßburger
Abstract
Pdf28 June 2001
CSL 2001, LNCS 2142, pp. 54–68
The Commutative/Noncommutative Linear Logic BVAlessio GuglielmiThis brief survey contains an informal presentation of the commutative/noncommutative linear logic BV in terms of a naif space-temporal model. I argue that BV improves on the ability of linear logic to model structure that is spatial in nature by satisfactorily capturing temporal structure in programming languages and quantum physics. I provide a guide to the literature on BV and suggest that the only satisfactory treatment for it can be obtained in the proof-theoretic methodology of deep inference.15 April 2014
Polynomial Size Deep-Inference Proofs Instead of Exponential Size Shallow-Inference ProofsAlessio GuglielmiBy a simple example, I show how deep inference can provide for an exponential speed-up in the size of proofs with respect to shallow inference.22 September 2007SlidesSuperseded by On the Proof Complexity of Deep Inference
Finitary CutAlessio GuglielmiWe all know that the traditional cut rule is considered infinitary. But if we reduce the cut rule to atomic form, as we always can do in the calculus of structures, is atomic cut still infinitary? Not really.22 September 2007Superseded by A First Order System with Finite Choice of Premises
On Analytic Inference Rules in the Calculus of StructuresPaola Bruscoli and Alessio GuglielmiWe discuss the notion of analytic inference rule for propositional logics in the calculus of structures.25 November 2009
ButterfliesAlessio GuglielmiIf you're interested in NEL and BV, try and find the mistake in this note.29 July 2005
Some News on Subatomic LogicAlessio GuglielmiI found a simple explanation for the soundness of the subatomic inference rule.28 July 2005
Red and BlueAlessio GuglielmiThe sequent calculus is often colour blind, but not always; the calculus of structures has a consistent behaviour.26 July 2005
The Problem of Bureaucracy and Identity of Proofs from the Perspective of Deep InferenceAlessio GuglielmiSome discussion and then the first attempt at wired/weird deduction.17 June 2005Proceedings of Structures and Deduction '05, pp. 53–68
Formalism BAlessio GuglielmiInference rules operate on derivations.20 December 2004
Formalism AAlessio GuglielmiI generalise the notion of derivation for taking care of a certain kind of bureaucracy.23 April 2004Superseded by A Proof Calculus Which Reduces Syntactic Bureaucracy
Resolution in the Calculus of StructuresAlessio GuglielmiIt is possible to search for cut-free proofs by the resolution strategy in the calculus of structures; the sequent calculus does not allow the same freedom.29 September 2003
MismatchAlessio GuglielmiThere is a mismatch between meta and object levels in many calculi, like the sequent calculus and natural deduction. The mismatch has undesirable proof theoretical consequences, the most important being the inability to design deductive systems.24 August 2003Brief note for the specialist, posted to the Foundations of Mathematics and Proof Theory lists
Normalisation Without Cut EliminationAlessio GuglielmiI propose a possible notion of equivalence for proofs in classical logic, sort of a normal form for case analysis.25 February 2003
Subatomic LogicAlessio GuglielmiOne can unify classical and linear logic by using only two simple, linear, `hyper´ inference rules; they generate nice systems for all the fragments, where all rules are local. The various logics are determined by the algebra of their units, for example boolean algebra determines classical logic. We can prove cut elimination easily and once and for all in the two-rule system, and the procedure scales down to all the derived fragments.21 November 2002
On Lafont's CounterexampleAlessio GuglielmiLafont's counterexample shows why cut elimination in the sequent calculus of classical logic is not confluent. I show here that the counterexample doesn't work in the calculus of structures.1 November 2002
RecipeAlessio GuglielmiA family of tautologies that hold in most logics can be used, following a simple recipe, to produce deductive systems where all rules are local, including cut.27 October 2002
Goodness, Perfection and MiraclesAlessio GuglielmiI argue about three properties closely related to cut elimination and interpolation theorems. 18 October 2002
Proof Composition Mechanisms and their Geometric InterpretationTheory and Application of Formal Proofs, LIX Colloquium 2013 (École Polytechnique).5 November 2013
Geometric Ideas in the Design of Efficient and Natural Proof SystemsThird Workshop of the Amadeus Project on Proof Compression (INRIA Nancy-Grand Est) and Workshop on Algebraic Proof Theory (Gudauri)16 and 24 September 2013
Towards More Efficient and Natural Proof Systems14th Wessex Theory Seminar27 June 2012
Redesigning Logical Syntax with a Bit of TopologyTechnische Universität Dresden31 May 2011
Removing Syntax From Proof TheoryUniversity of Bath (videolink with Swansea)9 December 2010
Some Ideas on How to Find Better Proof RepresentationsUniversity of Bath28 November 2010
Geometric Normalisation with Atomic FlowsJournées GEOCAL-LAC, Nice16 March 2010
Normalisation with Atomic FlowsUniversità di Bologna e di Torino20-21 January 2010
Ten Years of Deep InferenceUniversità di Pisa19 January 2010
Some News on the Proof Complexity of Deep InferenceUniversity of Bath11 November 2009
BeyondUniversity of Oslo, invited talk at Gentzen Systems and Beyond, workshop co-located with Tableaux 20096 July 2009
2013–2016Andrea Aler Tubella (PhD).
2013–2016Daniel Schmitter (PhD).
2010–2013Anupam Das (PhD). Now a post-doc at LIX & INRIA–Saclay-Île-de-France, École Polytechnique (Paris).
2006–2009Tom Gundersen (PhD). University of Bath. Until December 2013 a post-doc at the PPS lab of Paris VII/CNRS (Paris). Thesis (Pdf).
2003–2006Ozan Kahramanogullari (PhD). Now a researcher at The Microsoft Research-University of Trento Centre for Computational and Systems Biology, Trento. Thesis (Pdf).
2000–2003Kai Brünnler (PhD). Until 2011 a researcher at the University of Bern. Thesis (Pdf).
2000–2003Lutz Straßburger (PhD). Now a researcher at LIX & INRIA–Saclay-Île-de-France, École Polytechnique, Palaiseau (Paris). Thesis (Pdf).
2000–2001Alwen Tiu (MSc). Now an assistant professor at Nanyang Technological University (Singapore). Thesis (Pdf).
Efficient and Natural Proof Systems2013–2016. Three-year EPSRC research project at University of Bath. 756,061 GBP (full economic costing, including indexation and a research studentship funded by the University of Bath). Principal investigator is Alessio Guglielmi, co-investigator is Guy McCusker.
Sharing and Sequentiality in Proof Systems with Locality2012–2014. Two-year Royal Society International Exchanges Scheme. Exchange programme between University of Bath and Università di Torino. The two sides get 11,960 GBP. Principal investigators: Alessio Guglielmi and Luca Roversi.
REDO: Redesigning Logical Syntax2009–2010. INRIA—Action de Recherche Collaborative. Two-year project of the teams Calligramme/Démosthène and Parsifal (INRIA) and the Computer Science Department of the University of Bath. The project gets 80,000 EUR. Principal investigators are Alessio Guglielmi, François Lamarche and Lutz Straßburger (coordinator).
Identity and Geometric Essence of Proofs (Démosthène)ANR—Chaire d’excellence 2008–2010. Two-year project at LORIA. The project gets about 355,000 EUR from the Agence Nationale de la Recherche, out of a total cost of 750,448 EUR, the rest of which is covered by INRIA. Principal investigator is Alessio Guglielmi.
Complexity and Non-Determinism in Deep Inference2007–2008. 17 month project at University of Bath. 118,072 GBP (out of a total cost of 147,591 GBP) from the Engineering and Physical Sciences Research Council (EPSRC) Principal investigator is Alessio Guglielmi, co-investigator is Paola Bruscoli.
New Deductive Systems, Normalisation Methods and Semantics for Deep Inference 2006–2007. Alliance—Franco-British Partnership Programme. Two-year exchange programme between the University of Bath and the PPS group in Paris. Bath gets 4,300 GBP from the British Council (principal investigator Alessio Guglielmi) and PPS gets 6,250 EUR from the French MAE and the Ministry of Research (principal investigator Michel Parigot).
Analyticity and Proof Search for Modal Logics in Deep Inference2006–2008. British-German Academic Research Collaboration Programme. Two-year exchange programme between the University of Bath and the Technische Universität Dresden. Bath gets 2,650 GBP from the British Council (principal investigator Alessio Guglielmi) and Dresden gets 5,694 EUR from the German Academic Exchange Service (principal investigator Steffen Hölldobler).
I help running a mailing list devoted to proof theory: click here to subscribe or here to unsubscribe.
I also manage a mailing list specifically devoted to structural proof theory and deep inference: click here to subscribe or here to unsubscribe.
In my teaching, I insist on the practical motivations behind science. One of my deepest convictions is that being grounded in practice is an essential ingredient also for interesting research. I tend to build my courses around `important questions´. I ask students to think about natural questions and problems and explore different paths and possible lines of attack before showing the solution. I privilege concrete knowledge over abstract one: I prefer to insist on some core concepts and the ability of solving (perhaps conceptual) exercises rather than showing one more theorem. I also try to convey the idea that abstraction might become necessary in order to be effective at solving practical problems. It worked for me with the conception of deep inference, where the desire of managing certain concrete problems of concurrent languages led me to reformulate the very foundations of proof theory. In summary, I prefer to transmit basic and general methods whenever it is possible. I am sure that a solid education in the fundamentals is useful also for the majority of students, who do not intend to pursue a research career: higher education should provide mental flexibility, not just notions.
In the past, I taught, among others, Formal Logic and Semantics, Logic and Its Applications, Computability and Decidability, Proof Theory and Deep Inference, Complexity Theory, Logic, Deep Inference and the Calculus of Structures, Sequent Calculus and Abstract Logic Programming, Deduction Systems, Networking and Advanced Programming Principles.
15.4.2014Alessio Guglielmiemail