The event is open to anyone, without registration.
Dale does not know: make sure it stays so until the event!
The following schedule is subject to changes, please watch this space for updates.
The event will take place in the Sophie Germain building of IRIF, Université Paris Paris-Diderot - Paris 7.
All talks will be in the Alan Turing amphiteater, to your left when you enter the building.
Lunch on Thursday will take place at Paris Milan. Note that this is not a free lunch.
We will provide several options for the Friday lunch; this will be organized on Thursday.
There will be a cocktail on Thursday evening. It is open to anyone and will take place at IRIF, in the Sophie Germain building. You are welcome to bring a bottle as a gift to Dale.
A social dinner will take place on Thursday evening, chez Trassoudaine. The restaurant will propose a special menu for the occasion, for 45€/participant. Please get in touch with the local organizers to register if you would like to participate: we can only take a few more people.
We give a semantics for a classical variant of Dale and Alwen's logic FO-lambda-nabla. No such semantics seems to have existed for the nabla operator, except for one given by U. Schöpp. Our semantics validates the rule that nabla x implies exists x, but is otherwise faithful to Dale and Alwen's original intentions. The semantics is based on category of so-called nabla-sets, which we define as presheaves over the poset of natural numbers, with additional generic elements at each level. The semantics is sound, and complete (I am currently working on the latter).
Tensorial logic is a primitive logic of tensor and negation which refines linear logic by relaxing the hypothesis that tensorial negation (A -> ~A) is involutive. The resulting logic of linear continuations provides a proof-theoretic account of game semantics, where the formulas and proofs of the logic correspond univoquely to dialogue games and innocent strategies. In the present paper, we introduce a topologically-aware version of tensorial logic, called ribbon tensorial logic. We associate to every proof of the logic a ribbon tangle which tracks the flow of tensorial negations inside the proof. The translation is functorial: it is performed by exhibiting a correspondence between the notion of dialogue category in proof theory and the notion of ribbon category in knot theory. Our main theorem is that the translation is faithful: two tensorial proofs are equal modulo commuting conversions if and only if the associated ribbon tangles are equal up to topological deformation. The ``proof-as-tangle'' theorem may be also understood as a coherence theorem for ribbon dialogue categories. By connecting tensorial logic and knot theory in this functorial and faithful way, we hope to unveil the topological nature of proofs and programs, and of their dialogical interpretation in game semantics.
In this note, we discuss the notion of analyticity in deep inference and propose a formal definition for it. The idea is to obtain a notion that would guarantee the same properties that analyticity in Gentzen theory guarantees, in particular, some reasonable starting point for algorithmic proof search. Given that deep inference generalises Gentzen proof theory, the notion of analyticity discussed here could be useful in general, and we offer some reasons why this might be the case.
Max Kanovich, Stepan Kuznetsov, Vivek Nigam and Andre Scedrov: On the Proof Theory of Non-Commutative Subexponentials (PDF)
Linear logical frameworks with subexponentials have been used for the specification of among other systems, proof systems, concurrent programming languages and linear authorization logics. In these frameworks, subexponentials can be configured to allow or not for the application of the contraction and weakening rules while the exchange rule can always be applied. This means that formulas in such frameworks can only be organized as sets and multisets of formulas not being possible to organize formulas as lists of formulas. This paper investigates the proof theory of linear logic proof systems that do not allow the application of exchange rule on some subexponentials, called non-commutative subexponentials. We investigate conditions for when cut-elimination is admissible in the presence of non-commutative subexponentials, investigating the interaction of the exchange rule with local and non-local contraction rules. We build the foundations towards linear logical frameworks with possibly non-commutative subexponentials by proposing novel dyadic proof systems.
We conduct a complexity-theoretic study of focussed proof systems, relating the alternation of synchronous and asynchronous phases in a proof to an appropriate alternating time hierarchy, for instance the polynomial hierarchy. We propose a notion of 'over-focussing' that admits non-branching invertible rules during synchronous phases, due to the fact that deterministic computations can equally be carried out by a nondeterministic machine. As an application, we develop an over-focussed system for a fragment of intuitionistic propositional logic which we show to be already PSPACE-complete by a refinement of Statman's translation from true QBFs. We show that this translation has a well-behaved inverse, preserving quantifier complexity, in the form of a QBF encoding of proof search for the over-focussed system, demonstrating the usefulness of considering such systems. Consequently we are able to delineate intuitionistic tautologies according to the polynomial hierarchy and derive further proof-theoretic consequences for intuitionistic logic.
Tomer Libal and Marco Volpe: A general proof certification framework for modal logic (PDF)
One of the main issues in proof certification is that different theorem provers, even when designed for the same logic, tend to use different proof formalisms and to produce outputs in different formats. The project ProofCert promotes the usage of a common specification language and of a small and trusted kernel in order to check proofs coming from different sources and for different logics. By relying on that idea and by using a classical focused sequent calculus as a kernel, we propose here a general framework for checking modal proofs. We present the implementation of the framework in a prolog-like language and show how it is possible to specialize it in a simple and modular way in order to cover different proof formalisms, such as tableaux, labeled systems, sequent calculi and nested sequent calculi. We illustrate the method for the logic K by providing several examples and discuss how to extend the approach to other modal logics.
Roberto Blanco and Zakaria Chihani: An interactive assistance for the definition of proof certificates (PDF)
The Foundational Proof Certificate (FPC) approach to proof evidence offers a flexible framework for the formal definition of proof semantics, described through its relationship to focused proof systems. The certificates thus produced by tools are executable when interpreted on top of a suitable logic engine, and can thus be independently verified by trusted proof checkers.
The fundamental obstacle encountered here lies in translating the proof evidence produced by a tool in the terms of a formal definition in the system. These formal definitions are akin to domain-specific languages (in which proofs can be written) programmed in the assembly language of the underlying proof systems: a delicate task for which both expert knowledge and great care are needed.
To facilitate broader adoption, we begin to explore techniques that abstract away part of this complexity and bring the FPC framework closer to a user-friendly, programmable platform in which a wide range of high-level certificate definitions can be easily encoded.
Lutz Strassburger: Combinatorial flows as proof certificates with built-in proof compression
Ferruccio Guidi, Claudio Sacerdoti Coen and Enrico Tassi: Implementing Type Theory in Higher Order Constraint Logic Programming (PDF)
In this paper we are interested in high-level programming languages to implement the core components of an interactive theorem prover for a dependently typed language: the kernel --- responsible for type-checking closed terms --- and the elaborator --- that manipulates terms with holes or, equivalently, partial proof terms.
In the first part of the paper we confirm that $\lambda$Prolog, the language developed by Miller and Nadathur since the 80s, is extremely suitable for implementing the kernel, even when efficient techniques like reduction machines are employed.
In the second part of the paper we turn our attention to the elaborator and we observe that the eager generative semantics inherited by Prolog makes it impossible to reason by induction over terms containing metavariables. We also conclude that the minimal extension to $\lambda$Prolog that allows to do so is the possibility to delay inductive predicates over flexible terms, turning them into (set of) constraints to be propagated according to user provided constraint propagation rules.
Therefore we propose extensions to $\lambda$Prolog to declare and manipulate higher order constraints, and we implement the proposed extensions in the ELPI system. Our test case is the implementation of an elaborator for a type theory as a CLP extension to a kernel written in plain $\lambda$Prolog.
David Thibodeau and Alberto Momigliano and Brigitte Pientka: A Case-Study in Programming Coinductive Proofs: Howe's Method
Bisimulation proofs play a central role in programming languages in establishing rich properties such as contextual equivalence. They are also challenging to mechanize, since they require a combination of inductive and coinductive reasoning on open terms.
In this paper we describe mechanizing the property that similarity in the call-by-name lambda calculus is a pre-congruence using Howe's method in the Beluga proof environment. The formal development relies on three key ingredients: 1) we give a HOAS encoding of lambda-terms together with their operational semantics as intrinsically typed terms, thereby avoiding not only the need to deal with binders, renaming and substitutions, but keeping all typing invariants implicit; 2) we take advantage of Beluga's support for representing open terms using first-class contexts and simultaneous substitutions: this allows us to directly state a notion such as open simulation without resorting to the usual inductive closure operation and to encode very elegantly notoriously painful proofs such as the substitutivity of the Howe relation; 3) we exploit the possibility of reasoning by coinduction in Beluga's reasoning logic.
The end result is succinct and elegant, thanks to the high-level abstractions and primitives Beluga provides. We substantiate this claim by comparing this proof with a similar one, carried out in Abella, and less related developments in Isabelle/HOL and Coq. We hence believe that this mechanization is a text book example that illustrates Beluga’s strength at mechanizing challenging (co)inductive proofs using higher-order abstract syntax encodings.
In this paper we propose a logical foundation of processes and their focused normal forms. We use a linear meta-language based on substructural operational semantics to describe focused forms of processes, and compare them to standard π-calculus processes with their respective operational semantics. The overall goal of this research is to understand how to reason about processes, multi- party communication and global types, and how to mechanize properties such as deadlock freeness and liveness. We are also interested in establishing the limitations of this approach.
Ross Horne and Alwen Tiu: Towards Proofs as Successful Executions of Processes (PDF)
We further the understanding of the relationship between process calculi and non-commutative logic. This work focuses on, a first-order extension of the proof calculus BV featuring a de Morgan dual pair of nominal quantifiers, called BV1. An embedding of pi-calculus process as predicates in BV1 is defined, and a procedure is provided for extracting successful executions from proofs of embedded processes. This procedure is used to establish the soundness of linear implication in BV1 with respect to trace inclusion in the pi-calculus. We illustrate the expressive power of BV1, by demonstrating that these techniques extend also to the internal pi-calculus, where privacy of inputs are guaranteed. We emphasise that linear implication is strictly finer than trace inclusion, providing a tight refinement semantics for processes respecting both causality and the scope of private names.
In the paper “Structural Cut Elimination”, Pfenning gives a proof of the admissibility of cut for intuitionistic and classical logic. The proof is remarkable in that it does not rely on difficult termination metrics, but rather a nested lexicographical induction on the structure of formulas and derivations. A crucial requirement for this proof to go through is that contraction is not an inference rule of the system. Because of this, it is necessary to change the inference rules so that contraction becomes an admissible rule rather than an inference rule. This change also requires that weakening is admissible, hence it is not directly applicable to logics in which only contraction is admissible (e.g. relevance logic).
We present a sequent calculus which admits a unified structural cut elimination proof that encompasses Intuitionistic MALL and its affine, strict and intuitionistic extensions. A nice feature of the calculus is that, for instance, moving from linear to strict logic is as simple as allowing the presence of a rule corresponding to contraction.
Finally, based on the insights we obtain from this design, we present a strongly focused sequent calculus for strict logic (i.e. Intuitionistic MALL with free contraction).
Herbrand's theorem is one of the most fundamental insights in logic. From the syntactic point of view it suggests a compact representation of proofs in classical first- and higher-order logic by recording the information which instances have been chosen for which quantifiers, known in the literature as expansion trees.
Such a representation is inherently analytic and hence corresponds to a cut-free sequent calculus proof. Recently several extensions of such proof representations to proofs with cut have been proposed. These extensions are based on graphical formalisms similar to proof nets and are limited to prenex formulas.
In this paper we present a new approach that directly extends Miller's expansion trees by cuts and covers also non-prenex formulas. We describe a cut-elimination procedure for our expansion trees with cut that is based on the natural reduction steps and show that it is weakly normalizing.
David Baelde, Amina Doumane, Lucca Hirschi and Alexis Saurin: Circular Proofs, Theory and Certificates
I will present three recent recents on infinitary proof systems (cut-elimination and focalization for muMALL, and completeness for linear-time temporal logic) and briefly discuss to which extent this supports the use of infinitary proofs as foundations of functional programming, logic programming, and proof certificates.
Local organization is handled by David Baelde and Alexis Saurin.