Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Stéphane Lengrand is active.

Publication


Featured researches published by Stéphane Lengrand.


conference on computability in europe | 2006

LJQ: a strongly focused calculus for intuitionistic logic

Roy Dyckhoff; Stéphane Lengrand

LJQ is a focused sequent calculus for intuitionistic logic, with a simple restriction on the first premisss of the usual left introduction rule for implication. We discuss its history (going back to about 1950, or beyond), present the underlying theory and its applications both to terminating proof-search calculi and to call-by-value reduction in lambda calculus.


italian conference on theoretical computer science | 2005

The language X : circuits, computations and classical logic

Steffen van Bakel; Stéphane Lengrand; Pierre Lescanne

We present the syntax and reduction rules for χ, an untyped language that is well suited to describe structures which we call “circuits” and which are made of parts that are connected by wires. To demonstrate that χ gives an expressive platform, we will show how, even in an untyped setting, that we can faithfully embed algebraic objects and elaborate calculi, like the naturals, the λ-calculus, Bloe and Rose’s calculus of explicit substitutions λx, and Parigot’s λμ.


Logical Methods in Computer Science | 2011

A focused sequent calculus framework for proof-search in pure type systems

Stéphane Lengrand; Roy Dyckhoff; James McKinna

Basic proof-search tactics in logic and type theory can be seen as the root-first applications of rules in an appropriate sequent calculus, preferably without the redundan- cies generated by permutation of rules. This paper addresses the issues of defining such sequent calculi for Pure Type Systems (PTS, which were originally presented in natural de- duction style) and then organizing their rules for effective proof-search. We introduce the idea of Pure Type Sequent Calculus with meta-variables (PTSC�), by enriching the syn- tax of a permutation-free sequent calculus for propositional logic due to Herbelin, which is strongly related to natural deduction and already well adapted to proof-search. The operational semantics is adapted from Herbelins and is defined by a system of local re- write rules as in cut-elimination, using explicit substitutions. We prove confluence for this system. Restricting our attention to PTSC, a type system for the ground terms of this system, we obtain the Subject Reduction property and show that each PTSC is logically equivalent to its corresponding PTS, and the former is strongly normalising iff the latter is. We show how to make the logical rules of PTSC into a syntax-directed system PS for proof-search, by incorporating the conversion rules as in syntax-directed presentations of the PTS rules for type-checking. Finally, we consider how to use the explicitly scoped meta-variables of PTSCto represent partial proof-terms, and use them to analyse in- teractive proof construction. This sets up a framework PE in which we are able to study proof-search strategies, type inhabitant enumeration and (higher-order) unification.


Logical Methods in Computer Science | 2013

Non-idempotent intersection types and strong normalisation

Alexis Bernadet; Stéphane Lengrand

We present a typing system with non-idempotent intersection types, typing a term syntax covering three different calculi: the pure {\lambda}-calculus, the calculus with explicit substitutions {\lambda}S, and the calculus with explicit substitutions, contractions and weakenings {\lambda}lxr. In each of the three calculi, a term is typable if and only if it is strongly normalising, as it is the case in (many) systems with idempotent intersections. Non-idempotency brings extra information into typing trees, such as simple bounds on the longest reduction sequence reducing a term to its normal form. Strong normalisation follows, without requiring reducibility techniques. Using this, we revisit models of the {\lambda}-calculus based on filters of intersection types, and extend them to {\lambda}S and {\lambda}lxr. Non-idempotency simplifies a methodology, based on such filter models, that produces modular proofs of strong normalisation for well-known typing systems (e.g. System F). We also present a filter model by means of orthogonality techniques, i.e. as an instance of an abstract notion of orthogonality model formalised in this paper and inspired by classical realisability. Compared to other instances based on terms (one of which rephrases a now standard proof of strong normalisation for the {\lambda}-calculus), the instance based on filters is shown to be better at proving strong normalisation results for {\lambda}S and {\lambda}lxr. Finally, the bounds on the longest reduction sequence, read off our typing trees, are refined into an exact measure, read off a specific typing tree (called principal); in each of the three calculi, a specific reduction sequence of such length is identified. In the case of the {\lambda}-calculus, this complexity result is, for longest reduction sequences, the counterpart of de Carvalhos result for linear head-reduction sequences.


Journal of Logic and Computation | 2007

Call-by-Value λ-calculus and LJQ

Roy Dyckhoff; Stéphane Lengrand

LJQ is a focused sequent calculus for intuitionistic logic, with a simple restriction on the first premiss of the usual left introduction rule for implication. In a previous paper we discussed its history (going back to about 1950, or beyond) and presented its basic theory and some applications; here we discuss in detail its relation to call-by-value reduction in lambda calculus, establishing a connection between LJQ and the CBV calculus λC of Moggi. In particular, we present an equational correspondence between these two calculi forming a bijection between the two sets of normal terms, and allowing reductions in each to be simulated by reductions in the other.


joint european conferences on theory and practice of software | 2011

Complexity of strongly normalising λ-terms via non-idempotent intersection types

Alexis Bernadet; Stéphane Lengrand

We present a typing system for the λ-calculus, with nonidempotent intersection types. As it is the case in (some) systems with idempotent intersections, a lambda;-term is typable if and only if it is strongly normalising. Non-idempotency brings some further information into typing trees, such as a bound on the longest β-reduction sequence reducing a term to its normal form. We actually present these results in Klops extension of lambda;-calculus, where the bound that is read in the typing tree of a term is refined into an exact measure of the longest reduction sequence. This complexity result is, for longest reduction sequences, the counterpart of de Carvalhos result for linear head-reduction sequences.


computer science logic | 2006

A sequent calculus for type theory

Stéphane Lengrand; Roy Dyckhoff; James McKinna

Based on natural deduction, Pure Type Systems (PTS) can express a wide range of type theories. In order to express proof-search in such theories, we introduce the Pure Type Sequent Calculi (PTSC) by enriching a sequent calculus due to Herbelin, adapted to proof-search and strongly related to natural deduction. PTSC are equipped with a normalisation procedure, adapted from Herbelin’s and defined by local rewrite rules as in Cut-elimination, using explicit substitutions. It satisfies Subject Reduction and it is confluent. A PTSC is logically equivalent to its corresponding PTS, and the former is strongly normalising if and only if the latter is. We show how the conversion rules can be incorporated inside logical rules (as in syntax-directed rules for type checking), so that basic proof-search tactics in type theory are merely the root-first application of our inference rules.


computer science logic | 2011

Filter Models: Non-idempotent Intersection Types, Orthogonality and Polymorphism

Alexis Bernadet; Stéphane Lengrand

This paper revisits models of typed lambda calculus based on filters of intersection types: By using non-idempotent intersections, we simplify a methodology that produces modular proofs of strong normalisation based on filter models. Building such a model for some type theory shows that typed terms can be typed with intersections only, and are therefore strongly normalising. Non-idempotent intersections provide a decreasing measure proving a key termination property, simpler than the reducibility techniques used with idempotent intersections. Such filter models are shown to be captured by orthogonality techniques: we formalise an abstract notion of orthogonality model inspired by classical realisability, and express a filter model as one of its instances, along with two term-models (one of which captures a now common technique for strong normalisation). Applying the above range of model constructions to Curry-style System F describes at different levels of detail how the infinite polymorphism of System F can systematically be reduced to the finite polymorphism of intersection types.


foundations of software science and computation structure | 2008

Strong normalisation of cut-elimination that simulates β-reduction

Kentaro Kikuchi; Stéphane Lengrand

This paper is concerned with strong normalisation of cut-elimination for a standard intuitionistic sequent calculus. The cut-elimination procedure is based on a rewrite system for proof-terms with cut-permutation rules allowing the simulation of β-reduction. Strong normalisation of the typed terms is inferred from that of the simply-typed λ-calculus, using the notions of safe and minimal reductions as well as a simulation in Nederpelt-Klops λI-calculus. It is also shown that the type-free terms enjoy the preservation of strong normalisation (PSN) property with respect to β-reduction in an isomorphic image of the type-free λ-calculus.


international joint conference on automated reasoning | 2006

Strong cut-elimination systems for hudelmaier's depth-bounded sequent calculus for implicational logic

Roy Dyckhoff; Delia Kesner; Stéphane Lengrand

Inspired by the Curry-Howard correspondence, we study normalisation procedures in the depth-bounded intuitionistic sequent calculus of Hudelmaier (1988) for the implicational case, thus strengthening existing approaches to Cut-admissibility. We decorate proofs with terms and introduce various term-reduction systems representing proof transformations. In contrast to previous papers which gave different arguments for Cut-admissibility suggesting weakly normalising procedures for Cut-elimination, our main reduction system and all its variations are strongly normalising, with the variations corresponding to different optimisations, some of them with good properties such as confluence.

Collaboration


Dive into the Stéphane Lengrand's collaboration.

Top Co-Authors

Avatar

Roy Dyckhoff

University of St Andrews

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

James McKinna

University of St Andrews

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Pierre Lescanne

École normale supérieure de Lyon

View shared research outputs
Researchain Logo
Decentralizing Knowledge