Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Damien Doligez is active.

Publication


Featured researches published by Damien Doligez.


international conference on logic programming | 2007

Zenon: an extensible automated theorem prover producing checkable proofs

Richard Bonichon; David Delahaye; Damien Doligez

We present Zenon, an automated theorem prover for first order classical logic (with equality), based on the tableau method. Zenon is intended to be the dedicated prover of the Focal environment, an objectoriented algebraic specification and proof system, which is able to produce OCaml code for execution and Coq code for certification. Zenon can directly generate Coq proofs (proof scripts or proof terms), which can be reinserted in the Coq specifications produced by Focal. Zenon can also be extended, which makes specific (and possibly local) automation possible in Focal.


international joint conference on automated reasoning | 2010

Verifying safety properties with the TLA + proof system

Kaustuv Chaudhuri; Damien Doligez; Leslie Lamport; Stephan Merz

TLAPS, the TLA+ proof system, is a platform for the development and mechanical verification of TLA+ proofs. The TLA+ proof language is declarative, and understanding proofs requires little background beyond elementary mathematics. The language supports hierarchical and non-linear proof construction and verification, and it is independent of any verification tool or strategy. Proofs are written in the same language as specifications; engineers do not have to translate their high-level designs into the language of a particular verification tool. A proof manager interprets a TLA+ proof as a collection of proof obligations to be verified, which it sends to backend verifiers that include theorem provers, proof assistants, SMT solvers, and decision procedures. The first public release of TLAPS is available from [1], distributed with a BSD-like license. It handles almost all the non-temporal part of TLA+ as well as the temporal reasoning needed to prove standard safety properties, in particular invariance and step simulation, but not liveness properties. Intuitively, a safety property asserts what is permitted to happen; a liveness property asserts what must happen; for a more formal overview, see [3,10].


symposium on principles of programming languages | 2009

A foundation for flow-based program matching: using temporal logic and model checking

Julien Brunel; Damien Doligez; René Rydhof Hansen; Julia L. Lawall; Gilles Muller

Reasoning about program control-flow paths is an important functionality of a number of recent program matching languages and associated searching and transformation tools. Temporal logic provides a well-defined means of expressing properties of control-flow paths in programs, and indeed an extension of the temporal logic CTL has been applied to the problem of specifying and verifying the transformations commonly performed by optimizing compilers. Nevertheless, in developing the Coccinelle program transformation tool for performing Linux collateral evolutions in systems code, we have found that existing variants of CTL do not adequately support rules that transform subterms other than the ones matching an entire formula. Being able to transform any of the subterms of a matched term seems essential in the domain targeted by Coccinelle. In this paper, we propose an extension to CTL named CTLVW (CTL with variables and witnesses) that is a suitable basis for the semantics and implementation of the Coccinelles program matching language. Our extension to CTL includes existential quantification over program fragments, which allows metavariables in the program matching language to range over different values within different control-flow paths, and a notion of witnesses that record such existential bindings for use in the subsequent program transformation process. We formalize CTL-VW and describe its use in the context of Coccinelle. We then assess the performance of the approach in practice, using a transformation rule that fixes several reference count bugs in Linux code.


Ecology | 2008

SPATIAL SCALE OF LOCAL BREEDING HABITAT QUALITY AND ADJUSTMENT OF BREEDING DECISIONS

Blandine Doligez; Anne Berthouly; Damien Doligez; Marion Tanner; Verena Saladin; Danielle Bonfils; Heinz Richner

Experimental studies provide evidence that, in spatially and temporally heterogeneous environments, individuals track variation in breeding habitat quality to adjust breeding decisions to local conditions. However, most experiments consider environmental variation at one spatial scale only, while the ability to detect the influence of a factor depends on the scale of analysis. We show that different breeding decisions by adults are based on information about habitat quality at different spatial scales. We manipulated (increased or decreased) local breeding habitat quality through food availability and parasite prevalence at a small (territory) and a large (patch) scale simultaneously in a wild population of Great Tits (Parus major). Females laid earlier in high-quality large-scale patches, but laying date did not depend on small-scale territory quality. Conversely, offspring sex ratio was higher (i.e., biased toward males) in high-quality, small-scale territories but did not depend on large-scale patch quality. Clutch size and territory occupancy probability did not depend on our experimental manipulation of habitat quality, but territories located at the edge of patches were more likely to be occupied than central territories. These results suggest that integrating different decisions taken by breeders according to environmental variation at different spatial scales is required to understand patterns of breeding strategy adjustment.


international colloquium on theoretical aspects of computing | 2010

The TLA + proof system: building a heterogeneous verification platform

Kaustuv Chaudhuri; Damien Doligez; Leslie Lamport; Stephan Merz

Model checking has proved to be an efficient technique for finding subtle bugs in concurrent and distributed algorithms and systems. However, it is usually limited to the analysis of small instances of such systems, due to the problem of state space explosion. When model checking finds no more errors, one can attempt to verify the correctness of a model using theorem proving, which also requires efficient tool support.


international conference on logic programming | 2013

Zenon Modulo: When Achilles Outruns the Tortoise Using Deduction Modulo

David Delahaye; Damien Doligez; Frédéric Gilbert; Pierre Halmagrand; Olivier Hermant

We propose an extension of the tableau-based first order automated theorem prover Zenon to deduction modulo. The theory of deduction modulo is an extension of predicate calculus, which allows us to rewrite terms as well as propositions, and which is well suited for proof search in axiomatic theories, as it turns axioms into rewrite rules. We also present a heuristic to perform this latter step automatically, and assess our approach by providing some experimental results obtained on the benchmarks provided by the TPTP library, where this heuristic is able to prove difficult problems in set theory in particular. Finally, we describe an additional backend for Zenon that outputs proof certificates for Dedukti, which is a proof checker based on the λΠ-calculus modulo.


formal methods | 1999

Cache Coherence Verification with TLA

Homayoon Akhiani; Damien Doligez; Paul Harter; Leslie Lamport; Joshua Scheid; Mark R. Tuttle; Yuan Yu

We used the specification language TLA+ to analyze the correctness of two cache-coherence protocols for shared-memory multiprocessors based on two generations (EV6 and EV7) of the Alpha processor. A memory model defines the relationship between the values written by one processor and the values read by another, and a cache-coherence protocol manipulates the caches to preserve this relationship. The cache-coherence protocol is a fundamental component of any shared-memory multiprocessor design. Proving that the coherence protocol implements the memory model is a high-leverage application of formal methods. The analysis of the first protocol was largely a research project, but the analysis of the second protocol was a part of the engineers’ own verification process.


Journal of Automated Reasoning | 2003

Algorithms and Proofs Inheritance in the FOC Language

Virgile Prevosto; Damien Doligez

In this paper, we present the FOC langugage, dedicated to the development of certified computer algebra libraries (that is sets of programs). These libraries are based on a hierarchy of implementations of mathematical structures. After presenting the core set of features of our language, we describe the static analyses, which reject inconsistent programs. We then show how we translate FOC definitions into OCAML and COQ, our target languages for the computational part and the proof checking, respectively.


ABZ 2016 Proceedings of the 5th International Conference on Abstract State Machines, Alloy, B, TLA, VDM, and Z - Volume 9675 | 2016

Proving Determinacy of the PharOS Real-Time Operating System

Selma Azaiez; Damien Doligez; Matthieu Lemerre; Tomer Libal; Stephan Merz

Executions in the PharOS real-time system are deterministic in the sense that the sequence of local states for every process is independent of the order in which processes are scheduled. The essential ingredient for achieving this property is that a temporal window of execution is associated with every instruction. Messages become visible to receiving processes only after the time window of the sending message has elapsed. We present a high-level model of PharOS in TLA+ and formally state and prove determinacy using the TLA+ Proof System.


acm workshop on programming languages and analysis for security | 2012

Development of secured systems by mixing programs, specifications and proofs in an object-oriented programming environment: a case study within the FoCaLiZe environment

Damien Doligez; Mathieu Jaume; Renaud Rioboo

FoCaLiZe is an object-oriented programming environment that combines specifications, programs and proofs in the same language. This paper describes how its features can be used to formally express specifications and to develop by stepwise refinement the design and implementation of secured systems, while proving that the implementation meets its specification or design requirements. We thus obtain a modular implementation of a generic framework for the definition of security policies together with certified enforcement mechanism for these policies.

Collaboration


Dive into the Damien Doligez's collaboration.

Top Co-Authors

Avatar

Xavier Leroy

École Normale Supérieure

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

David Delahaye

University of Montpellier

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge