The Laboratoire Spécification & Vérification (LSV) organizes a two-day workshop to celebrate LSV's 20th anniversary. The event is open to the public and free, but registration is required for logistical purposes (please register before April 30th). The workshop will take place at Ecole normale supérieure Paris-Saclay, in the Amphitheatre Marie Curie on the ground floor of the d'Alembert building.
|Thursday, May 11, 2017|
|09:30||Registration -- Coffee|
|Chair : Stéphane Demri|
Welcome at ENS Paris-Saclay : Pierre-Paul Zalio, Keitaro Nakatani
Population protocols (Angluin et al., PODC 2004) are a formal model of sensor networks consisting of identical, finite-stae mobile devices. When two devices come into the range of each other, they interact and change their states. Computations are infinite sequences of pairwise interactions where the interacting processes are picked by a fair scheduler.
A population protocol is well specified if for every initial configuration C of devices and for every fair computation starting at C, all devices eventually agree on a consensus value that only depends on C. If a protocol is well-specified, then it is said to compute the predicate that assigns to each initial configuration its consensus value. The main two verification problems for population protocols are: Is a given protocol well-specified? Does a given protocol compute a given predicate ?
While the class of predicates computable by population protocols was already established in 2007 (Angluin et al., Distributed Computing), the decidability of the verification problems remained open until 2015, when my colleagues and I finally proved it (Esparza et al., CONCUR 2015, improved version to appear in Acta Informatica). However, the complexity of our decision procedures is very high. We have recently identified a class of procotols that, while being as expressive as the complete class, is far more tractable. I report on these results, and on some experimental results.
I will present a survey on QCTL (Quantified CTL), a temporal logic which extends CTL with quantification over atomic propositions. We will see different possible semantics for QCTL, we will present several results about its expressive power (e.g. the relationship with MSO and behavioural equivalences), and we will see the complexity of its model-checking and satisfiability problems.
These results are mainly based on the two following papers:
|12:00||Lunch -- Hall Villon, ENS Paris-Saclay|
|Chair : Hubert Comon-Lundh|
In this talk, I will introduce a novel rule for synthesis of reactive systems, applicable to systems made of n components which have each their own objectives. This rule is based on the notion of admissible strategies. Contrary to previous proposals in the literature, the rule « assume admissible » defines sets of solutions which are rectangular. This property leads to solutions which are robust and resilient, and allows one to synthesize strategies separately for each component. Algorithms with optimal complexity and also an abstraction framework compatible with the new rule are also available.
Cryptographic protocols aim at securing communications over insecure networks like the Internet. Over the past decades, numerous decision procedures and tools have been developed to automatically analyse the security of protocols. The field has now reached a good level of maturity with efficient techniques for protocols analyzed in isolation.
However, protocols are typically built using several components and tools do not scale well for complex protocols.
For example, may protocols share (securely) long term keys? Is it possible to use any public key infrastructure (PKI) to implement a public key protocol? When does a protocol securely implement a secure channel?
In this talk, we will explore when it is possible to securely compose protocols, in several scenarios.
In this talk we will consider some aspects of the formal specification and verification of database driven systems. The emphasis is made on the data flow rather that on the control flow. We will survey some of the main results, featuring automata models, logics and model theory, in the presence of data.
|Chair : Patricia Bouyer-Decitre|
Pierre-Alain Reynier (LIF - Université Aix-Marseille & CNRS) :
Simplification of transducers: removing two-wayness and non-determinism
Transducers constitute a classical extension of finite-state automata allowing to model word-to-word transformations. Unlike automata, they are not robust under extensions like non-determinism and two-wayness.
This yields natural subclass decision problems :
In this talk, I will present recent results on these problems for functional word-to-word transformations.
Je raconterai quelques secrets concernant l'informatique à l'ENS Cachan et la création du LSV, je parlerai de ma recherche cognitive (confidentielle au LSV) sur le fonctionnement de l'esprit et également de mes derniers résultats (non publiés) sur les WSTS.
To be published.
|19:00||Dinner -- Hall Villon, ENS Paris-Saclay|
|Friday, May 12, 2017||Chair : Paul Gastin|
Madhavan Mukund (Chennai Mathematical Institute) :
Replicated data, from practice to theory
In cloud based services, information is replicated across multiple servers to improve availability and provide resilience against network failures. To reason about replicated data, we have to relax traditional notions of distributed data consistency and work with weaker guarantees such as eventual consistency. This has also led to the development of conflict-free replicated data types that automatically guarantee strong eventual consistency. Tools and techniques from concurrency theory can be used to formally specify and verify replicated datatypes. Despite the progress in theoretical foundations in this area, much remains to be done to develop formal models that accurately represent features of systems that occur in practice. We provide a glimpse of what has been achieved and some of the challenges that remain.
We consider a population of identical NFA, and view it as a 2-player game: the first player chooses which action to play, and the second one resolves the non-determinism in each copy of the NFA.
The objective for the first player is to synchronize all copies to a target state. In this talk, we will report on decidability and complexity of the following parameterized control problem: for every population size, does the first player have a winning strategy ?
|Chair : Philippe Schnoebelen|
Cryptosense software finds and helps to fix security vulnerabilities caused by errors in the use of cryptography. Founded in Paris in 2013 as an INRIA spin-off, during this talk I will explain how the project evolved from a pure research idea that attracted industry interest to a growing company with clients in financial services and government across Europe and North America. In particular I’ll focus on the differences between our original research prototype and the product we sell now, and what I see as the differences between running research projects as a « chargé de recherche » and running a start-up.
The notions of automata and proof systems are two fundamental notions in theoretical computer science. But, these two notions might be just one.
|13:00||Lunch -- Hall Villon, ENS Paris-Saclay||Chair : Laurent Fribourg|
Etienne André (LIPN, Université Paris 13 & CNRS) :
Verification of real-time systems under uncertainty
Critical real-time systems must be formally verified to prevent any undesired behavior. When some timing constants are known with only a limited precision, traditional timed model-checking techniques may not apply anymore.
We use here the formalism of parametric timed automata. Despite notoriously hard problems (in fact most interesting decision problems are undecidable), parametric timed automata allow us to model distributed real-time systems; using IMITATOR, we derive best- and worst-case execution times for a real-time system designed by Thales, and subject to uncertainty in some tasks periods.
We also briefly report on recent techniques to speed-up the verification time using abstractions, and on distributed parametric model checking.
Sylvain Schmitz (LSV - ENS Paris-Saclay & CNRS) :
Complexity Analysis for Termination by a Well-Quasi-Order
A typical application of well-quasi-orders is program termination, which can be shown by mapping any execution sequence of a program to a so-called bad sequence of longer or equal length: as bad sequences are finite, so are executions, and this shows that the program terminates. I will show how such termination proofs can be instrumented to yield complexity upper bounds, with a particular focus on the algorithm for reachability in vector addition systems.
The workshop will take place at Ecole normale supérieure Paris-Saclay, in the Amphitheatre Marie Curie on the ground floor of the d'Alembert building. To get to the Amphitheatre Marie Curie , go up the horseshoe driveway, and turn left. Read general access information to the LSV here.
The organization committee consists of Patricia Bouyer-Decitre, Stéphane Demri, Laurent Fribourg, Alexis Goudou, Virginie Guénard, Francis Hulin-Hubard, Imane Mimouni, Hugues Moretto-Viry.
This event is financially supported by CNRS-INS2I, ENS Paris-Saclay, Inria.