Anciens programmes: 2010-2011, 2011-2012, 2012-2013, 2013-2014, current
Programme: 2012-2013
Evangeline Pollard (25/04/2013)
-
Titre:
Data fusion and multitarget tracking for the Intelligent Transportation Systems -
Résumé:This presentation is divided in two parts. In a first part, I will present Imara, my current work location and in a second part, I will present some of my scientific works. IMARA is a project-team at INRIA. Its motivation is to improve road transportation in terms of safety, efficiency, comfort and also to minimize nuisances. The technical approaches are based on drivers aids, going all the way to full driving automation. The project-team provides to the different teams cooperating with us, some important means such as a fleet of a dozen of computer driven vehicles, various sensors and advanced computing facilities including simulation tool. In the second part of my talk, I will give an overview of my work on perception. I will first speak about my PhD work, and more specifically about a generalization of the Kalman filter: the PHD (Probability Hypothesis Density) filter. Then I will present some of the scientific axes developed in the perception team at Imara: Laser based SLAM, stereo-vision based SLAM, numerical derivation, step detection, obstacle detection and tracking and ontologies in the ITS.
Benoit Geller (11/04/2013)
-
Titre:
Synchronization for Low-Power and Sensor Networks -
Résumé:The last decade has witnessed the convergence of three giant worlds - namely electronics, computer science and telecommunications- and the next decade should follow this convergence in most of our daily activities with the generalization of sensor networks. The most critical point of sensor networks is the energy saving issue which limits the life time of the networks. Very often sensor devices need to be silent, and they communicate through the use of short-block simple-protocols. In particular, old classical signal processing algorithms were revivified by this context. During this tutorial, we will review some of the basics of synchronization with a special focus on digital synchronization loops and the Cramer-Rao bounds that can theoretically be reached.
Christophe Avenel (28/03/2013)
-
Titre:
Massively parallel birth and death process for cell nuclei extraction in histopathological images -
Résumé:Cell nuclei extraction from histopathology images is necessary for breast cancer grading, and has become one of the major problem in the domain of automatic image analysis.
Stochastic marked point processes combined with birth and death processes are promising tools for such extraction, but they are extremely compute intensive, especially on large images such as scanned microscope slides. We here show that the original birth and death process applied to marked point processes is inherently sequential. We thus rewrite this algorithm in order to obtain a massively parallel birth and death process. This algorithm is efficiently deployed on multi-core and many-core architectures, and the corresponding performance results are analyzed.
Jan Olaf Blech (26/11/2012)
-
Titre:
Using Formal Behavioral Descriptions of Software Components at Development and Runtime of Systems -
Résumé:This talk gives an overview on using formal behavioral descriptions of software components at development and runtime of a system. The talk highlights different usage scenarios for behavioral descriptions and presents particular efforts that have been undertaken. These comprise the use of formal verification and certification techniques (e.g., by means of higher-order theorem provers) in the development process and at runtime of a system. Furthermore, ongoing work on the integration of these techniques into Eclipse based software development tools is presented. Here, making the techniques available for a larger community of developers is a goal.
Barbara Petit (23/11/2012)
-
Titre:
Type Systems for Certified Complexity Analysis -
Résumé:Static complexity analysis is an important problem in programming: Before running a program can we know if its execution will terminate, and the time it will take? This is not decidable in general and the more expressive the programming language is, the harder it is to solve the problem. In particular analysing the complexity of programs that include some recursive operations can be tricky.
I will present a quite new approach to cope with this issue in functional programming: type systems. Indeed, type theory is well studied for functional languages, and is directly connected with Logics. So we can use some intuitions coming from a ``Logic of Resources'' (namely Linear Logic) to develop a type system carrying complexity information.
I will show how this type system (and its automatic type inference) can be used in practise to obtain precise complexity bounds for (recursive) functional programs, and how this complexity can be certified by formal provers.
Mickaël Delahaye (14/11/2012)
-
Titre:
Combiner test et preuve pour améliorer la qualité du logiciel critique -
Résumé:Du fait de la complexité et l'ubiquité du logiciel, l'automatisation des tâches de vérification et de validation du logiciel est aujourd'hui indispensable. Je me suis intéressé, en particulier, à la problématique de la génération de données de test automatique et le problème posé par la présence de chemins infaisables dans les programmes. Dans ce domaine, je propose une méthode pour profiter de la détection d'un seul chemin infaisable au cours de la génération de données de test pour déduire toute une famille de chemins infaisables et ainsi accélérer la génération. Cette méthode exploite une combinaison d'exécution symbolique, de raisonnement à base de contraintes et d'analyses statiques. Par cette contribution et un rapide résumé de contributions, j'entends montrer l'opportunité du travail conjoint des méthodes de test et de vérification pour améliorer la qualité et la fiabilité du logiciel dans les domaines des applications critiques et embarquées.
Cinzia Di Giusto (26/10/2012)
-
Titre:
Adaptable processes -
Résumé:I will introduce the concept of adaptable processes as a way of overcoming the limitations that process calculi have for describing patterns of dynamic process evolution. Such patterns rely on direct ways of controlling the behavior and location of running processes, and so they are at the heart of the adaptation capabilities present in many modern concurrent systems such as workflows or cloud computing scenarios. Rather than a specification language, our calculus intends to be a basis for investigating the fundamental properties of evolvable processes and for developing richer languages with evolvability capabilities.
I will present a core calculus of adaptable processes and propose a framework for verifying and ensuring properties of the designed systems along two directions. First, i will show two specific verification problems and a way of generalizing them. The verification properties ensures that the number of consecutive erroneous states that can be traversed during a computation is bound by some given number k, or that a state without errors will be eventually reached. We study the (un)decidability of these two problems and their generalization in several variants of the calculus. Finally, i will propose a type system for statically ensuring that well typed processes guarantee safe and secure updates: for instance, we guarantee that no active protocol is interrupted by an update.
Assalé Adjé (23/10/2013)
-
Titre:
An introduction to Abstract Interpretation: Basics, Kleene and Policy Iteration -
Résumé:In this talk, after a quick motivation to static analysis of programs, the basic theory of abstract interpretation will be presented. Next, we will focus on fixed point equations to solve. Considering the abstraction in the interval domain, we make the link with the game theory. Finally, we compute the fixed point thanks to two techniques: the Kleene iteration and the Policy iteration.
Sylvain Conchon (19/10/2012)
-
Titre:
Cubicle: A Parallel SMT-based Model Checker for Parameterized Systems -
Résumé:Cubicle is used to verify safety properties of array-based systems, i.e. parametrized transition systems whose states are represented as infinite arrays indexed by an arbitrary number of processes. This class of systems are convenient to describe, for instance, cache coherence protocols and mutual exclusion algorithms. Cubicle model-checks such systems by performing a symbolic backward reachability analysis on infinite sets of states represented by cubes, i.e. existentially quantified conjunctions of predicates. The main loop consists in repeatedly computing the pre-images of cubes, safety and fixpoint checks are discharged in an SMT solver. The tool takes advantage of the symbolic representation of states as cubes (by finding symmetries, inclusions etc.), as well as a tightly integrated ad-hoc SMT solver. Its implementation can also exploit multi-core architectures.
Luca Saiu (25/09/2012)
-
Titre:
GNU epsilon, un langage de programmation extensible -
Résumé:Le réductionnisme est une technique réaliste de conception et implantation de vrais langages de programmation, et conduit à des solutions plus faciles à étendre, expérimenter avec et analyser. Je vais introduire la sémantique et l'implantation d'un langage de programmation extensible, basé sur un langage-noyau minimaliste impératif du premier ordre, équipé de mécanismes d'abstraction forts : macros à la Lisp, et transformations de code à code réécrivant les expressions étendues en expressions-noyau.