Laurent Delahoche
University of Picardie Jules Verne
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Laurent Delahoche.
2006 1ST IEEE International Conference on E-Learning in Industrial Electronics | 2006
Vincent Ricquebourg; David Menga; David Durand; Bruno Marhic; Laurent Delahoche; Christophe Loge
This general paper aims at presenting the Smart Home concept. In this paper, we will detail a) the Smart Home concept b) the various networks infrastructures specific to the habitat c) our concepts to model the habitat and to provide the most adapted services to the inhabitants. Contrary to the other projects, we direct our work towards a sensors approach and an ontology modelling of the Smart Home. Our work has the originality to take into account the real heterogeneity of information present in a habitat and use a service oriented approach (SOA). We can say that our paper is a good overview to present what is a Smart Home and which are the necessary hardware and software components to make a Smart Home
Pattern Recognition | 1999
Pascal Vasseur; Claude Pégard; El Mustapha Mouaddib; Laurent Delahoche
Abstract In this paper, we propose an application of the perceptual organization based on the Dempster–Shafer theory. This method is divided into two parts which respectively rectifies the segmentation mistakes by restoring the coherence of the segments and detects objects in the scene by forming groups of primitives. We show how we apply the Dempster–Shafer theory, usually used in data fusion, in order to obtain an optimal adequation between the perceptual organization problem and this tool. We show that without any prior knowledge and any threshold, our bottom-up algorithm detects efficiently the different objects even in cluttered environment. Moreover, we demonstrate its robustness and flexibility on indoor and outdoor scenes without any modification of parameters.
international conference on robotics and automation | 1999
Cyril Drocourt; Laurent Delahoche; Claude Pégard; Arnaud Clerentin
Presents a system of absolute localization based on the stereoscopic omnidirectional vision. To do it we use an original perception system which allows our omnidirectional vision sensor SYCLOP to move along a rail. The first part of our study deals with the problem of building the sensorial model with the help of the two stereoscopic omnidirectional images. To solve this problem we propose an approach based on the fusion of several criteria which will be made according to Dempster-Shafer rules. The second part is devoted to exploiting this sensorial model to localize the robot thanks to matching the sensorial primitives with the environment map. We analyze the performance of our global absolute localization system on several robot elementary moves, in different environments.
international conference on robotics and automation | 1998
Laurent Delahoche; Claude Pégard; El Mustapha Mouaddib; Pascal Vasseur
In this article we present a navigation system allowing a mobile robot to be localized in an indoor environment which is only partially known. This system integrates an environment map updating module allowing the mobile robot to estimate the position of new vertical landmarks along its path. An extended Kalman filter is used on the one hand to estimate the mobile robot position and on the other hand to extract observations which will be used to determine the positions of unlisted landmarks. The integration of new landmarks into the environment global map is managed from the covariance matrix associated with each unlisted landmark. We present the experimental results we have got with SARAH, our mobile robot.
intelligent robots and systems | 1997
Laurent Delahoche; Claude Pégard; Bruno Marhic; Pascal Vasseur
In this paper we present a dynamic localization system which allows a mobile robot to evolve autonomously in a structured environment. Our system is based on the use of two sensors: an odometer and an omnidirectional vision system which gives a reference in connection with a set of natural beacons. Our navigation algorithm gives a reliable position estimation due to a systematic dynamic resetting. To merge the data obtained we use the extended Kalman filter. Our proposed method allows us to treat efficiently the noise problems linked to the primitive extraction, which contributes to the robustness of our system. Thus, we have developed a reliable and quick navigation system which can deals with the constraints of moving the robots in an industrial environment. We give the experimental results obtained from a mission realized in an a priori known environment.
Cocos | 2003
Cyril Drocourt; Laurent Delahoche; Eric Brassart; Bruno Marhic; Arnaud Clerentin
This paper deals with an original simultaneous localisation and map building paradigm (SLAM) based on the one hand on the use of an omnidirectional stereoscopic vision system and on the other hand on an interval analysis formalism for the state estimation. The first part of our study is linked to the problem of building the sensorial model. The second part is devoted to exploiting this sensorial model to localise the robot in the sense of interval analysis. The third part introduces the problem of map updating and deals with the matching problem of the stereo sensorial model with an environment map, (integrating all the previous primitive observations). The SLAM algorithm was tested on several large and structured environments and some experimental results will be presented.
Robotics and Autonomous Systems | 2005
Arnaud Clerentin; Laurent Delahoche; Eric Brassart; Cyril Drocourt
Abstract In this article, a dynamic localization method based on multi-target tracking is presented. The originality of this method is its capability to manage and propagate uncertainties during the localization process. This multi-level uncertainty propagation stage is based on the use of the Dempster–Shafer theory. The perception system we use is composed of an omnidirectional vision system and a panoramic range finder. It enables us to treat complementary and redundant data and thus to construct a robust sensorial model which integrates an important number of significant primitives. Based on this model, we treat the problem of maintaining a matching and propagating uncertainties on each matched primitive in order to obtain a global uncertainty about the robot configuration.
Information Fusion | 2012
Bruno Marhic; Laurent Delahoche; Clément Solau; Anne Marie Jolly-Desodt; Vincent Ricquebourg
We address the problem of abnormal behaviour recognition of the inhabitant of a smart home in the presence of unreliable sensors. The corner stone of this work is a two-level architecture sensor fusion based on the Transferable Belief Model (TBM). The novelty of our work lies in the way we detect both unreliable sensors and abnormal behaviour within our architecture by using a temporal analysis of conflict resulting from the fusion of sensors. Detection of abnormal behaviour is based on a prediction/observation process and the influence of the faulty sources is discarded by discounting coefficients. Our architecture is tested in a real-life setting using three heterogeneous sensors enabling the detection of impossible transitions between three possible postures: Sitting, Standing and Lying. The impact of having a faulty sensor management is also tested in the real-life experiment for posture detection.
intelligent robots and systems | 2009
Arnaud Clerentin; Laurent Delahoche; Bruno Marhic; Mélanie Delafosse; Benjamin Allart
In this paper, we deal with an original advanced driver assistance system (ADAS) based on the use of omnidirectional vision and an evidential fusion architecture. The panoramic perception solution permits us to address efficiently the problem of close vehicles detection but also the monitoring side traffic system. The fusion and integration of this sensorial data stream is assumed by a credibilist architecture based on the transferable belief model (TBM) of SMETS. This paradigm permits the filtering of false alarms efficiently by an optimal management of the uncertainties estimation.
Autonomous Robots | 2008
Arnaud Clerentin; Mélanie Delafosse; Laurent Delahoche; Bruno Marhic; Anne-Marie Jolly-Desodt
Abstract This article deals with uncertainty and imprecision treatment during the mobile robot localization process. The imprecision determination is based on the use of the interval formalism. Indeed, the mobile robot is equipped with an exteroceptive sensor and odometers. The imprecise data given by these two sensors are fused by constraint propagation on intervals. At the end of the algorithm, we get 3D localization subpaving which is supposed to contain the robot’s position in a guaranteed way. Concerning the uncertainty, it is managed through a propagation architecture based on the use of the Transferable Belief Model of Smets. This architecture enables to propagate uncertainty from low level data (sensor data) in order to quantify the global uncertainty of the robot localization estimation.