Claude Pégard
University of Picardie Jules Verne
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Claude Pégard.
international conference on robotics and automation | 1996
Claude Pégard; El Mustapha Mouaddib
Mobile robots use actually a combination of internal and external sensors to determine their position and orientation in a path following period. Incremental encoders, gyrometers are generally used to give an approximative estimation of localization. Nevertheless, the cumulative drifts of these internal sensors must be periodically corrected with an exteroceptive sensor. So, the authors present, in this paper, an optical omnidirectional sensor which can give, with an adapted software, an absolute localization. This sensor is made of a CCD video camera associated with a conic shaped reflector; so, a view of a 2 /spl pi/ radian field is available to compute the position of the robot. The authors report the matching algorithm allowing a local observed scene to be replaced in the global navigation area. The authors conduct the accuracy analysis of this global positioning system, and present the experimental results.
international conference on robotics and automation | 2007
Cédric Demonceaux; Pascal Vasseur; Claude Pégard
Attitude is one of the most important parameters for a UAV during a flight. Attitude computation methods based vision generally use the horizon line as reference. However, the horizon line becomes an inadequate feature in urban environment. We then propose in this paper an omnidirectional vision system based on straight lines (very frequent in urban environment) that is able to compute the roll and pitch angles. The method consists in finding bundles of horizontal and vertical parallel lines in order to obtain an absolute reference for the attitude computation. We also develop here a new and efficient method for line extraction and bundle of parallel line detection. An original method of horizontal and vertical plane detection is also provided. We show experimental results on different images extracted from video sequences.
international conference on robotics and automation | 2005
Khaled Kaaniche; Benjamin Champion; Claude Pégard; Pascal Vasseur
This paper presents a vision system for road traffic surveillance from sequences acquired from an unmanned aerial vehicle (UAV). This UAV is able to follow a path considered as the surveillance area and defined by a set of ordered GPS points. During the navigation of the UAV, the vision system acquires sequences which are treated in real-time in order to detect vehicles. This detection allows to perform a traffic estimation or to track a pointed out vehicle. The detection of vehicles is based on the spatiotemporal grouping of primitives formulated as a normalized cuts problem. A verification step based on the Dempster-Shafer theory is also proposed in order to recognize the vehicles.
Pattern Recognition | 1999
Pascal Vasseur; Claude Pégard; El Mustapha Mouaddib; Laurent Delahoche
Abstract In this paper, we propose an application of the perceptual organization based on the Dempster–Shafer theory. This method is divided into two parts which respectively rectifies the segmentation mistakes by restoring the coherence of the segments and detects objects in the scene by forming groups of primitives. We show how we apply the Dempster–Shafer theory, usually used in data fusion, in order to obtain an optimal adequation between the perceptual organization problem and this tool. We show that without any prior knowledge and any threshold, our bottom-up algorithm detects efficiently the different objects even in cluttered environment. Moreover, we demonstrate its robustness and flexibility on indoor and outdoor scenes without any modification of parameters.
Archive | 2013
Luis Rodolfo García Carrillo; Alejandro Enrique Dzul López; Rogelio Lozano; Claude Pégard
This chapter presents the modeling of a quad-rotor UAV. A general overview of the quad-rotor helicopter and its operation principle is given. Next, the quad-rotor modeling is addressed using two different approaches: Euler–Lagrange and Newton–Euler. How to derive Lagrange’s equations from Newton’s equations is also shown. Finally, the author presents also the Newton–Euler modeling for an “X-Flyer” quad-rotor configuration.
international conference on advanced robotics | 2011
Abdelhamid Rabhi; Mohammed Chadli; Claude Pégard
In this paper, we propose an algorithm based on fuzzy control to ensure the stability of the quadrotor. After giving the nonlinear model of the robot, its representation by a Takagi-Sugeno (T-S) fuzzy model is first discussed. Next, a fuzzy controller are synthesized which guarantee desired control performances. The given controller is designed using numerical tools (Linear Matrix Inequalities-LMI). The simulation results show effectiveness and robustness of the proposed method.
intelligent robots and systems | 2006
Cédric Demonceaux; Pascal Vasseur; Claude Pégard
Attitude (roll and pitch) is an essential data for the navigation of a UAV. Rather than using inertial sensors, we propose a catadioptric vision system allowing a fast, robust and accurate estimation of these angles. We show that the optimization of a sky/ground partitioning criterion associated with the specific geometric characteristics of the catadioptric sensor provides very interesting results. Experimental results obtained on real sequences are presented and compared with inertial sensor measures
international conference on robotics and automation | 1999
Cyril Drocourt; Laurent Delahoche; Claude Pégard; Arnaud Clerentin
Presents a system of absolute localization based on the stereoscopic omnidirectional vision. To do it we use an original perception system which allows our omnidirectional vision sensor SYCLOP to move along a rail. The first part of our study deals with the problem of building the sensorial model with the help of the two stereoscopic omnidirectional images. To solve this problem we propose an approach based on the fusion of several criteria which will be made according to Dempster-Shafer rules. The second part is devoted to exploiting this sensorial model to localize the robot thanks to matching the sensorial primitives with the environment map. We analyze the performance of our global absolute localization system on several robot elementary moves, in different environments.
international conference on robotics and automation | 1998
Laurent Delahoche; Claude Pégard; El Mustapha Mouaddib; Pascal Vasseur
In this article we present a navigation system allowing a mobile robot to be localized in an indoor environment which is only partially known. This system integrates an environment map updating module allowing the mobile robot to estimate the position of new vertical landmarks along its path. An extended Kalman filter is used on the one hand to estimate the mobile robot position and on the other hand to extract observations which will be used to determine the positions of unlisted landmarks. The integration of new landmarks into the environment global map is managed from the covariance matrix associated with each unlisted landmark. We present the experimental results we have got with SARAH, our mobile robot.
intelligent robots and systems | 1997
Laurent Delahoche; Claude Pégard; Bruno Marhic; Pascal Vasseur
In this paper we present a dynamic localization system which allows a mobile robot to evolve autonomously in a structured environment. Our system is based on the use of two sensors: an odometer and an omnidirectional vision system which gives a reference in connection with a set of natural beacons. Our navigation algorithm gives a reliable position estimation due to a systematic dynamic resetting. To merge the data obtained we use the extended Kalman filter. Our proposed method allows us to treat efficiently the noise problems linked to the primitive extraction, which contributes to the robustness of our system. Thus, we have developed a reliable and quick navigation system which can deals with the constraints of moving the robots in an industrial environment. We give the experimental results obtained from a mission realized in an a priori known environment.