Franck Ruffier
Aix-Marseille University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Franck Ruffier.
Proceedings of the National Academy of Sciences of the United States of America | 2013
Dario Floreano; Ramon Pericet-Camara; Stéphane Viollet; Franck Ruffier; Andreas Brückner; Robert Leitel; Wolfgang Buss; M. Menouni; Fabien Expert; Raphaël Juston; Michal Karol Dobrzynski; Géraud L’Eplattenier; Fabian Recktenwald; Hanspeter A. Mallot; Nicolas Franceschini
In most animal species, vision is mediated by compound eyes, which offer lower resolution than vertebrate single-lens eyes, but significantly larger fields of view with negligible distortion and spherical aberration, as well as high temporal resolution in a tiny package. Compound eyes are ideally suited for fast panoramic motion perception. Engineering a miniature artificial compound eye is challenging because it requires accurate alignment of photoreceptive and optical components on a curved surface. Here, we describe a unique design method for biomimetic compound eyes featuring a panoramic, undistorted field of view in a very thin package. The design consists of three planar layers of separately produced arrays, namely, a microlens array, a neuromorphic photodetector array, and a flexible printed circuit board that are stacked, cut, and curved to produce a mechanically flexible imager. Following this method, we have prototyped and characterized an artificial compound eye bearing a hemispherical field of view with embedded and programmable low-power signal processing, high temporal resolution, and local adaptation to illumination. The prototyped artificial compound eye possesses several characteristics similar to the eye of the fruit fly Drosophila and other arthropod species. This design method opens up additional vistas for a broad range of applications in which wide field motion detection is at a premium, such as collision-free navigation of terrestrial and aerospace vehicles, and for the experimental testing of insect vision theories.
international conference on robotics and automation | 2004
Franck Ruffier; Nicolas Franceschini
We have developed a visually based autopilot which is able to make a micro air vehicle (MAV) automatically take off, cruise and land, while reacting adequately to wind disturbances. We built a proof-of-concept, tethered rotorcraft that can travel indoors over an environment composed of contrasting features randomly arranged on the floor. Here we show the feasibility of a visuomotor control loop that acts upon the thrust so as to maintain the optic flow (OF) estimated in the downward direction to a reference value. The sensor involved in this OF regulator is an elementary motion detector (EMD). The functional structure of the EMD was inspired by that of the housefly, which was previously investigated at our laboratory by performing electrophysiological recordings while applying optical microstimuli to single photoreceptor cells of the compound eye. The vision based autopilot, which we have called OCTAVE (optic flow control system for aerospace vehicles) solves complex problems such as terrain following, controls risky maneuvers such as take off and landing and responds appropriately to wind disturbances. All these reputedly demanding tasks are performed with one and the same visuomotor control loop. The non-emissive sensor and simple processing system are particularly suitable for use with MAV, since the tolerated avionic payload of these micro-aircraft is only a few grams. OCTAVE autopilot could also contribute to relieve a remote operator from the lowly and difficult task of continuously piloting and guiding an UAV. It could also provide guiding assistance to pilots of manned aircraft.
international symposium on circuits and systems | 2003
Franck Ruffier; Stéphane Viollet; S. Amic; Nicolas Franceschini
In the framework of our research on biologically inspired microrobotics, we have developed a visually based autopilot for micro air vehicles (MAV), which we have called OCTAVE (optical altitude control system for autonomous vehicles). Here, we show the feasibility of a joint altitude and speed control system based on a low complexity optronic velocity sensor that estimates the optic flow in the downward direction. This velocity sensor draws on electrophysiological findings of on the fly elementary motion detectors (EMDs) obtained at our laboratory. We built an elementary, 100-gram tethered helicopter system that carries out terrain following above a randomly textured ground. The overall processing system is light enough to be mounted on-board MAVs with an avionic payload of only some grams.
Autonomous Robots | 2008
Julien Serres; D. Dray; Franck Ruffier; Nicolas Franceschini
Abstract In our project on the autonomous guidance of Micro-Air Vehicles (MAVs) in confined indoor and outdoor environments, we have developed a vision based autopilot, with which a miniature hovercraft travels along a corridor by automatically controlling both its speed and its clearance from the walls. A hovercraft is an air vehicle endowed with natural roll and pitch stabilization characteristics, in which planar flight control systems can be developed conveniently. Our hovercraft is fully actuated by two rear and two lateral thrusters. It travels at a constant altitude (∼2 mm) and senses the environment by means of two lateral eyes that measure the right and left optic flows (OFs). The visuo-motor control system, which is called LORA III (Lateral Optic flow Regulation Autopilot, Mark III), is a dual OF regulator consisting of two intertwined feedback loops, each of which has its own OF set-point and controls the vehicle’s translation in one degree of freedom (surge or sway). Our computer-simulated experiments show that the hovercraft can navigate along a straight or tapered corridor at a relatively high speed (up to 1 m/s). It also reacts to any major step perturbations in the lateral OF (provided by a moving wall) and to any disturbances caused by a tapered corridor. The minimalistic visual system (comprised of only 4 pixels) suffices for the hovercraft to be able to control both its clearance from the walls and its forward speed jointly, without ever measuring speed and distance. The non-emissive visual sensors and the simple control system developed here are suitable for use on MAVs with a permissible avionic payload of only a few grams. This study also accounts quantitatively for previous ethological findings on honeybees flying freely in a straight or tapered corridor.
Naturwissenschaften | 2008
Julien Serres; Guillaume P. Masson; Franck Ruffier; Nicolas Franceschini
In an attempt to better understand the mechanism underlying lateral collision avoidance in flying insects, we trained honeybees (Apis mellifera) to fly through a large (95-cm wide) flight tunnel. We found that, depending on the entrance and feeder positions, honeybees would either center along the corridor midline or fly along one wall. Bees kept following one wall even when a major (150-cm long) part of the opposite wall was removed. These findings cannot be accounted for by the “optic flow balance” hypothesis that has been put forward to explain the typical bees’ “centering response” observed in narrower corridors. Both centering and wall-following behaviors are well accounted for, however, by a control scheme called the lateral optic flow regulator, i.e., a feedback system that strives to maintain the unilateral optic flow constant. The power of this control scheme is that it would allow the bee to guide itself visually in a corridor without having to measure its speed or distance from the walls.
Journal of Field Robotics | 2011
Fabien Expert; Stéphane Viollet; Franck Ruffier
Considerable attention has been paid during the past decade to navigation systems based on the use of visual optic flow cues. Optic flow--based visuomotor control systems have been implemented on an increasingly large number of sighted autonomous robots designed to travel under specific lighting conditions. Many algorithms based on conventional cameras or custom-made sensors are being used nowadays to process visual motion. In this paper, we focus on the reliability of our optical sensors, which can be used to measure the local one-dimensional angular speed of robots flying outdoors over a visual scene in terms of their accuracy, range, refresh rate, and sensitivity to illuminance variations. We have designed, constructed, and characterized two miniature custom-made visual motion sensors: (i) the APIS (adaptive pixels for insect-based sensors)-based local motion sensor involving the use of an array custom-made in Very-Large-Scale Integration (VLSI) technology, which is equipped with Delbruck-type autoadaptive pixels, and (ii) the LSC-based (LSC is a component purchased from iC-Haus) local motion sensor involving the use of off-the-shelf linearly amplified photosensors, which is equipped with an onchip preamplification circuit. By combining these photodetectors with a low-cost optical assembly and a bioinspired visual processing algorithm, highly effective miniature sensors were obtained for measuring the visual angular speed in field experiments. The present study focused on the static characteristics and the dynamic responses of these local motion sensors over a wide range of illuminance values, ranging from 50 to 10,000 lux both indoors and outdoors. Although outdoor experiments are of great interest to equip micro-air vehicles with visual motion sensors, we also performed indoor experiments as a comparison. The LSC-based visual motion sensor was found to be more accurate in a narrow, 1.5-decade illuminance range, whereas the APIS-based visual motion sensor was more robust to illuminance changes in a larger, 3-decade range. The method presented in this study provides a new benchmark test for thoroughly characterizing visual motion and optic flow sensors designed to operate outdoors under various lighting conditions, in unknown environments where future micro-aerial vehicles will be able to navigate safely.
Proceedings of SPIE | 2003
Franck Ruffier; Nicolas Franceschini
We have developed a visually based autopilot for Micro Air Vehicles (MAV), which we have called OCTAVE (Optical altitude Control sysTem for Autonomous VEhicles). First we built a miniature MAV and an indoor test-bed. The mini-helicopter is tethered to a whirling arm and rotates around a central pole equipped with ground-truth positioning sensors for experimental evaluation. The 100-gram rotorcraft lifts itself by means of a single rotor that can also be tilted forward (pitch) to give the craft a horizontal thrust component (propulsive force). The helicopter’s eye is automatically oriented downwards over an environment composed of contrasting features randomly arranged on the floor. Here we show the feasibility of a ground avoiding system based on a low complexity opto-electronic system. The latter relies on an Elementary Motion Detector (EMD) that estimates the optic flow in the downward direction. The EMD functional structure is directly inspired by that of the fly’s EMDs, the functional scheme of which has been elucidated at our Laboratory by performing electrophysiological recordings while applying optical microstimuli to the retina. The OCTAVE autopilot makes the aircraft capable of effective terrain following at various speeds: the MAV performs reproducible manoeuvers such as smooth cruise flight over a planar ground and hill climbing. The overall processing electronics is very light-weight, which makes it highly suitable for mounting on-board micro air vehicles with an avionic payload in the order of only a few grams.
intelligent robots and systems | 2008
Franck Ruffier; Nicolas Franceschini
We have developed a new vision based autopilot able to make a micro-air vehicle automatically navigate in steep relief. It uses onboard Optic Flow sensors inspired by the houseflys Elementary Motion Detectors (EMDs) that were previously investigated at our Laboratory with electrophysiological and micro-optical techniques. The paper investigates how the ground avoidance performances of the former OCTAVE robot could be enhanced to cope with steep relief. The idea is to combine frontal and ventral OF sensors and to merge feedback and feedforward loops. In the new robot, called OCTAVE(2), a feedback loop adjusts the lift so as to keep the ventral OF constant, while a feedforward loop based on a forward looking EMD sensor serves to anticipate the steep relief. We test the new autopilot on a 100-gram tethered rotorcraft that circles indoors over an environment composed of contrasting features randomly arranged on the floor. We show that OCTAVE(2) succeeds in following a relatively steep relief (maximum slope 17deg) while navigating close to the ground (groundheight in the order of lm). This risky task is performed thanks to a minimalist electronic visual system: the OF sensor suite is lightweight (4.3-grams including the lenses), and is therefore mounted onboard.
International Journal of Advanced Robotic Systems | 2006
Julien Serres; Franck Ruffier; Stéphane Viollet; Nicolas Franceschini
In our ongoing project on the autonomous guidance of Micro-Air Vehicles (MAVs) in confined indoor and outdoor environments, we have developed a bio-inspired optic flow based autopilot enabling a hovercraft to travel safely, and avoid the walls of a corridor. The hovercraft is an air vehicle endowed with natural roll and pitch stabilization characteristics, in which planar flight control can be developed conveniently. It travels at a constant ground height (∼2mm) and senses the environment by means of two lateral eyes that measure the right and left optic flows (OFs). The visuomotor feedback loop, which is called LORA(1) (Lateral Optic flow Regulation Autopilot, Mark 1), consists of a lateral OF regulator that adjusts the hovercrafts yaw velocity and keeps the lateral OF constant on one wall equal to an OF set-point. Simulations have shown that the hovercraft manages to navigate in a corridor at a “preset” groundspeed (1m/s) without requiring a supervisor to make it switch abruptly between the control-laws corresponding to behaviours such as automatic wall-following, automatic centring, and automatically reacting to an opening encountered on a wall. The passive visual sensors and the simple control system used here are suitable for use on MAVs with an avionic payload of only a few grams.
international conference on robotics and automation | 2010
Florent Valette; Franck Ruffier; Stéphane Viollet; Tobias Seidl
Autonomous landing on unknown extraterrestrial bodies requires fast, noise-resistant motion processing to elicit appropriate steering commands. Flying insects excellently master visual motion sensing techniques to cope with highly parallel data at a low energy cost, using dedicated motion processing circuits. Results obtained in neurophysiological, behavioural, and biorobotic studies on insect flight control were used to safely land a spacecraft on the Moon in a simulated environment. ESAs Advanced Concepts Team has identified autonomous lunar landing as a relevant situation for testing the potential applications of innovative bio-inspired visual guidance systems to space missions. Biomimetic optic flow-based strategies for controlling automatic landing were tested in a very realistic simulated Moon environment. Visual information was provided using the PANGU software program and used to regulate the optic flow generated during the landing of a two degrees of freedom spacecraft. The results of the simulation showed that a single elementary motion detector coupled to a regulator robustly controlled the autonomous descent and the approach of the simulated moonlander. “Low gate” located approximately 10 m above the ground was reached with acceptable vertical and horizontal speeds of 4 m/s and 5 m/s, respectively. It was also established that optic flow sensing methods can be used successfully to cope with temporary sensor blinding and poor lighting conditions.