Fabien Expert
Aix-Marseille University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Fabien Expert.
Proceedings of the National Academy of Sciences of the United States of America | 2013
Dario Floreano; Ramon Pericet-Camara; Stéphane Viollet; Franck Ruffier; Andreas Brückner; Robert Leitel; Wolfgang Buss; M. Menouni; Fabien Expert; Raphaël Juston; Michal Karol Dobrzynski; Géraud L’Eplattenier; Fabian Recktenwald; Hanspeter A. Mallot; Nicolas Franceschini
In most animal species, vision is mediated by compound eyes, which offer lower resolution than vertebrate single-lens eyes, but significantly larger fields of view with negligible distortion and spherical aberration, as well as high temporal resolution in a tiny package. Compound eyes are ideally suited for fast panoramic motion perception. Engineering a miniature artificial compound eye is challenging because it requires accurate alignment of photoreceptive and optical components on a curved surface. Here, we describe a unique design method for biomimetic compound eyes featuring a panoramic, undistorted field of view in a very thin package. The design consists of three planar layers of separately produced arrays, namely, a microlens array, a neuromorphic photodetector array, and a flexible printed circuit board that are stacked, cut, and curved to produce a mechanically flexible imager. Following this method, we have prototyped and characterized an artificial compound eye bearing a hemispherical field of view with embedded and programmable low-power signal processing, high temporal resolution, and local adaptation to illumination. The prototyped artificial compound eye possesses several characteristics similar to the eye of the fruit fly Drosophila and other arthropod species. This design method opens up additional vistas for a broad range of applications in which wide field motion detection is at a premium, such as collision-free navigation of terrestrial and aerospace vehicles, and for the experimental testing of insect vision theories.
Journal of Field Robotics | 2011
Fabien Expert; Stéphane Viollet; Franck Ruffier
Considerable attention has been paid during the past decade to navigation systems based on the use of visual optic flow cues. Optic flow--based visuomotor control systems have been implemented on an increasingly large number of sighted autonomous robots designed to travel under specific lighting conditions. Many algorithms based on conventional cameras or custom-made sensors are being used nowadays to process visual motion. In this paper, we focus on the reliability of our optical sensors, which can be used to measure the local one-dimensional angular speed of robots flying outdoors over a visual scene in terms of their accuracy, range, refresh rate, and sensitivity to illuminance variations. We have designed, constructed, and characterized two miniature custom-made visual motion sensors: (i) the APIS (adaptive pixels for insect-based sensors)-based local motion sensor involving the use of an array custom-made in Very-Large-Scale Integration (VLSI) technology, which is equipped with Delbruck-type autoadaptive pixels, and (ii) the LSC-based (LSC is a component purchased from iC-Haus) local motion sensor involving the use of off-the-shelf linearly amplified photosensors, which is equipped with an onchip preamplification circuit. By combining these photodetectors with a low-cost optical assembly and a bioinspired visual processing algorithm, highly effective miniature sensors were obtained for measuring the visual angular speed in field experiments. The present study focused on the static characteristics and the dynamic responses of these local motion sensors over a wide range of illuminance values, ranging from 50 to 10,000 lux both indoors and outdoors. Although outdoor experiments are of great interest to equip micro-air vehicles with visual motion sensors, we also performed indoor experiments as a comparison. The LSC-based visual motion sensor was found to be more accurate in a narrow, 1.5-decade illuminance range, whereas the APIS-based visual motion sensor was more robust to illuminance changes in a larger, 3-decade range. The method presented in this study provides a new benchmark test for thoroughly characterizing visual motion and optic flow sensors designed to operate outdoors under various lighting conditions, in unknown environments where future micro-aerial vehicles will be able to navigate safely.
Bioinspiration & Biomimetics | 2015
Fabien Expert; Franck Ruffier
Two bio-inspired guidance principles involving no reference frame are presented here and were implemented in a rotorcraft, which was equipped with panoramic optic flow (OF) sensors but (as in flying insects) no accelerometer. To test these two guidance principles, we built a tethered tandem rotorcraft called BeeRotor (80 grams), which was tested flying along a high-roofed tunnel. The aerial robot adjusts its pitch and hence its speed, hugs the ground and lands safely without any need for an inertial reference frame. The rotorcrafts altitude and forward speed are adjusted via two OF regulators piloting the lift and the pitch angle on the basis of the common-mode and differential rotor speeds, respectively. The robot equipped with two wide-field OF sensors was tested in order to assess the performances of the following two systems of guidance involving no inertial reference frame: (i) a system with a fixed eye orientation based on the curved artificial compound eye (CurvACE) sensor, and (ii) an active system of reorientation based on a quasi-panoramic eye which constantly realigns its gaze, keeping it parallel to the nearest surface followed. Safe automatic terrain following and landing were obtained with CurvACE under dim light to daylight conditions and the active eye-reorientation system over rugged, changing terrain, without any need for an inertial reference frame.
ieee sensors | 2011
Frédéric L. Roubieu; Fabien Expert; Marc Boyron; Benoı̂t-Jérémy Fuschlock; Stéphane Viollet; Franck Ruffier
Autopilots for micro aerial vehicles (MAVs) with a maximum permissible avionic payload of only a few grams need lightweight, low-power sensors to be able to navigate safely when flying through unknown environments. To meet these demanding specifications, we developed a simple functional model for an Elementary Motion Detector (EMD) circuit based on the common houseflys visual system. During the last two decades, several insect-based visual motion sensors have been designed and implemented on various robots, and considerable improvements have been made in terms of their mass, size and power consumption. The new lightweight visual motion sensor presented here generates 5 simultaneous neighboring measurements of the 1-D angular speed of a natural scene within a measurement range of more than one decade [25°/s; 350°/s]. Using a new sensory fusion method consisting in computing the median value of the 5 local motion units, we ended up with a more robust, more accurate and more frequently refreshed measurement of the 1-D angular speed.
Sensors | 2014
Stéphane Viollet; S. Godiot; Robert Leitel; Wolfgang Buss; P. Breugnon; M. Menouni; Raphaël Juston; Fabien Expert; Fabien Colonnier; Géraud L'Eplattenier; Andreas Brückner; Felix Kraze; Hanspeter A. Mallot; Nicolas Franceschini; Ramon Pericet-Camara; Franck Ruffier; Dario Floreano
The demand for bendable sensors increases constantly in the challenging field of soft and micro-scale robotics. We present here, in more detail, the flexible, functional, insect-inspired curved artificial compound eye (CurvACE) that was previously introduced in the Proceedings of the National Academy of Sciences (PNAS, 2013). This cylindrically-bent sensor with a large panoramic field-of-view of 180° × 60° composed of 630 artificial ommatidia weighs only 1.75 g, is extremely compact and power-lean (0.9 W), while it achieves unique visual motion sensing performance (1950 frames per second) in a five-decade range of illuminance. In particular, this paper details the innovative Very Large Scale Integration (VLSI) sensing layout, the accurate assembly fabrication process, the innovative, new fast read-out interface, as well as the auto-adaptive dynamic response of the CurvACE sensor. Starting from photodetectors and microoptics on wafer substrates and flexible printed circuit board, the complete assembly of CurvACE was performed in a planar configuration, ensuring high alignment accuracy and compatibility with state-of-the art assembling processes. The characteristics of the photodetector of one artificial ommatidium have been assessed in terms of their dynamic response to light steps. We also characterized the local auto-adaptability of CurvACE photodetectors in response to large illuminance changes: this feature will certainly be of great interest for future applications in real indoor and outdoor environments.
IEEE Sensors Journal | 2013
Frédéric L. Roubieu; Fabien Expert; Guillaume Sabiron; Franck Ruffier
Optic flow-based autopilots for Micro-Aerial Vehicles (MAVs) need lightweight, low-power sensors to be able to fly safely through unknown environments. The new tiny 6-pixel visual motion sensor presented here meets these demanding requirements in terms of its mass, size, and power consumption. This 1-gram, low-power, fly-inspired sensor accurately gauges the visual motion using only this 6-pixel array with two different panoramas and illuminance conditions. The new visual motion sensors output results from a smart combination of the information collected by several 2-pixel Local Motion Sensors (LMSs), on the basis of the “time of travel” scheme originally inspired by the common houseflys Elementary Motion Detector (EMD) neurons. The proposed sensory fusion method enables the new visual sensor to measure the visual angular speed and determine the main direction of the visual motion without any prior knowledge. Through computing the median value of the output from several LMSs, we also end up with a more robust, more accurate, and more frequently refreshed measurement of the one-dimensional angular speed.
international conference on complex medical engineering | 2012
Franck Ruffier; Fabien Expert
In previous studies, we described how complicated tasks such as ground avoidance, terrain following, takeoff and landing can be performed using optic flow sensors mounted on a tethered flying robot called OCTAVE. In the present study, a new programmable visual motion sensor connected to a lightweight Bluetooth module was mounted on a free-flying 50-gram helicopter called TwinCoax. This small helicopter model equipped with 3 IR-reflective markers was flown in a dedicated room equipped with a VICON system to record its trajectory. The results of this study show that despite the complex, adverse lighting conditions, the optic flow measured onboard matched the ground-truth optic flow generated by the free-flying helicopters trajectory quite exactly.
ieee sensors | 2012
Fabien Expert; Frédéric L. Roubieu; Franck Ruffier
Insects flying abilities based on optic flow (OF) are nice bio-inspired models for Micro Aerial Vehicles (MAVs) endowed with a limited computational power. Most OF sensing robots developed so far have used numerically complex algorithms requiring large computational power often carried out offline. The present study shows the performances of our bio-inspired Visual Motion Sensor (VMS) based on a 3×4 matrix of auto-adaptive aVLSI photoreceptors pertaining to a custom-made bio-inspired chip called APIS (Adaptive Pixels for Insect-based Sensors). To achieve such processing with the limited computational power of a tiny microcontroller (μC), the μC-based implementation of the “time of travel” scheme requiring at least a 1kHz sampling rate was modified by linearly interpolating the photoreceptors signals to run the algorithm at a lower sampling rate. The accuracy of the measurements was assessed for various sampling rates in simulation and the best tradeoff between computational load and accuracy determined at 200Hz was implemented onboard a tiny μC. By interpolating the photoreceptors signals and by fusing the output of several Local Motion Sensors (LMSs), we ended up with an accurate and frequently refreshed VMS measuring a visual angular speed and requiring more than 4 times less computational resources.
Procedia Computer Science | 2011
Ramon Pericet-Camara; Michal Karol Dobrzynski; Géraud L’Eplattenier; Jean-Christophe Zufferey; Fabien Expert; Raphaël Juston; Franck Ruffier; Nicolas Franceschini; Stéphane Viollet; M. Menouni; S. Godiot; Andreas Brückner; Wolfgang Buss; Robert Leitel; Fabian Recktenwald; Chunrong Yuan; Hanspeter A. Mallot; Dario Floreano
CURVACE aims at designing, developing, and assessing CURVed Artificial Compound Eyes, a radically novel family of vision systems. This innovative approach will provide more efficient visual abilities for embedded applications that require motion analysis in low-power and small packages. Compared to conventional cameras, artificial compound eyes will offer a much larger field of view with negligible distortion and exceptionally high temporal resolution in smaller size and weight that will fit the requirements of a wide range of applications.
2017 Workshop on Research, Education and Development of Unmanned Aerial Systems (RED-UAS) | 2017
A. Desbiez; Fabien Expert; Marc Boyron; Julien Diperi; Stéphane Viollet; Franck Ruffier
The X-Morf robot is a 380-g quadrotor consisting of two independent arms each carrying tandem rotors, forming an actuated scissor joint. The X-Morf robot is able to actively change in-flight its X-geometry by changing the angle between its two arms. The magnetic and electrical joint between the quadrotors arms makes them easily removable and resistant to crashes while providing the propellers with sufficient power and ensuring high quality signal transmission during flight. The dynamic model on which the X-Morf robot was based, was also used to design an adaptive controller. A Model Reference Adaptive Control (MRAC) law was implemented to deal with the uncertainties about the inertia and the center of mass due to the quadrotors reconfigurable architecture and for in-flight span-adapting purposes. The tests performed with the X-Morf robot showed that it is able to decrease and increase its span dynamically by up to 28.5% within 0.5s during flight while giving good stability and attitude tracking performances.