Julien Serres
Aix-Marseille University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Julien Serres.
BMC Neuroscience | 2007
Philippe Pinel; Bertrand Thirion; Sébastien Mériaux; Antoinette Jobert; Julien Serres; Denis Le Bihan; Jean-Baptiste Poline; Stanislas Dehaene
BackgroundAlthough cognitive processes such as reading and calculation are associated with reproducible cerebral networks, inter-individual variability is considerable. Understanding the origins of this variability will require the elaboration of large multimodal databases compiling behavioral, anatomical, genetic and functional neuroimaging data over hundreds of subjects. With this goal in mind, we designed a simple and fast acquisition procedure based on a 5-minute functional magnetic resonance imaging (fMRI) sequence that can be run as easily and as systematically as an anatomical scan, and is therefore used in every subject undergoing fMRI in our laboratory. This protocol captures the cerebral bases of auditory and visual perception, motor actions, reading, language comprehension and mental calculation at an individual level.Results81 subjects were successfully scanned. Before describing inter-individual variability, we demonstrated in the present study the reliability of individual functional data obtained with this short protocol. Considering the anatomical variability, we then needed to correctly describe individual functional networks in a voxel-free space. We applied then non-voxel based methods that automatically extract main features of individual patterns of activation: group analyses performed on these individual data not only converge to those reported with a more conventional voxel-based random effect analysis, but also keep information concerning variance in location and degrees of activation across subjects.ConclusionThis collection of individual fMRI data will help to describe the cerebral inter-subject variability of the correlates of some language, calculation and sensorimotor tasks. In association with demographic, anatomical, behavioral and genetic data, this protocol will serve as the cornerstone to establish a hybrid database of hundreds of subjects suitable to study the range and causes of variation in the cerebral bases of numerous mental processes.
Autonomous Robots | 2008
Julien Serres; D. Dray; Franck Ruffier; Nicolas Franceschini
Abstract In our project on the autonomous guidance of Micro-Air Vehicles (MAVs) in confined indoor and outdoor environments, we have developed a vision based autopilot, with which a miniature hovercraft travels along a corridor by automatically controlling both its speed and its clearance from the walls. A hovercraft is an air vehicle endowed with natural roll and pitch stabilization characteristics, in which planar flight control systems can be developed conveniently. Our hovercraft is fully actuated by two rear and two lateral thrusters. It travels at a constant altitude (∼2 mm) and senses the environment by means of two lateral eyes that measure the right and left optic flows (OFs). The visuo-motor control system, which is called LORA III (Lateral Optic flow Regulation Autopilot, Mark III), is a dual OF regulator consisting of two intertwined feedback loops, each of which has its own OF set-point and controls the vehicle’s translation in one degree of freedom (surge or sway). Our computer-simulated experiments show that the hovercraft can navigate along a straight or tapered corridor at a relatively high speed (up to 1 m/s). It also reacts to any major step perturbations in the lateral OF (provided by a moving wall) and to any disturbances caused by a tapered corridor. The minimalistic visual system (comprised of only 4 pixels) suffices for the hovercraft to be able to control both its clearance from the walls and its forward speed jointly, without ever measuring speed and distance. The non-emissive visual sensors and the simple control system developed here are suitable for use on MAVs with a permissible avionic payload of only a few grams. This study also accounts quantitatively for previous ethological findings on honeybees flying freely in a straight or tapered corridor.
Naturwissenschaften | 2008
Julien Serres; Guillaume P. Masson; Franck Ruffier; Nicolas Franceschini
In an attempt to better understand the mechanism underlying lateral collision avoidance in flying insects, we trained honeybees (Apis mellifera) to fly through a large (95-cm wide) flight tunnel. We found that, depending on the entrance and feeder positions, honeybees would either center along the corridor midline or fly along one wall. Bees kept following one wall even when a major (150-cm long) part of the opposite wall was removed. These findings cannot be accounted for by the “optic flow balance” hypothesis that has been put forward to explain the typical bees’ “centering response” observed in narrower corridors. Both centering and wall-following behaviors are well accounted for, however, by a control scheme called the lateral optic flow regulator, i.e., a feedback system that strives to maintain the unilateral optic flow constant. The power of this control scheme is that it would allow the bee to guide itself visually in a corridor without having to measure its speed or distance from the walls.
International Journal of Advanced Robotic Systems | 2006
Julien Serres; Franck Ruffier; Stéphane Viollet; Nicolas Franceschini
In our ongoing project on the autonomous guidance of Micro-Air Vehicles (MAVs) in confined indoor and outdoor environments, we have developed a bio-inspired optic flow based autopilot enabling a hovercraft to travel safely, and avoid the walls of a corridor. The hovercraft is an air vehicle endowed with natural roll and pitch stabilization characteristics, in which planar flight control can be developed conveniently. It travels at a constant ground height (∼2mm) and senses the environment by means of two lateral eyes that measure the right and left optic flows (OFs). The visuomotor feedback loop, which is called LORA(1) (Lateral Optic flow Regulation Autopilot, Mark 1), consists of a lateral OF regulator that adjusts the hovercrafts yaw velocity and keeps the lateral OF constant on one wall equal to an OF set-point. Simulations have shown that the hovercraft manages to navigate in a corridor at a “preset” groundspeed (1m/s) without requiring a supervisor to make it switch abruptly between the control-laws corresponding to behaviours such as automatic wall-following, automatic centring, and automatically reacting to an opening encountered on a wall. The passive visual sensors and the simple control system used here are suitable for use on MAVs with an avionic payload of only a few grams.
Archive | 2009
Nicolas Franceschini; Franck Ruffier; Julien Serres; Stéphane Viollet
Insects and birds have been in the sensory-motor control business for more than 100 million years. The manned aircraft developed over the last 100 years rely on a similar lift to that generated by birds’ wings. Aircraft designers have paid little attention, however, to the pilot’s visual sensor that finely controls these wings, although it is definitely the most sophisticated avionic sensors ever known to exist. For this reason, the thinking that prevails in the field of aeronautics does not help us much grasp the visuo-motor contol laws that animals and humans bring into play to control their flight. To control an aircraft, it has been deemed essential to measure state variables such as barometric altitude, groundheight, groundspeed, descent speed, etc. Yet the sensors developed for this purpose usually emissive sensors such as Doppler radars, radar-altimeters or forward-looking infrared sensors, in particular are far too cumbersome for insects or even birds to carry and to power. Natural flyers must therefore have developed other systems for controlling their flight. Flying insects are agile creatures that navigate swiftly through most unpredictable environments. Equipped with “only” about one million neurons and only 3000 pixels in each eye, the housefly, for example, achieves 3D navigation at an impressive 700 bodylengths per second. The lightness of the processing system at work onboard a fly makes us turn pale when we realize that this creature actually achieves just what is being sought for in the field of aerial robotics: dynamic stabilization, 3D autonomous navigation, ground avoidance, collision avoidance with stationary and nonstationary obstacles, tracking, docking, autonomous takeoff and landing, etc. Houseflies add insult to injury by being able to land gracefully on ceilings. The last seven decades have provided evidence that flying insects guide themselves through their environments by processing the optic flow (OF) that is generated on their eyes as a consequence of their locomotion. In the animal’s reference frame, the translational OF is the angular speed ω at which contrasting objects in the environment move past the animal (Kennedy, 1939; Gibson, 1950; Lee, 1980; Koenderink, 1986). In the present chapter, it is proposed to summarize our attempts to model the visuomotor control system that provides flying insects with a means of autonomous guidance at close range. The aim of these studies was not (yet) to produce a detailed neural circuit but rather to obtain a more functional overall picture, that is, a picture that abstracts some basic control
ieee international conference on biomedical robotics and biomechatronics | 2006
Julien Serres; Franck Ruffier; Nicolas Franceschini
In our project on the autonomous guidance of micro-air vehicles (MAVs) in confined indoor and outdoor environments, we have developed a bio-inspired optic flow based autopilot with which the speed of a miniature hovercraft is controlled and the walls of a straight or tapered corridor are safely avoided. A hovercraft is an air vehicle endowed with natural roll and pitch stabilization characteristics, in which planar flight control can be developed conveniently. Our own hovercraft is fully actuated by two rear and two lateral thrusters. It travels at a constant ground height (~2 mm) and senses the environment by means of two lateral eyes that measure the right and left optic flows (OFs). The complete visuo-motor control system, which is called LORA(2) (lateral optic flow regulation autopilot), consists of a system of two lateral OF regulators with a single OF set-point; (i) the first lateral OF regulator adjusts the hovercrafts forward thrust, so as to maintain the mean value of the two (right and left) OFs measured equal to a set-point. (ii) the second lateral OF regulator controls the hovercrafts side-slip thrust, so as to maintain the OF measured equal to the same set-point as in (i). Interestingly, this makes the distance to the left (DL) or right (DR) wall proportional to the forward airspeed Vx determined in (i): the faster the hovercraft is travelling, the further away from the walls it will be. Simulations have shown that the hovercraft manages to navigate in a straight or tapered corridor at speeds of up to 1 m/s although it has only a minimalistic visual system. The passive visual sensors and the simple control system used here are suitable for use on MAVs with an avionic payload of only a few grams. A major outcome of this work is that the LORA(2) autopilot makes the hovercraft navigate without any need for range sensors or speed sensors
Journal of Bionic Engineering | 2015
Julien Serres; Franck Ruffier
A bioinspired autopilot is presented, in which body saccadic and intersaccadic systems are combined. This autopilot enables a simulated hovercraft to travel along corridors comprising L-junctions, U-shaped and S-shaped turns, relying on mini-malistic motion vision cues alone without measuring its speed or distance from walls, in much the same way as flies and bees manage their flight in similar situations. The saccadic system responsible for avoiding frontal collisions triggers yawing body saccades with appropriately quantified angles based simply on a few local optic flow measurements, giving the angle of incidence with respect to a frontal wall. The simulated robot negotiates stiff bends by triggering body saccades to realign its trajectory, thus traveling parallel with the wall along a corridor comprising sharp turns. Direct comparison shows that the performance of this new body saccade-based autopilot closely resembles the behavior of a fly using similar body saccade strategy when flying along a corridor with an S-shaped turn, despite the huge differences in terms of the inertia.
Sensors | 2017
Erik Vanhoutte; Stefano Mafrica; Franck Ruffier; Reinoud J. Bootsma; Julien Serres
For use in autonomous micro air vehicles, visual sensors must not only be small, lightweight and insensitive to light variations; on-board autopilots also require fast and accurate optical flow measurements over a wide range of speeds. Using an auto-adaptive bio-inspired Michaelis–Menten Auto-adaptive Pixel (M2APix) analog silicon retina, in this article, we present comparative tests of two optical flow calculation algorithms operating under lighting conditions from 6×10−7 to 1.6×10−2 W·cm−2 (i.e., from 0.2 to 12,000 lux for human vision). Contrast “time of travel” between two adjacent light-sensitive pixels was determined by thresholding and by cross-correlating the two pixels’ signals, with measurement frequency up to 5 kHz for the 10 local motion sensors of the M2APix sensor. While both algorithms adequately measured optical flow between 25 ∘/s and 1000 ∘/s, thresholding gave rise to a lower precision, especially due to a larger number of outliers at higher speeds. Compared to thresholding, cross-correlation also allowed for a higher rate of optical flow output (99 Hz and 1195 Hz, respectively) but required substantially more computational resources.
Flying Insects and Robots | 2009
Nicolas Franceschini; Franck Ruffier; Julien Serres
The explicit control schemes presented here explain how insects may navigate on the sole basis of optic flow (OF) cues without requiring any distance or speed measurements: how they take off and land, follow the terrain, avoid the lateral walls in a corridor, and control their forward speed automatically. The optic flow regulator, a feedback system controlling either the lift, the forward thrust, or the lateral thrust, is described. Three OF regulators account for various insect flight patterns observed over the ground and over still water, under calm and windy conditions, and in straight and tapered corridors. These control schemes were simulated experimentally and/or implemented onboard two types of aerial robots, a micro-helicopter (MH) and a hovercraft (HO), which behaved much like insects when placed in similar environments. These robots were equipped with opto-electronic OF sensors inspired by our electrophysiological findings on houseflies’ motion-sensitive visual neurons. The simple, parsimonious control schemes described here require no conventional avionic devices such as rangefinders, groundspeed sensors, or GPS receivers. They are consistent with the neural repertory of flying insects and meet the low avionic payload requirements of autonomous micro-aerial and space vehicles.
international conference on event based control communication and signal processing | 2016
Julien Serres; Thibaut Raharijaona; Erik Vanhoutte; Franck Ruffier
In view of neuro-ethological findings on honeybees and our previously developed vision-based autopilot, in-silico experiments were performed in which a “simulated bee” was make to travel along a doubly tapering tunnel including for the first time event-based controllers. The “simulated bee” was equipped with: a minimalistic compound eye comprising 10 local motion sensors measuring the optic flow magnitude; two optic flow regulators updating the control signals whenever specific optic flow criteria changed; and three event-based controllers taking into account the error signals, each one in charge of its own translational dynamics. A MORSE/Blender based simulator-engine delivered what each of 20 “simulated photoreceptors” saw in the tunnel lined with high resolution natural 2D images. The “simulated bee” managed to travel safely along the doubly tapering tunnel without requiring any speed or distance measurements, using only a Gibsonian point of view, by: concomitantly adjusting the side thrust, vertical lift and forward thrust whenever a change was detected on the optic flow-based signal errors; avoiding collisions with the surface of the doubly tapering tunnel and decreasing or increasing its speed, depending on the clutter rate perceived by motion sensors.