Stéphane Bazeille
Istituto Italiano di Tecnologia
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Stéphane Bazeille.
international conference on robotics and automation | 2014
Alexander W. Winkler; Ioannis Havoutis; Stéphane Bazeille; Jesús Ortiz; Michele Focchi; Rüdiger Dillmann; Darwin G. Caldwell; Claudio Semini
We present a framework for quadrupedal locomotion over highly challenging terrain where the choice of appropriate footholds is crucial for the success of the behaviour. We use a path planning approach which shares many similarities with the results of the DARPA Learning Locomotion challenge and extend it to allow more flexibility and increased robustness. During execution we incorporate an on-line force-based foothold adaptation mechanism that updates the planned motion according to the perceived state of the environment. This way we exploit the active compliance of our system to smoothly interact with the environment, even when this is inaccurately perceived or dynamically changing, and update the planned path on-the-fly. In tandem we use a virtual model controller that provides the feed-forward torques that allow increased accuracy together with highly compliant behaviour on an otherwise naturally very stiff robotic system. We leverage the full set of benefits that a high performance torque controlled quadruped robot can provide and demonstrate the flexibility and robustness of our approach on a set of experimental trials of increasing difficulty.
Intelligent Service Robotics | 2014
Stéphane Bazeille; Victor Barasuol; Michele Focchi; Ioannis Havoutis; Marco Frigerio; Jonas Buchli; Darwin G. Caldwell; Claudio Semini
Legged robots have the potential to navigate in challenging terrain, and thus to exceed the mobility of wheeled vehicles. However, their control is more difficult as legged robots need to deal with foothold computation, leg trajectories and posture control in order to achieve successful navigation. In this paper, we present a new framework for the hydraulic quadruped robot HyQ, which performs goal-oriented navigation on unknown rough terrain using inertial measurement data and stereo-vision. This work uses our previously presented reactive controller framework with balancing control and extends it with visual feedback to enable closed-loop gait adjustment. On one hand, the camera images are used to keep the robot walking towards a visual target by correcting its heading angle if the robot deviates from it. On the other hand, the stereo camera is used to estimate the size of the obstacles on the ground plane and thus the terrain roughness. The locomotion controller then adjusts the step height and the velocity according to the size of the obstacles. This results in a robust and autonomous goal-oriented navigation over difficult terrain while subject to disturbances from the ground irregularities or external forces. Indoor and outdoor experiments with our quadruped robot show the effectiveness of this framework.
intelligent robots and systems | 2013
Ioannis Havoutis; Jesús Ortiz; Stéphane Bazeille; Victor Barasuol; Claudio Semini; Darwin G. Caldwell
This paper presents a framework developed to increase the autonomy and versatility of a large (~75kg) hydraulically actuated quadrupedal robot. It combines onboard perception with two locomotion strategies, a dynamic trot and a static crawl gait. This way the robot can perceive its environment and arbitrate between the two behaviours according to the situation at hand. All computations are performed on-board and are carried out in two separate computers, one handles the high-level processes while the other is concerned with the low-level hard real-time control. The perception and subsequently the appropriate gait modifications are performed autonomously. We present outdoor experimental trials of the robot trotting over unknown terrain, perceiving a large obstacle, altering its behaviour to the cautious crawl gait and stepping onto the obstacle. This allows the robot to locomote quickly on relatively flat terrain and gives the robot the ability to overcome large irregular obstacles when required.
2013 IEEE Conference on Technologies for Practical Robot Applications (TePRA) | 2013
Stéphane Bazeille; Victor Barasuol; Michele Focchi; Ioannis Havoutis; Marco Frigerio; Jonas Buchli; Claudio Semini; Darwin G. Caldwell
Legged robots have the potential to navigate in more challenging terrain than wheeled robots do. Unfortunately, their control is more difficult because they have to deal with the traditional mapping and path planning problems, as well as foothold computation, leg trajectories and posture control in order to achieve successful navigation. Many parameters need to be adjusted in real time to keep the robot stable and safe while it is moving. In this paper, we will present a new framework for a quadruped robot, which performs goal-oriented navigation on unknown rough terrain by using inertial measurement data and stereo vision. This framework includes perception and control, and allows the robot to navigate in a straight line forward to a visual goal in a difficult environment. The developed rough terrain locomotion system does not need any mapping or path planning: the stereo camera is used to visually guide the robot and evaluate the terrain roughness and an inertial measurement unit (IMU) is used for posture control. This new framework is an important step forward to achieve fully autonomous navigation because in the case of problems in the SLAM mapping, a reactive locomotion controller is always active. This ensures stable locomotion in rough terrain, by combining direct visual feedback and inertial measurements. By implementing this controller, an autonomous navigation system has been developed, which is goal-oriented and overcomes disturbances from the ground, the robot weight, or external forces. Indoor and outdoor experiments with our quadruped robot show the effectiveness and the robustness of this framework.
international conference on robotics and automation | 2017
Marco Camurri; Maurice Fallon; Stéphane Bazeille; Andreea Radulescu; Victor Barasuol; Darwin G. Caldwell; Claudio Semini
Reliable state estimation is crucial for stable planning and control of legged locomotion. A fundamental component of a state estimator in legged platforms is Leg Odometry, which only requires information about kinematics and contacts. Many legged robots use dedicated sensors on each foot to detect ground contacts. However, this choice is impractical for many agile legged robots in field operations, as these sensors often degrade and break. Instead, this paper focuses on the development of a robust Leg Odometry module, which does not require contact sensors. The module estimates the probability of reliable contact and detects foot impacts using internal force sensing. This knowledge is then used to improve the kinematics-inertial state estimate of the robots base. We show how our approach can reach comparable performance to systems with foot sensors. Extensive experimental results lasting over 1 h are presented on our 85
international symposium on safety, security, and rescue robotics | 2013
Marco Hutter; Michael Bloesch; Jonas Buchli; Claudio Semini; Stéphane Bazeille; Ludovic Righetti; Jeannette Bohg
\text{kg}
The International Journal of Robotics Research | 2017
Sylvain Lanneau; Frédéric Boyer; Vincent Lebastard; Stéphane Bazeille
quadrupedal robot HyQ carrying out a variety of gaits.
Robotica | 2017
Stéphane Bazeille; Jesús Ortiz; Francesco Rovida; Marco Camurri; Anis Meguenani; Darwin G. Caldwell; Claudio Semini
This paper presents the vision of the recently launched project AGILITY that aims to join forces across Europe to bring robots with legs and arms into highly unstructured outdoor environment such as a disaster area. Building upon state of the art torque controllable quadrupedal robots, we jointly investigate environment perception, motion planning, and whole body control strategies that enable rough terrain locomotion and manipulation. The developed machines will be able to autonomously navigate through challenging ground by optimally adapting the contact forces to the ground (e.g., propping feet against an obstacle), by using whole body motions to extend their standard workspace (e.g., twisting the body to reach), or by dynamic maneuvers (e.g., jumping or leaping). The proposed methods will be evaluated in a rescue scenario using (future versions of) our torque controllable quadrupedal robots HyQ and StarlETH.
intelligent agents | 2016
Ronald Thenius; Daniel Moser; Joshua Cherian Varughese; Serge Kernbach; Igor Kuksin; Olga Kernbach; Elena Kuksina; Nikola Mišković; Stjepan Bogdan; Tamara Petrovic; Anja Babić; Frédéric Boyer; Vincent Lebastard; Stéphane Bazeille; Graziano William Ferrari; Elisa Donati; Riccardo Pelliccia; Donato Romano; Godfried Jansen van Vuuren; Cesare Stefanini; Matteo Morgantin; Alexandre Campo; Thomas Schmickl
In this article we address the issue of shape estimation using electric sense inspired by the active electric fish. These fish can perceive their environment by measuring the perturbations in a self-generated electric field caused by nearby objects. The approach proceeded in three stages. Firstly, the object was detected and its electric properties (insulator or conductor) identified. Secondly, the object was localized using the multiple signal classification algorithm, which was originally developed to localize a radio wave emitter using a network of antennas. Thirdly, the shape estimation relied on the concept of generalized polarization tensor, which enabled us to model the electric response of an object polarized by an ambient electric field. We describe the implementation of the approach through numerous experiments. The system was able to estimate shape with an average error of 16%, and opened the way toward further improvements. In particular, self-aligning the sensor with the ellipsoid through a reactive feedback makes the shape estimation errors drop to 10%.
international conference on multisensor fusion and integration for intelligent systems | 2015
Marco Camurri; Stéphane Bazeille; Darwin G. Caldwell; Claudio Semini
Legged robots have the potential to navigate in more challenging terrains than wheeled robots. Unfortunately, their control is more demanding, because they have to deal with the common tasks of mapping and path planning as well as more specific issues of legged locomotion, like balancing and foothold planning. In this paper, we present the integration and the development of a stabilized vision system on the fully torque-controlled hydraulically actuated quadruped robot (HyQ). The active head added onto the robot is composed of a fast pan and tilt unit (PTU) and a high-resolution wide angle stereo camera. The PTU enables camera gaze shifting to a specific area in the environment (both to extend and refine the map) or to track an object while navigating. Moreover, as the quadruped locomotion induces strong regular vibrations, impacts or slippages on rough terrain, we took advantage of the PTU to mechanically compensate for the robots motions. In this paper, we demonstrate the influence of legged locomotion on the quality of the visual data stream by providing a detailed study of HyQs motions, which are compared against a rough terrain wheeled robot of the same size. Our proposed Inertial Measurement Unit (IMU)-based controller allows us to decouple the camera from the robot motions. We show through experiments that, by stabilizing the image feedback, we can improve the onboard vision-based processes of tracking and mapping. In particular, during the outdoor tests on the quadruped robot, the use of our camera stabilization system improved the accuracy on the 3D maps by 25%, with a decrease of 50% of mapping failures.