Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jesús Pestana is active.

Publication


Featured researches published by Jesús Pestana.


advances in computing and communications | 2014

Computer vision based general object following for GPS-denied multirotor unmanned vehicles

Jesús Pestana; Jose Luis Sanchez-Lopez; Srikanth Saripalli; Pascual Campoy

The motivation of this research is to show that visual based object tracking and following is reliable using a cheap GPS-denied multirotor platform such as the AR Drone 2.0. Our architecture allows the user to specify an object in the image that the robot has to follow from an approximate constant distance. At the current stage of our development, in the event of image tracking loss the system starts to hover and waits for the image tracking recovery or second detection, which requires the usage of odometry measurements for self stabilization. During the following task, our software utilizes the forward-facing camera images and part of the IMU data to calculate the references for the four on-board low-level control loops. To obtain a stronger wind disturbance rejection and an improved navigation performance, a yaw heading reference based on the IMU data is internally kept and updated by our control algorithm. We validate the architecture using an AR Drone 2.0 and the OpenTLD tracker in outdoor suburban areas. The experimental tests have shown robustness against wind perturbations, target occlusion and illumination changes, and the systems capability to track a great variety of objects present on suburban areas, for instance: walking or running people, windows, AC machines, static and moving cars and plants.


international symposium on safety, security, and rescue robotics | 2013

Vision based GPS-denied Object Tracking and following for unmanned aerial vehicles

Jesús Pestana; Jose Luis Sanchez-Lopez; Pascual Campoy; Srikanth Saripalli

We present a vision based control strategy for tracking and following objects using an Unmanned Aerial Vehicle. We have developed an image based visual servoing method that uses only a forward looking camera for tracking and following objects from a multi-rotor UAV, without any dependence on GPS systems. Our proposed method tracks a user specified object continuously while maintaining a fixed distance from the object and also simultaneously keeping it in the center of the image plane. The algorithm is validated using a Parrot AR Drone 2.0 in outdoor conditions while tracking and following people, occlusions and also fast moving objects; showing the robustness of the proposed systems against perturbations and illumination changes. Our experiments show that the system is able to track a great variety of objects present in suburban areas, among others: people, windows, AC machines, cars and plants.


Journal of Intelligent and Robotic Systems | 2014

An Approach Toward Visual Autonomous Ship Board Landing of a VTOL UAV

Jose Luis Sanchez-Lopez; Jesús Pestana; Srikanth Saripalli; Pascual Campoy

We present the design and implementation of a vision based autonomous landing algorithm using a downward looking camera. To demonstrate the efficacy of our algorithms we emulate the dynamics of the ship-deck, for various sea states and different ships using a six degrees of freedom motion platform. We then present the design and implementation of our robust computer vision system to measure the pose of the shipdeck with respect to the vehicle. A Kalman filter is used in conjunction with our vision system to ensure the robustness of the estimates. We demonstrate the accuracy and robustness of our system to occlusions, variation in intensity, etc. using our testbed.


Journal of Intelligent and Robotic Systems | 2016

A Reliable Open-Source System Architecture for the Fast Designing and Prototyping of Autonomous Multi-UAV Systems: Simulation and Experimentation

Jose Luis Sanchez-Lopez; Jesús Pestana; Paloma de la Puente; Pascual Campoy

During the process of design and development of an autonomous Multi-UAV System, two main problems appear. The first one is the difficulty of designing all the modules and behaviors of the aerial multi-robot system. The second one is the difficulty of having an autonomous prototype of the system for the developers that allows to test the performance of each module even in an early stage of the project. These two problems motivate this paper. A multipurpose system architecture for autonomous multi-UAV platforms is presented. This versatile system architecture can be used by the system designers as a template when developing their own systems. The proposed system architecture is general enough to be used in a wide range of applications, as demonstrated in the paper. This system architecture aims to be a reference for all designers. Additionally, to allow for the fast prototyping of autonomous multi-aerial systems, an Open Source framework based on the previously defined system architecture is introduced. It allows developers to have a flight proven multi-aerial system ready to use, so that they can test their algorithms even in an early stage of the project. The implementation of this framework, introduced in the paper with the name of “CVG Quadrotor Swarm”, which has also the advantages of being modular and compatible with different aerial platforms, can be found at https://github.com/Vision4UAV/cvg_quadrotor_swarm with a consistent catalog of available modules. The good performance of this framework is demonstrated in the paper by choosing a basic instance of it and carrying out simulation and experimental tests whose results are summarized and discussed in this paper.


international conference on unmanned aircraft systems | 2013

Toward visual autonomous ship board landing of a VTOL UAV

Jose Luis Sanchez-Lopez; Srikanth Saripalli; Pascual Campoy; Jesús Pestana; Changhong Fu

In this paper we tackle the problem of landing a helicopter autonomously on a ship deck, using as the main sensor, an on-board colour camera. To create a test-bed, we first adequately simulate the movement of a ship landing platform on the Sea, for different Sea States, for different ships, randomly and realistically enough. We use a commercial parallel robot to get this movement. Once we had this, we developed an accurate and robust computer vision system to measure the pose of the helipad with respect to the on-board camera. To deal with the noise and the possible fails of the computer vision, a state estimator was created. With all of this, we are now able to develop and test a controller that closes the loop and finish the autonomous landing task.


international conference on unmanned aircraft systems | 2016

AEROSTACK: An architecture and open-source software framework for aerial robotics

Jose Luis Sanchez-Lopez; Ramón Suárez Fernández; Hriday Bavle; Carlos Sampedro; Martin Molina; Jesús Pestana; Pascual Campoy

To simplify the usage of the Unmanned Aerial Systems (UAS), extending their use to a great number of applications, fully autonomous operation is needed. There are many open-source architecture frameworks for UAS that claim the autonomous operation of UAS, but they still have two main open issues: (1) level of autonomy, being in most of the cases limited and (2) versatility, being most of them designed specifically for some applications or aerial platforms. As a response to these needs and issues, this paper presents Aerostack, a system architecture and open-source multi-purpose software framework for autonomous multi-UAS operation. To provide higher degrees of autonomy, Aerostacks system architecture integrates state of the art concepts of intelligent, cognitive and social robotics, based on five layers: reactive, executive, deliberative, reflective, and social. To be a highly versatile practical solution, Aerostacks open-source software framework includes the main components to execute the architecture for fully autonomous missions of swarms of UAS; a collection of ready-to-use and flight proven modular components that can be reused by the users and developers; and compatibility with five well known aerial platforms, as well as a high number of sensors. Aerostack has been validated during three years by its successful use on many research projects, international competitions and exhibitions. To corroborate this fact, this paper also presents Aerostack carrying out a fictional fully autonomous indoors search and rescue mission.


international conference on unmanned aircraft systems | 2014

A system for the design and development of vision-based multi-robot quadrotor swarms

Jose Luis Sanchez-Lopez; Jesús Pestana; Paloma de la Puente; Ramon Suarez-Fernandez; Pascual Campoy

This paper presents a cost-effective framework for the prototyping of vision-based quadrotor multi-robot systems, which core characteristics are: modularity, compatibility with different platforms and being flight-proven. The framework is fully operative, which is shown in the paper through simulations and real flight tests of up to 5 drones, and was demonstrated with the participation in an international micro-aerial vehicles competition3 where it was awarded with the First Prize in the Indoors Autonomy Challenge. The motivation of this framework is to allow the developers to focus on their own research by decoupling the development of dependent modules, leading to a more cost-effective progress in the project. The basic instance of the framework that we propose, which is flight-proven with the cost-efficient and reliable platform Parrot AR Drone 2.0 and is open-source, includes several modules that can be reused and modified, such as: a basic sequential mission planner, a basic 2D trajectory planner, an odometry state estimator, localization and mapping modules which obtain absolute position measurements using visual markers, a trajectory controller and a visualization module.


international conference on unmanned aircraft systems | 2014

A Vision-based Quadrotor Swarm for the participation in the 2013 International Micro Air Vehicle Competition

Jesús Pestana; Jose Luis Sanchez-Lopez; Paloma de la Puente; Adrian Carrio; Pascual Campoy

This paper presents a completely autonomous solution to participate in the 2013 International Micro Air Vehicle Indoor Flight Competition (IMAV2013). Our proposal is a modular multi-robot swarm architecture, based on the Robot Operating System (ROS) software framework, where the only information shared among swarm agents is each robots position. Each swarm agent consists of an AR Drone 2.0 quadrotor connected to a laptop which runs the software architecture. In order to present a completely visual-based solution the localization problem is simplified by the usage of ArUco visual markers. These visual markers are used to sense and map obstacles and to improve the pose estimation based on the IMU and optical data flow by means of an Extended Kalman Filter localization and mapping method. The presented solution and the performance of the CVG UPM team were awarded with the First Prize in the Indoors Autonomy Challenge of the IMAV2013 competition.


Journal of Intelligent and Robotic Systems | 2016

A Vision-based Quadrotor Multi-robot Solution for the Indoor Autonomy Challenge of the 2013 International Micro Air Vehicle Competition

Jesús Pestana; Jose Luis Sanchez-Lopez; Paloma de la Puente; Adrian Carrio; Pascual Campoy

This paper presents a completely autonomous solution to participate in the Indoor Challenge of the 2013 International Micro Air Vehicle Competition (IMAV 2013). Our proposal is a multi-robot system with no centralized coordination whose robotic agents share their position estimates. The capability of each agent to navigate avoiding collisions is a consequence of the resulting emergent behavior. Each agent consists of a ground station running an instance of the proposed architecture that communicates over WiFi with an AR Drone 2.0 quadrotor. Visual markers are employed to sense and map obstacles and to improve the pose estimation based on Inertial Measurement Unit (IMU) and ground optical flow data. Based on our architecture, each robotic agent can navigate avoiding obstacles and other members of the multi-robot system. The solution is demonstrated and the achieved navigation performance is evaluated by means of experimental flights. This work also analyzes the capabilities of the presented solution in simulated flights of the IMAV 2013 Indoor Challenge. The performance of the CVG_UPM team was awarded with the First Prize in the Indoor Autonomy Challenge of the IMAV 2013 competition.


international conference on unmanned aircraft systems | 2015

A vision based aerial robot solution for the Mission 7 of the International Aerial Robotics Competition

Jose Luis Sanchez-Lopez; Jesús Pestana; Jean-Franois Collumeau; Ramon Suarez-Fernandez; Pascual Campoy; Martin Molina

The International Aerial Robotics Competition (IARC) aims at pulling forward the state of the art in UAV. The Missions 7 challenge deals mainly with GPS/Laser denied navigation, Robot-Robot interaction and obstacle avoidance in the setting of a ground robot herding problem. We present in this paper our UAV which took part in the 2014 competition, in the China venue. This year, the mission was not completed by any participant but our team at Technical University of Madrid (UPM) were awarded with two special prizes: Best Target Detection and Best System Control. The platform, hardware and the developed software used in this top leading competition are presented in this paper. This software has three main components: the visual localization and mapping algorithm; the control algorithms; and the mission planner. A statement of the safety measures integrated in the drone and of our efforts to ensure field testing in conditions as close as possible to the challengers are also included.

Collaboration


Dive into the Jesús Pestana's collaboration.

Top Co-Authors

Avatar

Pascual Campoy

Technical University of Madrid

View shared research outputs
Top Co-Authors

Avatar

Jose Luis Sanchez-Lopez

Spanish National Research Council

View shared research outputs
Top Co-Authors

Avatar

Adrian Carrio

Spanish National Research Council

View shared research outputs
Top Co-Authors

Avatar

Changhong Fu

Spanish National Research Council

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ignacio Mellado-Bataller

Spanish National Research Council

View shared research outputs
Top Co-Authors

Avatar

Ramon Suarez-Fernandez

Spanish National Research Council

View shared research outputs
Top Co-Authors

Avatar

Friedrich Fraundorfer

Graz University of Technology

View shared research outputs
Top Co-Authors

Avatar

Paloma de la Puente

Technical University of Madrid

View shared research outputs
Top Co-Authors

Avatar

Martin Molina

Technical University of Madrid

View shared research outputs
Researchain Logo
Decentralizing Knowledge