Carol Martinez
Technical University of Madrid
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Carol Martinez.
Journal of Intelligent and Robotic Systems | 2009
Jorge Artieda; José M. Sebastián; Pascual Campoy; Juan F. Correa; Iván F. Mondragón; Carol Martinez; Miguel Olivares
The aim of the paper is to present, test and discuss the implementation of Visual SLAM techniques to images taken from Unmanned Aerial Vehicles (UAVs) outdoors, in partially structured environments. Every issue of the whole process is discussed in order to obtain more accurate localization and mapping from UAVs flights. Firstly, the issues related to the visual features of objects in the scene, their distance to the UAV, and the related image acquisition system and their calibration are evaluated for improving the whole process. Other important, considered issues are related to the image processing techniques, such as interest point detection, the matching procedure and the scaling factor. The whole system has been tested using the COLIBRI mini UAV in partially structured environments. The results that have been obtained for localization, tested against the GPS information of the flights, show that Visual SLAM delivers reliable localization and mapping that makes it suitable for some outdoors applications when flying UAVs.
Journal of Intelligent and Robotic Systems | 2009
Pascual Campoy; Juan F. Correa; Iván F. Mondragón; Carol Martinez; Miguel Olivares; Luis Mejias; Jorge Artieda
Computer vision is much more than a technique to sense and recover environmental information from an UAV. It should play a main role regarding UAVs’ functionality because of the big amount of information that can be extracted, its possible uses and applications, and its natural connection to human driven tasks, taking into account that vision is our main interface to world understanding. Our current research’s focus lays on the development of techniques that allow UAVs to maneuver in spaces using visual information as their main input source. This task involves the creation of techniques that allow an UAV to maneuver towards features of interest whenever a GPS signal is not reliable or sufficient, e.g. when signal dropouts occur (which usually happens in urban areas, when flying through terrestrial urban canyons or when operating on remote planetary bodies), or when tracking or inspecting visual targets—including moving ones—without knowing their exact UMT coordinates. This paper also investigates visual servoing control techniques that use velocity and position of suitable image features to compute the references for flight control. This paper aims to give a global view of the main aspects related to the research field of computer vision for UAVs, clustered in four main active research lines: visual servoing and control, stereo-based visual navigation, image processing algorithms for detection and tracking, and visual SLAM. Finally, the results of applying these techniques in several applications are presented and discussed: this study will encompass power line inspection, mobile target tracking, stereo distance estimation, mapping and positioning.
Autonomous Robots | 2010
Iván F. Mondragón; Miguel A. Olivares-Mendez; Pascual Campoy; Carol Martinez; Luis Mejias
This paper presents an implementation of an aircraft pose and motion estimator using visual systems as the principal sensor for controlling an Unmanned Aerial Vehicle (UAV) or as a redundant system for an Inertial Measure Unit (IMU) and gyros sensors. First, we explore the applications of the unified theory for central catadioptric cameras for attitude and heading estimation, explaining how the skyline is projected on the catadioptric image and how it is segmented and used to calculate the UAV’s attitude. Then we use appearance images to obtain a visual compass, and we calculate the relative rotation and heading of the aerial vehicle. Additionally, we show the use of a stereo system to calculate the aircraft height and to measure the UAV’s motion. Finally, we present a visual tracking system based on Fuzzy controllers working in both a UAV and a camera pan and tilt platform. Every part is tested using the UAV COLIBRI platform to validate the different approaches, which include comparison of the estimated data with the inertial values measured onboard the helicopter platform and the validation of the tracking schemes on real flights.
international conference on robotics and automation | 2010
Iván F. Mondragón; Pascual Campoy; Carol Martinez; Miguel A. Olivares-Mendez
This article presents a real time Unmanned Aerial Vehicles UAVs 3D pose estimation method using planar object tracking, in order to be used on the control system of a UAV. The method explodes the rich information obtained by a projective transformation of planar objects on a calibrated camera. The algorithm obtains the metric and projective components of a reference object (landmark or helipad) with respect to the UAV camera coordinate system, using a robust real time object tracking based on homographies. The algorithm is validated on real flights that compare the estimated data against that obtained by the inertial measurement unit IMU, showing that the proposed method robustly estimates the helicopters 3D position with respect to a reference landmark, with a high quality on the position and orientation estimation when the aircraft is flying at low altitudes, a situation in which the GPS information is often inaccurate. The obtained results indicate that the proposed algorithm is suitable for complex control tasks, such as autonomous landing, accurate low altitude positioning and dropping of payloads.
Journal of Intelligent and Robotic Systems | 2011
Carol Martinez; Iván F. Mondragón; Miguel A. Olivares-Mendez; Pascual Campoy
In this paper, two techniques to control UAVs (Unmanned Aerial Vehicles), based on visual information are presented. The first one is based on the detection and tracking of planar structures from an on-board camera, while the second one is based on the detection and 3D reconstruction of the position of the UAV based on an external camera system. Both strategies are tested with a VTOL (Vertical take-off and landing) UAV, and results show good behavior of the visual systems (precision in the estimation and frame rate) when estimating the helicopter’s position and using the extracted information to control the UAV.
Robotics and Autonomous Systems | 2010
Iván F. Mondragón; Pascual Campoy; Carol Martinez; Miguel Olivares
This paper presents an aircraft attitude and heading estimator using catadioptric images as a principal sensor for UAV or as a redundant system for IMU (Inertial Measure Unit) and gyro sensors. First, we explain how the unified theory for central catadioptric cameras is used for attitude and heading estimation, explaining how the skyline is projected on the catadioptric image and how it is segmented and used to calculate the UAVs attitude. Then, we use appearance images to obtain a visual compass, and we calculate the relative rotation and heading of the aerial vehicle. Finally the tests and results using the UAV COLIBRI platform and the validation of them in real flights are presented, comparing the estimated data with the inertial values measured on board.
ieee international conference on fuzzy systems | 2010
Miguel A. Olivares-Mendez; Iván F. Mondragón; Pascual Campoy; Carol Martinez
This paper presents a Fuzzy Control application for a landing task of an Unmanned Aerial Vehicle, using the 3D-position estimation based on visual tracking of piecewise planar objects. This application allows the UAV to land on scenarios in which it is only possible to use visual information to obtain the position of the vehicle. The use of the homography permits a realtime estimation of the UAVs pose with respect to a helipad using a monocular camera. Fuzzy Logic allows the definition of a model-free control system of the UAV. The Fuzzy controller analyzes the visual information to generate altitude commands for the UAV to develop the landing task.
Robotics and Autonomous Systems | 2013
Carol Martinez; Thomas S. Richardson; Peter Thomas; Jonathan Luke du Bois; Pascual Campoy
Autonomous aerial refueling is a key enabling technology for both manned and unmanned aircraft where extended flight duration or range are required. The results presented within this paper offer one potential vision-based sensing solution, together with a unique test environment. A hierarchical visual tracking algorithm based on direct methods is proposed and developed for the purposes of tracking a drogue during the capture stage of autonomous aerial refueling, and of estimating its 3D position. Intended to be applied in real time to a video stream from a single monocular camera mounted on the receiver aircraft, the algorithm is shown to be highly robust, and capable of tracking large, rapid drogue motions within the frame of reference. The proposed strategy has been tested using a complex robotic testbed and with actual flight hardware consisting of a full size probe and drogue. Results show that the vision tracking algorithm can detect and track the drogue at real-time frame rates of more than thirty frames per second, obtaining a robust position estimation even with strong motions and multiple occlusions of the drogue.
intelligent robots and systems | 2009
Miguel A. Olivares-Mendez; Pascual Campoy; Carol Martinez; Iván F. Mondragón
This paper presents an implementation of two Fuzzy Logic controllers working in parallel for a pan-tilt camera platform on an UAV. This implementation uses a basic Lucas-Kanade tracker algorithm, which sends information about the error between the center of the object to track and the center of the image, to the Fuzzy controller. This information is enough for the controller to follow the object by moving a two axis servo-platform, regardless the UAV vibrations and movements. The two Fuzzy controllers for each axis, work with a rules-base of 49 rules, two inputs and one output with a more significant sector defined to improve the behavior of those controllers. The controllers have shown very good performances in real flights for statics objects, tested on the Colibri prototypes.
intelligent robots and systems | 2009
Carol Martinez; Pascual Campoy; Iván F. Mondragón; Miguel A. Olivares-Mendez
In this paper we introduce a real-time trinocular system to control rotary wing Unmanned Aerial Vehicles based on the 3D information extracted by cameras located on the ground. The algorithm is based on key features onboard the UAV to estimate the vehicles position and orientation. The algorithm is validated against onboard sensors and known 3D positions, showing that the proposed camera configuration robustly estimates the helicopters position with an adequate resolution, improving the position estimation, especially the height estimation. The obtained results show that the proposed algorithm is suitable to complement or replace the GPS-based position estimation in situations where GPS information is unavailable or where its information is inaccurate, allowing the vehicle to develop tasks at low heights, such as autonomous landing, take-off, and positioning, using the extracted 3D information as a visual feedback to the flight controller.