Eduardo J. Molinos
University of Alcalá
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Eduardo J. Molinos.
Sensors | 2013
Angel Llamazares; Vladimir Ivan; Eduardo J. Molinos; Manuel Ocaña; Sethu Vijayakumar
The goal of this paper is to solve the problem of dynamic obstacle avoidance for a mobile platform by using the stochastic optimal control framework to compute paths that are optimal in terms of safety and energy efficiency under constraints. We propose a three-dimensional extension of the Bayesian Occupancy Filter (BOF) (Coué et al. Int. J. Rob. Res. 2006, 25, 19–30) to deal with the noise in the sensor data, improving the perception stage. We reduce the computational cost of the perception stage by estimating the velocity of each obstacle using optical flow tracking and blob filtering. While several obstacle avoidance systems have been presented in the literature addressing safety and optimality of the robot motion separately, we have applied the approximate inference framework to this problem to combine multiple goals, constraints and priors in a structured way. It is important to remark that the problem involves obstacles that can be moving, therefore classical techniques based on reactive control are not optimal from the point of view of energy consumption. Some experimental results, including comparisons against classical algorithms that highlight the advantages are presented.
Sensors | 2017
Elena López; Sergio García; Rafael Barea; Luis Miguel Bergasa; Eduardo J. Molinos; Roberto Arroyo; Eduardo Romera; Samuel Pardo
One of the main challenges of aerial robots navigation in indoor or GPS-denied environments is position estimation using only the available onboard sensors. This paper presents a Simultaneous Localization and Mapping (SLAM) system that remotely calculates the pose and environment map of different low-cost commercial aerial platforms, whose onboard computing capacity is usually limited. The proposed system adapts to the sensory configuration of the aerial robot, by integrating different state-of-the art SLAM methods based on vision, laser and/or inertial measurements using an Extended Kalman Filter (EKF). To do this, a minimum onboard sensory configuration is supposed, consisting of a monocular camera, an Inertial Measurement Unit (IMU) and an altimeter. It allows to improve the results of well-known monocular visual SLAM methods (LSD-SLAM and ORB-SLAM are tested and compared in this work) by solving scale ambiguity and providing additional information to the EKF. When payload and computational capabilities permit, a 2D laser sensor can be easily incorporated to the SLAM system, obtaining a local 2.5D map and a footprint estimation of the robot position that improves the 6D pose estimation through the EKF. We present some experimental results with two different commercial platforms, and validate the system by applying it to their position control.
Robot | 2016
Elena López; Rafael Barea; Alejandro Gómez; Álvaro Saltos; Luis Miguel Bergasa; Eduardo J. Molinos; Abdelkrim Nemra
This paper represents research in progress in Simultaneous Localization and Mapping (SLAM) for Micro Aerial Vehicles (MAVs) in the context of rescue and/or recognition navigation tasks in indoor environments. In this kind of applications, the MAV must rely on its own onboard sensors to autonomously navigate in unknown, hostile and GPS denied environments, such as ruined or semi-demolished buildings. This article aims to investigate a new SLAM technique that fuses laser and visual information, besides measurements from the inertial unit, to robustly obtain the 6DOF pose estimation of a MAV within a local map of the environment. Laser is used to obtain a local 2D map and a footprint estimation of the MAV position, while a monocular visual SLAM algorithm enlarges the pose estimation through an Extended Kalman Filter (EKF). The system consists of a commercial drone and a remote control unit to computationally afford the SLAM algorithms using a distributed node system based on ROS (Robot Operating System). Some experimental results show how sensor fusion improves the position estimation and the obtained map under different test conditions.
Robotica | 2016
Fernando Herranz; Angel Llamazares; Eduardo J. Molinos; Manuel Ocaña; Miguel Ángel Sotelo
Localization and mapping in indoor environments, such as airports and hospitals, are key tasks for almost every robotic platform. Some researchers suggest the use of Range-Only (RO) sensors based on WiFi (Wireless Fidelity) technology with SLAM (Simultaneous Localization And Mapping) techniques to solve both problems. The current state of the art in RO SLAM is mainly focused on the filtering approach, while the study of smoothing approaches with RO sensors is quite incomplete. This paper presents a comparison between filtering algorithms, such as EKF and FastSLAM, and a smoothing algorithm, the SAM (Smoothing And Mapping). Experimental results are obtained in indoor environments using WiFi sensors. The results demonstrate the feasibility of the smoothing approach using WiFi sensors in an indoor environment.
international conference on advanced robotics | 2015
José Cano; Eduardo J. Molinos; Vijay Nagarajan; Sethu Vijayakumar
In distributed (mobile) robotics environments, the different computing substrates offer flexible resource allocation options to perform computations that implement an overall system goal. The AnyScale concept that we introduce and describe in this paper exploits this redundancy by dynamically allocating tasks to appropriate substrates (or scales) to optimize some level of system performance while migrating others depending on current resource and performance parameters. In this paper, we demonstrate this concept with a general ROS-based infrastructure that solves the task allocation problem by optimising the system performance while correctly reacting to unpredictable events at the same time. Assignment decisions are based on a characterisation of the static/dynamic parameters that represent the system and its interaction with the environment. We instantiate our infrastructure on a case study application, in which a mobile robot navigates along the floor of a building trying to reach a predefined goal. Experimental validation demonstrates more robust performance (around a third improvement in metrics) under the Anyscale implementation framework.
2016 International Conference on Autonomous Robot Systems and Competitions (ICARSC) | 2016
Sergio García; M. Elena Lopez; Rafael Barea; Luis Miguel Bergasa; Alejandro Gómez; Eduardo J. Molinos
This paper represents research in progress in Simultaneous Localization and Mapping (SLAM) for Micro Aerial Vehicles (MAVs) in the context of rescue and/or recognition navigation tasks in indoor environments. In this kind of applications, the MAV must rely on its own onboard sensors to autonomously navigate in unknown, hostile and GPS denied environments, such as ruined or semi-demolished buildings. This article aims to investigate a SLAM technique that fuses visual information and measurements from the inertial measurement unit (IMU), to robustly obtain the 6DOF pose estimation of a MAV within a local map of the environment. The monocular visual SLAM algorithm along with the IMU calculate the pose estimation through an Extended Kalman Filter (EKF). The system consists of a low-cost commercial drone and a remote control unit to computationally afford the SLAM algorithms using a distributed node system based on ROS (Robot Operating System). Some experimental results show how sensor fusion improves the position estimation and the obtained map under different test conditions.
ieee/sice international symposium on system integration | 2014
Eduardo J. Molinos; Angel Llamazares; Manuel Ocaña; Fernando Herranz
Traditionally, obstacle avoidance algorithms have been developed and applied successfully to mobile robots that work in controlled and static environments. But, when working in real scenarios the problem becomes complex since the scenario is dynamic and the algorithms must be enhanced in order to deal with moving objects. In this paper we propose a new method based on the well known Curvature Velocity Method (CVM) and a probabilistic 3D occupancy and velocity grid, developed by the authors, that can deal with these dynamic scenarios. The proposal is validated in real and simulated environments.
computer aided systems theory | 2011
Angel Llamazares; Eduardo J. Molinos; Manuel Ocaña; Luis Miguel Bergasa; N. Hern; ndez; Fernando Herranz
In this paper we present a technique to build 3D maps of the environment using a 2D laser scanner combined with a robots action model. This paper demonstrates that it is possible to build 3D maps in a cheap way using an angled 2D laser. We introduce a scan matching method to minimize the odometer errors of the robotics platform and a calibration method to improve the accuracy of the system. Some experimental results and conclusions are presented.
international conference on robotics and automation | 2014
Fernando Herranz; Angel Llamazares; Eduardo J. Molinos; Manuel Ocaña
Localization and mapping in indoor environments, such as airports and hospitals, are key tasks for almost every robotic platform. Some researchers suggest the use of RO (Range Only) sensors based on WiFi (Wireless Fidelity) technology with SLAM (Simultaneous Localization And Mapping) techniques. The current state of the art in RO SLAM is mainly focused on the filtering approach, while the study of smoothing approach with RO sensors is quite incomplete. This paper presents a comparison between a filtering algorithm, the EKF, and a smoothing algorithm, the SAM (Smoothing And Mapping). Experimental results are obtained, first in an outdoor environment using two types of RO sensors and then in an indoor environment with WiFi sensors. The results demonstrate the feasibility of the smoothing approach with WiFi sensors in indoors.
Robot | 2014
Eduardo J. Molinos; Angel Llamazares; Noelia Hernández; Roberto Arroyo; Andres F. Cela; José Javier Yebes; Manuel Ocaña; Luis Miguel Bergasa
This paper presents different techniques to achieve the tasks proposed in the DARPA (Defense Advanced Research Projects Agency) VRC (Virtual Robotics Challenge), which entails the recognition of objects, the robot localization and the mapping of the simulated environments in the Challenge. Data acquisition relies on several sensors such as a stereo camera, a 2D laser, an IMU (Inertial Motion Unit) and stress sensors. Using the map and the position of the robot inside it, we propose a safe path planning to navigate through the environment using an Atlas humanoid robot.