Helen Oleynikova
ETH Zurich
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Helen Oleynikova.
intelligent robots and systems | 2014
Dominik Honegger; Helen Oleynikova; Marc Pollefeys
Recent developments in smartphones create an ideal platform for robotics and computer vision applications: they are small, powerful, embedded devices with low-power mobile CPUs. However, though the computational power of smartphones has increased substantially in recent years, they are still not capable of performing intense computer vision tasks in real time, at high frame rates and low latency. We present a combination of FPGA and mobile CPU to overcome the computational and latency limitations of mobile CPUs alone. With the FPGA as an additional layer between the image sensor and CPU, the system is capable of accelerating computer vision algorithms to real-time performance. Low latency calculation allows for direct usage within control loops of mobile robots. A stereo camera setup with disparity estimation based on the semi global matching algorithm is implemented as an accelerated example application. The system calculates dense disparity images with 752×480 pixels resolution at 60 frames per second. The overall latency of the disparity estimation is less than 2 milliseconds. The system is suitable for any mobile robot application due to its light weight and low power consumption.
international conference on robotics and automation | 2016
Andreas Bircher; Mina Kamel; Kostas Alexis; Helen Oleynikova; Roland Siegwart
This paper presents a novel path planning algorithm for the autonomous exploration of unknown space using aerial robotic platforms. The proposed planner employs a receding horizon “next-best-view” scheme: In an online computed random tree it finds the best branch, the quality of which is determined by the amount of unmapped space that can be explored. Only the first edge of this branch is executed at every planning step, while repetition of this procedure leads to complete exploration results. The proposed planner is capable of running online, onboard a robot with limited resources. Its high performance is evaluated in detailed simulation studies as well as in a challenging real world experiment using a rotorcraft micro aerial vehicle. Analysis on the computational complexity of the algorithm is provided and its good scaling properties enable the handling of large scale and complex problem setups.
intelligent robots and systems | 2015
Michael Burri; Helen Oleynikova; Markus W. Achtelik; Roland Siegwart
In this work, we present an MAV system that is able to relocalize itself, create consistent maps and plan paths in full 3D in previously unknown environments. This is solely based on vision and IMU measurements with all components running onboard and in real-time. We use visual-inertial odometry to keep the MAV airborne safely locally, as well as for exploration of the environment based on high-level input by an operator. A globally consistent map is constructed in the background, which is then used to correct for drift of the visual odometry algorithm. This map serves as an input to our proposed global planner, which finds dynamic 3D paths to any previously visited place in the map, without the use of teach and repeat algorithms. In contrast to previous work, all components are executed onboard and in real-time without any prior knowledge of the environment.
international conference on robotics and automation | 2015
Helen Oleynikova; Dominik Honegger; Marc Pollefeys
High speed, low latency obstacle avoidance is essential for enabling Micro Aerial Vehicles (MAVs) to function in cluttered and dynamic environments. While other systems exist that do high-level mapping and 3D path planning for obstacle avoidance, most of these systems require high-powered CPUs on-board or off-board control from a ground station. We present a novel entirely on-board approach, leveraging a light-weight low power stereo vision system on FPGA. Our approach runs at a frame rate of 60 frames a second on VGA-sized images and minimizes latency between image acquisition and performing reactive maneuvers, allowing MAVs to fly more safely and robustly in complex environments. We also suggest our system as a light-weight safety layer for systems undertaking more complex tasks, like mapping the environment. Finally, we show our algorithm implemented on a lightweight, very computationally constrained platform, and demonstrate obstacle avoidance in a variety of environments.
Autonomous Robots | 2018
Andreas Bircher; Mina Kamel; Kostas Alexis; Helen Oleynikova; Roland Siegwart
Within this paper a new path planning algorithm for autonomous robotic exploration and inspection is presented. The proposed method plans online in a receding horizon fashion by sampling possible future configurations in a geometric random tree. The choice of the objective function enables the planning for either the exploration of unknown volume or inspection of a given surface manifold in both known and unknown volume. Application to rotorcraft Micro Aerial Vehicles is presented, although planning for other types of robotic platforms is possible, even in the absence of a boundary value solver and subject to nonholonomic constraints. Furthermore, the method allows the integration of a wide variety of sensor models. The presented analysis of computational complexity and thorough simulations-based evaluation indicate good scaling properties with respect to the scenario complexity. Feasibility and practical applicability are demonstrated in real-life experimental test cases with full on-board computation.
intelligent robots and systems | 2015
Helen Oleynikova; Michael Burri; Simon Lynen; Roland Siegwart
Localization is essential for robots to operate autonomously, especially for extended periods of time, when estimator drift tends to destroy alignment to any global map. Though there has been extensive work in vision-based localization in recent years, including several systems that show real-time performance, none have been demonstrated running entirely on-board in closed loop on robotic platforms. We propose a fast, real-time localization system that keeps the existing local visual-inertial odometry frame consistent for controllers and collision avoidance, while correcting drift and alignment to a global coordinate frame. We demonstrate our localization system entirely on-board an aerial and ground robot, showing a collaboration experiment where both robots are able to localize against the same map accurately enough to allow the multicopter to land on top of the ground robot. We also perform extensive evaluations for the proposed closed-loop system on ground-truth datasets from MAV flight in an industrial setting.
international conference on robotics and automation | 2016
Michael Burri; Janosch Nikolic; Helen Oleynikova; Markus W. Achtelik; Roland Siegwart
As the applications of Micro Aerial Vehicles (MAVs) get more and more complex, and require highly dynamic motions, it becomes essential to have an accurate dynamic model of the MAV. Such a model can be used for reliable state estimation, control, and for realistic simulation. A good model requires accurate estimates of physical parameters of the system, which we aim to estimate from recorded flight data. In this paper, we present a detailed physical model of the MAV and a maximum likelihood estimation scheme for determining the dominant parameters, such as inertia matrix, center of gravity (CoG) with respect to the IMU, and parameters related to the aerodynamics. To incorporate all information given by the IMU and the physical MAV model, we propose to use two process models in the optimization. We show the effectiveness of the method on simulated data, as well as on a real platform.
Archive | 2019
Christos Papachristos; Mina Kamel; Marija Popovic; Shehryar Khattak; Andreas Bircher; Helen Oleynikova; Tung Dang; Frank Mascarich; Kostas Alexis; Roland Siegwart
This use case chapter presents a set of algorithms for the problems of autonomous exploration, terrain monitoring and optimized inspection path planning using aerial robots. The autonomous exploration algorithms described employ a receding horizon structure to iteratively derive the action that the robot should take to optimally explore its environment when no prior map is available, with the extension to localization uncertainty–aware planning. Terrain monitoring is tackled by a finite–horizon informative planning algorithm that further respects time budget limitations. For the problem of optimized inspection with a model of the environment known a priori, an offline path planning algorithm is proposed. All methods proposed are characterized by computational efficiency and have been tested thoroughly via multiple experiments. The Robot Operating System corresponds to the common middleware for the outlined family of methods. By the end of this chapter, the reader should be able to use the open–source contributions of the algorithms presented, implement them from scratch, or modify them to further fit the needs of a particular autonomous exploration, terrain monitoring, or structural inspection mission using aerial robots. Four different open–source ROS packages (compatible with ROS Indigo, Jade and Kinetic) are released, while the repository https://github.com/unr-arl/informative-planning stands as a single point of reference for all of them.
intelligent robots and systems | 2016
Helen Oleynikova; Michael Burri; Zachary Taylor; Juan I. Nieto; Roland Siegwart; Enric Galceran
intelligent robots and systems | 2017
Helen Oleynikova; Zachary Taylor; Marius Fehr; Roland Siegwart; Juan I. Nieto