Stefan Hrabar
Commonwealth Scientific and Industrial Research Organisation
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Stefan Hrabar.
international conference on robotics and automation | 2004
Peter Corke; Stefan Hrabar; Ronald A. Peterson; Daniela Rus; Srikanth Saripalli; Gaurav S. Sukhatme
We describe a sensor network deployment method using autonomous flying robots. Such networks are suitable for tasks such as large-scale environmental monitoring or for command and control in emergency situations. We describe in detail the algorithms used for deployment and for measuring network connectivity and provide experimental data we collected from field trials. A particular focus is on determining gaps in connectivity of the deployed network and generating a plan for a second, repair, pass to complete the connectivity. This project is the result of a collaboration between three robotics labs (CSIRO, USC, and Dartmouth.).
intelligent robots and systems | 2008
Stefan Hrabar
We present a synthesis of techniques for rotorcraft UAV navigation through unknown environments which may contain obstacles. D* Lite and probabilistic roadmaps are combined for path planning, together with stereo vision for obstacle detection and dynamic path updating. A 3D occupancy map is used to represent the environment, and is updated online using stereo data. The target application is autonomous helicopter-based structure inspections, which require the UAV to fly safely close to the structures it is inspecting. Results are presented from simulation and with real flight hardware mounted onboard a cable array robot, demonstrating successful navigation through unknown environments containing obstacles.
international conference on robotics and automation | 2003
Stefan Hrabar; Gaurav S. Sukhatme
We present the design and implementation of an omnidirectional vision system used for sideways-looking sensing on an autonomous helicopter. To demonstrate the capabilities of the system, a visual servoing task was designed which required the helicopter to locate and move towards the centroid of a number of visual targets. Results are presented showing that the task was successfully completed by a Pioneer ground robot equipped with the same omnidirectional vision system, and preliminary test flight results show that the system can generate appropriate control commands for the helicopter.
intelligent robots and systems | 2004
Stefan Hrabar; Gaurav S. Sukhatme
We present a comparison of two camera configurations for avoiding obstacles in 3D-space using optic flow. The two configurations were developed for use on an autonomous helicopter, with the aim of enabling it to fly in environments with tall obstacles (e.g. urban canyons). The comparison is made based on real data captured from two sideways-looking cameras and an omnidirectional camera mounted onboard an autonomous helicopter. Optic flow information from the images is used to determine the relative distance to obstacles on each side of the helicopter. We show that on average, both camera configurations are equally effective and that they can be used to tell which of the canyon walls is closer with an accuracy of 74%. It is noted that each configuration is however more effective under certain conditions, and so a suitable hybrid approach is suggested. We also show that there is a linear relationship between the optic flow ratios and the position of the helicopter with respect to the center of the canyon. We use this relationship to develop a proportional control strategy for flying the helicopter along the Voronoi line between buildings.
Faculty of Built Environment and Engineering | 2006
Peter Corke; Stefan Hrabar; Ronald A. Peterson; Daniela Rus; Srikanth Saripalli; Gaurav S. Sukhatme
We consider multi-robot systems that include sensor nodes and aerial or ground robots networked together. Such networks are suitable for tasks such as large-scale environmental monitoring or for command and control in emergency situations. We present a sensor network deployment method using autonomous aerial vehicles and describe in detail the algorithms used for deployment and for measuring network connectivity and provide experimental data collected from field trials. A particular focus is on determining gaps in connectivity of the deployed network and generating a plan for repair, to complete the connectivity. This project is the result of a collaboration between three robotics labs (CSIRO, USC, and Dartmouth). © Springer-Verlag Berlin/Heidelberg 2006.
intelligent robots and systems | 2011
Stefan Hrabar
We present a goal-directed 3D reactive obstacle avoidance algorithm specifically designed for Rotorcraft Unmanned Aerial Vehicles (RUAVs) that fly point-to-point type trajectories. The algorithm detects potential collisions within a cylindrical Safety Volume projected ahead of the UAV. This is done in a 3D occupancy map representation of the environment. An expanding elliptical search is performed to find an Escape Point; a waypoint which offers a collision free route past obstacles and towards a goal waypoint. An efficient occupied voxel checking technique is employed which approximates the Safety Volume by a series of spheres, and uses an approximate nearest neighbour search in a Bkd-tree representation of the occupied voxels. Tests show the algorithm can typically find an Escape Point in under 100 ms using onboard UAV processing for a cluttered environment with 20 000 occupied voxels. Successful collision avoidance results are presented from simulation experiments and from flights with an autonomous helicopter equipped with stereo and laser range sensors.
intelligent robots and systems | 2006
Stefan Hrabar; Gaurav S. Sukhatme
We present analytical and empirical investigations into the optimum camera angle to use for the optic flow-based centering response. This technique is commonly used to guide both ground-based and aerial robots between obstacles. A variety of camera angles have been implemented by researchers in the past, but surprisingly little mention is made of the motivation for these camera angle choices, nor has an investigation into the optimum camera angle been conducted. Our investigation shows that camera angle plays a key role in the performance of control strategies for the centering response, and both empirical and analytical investigations show the optimum camera angle to be 45 degrees when traveling between parallel obstacles
intelligent robots and systems | 2014
Inkyu Sa; Stefan Hrabar; Peter Corke
We present an approach for the inspection of vertical pole-like infrastructure using a vertical take-off and landing (VTOL) unmanned aerial vehicle and shared autonomy. Inspecting vertical structures, such as light and power distribution poles, is a time consuming, dangerous and expensive task with high operator workload. To address these issues, we propose a VTOL platform that can operate at close-quarters, whilst maintaining a safe stand-off distance and rejecting environmental disturbances. We adopt an Image based Visual Servoing (IBVS) technique using only two line features to stabilise the vehicle with respect to a pole. Visual, inertial and sonar data are used, making the approach suitable for indoor or GPS-denied environments. Results from simulation and outdoor flight experiments demonstrate the system is able to successfully inspect and circumnavigate a pole.
digital image computing: techniques and applications | 2008
Paul Westall; Jason J. Ford; Peter O'Shea; Stefan Hrabar
Searching for humans lost in vast stretches of ocean has always been a difficult task. In this paper, a range of machine vision approaches are investigated as candidate tools to mitigate the risk of human fatigue and complacency after long hours performing these kind of search tasks. Our two-phased approach utilises point target detection followed by temporal tracking of these targets. Four different point target detection techniques and two tracking techniques are evaluated. We also evaluate the use of different colour spaces for target detection. This paper has a particular focus on Hidden Markov Model based tracking techniques, which seem best able to incorporate a priori knowledge about the maritime search problem, to improve detection performance.
Journal of Field Robotics | 2012
Stefan Hrabar
We present an evaluation of stereo vision and laser-based range sensing for rotorcraft unmanned aerial vehicle (RUAV) obstacle avoidance. Our focus is on sensors that are suitable for mini-RUAV class vehicles in terms of weight and power consumption. The study is limited to the avoidance of large static obstacles such as trees. We compare two commercially available devices that are representative of the state of the art in two-dimensional scanning laser and stereo-based sensing. Stereo is evaluated with three different focal length lenses to assess the tradeoff between range resolution and field of view (FOV). The devices are evaluated in the context of obstacle avoidance through extensive flight trials with an RUAV. We discuss the merits and limitations of each sensor type, including sensing range, FOV, accuracy, and susceptibility to lighting conditions. We show that the stereo device fitted with 8-mm lenses has a better sensing range and vertical FOV than the laser device; however, it relies on careful calibration and is affected by high-contrast outdoor lighting conditions. The laser has a wider horizontal FOV and is more reliable at detecting obstacles that are within a 20-m range. Overall the laser produced superior obstacle avoidance performance, with a success rate of 84% compared to 42% for 8-mm stereo.
Collaboration
Dive into the Stefan Hrabar's collaboration.
Commonwealth Scientific and Industrial Research Organisation
View shared research outputs