Andrew Dankers
Australian National University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Andrew Dankers.
machine vision applications | 2004
Andrew Dankers; Alexander Zelinsky
Abstract.We report on the development of a multi-purpose active visual sensor system for real-world application. The Cable-Drive Active-Vision Robot, CeDAR, has been designed for use on a diverse range of platforms to perform a diverse range of tasks. The novel, biologically inspired design has evolved from a systems-based approach. The mechanism is compact and lightweight and is capable of motions that exceed human visual performance and earlier mechanical designs. The control system complements the mechanical design to implement the basic visual behaviours of fixation, smooth pursuit and saccade, with stability during high-speed motions, high precision and repeatability. Real-time algorithms have been developed that process stereo colour images, resulting in a suite of basic visual competencies. We have developed a scheme to fuse the results of the visual algorithms into robust task-oriented behaviours by adopting a statistical framework. CeDAR has been successfully used for experiments in autonomous vehicle guidance, object tracking and visual sensing for mobile robot experiments.
Computer Vision and Image Understanding | 2007
Andrew Dankers; Nick Barnes; Alexander Zelinsky
A maximum a posterior probability zero disparity filter (MAP ZDF) ensures coordinated stereo fixation upon an arbitrarily moving, rotating, re-configuring hand, performing marker-less pixel-wise segmentation of the hand. Active stereo fixation permits real-time foveal hand tracking and segmentation over a large visual workspace, allowing investigation of unrestricted natural human gesturing. Hand segmentation is shown to be robust to lighting conditions, defocus, hand colour variation, foreground and background clutter including non-tracked hands, and partial or gross occlusions including those due to non-tracked hands. The system operates at approximately 27fps on a 3GHz single processor PC.
intelligent vehicles symposium | 2005
Andrew Dankers; Nick Barnes; Alexander Zelinsky
We present a mapping approach to road scene awareness based on active stereo vision. We generalise traditional static multi-camera rectification techniques to enable active epipolar rectification with a mosaic representation of the output. The approach is used to apply standard static depth mapping and optical flow techniques to the active case. We use the framework to extract the ground plane and segment moving objects in dynamic scenes using arbitrarily moving cameras on a moving vehicle. The approach enables an estimation of the velocity of the vehicle relative to the road, and the velocity of objects in the scene. We provide footage of preliminary results of the system operating in real-time, including dynamic object extraction and tracking, ground plane extraction, and recovery of vehicle velocity.
international symposium on experimental robotics | 2009
Andrew Dankers; Nick Barnes; Walter F. Bischof; Alexander Zelinsky
Perception in the visual cortex and dorsal stream of the primate brain includes important visual competencies, such as: a consistent representation of visual space despite eye movement; egocentric spatial perception; attentional gaze deployment; and, coordinated stereo fixation upon dynamic objects. These competencies have emerged commensurate with observation of the real world, and constitute a vision system that is optimised, in some sense, for perception and interaction. We present a robotic vision system that incorporates these competencies. We hypothesise that similarities between the underlying robotic system model and that of the primate vision system will elicit accordingly similar gaze behaviours. Psychophysical trials were conducted to record human gaze behaviour when free-viewing a reproducible, dynamic, 3D scene. Identical trials were conducted with the robotic system. A statistical comparison of robotic and human gaze behaviour has shown that the two are remarkably similar. Enabling a humanoid to mimic the optimised gaze strategies of humans may be a significant step towards facilitating human-like perception.
international conference on computer vision systems | 2003
Andrew Dankers; Alexander Zelinsky
This paper reports on the development of a multi-purpose active visual sensor system for real-world application. The Cable-Drive Active-Vision Robot (CeDAR) has been designed for use on a diverse range of platforms, to perform a diverse range of tasks. The novel, biologically inspired design has evolved from a systems based approach. The mechanism is compact and light-weight, and is capable of motions that exceed human visual performance and earlier mechanical designs. The control system complements the mechanical design to implement the basic visual behaviours of fixation, smooth pursuit and saccade, with stability during high speed motions, high precision and repeatability. Real-time vision processing algorithms have been developed that process stereo colour images at 30Hz, resulting in a suite of basic visual competencies. We have developed a scheme to fuse the results of the visual algorithms into robust task-oriented behaviours by adopting a statistical frame-work. CeDAR has been successfully used for experiments in autonomous vehicle guidance, object tracking, and visual sensing for mobile robot experiments.
field and service robotics | 2006
Andrew Dankers; Nick Barnes; Alexander Zelinsky
We present a biologically inspired active vision system that incorporates two modes of perception. A peripheral mode provides a broad and coarse perception of where mass is in the scene in the vicinity of the current fixation point, and how that mass is moving. It involves fusion of actively acquired depth data into a 3D occupancy grid. A foveal mode then ensures coordinated stereo fixation upon mass/objects in the scene, and enables extraction of the mass/object using a maximum a-posterior probability zero disparity filter. Foveal processing is limited to the vicinity of the camera optical centres. Results for each mode and both modes operating in parallel are presented. The regime operates at approximately 15Hz on a 3GHz single processor PC.
Archive | 2007
Andrew Dankers; Nick Barnes; Alexander Zelinsky
international conference on robotics and automation | 2004
Andrew Dankers; Nick Barnes; Alexander Zelinsky
international conference on robotics and automation | 2003
Andrew Dankers; Luke Fletcher; Lars Petersson; Alexander Zelinsky
Nature Precedings | 2007
Andrew Dankers; Nick Barnes; Walter F. Bischof; Alexander Zelinsky; Locked Bag
Collaboration
Dive into the Andrew Dankers's collaboration.
Commonwealth Scientific and Industrial Research Organisation
View shared research outputs