Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Richard James Donald Moore is active.

Publication


Featured researches published by Richard James Donald Moore.


intelligent robots and systems | 2009

A Vision based system for attitude estimation of UAVS

Saul Thurrowgood; Dean Soccol; Richard James Donald Moore; Daniel Bland; Mandyam V. Srinivasan

This paper describes a technique for estimating the attitude of a UAV by monitoring the visual horizon. An algorithm is developed that makes the best use of color and intensity information in an image to determine the position and orientation of the horizon, and infer the aircrafts attitude. The technique is accurate, reliable, and fully capable of real-time operation. Furthermore, it can be incorporated into any existing vision system, irrespective of the way in which the environment is imaged (e.g. through lenses or mirrors).


Proceedings of the National Academy of Sciences of the United States of America | 2014

Selective attention in the honeybee optic lobes precedes behavioral choices

Angelique C. Paulk; Jacqueline A. Stacey; Thomas Pearson; Gavin J. Taylor; Richard James Donald Moore; Mandyam V. Srinivasan; Bruno van Swinderen

Significance Attention, observed in a wide variety of animals from insects to humans, involves selectively attending to behaviorally relevant stimuli while filtering out other stimuli. We designed a paradigm that allowed us to record brain activity in tethered, walking bees selecting virtual visual objects. We found that stimulus-specific brain activity increased when the bees controlled the position of the visual objects, and that activity decreased when bees were not in control. When bees were presented with competing objects, brain activity in the optic lobes preceded behavioral choices; this suggests that in animals with tiny brains, such as bees, attention-like processes are pushed far out into the sensory periphery. This trait is likely important for efficiently navigating complex visual environments. Attention allows animals to respond selectively to competing stimuli, enabling some stimuli to evoke a behavioral response while others are ignored. How the brain does this remains mysterious, although it is increasingly evident that even animals with the smallest brains display this capacity. For example, insects respond selectively to salient visual stimuli, but it is unknown where such selectivity occurs in the insect brain, or whether neural correlates of attention might predict the visual choices made by an insect. Here, we investigate neural correlates of visual attention in behaving honeybees (Apis mellifera). Using a closed-loop paradigm that allows tethered, walking bees to actively control visual objects in a virtual reality arena, we show that behavioral fixation increases neuronal responses to flickering, frequency-tagged stimuli. Attention-like effects were reduced in the optic lobes during replay of the same visual sequences, when bees were not able to control the visual displays. When bees were presented with competing frequency-tagged visual stimuli, selectivity in the medulla (an optic ganglion) preceded behavioral selection of a stimulus, suggesting that modulation of early visual processing centers precedes eventual behavioral choices made by these insects.


intelligent robots and systems | 2009

A stereo vision system for UAV guidance

Richard James Donald Moore; Saul Thurrowgood; Daniel Bland; Dean Soccol; Mandyam V. Srinivasan

This study describes a novel, vision-based system for guidance of UAVs. The system uses two cameras, each associated with a specially-shaped reflective surface, to obtain stereo information on the height above ground and the distances to potential obstacles. The camera-mirror system has the advantage that it remaps the world onto a cylindrical co-ordinate system that simplifies and speeds up range computations, and defines a collision-free cylinder through which the aircraft can pass without encountering obstacles. The result is a computationally efficient approach to vision-based aircraft guidance that is particularly suited to terrain and gorge following, obstacle avoidance, and landing. The feasibility of the system is demonstrated in laboratory and field tests.


Journal of Field Robotics | 2014

A Biologically Inspired, Vision-based Guidance System for Automatic Landing of a Fixed-wing Aircraft

Saul Thurrowgood; Richard James Donald Moore; Dean Soccol; Michael Knight; Mandyam V. Srinivasan

We describe a guidance system for achieving automatic landing of a fixed-wing aircraft in unstructured outdoor terrain, using onboard video cameras. The system uses optic flow information for sensing and controlling the height above the ground, and information on the horizon profile, also acquired by the vision system for stabilizing roll and controlling pitch, and additionally, if required, for the control and stabilization of yaw and flight direction. At low heights, when optic flow is unreliable, stereo information is used to guide descent close to touchdown. While rate gyro information is used to augment attitude stabilization in one of the designs, this is not mandatory and it can be replaced by visual information. Smooth, safe landings are achieved with a success rate of 92.5%. The system does not emit active radiation and does not rely on any external information such as a global positioning system or an instrument landing system.


Journal of Neuroscience Methods | 2014

FicTrac: a visual method for tracking spherical motion and generating fictive animal paths.

Richard James Donald Moore; Gavin J. Taylor; Angelique C. Paulk; Thomas Pearson; Bruno van Swinderen; Mandyam V. Srinivasan

Studying how animals interface with a virtual reality can further our understanding of how attention, learning and memory, sensory processing, and navigation are handled by the brain, at both the neurophysiological and behavioural levels. To this end, we have developed a novel vision-based tracking system, FicTrac (Fictive path Tracking software), for estimating the path an animal makes whilst rotating an air-supported sphere using only input from a standard camera and computer vision techniques. We have found that the accuracy and robustness of FicTrac outperforms a low-cost implementation of a standard optical mouse-based approach for generating fictive paths. FicTrac is simple to implement for a wide variety of experimental configurations and, importantly, is fast to execute, enabling real-time sensory feedback for behaving animals. We have used FicTrac to record the behaviour of tethered honeybees, Apis mellifera, whilst presenting visual stimuli in both open-loop and closed-loop experimental paradigms. We found that FicTrac could accurately register the fictive paths of bees as they walked towards bright green vertical bars presented on an LED arena. Using FicTrac, we have demonstrated closed-loop visual fixation in both the honeybee and the fruit fly, Drosophila melanogaster, establishing the flexibility of this system. FicTrac provides the experimenter with a simple yet adaptable system that can be combined with electrophysiological recording techniques to study the neural mechanisms of behaviour in a variety of organisms, including walking vertebrates.


intelligent robots and systems | 2011

A fast and adaptive method for estimating UAV attitude from the visual horizon

Richard James Donald Moore; Saul Thurrowgood; Daniel Bland; Dean Soccol; Mandyam V. Srinivasan

This study describes a novel method for automatically obtaining the attitude of an aircraft from the visual horizon. A wide-angle view of the environment, including the visual horizon, is captured and the input images are classified into fuzzy sky and ground regions using the spectral and intensity properties of the pixels. The classifier is updated continuously using an online reinforcement strategy and is therefore able to adapt to the changing appearance of the sky and ground, without requiring prior training offline. A novel approach to obtaining the attitude of the aircraft from the classified images is described, which is reliable, accurate, and computationally efficient to implement. This method is therefore suited to real-time operation and we present results from flight tests that demonstrate the ability of this vision-based approach to outperform an inexpensive inertial system.


international conference on robotics and automation | 2010

UAV altitude and attitude stabilisation using a coaxial stereo vision system

Richard James Donald Moore; Saul Thurrowgood; Daniel Bland; Dean Soccol; Mandyam V. Srinivasan

This study describes a novel, vision-based system for guidance of UAVs. The system uses two coaxially aligned cameras, each associated with a specially-shaped reflective surface, to obtain stereo information on the height above ground and the distances to potential obstacles. The camera-mirror system has the advantage that it remaps the world onto a cylindrical co-ordinate system that simplifies and speeds up range computations, and defines a collision-free cylinder through which the aircraft can pass without encountering obstacles. We describe an approach, using this vision system, in which the attitude and altitude of an aircraft can be controlled directly, making the system particularly suited to terrain following, obstacle avoidance, and landing. The autonomous guidance of an aircraft performing a terrain following task using the system is demonstrated in field tests.


intelligent robots and systems | 2012

Vision-only estimation of wind field strength and direction from an aerial platform

Richard James Donald Moore; Saul Thurrowgood; Mandyam V. Srinivasan

This study describes a novel method for estimating the strength and direction of the local wind field from a mobile airborne platform. An iterative optimisation is derived that allows the properties of the wind field to be determined from successive measurements of the heading direction and ground track of the aircraft only. We have previously described methods for estimating these parameters using a single vision system. This approach therefore constitutes a purely visual method for estimating the properties of the local wind field. We present results from simulated and real-world flight tests that demonstrate the accuracy and robustness of the proposed method and its practicality in uncontrolled environmental conditions. These properties and the simplicity of the implementation should make this approach useful as an alternative means for estimating the properties of the local wind field from small-scale, fixed-wing UAVs.


Archive | 2012

From biology to engineering: Insect vision and applications to robotics

Mandyam V. Srinivasan; Richard James Donald Moore; Saul Thurrowgood; Dean Soccol; Daniel Bland

The past two decades have witnessed a growing interest not only in understanding sensory biology, but also in applying the principles gleaned from these studies to the design of new, biologically inspired sensors for a variety engineering applications. This chapter provides a brief account of this interdisciplinary endeavour in the field of insect vision and flight guidance. Despite their diminutive eyes and brains, flying insects display superb agility and remarkable navigational competence. This review describes our current understanding of how insects use vision to stabilize flight, avoid collisions with objects, regulate flight speed, navigate to a distant food source, and orchestrate smooth landings. It also illustrates how some of these insights from biology are being used to develop novel algorithms for the guidance of terrestrial and airborne vehicles. We use this opportunity to also highlight some of the outstanding questions in this particular area of sensing and control.


international conference on robotics and automation | 2010

UAV attitude control using the visual horizon

Saul Thurrowgood; Richard James Donald Moore; Daniel Bland; Dean Soccol; Mandyam V. Srinivasan

Collaboration


Dive into the Richard James Donald Moore's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Dean Soccol

University of Queensland

View shared research outputs
Top Co-Authors

Avatar

Daniel Bland

University of Queensland

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Thomas Pearson

University of Queensland

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

David Ball

Peter MacCallum Cancer Centre

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge