Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Saul Thurrowgood is active.

Publication


Featured researches published by Saul Thurrowgood.


intelligent robots and systems | 2009

A Vision based system for attitude estimation of UAVS

Saul Thurrowgood; Dean Soccol; Richard James Donald Moore; Daniel Bland; Mandyam V. Srinivasan

This paper describes a technique for estimating the attitude of a UAV by monitoring the visual horizon. An algorithm is developed that makes the best use of color and intensity information in an image to determine the position and orientation of the horizon, and infer the aircrafts attitude. The technique is accurate, reliable, and fully capable of real-time operation. Furthermore, it can be incorporated into any existing vision system, irrespective of the way in which the environment is imaged (e.g. through lenses or mirrors).


intelligent robots and systems | 2009

A stereo vision system for UAV guidance

Richard James Donald Moore; Saul Thurrowgood; Daniel Bland; Dean Soccol; Mandyam V. Srinivasan

This study describes a novel, vision-based system for guidance of UAVs. The system uses two cameras, each associated with a specially-shaped reflective surface, to obtain stereo information on the height above ground and the distances to potential obstacles. The camera-mirror system has the advantage that it remaps the world onto a cylindrical co-ordinate system that simplifies and speeds up range computations, and defines a collision-free cylinder through which the aircraft can pass without encountering obstacles. The result is a computationally efficient approach to vision-based aircraft guidance that is particularly suited to terrain and gorge following, obstacle avoidance, and landing. The feasibility of the system is demonstrated in laboratory and field tests.


advanced video and signal based surveillance | 2006

An Optical System for Guidance of Terrain Following in UAVs

Mandyam V. Srinivasan; Saul Thurrowgood; Dean Soccol

There is considerable interest in designing guidance systems for UAVs that use passive sensing (such as vision), rather than active sensing which can be bulky, expensive and stealth-compromising. Here we describe an optical sensor, based partly on principles of insect vision and optic flow analysis, for measurement and control of height above the ground. A video camera is used in conjunction with a specially shaped reflective surface to simplify the computation of optic flow, and extend the range of aircraft speeds over which accurate data can be obtained. The imaging system also provides a useful geometrical remapping of the environment, which facilitates obstacle avoidance and computation of 3-D terrain maps.


Journal of Field Robotics | 2014

A Biologically Inspired, Vision-based Guidance System for Automatic Landing of a Fixed-wing Aircraft

Saul Thurrowgood; Richard James Donald Moore; Dean Soccol; Michael Knight; Mandyam V. Srinivasan

We describe a guidance system for achieving automatic landing of a fixed-wing aircraft in unstructured outdoor terrain, using onboard video cameras. The system uses optic flow information for sensing and controlling the height above the ground, and information on the horizon profile, also acquired by the vision system for stabilizing roll and controlling pitch, and additionally, if required, for the control and stabilization of yaw and flight direction. At low heights, when optic flow is unreliable, stereo information is used to guide descent close to touchdown. While rate gyro information is used to augment attitude stabilization in one of the designs, this is not mandatory and it can be replaced by visual information. Smooth, safe landings are achieved with a success rate of 92.5%. The system does not emit active radiation and does not rely on any external information such as a global positioning system or an instrument landing system.


IEEE Robotics & Automation Magazine | 2009

Competent vision and navigation systems

Mandyam V. Srinivasan; Saul Thurrowgood; Dean Soccol

In this article, we describe how flying insects use vision for guidance, especially in the contexts of regulating flight speed, negotiating narrow gaps, avoiding obstacles, and performing smooth landings. We show that many of these maneuvers, which were traditionally believed to involve relatively complex and high-level perception, can be achieved through the use of low-level cues and relatively simple computation. We also describe tests of the effectiveness of some of these strategies for autonomous guidance of small-scale terrestrial and aerial vehicles in the contexts of corridor navigation, altitude control, and terrain following and landing. We also describe a novel, mirror- based imaging system that is tailored for these tasks and facilitates the requisite visual computations.


intelligent robots and systems | 2011

A fast and adaptive method for estimating UAV attitude from the visual horizon

Richard James Donald Moore; Saul Thurrowgood; Daniel Bland; Dean Soccol; Mandyam V. Srinivasan

This study describes a novel method for automatically obtaining the attitude of an aircraft from the visual horizon. A wide-angle view of the environment, including the visual horizon, is captured and the input images are classified into fuzzy sky and ground regions using the spectral and intensity properties of the pixels. The classifier is updated continuously using an online reinforcement strategy and is therefore able to adapt to the changing appearance of the sky and ground, without requiring prior training offline. A novel approach to obtaining the attitude of the aircraft from the classified images is described, which is reliable, accurate, and computationally efficient to implement. This method is therefore suited to real-time operation and we present results from flight tests that demonstrate the ability of this vision-based approach to outperform an inexpensive inertial system.


international conference on robotics and automation | 2010

UAV altitude and attitude stabilisation using a coaxial stereo vision system

Richard James Donald Moore; Saul Thurrowgood; Daniel Bland; Dean Soccol; Mandyam V. Srinivasan

This study describes a novel, vision-based system for guidance of UAVs. The system uses two coaxially aligned cameras, each associated with a specially-shaped reflective surface, to obtain stereo information on the height above ground and the distances to potential obstacles. The camera-mirror system has the advantage that it remaps the world onto a cylindrical co-ordinate system that simplifies and speeds up range computations, and defines a collision-free cylinder through which the aircraft can pass without encountering obstacles. We describe an approach, using this vision system, in which the attitude and altitude of an aircraft can be controlled directly, making the system particularly suited to terrain following, obstacle avoidance, and landing. The autonomous guidance of an aircraft performing a terrain following task using the system is demonstrated in field tests.


intelligent robots and systems | 2012

Vision-only estimation of wind field strength and direction from an aerial platform

Richard James Donald Moore; Saul Thurrowgood; Mandyam V. Srinivasan

This study describes a novel method for estimating the strength and direction of the local wind field from a mobile airborne platform. An iterative optimisation is derived that allows the properties of the wind field to be determined from successive measurements of the heading direction and ground track of the aircraft only. We have previously described methods for estimating these parameters using a single vision system. This approach therefore constitutes a purely visual method for estimating the properties of the local wind field. We present results from simulated and real-world flight tests that demonstrate the accuracy and robustness of the proposed method and its practicality in uncontrolled environmental conditions. These properties and the simplicity of the implementation should make this approach useful as an alternative means for estimating the properties of the local wind field from small-scale, fixed-wing UAVs.


Archive | 2012

From biology to engineering: Insect vision and applications to robotics

Mandyam V. Srinivasan; Richard James Donald Moore; Saul Thurrowgood; Dean Soccol; Daniel Bland

The past two decades have witnessed a growing interest not only in understanding sensory biology, but also in applying the principles gleaned from these studies to the design of new, biologically inspired sensors for a variety engineering applications. This chapter provides a brief account of this interdisciplinary endeavour in the field of insect vision and flight guidance. Despite their diminutive eyes and brains, flying insects display superb agility and remarkable navigational competence. This review describes our current understanding of how insects use vision to stabilize flight, avoid collisions with objects, regulate flight speed, navigate to a distant food source, and orchestrate smooth landings. It also illustrates how some of these insights from biology are being used to develop novel algorithms for the guidance of terrestrial and airborne vehicles. We use this opportunity to also highlight some of the outstanding questions in this particular area of sensing and control.


Archive | 2011

A Bio-Inspired Stereo Vision System for Guidance of Autonomous Aircraft

Richard D. Moore; Saul Thurrowgood; Dean Soccol; Daniel Bland; Mandyam V. Srinivasan

Unmanned aerial vehicles (UAVs) are increasingly replacing manned systems in situations that are either too dangerous, too remote, or too difficult formanned aircraft to access. Modern UAVs are capable of accurately controlling their position and orientation in space using systems such as the Global Positioning System (GPS) and the Attitude and Heading Reference System (AHRS). However, they are unable to perform crucial guidance tasks such as obstacle avoidance, low-altitude terrain or gorge following, or landing in an uncontrolled environment using these systems only. For such tasks, the aircraft must be able to continuously monitor its surroundings. Active sensors, such as laser range finders or radar can be bulky, low-bandwidth, and stealth-compromising. Therefore, there is considerable benefit to be gained by designing guidance systems for UAVs that utilise passive sensing, such as vision. Over the last two decades, a significant amount of research has shown that biological visual systems can inspire novel, vision-based solutions to some of the challenges facing autonomous aircraft guidance. A recent trend in biologically inspired vision systems has been to exploit optical flow information for collision avoidance, terrain and gorge following, and landing. However, systems that rely on optical flow for extracting range information need to discount the components of optical flow that are induced by rotations of the aircraft. Furthermore, altitude cannot be controlled in a precise manner using measurements of optical flow only, as optical flow also depends upon the aircraft’s velocity. Stereo vision, on the other hand, allows the aircraft’s altitude to be directly computed and controlled, irrespective of the attitude or ground speed of the aircraft, and independently of its rotations about the roll, pitch, and yaw axes. Additionally, we will show that a stereo vision system can also allow the computation and control of the orientation of the aircraft with respect to the ground. Stereo vision therefore provides an attractive approach to solving some of the problems of providing guidance for autonomous aircraft operating in low-altitude or cluttered environments. In this chapter, we will explore how stereo vision may be applied to facilitate the guidance of an autonomous aircraft. In particular, we will discuss a wide-angle stereo vision system that has been tailored for the specific needs of aircraft guidance, such as terrain following, obstacle avoidance, and landing. Finally, results from closed-loop flight tests conducted using this system will be presented to demonstrate how stereo vision can be successfully utilised to provide guidance for an autonomous aircraft performing real-world tasks. 16

Collaboration


Dive into the Saul Thurrowgood's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Dean Soccol

University of Queensland

View shared research outputs
Top Co-Authors

Avatar

Daniel Bland

University of Queensland

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Reuben Strydom

University of Queensland

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Farid Kendoul

Commonwealth Scientific and Industrial Research Organisation

View shared research outputs
Top Co-Authors

Avatar

Michael Knight

University of Queensland

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Adam Postula

University of Queensland

View shared research outputs
Researchain Logo
Decentralizing Knowledge