Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Dean Soccol is active.

Publication


Featured researches published by Dean Soccol.


intelligent robots and systems | 2009

A Vision based system for attitude estimation of UAVS

Saul Thurrowgood; Dean Soccol; Richard James Donald Moore; Daniel Bland; Mandyam V. Srinivasan

This paper describes a technique for estimating the attitude of a UAV by monitoring the visual horizon. An algorithm is developed that makes the best use of color and intensity information in an image to determine the position and orientation of the horizon, and infer the aircrafts attitude. The technique is accurate, reliable, and fully capable of real-time operation. Furthermore, it can be incorporated into any existing vision system, irrespective of the way in which the environment is imaged (e.g. through lenses or mirrors).


intelligent robots and systems | 2009

A stereo vision system for UAV guidance

Richard James Donald Moore; Saul Thurrowgood; Daniel Bland; Dean Soccol; Mandyam V. Srinivasan

This study describes a novel, vision-based system for guidance of UAVs. The system uses two cameras, each associated with a specially-shaped reflective surface, to obtain stereo information on the height above ground and the distances to potential obstacles. The camera-mirror system has the advantage that it remaps the world onto a cylindrical co-ordinate system that simplifies and speeds up range computations, and defines a collision-free cylinder through which the aircraft can pass without encountering obstacles. The result is a computationally efficient approach to vision-based aircraft guidance that is particularly suited to terrain and gorge following, obstacle avoidance, and landing. The feasibility of the system is demonstrated in laboratory and field tests.


advanced video and signal based surveillance | 2006

An Optical System for Guidance of Terrain Following in UAVs

Mandyam V. Srinivasan; Saul Thurrowgood; Dean Soccol

There is considerable interest in designing guidance systems for UAVs that use passive sensing (such as vision), rather than active sensing which can be bulky, expensive and stealth-compromising. Here we describe an optical sensor, based partly on principles of insect vision and optic flow analysis, for measurement and control of height above the ground. A video camera is used in conjunction with a specially shaped reflective surface to simplify the computation of optic flow, and extend the range of aircraft speeds over which accurate data can be obtained. The imaging system also provides a useful geometrical remapping of the environment, which facilitates obstacle avoidance and computation of 3-D terrain maps.


Journal of Field Robotics | 2014

A Biologically Inspired, Vision-based Guidance System for Automatic Landing of a Fixed-wing Aircraft

Saul Thurrowgood; Richard James Donald Moore; Dean Soccol; Michael Knight; Mandyam V. Srinivasan

We describe a guidance system for achieving automatic landing of a fixed-wing aircraft in unstructured outdoor terrain, using onboard video cameras. The system uses optic flow information for sensing and controlling the height above the ground, and information on the horizon profile, also acquired by the vision system for stabilizing roll and controlling pitch, and additionally, if required, for the control and stabilization of yaw and flight direction. At low heights, when optic flow is unreliable, stereo information is used to guide descent close to touchdown. While rate gyro information is used to augment attitude stabilization in one of the designs, this is not mandatory and it can be replaced by visual information. Smooth, safe landings are achieved with a success rate of 92.5%. The system does not emit active radiation and does not rely on any external information such as a global positioning system or an instrument landing system.


IEEE Robotics & Automation Magazine | 2009

Competent vision and navigation systems

Mandyam V. Srinivasan; Saul Thurrowgood; Dean Soccol

In this article, we describe how flying insects use vision for guidance, especially in the contexts of regulating flight speed, negotiating narrow gaps, avoiding obstacles, and performing smooth landings. We show that many of these maneuvers, which were traditionally believed to involve relatively complex and high-level perception, can be achieved through the use of low-level cues and relatively simple computation. We also describe tests of the effectiveness of some of these strategies for autonomous guidance of small-scale terrestrial and aerial vehicles in the contexts of corridor navigation, altitude control, and terrain following and landing. We also describe a novel, mirror- based imaging system that is tailored for these tasks and facilitates the requisite visual computations.


intelligent robots and systems | 2011

A fast and adaptive method for estimating UAV attitude from the visual horizon

Richard James Donald Moore; Saul Thurrowgood; Daniel Bland; Dean Soccol; Mandyam V. Srinivasan

This study describes a novel method for automatically obtaining the attitude of an aircraft from the visual horizon. A wide-angle view of the environment, including the visual horizon, is captured and the input images are classified into fuzzy sky and ground regions using the spectral and intensity properties of the pixels. The classifier is updated continuously using an online reinforcement strategy and is therefore able to adapt to the changing appearance of the sky and ground, without requiring prior training offline. A novel approach to obtaining the attitude of the aircraft from the classified images is described, which is reliable, accurate, and computationally efficient to implement. This method is therefore suited to real-time operation and we present results from flight tests that demonstrate the ability of this vision-based approach to outperform an inexpensive inertial system.


international conference on robotics and automation | 2010

UAV altitude and attitude stabilisation using a coaxial stereo vision system

Richard James Donald Moore; Saul Thurrowgood; Daniel Bland; Dean Soccol; Mandyam V. Srinivasan

This study describes a novel, vision-based system for guidance of UAVs. The system uses two coaxially aligned cameras, each associated with a specially-shaped reflective surface, to obtain stereo information on the height above ground and the distances to potential obstacles. The camera-mirror system has the advantage that it remaps the world onto a cylindrical co-ordinate system that simplifies and speeds up range computations, and defines a collision-free cylinder through which the aircraft can pass without encountering obstacles. We describe an approach, using this vision system, in which the attitude and altitude of an aircraft can be controlled directly, making the system particularly suited to terrain following, obstacle avoidance, and landing. The autonomous guidance of an aircraft performing a terrain following task using the system is demonstrated in field tests.


Applied Optics | 2008

Rugged, obstruction-free, mirror-lens combination for panoramic imaging

Wolfgang Stürzl; Dean Soccol; Jochen Zeil; Norbert Boeddeker; Mandyam V. Srinivasan

We present a new combination of lenses and reflective surfaces for obstruction-free wide-angle imaging. The panoramic imaging system consists of a reflective surface machined into solid Perspex, which together with an embedded lens, can be attached to a video camera lens. Unlike vision sensors with a single mirror mounted in front of a camera, the view in the forward direction (i.e., the direction of the optical axis) is not obstructed. Light rays contributing to the central region of the image are refracted at a centrally positioned lens and at the Perspex enclosure. For the outer image region, rays are reflected at a mirror surface of constant angular gain machined into the Perspex and coated with silver. The design produces a field of view of approximately 260 degrees with only a small separation of viewpoints. The shape of the enclosing Perspex is specifically designed in order to minimize internal reflections.


Journal of Robotic Systems | 2003

Review: The Benefits and Applications of Bioinspired Flight Capabilities

Sarita Thakoor; Nathalie A. Cabrol; Norman Lay; Javaan Chahl; Dean Soccol; Butler Hine; Steven Zornetzer

This paper addresses the challenges of flight on Mars that at this time have the same element of novelty as flight on Earth itself was a novelty in the Kitty Hawk era almost 100 years ago, details the scientific need for such flyers, highlights the bioinspired engineering of exploration systems (BEES) flyer development and finally describes a few viable mission architecture options that allow reliable data return from the BEES flyers using the limited telecom infrastructure that can be made available with a lander base to orbiter combination on Mars. Our recent developments using inspiration from biology that are enabling the pathway to demonstrate flight capability for Mars exploration are described. These developments hold substantial spin-offs for a variety of applications both for NASA and DoD. Unmanned exploration to date suggests that Mars once had abundant liquid water (considered essential for life as we know it). It is not clear what transpired on the Martian climate to have turned the planet into the desert that it is today. Developing a comprehensive understanding of the past and present climatic events for Mars may provide important information relevant to the future of our own planet. Such exploration missions are enabled using the BEES technology.


Archive | 2012

From biology to engineering: Insect vision and applications to robotics

Mandyam V. Srinivasan; Richard James Donald Moore; Saul Thurrowgood; Dean Soccol; Daniel Bland

The past two decades have witnessed a growing interest not only in understanding sensory biology, but also in applying the principles gleaned from these studies to the design of new, biologically inspired sensors for a variety engineering applications. This chapter provides a brief account of this interdisciplinary endeavour in the field of insect vision and flight guidance. Despite their diminutive eyes and brains, flying insects display superb agility and remarkable navigational competence. This review describes our current understanding of how insects use vision to stabilize flight, avoid collisions with objects, regulate flight speed, navigate to a distant food source, and orchestrate smooth landings. It also illustrates how some of these insights from biology are being used to develop novel algorithms for the guidance of terrestrial and airborne vehicles. We use this opportunity to also highlight some of the outstanding questions in this particular area of sensing and control.

Collaboration


Dive into the Dean Soccol's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Daniel Bland

University of Queensland

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Sarita Thakoor

Jet Propulsion Laboratory

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Javaan S. Chahl

Australian National University

View shared research outputs
Top Co-Authors

Avatar

Michael Knight

University of Queensland

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge