Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Javaan Chahl is active.

Publication


Featured researches published by Javaan Chahl.


The International Journal of Robotics Research | 2004

Landing strategies in honeybees and applications to uninhabited airborne vehicles

Javaan Chahl; Mandyam V. Srinivasan; Shaowu Zhang

An application of insect visuomotor behavior to automatic control of landing is explored. Insects, being perhaps more reliant on image motion cues than mammals or higher vertebrates, are proving to be an excellent organism in which to investigate how information on optic flow is exploited to guide locomotion and navigation. We have observed how bees perform grazing landings on a flat surface and have deduced the algorithmic basis for the behavior. A smooth landing is achieved by a surprisingly simple and elegant strategy: image velocity is held constant as the surface is approached, thus automatically ensuring that flight speed is close to zero at touchdown. No explicit knowledge of flight speed or height above the ground is necessary. The feasibility of this landing strategy was tested by implementation in a robotic gantry. We also outline our current efforts at exploring the applicability of this and related techniques to the guidance of uninhabited airborne vehicles (UAVs). Aspects of the algorithm were tested on a small UAV using real imagery to control descent rate.


Journal of Comparative Physiology A-neuroethology Sensory Neural and Behavioral Physiology | 2002

Anisotropic imaging in the dragonfly median ocellus: a matched filter for horizon detection

Gert Stange; Sally Stowe; Javaan Chahl; A Massaro

Abstract. It is suggested that the dragonfly median ocellus is specifically adapted to detect horizontally extended features rather than merely changes in overall intensity. Evidence is presented from the optics, tapetal reflections and retinal ultrastructure. The underfocused ocelli of adult insects are generally incapable of resolving images. However, in the dragonfly median ocellus the geometry of the lens indicates that some image detail is present at the retina in the vertical dimension. Details in the horizontal dimension are blurred by the strongly astigmatic lens. In the excised eye the image of a point source forms a horizontal streak at the level of the retina. Tapetal reflections from the intact eye show that the field of view is not circular as in most other insects but elliptical with the major axis horizontal, and that resolution in the vertical direction is better than in the horizontal. Measurements of tapetal reflections in locust ocelli confirm their visual fields are wide and circular and their optics strongly underfocused. The ultrastructure suggests adaptation for resolution, sensitivity and a high metabolic rate, with long, widely separated rhabdoms, retinulae cupped by reflecting pigment, abundant tracheoles and mitochondria, and convoluted, amplified retinula cell plasma membranes.


Journal of Robotic Systems | 2003

Bioinspired engineering of exploration systems: a horizon sensor/attitude reference system based on the dragonfly Ocelli for Mars exploration applications

Javaan Chahl; Sarita Thakoor; Naig Le Bouffant; Gert Stange; Mandyam V. Srinivasan; Butler Hine; Steven Zornetzer

Bioinspired engineering of exploration systems (BEES) is a fast emerging new discipline. It focuses on distilling the principles found in successful, nature-tested mechanisms of specific crucial functions that are hard to accomplish by conventional methods, but are accomplished rather deftly in nature by biological organisms. The intent is not just to mimic operational mechanisms found in a specific biological organism but to imbibe the salient principles from a variety of diverse organisms for the desired crucial function. Thereby, we can build exploration systems that have specific capabilities endowed beyond nature, as they will possess a mix of the best nature-tested mechanisms for each particular function. Insects (for example, honey bees and dragonflies) cope remarkably well with their world, despite possessing a brain that carries less than 0.01% as many neurons as ours does. Although most insects have immobile eyes, fixed focus optics, and lack stereo vision, they use a number of ingenious strategies for perceiving their world in three dimensions and navigating successfully in it. We are distilling some of these insect-inspired strategies for utilizing optical cues to obtain unique solutions to navigation, hazard avoidance, altitude hold, stable flight, terrain following, and smooth deployment of payload. Such functionality can enable access to otherwise unreachable exploration sites for much sought-after data. A BEES approach to developing autonomous dflight systems, particularly in small scale, can thus have a tremendous impact on autonomous airborne navigation of these biomorphic flyers particularly for planetary exploration missions, for example, to Mars which offer unique challenges due to its thin atmosphere, low gravity, and lack of magnetic field. Incorporating these success strategies of bioinspired navigation into biomorphic sensors such as the horizon sensor described herein fulfills for the first time the requirements of a variety of potential future Mars exploration applications described in this paper. Specifically we have obtained lightweight (similar to6 g), low power (<40 mW), and robust autonomous horizon sensing for flight stabilization based on distilling the principles of the dragonfly ocelli. Such levels of miniaturization of navigation sensors are essential to enable biomorphic microflyers (< 1 kg) that can be deployed in large numbers for distributed measurements. In this paper we present the first experimental test results of a biomorphic flyer platform with an embedded biomorphic ocellus (the dragonfly-inspired horizon sensor/attitude reference system). These results from the novel hardware implementation of a horizon sensor demonstrate the advantage of our approach in adapting principles proven successful in nature to accomplish navigation for Mars exploration


Proceedings IEEE Workshop on Omnidirectional Vision (Cat. No.PR00704) | 2000

A complete panoramic vision system, incorporating imaging, ranging, and three dimensional navigation

Javaan Chahl; Mandyam V. Srinivasan

A complete panoramic imaging, processing and autonomous navigation stratagem is presented. New techniques for filtering panoramic images are developed to allow rangefinding and image filtering. The technique utilises the motion of the autonomous agent to compute range visually. Using the range data, a safe path through the environment is found. Each stage of processing requires novel approaches to processing and interpreting panoramic images. The entire system was tested in an artificial environment using a robotic gantry and a model terrain.


IEEE Sensors Journal | 2012

Biomimetic Attitude and Orientation Sensors

Javaan Chahl; Akiko Mizutani

We developed and flight-tested two biomimetic sensors that use the spectral, spatial and polarization distribution of light in the environment for navigation and stabilization. A sky polarization compass was constructed and methodologies for precise calibration were developed. In static and flight testing, the calibrated device was found to be comparable in accuracy to a solid state magnetic compass. A biomimetic version of the optical stabilization organ of dragonflies known as the ocelli was constructed. A technique of spectral opponency in ultraviolet and green wavelengths was demonstrated to be effective in reducing the biasing effect of the sun. In flight testing, the biomimetic ocelli were implemented as part of the autopilot for maintaining level flight and shown to be effective. The successful results indicate that biomimetic sensors may have a role in the quest to miniaturize the autopilots of small unmanned aerial vehicles.


Artificial Life | 2002

Bioinspired engineering of exploration systems for NASA and DoD

Sarita Thakoor; Javaan Chahl; Mandyam V. Srinivasan; L. Young; Frank S. Werblin; Butler Hine; Steven Zornetzer

A new approach called bioinspired engineering of exploration systems (BEES) and its value for solving pressing NASA and DoD needs are described. Insects (for example honeybees and dragonflies) cope remarkably well with their world, despite possessing a brain containing less than 0.01 as many neurons as the human brain. Although most insects have immobile eyes with fixed focus optics and lack stereo vision, they use a number of ingenious, computationally simple strategies for perceiving their world in three dimensions and navigating successfully within it. We are distilling selected insect-inspired strategies to obtain novel solutions for navigation, hazard avoidance, altitude hold, stable flight, terrain following, and gentle deployment of payload. Such functionality provides potential solutions for future autonomous robotic space and planetary explorers. A BEES approach to developing lightweight low-power autonomous flight systems should be useful for flight control of such biomorphic flyers for both NASA and DoD needs. Recent biological studies of mammalian retinas confirm that representations of multiple features of the visual world are systematically parsed and processed in parallel. Features are mapped to a stack of cellular strata within the retina. Each of these representations can be efficiently modeled in semiconductor cellular nonlinear network (CNN) chips. We describe recent breakthroughs in exploring the feasibility of the unique blending of insect strategies of navigation with mammalian visual search, pattern recognition, and image understanding into hybrid biomorphic flyers for future planetary and terrestrial applications. We describe a few future mission scenarios for Mars exploration, uniquely enabled by these newly developed biomorphic flyers.


International Journal of Pattern Recognition and Artificial Intelligence | 1997

Robot Navigation by Visual Dead-Reckoning: Inspiration from Insects

Mandyam V. Srinivasan; Javaan Chahl; Shaowu Zhang

Insects such as ants and bees are capable of surprisingly good navigation, despite the small size and relative simplicity of their brains. Recent experimental research in our laboratory, summarized in this chapter, indicates that honeybees estimate the distance travelled to a food site in terms of the image motion that they experience en route. This finding has inspired us to design and build a robot that navigates using a visually-driven odometer.


Sensors | 2017

Real Time Apnoea Monitoring of Children Using the Microsoft Kinect Sensor: A Pilot Study

Ali Al-Naji; Kim Gibson; Sang-Heon Lee; Javaan Chahl

The objective of this study was to design a non-invasive system for the observation of respiratory rates and detection of apnoea using analysis of real time image sequences captured in any given sleep position and under any light conditions (even in dark environments). A Microsoft Kinect sensor was used to visualize the variations in the thorax and abdomen from the respiratory rhythm. These variations were magnified, analyzed and detected at a distance of 2.5 m from the subject. A modified motion magnification system and frame subtraction technique were used to identify breathing movements by detecting rapid motion areas in the magnified frame sequences. The experimental results on a set of video data from five subjects (3 h for each subject) showed that our monitoring system can accurately measure respiratory rate and therefore detect apnoea in infants and young children. The proposed system is feasible, accurate, safe and low computational complexity, making it an efficient alternative for non-contact home sleep monitoring systems and advancing health care applications.


Journal of Robotic Systems | 2003

Review: The Benefits and Applications of Bioinspired Flight Capabilities

Sarita Thakoor; Nathalie A. Cabrol; Norman Lay; Javaan Chahl; Dean Soccol; Butler Hine; Steven Zornetzer

This paper addresses the challenges of flight on Mars that at this time have the same element of novelty as flight on Earth itself was a novelty in the Kitty Hawk era almost 100 years ago, details the scientific need for such flyers, highlights the bioinspired engineering of exploration systems (BEES) flyer development and finally describes a few viable mission architecture options that allow reliable data return from the BEES flyers using the limited telecom infrastructure that can be made available with a lander base to orbiter combination on Mars. Our recent developments using inspiration from biology that are enabling the pathway to demonstrate flight capability for Mars exploration are described. These developments hold substantial spin-offs for a variety of applications both for NASA and DoD. Unmanned exploration to date suggests that Mars once had abundant liquid water (considered essential for life as we know it). It is not clear what transpired on the Martian climate to have turned the planet into the desert that it is today. Developing a comprehensive understanding of the past and present climatic events for Mars may provide important information relevant to the future of our own planet. Such exploration missions are enabled using the BEES technology.


Biomedical Signal Processing and Control | 2016

Remote respiratory monitoring system based on developing motion magnification technique

Ali Al-Naji; Javaan Chahl

Abstract The aim of this study is to detect and measure the rate and timing parameters of the respiratory cycle at a distance from different sleeping positions of a baby based on video imagery. This study relied on amplifying motion resulting from movement of the chest caused by inhalation and exhalation. A motion magnification technique based on a wavelet decomposition and an elliptic filter was used to magnify breathing movement that is difficult to see with the naked eye. A novel measuring method based on motion detection was used to measure respiratory rate and its time parameters by detecting the fastest moving areas in the magnified video frame sequences. The video frames were converted into a corresponding logical matrix. The experimental results on several videos for the baby at different sleeping positions show that the remote respiratory monitoring system has an accuracy of 99%. The proposed system has very low computational complexity, is feasible and safe making it suitable for the design of next generation non-contact vital signs monitoring systems.

Collaboration


Dive into the Javaan Chahl's collaboration.

Top Co-Authors

Avatar

Akiko Mizutani

Defence Science and Technology Organisation

View shared research outputs
Top Co-Authors

Avatar

Ali Al-Naji

University of South Australia

View shared research outputs
Top Co-Authors

Avatar

Kent Rosser

Defence Science and Technology Organisation

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Sang-Heon Lee

University of South Australia

View shared research outputs
Top Co-Authors

Avatar

Huajian Liu

University of South Australia

View shared research outputs
Top Co-Authors

Avatar

David Carr

Defence Science and Technology Organization

View shared research outputs
Top Co-Authors

Avatar

Aakash Dawadee

University of South Australia

View shared research outputs
Top Co-Authors

Avatar

Asanka G. Perera

University of South Australia

View shared research outputs
Top Co-Authors

Avatar

Jia-Ming Kok

University of South Australia

View shared research outputs
Researchain Logo
Decentralizing Knowledge