Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Javaan S. Chahl is active.

Publication


Featured researches published by Javaan S. Chahl.


Journal of The Optical Society of America A-optics Image Science and Vision | 2003

Catchment areas of panoramic snapshots in outdoor scenes

Jochen Zeil; Martin I. Hofmann; Javaan S. Chahl

We took panoramic snapshots in outdoor scenes at regular intervals in two- or three-dimensional grids covering 1 m2 or 1 m3 and determined how the root mean square pixel differences between each of the images and a reference image acquired at one of the locations in the grid develop over distance from the reference position. We then asked whether the reference position can be pinpointed from a random starting position by moving the panoramic imaging device in such a way that the image differences relative to the reference image are minimized. We find that on time scales of minutes to hours, outdoor locations are accurately defined by a clear, sharp minimum in a smooth three-dimensional (3D) volume of image differences (the 3D difference function). 3D difference functions depend on the spatial-frequency content of natural scenes and on the spatial layout of objects therein. They become steeper in the vicinity of dominant objects. Their shape and smoothness, however, are affected by changes in illumination and shadows. The difference functions generated by rotation are similar in shape to those generated by translation, but their plateau values are higher. Rotational difference functions change little with distance from the reference location. Simple gradient descent methods are surprisingly successful in recovering a goal location, even if faced with transient changes in illumination. Our results show that view-based homing with panoramic images is in principle feasible in natural environments and does not require the identification of individual landmarks. We discuss the relevance of our findings to the study of robot and insect homing.


Applied Optics | 1997

Reflective surfaces for panoramic imaging

Javaan S. Chahl; Mandyam V. Srinivasan

A family of reflective surfaces is presented that, when imaged by a camera, can capture a global view of the visual environment. By using these surfaces in conjunction with conventional imaging devices, it is possible to produce fields of view in excess of 180 degrees that are not affected by the distortions and aberrations found in refractive wide-angle imaging devices. By solving a differential equation expressing the camera viewing angle as a function of the angle of incidence on a reflective surface, a family of appropriate surfaces has been derived. The surfaces preserve a linear relationship between the angle of incidence of light onto the surface and the angle of reflection onto the imaging device, as does a normal mirror. However, the gradient of this linear relationship can be varied as desired to produce a larger or smaller field of view. The resulting family of surfaces has a number of applications in surveillance and machine vision.


Biological Cybernetics | 2000

How honeybees make grazing landings on flat surfaces.

Mandyam V. Srinivasan; Shaowu Zhang; Javaan S. Chahl; Erhardt Barth; Svetha Venkatesh

Abstract. Freely flying bees were filmed as they landed on a flat, horizontal surface, to investigate the underlying visuomotor control strategies. The results reveal that (1) landing bees approach the surface at a relatively shallow descent angle; (2) they tend to hold the angular velocity of the image of the surface constant as they approach it; and (3) the instantaneous speed of descent is proportional to the instantaneous forward speed. These characteristics reflect a surprisingly simple and effective strategy for achieving a smooth landing, by which the forward and descent speeds are automatically reduced as the surface is approached and are both close to zero at touchdown. No explicit knowledge of flight speed or height above the ground is necessary. A model of the control scheme is developed and its predictions are verified. It is also shown that, during landing, the bee decelerates continuously and in such a way as to keep the projected time to touchdown constant as the surface is approached. The feasibility of this landing strategy is demonstrated by implementation in a robotic gantry equipped with vision.


Robotics and Autonomous Systems | 1999

Robot navigation inspired by principles of insect vision

Mandyam V. Srinivasan; Javaan S. Chahl; K. Weber; Svetha Venkatesh; Martin G. Nagle; Shaowu Zhang

Abstract Recent studies of insect visual behaviour and navigation reveal a number of elegant strategies that can be profitably applied to the design of autonomous robots. The peering behaviour of grasshoppers, for example, has inspired the design of new rangefinding systems. The centring response of bees flying through a tunnel has led to simple methods for navigating through corridors. Experimental investigation of the bees “odometer” has led to the implementation of schemes for visually driven odometry. These and other visually mediated insect behaviours are described along with a number of applications to robot navigation.


Neural Computation | 2004

Insect-inspired estimation of egomotion

Matthias O. Franz; Javaan S. Chahl; Holger G. Krapp

Tangential neurons in the fly brain are sensitive to the typical optic flow patterns generated during egomotion. In this study, we examine whether a simplified linear model based on the organization principles in tangential neurons can be used to estimate egomotion from the optic flow. We present a theory for the construction of an estimator consisting of a linear combination of optic flow vectors that incorporates prior knowledge about the distance distribution of the environment and about the noise and egomotion statistics of the sensor. The estimator is tested on a gantry carrying an omnidirectional vision sensor. The experiments show that the proposed approach leads to accurate and robust estimates of rotation rates, whereas translation estimates are of reasonable quality, albeit less reliable.


Proceedings of The Institution of Mechanical Engineers Part G-journal of Aerospace Engineering | 2004

An overview of insect-inspired guidance for application in ground and airborne platforms

Mandyam V. Srinivasan; Shaowu Zhang; Javaan S. Chahl; Gert Stange; Matthew A. Garratt

Abstract Flying insects provide a clear demonstration that living organisms can display surprisingly competent mechanisms of guidance and navigation, despite possessing relatively small brains and simple nervous systems. Consequently, they are proving to be excellent organisms in which to investigate how visual information is exploited to guide locomotion and navigation. Four illustrative examples are described here, in the context of navigation to a destination. Bees negotiate narrow gaps by balancing the speeds of the images in the two eyes. Flight speed is regulated by holding constant the average image velocity as seen by the two eyes. This automatically ensures that flight speed is reduced to a safe level when the passage narrows. Smooth landings on a horizontal surface are achieved by holding image velocity constant as the surface is approached, thus automatically ensuring that flight speed is close to zero at touchdown. Roll and pitch are stabilized by balancing the signals registered by three visual organs, the ocelli, that view the horizon in the left, right and forward directions respectively. Tests of the feasibility of these navigational strategies, by implementation in robots, are described.


Journal of The Optical Society of America A-optics Image Science and Vision | 1997

Range estimation with a panoramic visual sensor

Javaan S. Chahl; Mandyam V. Srinivasan

A method that uses a moving panoramic visual sensor to estimate range is presented. Range estimation is based on the fact that local, motion-induced deformation of the panoramic image is range dependent, regardless of the nature of the deformation. The nature of the deformation depends on azimuthal viewing direction: It is an expansion in the direction of motion, a contraction in the opposite direction, a translation for viewing directions perpendicular to the motion, and a complex deformation along other viewing directions. Range in each direction is estimated by comparing the magnitude of the image deformation measured in that direction with the magnitude of the deformation that would be produced by a surface at a known, standard, minimum range. The local image deformation is measured by using a version of the image interpolation algorithm. The results of this technique are a uniform and efficient treatment of the entire panoramic image and a robust estimation of range in every direction. With this method a single translatory motion provides range information along every azimuthal direction.


The Biological Bulletin | 2001

Landing strategies in honeybees, and possible applications to autonomous airborne vehicles.

Mandyam V. Srinivasan; Shaowu Zhang; Javaan S. Chahl

Insects, being perhaps more reliant on image motion cues than mammals or higher vertebrates, are proving to be an excellent organism in which to investigate how information on optic flow is exploited to guide locomotion and navigation. This paper describes one example, illustrating how bees perform grazing landings on a flat surface. A smooth landing is achieved by a surprisingly simple and elegant strategy: image velocity is held constant as the surface is approached, thus automatically ensuring that flight speed is close to zero at touchdown. No explicit knowledge of flight speed or height above the ground is necessary. The feasibility of this landing strategy is tested by implementation in a robotic gantry, and its applicability to autonomous airborne vehicles is discussed.


Biological Cybernetics | 1996

Visual computation of egomotion using an image interpolation technique

Javaan S. Chahl; Mandyam V. Srinivasan

A novel technique is presented for the computation of the parameters of egomotion of a mobile device, such as a robot or a mechanical arm, equipped with two visual sensors. Each sensor captures a panoramic view of the environment. We show that the parameters of egomotion can be computed by interpolating the position of the image captured by one of the sensors at the robots present location, with respect to the images captured by the two sensors at the robots previous location. The algorithm delivers the distance travelled and angle rotated, without the explicit measurement or integration of velocity fields. The result is obtained in a single step, without any iteration or successive approximation. Tests of the algorithm on real and synthetic images reveal an accuracy to within 5% of the actual motion. Implementation of the algorithm on a mobile robot reveals that stepwise rotation and translation can be measured to within 10% accuracy in a three-dimensional world of unknown structure. The position and orientation of the robot at the end of a 30-step trajectory can be estimated with accuracies of 5% and 5°, respectively.


BMCV '02 Proceedings of the Second International Workshop on Biologically Motivated Computer Vision | 2002

Insect-Inspired Estimation of Self-Motion

Matthias O. Franz; Javaan S. Chahl

The tangential neurons in the fly brain are sensitive to the typical optic flow patterns generated during self-motion. In this study, we examine whether a simplified linear model of these neurons can be used to estimate self-motion from the optic flow. We present a theory for the construction of an optimal linear estimator incorporating prior knowledge about the environment. The optimal estimator is tested on a gantry carrying an omnidirectional vision sensor. The experiments show that the proposed approach leads to accurate and robust estimates of rotation rates, whereas translation estimates turn out to be less reliable.

Collaboration


Dive into the Javaan S. Chahl's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Sarita Thakoor

Jet Propulsion Laboratory

View shared research outputs
Top Co-Authors

Avatar

Shaowu Zhang

Australian National University

View shared research outputs
Top Co-Authors

Avatar

Martin G. Nagle

Australian National University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Matthew A. Garratt

University of New South Wales

View shared research outputs
Top Co-Authors

Avatar

Gert Stange

Australian National University

View shared research outputs
Researchain Logo
Decentralizing Knowledge