Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Andres Castano is active.

Publication


Featured researches published by Andres Castano.


Autonomous Robots | 2005

Obstacle Detection and Terrain Classification for Autonomous Off-Road Navigation

Roberto Manduchi; Andres Castano; Ashit Talukder; Larry H. Matthies

Autonomous navigation in cross-country environments presents many new challenges with respect to more traditional, urban environments. The lack of highly structured components in the scene complicates the design of even basic functionalities such as obstacle detection. In addition to the geometric description of the scene, terrain typing is also an important component of the perceptual system. Recognizing the different classes of terrain and obstacles enables the path planner to choose the most efficient route toward the desired goal.This paper presents new sensor processing algorithms that are suitable for cross-country autonomous navigation. We consider two sensor systems that complement each other in an ideal sensor suite: a color stereo camera, and a single axis ladar. We propose an obstacle detection technique, based on stereo range measurements, that does not rely on typical structural assumption on the scene (such as the presence of a visible ground plane); a color-based classification system to label the detected obstacles according to a set of terrain classes; and an algorithm for the analysis of ladar data that allows one to discriminate between grass and obstacles (such as tree trunks or rocks), even when such obstacles are partially hidden in the grass. These algorithms have been developed and implemented by the Jet Propulsion Laboratory (JPL) as part of its involvement in a number of projects sponsored by the US Department of Defense, and have enabled safe autonomous navigation in high-vegetated, off-road terrain.


Journal of Field Robotics | 2007

OASIS: Onboard Autonomous Science Investigation System for Opportunistic Rover Science

Rebecca Castano; Tara Estlin; Robert C. Anderson; Daniel M. Gaines; Andres Castano; Benjamin J. Bornstein; Caroline Chouinard; M. A. Judd

The Onboard Autonomous Science Investigation System has been developed to enable a rover to identify and react to serendipitous science opportunities. Using the FIDO rover in the Mars Yard at JPL, we have successfully demonstrated a fully autonomous opportunistic science system. The closed loop system tests included the rover acquiring image data, finding rocks in the image, analyzing rock properties and identifying rocks that merit further investigation. When the system on the rover alerts the rover to take additional measurements of interesting rocks, the planning and scheduling component determines if there are enough resources to meet this additional science data request. The rover is then instructed to either turn toward the rock, or to actually move closer to the rock to take an additional, close-up image. Prototype dust devil and cloud detection algorithms were delivered to an infusion task which refined the algorithms specifically for Mars Exploration Rovers (MER). These algorithms have been integrated into the MER flight software and were recently uploaded to the rovers on Mars.


international symposium on 3d data processing visualization and transmission | 2004

Enhanced real-time stereo using bilateral filtering

Adnan Ansar; Andres Castano; Larry H. Matthies

In recent years, there have been significant strides in increasing quality of range from stereo using global techniques such as energy minimization. These methods cannot yet achieve real-time performance. However, the need to improve range quality for real-time applications persists. All real-time stereo implementations rely on a simple correlation step which employs some local similarity metric between the left and right image. Typically, the correlation takes place on an image pair modified in some way to compensate for photometric variations between the left and right cameras. Improvements and modifications to such algorithms tend to fall into one of two broad categories: those which address the correlation step itself (e.g., shiftable windows, adaptive windows) and those which address the preprocessing of input imagery (e.g. band-pass filtering, Rank, Census). Our efforts lie in the latter area. We present in this paper a modification of the standard band-pass filtering technique used by many SSD- and SAD-based correlation algorithms. By using the bilateral filter of Tomasi and Manduchi [(1998)], we minimize blurring at the filtering stage. We show that in conjunction with SAD correlation, our new method improves stereo quality at range discontinuities while maintaining real-time performance.


machine vision applications | 2008

Automatic detection of dust devils and clouds on Mars

Andres Castano; Alex Fukunaga; Jeffrey J. Biesiadecki; Lynn D. V. Neakrase; P. L. Whelley; Ronald Greeley; Mark T. Lemmon; Rebecca Castano; Steve Chien

The acquisition of science data in space applications is shifting from teleoperated data collection to an automated onboard analysis, resulting in improved data quality, as well as improved usage of limited resources such as onboard memory, CPU, and communications bandwidth. Science instruments onboard a modern deep-space spacecraft can acquire much more data that can be downloaded to Earth, given the limited communication bandwidth. Onboard data analysis offers a means of compressing the huge amounts of data collected and downloading only the most valuable subset of the collected data. In this paper, we describe algorithms for detecting dust devils and clouds onboard Mars rovers, and summarize the results. These algorithms achieve the accuracy required by planetary scientists, as well as the runtime, CPU, memory, and bandwidth constraints set by the engineering mission parameters. The detectors have been uploaded to the Mars Exploration Rovers, and currently are operational. These detectors are the first onboard science analysis processes on Mars.


intelligent robots and systems | 2002

Autonomous terrain characterisation and modelling for dynamic control of unmanned vehicles

Ashit Talukder; Roberto Manduchi; Rebecca Castano; Ken Owens; Larry H. Matthies; Andres Castano; Robert W. Hogg

We discuss techniques to predict the dynamic vehicle response to various natural obstacles. This method can then be used to adjust the vehicle dynamics to optimize performance (e.g. speed) while ensuring that the vehicle is not damaged. This capability opens up a new area of obstacle negotiation for UGVs, where the vehicle moves over certain obstacles, rather than avoiding them, thereby resulting in more effective achievement of objectives. Robust obstacle negotiation and vehicle dynamics prediction requires several key technologies that are discussed in this paper. We detect and segment (label) obstacles using a novel 3D obstacle algorithm. The material of each labelled obstacle (rock, vegetation, etc) is then determined using a texture or color classification scheme. Terrain load-bearing surface models are then constructed using vertical springs to model the compressibility and traversability of each obstacle in front of the vehicle. The terrain model is then combined with the vehicle suspension model to yield an estimate of the maximum safe velocity, and predict the vehicle dynamics as the vehicle follows a path. This end-to-end obstacle negotiation system is envisioned to be useful in optimized path planning and vehicle navigation in terrain conditions cluttered with vegetation, bushes, rocks, etc. Results on natural terrain with various natural materials are presented.


ieee aerospace conference | 2005

Current results from a rover science data analysis system

Rebecca Castano; Michele Judd; Tara Estlin; Robert C. Anderson; Daniel M. Gaines; Andres Castano; Ben Bornstein; Tim Stough; Kiri L. Wagstaff

The Onboard Autonomous Science Investigation System (OASIS) evaluates geologic data gathered by a planetary rover. This analysis is used to prioritize the data for transmission, so that the data with the highest science value is transmitted to Earth. In addition, the onboard analysis results are used to identify science opportunities. A planning and scheduling component of the system enables the rover to take advantage of the identified science opportunity. OASIS is a NASA-funded research project that is currently being tested on the FIDO rover at JPL for use on future missions. In this paper, we provide a brief overview of the OASIS system, and then describe our recent successes in integrating with and using rover hardware. OASIS currently works in a closed loop fashion with onboard control software (e.g., navigation and vision) and has the ability to autonomously perform the following sequence of steps: analyze gray scale images to find rocks, extract the properties of the rocks, identify rocks of interest, retask the rover to take additional imagery of the identified target and then allow the rover to continue on its original mission. We also describe the early 2004 ground test validation of specific OASIS components on selected Mars exploration rover (MER) images. These components include the rock-finding algorithm, RockIT, and the rock size feature extraction code. Our team also developed the RockIT GUI, an interface that allows users to easily visualize and modify the rock-finder results. This interface has allowed us to conduct preliminary testing and validation of the rock-finders performance.


international conference on robotics and automation | 2003

Foliage discrimination using a rotating ladar

Andres Castano; Larry H. Matthies

An outdoor environment presents to a robot objects that are drivable, such as tall grass and small bushes, and non-drivable, such as trees and rocks. Due to the difficulty of discriminating between these classes, traditionally a robot searches for paths free of any objects, drivable or not. Although this approach prevents collisions with objects misclassified as drivable, it also eliminates a large number of drivable paths and by doing so, it may eliminate the only path to a desired destination. We present a real time algorithm that detects foliage, using a range from a rotating ladar. Objects not classified as foliage are conservatively labeled as nondrivable obstacles. In contrast to related work that uses range statistics to classify the objects, we exploit the expected localities and continuities of an obstacle, in both space and time. Also, instead of attempting to find a single accurate discriminating factor for every ladar return, we hypothesize the class of some few returns and then spread the confidence (and classification) to other returns using the locality constraints. The Urbie robot is presently using this algorithm to discriminate drivable grass from obstacles during outdoor autonomous navigation tasks.


ISRR | 2005

Obstacle Detection in Foliage with Ladar and Radar

Larry H. Matthies; Chuck Bergh; Andres Castano; Jose Macedo; Roberto Manduchi

Autonomous off-road navigation is central to several important applications of unmanned ground vehicles. This requires the ability to detect obstacles in vegetation. We examine the prospects for doing so with scanning ladar and with a linear array of 2.2 GHz micro-impulse radar transceivers. For ladar, we summarize our work to date on algorithms for detecting obstacles in tall grass with single-axis ladar, then present a simple probabilistic model of the distance into tall grass that ladar-based obstacle detection is possible. This model indicates that the ladar “penetration depth” can range from on the order of 10 cm to several meters, depending on the plant type. We also present an experimental investigation of mixed pixel phenomenology for a time-of-flight, SICK ladar and discuss briefly how this bears on the problem. For radar, we show results of applying an existing algorithm for multi-frequency diffraction tomography to a set of 45 scans taken with one sensor translating laterally 4 cm/scan to mimic a linear array of transceivers. This produces a high resolution, 2-D map of scattering surfaces in front of the array and clearly reveals a large tree trunk behind over 2.5 m of thick foliage. Both types of sensor warrant further development and exploitation for this problem.


ieee aerospace conference | 2006

Opportunistic rover science: finding and reacting to rocks, clouds and dust devils

Rebecca Castano; Tara Estlin; Daniel M. Gaines; Andres Castano; Caroline Chouinard; Ben Bornstein; Robert C. Anderson; Steve Chien; Alex Fukunaga; Michele Judd

The goal of the Onboard Autonomous Science Investigation System (OASIS) project at NASAs Jet Propulsion Laboratory (JPL) is to evaluate, and autonomously act upon, science data gathered by in-situ spacecraft, such as planetary landers and rovers. Using the FIDO rover in the Mars yard at JPL, we have successfully demonstrated a closed loop system test of the rover acquiring image data, finding rocks in the image, analyzing rock properties and identifying rocks that merit further investigation. When the system on the rover alerts the rover to take additional measurements of interesting rocks, the planning and scheduling component determines if there are enough resources to meet this additional science data request. The rover is then instructed to either turn toward the rock, or to actually move closer to the rock to take an additional, close up, picture. In addition to these hardware integration successes, the OASIS team has also continued its autonomous science research by collaboratively working with other scientists and technologists to identify and react to other scientific phenomena - such as clouds and dust devils. Prototype dust devil and cloud detection algorithms were delivered to an infusion task which has refined the algorithms specifically for Mars exploration rovers (MER) and is integrating the code into the next release of MER flight software


ieee aerospace conference | 2007

Onboard Autonomous Rover Science

Rebecca Castano; Tara Estlin; Daniel M. Gaines; Clement Chouinard; B. Bomstein; Robert C. Anderson; Michael C. Burl; David R. Thompson; Andres Castano; M. A. Judd

The Onboard Autonomous Science Investigation System (OASIS) was used in the first formal demonstration of closed loop opportunistic detection and reaction during a rover traverse on the FIDO rover at NASAs Jet Propulsion Laboratory. In addition to hardware demonstrations, the system has been demonstrated and exercised in simulation using the Rover Analysis, Modeling, and Simulation (ROAMS) planetary rover simulator, A. Jain et al (2003). We discuss several system enhancements including new planning and scheduling capabilities and image prioritization. We also describe the new end-of-traverse capability that includes taking a partial panorama of images, assessing these for targets of interest, and collecting narrow angle images of selected targets. Finally, we present several methods for estimating properties of rocks and provide a comparative assessment. Understanding the relationship of these methods is important to correctly interpret autonomous rock analyses performed during a traverse.

Collaboration


Dive into the Andres Castano's collaboration.

Top Co-Authors

Avatar

Tara Estlin

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Benjamin J. Bornstein

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Daniel M. Gaines

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Rebecca Castano

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Ramon Abel Castano

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

M. A. Judd

Jet Propulsion Laboratory

View shared research outputs
Top Co-Authors

Avatar

Robert C. Anderson

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Ronald Greeley

Arizona State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge