Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Larry H. Matthies is active.

Publication


Featured researches published by Larry H. Matthies.


intelligent robots and systems | 2004

Real-time detection of moving objects from moving vehicles using dense stereo and optical flow

Ashit Talukder; Larry H. Matthies

Dynamic scene perception is very important for autonomous vehicles operating around other moving vehicles and humans. Most work on real-time camera based object tracking from moving platforms has used sparse features or assumed flat scene structures. We have recently extended a real-time, dense stereo system to include real-time, dense optical flow, enabling more comprehensive dynamic scene analysis. We describe algorithms to robustly estimate 6-DOF robot egomotion in the presence of moving objects using dense flow and dense stereo. We then use dense stereo and egomotion estimates to identify other moving objects while the robot itself is moving. We present results showing accurate egomotion estimation and detection of moving people and vehicles under general 6-DOF motion of the robot and independently moving objects. The system runs at 18.3 Hz on a 1.4 GHz Pentium M laptop, computing 160/spl times/120 disparity maps and optical flow fields, egomotion, and moving object segmentation. We believe this is a significant step toward general unconstrained dynamic scene analysis for mobile robots, as well as for improved position estimation where GPS is unavailable.


ieee aerospace conference | 2004

Path following using visual odometry for a Mars rover in high-slip environments

Daniel M. Helmick; Yang Cheng; Daniel S. Clouse; Larry H. Matthies; Stergios I. Roumeliotis

A system for autonomous operation of Mars rovers in high slip environments has been designed, implemented, and tested. This system is composed of several key technologies that enable the rover to accurately follow a designated path, compensate for slippage, and reach intended goals independent of the terrain over which it is traversing (within the mechanical constraints of the mobility system). These technologies include: visual odometry, full vehicle kinematics, a Kalman filter pose estimator, and a slip compensation/path follower. Visual odometry tracks distinctive scene features in stereo imagery to estimate rover motion between successively acquired stereo image pairs using a maximum likelihood motion estimation algorithm. The full vehicle kinematics for a rocker-bogie suspension system estimates motion, with a no-slip assumption, by measuring wheel rates, and rocker, bogie, and steering angles. The Kalman filter merges data from an inertial measurement unit (IMU) and visual odometry. This merged estimate is then compared to the kinematic estimate to determine (taking into account estimate uncertainties) if and how much slippage has occurred. If no statistically significant slippage has occurred then the kinematic estimate is used to complement the Kalman filter estimate. If slippage has occurred then a slip vector is calculated by differencing the current Kalman filter estimate from the kinematic estimate. This slip vector is then used, in conjunction with the inverse kinematics, to determine the necessary wheel velocities and steering angles to compensate for slip and follow the desired path.


The International Journal of Robotics Research | 2007

Autonomous Stair Climbing for Tracked Vehicles

Anastasios I. Mourikis; Nikolas Trawny; Stergios I. Roumeliotis; Daniel M. Helmick; Larry H. Matthies

In this paper, an algorithm for autonomous stair climbing with a tracked vehicle is presented. The proposed method achieves robust performance under real-world conditions, without assuming prior knowledge of the stair geometry, the dynamics of the vehicles interaction with the stair surface, or lighting conditions. The approach relies on fast and accurate estimation of the robots heading and its position relative to the stair boundaries. An extended Kalman filter is used for quaternion-based attitude estimation, fusing rotational velocity measurements from a 3-axial gyroscope, and measurements of the stair edges acquired with an onboard camera. A two-tiered controller, comprised of a centering- and a heading-control module, utilizes the estimates to guide the robot rapidly, safely, and accurately upstairs. Both the theoretical analysis and implementation of the algorithm are presented in detail, and extensive experimental results demonstrating the algorithms performance are described.


IEEE Transactions on Circuits and Systems for Video Technology | 1997

Multiresolution image sensor

Sabrina E. Kemeny; Roger Panicacci; Bedabrata Pain; Larry H. Matthies; Eric R. Fossum

The development of the CMOS active pixel sensor (APS) has, for the first time, permitted large scale integration of supporting circuitry and smart camera-functions on the same chip as a high-performance image sensor. This paper reports on the demonstration of a new 128/spl times/128 CMOS APS with programmable multiresolution readout capability. By placing signal processing circuitry on the imaging focal plane, the image sensor can output data at varying resolutions which can decrease the computational load of downstream image processing. For instance, software intensive image pyramid reconstruction can be eliminated. The circuit uses a passive switched capacitor network to average arbitrarily large neighborhoods of pixels which can then be read out at any user-defined resolution by configuring a set of digital shift registers. The full resolution frame rate is 30 Hz with higher rates for all other image resolutions. The sensor achieved 80 dB of dynamic range while dissipating only 5 mW of power. Circuit error was less than -34 dB and introduced no objectionable fixed pattern noise or other artifacts into the image.


IEEE Intelligent Systems | 2004

The Mars exploration rovers descent image motion estimation system

Yang Cheng; Jay D. Goguen; Andrew Edie Johnson; Chris Leger; Larry H. Matthies; A. Miguel San Martin; Reg G. Willson

Descent image motion estimation system is the first machine-vision system for estimating lander velocity during planetary descent. Composed of sensors and software, DIMES features a descent imager, a radar altimeter, an inertial-measurement unit, and an algorithm for combining sensor measurements to estimate horizontal velocity - the speed across the planets surface the lander travels as it descends. Although the sensors are not novel technology, the algorithm and flight software that combines them are new. This algorithm combines radar, image, and inertial data in a novel way to create a low-cost, robust, and computationally efficient solution to the horizontal-velocity-estimation problem. This article describes the DIMES algorithm, its testing, and its performance during both Mars exploration rover landings.


Intelligent Robots and Computer Vision XVI: Algorithms, Techniques, Active Vision, and Materials Handling | 1997

Lightweight Rovers for Mars Science Exploration and Sample Return

Paul S. Schenker; Lee F. Sword; A. J. Ganino; Donald B. Bickler; Gregory Scott Hickey; D. K. Brown; Eric T. Baumgartner; Larry H. Matthies; Brian H. Wilcox; Tucker R. Balch; Hrand Aghazarian; Michael Garrett

We report on the development of new mobile robots for Mars exploration missions. These lightweight survivable rover (LSR) systems are of potential interest to both space and terrestrial applications, and are distinguished from more conventional designs by their use of new composite materials, collapsible running gear, integrated thermal-structural chassis, and other mechanical features enabling improved mobility and environmental robustness at reduced mass, volume, and power. Our first demonstrated such rover architecture, LSR-1, introduces running gear based on 2D composite struts and 3D machined composite joints, a novel collapsible hybrid composite-aluminum wheel design, a unit-body structural- thermal chassis with improved internal temperature isolation and stabilization, and a spot-pushbroom laser/CCD sensor enabling accurate, fast hazard detection and terrain mapping. LSR-1 is an approximately .7


Unmanned ground vehicle technology. Conference | 2003

Detecting water hazards for autonomous off-road navigation

Larry H. Matthies; Paolo Bellutta; Mike McHenry

MIL 1.0 meter(Lambda) 2(W X L) footprint six-wheel (20 cm dia.) rocker-bogie geometry vehicle of approximately 30 cm ground clearance, weighing only 7 kilograms with an onboard .3 kilogram multi-spectral imager and spectroscopic photometer. By comparison, NASA/JPLs recently flown Mars Pathfinder rover Sojourner is an 11+ kilogram flight experiment (carrying a 1 kg APXS instrument) having approximately .45 X .6 meter(Lambda) 2(WXL) footprint and 15 cm ground clearance, and about half the warm electronics enclosure (WEE) volume with twice the diurnal temperature swing (-40 to +40 degrees Celsius) of LSR- 1 in nominal Mars environments. We are also developing a new, smaller 5 kilogram class LSR-type vehicle for Mars sample return -- the travel to, localization of, pick-up, and transport back to an Earth return ascent vehicle of a sample cache collected by earlier science missions. This sample retrieval rover R&D prototype has a completely collapsible mobility system enabling rover stowage to approximately 25% operational volume, as well an actively articulated axle, allowing changeable pose of the wheel strut geometry for improved transverse and manipulation characteristics.


Solid State Sensor Arrays and CCD Cameras | 1996

Programmable multiresolution CMOS active pixel sensor

Roger Panicacci; Sabrina E. Kemeny; Larry H. Matthies; Bedabrata Pain; Eric R. Fossum

Detecting water hazards for autonomous, off-road navigation of unmanned ground vehicles is a largely unexplored problem. In this paper, we catalog environmental variables that affect the difficulty of this problem, including day vs. night operation, whether the water reflects sky or other terrain features, the size of the water body, and other factors. We briefly survey sensors that are applicable to detecting water hazards in each of these conditions. We then present analyses and results for water detection for four specific sensor cases: (1) using color image classification to recognize sky reflections in water during the day, (2) using ladar to detect the presense of water bodies and to measure their depth, (3) using short-wave infrared (SWIR) imagery to detect water bodies, as well as snow and ice, and (4) using mid-wave infrared (MWIR) imagery to recognize water bodies at night. For color imagery, we demonstrate solid results with a classifier that runs at nearly video rate on a 433 MHz processor. For ladar, we present a detailed propagation analysis that shows the limits of water body detection and depth estimation as a function of lookahead distance, water depth, and ladar wavelength. For SWIR and MWIR, we present sample imagery from a variety of data collections that illustrate the potential of these sensors. These results demonstrate significant progress on this problem.


Journal of Geophysical Research | 2006

Spirit rover localization and topographic mapping at the landing site of Gusev crater, Mars

Rongxing Li; Brent A. Archinal; Raymond E. Arvidson; James F. Bell; Philip R. Christensen; Larry S. Crumpler; David J. Des Marais; Kaichang Di; T. Duxbury; M. Golombek; John A. Grant; Ronald Greeley; Joe Guinn; Andrew Edie Johnson; Randolph L. Kirk; Mark W. Maimone; Larry H. Matthies; M. C. Malin; Tim Parker; Mike Sims; Shane D. Thompson; Steven W. Squyres; Larry Soderblom

CMOS active pixel sensors (APS) allow the flexibility of placing signal processing circuitry on the imaging focal plane. The multiresolution CMOS APS can perform block averaging on-chip to eliminate the off-chip software intensive image processing. This 128 X 128 APS array can be read out at any user-defined resolution by configuring a set of digital shift registers. The full resolution frame rate is 30 Hz with higher rates for all other image resolutions.


Proceedings of the 24th US Army Science Conference | 2006

Daytime Water Detection by Fusing Multiple Cues for Autonomous Off-Road Navigation

Arturo L. Rankin; Larry H. Matthies; Andres Huertas

[1]xa0By sol 440, the Spirit rover has traversed a distance of 3.76 km (actual distance traveled instead of odometry). Localization of the lander and the rover along the traverse has been successfully performed at the Gusev crater landing site. We localized the lander in the Gusev crater using two-way Doppler radio positioning and cartographic triangulations through landmarks visible in both orbital and ground images. Additional high-resolution orbital images were used to verify the determined lander position. Visual odometry and bundle adjustment technologies were applied to compensate for wheel slippage, azimuthal angle drift, and other navigation errors (which were as large as 10.5% in the Husband Hill area). We generated topographic products, including 72 ortho maps and three-dimensional (3-D) digital terrain models, 11 horizontal and vertical traverse profiles, and one 3-D crater model (up to sol 440). Also discussed in this paper are uses of the data for science operations planning, geological traverse surveys, surveys of wind-related features, and other science applications.

Collaboration


Dive into the Larry H. Matthies's collaboration.

Top Co-Authors

Avatar

Arturo L. Rankin

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Andres Huertas

Jet Propulsion Laboratory

View shared research outputs
Top Co-Authors

Avatar

Max Bajracharya

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Yang Cheng

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Kaichang Di

Chinese Academy of Sciences

View shared research outputs
Top Co-Authors

Avatar

Andrew Howard

University of California

View shared research outputs
Top Co-Authors

Avatar

Daniel M. Helmick

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Mark W. Maimone

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Adnan Ansar

Jet Propulsion Laboratory

View shared research outputs
Top Co-Authors

Avatar

Anelia Angelova

Jet Propulsion Laboratory

View shared research outputs
Researchain Logo
Decentralizing Knowledge