Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Reg G. Willson is active.

Publication


Featured researches published by Reg G. Willson.


International Journal of Computer Vision | 2007

Computer Vision on Mars

Larry H. Matthies; Mark W. Maimone; Andrew Edie Johnson; Yang Cheng; Reg G. Willson; Carlos Y. Villalpando; Steve B. Goldberg; Andres Huertas; Andrew Neil Stein; Anelia Angelova

Increasing the level of spacecraft autonomy is essential for broadening the reach of solar system exploration. Computer vision has and will continue to play an important role in increasing autonomy of both spacecraft and Earth-based robotic vehicles. This article addresses progress on computer vision for planetary rovers and landers and has four main parts. First, we review major milestones in the development of computer vision for robotic vehicles over the last four decades. Since research on applications for Earth and space has often been closely intertwined, the review includes elements of both. Second, we summarize the design and performance of computer vision algorithms used on Mars in the NASA/JPL Mars Exploration Rover (MER) mission, which was a major step forward in the use of computer vision in space. These algorithms did stereo vision and visual odometry for rover navigation and feature tracking for horizontal velocity estimation for the landers. Third, we summarize ongoing research to improve vision systems for planetary rovers, which includes various aspects of noise reduction, FPGA implementation, and vision-based slip perception. Finally, we briefly survey other opportunities for computer vision to impact rovers, landers, and orbiters in future solar system exploration missions.


international symposium on experimental robotics | 2006

Autonomous Navigation Results from the Mars Exploration Rover (MER) Mission

Mark W. Maimone; Andrew Edie Johnson; Yang Cheng; Reg G. Willson; Larry H. Matthies

In January, 2004, the Mars Exploration Rover (MER) mission landed two rovers, Spirit and Opportunity, on the surface of Mars. Several autonomous navigation capabilities were employed in space for the first time in this mission. ]n the Entry, Descent, and Landing (EDL) phase, both landers used a vision system called the, Descent Image Motion Estimation System (DIMES) to estimate horizontal velocity during the last 2000 meters (m) of descent, by tracking features on the ground with a downlooking camera, in order to control retro-rocket firing to reduce horizontal velocity before impact. During surface operations, the rovers navigate autonomously using stereo vision for local terrain mapping and a local, reactive planning algorithm called Grid-based Estimation of Surface Traversability Applied to Local Terrain (GESTALT) for obstacle avoidance. ]n areas of high slip, stereo vision-based visual odometry has been used to estimate rover motion, As of mid-June, Spirit had traversed 3405 m, of which 1253 m were done autonomously; Opportunity had traversed 1264 m, of which 224 m were autonomous. These results have contributed substantially to the success of the mission and paved the way for increased levels of autonomy in future missions.


International Journal of Computer Vision | 2007

Design Through Operation of an Image-Based Velocity Estimation System for Mars Landing

Andrew Edie Johnson; Reg G. Willson; Yang Cheng; Jay D. Goguen; Chris Leger; Miguel Sanmartin; Larry H. Matthies

During the Mars Exploration Rover (MER) landings, the Descent Image Motion Estimation System (DIMES) was used for horizontal velocity estimation. The DIMES algorithm combined measurements from a descent camera, a radar altimeter, and an inertial measurement unit. To deal with large changes in scale and orientation between descent images, the algorithm used altitude and attitude measurements to rectify images to a level ground plane. Feature selection and tracking were employed in the rectified images to compute the horizontal motion between images. Differences of consecutive motion estimates were then compared to inertial measurements to verify correct feature tracking. DIMES combined sensor data from multiple sources in a novel way to create a low-cost, robust, and computationally efficient velocity estimation solution, and DIMES was the first robotics vision system used to control a spacecraft during planetary landing. This paper presents the design and implementation of the DIMES algorithm, the assessment of the algorithm performance using a high fidelity Monte Carlo simulation, validation of performance using field test data and the detailed results from the two landings on Mars.DIMES was used successfully during both MER landings. In the case of Spirit, had DIMES not been used onboard, the total velocity would have been at the limits of the airbag capability. Fortunately, DIMES computed the actual steady state horizontal velocity and it was used by the thruster firing logic to reduce the total velocity prior to landing. For Opportunity, DIMES computed the correct velocity, and the velocity was small enough that the lander performed no action to remove it.


international conference on robotics and automation | 2005

Field Testing of the Mars Exploration Rovers Descent Image Motion Estimation System

Andrew Edie Johnson; Reg G. Willson; Jay D. Goguen; James W. Alexander; David Meller

The Mars Exploration Rover (MER) Descent Image Motion Estimation System (DIMES) is the first autonomous machine vision system used to safely land a robotics payload on another planet. DIMES consists of a descent camera and an algorithm for estimating horizontal velocity using image, inertial and altitude measurements. Before DIMES was accepted by MER for inclusion in the mission, its performance was validated through field testing using a manned helicopter to image three Mars analog test sites. Statistical analysis of the resulting 1900+ test cases showed that DIMES met its velocity estimation requirement. This paper describes the DIMES field test approach and associated results.


systems, man and cybernetics | 2005

Mars Exploration Rover mobility and robotic arm operational performance

Edward Tunstel; Mark W. Maimone; Ashitey Trebi-Ollennu; Jeng Yen; Rich Petras; Reg G. Willson

Increased attention has been focused in recent years on human-machine systems, how they are architected, and how they should operate. The purpose of this paper is to describe an actual instance of a practical human-robot system used on a NASA Mars rover mission that has been underway since January 2004 involving daily interaction between humans on Earth and mobile robots on Mars. The emphasis is on the human-robot collaborative arrangement and the performance enabled by mobility and robotic arm software functionality during the first 90 days of the mission. Mobile traverse distance, accuracy, and rate as well as robotic arm operational accuracy achieved by the system is presented.


Archive | 2015

Curiosity’s robotic arm-mounted Mars Hand Lens Imager (MAHLI): Characterization and calibration status

Kenneth S. Edgett; Robin Aileen Yingst; Michael A. Caplinger; M. C. Caplinger; J. N. Maki; Michael A. Ravine; Fatemeh Ghaemi; F. Tony Ghaemi; Sean McNair; Kenneth E. Herkenhoff; Brian M. Duston; Reg G. Willson; R. Aileen Yingst; Megan R. Kennedy; M. E. Minitti; Aaron J. Sengstacken; Kimberley D. Supulver; Leslie J. Lipkaman; Gillian M. Krezoski; Marie J. McBride; Tessa L. Jones; Brian E. Nixon; Jason K. Van Beek; Daniel Krysak; Randolph L. Kirk

......................................................................................................................... 6


Optical Engineering | 2004

Sun-induced veiling glare in dusty camera optics

Carl Christian Liebe; Lawrence M. Scherr; Reg G. Willson

The National Aeronautical and Space Administration (NASA) is planning to send two Mars Exploration Rovers (MER) to Mars in 2003. Onboard these rovers will be a number of scientific and engineering cameras. Mars is a dusty place, so dust will accumulate on the front surface of the camera optics. When the sun shines on the dusty front surface, light will be scattered to the detector. This increases glare and reduces contrast. The rover lenses must work, even when the sun shines on the front element. Therefore, the veiling glare has been evalu- ated by experiments. We discuss these experiments and the results.


Space Science Reviews | 2012

Curiosity's Mars Hand Lens Imager (MAHLI) investigation

Kenneth S. Edgett; R. Aileen Yingst; Michael A. Ravine; Michael A. Caplinger; J. N. Maki; F. Tony Ghaemi; J. A. Schaffner; James F. Bell; Laurence J. Edwards; Kenneth E. Herkenhoff; Ezat Heydari; Linda C. Kah; Mark T. Lemmon; M. E. Minitti; Timothy S. Olson; Timothy J. Parker; Scott K. Rowland; Juergen Schieber; Robert J. Sullivan; Dawn Y. Sumner; Peter C. Thomas; Elsa Jensen; John J. Simmonds; Aaron J. Sengstacken; Reg G. Willson; W. Goetz


Archive | 2010

The Mars Science Laboratory (MSL) Mast-mounted Cameras (Mastcams) Flight Instruments

Michael C. Malin; Michael A. Caplinger; Kenneth S. Edgett; F. T. Ghaemi; Michael A. Ravine; J. A. Schaffner; Jennifer Baker; Jason Dante Bardis; Dan Dibiase; J. N. Maki; Reg G. Willson; James F. Bell; William E. Dietrich; Laurence J. Edwards; Bernard Hallet; Kenneth E. Herkenhoff; Ezat Heydari; Linda C. Kah; Mark T. Lemmon; M. E. Minitti; Timothy S. Olson; Timothy J. Parker; Scott K. Rowland; Juergen Schieber; R. Sullivan; Dawn Y. Sumner; Peter C. Thomas; Robin Aileen Yingst


Archive | 2005

AN OPTICAL MODEL FOR IMAGE ARTIFACTS PRODUCED BY DUST PARTICLES ON LENSES

Reg G. Willson; Mark W. Maimone; Andrew Edie Johnson; Larry Scherr

Collaboration


Dive into the Reg G. Willson's collaboration.

Top Co-Authors

Avatar

J. N. Maki

Jet Propulsion Laboratory

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

M. E. Minitti

Planetary Science Institute

View shared research outputs
Top Co-Authors

Avatar

Kenneth E. Herkenhoff

United States Geological Survey

View shared research outputs
Top Co-Authors

Avatar

Robin Aileen Yingst

University of Wisconsin–Green Bay

View shared research outputs
Top Co-Authors

Avatar

Andrew Edie Johnson

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Dawn Y. Sumner

University of California

View shared research outputs
Top Co-Authors

Avatar

Ezat Heydari

Jackson State University

View shared research outputs
Top Co-Authors

Avatar

James F. Bell

Arizona State University

View shared research outputs
Top Co-Authors

Avatar

Juergen Schieber

Indiana University Bloomington

View shared research outputs
Researchain Logo
Decentralizing Knowledge