Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Laurence J. Edwards is active.

Publication


Featured researches published by Laurence J. Edwards.


Autonomous Robots | 2001

Virtual Reality Interfaces for Visualization and Control of Remote Vehicles

Laurent Nguyen; Maria Bualat; Laurence J. Edwards; Lorenzo Flueckiger; Charles F. Neveu; Kurt Schwehr; Michael Wagner; Eric Zbinden

The Autonomy and Robotics Area (ARA) at NASA Ames Research Center has investigated the use of various types of Virtual Reality-based operator interfaces to remotely control complex robotic mechanisms. In this paper, we describe the major accomplishments and technology applications of the ARA in this area, and highlight the advantages and issues related to this technology.


systems, man and cybernetics | 2005

Photo-realistic Terrain Modeling and Visualization for Mars Exploration Rover Science Operations

Laurence J. Edwards; Michael H. Sims; Clayton Kunz; David Lees; Judd D. Bowman

Modern NASA planetary exploration missions employ complex systems of hardware and software managed by large teams of engineers and scientists in order to study remote environments. The most complex and successful of these recent projects is the Mars Exploration Rover mission. The Computational Sciences Division at NASA Ames Research Center delivered a 3D visualization program, Viz, to the MER mission that provides an immersive, interactive environment for science analysis of the remote planetary surface. In addition, Ames provided the Athena Science Team with high-quality terrain reconstructions generated with the Ames Stereo-pipeline. The on-site support team for these software systems responded to unanticipated opportunities to generate 3D terrain models during the primary MER mission. This paper describes Viz, the Stereo-pipeline, and the experiences of the on-site team supporting the scientists at JPL during the primary MER mission


Earth and Space Science | 2017

The Mars Science Laboratory (MSL) Mast cameras and Descent imager: Investigation and instrument descriptions

Michael C. Malin; Michael A. Ravine; Michael A. Caplinger; F. Tony Ghaemi; J. A. Schaffner; J. N. Maki; James F. Bell; James F. Cameron; William E. Dietrich; Kenneth S. Edgett; Laurence J. Edwards; James B. Garvin; Bernard Hallet; Kenneth E. Herkenhoff; Ezat Heydari; Linda C. Kah; Mark T. Lemmon; M. E. Minitti; Timothy S. Olson; Timothy J. Parker; Scott K. Rowland; Juergen Schieber; Ron Sletten; Robert J. Sullivan; Dawn Y. Sumner; R. Aileen Yingst; Brian M. Duston; Sean McNair; Elsa Jensen

Abstract The Mars Science Laboratory Mast camera and Descent Imager investigations were designed, built, and operated by Malin Space Science Systems of San Diego, CA. They share common electronics and focal plane designs but have different optics. There are two Mastcams of dissimilar focal length. The Mastcam‐34 has an f/8, 34 mm focal length lens, and the M‐100 an f/10, 100 mm focal length lens. The M‐34 field of view is about 20° × 15° with an instantaneous field of view (IFOV) of 218 μrad; the M‐100 field of view (FOV) is 6.8° × 5.1° with an IFOV of 74 μrad. The M‐34 can focus from 0.5 m to infinity, and the M‐100 from ~1.6 m to infinity. All three cameras can acquire color images through a Bayer color filter array, and the Mastcams can also acquire images through seven science filters. Images are ≤1600 pixels wide by 1200 pixels tall. The Mastcams, mounted on the ~2 m tall Remote Sensing Mast, have a 360° azimuth and ~180° elevation field of regard. Mars Descent Imager is fixed‐mounted to the bottom left front side of the rover at ~66 cm above the surface. Its fixed focus lens is in focus from ~2 m to infinity, but out of focus at 66 cm. The f/3 lens has a FOV of ~70° by 52° across and along the direction of motion, with an IFOV of 0.76 mrad. All cameras can acquire video at 4 frames/second for full frames or 720p HD at 6 fps. Images can be processed using lossy Joint Photographic Experts Group and predictive lossless compression.


Space | 2006

Human-Robot Site Survey and Sampling for Space Exploration

Terrence Fong; Maria Bualat; Laurence J. Edwards; Clayton Kunz; Susan Y. Lee; Eric Park; Hans Utz; Nir Ackner; Nicholas Armstrong-Crews; Joseph Gannon

NASA is planning to send humans and robots back to the Moon before 2020. In order for extended missions to be productive, high quality maps of lunar terrain and resources are required. Although orbital images can provide much information, many features (local topography, resources, etc) will have to be characterized directly on the surface. To address this need, we are developing a system to perform site survey and sampling. The system includes multiple robots and humans operating in a variety of team configurations, coordinated via peer-to-peer human-robot interaction. In this paper, we present our system design and describe planned field tests.


field and service robotics | 2008

Autonomous Robotic Inspection for Lunar Surface Operations

Maria Bualat; Laurence J. Edwards; Terrence Fong; Michael Broxton; Lorenzo Flueckiger; Susan Y. Lee; Eric Park; Vinh To; Hans Utz; Vandi Verma; Clayton Kunz; Matt MacMahon

In this paper, we describe NASA Ames Research Center’s K10 rover as used in the 2006 Coordinated Field Demonstration at Meteor Crater, Arizona. We briefly discuss the control software architecture and describe a high dynamic range imaging system and panoramic display system used for the remote inspection of an EVA crew vehicle.


international conference on image processing | 2014

Planetary rover localization within orbital maps

Ara V. Nefian; Xavier Bouyssounouse; Laurence J. Edwards; Taemin Kim; Emily Hand; Jared Rhizor; Matthew C. Deans; George Bebis; Terrence Fong

This paper introduces an advanced rover localization system suitable for autonomous planetary exploration in the absence of Global Positioning System (GPS) infrastructure. Given an existing terrain map (image and elevation) obtained from satellite imagery and the images provided by the rover stereo camera system, the proposed method determines the best rover location through visual odometry, 3D terrain and horizon matching. The system is tested on data retrieved from a 3 km traverse of the Basalt Hills quarry in California where the GPS track is used as ground truth. Experimental results show the system presented here reduces by over 60% the localization error obtained by wheel odometry.


international conference on image processing | 2016

Horizon based orientation estimation for planetary surface navigation

Xavier Bouyssounouse; Ara V. Nefian; A. Thomas; Laurence J. Edwards; Matthew C. Deans; Terrence Fong

Planetary rovers navigate in extreme environments for which a Global Positioning System (GPS) is unavailable, maps are restricted to relatively low resolution provided by orbital imagery, and compass information is often lacking due to weak or not existent magnetic fields. However, an accurate rover localization is particularly important to achieve the mission success by reaching the science targets, avoiding negative obstacles visible only in orbital maps, and maintaining good communication connections with ground. This paper describes a horizon solution for precise rover orientation estimation. The detected horizon in imagery provided by the on board navigation cameras is matched with the horizon rendered over the existing terrain model. The set of rotation parameters (roll, pitch yaw) that minimize the cost function between the two horizon curves corresponds to the rover estimated pose.


visualization and data analysis | 2005

Designing visualization software for ships and robotic vehicles

Kurt Schwehr; Alexander Derbes; Laurence J. Edwards; Laurent Nguyen; Eric Zbinden

One of the challenges of visualization software design is providing real-time tools capable of concurrently displaying data that varies temporally and in scale from kilometers to micrometers, such as the data prevalent in planetary exploration and deep-sea marine research. The Viz software developed by NASA Ames and the additions of the X-Core extensions solve this problem by providing a flexible framework for rapidly developing visualization software capable of accessing and displaying large dynamic data sets. This paper describes the Viz/X-Core design and illustrates the operation of both systems over a number of deployments ranging from marine research to Martian exploration. Highlights include a 2002 integration with live ship operations and the Mars Exploration Rovers Spirit and Opportunity.


Space Science Reviews | 2012

Curiosity's Mars Hand Lens Imager (MAHLI) investigation

Kenneth S. Edgett; R. Aileen Yingst; Michael A. Ravine; Michael A. Caplinger; J. N. Maki; F. Tony Ghaemi; J. A. Schaffner; James F. Bell; Laurence J. Edwards; Kenneth E. Herkenhoff; Ezat Heydari; Linda C. Kah; Mark T. Lemmon; M. E. Minitti; Timothy S. Olson; Timothy J. Parker; Scott K. Rowland; Juergen Schieber; Robert J. Sullivan; Dawn Y. Sumner; Peter C. Thomas; Elsa Jensen; John J. Simmonds; Aaron J. Sengstacken; Reg G. Willson; W. Goetz


Archive | 2008

The Ames Stereo Pipeline: Automated 3D Surface Reconstruction from Orbital Imagery

Michael Broxton; Laurence J. Edwards

Collaboration


Dive into the Laurence J. Edwards's collaboration.

Top Co-Authors

Avatar

James F. Bell

Arizona State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Dawn Y. Sumner

University of California

View shared research outputs
Top Co-Authors

Avatar

Ezat Heydari

Jackson State University

View shared research outputs
Top Co-Authors

Avatar

J. N. Maki

Jet Propulsion Laboratory

View shared research outputs
Top Co-Authors

Avatar

Juergen Schieber

Indiana University Bloomington

View shared research outputs
Top Co-Authors

Avatar

Kenneth E. Herkenhoff

United States Geological Survey

View shared research outputs
Top Co-Authors

Avatar

Linda C. Kah

University of Tennessee

View shared research outputs
Top Co-Authors

Avatar

M. E. Minitti

Planetary Science Institute

View shared research outputs
Researchain Logo
Decentralizing Knowledge