Robert O. Ambrose
Texas Medical Center
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Robert O. Ambrose.
international conference on robotics and automation | 2011
Myron A. Diftler; Joshua S. Mehling; Muhammad E. Abdallah; Nicolaus A. Radford; Lyndon Bridgwater; A.M. Sanders; R.S. Askew; Douglas Martin Linn; John D. Yamokoski; F.A. Permenter; Brian Hargrave; Robert Platt; R.T. Savely; Robert O. Ambrose
NASA and General Motors have developed the second generation Robonaut, Robonaut 2 or R2, and it is scheduled to arrive on the International Space Station in early 2011 and undergo initial testing by mid-year. This state of the art, dexterous, anthropomorphic robotic torso has significant technical improvements over its predecessor making it a far more valuable tool for astronauts. Upgrades include: increased force sensing, greater range of motion, higher bandwidth, and improved dexterity. R2s integrated mechatronic design results in a more compact and robust distributed control system with a fraction of the wiring of the original Robonaut. Modularity is prevalent throughout the hardware and software along with innovative and layered approaches for sensing and control. The most important aspects of the Robonaut philosophy are clearly present in this latest models ability to allow comfortable human interaction and in its design to perform significant work using the same hardware and interfaces used by people. The following describes the mechanisms, integrated electronics, control strategies, and user interface that make R2 a promising addition to the Space Station and other environments where humanoid robots can assist people.
Autonomous Robots | 2003
William Bluethmann; Robert O. Ambrose; Myron A. Diftler; R. Scott Askew; Eric Huber; Michael Goza; Fredrik Rehnmark; Chris Lovchik; Darby Magruder
The Robotics Technology Branch at the NASA Johnson Space Center is developing robotic systems to assist astronauts in space. One such system, Robonaut, is a humanoid robot with the dexterity approaching that of a suited astronaut. Robonaut currently has two dexterous arms and hands, a three degree-of-freedom articulating waist, and a two degree-of-freedom neck used as a camera and sensor platform. In contrast to other space manipulator systems, Robonaut is designed to work within existing corridors and use the same tools as space walking astronauts. Robonaut is envisioned as working with astronauts, both autonomously and by teleoperation, performing a variety of tasks including, routine maintenance, setting up and breaking down worksites, assisting crew members while outside of spacecraft, and serving in a rapid response capacity.
international conference on robotics and automation | 2003
Myron A. Diftler; C.J. Culbert; Robert O. Ambrose; Robert Platt; William Bluethmann
The NASA/DARPA Robonaut system is evolving from a purely teleoperator controlled anthropomorphic robot towards a humanoid system with multiple control pathways. Robonaut is a human scale robot designed to approach the dexterity of a space suited astronaut. Under teleoperator control, Robonaut has been able to perform many high payoff tasks indicating that it could significantly reduce the maintenance workload for humans working in space. Throughout its development, Robonaut has been augmented to include new sensors and software resulting in increased skills that allow for more shared control with the teleoperator, and ever increasing levels of autonomy. These skills range from simple compliance control, and short term memory, to, most recently, reflexive grasping and haptic object identification using a custom tactile glove, and real-time visual object tracking.
international conference on robotics and automation | 2004
Robert O. Ambrose; Robert T. Savely; S. M. Goza; Philip Strawser; Myron A. Diftler; Ivan M. Spain; Nicolaus A. Radford
The Johnson Space Center has developed a new mobile manipulation system with the combination of a Robonaut upper body mounted onto a Segway mobile base. The objective is to study a fluid and coordinated control of dexterous limbs on a mobile robot. The system has been demonstrated interacting with people, tools, and urban interfaces built for humans. Human interactions have included manually exchanging objects with humans, following people, and tracking people with hand held objects such as flashlights. Like other configurations of the Robonaut family, the upper body provides dexterity for using tools such as wire cutters, shovels, space flight gear, and handling flexible tethers and fabrics. The Segway base is a custom version called the Robotic Mobility Platform (RMP) built for DARPA, and provided to NASA for this collaborative effort. The RMPs active balance gives Robonaut a relatively small footprint for its height, allowing it to pass through doors and elevators built for humans, and use wheelchair accessible ramps and lifts. Lessons learned from this development are presented to improve the design of future mobile manipulation systems, and the Segway base provides mobility to Robonaut for Earth based testing.
international conference on robotics and automation | 2004
Toby B. Martin; Robert O. Ambrose; Myron A. Diftler; Robert Platt; Melissa Butzer
Tactile data from rugged gloves are providing the foundation for developing autonomous grasping skills for the NASA/DARPA Robonaut, a dexterous humanoid robot. These custom gloves compliment the human like dexterity available in the Robonaut hands. Multiple versions of the gloves are discussed, showing a progression in using advanced materials and construction techniques to enhance sensitivity and overall sensor coverage. The force data provided by the gloves can be used to improve dexterous, tool and power grasping primitives. Experiments with the latest gloves focus on the use of tools, specifically a power drill used to approximate an astronauts torque tool.
Journal of Field Robotics | 2015
Nicolaus A. Radford; Philip Strawser; Kimberly A. Hambuchen; Joshua S. Mehling; William K. Verdeyen; A. Stuart Donnan; James Holley; Jairo Sanchez; Vienny Nguyen; Lyndon Bridgwater; Reginald Berka; Robert O. Ambrose; Mason M. Markee; Nathan Fraser-Chanpong; Christopher McQuin; John D. Yamokoski; Stephen Hart; Raymond Guo; Adam H. Parsons; Brian J. Wightman; Paul Dinh; Barrett Ames; Charles Blakely; Courtney Edmondson; Brett Sommers; Rochelle Rea; Chad Tobler; Heather Bibby; Brice Howard; Lei Niu
In December 2013, 16 teams from around the world gathered at Homestead Speedway near Miami, FL to participate in the DARPA Robotics Challenge DRC Trials, an aggressive robotics competition partly inspired by the aftermath of the Fukushima Daiichi reactor incident. While the focus of the DRC Trials is to advance robotics for use in austere and inhospitable environments, the objectives of the DRC are to progress the areas of supervised autonomy and mobile manipulation for everyday robotics. NASAs Johnson Space Center led a team comprised of numerous partners to develop Valkyrie, NASAs first bipedal humanoid robot. Valkyrie is a 44 degree-of-freedom, series elastic actuator-based robot that draws upon over 18 years of humanoid robotics design heritage. Valkyries application intent is aimed at not only responding to events like Fukushima, but also advancing human spaceflight endeavors in extraterrestrial planetary settings. This paper presents a brief system overview, detailing Valkyries mechatronic subsystems, followed by a summarization of the inverse kinematics-based walking algorithm employed at the Trials. Next, the software and control architectures are highlighted along with a description of the operator interface tools. Finally, some closing remarks are given about the competition, and a vision of future work is provided.
human factors in computing systems | 2004
S. M. Goza; Robert O. Ambrose; Myron A. Diftler; Ivan M. Spain
Engineers at the Johnson Space Center recently combined the upper body of the National Aeronautics and Space Administration (NASA) / Defense Advanced Research Projects Agency (DARPA) Robonaut system with a Robotic Mobility Platform (RMP) to make an extremely mobile humanoid robot designed to interact with human teammates. Virtual Reality gear that immerses a human operator into Robonauts working environment provides the primary control pathway for remote operations. Human/robot interface challenges are addressed in the control system for teleoperators, console operators and humans working directly with the Robonaut. Multiple control modes are available for controlling the five fingered dexterous robot hands and operator selectable depending on the type of grasp required. A relative positioning system is used to maximize operator comfort during arm and head motions. Foot pedals control the mobility base. Initial tasks that include working with human rated tools, navigating hallways and cutting wires are presented and show the effectiveness of telepresence control for this class of robot.
ieee aerospace conference | 2008
Dan A. Harrison; Robert O. Ambrose; Bill Bluethmann; Lucien Q. Junkin
As NASA refines its plans for the return of humans to the lunar surface, it becomes very clear that surface mobility will be critical to outpost buildup and exploration activities. NASAs Exploration Technology Development Program is investing in a broad range of surface mobility projects. Within this range of projects falls a rover vehicle, capable of moving suited crew members and cargo. A prototype, known as Chariot has been developed. This prototype vehicle is a multipurpose, reconfigurable, modular lunar surface vehicle. And, with the right attachments and/or crew accommodations, Chariot will be capable of serving a large number of functions.
ieee-ras international conference on humanoid robots | 2004
William Bluethmann; Robert O. Ambrose; Myron A. Diftler; Eric Huber; Andrew H. Fagg; Michael T. Rosenstein; Robert Platt; Roderic A. Grupen; Cynthia Breazeal; Andrew G. Brooks; Andrea Lockerd; Richard Alan Peters; Odest Chadwicke Jenkins; Maja J. Matarić; Magdalena D. Bugajska
To make the transition from a technological curiosity to productive tools, humanoid robots will require key advances in many areas, including, mechanical design, sensing, embedded avionics, power, and navigation. Using the NASA Johnson Space Centers Robonaut as a testbed, the DARPA mobile autonomous robot software (MARS) humanoids team is investigating technologies that will enable humanoid robots to work effectively with humans and autonomously work with tools. A novel learning approach is being applied that enables the robot to learn both from a remote human teleoperating the robot and an adjacent human giving instruction. When the remote human performs tasks teleoperatively, the robot learns the salient sensory-motor features to executing the task. Once learned, the task may be carried out fusing the skills required to perform the task, guided by on-board sensing. The adjacent human takes advantage of previously learned skills to sequence the execution of these skills. Preliminary results from initial experiments using a drill to tighten lug nuts on a wheel are discussed.
IEEE Transactions on Robotics | 2006
Christina Louise Campbell; Richard Alan Peters; Robert E. Bodenheimer; William Bluethmann; Eric Huber; Robert O. Ambrose
This paper reports that the superposition of a small set of behaviors, learned via teleoperation, can lead to robust completion of an articulated reach-and-grasp task. The results support the hypothesis that a robot can learn to interact purposefully with its environment through a developmental acquisition of sensory-motor coordination. Teleoperation can bootstrap the process by enabling the robot to observe its own sensory responses to actions that lead to specific outcomes within an environment. It is shown that a reach-and-grasp task, learned by an articulated robot through a small number of teleoperated trials, can be performed autonomously with success in the face of significant variations in the environment and perturbations of the goal. In particular, teleoperation of the robot to reach and grasp an object at nine different locations in its workspace enabled robust autonomous performance of the task anywhere within the workspace. Superpositioning was performed using the Verbs and Adverbs algorithm that was developed originally for the graphical animation of articulated characters. The work was performed on Robonaut, the NASA space-capable humanoid at Johnson Space Center, Houston, TX.