Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Roger V. Bostelman is active.

Publication


Featured researches published by Roger V. Bostelman.


Journal of Robotic Systems | 1993

The NIST robocrane

James S. Albus; Roger V. Bostelman; Nicholas G. Dagalakis

The Robot Systems Division of the National Institute of Standards and Technology (NIST) has been experimenting for several years with new concepts for robot cranes. These concepts utilize the basic idea of the Stewart platform parallel link manipulator. The unique feature of the NIST approach is to use cables as the parallel links and to use winches as the actuators. As long as the cables are all in tension, the load is kinematically constrained and the cables resist perturbing forces and moments with equal stiffness to both positive and negative loads. The result is that the suspended load is constrained with a mechanical stiffness determined by the elasticity of the cables, the suspended weight, and the geometry of the mechanism. Based on these concepts, a revolutionary new type of robot crane, the NIST ROBOCRANE, has been developed that can control the position, velocity, and force of tools and heavy machinery in all six degrees of freedom (x, y, z, roll, pitch, and yaw). Depending on what is suspended from its work platform, the ROBOCRANE can perform a variety of tasks. Examples are: cutting, excavating and grading, shaping and finishing, lifting, and positioning. A 6-m version of the ROBOCRANE has been built and critical performance characteristics analyzed.


Journal of Research of the National Institute of Standards and Technology | 1992

The NIST SPIDER, a robot crane

J. Albus; Roger V. Bostelman; Nicholas G. Dagalakis

The Robot Systems Division of the National Institute of Standards and Technology has been experimenting for several years with new concepts for robot cranes. These concepts utilize the basic idea of the Stewart Platform parallel link manipulator. The unique feature of the NIST approach is to use cables as the parallel links and to use winches as the actuators. So long as the cables are all in tension, the load is kinematically constrained, and the cables resist perturbing forces and moments with equal stiffness to both positive and negative loads. The result is that the suspended load is constrained with a mechanical stiffness determined by the elasticity of the cables, the suspended weight, and the geometry of the mechanism. Based on these concepts, a revolutionary new type of robot crane, the NIST SPIDER (Stewart Platform Instrumented Drive Environmental Robot) has been developed that can control the position, velocity, and force of tools and heavy machinery in all six degrees of freedom (x, y, z, roll, pitch, and yaw). Depending on what is suspended from its work platform, the SPIDER can perform a variety of tasks. Examples are: cutting, excavating and grading, shaping and finishing, lifting and positioning. A 6 m version of the SPIDER has been built and critical performance characteristics analyzed.


Autonomous Robots | 2008

Learning traversability models for autonomous mobile vehicles

Michael O. Shneier; Tommy Chang; Tsai Hong; William P. Shackleford; Roger V. Bostelman; James S. Albus

Abstract Autonomous mobile robots need to adapt their behavior to the terrain over which they drive, and to predict the traversability of the terrain so that they can effectively plan their paths. Such robots usually make use of a set of sensors to investigate the terrain around them and build up an internal representation that enables them to navigate. This paper addresses the question of how to use sensor data to learn properties of the environment and use this knowledge to predict which regions of the environment are traversable. The approach makes use of sensed information from range sensors (stereo or ladar), color cameras, and the vehicle’s navigation sensors. Models of terrain regions are learned from subsets of pixels that are selected by projection into a local occupancy grid. The models include color and texture as well as traversability information obtained from an analysis of the range data associated with the pixels. The models are learned without supervision, deriving their properties from the geometry and the appearance of the scene. The models are used to classify color images and assign traversability costs to regions. The classification does not use the range or position information, but only color images. Traversability determined during the model-building phase is stored in the models. This enables classification of regions beyond the range of stereo or ladar using the information in the color images. The paper describes how the models are constructed and maintained, how they are used to classify image regions, and how the system adapts to changing environments. Examples are shown from the implementation of this algorithm in the DARPA Learning Applied to Ground Robots (LAGR) program, and an evaluation of the algorithm against human-provided ground truth is presented.


international conference on networking, sensing and control | 2006

Applications of a 3D Range Camera Towards Healthcare Mobility Aids

Roger V. Bostelman; Peter Russo; James S. Albus; Tsai Hong Hong; Rajmohan Madhavan

The National Institute of Standards and Technology (NIST) has recently studied a new 3D range camera for use on mobile robots. These robots have potential applications in manufacturing, healthcare and perhaps several other service related areas beyond the scope of this paper. In manufacturing, the 3D range camera shows promise for standard size obstacle detection possibly augmenting existing safety systems on automated guided vehicles. We studied the use of this new 3D range imaging camera for advancing safety standards for automated guided vehicles. In healthcare, these cameras show promise for guiding the blind and assisting the disabled who are wheelchair dependent. Further development beyond standards efforts allowed NIST to combine the 3D camera with stereo audio feedback to help the blind or visually impaired to stereophonically hear where a clear path is from room to room as objects were detected with the camera. This paper describes the 3D range camera and the control algorithm that combines the camera with stereo audio to help guide people around objects, including the detection of low hanging objects typically undetected by a white cane


international conference on advanced robotics | 2005

Towards AGV safety and navigation advancement obstacle detection using a TOF range camera

Roger V. Bostelman; Tsai Hong Hong; Rajmohan Madhavan

The performance evaluation of an obstacle detection and segmentation algorithm for automated guided vehicle (AGV) navigation using a 3D real-time range camera is the subject of this paper. Our approach has teen tested successfully on British safety standard recommended object sizes and materials placed on the vehicle path. The segmented (mapped) obstacles are then verified using absolute measurements obtained using a relatively accurate 2D scanning laser rangefinder. Sensor mounting and sensor modulation issues are also described through representative data sets


Journal of Intelligent and Robotic Systems | 1992

High-level mobility controller for a remotely operated unmanned land vehicle

Sandor S. Szabo; Harry A. Scott; Karl Murphy; Steven Legowik; Roger V. Bostelman

The U.S. Army Laboratory Command, as part of the Department of Defense Robotics Testbed Program, is developing a testbed for cooperative, real-time control of unmanned land vehicles. The program entails the development and integration of many elements which allow the vehicles to perform both autonomous and teleoperated functions. The National Institute of Standards and Technology (NIST) is supporting this program by developing the vehicle control system using the Real-time Control System (RCS) architecture. RCS is a hierarchical, sensory-based control system, initially developed for the control of industrial robots and automated manufacturing systems. NIST is developing the portions of RCS that control all vehicle mobility functions, coordinate the operations of the other subsystems on the vehicle, and communicate between the vehicle and the remote operator control station. This paper reviews the overall control system architecture, the design and implementation of the mobility and communication functions, and results from recent testing.


Journal of Field Robotics | 2006

Learning in a hierarchical control system: 4D/RCS in the DARPA LAGR program

James S. Albus; Roger V. Bostelman; Tommy Chang; Tsai Hong Hong; William P. Shackleford; Michael O. Shneier

Abstract : The Defense Applied Research Projects Agency (DARPA) Learning Applied to Ground Vehicles (LAGR) program aims to develop algorithms for autonomous vehicle navigation that learn how to operate in complex terrain. Over many years, the National Institute of Standards and Technology (NIST) has developed a reference model control system architecture called 4D/RCS that has been applied to many kinds of robot control, including autonomous vehicle control. For the LAGR program, NIST has embedded learning into a 4D/RCS controller to enable the small robot used in the program to learn to navigate through a range of terrain types. The vehicle learns in several ways. These include learning by example, learning by experience, and learning how to optimize traversal. Learning takes place in the sensory processing, world modeling, and behavior generation parts of the control system. The 4D/RCS architecture is explained in the paper, its application to LAGR is described, and the learning algorithms are discussed. Results are shown of the performance of the NIST control system on independently-conducted tests. Further work on the system and its learning capabilities is discussed.


oceans conference | 1993

Stability of an underwater work platform suspended from an unstable reference

Roger V. Bostelman; James S. Albus

A work platform suspended by six cables and controlled by six winches located on a vessel, can precisely control the position, velocity, and force of tools, grippers, and machinery in all six degrees of freedom (x, y, z, roll, pitch, and yaw) while the vessel is moving due to sea surface conditions. Based on the Robocrane, the platform can be maneuvered via operator controls (e.g. a joystick) for doing work without disturbance of the sea bottom by controlling the cable tensions and buoyancy of the work platform. Large forces and torques can be exerted with the work platform while being controlled from a remote position. Many applications are possible: from underwater laying/repair to construction, in other than mild weather conditions.<<ETX>>


Advanced Robotics | 2008

Robotic Patient Transfer and Rehabilitation Device for Patient Care Facilities or the Home

Roger V. Bostelman; James S. Albus

This paper describes a novel Home Lift, Position and Rehabilitation (HLPR) Chair, designed at the National Institute of Standards and Technology, to provide independent patient mobility for indoor tasks, such as moving to and placing a person on a toilet or bed, and lift assistance for tasks, such as accessing kitchen or other tall shelves. These functionalities are currently out of reach of most wheelchair users. One of the design motivations of the HLPR Chair is to reduce back injury, typically an important issue in the care of this group. The HLPR Chair is currently being extended to be an autonomous mobility device to assist cognition by route and trajectory planning. This paper describes the design of HLPR Chair, its control architecture, and algorithms for autonomous planning and control using its unique kinematics. Also included here is a description of the plan, simulation results and sensor performance measurements completed in preparation for an autonomous HLPR Chair demonstration, as well as the recent modifications to advance the HLPR Chair toward a commercial system.


international symposium on safety, security, and rescue robotics | 2005

3D range imaging for urban search and rescue robotics research

Roger V. Bostelman; Tsai Hong; Raj Madhavan; Brian Weiss

Urban search and rescue (USAR) operations can be extremely dangerous for human rescuers during disaster response. Human task forces carrying necessary tools and equipment and having the required skills and techniques, are deployed for the rescue of victims of structural collapse. Instead of sending human rescuers into such dangerous structures, it is hoped that robots will one day meet the requirements to perform such tasks so that rescuers are not at risk of being hurt or worse. Recently, the National Institute of Standards and Technology, sponsored by the Defense Advanced Research Projects Agency, created reference test arenas that simulate collapsed structures for evaluating the performance of autonomous mobile robots performing USAR tasks. At the same time, the NIST Industrial Autonomous Vehicles Project has been studying advanced 3D range sensors for improved robot safety in warehouses and manufacturing environments. Combined applications are discussed in this paper where advanced 3D range sensors also show promise during USAR operations toward improved robot performance in collapsed structure navigation and rescue operations.

Collaboration


Dive into the Roger V. Bostelman's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Tsai Hong Hong

National Institute of Standards and Technology

View shared research outputs
Top Co-Authors

Avatar

William P. Shackleford

National Institute of Standards and Technology

View shared research outputs
Top Co-Authors

Avatar

Tommy Chang

National Institute of Standards and Technology

View shared research outputs
Top Co-Authors

Avatar

Michael O. Shneier

National Institute of Standards and Technology

View shared research outputs
Top Co-Authors

Avatar

Steven Legowik

National Institute of Standards and Technology

View shared research outputs
Top Co-Authors

Avatar

Rajmohan Madhavan

National Institute of Standards and Technology

View shared research outputs
Top Co-Authors

Avatar

Adam Jacoff

National Institute of Standards and Technology

View shared research outputs
Top Co-Authors

Avatar

Nicholas G. Dagalakis

National Institute of Standards and Technology

View shared research outputs
Top Co-Authors

Avatar

Tsai H. Hong

National Institute of Standards and Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge