Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Paul Michelman is active.

Publication


Featured researches published by Paul Michelman.


international conference on robotics and automation | 1992

Trajectory filtering and prediction for automated tracking and grasping of a moving object

Peter K. Allen; Aleksandar Timcenko; Billibon H. Yoshimi; Paul Michelman

The authors explore the requirements for grasping a moving object. This task requires proper coordination between at least three separate subsystems: real-time vision sensing, trajectory-planning/arm-control, and grasp planning. As with humans, the system first visually tracks the objects 3D position. Because the object is in motion, this must be done in real-time to coordinate the motion of the robotic arm as it tracks the object. The vision system is used to feed an arm control algorithm that plans a trajectory. The arm control algorithm is implemented into two steps: filtering and prediction and kinematic transformation computation. Once the trajectory of the object is tracked, the hand must intercept the object to actually grasp it. Experimental results are presented in which which a moving model train was tracked, stably grasped, and picked up by the system.<<ETX>>


international conference on robotics and automation | 1998

Precision object manipulation with a multifingered robot hand

Paul Michelman

This paper outlines several key issues associated with precision manipulation for robot hands. Precision manipulation is defined as the control of a grasped object using fingertip contacts alone. A set of primitive manipulation functions is defined. They are generalizable in the sense that they take parameters for different object geometry, speed, and direction of motion. A single manipulation can be performed with a number of different grasp topologies. With each manipulation there are several associated computations: 1) the trajectories of the contact points can be calculated a priori from knowledge of the desired object motion; 2) a workspace analysis is performed to determine that the manipulation is within the workspaces of all of the fingers simultaneously; and 3) task partitioning is performed to specify force- and position-controlled directions of the contact points. This partitioning controls the grasping forces on the object during the grasping and manipulation phases. The paper also describes how the primitive manipulations can be combined into complex tasks. The complex example of removing a top from a childproof bottle is presented. The manipulations were implemented on a Utah/MIT dextrous robot-hand system.


international conference on robotics and automation | 1994

Forming complex dextrous manipulations from task primitives

Paul Michelman; Peter K. Allen

This paper discusses the implementation of complex manipulation tasks with a dextrous hand. The approach used is to build a set of primitive manipulation functions and combine them to form complex tasks. Only fingertip, or precision, manipulations are considered. Each function performs a simple two-dimensional translation or rotation that can be generalized to work with objects of different sizes and using different grasping forces. Complex tasks are sequential combinations of the primitive functions. They are formed by analyzing the workspaces of the individual tasks and controlled by finite state machines. We present a number of examples, including a complex manipulation removing the top of a child-proof medicine bottle-that incorporates different hybrid position/force specifications of the primitive functions of which it is composed. The work has been implemented with a robot hand system using a Utah-MIT hand.<<ETX>>


intelligent robots and systems | 1994

Shared autonomy in a robot hand teleoperation system

Paul Michelman; Peter K. Allen

This paper considers adding autonomy to robot hands used in teleoperation systems. Currently, the finger positions of robot hands in teleoperation systems are controlled via a robot master using a Dataglove or exoskeleton. There are several difficulties with this approach: accurate calibration is hard to achieve; robot hands have different capabilities from human hands; and complex force reflection is difficult. In this paper we propose a model of hand teleoperation in which the input device commands the motions of a grasped object rather than the joint displacements of the fingers. To achieve this goal, the hand requires greater autonomy and the capability to perform high-level functions with minimal external input. Therefore, a set of general, primitive manipulation functions that can be performed automatically is defined. These elementary functions control simple rotations and translations of the grasped objects. They are incorporated into a teleoperation system by using a simple input device as a control signal. Preliminary implementations with a Utah/MIT are discussed.<<ETX>>


international conference on robotics and automation | 1989

An integrated system for dextrous manipulation

Peter K. Allen; Paul Michelman; Kenneth S. Roberts

The authors describe an integrated system for dextrous manipulation using a Utah-MIT hand that makes it possible to look at the higher levels of control in a number of grasping and manipulation tasks. The system consists of a number of low-level system primitives for integrated hand and robotic arm movement, tactile sensors mounted on the fingertips, sensing primitives to utilize joint position, tendon force and tactile array feedback, and a high-level programming environment that allows task level scripts to be created for grasping and manipulation tasks are described that have been implemented with this system.<<ETX>>


international conference on robotics and automation | 1993

Compliant manipulation with a dextrous robot hand

Paul Michelman; Peter K. Allen

The control of precise, compliant manipulation tasks with multifingered robots is discussed. Emphasis is placed on performing manipulations of grasped objects that are themselves undergoing compliant motion. This class of manipulations include common tasks such as using tools, writing, and sliding an object on a surface. A task-level formulation is presented and illustrated. Results of experiments are presented to demonstrate the feasibility of performing precision manipulations with a dextrous hand.<<ETX>>


systems man and cybernetics | 1990

A system for programming and controlling a multisensor robotic hand

Peter K. Allen; Paul Michelman; Kenneth S. Roberts

A system for programming and controlling a multisensor robotic hand (Utah-MIT Hand) is described. Using this system, a number of autonomous tasks that are easily programmed and include combinations of hand-arm actuation with force, position, and tactile sensing have been implemented. The system is controlled at the software level by a programming language DIAL that provides an easy method for expressing the parallel operation of robotic devices. It also provides a convenient way to implement task-level scripts that can then be bound to particular sensors, actuators, and methods for accomplishing a generic grasping or manipulation task. Experiments using the system to pick up and pour from a pitcher, unscrew a lightbulb, and explore planar surfaces are presented. >


[1989] Proceedings. Workshop on Interpretation of 3D Scenes | 1989

Acquisition and interpretation of 3-D sensor data from touch

Peter K. Allen; Paul Michelman

A description is given of the use of touch sensing as part of a larger system the authors are building for 3-D shape recovery and object recognition using touch and vision methods. The authors focus on three exploratory procedures (EPs) they have built to acquire and interpret sparse 3D touch data: grasping by containment, planar surface exploration, and surface contour exploration. Experimental results for each of these procedures are presented. The EPs can be used in a coarse-to-fine sensing strategy that tries to build shape descriptions at a number of levels. An important feature of this system is the multiple representations used in recovering and reasoning about shape.<<ETX>>


IEEE Computer | 1989

Current research in robotics and automation-an intelligent grasping system

Peter K. Allen; Paul Michelman; Kenneth S. Roberts

A research project is described that focuses on building a comprehensive grasping environment capable of performing tasks such as locating moving objects and picking them up, manipulating man-made objects such as tools, and recognizing unknown objects through touch. In addition, an integrated programming environment is being designed that will allow grasping and grasping primitives within an overall robotic control and programming system that includes dextrous hands, vision sensors, and multiple-degree-of-freedom manipulators. A system overview is given, and the applications are discussed. >


Sensor Fusion III: 3D Perception and Recognition | 1991

Hand-eye coordination for grasping moving objects

Peter K. Allen; Billibon H. Yoshimi; Alexander Timcenko; Paul Michelman

Most robotic grasping tasks assume a stationary or fixed object. In this paper, we explore the requirements for grasping a moving object. This task requires proper coordination between at least 3 separate subsystems: dynamic vision sensing, real-time arm control, and grasp control. As with humans, our system first visually tracks the object’s 3-D position. Because the object is in motion, this must be done in a dynamic manner to coordinate the motion of the robotic arm as it tracks the object. The dynamic vision system is used to feed a real-time arm control algorithm that plans a trajectory. The arm control algorithm is implemented in two steps: 1) filtering and prediction, and 2) kinematic transformation computation. Once the trajectory of the object is tracked, the hand must intercept the object to actually grasp it. We present 3 different strategies for intercepting the object and results from the tracking algorithm.

Collaboration


Dive into the Paul Michelman's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge