Matei T. Ciocarlie
Columbia University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Matei T. Ciocarlie.
The International Journal of Robotics Research | 2009
Matei T. Ciocarlie; Peter K. Allen
In this paper we focus on the concept of low-dimensional posture subspaces for artificial hands. We begin by discussing the applicability of a hand configuration subspace to the problem of automated grasp synthesis; our results show that low-dimensional optimization can be instrumental in deriving effective pre-grasp shapes for a number of complex robotic hands. We then show that the computational advantages of using a reduced dimensionality framework enable it to serve as an interface between the human and automated components of an interactive grasping system. We present an on-line grasp planner that allows a human operator to perform dexterous grasping tasks using an artificial hand. In order to achieve the computational rates required for effective user interaction, grasp planning is performed in a hand posture subspace of highly reduced dimensionality. The system also uses real-time input provided by the operator, further simplifying the search for stable grasps to the point where solutions can be found at interactive rates. We demonstrate our approach on a number of different hand models and target objects, in both real and virtual environments.
international conference on robotics and automation | 2009
Corey Goldfeder; Matei T. Ciocarlie; Hao Dang; Peter K. Allen
Collecting grasp data for learning and benchmarking purposes is very expensive. It would be helpful to have a standard database of graspable objects, along with a set of stable grasps for each object, but no such database exists. In this work we show how to automate the construction of a database consisting of several hands, thousands of objects, and hundreds of thousands of grasps. Using this database, we demonstrate a novel grasp planning algorithm that exploits geometric similarity between a 3D model and the objects in the database to synthesize form closure grasps. Our contributions are this algorithm, and the database itself, which we are releasing to the community as a tool for both grasp planning and benchmarking.
intelligent robots and systems | 2010
Kaijen Hsiao; Sachin Chitta; Matei T. Ciocarlie; E. Gil Jones
Robotic grasping in unstructured environments requires the ability to select grasps for unknown objects and execute them while dealing with uncertainty due to sensor noise or calibration errors. In this work, we propose a simple but robust approach to grasp selection for unknown objects, and a reactive adjustment approach to deal with uncertainty in object location and shape. The grasp selection method uses 3D sensor data directly to determine a ranked set of grasps for objects in a scene, using heuristics based on both the overall shape of the object and its local features. The reactive grasping approach uses tactile feedback from fingertip sensors to execute a compliant robust grasp. We present experimental results to validate our approach by grasping a wide range of unknown objects. Our results show that reactive grasping can correct for a fair amount of uncertainty in the measured position or shape of the objects, and that our grasp selection approach is successful in grasping objects with a variety of shapes.
intelligent robots and systems | 2007
Matei T. Ciocarlie; Corey Goldfeder; Peter K. Allen
In this paper, we build upon recent advances in neuroscience research which have shown that control of the human hand during grasping is dominated by movement in a configuration space of highly reduced dimensionality. We extend this concept to robotic hands and show how a similar dimensionality reduction can be defined for a number of different hand models. This framework can be used to derive planning algorithms that produce stable grasps even for highly complex hand designs. Furthermore, it offers a unified approach for controlling different hands, even if the kinematic structures of the models are significantly different. We illustrate these concepts by building a comprehensive grasp planner that can be used on a large variety of robotic hands under various constraints.
international symposium on experimental robotics | 2014
Matei T. Ciocarlie; Kaijen Hsiao; Edward Gil Jones; Sachin Chitta; Radu Bogdan Rusu; Ioan A. Şucan
We present a complete software architecture for reliable grasping of household objects. Our work combines aspects such as scene interpretation from 3D range data, grasp planning, motion planning, and grasp failure identification and recovery using tactile sensors. We build upon, and add several new contributions to the significant prior work in these areas. A salient feature of our work is the tight coupling between perception (both visual and tactile) and manipulation, aiming to address the uncertainty due to sensor and execution errors. This integration effort has revealed new challenges, some of which can be addressed through system and software engineering, and some of which present opportunities for future research. Our approach is aimed at typical indoor environments, and is validated by long running experiments where the PR2 robotic platform was able to consistently grasp a large variety of known and unknown objects. The set of tools and algorithms for object grasping presented here have been integrated into the open-source Robot Operating System (ROS).
human-robot interaction | 2012
Adam Leeper; Kaijen Hsiao; Matei T. Ciocarlie; Leila Takayama; David Gossow
Human-in-the loop robotic systems have the potential to handle complex tasks in unstructured environments, by combining the cognitive skills of a human operator with autonomous tools and behaviors. Along these lines, we present a system for remote human-in-the-loop grasp execution. An operator uses a computer interface to visualize a physical robot and its surroundings, and a point-and-click mouse interface to command the robot. We implemented and analyzed four different strategies for performing grasping tasks, ranging from direct, real-time operator control of the end-effector pose, to autonomous motion and grasp planning that is simply adjusted or confirmed by the operator. Our controlled experiment (N=48) results indicate that people were able to successfully grasp more objects and caused fewer unwanted collisions when using the strategies with more autonomous assistance. We used an untethered robot over wireless communications, making our strategies applicable for remote, human-in-the-loop robotic applications.
intelligent robots and systems | 2009
Corey Goldfeder; Matei T. Ciocarlie; Jaime Peretzman; Hao Dang; Peter K. Allen
To grasp a novel object, we can index it into a database of known 3D models and use precomputed grasp data for those models to suggest a new grasp. We refer to this idea as data-driven grasping, and we have previously introduced the Columbia Grasp Database for this purpose. In this paper we demonstrate a data-driven grasp planner that requires only partial 3D data of an object in order to grasp it. To achieve this, we introduce a new shape descriptor for partial 3D range data, along with an alignment method that can rigidly register partial 3D models to models that are globally similar but not identical. Our method uses SIFT features of depth images, and encapsulates “nearby” views of an object in a compact shape descriptor.
IEEE Robotics & Automation Magazine | 2013
Tiffany L. Chen; Matei T. Ciocarlie; Steve Cousins; Phillip M. Grice; Kelsey P. Hawkins; Kaijen Hsiao; Charles C. Kemp; Chih-Hung King; Daniel A. Lazewatsky; Adam Leeper; Hai Nguyen; Andreas Paepcke; Caroline Pantofaru; William D. Smart; Leila Takayama
Assistive mobile manipulators (AMMs) have the potential to one day serve as surrogates and helpers for people with disabilities, giving them the freedom to perform tasks such as scratching an itch, picking up a cup, or socializing with their families.
symposium on haptic interfaces for virtual environment and teleoperator systems | 2007
Matei T. Ciocarlie; Claire Lackner; Peter K. Allen
This paper presents a method for building analytical contact models for soft fingers. Friction constraints are derived based on general expressions for non-planar contacts of elastic bodies, taking into account the local geometry and structure of the objects in contact. These constraints are then formulated as a linear complementarity problem, the solution of which provides the normal and frictional forces applied at each contact, as well as the relative velocity of the bodies involved. This approach captures frictional effects such as coupling between tangential force and frictional torque. We illustrate this method by analyzing manipulation tasks performed by an anthropomorphic robotic hand equipped with soft fingerpads
IEEE Robotics & Automation Magazine | 2012
Sachin Chitta; Edward Gil Jones; Matei T. Ciocarlie; Kaijen Hsiao
Unstructured human environments present a substantial challenge to effective robotic operation. Mobile manipulation in human environments requires dealing with novel unknown objects, cluttered workspaces, and noisy sensor data. We present an approach to mobile pick and place in such environments using a combination of two-dimensional (2-D) and three-dimensional (3-D) visual processing, tactile and proprioceptive sensor data, fast motion planning, reactive control and monitoring, and reactive grasping. We demonstrate our approach by using a two-arm mobile manipulation system to pick and place objects. Reactive components allow our system to account for uncertainty arising from noisy sensors, inaccurate perception (e.g., object detection or registration), or dynamic changes in the environment. We also present a set of tools that allows our system to be easily configured within a short time for a new robotic system.