Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Sachin Chitta is active.

Publication


Featured researches published by Sachin Chitta.


international conference on robotics and automation | 2011

STOMP: Stochastic trajectory optimization for motion planning

Mrinal Kalakrishnan; Sachin Chitta; Evangelos A. Theodorou; Peter Pastor; Stefan Schaal

We present a new approach to motion planning using a stochastic trajectory optimization framework. The approach relies on generating noisy trajectories to explore the space around an initial (possibly infeasible) trajectory, which are then combined to produced an updated trajectory with lower cost. A cost function based on a combination of obstacle and smoothness cost is optimized in each iteration. No gradient information is required for the particular optimization algorithm that we use and so general costs for which derivatives may not be available (e.g. costs corresponding to constraints and motor torques) can be included in the cost function. We demonstrate the approach both in simulation and on a mobile manipulation system for unconstrained and constrained tasks. We experimentally show that the stochastic nature of STOMP allows it to overcome local minima that gradient-based methods like CHOMP can get stuck in.


IEEE Transactions on Robotics | 2011

Human-Inspired Robotic Grasp Control With Tactile Sensing

Joseph M. Romano; Kaijen Hsiao; Günter Niemeyer; Sachin Chitta; Katherine J. Kuchenbecker

We present a novel robotic grasp controller that allows a sensorized parallel jaw gripper to gently pick up and set down unknown objects once a grasp location has been selected. Our approach is inspired by the control scheme that humans employ for such actions, which is known to centrally depend on tactile sensation rather than vision or proprioception. Our controller processes measurements from the grippers fingertip pressure arrays and hand-mounted accelerometer in real time to generate robotic tactile signals that are designed to mimic human SA-I, FA-I, and FA-II channels. These signals are combined into tactile event cues that drive the transitions between six discrete states in the grasp controller: Close, Load, Lift and Hold, Replace, Unload, and Open. The controller selects an appropriate initial grasping force, detects when an object is slipping from the grasp, increases the grasp force as needed, and judges when to release an object to set it down. We demonstrate the promise of our approach through implementation on the PR2 robotic platform, including grasp testing on a large number of real-world objects.


intelligent robots and systems | 2010

Contact-reactive grasping of objects with partial shape information

Kaijen Hsiao; Sachin Chitta; Matei T. Ciocarlie; E. Gil Jones

Robotic grasping in unstructured environments requires the ability to select grasps for unknown objects and execute them while dealing with uncertainty due to sensor noise or calibration errors. In this work, we propose a simple but robust approach to grasp selection for unknown objects, and a reactive adjustment approach to deal with uncertainty in object location and shape. The grasp selection method uses 3D sensor data directly to determine a ranked set of grasps for objects in a scene, using heuristics based on both the overall shape of the object and its local features. The reactive grasping approach uses tactile feedback from fingertip sensors to execute a compliant robust grasp. We present experimental results to validate our approach by grasping a wide range of unknown objects. Our results show that reactive grasping can correct for a fair amount of uncertainty in the measured position or shape of the objects, and that our grasp selection approach is successful in grasping objects with a variety of shapes.


international conference on robotics and automation | 2011

Skill learning and task outcome prediction for manipulation

Peter Pastor; Mrinal Kalakrishnan; Sachin Chitta; Evangelos A. Theodorou; Stefan Schaal

Learning complex motor skills for real world tasks is a hard problem in robotic manipulation that often requires painstaking manual tuning and design by a human expert. In this work, we present a Reinforcement Learning based approach to acquiring new motor skills from demonstration. Our approach allows the robot to learn fine manipulation skills and significantly improve its success rate and skill level starting from a possibly coarse demonstration. Our approach aims to incorporate task domain knowledge, where appropriate, by working in a space consistent with the constraints of a specific task. In addition, we also present an approach to using sensor feedback to learn a predictive model of the task outcome. This allows our system to learn the proprioceptive sensor feedback needed to monitor subsequent executions of the task online and abort execution in the event of predicted failure. We illustrate our approach using two example tasks executed with the PR2 dual-arm robot: a straight and accurate pool stroke and a box flipping task using two chopsticks as tools.


international conference on robotics and automation | 2012

FCL: A general purpose library for collision and proximity queries

Jia Pan; Sachin Chitta; Dinesh Manocha

We present a new collision and proximity library that integrates several techniques for fast and accurate collision checking and proximity computation. Our library is based on hierarchical representations and designed to perform multiple proximity queries on different model representations. The set of queries includes discrete collision detection, continuous collision detection, separation distance computation and penetration depth estimation. The input models may correspond to triangulated rigid or deformable models and articulated models. Moreover, FCL can perform probabilistic collision checking between noisy point clouds that are captured using cameras or LIDAR sensors. The main benefit of FCL lies in the fact that it provides a unified interface that can be used by various applications. Furthermore, its flexible architecture makes it easier to implement new algorithms within this framework. The runtime performance of the library is comparable to state of the art collision and proximity algorithms. We demonstrate its performance on synthetic datasets as well as motion planning and grasping computations performed using a two-armed mobile manipulation robot.


international conference on robotics and automation | 2010

Autonomous door opening and plugging in with a personal robot

Wim Meeussen; Melonee Wise; Stuart Glaser; Sachin Chitta; Conor McGann; Patrick Mihelich; Eitan Marder-Eppstein; Marius Muja; Victor Eruhimov; Tully Foote; John M. Hsu; Radu Bogdan Rusu; Bhaskara Marthi; Gary R. Bradski; Kurt Konolige; Brian P. Gerkey; Eric Berger

We describe an autonomous robotic system capable of navigating through an office environment, opening doors along the way, and plugging itself into electrical outlets to recharge as needed. We demonstrate through extensive experimentation that our robot executes these tasks reliably, without requiring any modification to the environment. We present robust detection algorithms for doors, door handles, and electrical plugs and sockets, combining vision and laser sensors. We show how to overcome the unavoidable shortcoming of perception by integrating compliant control into manipulation motions. We present a visual-differencing approach to high-precision plug-insertion that avoids the need for high-precision hand-eye calibration.


international symposium on experimental robotics | 2014

Towards Reliable Grasping and Manipulation in Household Environments

Matei T. Ciocarlie; Kaijen Hsiao; Edward Gil Jones; Sachin Chitta; Radu Bogdan Rusu; Ioan A. Şucan

We present a complete software architecture for reliable grasping of household objects. Our work combines aspects such as scene interpretation from 3D range data, grasp planning, motion planning, and grasp failure identification and recovery using tactile sensors. We build upon, and add several new contributions to the significant prior work in these areas. A salient feature of our work is the tight coupling between perception (both visual and tactile) and manipulation, aiming to address the uncertainty due to sensor and execution errors. This integration effort has revealed new challenges, some of which can be addressed through system and software engineering, and some of which present opportunities for future research. Our approach is aimed at typical indoor environments, and is validated by long running experiments where the PR2 robotic platform was able to consistently grasp a large variety of known and unknown objects. The set of tools and algorithms for object grasping presented here have been integrated into the open-source Robot Operating System (ROS).


IEEE Robotics & Automation Magazine | 2012

MoveIt! [ROS Topics]

Sachin Chitta; Ioan Alexandru Sucan; Steve Cousins

R obots are increasingly finding applications in domains where they have to work in close proximity to humans. Industrial robotic applications are starting to examine the possibility of robots and humans as coworkers, sharing tasks and workspace. Autonomous robotic cars operating on crowded streets and freeways have to share space with pedestrians and cyclists in addition to other vehicles. Domestic robots, in particular mobile manipulation systems, will be confronted with cluttered, messy environments where obstacles exist at every corner, and people are continuously moving in and out of the workspace of the robots. Robots working in human environments clearly have to be aware of their surroundings andmust actively attempt to avoid collisions with humans and other obstacles. MoveIt! is a set of software packages integrated with the Robot Operating System (ROS) and designed specifically to provide such capabilities, especially for mobile manipulation. MoveIt! will allow robots to build up a representation of their environment using data fused from three-dimensional (3-D) and other sensors, generate motion plans that effectively and safely move the robot around in the environment, and execute the motion plans while constantly monitoring the environment for changes.


intelligent robots and systems | 2009

Real-time perception-guided motion planning for a personal robot

Radu Bogdan Rusu; Ioan Alexandru Sucan; Brian P. Gerkey; Sachin Chitta; Michael Beetz; Lydia E. Kavraki

This paper presents significant steps towards the online integration of 3D perception and manipulation for personal robotics applications. We propose a modular and distributed architecture, which seamlessly integrates the creation of 3D maps for collision detection and semantic annotations, with a real-time motion replanning framework. To validate our system, we present results obtained during a comprehensive mobile manipulation scenario, which includes the fusion of the above components with a higher level executive.


IEEE Transactions on Robotics | 2011

Tactile Sensing for Mobile Manipulation

Sachin Chitta; Jürgen Sturm; Matthew Piccoli; Wolfram Burgard

Tactile information is valuable in determining properties of objects that are inaccessible from visual perception. In this paper, we present a tactile perception strategy that allows a mobile robot with tactile sensors in its gripper to measure a generic set of tactile features while manipulating an object. We propose a switching velocity-force controller that grasps an object safely and reveals, at the same time, its deformation properties. By gently rolling the object, the robot can extract additional information about the contents of the object. As an application, we show that a robot can use these features to distinguish the internal state of bottles and cans-purely from tactile sensing-from a small training set. The robot can distinguish open from closed bottles and cans and full ones from empty ones. We also show how the high-frequency component in tactile information can be used to detect movement inside a container, e.g., in order to detect the presence of liquid. To prove that this is a hard recognition problem, we also conducted a comparative study with 17 human test subjects. The recognition rates of the human subjects were comparable with that of the robot.

Collaboration


Dive into the Sachin Chitta's collaboration.

Top Co-Authors

Avatar

Maxim Likhachev

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Benjamin J. Cohen

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Vijay Kumar

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Dinesh Manocha

University of North Carolina at Chapel Hill

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Mark Yim

University of Pennsylvania

View shared research outputs
Researchain Logo
Decentralizing Knowledge