Neil Dantam
Georgia Institute of Technology
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Neil Dantam.
IEEE Transactions on Robotics | 2013
Neil Dantam; Mike Stilman
We present the Motion Grammar: an approach to represent and verify robot control policies based on context-free grammars. The production rules of the grammar represent a top-down task decomposition of robot behavior. The terminal symbols of this language represent sensor readings that are parsed in real time. Efficient algorithms for context-free parsing guarantee that online parsing is computationally tractable. We analyze verification properties and language constraints of this linguistic modeling approach, show a linguistic basis that unifies several existing methods, and demonstrate effectiveness through experiments on a 14-degree-of-freedom (DOF) manipulator interacting with 32 objects (chess pieces) and an unpredictable human adversary. We provide many of the algorithms discussed as Open Source, permissively licensed software.
robotics science and systems | 2016
Neil Dantam; Zachary Kingston; Swarat Chaudhuri; Lydia E. Kavraki
We present a new algorithm for task and motion planning (TMP) and discuss the requirements and abstractions necessary to obtain robust solutions for TMP in general. Our Iteratively Deepened Task and Motion Planning (IDTMP) method is probabilistically-complete and offers improved performance and generality compared to a similar, state-of-theart, probabilistically-complete planner. The key idea of IDTMP is to leverage incremental constraint solving to efficiently add and remove constraints on motion feasibility at the task level. We validate IDTMP on a physical manipulator and evaluate scalability on scenarios with many objects and long plans, showing order-of-magnitude gains compared to the benchmark planner and a four-times self-comparison speedup from our extensions. Finally, in addition to describing a new method for TMP and its implementation on a physical robot, we also put forward requirements and abstractions for the development of similar planners in the future.
ieee-ras international conference on humanoid robots | 2012
Neil Dantam; Mike Stilman
We present a new Interprocess Communication (IPC) mechanism and library. Ach is uniquely suited for coordinating drivers, controllers, and algorithms in complex robotic systems such as humanoid robots. Ach eliminates the Head-of-Line Blocking problem for applications that always require access to the newest message. Ach is efficient, robust, and formally verified. It has been tested and demonstrated on a variety of physical robotic systems, and we discuss the implementation on our humanoid robot Golem Krang. Finally, the source code for Ach is available under an Open Source permissive license.
intelligent robots and systems | 2012
Neil Dantam; Irfan A. Essa; Mike Stilman
We demonstrate the automatic transfer of an assembly task from human to robot. This work extends efforts showing the utility of linguistic models in verifiable robot control policies by now performing real visual analysis of human demonstrations to automatically extract a policy for the task. This method tokenizes each human demonstration into a sequence of object connection symbols, then transforms the set of sequences from all demonstrations into an automaton, which represents the task-language for assembling a desired object. Finally, we combine this assembly automaton with a kinematic model of a robot arm to reproduce the demonstrated task.
international conference on robotics and automation | 2011
Neil Dantam; Pushkar Kolhe; Mike Stilman
We introduce the Motion Grammar, a powerful new representation for robot decision making, and validate its properties through the successful implementation of a physical human-robot game. The Motion Grammar is a formal tool for task decomposition and hybrid control in the presence of significant online uncertainty. In this paper, we describe the Motion Grammar, introduce some of the formal guarantees it can provide, and represent the entire game of human-robot chess through a single formal language. This language includes game-play, safe handling of human motion, uncertainty in piece positions, misplaced and collapsed pieces. We demonstrate the simple and effective language formulation through experiments on a 14-DOF manipulator interacting with 32 objects (chess pieces) and an unpredictable human adversary.
robotics science and systems | 2011
Neil Dantam; Mike Stilman
Presented at the 2011 Robotics: Science and Systems Conference VII (RSS), 27-30 June 2011, Los Angeles, CA.
IEEE Robotics & Automation Magazine | 2015
Neil Dantam; Daniel M. Lofaro; Ayonga Hereid; Paul Oh; Aaron D. Ames; Mike Stilman
Correct real-time software is vital for robots in safety-critical roles such as service and disaster response. These systems depend on software for locomotion, navigation, manipulation, and even seemingly innocuous tasks such as safely regulating battery voltage. A multiprocess software design increases robustness by isolating errors to a single process, allowing the rest of the system to continue operation. This approach also assists with modularity and concurrency. For real-time tasks, such as dynamic balance and force control of manipulators, it is critical to communicate the latest data sample with minimum latency. There are many communication approaches intended for both general-purpose and real-time needs [9], [13], [15], [17], [19]. Typical methods focus on reliable communication or network transparency and accept a tradeoff of increased message latency or the potential to discard newer data. By focusing instead on the specific case of real-time communication on a single host, we reduce communication latency and guarantee access to the latest sample. We present a new interprocess communication (IPC) library, Ach which addresses this need, and discuss its application for real-time multiprocess control on three humanoid robots (Figure 1). (Ach is available at http://www.golems.org/projects/ach.html. The name Ach comes from the common abbreviation for the motor neurotransmitter Acetylcholine and the computer networking term ACK.).
2013 IEEE Conference on Technologies for Practical Robot Applications (TePRA) | 2013
Michael X. Grey; Neil Dantam; Daniel M. Lofaro; Aaron F. Bobick; Magnus Egerstedt; Paul Y. Oh; Mike Stilman
Humanoid robots require greater software reliability than traditional mechatronic systems if they are to perform useful tasks in typical human-oriented environments. This paper covers a software architecture which distributes the load of computation and control tasks over multiple processes, enabling fail-safes within the software. These fail-safes ensure that unexpected crashes or latency do not produce damaging behavior in the robot. The distribution also offers benefits for future software development by making the architecture modular and extensible. Utilizing a low-latency inter-process communication protocol (Ach), processes are able to communicate with high control frequencies. The key motivation of this software architecture is to provide a practical framework for safe and reliable humanoid robot software development. The authors test and verify this framework on a HUBO2 Plus humanoid robot.
international conference on robotics and automation | 2010
Pushkar Kolhe; Neil Dantam; Mike Stilman
This paper presents three effective manipulation strategies for wheeled, dynamically balancing robots with articulated links. By comparing these strategies through analysis, simulation and robot experiments, we show that contact placement and body posture have a significant impact on the robots ability to accelerate and displace environment objects. Given object geometry and friction parameters we determine the most effective methods for utilizing wheel torque to perform non-prehensile manipulation.
ieee-ras international conference on humanoid robots | 2014
Neil Dantam; Heni Ben Amor; Henrik I. Christensen; Mike Stilman
We demonstrate that millimeter-level bimanual manipulation accuracy can be achieved without the static camera registration typically required for visual servoing. We register multiple cameras online, converging in seconds, by visually tracking features on the robot hands and filtering the result. Then, we compute and track continuous-velocity relative workspace trajectories for the end-effector. We demonstrate the approach using Schunk LWA4 and SDH manipulators and Logitech C920 cameras, showing accurate relative positioning for pen-capping and object hand-off tasks. Our filtering software is available under a permissive license.1