Dipendra Kumar Misra
Cornell University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Dipendra Kumar Misra.
The International Journal of Robotics Research | 2016
Dipendra Kumar Misra; Jaeyong Sung; Kevin Lee; Ashutosh Saxena
It is important for a robot to be able to interpret natural language commands given by a human. In this paper, we consider performing a sequence of mobile manipulation tasks with instructions described in natural language. Given a new environment, even a simple task such as boiling water would be performed quite differently depending on the presence, location and state of the objects. We start by collecting a dataset of task descriptions in free-form natural language and the corresponding grounded task-logs of the tasks performed in an online robot simulator. We then build a library of verb–environment instructions that represents the possible instructions for each verb in that environment, these may or may not be valid for a different environment and task context. We present a model that takes into account the variations in natural language and ambiguities in grounding them to robotic instructions with appropriate environment context and task constraints. Our model also handles incomplete or noisy natural language instructions. It is based on an energy function that encodes such properties in a form isomorphic to a conditional random field. We evaluate our model on tasks given in a robotic simulator and show that it successfully outperforms the state of the art with 61.8% accuracy. We also demonstrate a grounded robotic instruction sequence on a PR2 robot using the Learning from Demonstration approach.
international joint conference on natural language processing | 2015
Dipendra Kumar Misra; Kejia Tao; Percy Liang; Ashutosh Saxena
We focus on the task of interpreting complex natural language instructions to a robot, in which we must ground high-level commands such as microwave the cup to low-level actions such as grasping. Previous approaches that learn a lexicon during training have inadequate coverage at test time, and pure search strategies cannot handle the exponential search space. We propose a new hybrid approach that leverages the environment to induce new lexical entries at test time, even for new verbs. Our semantic parsing model jointly reasons about the text, logical forms, and environment over multi-stage instruction sequences. We introduce a new dataset and show that our approach is able to successfully ground new verbs such as distribute, mix, arrange to complex logical forms, each containing up to four predicates.
empirical methods in natural language processing | 2016
Dipendra Kumar Misra; Yoav Artzi
We present a shift-reduce CCG semantic parser. Our parser uses a neural network architecture that balances model capacity and computational cost. We train by transferring a model from a computationally expensive loglinear CKY parser. Our learner addresses two challenges: selecting the best parse for learning when the CKY parser generates multiple correct trees, and learning from partial derivations when the CKY parser fails to parse. We evaluate on AMR parsing. Our parser performs comparably to the CKY parser, while doing significantly fewer operations. We also present results for greedy semantic parsing with a relatively small drop in performance.
robotics: science and systems | 2014
Dipendra Kumar Misra; Jaeyong Sung; Kevin Lee; Ashutosh Saxena
arXiv: Artificial Intelligence | 2014
Ashutosh Saxena; Ashesh Jain; Ozan Sener; Aditya Jami; Dipendra Kumar Misra; Hema Swetha Koppula
empirical methods in natural language processing | 2017
Dipendra Kumar Misra; John Langford; Yoav Artzi
arXiv: Artificial Intelligence | 2018
Claudia Yan; Dipendra Kumar Misra; Andrew Bennett; Aaron Walsman; Yonatan Bisk; Yoav Artzi
international conference on machine learning | 2018
Kavosh Asadi; Dipendra Kumar Misra; Michael L. Littman
empirical methods in natural language processing | 2018
Dipendra Kumar Misra; Andrew Bennett; Valts Blukis; Eyvind Niklasson; Max Shatkhin; Yoav Artzi
empirical methods in natural language processing | 2018
Dipendra Kumar Misra; Ming-Wei Chang; Xiaodong He; Wen-tau Yih