Vaibhav V. Unhelkar
Massachusetts Institute of Technology
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Vaibhav V. Unhelkar.
human-robot interaction | 2014
Vaibhav V. Unhelkar; Ho Chit Siu; Julie A. Shah
There is an emerging desire across manufacturing industries to deploy robots that support people in their manual work, rather than replace human workers. This paper explores one such opportunity, which is to field a mobile robotic assistant that travels between part carts and the automotive final assembly line, delivering tools and materials to the human workers. We compare the performance of a mobile robotic assistant to that of a human assistant to gain a better understanding of the factors that impact its effectiveness. Statistically significant differences emerge based on type of assistant, human or robot. Interaction times and idle times are statistically significantly higher for the robotic assistant than the human assistant. We report additional differences in participant’s subjective response regarding team fluency, situational awareness, comfort and safety. Finally, we discuss how results from the experiment inform the design of a more effective assistant.Categories and Subject DescriptorsH.1.2 [Models and Principles]: User/Machine Systems;I.2.9 [Artificial Intelligence]: RoboticsGeneral TermsExperimentation, Performance, Human Factors
international conference on robotics and automation | 2015
Vaibhav V. Unhelkar; Leia Stirling; Julie A. Shah
Mobile, interactive robots that operate in human-centric environments need the capability to safely and efficiently navigate around humans. This requires the ability to sense and predict human motion trajectories and to plan around them. In this paper, we present a study that supports the existence of statistically significant biomechanical turn indicators of human walking motions. Further, we demonstrate the effectiveness of these turn indicators as features in the prediction of human motion trajectories. Human motion capture data is collected with predefined goals to train and test a prediction algorithm. Use of anticipatory features results in improved performance of the prediction algorithm. Lastly, we demonstrate the closed-loop performance of the prediction algorithm using an existing algorithm for motion planning within dynamic environments. The anticipatory indicators of human walking motion can be used with different prediction and/or planning algorithms for robotics; the chosen planning and prediction algorithm demonstrates one such implementation for human-robot co-navigation.
international conference on robotics and automation | 2014
Vaibhav V. Unhelkar; Jorge Perez; James C. Boerkoel; Johannes Bix; Stefan Bartscher; Julie A. Shah
There exists an increasing demand to incorporate mobile interactive robots to assist humans in repetitive, non-value added tasks in the manufacturing domain. Our aim is to develop a mobile robotic assistant for fetch-and-deliver tasks in human-oriented assembly line environments. Assembly lines present a niche yet novel challenge for mobile robots; the robot must precisely control its position on a surface which may be either stationary, moving, or split (e.g. in the case that the robot straddles the moving assembly line and remains partially on the stationary surface). In this paper we present a control and sensing solution for a mobile robotic assistant as it traverses a moving-floor assembly line. Solutions readily exist for control of wheeled mobile robots on static surfaces; we build on the open-source Robot Operating System (ROS) software architecture and generalize the algorithms for the moving line environment. Off-the-shelf sensors and localization algorithms are explored to sense the moving surface, and a customized solution is presented using PX4Flow optic flow sensors and a laser scanner-based localization algorithm. Validation of the control and sensing system is carried out both in simulation and in hardware experiments on a customized treadmill. Initial demonstrations of the hardware system yield promising results; the robot successfully maintains its position while on, and while straddling, the moving line.
human robot interaction | 2015
Vaibhav V. Unhelkar; Julie A. Shah
Industrial robots are on the verge of emerging from their cages, and entering the final assembly to work along side humans. Towards this we are developing a collaborative robot capable of assisting humans in the final automotive assembly. Several algorithmic as well as design challenges exist when the robots enter the unpredictable, human-centric and time-critical environment of final assembly. In this work, we briefly discuss a few of these challenges along with developed solutions and proposed methodologies, and their implications for improving human-robot collaboration.
international joint conference on artificial intelligence | 2018
Vaibhav V. Unhelkar; Julie A. Shah
Artificial agents (both embodied robots and software agents) that interact with humans are increasing at an exceptional rate. Yet, achieving seamless collaboration between artificial agents and humans in the real world remains an active problem [Thomaz et al., 2016]. A key challenge is that the agents need to make decisions without complete information about their shared environment and collaborators. For instance, a human-robot team performing a rescue operation after a disaster may not have an accurate map of their surroundings. Even in structured domains, such as manufacturing, a robot might not know the goals or preferences of its human collaborators [Unhelkar et al., 2018]. Algorithmically, this challenge manifests itself as a problem of decision-making under uncertainty in which the agent has to reason about the latent states of its environment and human collaborator. However, in practice, quantifying this uncertainty (i.e., the state transition function) and even specifying the features (i.e., the relevant states) of human-machine collaboration is difficult. Thus, the objective of this thesis research is to develop novel algorithms that enable artificial agents to learn and reason about the latent states of humanmachine collaboration and achieve fluent interaction.
IEEE Robotics & Automation Magazine | 2018
Vaibhav V. Unhelkar; Stefan Dorr; Alexander Bubeck; Przemyslaw A. Lasota; Jorge Perez; Ho Chit Siu; James C. Boerkoel; Quirin Tyroller; Johannes Bix; Stefan Bartscher; Julie A. Shah
Robots that operate alongside or cooperatively with humans are envisioned as the next generation of robotics. Toward this vision, we present the first mobile robot system designed for and capable of operating on the moving floors of automotive final assembly lines (AFALs). AFALs represent a distinct challenge for mobile robots in the form of dynamic surfaces: the conveyor belts that transport cars throughout the factory during final assembly.
national conference on artificial intelligence | 2016
Vaibhav V. Unhelkar; Julie A. Shah
human-robot interaction | 2017
X. Jessie Yang; Vaibhav V. Unhelkar; Kevin Li; Julie A. Shah
international conference on robotics and automation | 2018
Vaibhav V. Unhelkar; Przemyslaw A. Lasota; Quirin Tyroller; Rares-Darius Buhai; Laurie Marceau; Barbara Deml; Julie A. Shah
Archive | 2018
Vaibhav V. Unhelkar; Julie A. Shah