Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Daniel Rakita is active.

Publication


Featured researches published by Daniel Rakita.


human-robot interaction | 2017

A Motion Retargeting Method for Effective Mimicry-based Teleoperation of Robot Arms

Daniel Rakita; Bilge Mutlu; Michael Gleicher

In this paper, we introduce a novel interface that allows novice users to effectively and intuitively tele-operate robot manipulators. The premise of our method is that an interface that allows its user to direct a robot arm using the natural 6-DOF space of his/her hand would afford effective direct control of the robot; however, a direct mapping between the users hand and the robots end effector is impractical because the robot has different kinematic and speed capabilities than the human arm. Our key technical idea that by relaxing the constraint of the direct mapping between hand position and orientation and end effector configuration, a system can provide the user with the feel of direct control, while still achieving the practical requirements for telemanipulation, such as motion smoothness and singularity avoidance. We present methods for implementing a motion retargeting solution that achieves this relaxed control using constrained optimization and describe a system that utilizes it to provide real-time control of a robot arm. We demonstrate the effectiveness of our approach in a user study that shows novice users can complete a range of tasks more efficiently and enjoyably using our relaxed-mimicry based interface compared to standard interfaces.


robot and human interactive communication | 2016

Evaluating intent-expressive robot arm motion

Christopher Bodden; Daniel Rakita; Bilge Mutlu; Michael Gleicher

Planning effective arm motions is integral to manipulation tasks. In general, motion synthesis methods have focused on functional objectives, such as minimizing time and maximizing efficiency. However, recent work in human-robot collaboration suggests that choices in motion design can influence collaboration performance and quality. Some motion designs are easier than others for human observers to interpret. In this paper, we explore the tradeoffs in robot arm movements designed to be observed by people. Through a series of human-subjects experiments, we compare collaboration performance between several motion-synthesis methods explored by prior work. We find that a number of factors, including the design of the robot arm and metric for success, affect the relative merits of different approaches.


human-robot interaction | 2018

An Autonomous Dynamic Camera Method for Effective Remote Teleoperation

Daniel Rakita; Bilge Mutlu; Michael Gleicher

In this paper, we present a method that improves the ability of remote users to teleoperate amanipulation robot arm by continuously providing them with an effective viewpoint using a secondcamera-in-hand robot arm. The user controls the manipulation robot usinganyteleoperation interface, and the camera-in-hand robot automatically servos to provide a view of the remote environment that is estimated to best support effective manipulations. Our method avoids occlusions with the manipulation arm to improve visibility, provides context and detailed views of the environment by varying the camera-target distance, utilizes motion prediction to cover the space of the user»s next manipulation actions, and actively corrects views to avoid disorienting the user as the camera moves. Through two user studies, we show that our method improves teleoperation performance over alternative methods of providing visual support for teleoperation. We discuss the implications of our findings for real-world teleoperation and for future research.


international conference on computer graphics and interactive techniques | 2016

Authoring directed gaze for full-body motion capture

Tomislav Pejsa; Daniel Rakita; Bilge Mutlu; Michael Gleicher

We present an approach for adding directed gaze movements to characters animated using full-body motion capture. Our approach provides a comprehensive authoring solution that automatically infers plausible directed gaze from the captured body motion, provides convenient controls for manual editing, and adds synthetic gaze movements onto the original motion. The foundation of the approach is an abstract representation of gaze behavior as a sequence of gaze shifts and fixations toward targets in the scene. We present methods for automatic inference of this representation by analyzing the head and torso kinematics and scene features. We introduce tools for convenient editing of the gaze sequence and target layout that allow an animator to adjust the gaze behavior without worrying about the details of pose and timing. A synthesis component translates the gaze sequence into coordinated movements of the eyes, head, and torso, and blends these with the original body motion. We evaluate the effectiveness of our inference methods, the efficiency of the authoring process, and the quality of the resulting animation.


human-robot interaction | 2018

Shared Dynamic Curves: A Shared-Control Telemanipulation Method for Motor Task Training

Daniel Rakita; Bilge Mutlu; Michael Gleicher; Laura M. Hiatt

In this paper, we present a novel shared-control telemanipulation method that is designed to incrementally improve a user»s motor ability. Our method initially corrects for the user»s suboptimal control trajectories, gradually giving the user more direct control over a series of training trials as he/she naturally gets more accustomed to the task. Our shared-control method, calledShared Dynamic Curves, blends suboptimal user translation and rotation control inputs with known translation and rotation paths needed to complete a task. Shared Dynamic Curves provide a translation and rotation path in space along which the user can easily guide the robot, and this curve can bend and flex in real-time as a dynamical system to pull the user»s motion gracefully toward a goal. We show through a user study that Shared Dynamic Curves affords effective motor learning on certain tasks compared to alternative training methods. We discuss our findings in the context of shared control and speculate on how this method could be applied in real-world scenarios such as job training or stroke rehabilitation.


The International Journal of Robotics Research | 2018

A flexible optimization-based method for synthesizing intent-expressive robot arm motion

Christopher Bodden; Daniel Rakita; Bilge Mutlu; Michael Gleicher

We present an approach to synthesize robot arm trajectories that effectively communicate the robot’s intent to a human collaborator while achieving task goals. Our approach uses nonlinear constrained optimization to encode task requirements and desired motion properties. Our implementation allows for a wide range of constraints and objectives. We introduce a novel objective function to optimize robot arm motions for intent-expressiveness that works in a range of scenarios and robot arm types. Our formulation supports experimentation with different theories of how viewers interpret robot motion. Through a series of human-subject experiments on real and simulated robots, we demonstrate that our method leads to improved collaborative performance against other methods, including the current state of the art. These experiments also show how our perception heuristic can affect collaborative outcomes.


human robot interaction | 2017

Methods for Effective Mimicry-based Teleoperation of Robot Arms

Daniel Rakita

In this research, I report on novel methods to afford more intuitive and efficient robot teleoperation control using human motion. The overall premise of this work is that allowing users to control robots using the natural input space of their arms will lead to task performance and subjective measure benefits over more traditional interfaces. In this paper, I outline completed work on a mimicry-based teleoperation control system that enables improved task proficiency for novice users by mapping their arm motion to a robot arm in real-time, as well as ongoing and future research that will further improve this control paradigm.


robot and human interactive communication | 2016

Motion synopsis for robot arm trajectories

Daniel Rakita; Bilge Mutlu; Michael Gleicher

Monitoring, analyzing, or comparing the motions of a robot can be a critical activity but a tedious and inefficient one in research settings and practical applications. In this paper, we present an approach we call motion synopsis for providing users with a global view of a robots motion trajectory as a set of key poses in a static 2D image, allowing for more efficient robot motion review, preview, analysis, and comparisons. To accomplish this presentation, we construct a 3D scene, select a camera view direction and position based on the motion data, decide what interior poses should be shown based on robot motion features, and organize the robot mesh models and graphical information in a way that provides the user with an at-a-glance view of the motion. Through examples and a user study, we document how our approach performs against alternative summarization techniques and highlight where the approach offers benefit and where it is limited.


international conference on computer graphics and interactive techniques | 2015

Inferring gaze shifts from captured body motion

Daniel Rakita; Tomislav Pejsa; Bilge Mutlu; Michael Gleicher

Motion-captured performances seldom include eye gaze, because capturing this motion requires eye tracking technology that is not typically part of a motion capture setup. Yet having eye gaze information is important, as it tells us what the actor was attending to during capture and it adds to the expressivity of their performance.


robotics science and systems | 2018

RelaxedIK: Real-time Synthesis of Accurate and Feasible Robot Arm Motion

Daniel Rakita; Bilge Mutlu; Michael Gleicher

Collaboration


Dive into the Daniel Rakita's collaboration.

Top Co-Authors

Avatar

Michael Gleicher

University of Wisconsin-Madison

View shared research outputs
Top Co-Authors

Avatar

Bilge Mutlu

University of Wisconsin-Madison

View shared research outputs
Top Co-Authors

Avatar

Christopher Bodden

University of Wisconsin-Madison

View shared research outputs
Top Co-Authors

Avatar

Tomislav Pejsa

University of Wisconsin-Madison

View shared research outputs
Top Co-Authors

Avatar

Guru Subramani

University of Wisconsin-Madison

View shared research outputs
Top Co-Authors

Avatar

Hongyi Wang

University of Wisconsin-Madison

View shared research outputs
Top Co-Authors

Avatar

Jordan Black

University of Wisconsin-Madison

View shared research outputs
Top Co-Authors

Avatar

Laura M. Hiatt

United States Naval Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

Michael R. Zinn

University of Wisconsin-Madison

View shared research outputs
Top Co-Authors

Avatar

Oliver Liu

University of Wisconsin-Madison

View shared research outputs
Researchain Logo
Decentralizing Knowledge