Human-centered Control of a Growing Soft Robot for Object Manipulation
Fabio Stroppa, Ming Luo, Giada Gerboni, Margaret M. Coad, Julie M. Walker, Allison M. Okamura
HHuman-centered Control of a Growing Soft Robot for ObjectManipulation*
Fabio Stroppa, Ming Luo, Giada Gerboni, Margaret M. Coad, Julie M. Walker, and Allison M. Okamura
Abstract — We present a user-friendly interface to teleoperatea soft robot manipulator in a complex environment. Keycomponents of the system include a manipulator with a graspingend-effector that grows via tip eversion, gesture-based control,and haptic display to the operator for feedback and guidance.In the initial work, the operator uses the soft robot to builda tower of blocks, and future works will extend this to sharedautonomy scenarios in which the human operator and robotintelligence are both necessary for task completion.
I. INTRODUCTIONRobots in domestic environments have the potential toboth autonomously assist humans and enable a physicalpresence for remote operators. A variety of operation modal-ities can potentially lie in the spectrum between autonomyand teleoperation, providing a rich and useful set of robotcapabilities. However, this also creates numerous challengesfor robot design and intuitive human-in-the-loop control.Whether supervising an autonomous robot or directly con-trolling a teleoperated one, human operators require intu-itive interfaces and a sense of immersion while dealingwith a remote environment. Haptic feedback can improveintuition and immersion by directly communicating physicalinteractions and constraints, and there is the opportunity todesign haptic interfaces with the specific feedback modalitiesand degrees of freedom appropriate for shared autonomyscenarios.In this work, we present a setup including a user-friendlyMotion Capture (MoCap) system to teleoperate a growingsoft robot during a manipulation task. The MoCap allowsthe operator to interact with the robot using an intuitiveapproach, removing the burden of learning the mappingrequired by a joystick or another physical interface. Further-more, the operator is also equipped with a light holdablecontroller providing kinesthetic and tactile haptics for feed-back and guidance, facilitating manipulation in complex andunstructured environments.Although the scenario considered here consists of simplereaching, grasping, and manipulation of blocks, there aremany other applications for shared control of robots in homesor other complex environments. Fig. 1 shows a possibleexample, in which a remote human operator uses a softrobot to help other humans in a kitchen. In particular, theoperator may have a direct line of sight of the robot and *Toyota Research Institute (“TRI”) provided funds to assist the authorswith their research but this article solely reflects the opinions and conclu-sions of its authors and not TRI or any other Toyota entity. The authorsthank A. Thackston and S. Zapolsky for their ideas related to this work.The authors are with the Mechanical Engineering Department, StanfordUniversity, Stanford, CA 94305, USA [email protected]
Fig. 1. Concept of teleoperation and shared autonomy with a soft robotin a human environment. being physically in front of it, or can be interfaced with avisual interface (e.g. a screen or a head mounted display) fora remote teleoperation.II. BACKGROUNDRobot manipulators are desirable for assistance with tasksin domestic environments, although their safety is often lim-ited by their inherent mechanical properties. Soft continuumrobots offer the potential for safe physical interactions withhumans, and exhibit access and manipulation capabilities inconstrained and cluttered environments not achievable bytraditional robots [1]. However, environmental contact candrastically alter the motion of soft robots, complicating theircontrol and limiting interaction forces [2]. Prior studies ofthe authors introduced a novel soft robot that extends bygrowing from its tip and controls the direction of growth byreversible bending [3]. This robot has been used to navigatecluttered environments and steer toward targets [4], but itsuse in manipulation has not yet been studied.When the manipulation task is performed in a shared-autonomy scenario, and the operator can exert control oversome aspects of the task, sharing the control of the objecthandling forces is the more natural approach, and it alsoallows the human operator to receive force information asfeedback [5]. Artificial haptic sensations can present infor-mation to operators, but the majority of existing devices havefocused on applying forces to the operators finger pads orrecreating textures [6], [7]. A previous study from the authorsproposed a device that generates force and torque sensationsby applying tangential cues to the finger pads [8], specificallyfor motion guidance. This device is used in the current setup a r X i v : . [ c s . R O ] O c t ig. 2. System components for human-centered control of a soft robot. as part of the interface for sharing the autonomy betweenthe operator and the robot.III. MATERIALS AND METHODSThe proposed task consists of manipulating toy blocks inorder to build a “block tower” by stacking them on top ofone another. The robot is attached on the ceiling and graspsthe blocks from above. Fig. 2 shows the proposed system,composed of the soft robot, two main interfaces, two high-level software modules, and a set of cameras for both theMoCap system and the 3D tracking. Robot:
The soft robot (Fig. 3(a)) can grow via pneumaticactuation to an arbitrary length and steer to reach the blocksin the workspace. Once a block is reached, it can be graspedand handled by the gripper equipped at the tip of the robot(see Fig. 3(b)). The base of the gripper is composed oftwo rings, one inside and one outside the robot, which areconnected by rolling magnets. The material of the soft robotcan slide through the surface between the magnets whengrowing or retracting, such that the gripper remains stableon the tip position.
Interfaces:
The MoCap system is used to control therobot. The PhaseSpace Impulse X2E (phasespace.com) waschosen to track the movements of the operator. Commandsare sampled at 270 Hz and sent to the robot at 10 Hz .The haptic device is an holdable interface that allows theoperator to move around freely in a large workspace. Itconsists of a hand-held kinesthetic gripper (Fig. 3(c)) thatprovides guidance cues in 4 degrees of freedom through skinstretch and kinesthetic feedback at the fingertips. A motor atthe hinge allows the fingers to open and close to performgrasping movements, which controls the gripper at the tip ofthe robot. Software:
A gesture interpreter is used to map the oper-ator’s movements to the kinematics of the soft robot. It rec-ognizes multiple commands such as grow/retract, left/right,up/down – which can be given simultaneously – and isindependent from the MoCap reference frame, such that theoperator is not asked to stand in a particular position and isfree to move comfortably.A set of RGB-D cameras (2D image from RGB opticalcameras and depth data from infrared cameras) are used for3D reconstruction of the environment. The robot and theblocks are recognized and tracked in real time ( ≈ Hz )by a 3D-processing engine that analyzes their pose in the (a)(b) (c)Fig. 3. Existing components of the human-centered soft robot manipulationsystem: (a) a steerable soft robot, (b) a one-degree-of-freedom gripper, and(c) a four-degree-of-freedom holdable device for haptic feedback. space, in order to evaluate a strategy for the robot to reachthe targets and handle the task. Based on this evaluation, themodule sends cues to the haptic device to help the operatornavigate the environment, pointing in the direction of thedesired target or away from any dangerous regions.IV. CONCLUSIONFuture works include further development of shared au-tonomy and teleoperation scenarios, defining metrics forperformance, integrating new soft robot and haptic devicedesigns, and creating novel interactions with a variety ofperception, modeling, and planning/control strategies.R EFERENCES[1] D. Trivedi, C. D. Rahn, W. M. Kier, and I. D. Walker, “Soft robotics:Biological inspiration, state of the art, and future research,”
AppliedBionics and Biomechanics , vol. 5, no. 3, pp. 99–117, 2008.[2] M. C. Yip and D. B. Camarillo, “Model-less feedback control of con-tinuum manipulators in constrained environments,”
IEEE Transactionson Robotics , vol. 30, no. 4, pp. 880–889, 2014.[3] J. D. Greer, T. K. Morimoto, A. M. Okamura, and E. W. Hawkes,“Series pneumatic artificial muscles (sPAMs) and application to a softcontinuum robot,” in
IEEE International Conference on Robotics andAutomation (ICRA) , 2017, pp. 5503–5510.[4] M. M. Coad, L. H. Blumenschein, S. Cutler, J. A. R. Zepeda, N. D.Naclerio, H. El-Hussieny, U. Mehmood, J.-H. Ryu, E. W. Hawkes, andA. M. Okamura, “Vine robots: Design, teleoperation, and deploymentfor navigation and exploration,” arXiv preprint arXiv:1903.00069 , 2019.[5] W. B. Griffin, W. R. Provancher, and M. R. Cutkosky, “Feedbackstrategies for telemanipulation with shared control of object handlingforces,”
Presence: Teleoperators & Virtual Environments , vol. 14, no. 6,pp. 720–731, 2005.[6] H. Benko, C. Holz, M. Sinclair, and E. Ofek, “Normaltouch and tex-turetouch: High-fidelity 3d haptic shape rendering on handheld virtualreality controllers,” in
Proceedings of the 29th Annual Symposium onUser Interface Software and Technology , 2016, pp. 717–728.[7] E. Whitmire, H. Benko, C. Holz, E. Ofek, and M. Sinclair, “Hapticrevolver: Touch, shear, texture, and shape rendering on a reconfigurablevirtual reality controller,” in
Proc. CHI Conference on Human Factorsin Computing Systems , 2018, p. 86.[8] J. M. Walker, N. Zemiti, P. Poignet, and A. M. Okamura, “Holdablehaptic device for 4-dof motion guidance,” in