Katherine J. Kuchenbecker
University of Pennsylvania
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Katherine J. Kuchenbecker.
IEEE Transactions on Visualization and Computer Graphics | 2006
Katherine J. Kuchenbecker; Jonathan Fiene; Günter Niemeyer
Tapping on surfaces in a typical virtual environment feels like contact with soft foam rather than a hard object. The realism of such interactions can be dramatically improved by superimposing event-based, high-frequency transient forces over traditional position-based feedback. When scaled by impact velocity, hand-tuned pulses and decaying sinusoids produce haptic cues that resemble those experienced during real impacts. Our new method for generating appropriate transients inverts a dynamic model of the haptic device to determine the motor forces required to create prerecorded acceleration profiles at the users fingertips. After development, the event-based haptic paradigm and the method of acceleration matching were evaluated in a carefully controlled user study. Sixteen individuals blindly tapped on nine virtual and three real samples, rating the degree to which each felt like real wood. Event-based feedback achieved significantly higher realism ratings than the traditional rendering method. The display of transient signals made virtual objects feel similar to a real sample of wood on a foam substrate, while position feedback alone received ratings similar to those of foam. This work provides an important new avenue for increasing the realism of contact in haptic interactions.
IEEE Transactions on Robotics | 2011
Joseph M. Romano; Kaijen Hsiao; Günter Niemeyer; Sachin Chitta; Katherine J. Kuchenbecker
We present a novel robotic grasp controller that allows a sensorized parallel jaw gripper to gently pick up and set down unknown objects once a grasp location has been selected. Our approach is inspired by the control scheme that humans employ for such actions, which is known to centrally depend on tactile sensation rather than vision or proprioception. Our controller processes measurements from the grippers fingertip pressure arrays and hand-mounted accelerometer in real time to generate robotic tactile signals that are designed to mimic human SA-I, FA-I, and FA-II channels. These signals are combined into tactile event cues that drive the transitions between six discrete states in the grasp controller: Close, Load, Lift and Hold, Replace, Unload, and Open. The controller selects an appropriate initial grasping force, detects when an object is slipping from the grasp, increases the grasp force as needed, and judges when to release an object to set it down. We demonstrate the promise of our approach through implementation on the PR2 robotic platform, including grasp testing on a large number of real-world objects.
IEEE Transactions on Haptics | 2012
Joseph M. Romano; Katherine J. Kuchenbecker
Modern haptic interfaces are adept at conveying the large-scale shape of virtual objects, but they often provide unrealistic or no feedback when it comes to the microscopic details of surface texture. Direct texture-rendering challenges the state of the art in haptics because it requires a finely detailed model of the surfaces properties, real-time dynamic simulation of complex interactions, and high-bandwidth haptic output to enable the user to feel the resulting contacts. This paper presents a new, fully realized solution for creating realistic virtual textures. Our system employs a sensorized handheld tool to capture the feel of a given texture, recording three-dimensional tool acceleration, tool position, and contact force over time. We reduce the three-dimensional acceleration signals to a perceptually equivalent one-dimensional signal, and then we use linear predictive coding to distill this raw haptic information into a database of frequency-domain texture models. Finally, we render these texture models in real time on a Wacom tablet using a stylus augmented with small voice coil actuators. The resulting virtual textures provide a compelling simulation of contact with the real surfaces, which we verify through a human subject study.
Proceedings of the IEEE | 2013
Seungmoon Choi; Katherine J. Kuchenbecker
This paper reviews the technology and applications of vibrotactile display, an effective information transfer modality for the emerging area of haptic media. Our emphasis is on summarizing foundational knowledge in this area and providing implementation guidelines for application designers who do not yet have a background in haptics. Specifically, we explain the relevant human vibrotactile perceptual capabilities, detail the main types of commercial vibrotactile actuators, and describe how to build both monolithic and localized vibrotactile displays. We then identify exemplary vibrotactile display systems in application areas ranging from the presentation of physical object properties to broadcasting vibrotactile media content.
IEEE Transactions on Haptics | 2011
William McMahan; Jamie Gewirtz; Dorsey Standish; Paul Martin; Jacquelyn A. Kunkel; Magalie Lilavois; Alexei Wedmid; David I. Lee; Katherine J. Kuchenbecker
Minimally invasive telerobotic surgical systems enable surgeons to perform complicated procedures without large incisions. Unfortunately, these systems typically do not provide the surgeon with sensory feedback aside from stereoscopic vision. We have, thus, developed VerroTouch, a sensing and actuating device that can be added to Intuitive Surgicals existing da Vinci S Surgical System to provide auditory and vibrotactile feedback of tool contact accelerations. These cues let the surgeon feel and hear contact with rough textures as well as the making and breaking of contact with objects and other tools. To evaluate the merits of this approach, we had 11 surgeons use an augmented da Vinci S to perform three in vitro manipulation tasks under four different feedback conditions: with no acceleration feedback, with audio feedback, with haptic feedback, and with both audio and haptic. Subjects expressed a significant preference for the inclusion of tool contact acceleration feedback, although they disagreed over which sensory modality was best. Other survey responses and qualitative written comments indicate that the feedback may have improved the subjects concentration and situational awareness by strengthening the connection between the surgeon and the surgical instruments. Analysis of quantitative task metrics shows that the feedback neither improves nor impedes the performance of the chosen tasks.
The International Journal of Robotics Research | 2005
William R. Provancher; Mark R. Cutkosky; Katherine J. Kuchenbecker; Günter Niemeyer
We present a new tactile display for use in dexterous telemanipulation and virtual reality. Our system renders the location of the contact centroid moving on the user’s fingertip. Constructed in a thimble-sized package and mounted on a haptic force-feedback device, it provides the user with concurrent feedback of contact location and interaction forces. We believe such a design will enable more versatile object manipulation and richer haptic interactions. To evaluate this display concept, we conducted two perceptual experiments. First, human subjects judged object curvature using both direct manipulation of physical models and virtual manipulation via the device. Results show similar levels of discrimination in real and virtual interactions, indicating the device can effectively portray contact information. Secondly, we investigated virtual interactions with rolling and anchored objects and demonstrated that users are able to distinguish the interaction type using our device. These experiments give insight into the sensitivity of human perception and suggest that even a simple display of the contact centroid location may significantly enhance telerobotic or virtual grasping tasks.
ASME 2003 International Mechanical Engineering Congress and Exposition | 2003
Katherine J. Kuchenbecker; June Gyu Park; Günter Niemeyer
Haptic displays provide the user with a sense of touch in both simulation of virtual environments and teleoperation of remote robots. The instantaneous impedance of the user’s hand affects this force interaction, changing the transients experienced during activities such as exploratory tapping. This research characterizes the behavior of the human wrist joint while holding a stylus in a three-fingered grasp. Nonparametric identification methods, evaluating frequency-and time-responses, support a second-order system model. Further analysis shows a positive linear correlation between grip force and wrist impedance for all subjects, though each individual’s trend is unique. These findings suggest that a quick calibration procedure and a realtime grip force measurement could enable a haptic display to predict user response characteristics throughout an interaction. Such knowledge would enable haptic control algorithms to adapt continuously to the user’s instantaneous state for improved performance.Copyright
international conference on robotics and automation | 2013
Vivian Chu; Ian McMahon; Lorenzo Riano; Craig G. McDonald; Qin He; Jorge Martinez Perez-Tejada; Michael Arrigo; Naomi T. Fitter; John C. Nappo; Trevor Darrell; Katherine J. Kuchenbecker
Delivering on the promise of real-world robotics will require robots that can communicate with humans through natural language by learning new words and concepts through their daily experiences. Our research strives to create a robot that can learn the meaning of haptic adjectives by directly touching objects. By equipping the PR2 humanoid robot with state-of-the-art biomimetic tactile sensors that measure temperature, pressure, and fingertip deformations, we created a platform uniquely capable of feeling the physical properties of everyday objects. The robot used five exploratory procedures to touch 51 objects that were annotated by human participants with 34 binary adjective labels. We present both static and dynamic learning methods to discover the meaning of these adjectives from the labeled objects, achieving average F1 scores of 0.57 and 0.79 on a set of eight previously unfelt items.
symposium on haptic interfaces for virtual environment and teleoperator systems | 2005
Katherine J. Kuchenbecker; Jonathan Fiene; Günter Niemeyer
Contact in a typical haptic environment resembles the experience of tapping on soft foam, rather than on a hard object. Event-based, high-frequency transient forces must be superimposed with traditional proportional feedback to provide realistic haptic cues at impact. We have developed a new method for matching the accelerations experienced during real contact, inverting a dynamic model of the device to compute appropriate force feedback transients. We evaluated this haptic rendering paradigm by conducting a study in which users blindly rated the realism of tapping on a variety of virtually rendered surfaces as well as on three real objects. Event-based feedback significantly increased the realism of the virtual surfaces, and the acceleration matching strategy was rated similarly to a sample of real wood on a foam substrate. This work provides a new avenue for achieving realism of contact in haptic interactions.
IEEE Transactions on Biomedical Engineering | 2016
Claudio Pacchierotti; Domenico Prattichizzo; Katherine J. Kuchenbecker
Despite its expected clinical benefits, current teleoperated surgical robots do not provide the surgeon with haptic feedback largely because grounded forces can destabilize the systems closed-loop controller. This paper presents an alternative approach that enables the surgeon to feel fingertip contact deformations and vibrations while guaranteeing the teleoperators stability. We implemented our cutaneous feedback solution on an Intuitive Surgical da Vinci Standard robot by mounting a SynTouch BioTac tactile sensor to the distal end of a surgical instrument and a custom cutaneous display to the corresponding master controller. As the user probes the remote environment, the contact deformations, dc pressure, and ac pressure (vibrations) sensed by the BioTac are directly mapped to input commands for the cutaneous devices motors using a model-free algorithm based on look-up tables. The cutaneous display continually moves, tilts, and vibrates a flat plate at the operators fingertip to optimally reproduce the tactile sensations experienced by the BioTac. We tested the proposed approach by having eighteen subjects use the augmented da Vinci robot to palpate a heart model with no haptic feedback, only deformation feedback, and deformation plus vibration feedback. Fingertip deformation feedback significantly improved palpation performance by reducing the task completion time, the pressure exerted on the heart model, and the subjects absolute error in detecting the orientation of the embedded plastic stick. Vibration feedback significantly improved palpation performance only for the seven subjects who dragged the BioTac across the model, rather than pressing straight into it.