Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Heather Culbertson is active.

Publication


Featured researches published by Heather Culbertson.


IEEE Transactions on Haptics | 2014

Modeling and Rendering Realistic Textures from Unconstrained Tool-Surface Interactions

Heather Culbertson; Juliette Unwin; Katherine J. Kuchenbecker

Texture gives real objects an important perceptual dimension that is largely missing from virtual haptic interactions due to limitations of standard modeling and rendering approaches. This paper presents a set of methods for creating a haptic texture model from tool-surface interaction data recorded by a human in a natural and unconstrained manner. The recorded high-frequency tool acceleration signal, which varies as a function of normal force and scanning speed, is segmented and modeled as a piecewise autoregressive (AR) model. Each AR model is labeled with the source segments median force and speed values and stored in a Delaunay triangulation to create a model set for a given texture. We use these texture model sets to render synthetic vibration signals in real time as a user interacts with our TexturePad system, which includes a Wacom tablet and a stylus augmented with a Haptuator. We ran a human-subject study with two sets of ten participants to evaluate the realism of our virtual textures and the strengths and weaknesses of this approach. The results indicated that our virtual textures accurately capture and recreate the roughness of real textures, but other modeling and rendering approaches are required to completely match surface hardness and slipperiness.


world haptics conference | 2013

Generating haptic texture models from unconstrained tool-surface interactions

Heather Culbertson; Juliette Unwin; Benjamin E. Goodman; Katherine J. Kuchenbecker

If you pick up a tool and drag its tip across a table, a rock, or a swatch of fabric, you are able to feel variations in the textures even though you are not directly touching them. These vibrations are characteristic of the material and the motions made when interacting with the surface. This paper presents a new method for creating haptic texture models from data recorded during natural and unconstrained motions using a new haptic recording device. The recorded vibration data is parsed into short segments that represent the feel of the surface at the associated tool force and speed. We create a low-order auto-regressive (AR) model for each data segment and construct a Delaunay triangulation of models in force-speed space for each surface. During texture rendering, we stably interpolate between these models using barycentric coordinates and drive the interpolated model with white noise to output synthetic vibrations. Our methods were validated through application to data recorded by eight human subjects and the experimenter interacting with six textures. We present a new spectral metric for determining perceptual match of the models in order to evaluate the effectiveness and consistency of the segmenting and modeling approach. Multidimensional scaling (MDS) on the pairwise differences in the synthesized vibrations shows that the 54 created texture models cluster by texture in a two-dimensional perceptual space.


ieee haptics symposium | 2014

One hundred data-driven haptic texture models and open-source methods for rendering on 3D objects

Heather Culbertson; Juan Jose Lopez Delgado; Katherine J. Kuchenbecker

This paper introduces the Penn Haptic Texture Toolkit (HaTT), a publicly available repository of haptic texture models for use by the research community. HaTT includes 100 haptic texture and friction models, the recorded data from which the models were made, images of the textures, and the code and methods necessary to render these textures using an impedance-type haptic interface such as a SensAble Phantom Omni. This paper reviews our previously developed methods for modeling haptic virtual textures, describes our technique for modeling Coulomb friction between a tooltip and a surface, discusses the adaptation of our rendering methods for display using an impedance-type haptic device, and provides an overview of the information included in the toolkit. Each texture and friction model was based on a ten-second recording of the force, speed, and high-frequency acceleration experienced by a handheld tool moved by an experimenter against the surface in a natural manner. We modeled each textures recorded acceleration signal as a piecewise autoregressive (AR) process and stored the individual AR models in a Delaunay triangulation as a function of the force and speed used when recording the data. To increase the adaptability and utility of HaTT, we developed a method for resampling the texture models so they can be rendered at a sampling rate other than the 10 kHz used when recording data. Measurements of the users instantaneous normal force and tangential speed are used to synthesize texture vibrations in real time. These vibrations are transformed into a texture force vector that is added to the friction and normal force vectors for display to the user.


ieee haptics symposium | 2012

Refined methods for creating realistic haptic virtual textures from tool-mediated contact acceleration data

Heather Culbertson; Joseph M. Romano; Pablo Castillo; Max Mintz; Katherine J. Kuchenbecker

Dragging a tool across a textured object creates rich high-frequency vibrations that distinctly convey the physical interaction between the tool tip and the object surface. Varying ones scanning speed and normal force alters these vibrations, but it does not change the perceived identity of the tool or the surface. Previous research developed a promising data-driven approach to embedding this natural complexity in a haptic virtual environment: the approach centers on recording and modeling the tool contact accelerations that occur during real texture interactions at a limited set of force-speed combinations. This paper aims to optimize these prior methods of texture modeling and rendering to improve system performance and enable potentially higher levels of haptic realism. The key elements of our approach are drawn from time series analysis, speech processing, and discrete-time control. We represent each recorded texture vibration with a low-order auto-regressive moving-average (ARMA) model, and we optimize this set of models for a specific tool-surface pairing (plastic stylus and textured ABS plastic) using metrics that depend on spectral match, final prediction error, and model order. For rendering, we stably resample the texture models at the desired output rate, and we derive a new texture model at each time step using bilinear interpolation on the line spectral frequencies of the resampled models adjacent to the users current force and speed. These refined processes enable our TexturePad system to generate a stable and spectrally accurate vibration waveform in real time, moving us closer to the goal of virtual textures that are indistinguishable from their real counterparts.


IEEE Transactions on Haptics | 2017

Importance of Matching Physical Friction, Hardness, and Texture in Creating Realistic Haptic Virtual Surfaces

Heather Culbertson; Katherine J. Kuchenbecker

Interacting with physical objects through a tool elicits tactile and kinesthetic sensations that comprise your haptic impression of the object. These cues, however, are largely missing from interactions with virtual objects, yielding an unrealistic user experience. This article evaluates the realism of virtual surfaces rendered using haptic models constructed from data recorded during interactions with real surfaces. The models include three components: surface friction, tapping transients, and texture vibrations. We render the virtual surfaces on a SensAble Phantom Omni haptic interface augmented with a Tactile Labs Haptuator for vibration output. We conducted a human-subject study to assess the realism of these virtual surfaces and the importance of the three model components. Following a perceptual discrepancy paradigm, subjects compared each of 15 real surfaces to a full rendering of the same surface plus versions missing each model component. The realism improvement achieved by including friction, tapping, or texture in the rendering was found to directly relate to the intensity of the surfaces property in that domain (slipperiness, hardness, or roughness). A subsequent analysis of forces and vibrations measured during interactions with virtual surfaces indicated that the Omnis inherent mechanical properties corrupted the users haptic experience, decreasing realism of the virtual surface.


ieee haptics symposium | 2016

Modeling and design of asymmetric vibrations to induce ungrounded pulling sensation through asymmetric skin displacement

Heather Culbertson; Julie M. Walker; Allison M. Okamura

When subjected to asymmetric accelerations at the fingertips, a human perceives an ungrounded pulling sensation. This paper presents a model and analysis of the underlying physics and perception behind this pulling sensation. We design a system with voicecoil actuators that are driven with step-ramp current pulses to create asymmetric accelerations. When the voicecoils are attached to a handle, the force pulses are translated to skin displacement at the users fingertips. This skin displacement, which is asymmetric in both amplitude and speed, is perceived as a directional pulling force, rather than as a vibration. Our analysis shows that the frequency and pulse width of the actuator inputs are important in creating effective skin displacement profiles. Both the amplitude and speed of skin displacement should be maximized in the desired direction, and minimized in the opposite direction. Our physical model of the system allows us to optimize the parameters of the actuator commands without time-intensive human-subject studies.


world haptics conference | 2015

Should haptic texture vibrations respond to user force and speed

Heather Culbertson; Katherine J. Kuchenbecker

Dragging a tool across a textured surface produces vibrations that convey important perceptual information about the interaction and the underlying qualities of the surface. These vibrations depend on the motions of the tool and respond to both normal force and tangential speed. This paper explores various methods of simulating haptic texture interactions by rendering tool vibrations that are based on recorded data. We designed and ran a human-subject study (N=15) to analyze the importance of creating virtual texture vibrations that respond to user force and speed. Our analysis of data from fifteen textures showed that removing speed responsiveness did cause a statistically significant decrease in perceived realism, but removing force responsiveness did not. This result indicates that virtual textures aiming to simulate real surfaces should vary the rendered vibrations with user speed but may not need to vary them with user force.


user interface software and technology | 2017

Grabity: A Wearable Haptic Interface for Simulating Weight and Grasping in Virtual Reality

Inrak Choi; Heather Culbertson; Mark R. Miller; Alex Olwal; Sean Follmer

Ungrounded haptic devices for virtual reality (VR) applications lack the ability to convincingly render the sensations of a grasped virtual objects rigidity and weight. We present Grabity, a wearable haptic device designed to simulate kinesthetic pad opposition grip forces and weight for grasping virtual objects in VR. The device is mounted on the index finger and thumb and enables precision grasps with a wide range of motion. A unidirectional brake creates rigid grasping force feedback. Two voice coil actuators create virtual force tangential to each finger pad through asymmetric skin deformation. These forces can be perceived as gravitational and inertial forces of virtual objects. The rotational orientation of the voice coil actuators is passively aligned with the real direction of gravity through a revolute joint, causing the virtual forces to always point downward. This paper evaluates the performance of Grabity through two user studies, finding promising ability to simulate different levels of weight with convincing object rigidity. The first user study shows that Grabity can convey various magnitudes of weight and force sensations to users by manipulating the amplitude of the asymmetric vibration. The second user study shows that users can differentiate different weights in a virtual environment using Grabity.


human factors in computing systems | 2017

WAVES: A Wearable Asymmetric Vibration Excitation System for Presenting Three-Dimensional Translation and Rotation Cues

Heather Culbertson; Julie M. Walker; Michael Raitor; Allison M. Okamura

WAVES, a Wearable Asymmetric Vibration Excitation System, is a novel wearable haptic device for presenting three dimensions of translation and rotation guidance cues. In contrast to traditional vibration feedback, which usually requires that users learn to interpret a binary cue, asymmetric vibrations have been shown to induce a pulling sensation in a desired direction. When attached to the fingers, a single voicecoil actuator presents a translation guidance cue and a pair of voicecoil actuators presents a rotation guidance cue. The directionality of mechanoreceptors in the skin led to our choice of the location and orientation of the actuators in order to elicit very strong sensations in certain directions. For example, users distinguished a left cue versus a right cue 94.5% of the time. When presented with one of six possible direction cues, users on average correctly identified the direction of translation cues 86.1% of the time and rotation cues 69.0% of the time.


ieee haptics symposium | 2016

A dual-flywheel ungrounded haptic feedback system provides single-axis moment pulses for clear direction signals

Julie M. Walker; Michael Raitor; Alex Mallery; Heather Culbertson; Philipp J. Stolka; Allison M. Okamura

Flywheel rotation provides a means for ungrounded kinesthetic (force) feedback, but devices that use a single flywheel to generate moment pulses inherently generate undesirable torque in directions orthogonal to the desired torque. We present a novel dual-flywheel gyro-moment haptic device capable of delivering short moment pulses about isolated axes in three-dimensional space. Rotating a single spinning flywheel about an axis creates a reaction moment about an orthogonal axis. Summing the moments created by two flywheels spinning in opposite directions eliminates moment components due to gyroscopic effects, leaving a moment about a constant axis. A dynamic model and torque output measurements of the dual-flywheel system demonstrate that the dual-flywheel approach generates moments about any 3D axis with no orthogonal moment components. We hypothesize that this system presents easily discernible haptic feedback that users can interpret as a direction signal. A prototype device and user study demonstrate experimentally that moment pulses generated by the dual-flywheel system resulted in easily distinguishable direction signals.

Collaboration


Dive into the Heather Culbertson's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Juliette Unwin

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge