Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Netta Gurari is active.

Publication


Featured researches published by Netta Gurari.


symposium on haptic interfaces for virtual environment and teleoperator systems | 2009

Stiffness discrimination with visual and proprioceptive cues

Netta Gurari; Katherine J. Kuchenbecker; Allison M. Okamura

This study compares the Weber fraction for human perception of stiffness among three conditions: vision, proprioceptive motion feedback, and their combination. To make comparisons between these feedback conditions, a novel haptic device was designed that senses the spring behavior through encoder and force measurements, and implements a controller to render linear virtual springs so that the stimuli displayed haptically could be compared with their visual counterparts. The custom-designed, torque-controlled haptic interface non-invasively controls the availability of proprioceptive motion feedback in unimpaired individuals using a virtual environment. When proprioception is available, the user feels an MCP joint rotation that is proportional to his or her finger force. When proprioception is not available, the actual finger is not allowed to move, but a virtual finger displayed graphically moves in proportion to the users applied force. Visual feedback is provided and removed by turning on and off this graphical display. Weber fractions were generated from an experiment in which users examined pairs of springs and attempted to identify the spring with higher stiffness. To account for slight trial-to-trial variations in the relationship between force and position in the proprioceptive feedback conditions, our analysis uses measurements of the actual rendered stiffness, rather than the commanded stiffness. Results for 10 users give average Weber fractions of 0.056 for vision, 0.036 for proprioception, and 0.039 for their combination, indicating that proprioception is important for stiffness perception for this experimental setup. The long-term goal of this research is to motivate and develop methods for proprioception feedback to wearers of dexterous upper-limb prostheses.


ieee international conference on rehabilitation robotics | 2007

Effects of Visual and Proprioceptive Motion Feedback on Human Control of Targeted Movement

Katherine J. Kuchenbecker; Netta Gurari; Allison M. Okamura

This research seeks to ascertain the relative value of visual and proprioceptive motion feedback during force-based control of a non-self entity like a powered prosthesis. Accurately controlling such a device is very difficult when the operator cannot see or feel the movement that results from applied forces. As an analogy to prosthesis use, we tested the relative importance of visual and proprioceptive motion feedback during targeted force-based movement. Thirteen human subjects performed a virtual finger-pointing task in which the virtual fingers velocity was always programmed to be directly proportional to the MCP joint torque applied by the subjects right index finger. During successive repetitions of the pointing task, the system conveyed the virtual fingers motion to the user through four combinations of graphical display (vision) and finger movement (proprioception). Success rate, speed, and qualitative ease of use were recorded, and visual motion feedback was found to increase all three performance measures. Proprioceptive motion feedback significantly improved success rate and ease of use, but it yielded slower motions. The results indicate that proprioceptive motion feedback improves human control of targeted movement in both sighted and unsighted conditions, supporting the pursuit of artificial proprioception for prosthetics and underscoring the importance of motion feedback for other force-controlled human-machine systems, such as interactive virtual environments and teleoperators.


Journal of Neuroscience Methods | 2014

Customization, control, and characterization of a commercial haptic device for high-fidelity rendering of weak forces

Netta Gurari; Gabriel Baud-Bovy

BACKGROUND The emergence of commercial haptic devices offers new research opportunities to enhance our understanding of the human sensory-motor system. Yet, commercial device capabilities have limitations which need to be addressed. This paper describes the customization of a commercial force feedback device for displaying forces with a precision that exceeds the human force perception threshold. NEW METHOD The device was outfitted with a multi-axis force sensor and closed-loop controlled to improve its transparency. Additionally, two force sensing resistors were attached to the device to measure grip force. Force errors were modeled in the frequency- and time-domain to identify contributions from the mass, viscous friction, and Coulomb friction during open- and closed-loop control. The effect of user interaction on system stability was assessed in the context of a user study which aimed to measure force perceptual thresholds. RESULTS Findings based on 15 participants demonstrate that the system maintains stability when rendering forces ranging from 0-0.20 N, with an average maximum absolute force error of 0.041 ± 0.013 N. Modeling the force errors revealed that Coulomb friction and inertia were the main contributors to force distortions during respectively slow and fast motions. COMPARISON WITH EXISTING METHODS Existing commercial force feedback devices cannot render forces with the required precision for certain testing scenarios. Building on existing robotics work, this paper shows how a device can be customized to make it reliable for studying the perception of weak forces. CONCLUSIONS The customized and closed-loop controlled device is suitable for measuring force perceptual thresholds.


ieee international conference on rehabilitation robotics | 2009

Environment discrimination with vibration feedback to the foot, arm, and fingertip

Netta Gurari; Kathryn Smith; Manu S. Madhav; Allison M. Okamura

Haptic feedback for upper-limb prostheses is desirable to enable a user to interact naturally with his or her environment, including operating the limb without vision and performing activities of daily living. We present a noninvasive method of providing one type of haptic feedback, vibration, to an upper-limb prosthesis user to enable discrimination of environment properties. Using a telemanipulation system that emulates an ideal prosthesis, able-bodied subjects tapped on materials of varying stiffness while vibration signals were recorded using an accelerometer. The vibrations were displayed in real time to the user through a C2 tactor mounted on the fingertip, foot, or upper arm. A three-alternative forced choice experiment was conducted, in which pairs of materials were presented. The subjects identified the stiffer surface or stated that they were of equal stiffness. Differing visual and force cues among the materials were eliminated through the use of the teleoperator and a graphical display. Results for five users indicate that vibration feedback to the foot enables environment discrimination comparable to that of the fingertip, and that the foot is better than the upper arm. The foot is a promising location for haptic feedback because of its sensitivity to haptic stimuli and the convenience of placing small devices and power sources within the shoe.


IEEE Transactions on Human-Machine Systems | 2013

Perception of Springs With Visual and Proprioceptive Motion Cues: Implications for Prosthetics

Netta Gurari; Katherine J. Kuchenbecker; Allison M. Okamura

Manipulating objects with an upper limb prosthesis requires significantly more visual attention than doing the same task with an intact limb. Prior work and comments from individuals lacking proprioception indicate that conveying prosthesis motion through a nonvisual sensory channel would reduce and possibly remove the need to watch the prosthesis. To motivate the design of suitable sensory substitution devices, this study investigates the difference between seeing a virtual prosthetic limb move and feeling ones real limb move. Fifteen intact subjects controlled a virtual prosthetic finger in a one-degree-of-freedom rotational spring discrimination task. A custom haptic device was used to measure both real finger position and applied finger force, and the resulting prosthetic finger movement was displayed visually (on a computer screen) and/or proprioceptively (by allowing the subjects real finger to move). Spring discrimination performance was tested for three experimental sensory conditions-visual motion, proprioceptive motion, and visual and proprioceptive motion-using the method of constant stimuli, with a reference stiffness of 290 N/m. During each trial, subjects sequentially pressed the right index finger on a pair of hard-surfaced virtual springs and decided which was stiffer. No significant performance differences were found between the three experimental sensory conditions, but subjects perceived proprioceptive motion to be significantly more useful than visual motion. These results imply that relaying proprioceptive information through a nonvisual channel could reduce visual attention during prosthesis control while maintaining task performance, thus improving the upper limb prosthesis experience.


biomedical circuits and systems conference | 2008

The feeling of color: A haptic feedback device for the visually disabled

Jonathan Tapson; Netta Gurari; Javier Díaz; Elisabetta Chicca; David Sander; Philippe O. Pouliquen; Ralph Etienne-Cummings

We describe a sensory augmentation system designed to provide the visually disabled with a sense of color. Our system consists of a glove with short-range optical color sensors mounted on its fingertips, and a torso-worn belt on which tactors (haptic feedback actuators) are mounted. Each fingertip sensor detects the observed objectpsilas color. This information is encoded to the tactor through vibrations in respective locations and varying modulations. Early results suggest that detection of primary colors is possible with near 100% accuracy and moderate latency, with a minimum amount of training.


ieee haptics symposium | 2012

Conveying the configuration of a virtual human hand using vibrotactile feedback

Andrew F. Cheng; Kirk A. Nichols; Heidi M. Weeks; Netta Gurari; Allison M. Okamura

Upper-limb prostheses users lack proprioception of their artificial arm, and rely heavily on vision to understand its configuration. With the goal of reducing the amount of energy expended on visual cues during upper-limb prosthesis use, this study investigates whether haptic feedback can relay the configuration of a virtual hand in the absence of sight. Two mappings from waistbelt-mounted tactor vibration patterns to hand configuration are explored: (1) Synergy-based hand motions derived from the results of a principal component analysis run on an aggregate of hand motions and (2) Decoupled hand motions, which include experimenter-selected motions such as finger grasp and finger spread. Results show that users can identify complex hand configurations with vibrotactile feedback patterns based on both the Synergies and Decoupled methods, although 30-45 seconds are required to achieve this task. Also, findings demonstrate that users are likely to memorize correspondence between an overall feeling of the tactor pattern to a hand configuration rather than constructing the hand configuration by isolating and considering each tactor individually. Last, results indicate that hand configuration is most accurately conveyed by maximizing information along a synergy-based space.


PLOS ONE | 2017

Perception of force and stiffness in the presence of low-frequency haptic noise

Netta Gurari; Allison M. Okamura; Katherine J. Kuchenbecker

Objective This work lays the foundation for future research on quantitative modeling of human stiffness perception. Our goal was to develop a method by which a human’s ability to perceive suprathreshold haptic force stimuli and haptic stiffness stimuli can be affected by adding haptic noise. Methods Five human participants performed a same-different task with a one-degree-of-freedom force-feedback device. Participants used the right index finger to actively interact with variations of force (∼5 and ∼8 N) and stiffness (∼290 N/m) stimuli that included one of four scaled amounts of haptically rendered noise (None, Low, Medium, High). The haptic noise was zero-mean Gaussian white noise that was low-pass filtered with a 2 Hz cut-off frequency; the resulting low-frequency signal was added to the force rendered while the participant interacted with the force and stiffness stimuli. Results We found that the precision with which participants could identify the magnitude of both the force and stiffness stimuli was affected by the magnitude of the low-frequency haptically rendered noise added to the haptic stimulus, as well as the magnitude of the haptic stimulus itself. The Weber fraction strongly correlated with the standard deviation of the low-frequency haptic noise with a Pearson product-moment correlation coefficient of ρ > 0.83. The mean standard deviation of the low-frequency haptic noise in the haptic stimuli ranged from 0.184 N to 1.111 N across the four haptically rendered noise levels, and the corresponding mean Weber fractions spanned between 0.042 and 0.101. Conclusions The human ability to perceive both suprathreshold haptic force and stiffness stimuli degrades in the presence of added low-frequency haptic noise. Future work can use the reported methods to investigate how force perception and stiffness perception may relate, with possible applications in haptic watermarking and in the assessment of the functionality of peripheral pathways in individuals with haptic impairments.


Clinical Neurophysiology | 2017

Individuals with chronic hemiparetic stroke can correctly match forearm positions within a single arm

Netta Gurari; Justin M. Drogos; Julius P. A. Dewald

OBJECTIVE Previous studies determined, using between arms position matching assessments, that at least one-half of individuals with stroke have an impaired position sense. We investigated whether individuals with chronic stroke who have impairments mirroring arm positions also have impairments identifying the location of each arm in space. METHODS Participants with chronic hemiparetic stroke and age-matched participants without neurological impairments (controls) performed a between forearms position matching task based on a clinical assessment and a single forearm position matching task, using passive and active movements, based on a robotic assessment. RESULTS 12 out of our 14 participants with stroke who had clinically determined between forearms position matching impairments had greater errors than the controls in both their paretic and non-paretic arm when matching positions during passive movements; yet stroke participants performed comparable to the controls during active movements. CONCLUSIONS Many individuals with chronic stroke may have impairments matching positions in both their paretic and non-paretic arm if their arm is moved for them, yet not within either arm if these individuals control their own movements. SIGNIFICANCE The neural mechanisms governing arm location perception in the stroke population may differ depending on whether arm movements are made passively versus actively.


international conference on human haptic sensing and touch enabled computer applications | 2014

Static Force Rendering Performance of Two Commercial Haptic Systems

Fabio Tatti; Netta Gurari; Gabriel Baud-Bovy

The performance of a haptic device generally varies across its workspace. This work measures the force rendering accuracy and maximum force output rendering capabilities of two commercial devices – the Falcon and Omega devices – in six directions at 46 and 49 set locations, respectively, in the mechanical workspace. The results are compared to several theoretical measures of performance based on force ellipsoids. Findings quantify the differences in rendering capability across the tested locations, and show that this variability can be mostly explained by the highly nonlinear nature of the Jacobian.

Collaboration


Dive into the Netta Gurari's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Gabriel Baud-Bovy

Istituto Italiano di Tecnologia

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Amy L. Shelton

Johns Hopkins University

View shared research outputs
Top Co-Authors

Avatar

Heidi M. Weeks

Johns Hopkins University

View shared research outputs
Top Co-Authors

Avatar

Kathryn Smith

Johns Hopkins University

View shared research outputs
Top Co-Authors

Avatar

Manu S. Madhav

Johns Hopkins University

View shared research outputs
Researchain Logo
Decentralizing Knowledge