Shelly Levy-Tzedek
Ben-Gurion University of the Negev
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Shelly Levy-Tzedek.
IEEE Engineering in Medicine and Biology Magazine | 2008
Hermano Igo Krebs; Laura Dipietro; Shelly Levy-Tzedek; Susan E. Fasoli; Avrielle Rykman-Berland; Johanna Zipse; Jennifer A. Fawcett; Joel Stein; Howard Poizner; Albert C. Lo; Bruce T. Volpe; Neville Hogan
Therapeutic robots enhance clinician productivity in facilitating patient recovery. In this article, we presented an overview of the remarkable growth in the activities in the area of therapeutic robotics and of experiences with our devices. We briefly review the published clinical literature in this emerging field and our initial clinical results in stroke. However, we also report our initial efforts that go beyond stroke, broadening the potential population that might benefit from this class of technology by discussing case studies of applications to other neurological diseases. We will also highlight the underexploited potential of this technology as an evaluation tool.
Restorative Neurology and Neuroscience | 2014
Sami Abboud; Shlomi Hanassy; Shelly Levy-Tzedek; Shachar Maidenbaum; Amir Amedi
PURPOSE Sensory-substitution devices (SSDs) provide auditory or tactile representations of visual information. These devices often generate unpleasant sensations and mostly lack color information. We present here a novel SSD aimed at addressing these issues. METHODS We developed the EyeMusic, a novel visual-to-auditory SSD for the blind, providing both shape and color information. Our design uses musical notes on a pentatonic scale generated by natural instruments to convey the visual information in a pleasant manner. A short behavioral protocol was utilized to train the blind to extract shape and color information, and test their acquired abilities. Finally, we conducted a survey and a comparison task to assess the pleasantness of the generated auditory stimuli. RESULTS We show that basic shape and color information can be decoded from the generated auditory stimuli. High performance levels were achieved by all participants following as little as 2-3 hours of training. Furthermore, we show that users indeed found the stimuli pleasant and potentially tolerable for prolonged use. CONCLUSIONS The novel EyeMusic algorithm provides an intuitive and relatively pleasant way for the blind to extract shape and color information. We suggest that this might help facilitating visual rehabilitation because of the added functionality and enhanced pleasantness.
Restorative Neurology and Neuroscience | 2012
Shelly Levy-Tzedek; Shlomi Hanassy; Sami Abboud; Shachar Maidenbaum; Amir Amedi
PURPOSE Visual sensory substitution devices (SSDs) use sound or touch to convey information that is normally perceived by vision. The primary focus of prior research using SSDs was the perceptual components of learning to use SSDs and their neural correlates. However, sensorimotor integration is critical in the effort to make SSDs relevant for everyday tasks, like grabbing a cup of coffee efficiently. The purpose of this study was to test the use of a novel visual-to-auditory SSD to guide a fast reaching movement. METHODS Using sound, the SSD device relays location, shape and color information. Participants were asked to make fast reaching movements to targets presented by the SSD. RESULTS After only a short practice session, blindfolded sighted participants performed fast and accurate movements to presented targets, which did not differ significantly from movements performed with visual feedback in terms of movement time, peak speed, and path length. A small but significant difference was found between the endpoint accuracy of movements under the two feedback conditions; remarkably, in both cases the average error was smaller than 0.5 cm. CONCLUSIONS Our findings combine with previous brain-imaging studies to support a theory of a modality-independent representation of spatial information. Task-specificity, rather than modality-specificity, of brain functions is crucially important for the rehabilitative use of SSDs in the blind and the visually impaired. We present the first direct comparison between movement trajectories performed with an SSD and ones performed under visual guidance. The accuracy level reached in this study demonstrates the potential applicability of using the visual-to-auditory SSD for performance of daily tasks which require fast, accurate reaching movements, and indicates a potential for rehabilitative use of the device.
PLOS ONE | 2013
Shachar Maidenbaum; Shelly Levy-Tzedek; Daniel-Robert Chebat; Amir Amedi
Virtual worlds and environments are becoming an increasingly central part of our lives, yet they are still far from accessible to the blind. This is especially unfortunate as such environments hold great potential for them for uses such as social interaction, online education and especially for use with familiarizing the visually impaired user with a real environment virtually from the comfort and safety of his own home before visiting it in the real world. We have implemented a simple algorithm to improve this situation using single-point depth information, enabling the blind to use a virtual cane, modeled on the “EyeCane” electronic travel aid, within any virtual environment with minimal pre-processing. Use of the Virtual-EyeCane, enables this experience to potentially be later used in real world environments with identical stimuli to those from the virtual environment. We show the fast-learned practical use of this algorithm for navigation in simple environments.
Scientific Reports | 2012
Shelly Levy-Tzedek; Itai Novick; Roni Arbel; Sami Abboud; Shachar Maidenbaum; Eilon Vaadia; Amir Amedi
Visual-to-auditory sensory-substitution devices allow users to perceive a visual image using sound. Using a motor-learning task, we found that new sensory-motor information was generalized across sensory modalities. We imposed a rotation when participants reached to visual targets, and found that not only seeing, but also hearing the location of targets via a sensory-substitution device resulted in biased movements. When the rotation was removed, aftereffects occurred whether the location of targets was seen or heard. Our findings demonstrate that sensory-motor learning was not sensory-modality-specific. We conclude that novel sensory-motor information can be transferred between sensory modalities.
Experimental Brain Research | 2011
Shelly Levy-Tzedek; Hermano Igo Krebs; Jeffrey E. Arle; Jay L. Shils; Howard Poizner
Previous studies examining discrete movements of Parkinson’s disease (PD) patients have found that in addition to performing movements that were slower than those of control participants, they exhibit specific deficits in movement coordination and in sensorimotor integration required to accurately guide movements. With medication, movement speed was normalized, but the coordinative aspects of movement were not. This led to the hypothesis that dopaminergic medication more readily compensates for intensive aspects of movement (such as speed), than for coordinative aspects (such as coordination of different limb segments) (Schettino et al., Exp Brain Res 168:186–202, 2006). We tested this hypothesis on rhythmic, continuous movements of the forearm. In our task, target peak speed and amplitude, availability of visual feedback, and medication state (on/off) were varied. We found, consistent with the discrete-movement results, that peak speed (intensive aspect) was normalized by medication, while accuracy, which required coordination of speed and amplitude modulation (coordinative aspect), was not normalized by dopaminergic treatment. However, our findings that amplitude, an intensive aspect of movement, was also not normalized by medication, suggests that a simple pathway gain increase does not act to remediate all intensive aspects of movement to the same extent. While it normalized movement peak speed, it did not normalize movement amplitude. Furthermore, we found that when visual feedback was not available, all participants (PD and controls) made faster movements. The effects of dopaminergic medication and availability of visual feedback on movement speed were additive. The finding that movement speed uniformly increased both in the PD and the control groups suggests that visual feedback may be necessary for calibration of peak speed, otherwise underestimated by the motor control system.
Experimental Brain Research | 2010
Shelly Levy-Tzedek; Hermano Igo Krebs; D. Song; Neville Hogan; Howard Poizner
We tested 23 healthy participants who performed rhythmic horizontal movements of the elbow. The required amplitude and frequency ranges of the movements were specified to the participants using a closed shape on a phase-plane display, showing angular velocity versus angular position, such that participants had to continuously control both the speed and the displacement of their forearm. We found that the combined accuracy in velocity and position throughout the movement was not a monotonic function of movement speed. Our findings suggest that specific combinations of required movement frequency and amplitude give rise to two distinct types of movements: one of a more rhythmic nature, and the other of a more discrete nature.
Brain Research Bulletin | 2011
Shelly Levy-Tzedek; M. Ben Tov; Amir Karniel
In everyday life, we frequently alternate between performing discrete and rhythmic movements. When performing a periodic movement, two distinct movement types can be distinguished: highly harmonic vs. discrete-like. The harmonicity of the movement is used to classify it as one or the other. We asked: (1) whether the frequency at which a periodic movement is performed affects the harmonicity of the resultant movement; and (2) what underlies switching between these movement types. To answer these questions, we studied horizontal flexion/extension forearm movements in 13 young adults over a wide range of frequencies. Movements were performed either at a fixed frequency, or at gradually increasing or decreasing target frequencies. We found movement harmonicity to depend on the frequency of the movement. Furthermore, we found a reverse hysteresis behavior, where participants switched movement type in anticipation of the future-required frequency. These findings suggest that predictive control is employed in switching between movement types.
Multisensory Research | 2014
Shachar Maidenbaum; Shelly Levy-Tzedek; Daniel-Robert Chebat; Rinat Namer-Furstenberg; Amir Amedi
Mobility training programs for helping the blind navigate through unknown places with a White-Cane significantly improve their mobility. However, what is the effect of new assistive technologies, offering more information to the blind user, on the underlying premises of these programs such as navigation patterns? We developed the virtual-EyeCane, a minimalistic sensory substitution device translating single-point-distance into auditory cues identical to the EyeCanes in the real world. We compared performance in virtual environments when using the virtual-EyeCane, a virtual-White-Cane, no device and visual navigation. We show that the characteristics of virtual-EyeCane navigation differ from navigation with a virtual-White-Cane or no device, and that virtual-EyeCane users complete more levels successfully, taking shorter paths and with less collisions than these groups, and we demonstrate the relative similarity of virtual-EyeCane and visual navigation patterns. This suggests that additional distance information indeed changes navigation patterns from virtual-White-Cane use, and brings them closer to visual navigation.
Frontiers in Neuroscience | 2014
Shelly Levy-Tzedek; Dar Riemer; Amir Amedi
Visual-to-auditory sensory substitution devices (SSDs) convey visual information via sound, with the primary goal of making visual information accessible to blind and visually impaired individuals. We developed the EyeMusic SSD, which transforms shape, location, and color information into musical notes. We tested the “visual” acuity of 23 individuals (13 blind and 10 blindfolded sighted) on the Snellen tumbling-E test, with the EyeMusic. Participants were asked to determine the orientation of the letter “E.” The test was repeated twice: in one test, the letter “E” was drawn with a single color (white), and in the other test, with two colors (red and white). In the latter case, the vertical line in the letter, when upright, was drawn in red, with the three horizontal lines drawn in white. We found no significant differences in performance between the blind and the sighted groups. We found a significant effect of the added color on the “visual” acuity. The highest acuity participants reached in the monochromatic test was 20/800, whereas with the added color, acuity doubled to 20/400. We conclude that color improves “visual” acuity via sound.