Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Stephen I. Helms Tillery is active.

Publication


Featured researches published by Stephen I. Helms Tillery.


Neuron | 2010

Transcranial Pulsed Ultrasound Stimulates Intact Brain Circuits

Yusuf Tufail; Alexei Matyushov; Nathan Baldwin; Monica L. Tauchmann; Joseph Georges; Anna Yoshihiro; Stephen I. Helms Tillery; William J. Tyler

Electromagnetic-based methods of stimulating brain activity require invasive procedures or have other limitations. Deep-brain stimulation requires surgically implanted electrodes. Transcranial magnetic stimulation does not require surgery, but suffers from low spatial resolution. Optogenetic-based approaches have unrivaled spatial precision, but require genetic manipulation. In search of a potential solution to these limitations, we began investigating the influence of transcranial pulsed ultrasound on neuronal activity in the intact mouse brain. In motor cortex, ultrasound-stimulated neuronal activity was sufficient to evoke motor behaviors. Deeper in subcortical circuits, we used targeted transcranial ultrasound to stimulate neuronal activity and synchronous oscillations in the intact hippocampus. We found that ultrasound triggers TTX-sensitive neuronal activity in the absence of a rise in brain temperature (<0.01 degrees C). Here, we also report that transcranial pulsed ultrasound for intact brain circuit stimulation has a lateral spatial resolution of approximately 2 mm and does not require exogenous factors or surgical invasion.


IEEE Transactions on Neural Systems and Rehabilitation Engineering | 2003

Information conveyed through brain-control: cursor versus robot

Dawn M. Taylor; Stephen I. Helms Tillery; Andrew B. Schwartz

Microwire electrode arrays were implanted in the motor and premotor cortical areas of rhesus macaques. The recorded activity was used to control the three-dimensional movements of a virtual cursor and of a robotic arm in real time. The goal was to move the cursor or robot to one of eight targets. Average information conveyed about the intended target was calculated from the observed trajectories at 30-ms intervals throughout the movements. Most of the information about intended target was conveyed within the first second of the movement. For the brain-controlled cursor, the instantaneous information transmission rate was at its maximum at the beginning of each movement (averaged 4.8 to 5.5 bits/s depending on the calculation method used). However, this instantaneous rate quickly slowed down as the movement progressed and additional information became redundant. Information was conveyed more slowly through the brain-controlled robot due to the dynamics and noise of the robot system. The brain-controlled cursor data was also used to demonstrate a method for optimizing information transmission rate in the case where repeated cursor movements are used to make long strings of sequential choices such as in a typing task.


Current Opinion in Neurobiology | 2001

Extraction algorithms for cortical control of arm prosthetics.

Andrew B. Schwartz; Dawn M. Taylor; Stephen I. Helms Tillery

Now that recordings of multiple, individual action potentials are being made with chronic electrodes, it seems that previous work showing simple encoding of movement parameters in these spike trains can be used as a real-time control signal for prosthetic arms. Efficient extraction algorithms can compensate for the limited ensemble sample acquired with this emerging technology.


Journal of Neural Engineering | 2006

Selection and parameterization of cortical neurons for neuroprosthetic control

Remy Wahnoun; Jiping He; Stephen I. Helms Tillery

When designing neuroprosthetic interfaces for motor function, it is crucial to have a system that can extract reliable information from available neural signals and produce an output suitable for real life applications. Systems designed to date have relied on establishing a relationship between neural discharge patterns in motor cortical areas and limb movement, an approach not suitable for patients who require such implants but who are unable to provide proper motor behavior to initially tune the system. We describe here a method that allows rapid tuning of a population vector-based system for neural control without arm movements. We trained highly motivated primates to observe a 3D center-out task as the computer played it very slowly. Based on only 10-12 s of neuronal activity observed in M1 and PMd, we generated an initial mapping between neural activity and device motion that the animal could successfully use for neuroprosthetic control. Subsequent tunings of the parameters led to improvements in control, but the initial selection of neurons and estimated preferred direction for those cells remained stable throughout the remainder of the day. Using this system, we have observed that the contribution of individual neurons to the overall control of the system is very heterogeneous. We thus derived a novel measure of unit quality and an indexing scheme that allowed us to rate each neurons contribution to the overall control. In offline tests, we found that fewer than half of the units made positive contributions to the performance. We tested this experimentally by having the animals control the neuroprosthetic system using only the 20 best neurons. We found that performance in this case was better than when the entire set of available neurons was used. Based on these results, we believe that, with careful task design, it is feasible to parameterize control systems without any overt behaviors and that subsequent control system design will be enhanced with cautious unit selection. These improvements can lead to systems demanding lower bandwidth and computational power, and will pave the way for more feasible clinical systems.


PLOS ONE | 2011

The proprioceptive map of the arm is systematic and stable, but idiosyncratic.

Liliana Rincon-Gonzalez; Christopher A. Buneo; Stephen I. Helms Tillery

Visual and somatosensory signals participate together in providing an estimate of the hands spatial location. While the ability of subjects to identify the spatial location of their hand based on visual and proprioceptive signals has previously been characterized, relatively few studies have examined in detail the spatial structure of the proprioceptive map of the arm. Here, we reconstructed and analyzed the spatial structure of the estimation errors that resulted when subjects reported the location of their unseen hand across a 2D horizontal workspace. Hand position estimation was mapped under four conditions: with and without tactile feedback, and with the right and left hands. In the task, we moved each subjects hand to one of 100 targets in the workspace while their eyes were closed. Then, we either a) applied tactile stimulation to the fingertip by allowing the index finger to touch the target or b) as a control, hovered the fingertip 2 cm above the target. After returning the hand to a neutral position, subjects opened their eyes to verbally report where their fingertip had been. We measured and analyzed both the direction and magnitude of the resulting estimation errors. Tactile feedback reduced the magnitude of these estimation errors, but did not change their overall structure. In addition, the spatial structure of these errors was idiosyncratic: each subject had a unique pattern of errors that was stable between hands and over time. Finally, we found that at the population level the magnitude of the estimation errors had a characteristic distribution over the workspace: errors were smallest closer to the body. The stability of estimation errors across conditions and time suggests the brain constructs a proprioceptive map that is reliable, even if it is not necessarily accurate. The idiosyncrasy across subjects emphasizes that each individual constructs a map that is unique to their own experiences.


IEEE Transactions on Neural Systems and Rehabilitation Engineering | 2011

Haptic Interaction of Touch and Proprioception: Implications for Neuroprosthetics

Liliana Rincon-Gonzalez; Jay P. Warren; David M. Meller; Stephen I. Helms Tillery

Somatosensation is divided into multiple discrete modalities that we think of separably: e.g., tactile, proprioceptive, and temperature sensation. However, in processes such as haptics, those modalities all interact. If one intended to artificially generate a sensation that could be used for stereognosis, for example, it would be crucial to understand these interactions. We are presently examining the relationship between tactile and proprioceptive modalities in this context. In this overview of some of our recent work, we show that signals that would normally be attributed to two of these systems separately, tactile contact and self-movement, interact both perceptually and physiologically in ways that complicate the understanding of haptic processing. In the first study described here, we show that a tactile illusion on the fingertips, the cutaneous rabbit effect, can be abolished by changing the posture of the fingers. We then discuss activity in primary somatosensory cortical neurons illustrating the interrelationship of tactile and postural signals. In this study, we used a robot-enhanced virtual environment to show that many neurons in primary somatosensory cortex with cutaneous receptive fields encode elements both of tactile contact and self-motion. We then show the results of studies examining the structure of the process which extracts the spatial location of the hand from proprioceptive signals. The structure of the spatial errors in these maps indicates that the proprioceptive-spatial map is stable but individually constructed. These seemingly disparate studies lead us to suggest that tactile sensation is encoded in a 2-D map, but one which undergoes continual dynamic modification by an underlying proprioceptive map. Understanding how the disparate signals that comprise the somatosensory system are processed to produce sensation is an important step in realizing the kind of seamless integration aspired to in neuroprosthetics.


Current Opinion in Neurobiology | 2004

Signal acquisition and analysis for cortical control of neuroprosthetics

Stephen I. Helms Tillery; Dawn M. Taylor

Work in cortically controlled neuroprosthetic systems has concentrated on decoding natural behaviors from neural activity, with the idea that if the behavior could be fully decoded it could be duplicated using an artificial system. Initial estimates from this approach suggested that a high-fidelity signal comprised of many hundreds of neurons would be required to control a neuroprosthetic system successfully. However, recent studies are showing hints that these systems can be controlled effectively using only a few tens of neurons. Attempting to decode the pre-existing relationship between neural activity and natural behavior is not nearly as important as choosing a decoding scheme that can be more readily deployed and trained to generate the desired actions of the artificial system. These artificial systems need not resemble or behave similarly to any natural biological system. Effective matching of discrete and continuous neural command signals to appropriately configured device functions will enable effective control of both natural and abstract artificial systems using compatible thought processes.


PLOS ONE | 2011

Effects of fusion between tactile and proprioceptive inputs on tactile perception.

Jay P. Warren; Marco Santello; Stephen I. Helms Tillery

Tactile perception is typically considered the result of cortical interpretation of afferent signals from a network of mechanical sensors underneath the skin. Yet, tactile illusion studies suggest that tactile perception can be elicited without afferent signals from mechanoceptors. Therefore, the extent that tactile perception arises from isomorphic mapping of tactile afferents onto the somatosensory cortex remains controversial. We tested whether isomorphic mapping of tactile afferent fibers onto the cortex leads directly to tactile perception by examining whether it is independent from proprioceptive input by evaluating the impact of different hand postures on the perception of a tactile illusion across fingertips. Using the Cutaneous Rabbit Effect, a well studied illusion evoking the perception that a stimulus occurs at a location where none has been delivered, we found that hand posture has a significant effect on the perception of the illusion across the fingertips. This finding emphasizes that tactile perception arises from integration of perceived mechanical and proprioceptive input and not purely from tactile interaction with the external environment.


Experimental Brain Research | 2010

Electrotactile stimuli delivered across fingertips inducing the Cutaneous Rabbit Effect

Jay P. Warren; Marco Santello; Stephen I. Helms Tillery

Previous studies have been unable to induce the Cutaneous Rabbit Effect (CRE) when the most likely perceived location of the illusory stimulus is on a non-continuous skin area. To determine whether the CRE could be elicited when each of the delivered stimuli were on non-continuous skin areas, we developed a new electrotactile stimulation paradigm attempting to induce the CRE across the fingertips. Though our stimulation paradigm differed from classic reduced CRE paradigms through the use of electrotactile stimuli, focusing the subject attention to a ‘likely’ illusory site, and the inclusion of a fourth stimulation site (two stimuli after the illusory stimulus), these factors were not the cause of the illusory effect we observed. Experiments conducted on the forearm validated that our paradigm elicited similar results to those reported in previous CRE studies that used either 3-stimulation-point mechanical or electrotactile stimuli with subject attention focused on the ‘likely’ illusory site. Across the fingertips, we observed an increase in stimulus mislocalization onto the middle fingertip, the ‘likely’ perceived location of the illusory stimuli, under Illusory Rabbit Trains compared to the Motion Bias Trains. Because the Motion Bias Trains should not induce a perceived location shift of the illusory stimulus but stimulates the adjacent digits in a similar way to the Illusory Rabbit Trains, differences observed between their mislocalization rates between these trains indicate that the CRE can be induced across the fingertips. These results provide the first evidence that the CRE can ‘jump’ when the stimuli occur across non-continuous skin areas.


Frontiers in Human Neuroscience | 2015

A robot hand testbed designed for enhancing embodiment and functional neurorehabilitation of body schema in subjects with upper limb impairment or loss

Randall B. Hellman; Eric Chang; Justin Tanner; Stephen I. Helms Tillery; Veronica J. Santos

Many upper limb amputees experience an incessant, post-amputation “phantom limb pain” and report that their missing limbs feel paralyzed in an uncomfortable posture. One hypothesis is that efferent commands no longer generate expected afferent signals, such as proprioceptive feedback from changes in limb configuration, and that the mismatch of motor commands and visual feedback is interpreted as pain. Non-invasive therapeutic techniques for treating phantom limb pain, such as mirror visual feedback (MVF), rely on visualizations of postural changes. Advances in neural interfaces for artificial sensory feedback now make it possible to combine MVF with a high-tech “rubber hand” illusion, in which subjects develop a sense of embodiment with a fake hand when subjected to congruent visual and somatosensory feedback. We discuss clinical benefits that could arise from the confluence of known concepts such as MVF and the rubber hand illusion, and new technologies such as neural interfaces for sensory feedback and highly sensorized robot hand testbeds, such as the “BairClaw” presented here. Our multi-articulating, anthropomorphic robot testbed can be used to study proprioceptive and tactile sensory stimuli during physical finger–object interactions. Conceived for artificial grasp, manipulation, and haptic exploration, the BairClaw could also be used for future studies on the neurorehabilitation of somatosensory disorders due to upper limb impairment or loss. A remote actuation system enables the modular control of tendon-driven hands. The artificial proprioception system enables direct measurement of joint angles and tendon tensions while temperature, vibration, and skin deformation are provided by a multimodal tactile sensor. The provision of multimodal sensory feedback that is spatiotemporally consistent with commanded actions could lead to benefits such as reduced phantom limb pain, and increased prosthesis use due to improved functionality and reduced cognitive burden.

Collaboration


Dive into the Stephen I. Helms Tillery's collaboration.

Top Co-Authors

Avatar

Jay P. Warren

Arizona State University

View shared research outputs
Top Co-Authors

Avatar

Marco Santello

Arizona State University

View shared research outputs
Top Co-Authors

Avatar

Dawn M. Taylor

Arizona State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jiping He

Arizona State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Remy Wahnoun

Arizona State University

View shared research outputs
Researchain Logo
Decentralizing Knowledge