Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Ruey-Song Huang is active.

Publication


Featured researches published by Ruey-Song Huang.


Nature Neuroscience | 2006

A human parietal face area contains aligned head-centered visual and tactile maps.

Martin I. Sereno; Ruey-Song Huang

Visually guided eating, biting and kissing, and avoiding objects moving toward the face and toward which the face moves require prompt, coordinated processing of spatial visual and somatosensory information in order to protect the face and the brain. Single-cell recordings in parietal cortex have identified multisensory neurons with spatially restricted, aligned visual and somatosensory receptive fields, but so far, there has been no evidence for a topographic map in this area. Here we mapped the organization of a multisensory parietal face area in humans by acquiring functional magnetic resonance images while varying the polar angle of facial air puffs and close-up visual stimuli. We found aligned maps of tactile and near-face visual stimuli at the highest level of human association cortex—namely, in the superior part of the postcentral sulcus. We show that this area may code the location of visual stimuli with respect to the face, not with respect to the retina.*NOTE: In the version of this article initially published online, there was an error in the affiliation in the html version. The first affiliation should read Department of Cognitive Science, University of California San Diego, La Jolla, California 92093, USA. The error has been corrected online.


The Journal of Neuroscience | 2006

Wide-Field Retinotopy Defines Human Cortical Visual Area V6

Sabrina Pitzalis; Claudio Galletti; Ruey-Song Huang; Fabiana Patria; Giorgia Committeri; Gaspare Galati; Patrizia Fattori; Martin I. Sereno

The retinotopic organization of a newly identified visual area near the midline in the dorsalmost part of the human parieto-occipital sulcus was mapped using high-field functional magnetic resonance imaging, cortical surface-based analysis, and wide-field retinotopic stimulation. This area was found in all 34 subjects that were mapped. It represents the contralateral visual hemifield in both hemispheres of all subjects, with upper fields located anterior and medial to areas V2/V3, and lower fields medial and slightly anterior to areas V3/V3A. It contains a representation of the center of gaze distinct from V3A, a large representation of the visual periphery, and a mirror-image representation of the visual field. Based on similarity in position, visuotopic organization, and relationship with the neighboring extrastriate visual areas, we suggest it might be the human homolog of macaque area V6, and perhaps of area M (medial) or DM (dorsomedial) of New World primates.


The Journal of Neuroscience | 2009

Multiple Parietal Reach Regions in Humans: Cortical Representations for Visual and Proprioceptive Feedback during On-Line Reaching

Flavia Filimon; Jonathan D. Nelson; Ruey-Song Huang; Martin I. Sereno

Reaching toward a visual target involves at least two sources of information. One is the visual feedback from the hand as it approaches the target. Another is proprioception from the moving limb, which informs the brain of the location of the hand relative to the target even when the hand is not visible. Where these two sources of information are represented in the human brain is unknown. In the present study, we investigated the cortical representations for reaching with or without visual feedback from the moving hand, using functional magnetic resonance imaging. To identify reach-dominant areas, we compared reaching with saccades. Our results show that a reach-dominant region in the anterior precuneus (aPCu), extending into medial intraparietal sulcus, is equally active in visual and nonvisual reaching. A second region, at the superior end of the parieto-occipital sulcus (sPOS), is more active for visual than for nonvisual reaching. These results suggest that aPCu is a sensorimotor area whose sensory input is primarily proprioceptive, while sPOS is a visuomotor area that receives visual feedback during reaching. In addition to the precuneus, medial, anterior intraparietal, and superior parietal cortex were also activated during both visual and nonvisual reaching, with more anterior areas responding to hand movements only and more posterior areas responding to both hand and eye movements. Our results suggest that cortical networks for reaching are differentially activated depending on the sensory conditions during reaching. This indicates the involvement of multiple parietal reach regions in humans, rather than a single homogenous parietal reach region.


Proceedings of the IEEE | 2008

Noninvasive Neural Prostheses Using Mobile and Wireless EEG

Chin-Teng Lin; Li-Wei Ko; Jin-Chern Chiou; Jeng-Ren Duann; Ruey-Song Huang; Sheng-Fu Liang; Tzai-Wen Chiu; Tzyy-Ping Jung

Neural prosthetic technologies have helped many patients by restoring vision, hearing, or movement and relieving chronic pain or neurological disorders. While most neural prosthetic systems to date have used invasive or implantable devices for patients with inoperative or malfunctioning external body parts or internal organs, a much larger population of ldquohealthyrdquo people who suffer episodic or progressive cognitive impairments in daily life can benefit from noninvasive neural prostheses. For example, reduced alertness, lack of attention, or poor decision-making during monotonous, routine tasks can have catastrophic consequences. This study proposes a noninvasive mobile prosthetic platform for continuously monitoring high-temporal resolution brain dynamics without requiring application of conductive gels on the scalp. The proposed system features dry microelectromechanical system electroencephalography sensors, low-power signal acquisition, amplification and digitization, wireless telemetry, online artifact cancellation, and signal processing. Its implications for neural prostheses are examined in two sample studies: 1) cognitive-state monitoring of participants performing realistic driving tasks in the virtual-reality-based dynamic driving simulator and 2) the neural correlates of motion sickness in driving. The experimental results of these studies provide new insights into the understanding of complex brain functions of participants actively performing ordinary tasks in natural body positions and situations within real operational environments.


Current Opinion in Neurobiology | 2014

Multisensory maps in parietal cortex

Martin I. Sereno; Ruey-Song Huang

Highlights • A new parietal multisensory area integrates lower body and lower visual field.• Rearrangement of parietal areas in human and non-human primates is rationalized.• In vivo myelin mapping outlines some parietal multisensory areas.• Multisensory parietal areas transform visual maps into non-retinocentric coordinates.


Proceedings of the National Academy of Sciences of the United States of America | 2012

Mapping multisensory parietal face and body areas in humans

Ruey-Song Huang; Ching-fu Chen; Alyssa T. Tran; Katie L. Holstein; Martin I. Sereno

Detection and avoidance of impending obstacles is crucial to preventing head and body injuries in daily life. To safely avoid obstacles, locations of objects approaching the body surface are usually detected via the visual system and then used by the motor system to guide defensive movements. Mediating between visual input and motor output, the posterior parietal cortex plays an important role in integrating multisensory information in peripersonal space. We used functional MRI to map parietal areas that see and feel multisensory stimuli near or on the face and body. Tactile experiments using full-body air-puff stimulation suits revealed somatotopic areas of the face and multiple body parts forming a higher-level homunculus in the superior posterior parietal cortex. Visual experiments using wide-field looming stimuli revealed retinotopic maps that overlap with the parietal face and body areas in the postcentral sulcus at the most anterior border of the dorsal visual pathway. Starting at the parietal face area and moving medially and posteriorly into the lower-body areas, the median of visual polar-angle representations in these somatotopic areas gradually shifts from near the horizontal meridian into the lower visual field. These results suggest the parietal face and body areas fuse multisensory information in peripersonal space to guard an individual from head to toe.


international conference on foundations of augmented cognition | 2009

Tonic Changes in EEG Power Spectra during Simulated Driving

Ruey-Song Huang; Tzyy-Ping Jung; Scott Makeig

Electroencephalographic (EEG) correlates of driving performance were studied using an event-related lane-departure paradigm. High-density EEG data were analyzed using independent component analysis (ICA) and Fourier analysis. Across subjects and sessions, when reaction time to lane-departure events increased, several clusters of independent component activities in the occipital, posterior parietal, and middle temporal cortex showed tonic power increases in the delta, theta, and alpha bands. The strongest of these tonic power increases occurred in the alpha band in the occipital and parietal regions. Other independent component clusters in the somatomotor and frontal regions showed less or no significant increase in all frequency bands as RT increased. This study demonstrates additional evidence of the close and specific links between cortical brain activities (via changes in EEG spectral power) and performance (reaction time) during sustained-attention tasks. These results may also provide insights into the development of human-computer interfaces for countermeasures for drowsy driving.


international conference on foundations of augmented cognition | 2007

Event-related brain dynamics in continuous sustained-attention tasks

Ruey-Song Huang; Tzyy-Ping Jung; Scott Makeig

Event-related brain dynamics of electroencephalographic (EEG) activity in a continuous compensatory tracking task (CTT) and in a continuous driving simulation were analyzed by independent component analysis (ICA) and time-frequency techniques. We showed that changes in the level of subject performance are accompanied by distinct changes in EEG spectrum of a class of bilateral posterior independent EEG components. During periods of high-error (drowsy) performance, tonic alpha band EEG power was significantly elevated, compared to that during periods of low-error (alert) performance. In addition, characteristic transient (phasic) alpha and other band increases and decreases followed critical task events, depending on current performance level. These performance-related and event-related spectral changes were consistently observed across subjects and sessions, and were remarkably similar across the two continuous sustained-attention tasks.


international conference on acoustics, speech, and signal processing | 2007

Multi-Scale EEG Brain Dynamics During Sustained Attention Tasks

Ruey-Song Huang; Tzyy-Ping Jung; Scott Makeig

We present a novel experimental paradigm and data analysis methodology for studying brain dynamics during sustained-attention tasks. 256-channel EEG data were recorded while subjects participated in hour-long simulated driving sessions. Every few seconds, the vehicle drifted away from the center of the left lane, and subjects were instructed to steer back to the lane center. The error of each drifting event was measured by the maximum absolute distance from the vehicles position at deviation onset. EEG data were analyzed using independent component analysis and time-frequency analysis. An independent component with equivalent dipole sources located bilaterally in lateral occipital cortex exhibited multi-scale brain dynamics. Tonic (~20s) alpha-band power increased in high-error compared to low-error epochs, while phasic (~1s) alpha power was suppressed briefly after deviation onset, then increased strongly just before response offset. Other components also exhibited distinct tonic and/or phasic activity patterns relating to deviation onsets or response onsets.


The Open Neuroimaging Journal | 2013

Bottom-up Retinotopic Organization Supports Top-down Mental Imagery

Ruey-Song Huang; Martin I. Sereno

Finding a path between locations is a routine task in daily life. Mental navigation is often used to plan a route to a destination that is not visible from the current location. We first used functional magnetic resonance imaging (fMRI) and surface-based averaging methods to find high-level brain regions involved in imagined navigation between locations in a building very familiar to each participant. This revealed a mental navigation network that includes the precuneus, retrosplenial cortex (RSC), parahippocampal place area (PPA), occipital place area (OPA), supplementary motor area (SMA), premotor cortex, and areas along the medial and anterior intraparietal sulcus. We then visualized retinotopic maps in the entire cortex using wide-field, natural scene stimuli in a separate set of fMRI experiments. This revealed five distinct visual streams or ‘fingers’ that extend anteriorly into middle temporal, superior parietal, medial parietal, retrosplenial and ventral occipitotemporal cortex. By using spherical morphing to overlap these two data sets, we showed that the mental navigation network primarily occupies areas that also contain retinotopic maps. Specifically, scene-selective regions RSC, PPA and OPA have a common emphasis on the far periphery of the upper visual field. These results suggest that bottom-up retinotopic organization may help to efficiently encode scene and location information in an eye-centered reference frame for top-down, internally generated mental navigation. This study pushes the border of visual cortex further anterior than was initially expected.

Collaboration


Dive into the Ruey-Song Huang's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Tzyy-Ping Jung

University of California

View shared research outputs
Top Co-Authors

Avatar

Ching-fu Chen

University of California

View shared research outputs
Top Co-Authors

Avatar

Scott Makeig

University of California

View shared research outputs
Top Co-Authors

Avatar

Chung J. Kuo

National Chung Cheng University

View shared research outputs
Top Co-Authors

Avatar

Li-Wei Ko

National Chiao Tung University

View shared research outputs
Top Co-Authors

Avatar

Arnaud Delorme

University of California

View shared research outputs
Top Co-Authors

Avatar

Fabiana Patria

University of California

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jin-Chern Chiou

National Chiao Tung University

View shared research outputs
Researchain Logo
Decentralizing Knowledge