Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Hongliang Gong is active.

Publication


Featured researches published by Hongliang Gong.


The Journal of Neuroscience | 2012

Equivalent Representation of Real and Illusory Contours in Macaque V4

Yanxia Pan; Minggui Chen; Jiapeng Yin; Xu An; Xian Zhang; Yiliang Lu; Hongliang Gong; Wu Li; Wei Wang

The cortical processing of illusory contours provides a unique window for exploring the brain mechanisms underlying visual perception. Previous electrophysiological single-cell recordings demonstrate that a subgroup of cells in macaque V1 and V2 signal the presence of illusory contours, whereas recent human brain imaging studies reveal higher-order visual cortices playing a central role in illusory figure processing. It seems that the processing of illusory contours/figures may engage multiple cortical interactions between hierarchically organized processing stages in the ventral visual pathway of primates. However, it is not yet known in which brain areas illusory contours are represented in the same manner as real contours at both the population and single-cell levels. Here, by combining intrinsic optical imaging in anesthetized rhesus macaques with single-cell recordings in awake ones, we found a complete overlap of orientation domains in visual cortical area V4 for processing real and illusory contours. In contrast, the orientation domains mapped in early visual areas V1 and V2 mainly encoded the local physical stimulus features inducing the subjective perception of global illusory contours. Our results indicate that real and illusory contours are encoded equivalently by the same functional domains in V4, suggesting that V4 is a key cortical locus for integration of local features into global contours.


The Journal of Neuroscience | 2012

Distinct Functional Organizations for Processing Different Motion Signals in V1, V2, and V4 of Macaque

Xu An; Hongliang Gong; Liling Qian; Xiaochun Wang; Yanxia Pan; Xian Zhang; Yupeng Yang; Wei Wang

Motion perception is qualitatively invariant across different objects and forms, namely, the same motion information can be conveyed by many different physical carriers, and it requires the processing of motion signals consisting of direction, speed, and axis or trajectory of motion defined by a moving object. Compared with the representation of orientation, the cortical processing of these different motion signals within the early ventral visual pathway of the primate remains poorly understood. Using drifting full-field noise stimuli and intrinsic optical imaging, along with cytochrome-oxidase staining, we found that the orientation domains in macaque V1, V2, and V4 that processed orientation signals also served to process motion signals associated with the axis and speed of motion. In contrast, direction domains within the thick stripes of V2 demonstrated preferences that were independent of motion speed. The population responses encoding the orientation and motion axis could be precisely reproduced by a spatiotemporal energy model. Thus, our observation of orientation domains with dual functions in V1, V2, and V4 directly support the notion that the linear representation of the temporal series of retinotopic activations may serve as another motion processing strategy in primate ventral visual pathway, contributing directly to fine form and motion analysis. Our findings further reveal that different types of motion information are differentially processed in parallel and segregated compartments within primate early visual cortices, before these motion features are fully combined in high-tier visual areas.


PLOS ONE | 2014

Orientation-Cue Invariant Population Responses to Contrast-Modulated and Phase-Reversed Contour Stimuli in Macaque V1 and V2

Xu An; Hongliang Gong; Jiapeng Yin; Xiaochun Wang; Yanxia Pan; Xian Zhang; Yiliang Lu; Yupeng Yang; Zoltan G. Toth; Ingo Schiessl; Niall McLoughlin; Wei Wang

Visual scenes can be readily decomposed into a variety of oriented components, the processing of which is vital for object segregation and recognition. In primate V1 and V2, most neurons have small spatio-temporal receptive fields responding selectively to oriented luminance contours (first order), while only a subgroup of neurons signal non-luminance defined contours (second order). So how is the orientation of second-order contours represented at the population level in macaque V1 and V2? Here we compared the population responses in macaque V1 and V2 to two types of second-order contour stimuli generated either by modulation of contrast or phase reversal with those to first-order contour stimuli. Using intrinsic signal optical imaging, we found that the orientation of second-order contour stimuli was represented invariantly in the orientation columns of both macaque V1 and V2. A physiologically constrained spatio-temporal energy model of V1 and V2 neuronal populations could reproduce all the recorded population responses. These findings suggest that, at the population level, the primate early visual system processes the orientation of second-order contours initially through a linear spatio-temporal filter mechanism. Our results of population responses to different second-order contour stimuli support the idea that the orientation maps in primate V1 and V2 can be described as a spatial-temporal energy map.


PLOS ONE | 2014

The mechanism for processing random-dot motion at various speeds in early visual cortices.

Xu An; Hongliang Gong; Niall McLoughlin; Yupeng Yang; Wei Wang

All moving objects generate sequential retinotopic activations representing a series of discrete locations in space and time (motion trajectory). How direction-selective neurons in mammalian early visual cortices process motion trajectory remains to be clarified. Using single-cell recording and optical imaging of intrinsic signals along with mathematical simulation, we studied response properties of cat visual areas 17 and 18 to random dots moving at various speeds. We found that, the motion trajectory at low speed was encoded primarily as a direction signal by groups of neurons preferring that motion direction. Above certain transition speeds, the motion trajectory is perceived as a spatial orientation representing the motion axis of the moving dots. In both areas studied, above these speeds, other groups of direction-selective neurons with perpendicular direction preferences were activated to encode the motion trajectory as motion-axis information. This applied to both simple and complex neurons. The average transition speed for switching between encoding motion direction and axis was about 31°/s in area 18 and 15°/s in area 17. A spatio-temporal energy model predicted the transition speeds accurately in both areas, but not the direction-selective indexes to random-dot stimuli in area 18. In addition, above transition speeds, the change of direction preferences of population responses recorded by optical imaging can be revealed using vector maximum but not vector summation method. Together, this combined processing of motion direction and axis by neurons with orthogonal direction preferences associated with speed may serve as a common principle of early visual motion processing.


Royal Society of London. Proceedings B. Biological Sciences. 2015;282(1813). | 2015

Breaking cover: neural responses to slow and fast camouflage-breaking motion

Jiapeng Yin; Hongliang Gong; Xu An; Zheyuan Chen; Yiliang Lu; Ian M. Andolina; Niall McLoughlin; Wei Wang

Primates need to detect and recognize camouflaged animals in natural environments. Camouflage-breaking movements are often the only visual cue available to accomplish this. Specifically, sudden movements are often detected before full recognition of the camouflaged animal is made, suggesting that initial processing of motion precedes the recognition of motion-defined contours or shapes. What are the neuronal mechanisms underlying this initial processing of camouflaged motion in the primate visual brain? We investigated this question using intrinsic-signal optical imaging of macaque V1, V2 and V4, along with computer simulations of the neural population responses. We found that camouflaged motion at low speed was processed as a direction signal by both direction- and orientation-selective neurons, whereas at high-speed camouflaged motion was encoded as a motion-streak signal primarily by orientation-selective neurons. No population responses were found to be invariant to the camouflage contours. These results suggest that the initial processing of camouflaged motion at low and high speeds is encoded as direction and motion-streak signals in primate early visual cortices. These processes are consistent with a spatio-temporal filter mechanism that provides for fast processing of motion signals, prior to full recognition of camouflage-breaking animals.


Archive | 2016

The Neural Mechanism of Direction- and Orientation-Selective Neurons for Processing Direction, Speed, and Axis of Motion in Early Visual Cortices

Hongliang Gong; Xu An; Liling Qian; Jiapeng Yin; Yiliang Lu; Wei Wang

Visual motion is fundamentally different from physical motion, because the former represents sequential retinotopic neuronal activations in time generated by a physical moving object. Traditionally, cortical direction-selective neurons are regarded as motion detector while orientation-selective neurons as contour detector. However, orientation-selective neurons also respond rigorously to motion stimuli. What is then the common neural mechanism underlying early motion processing in high mammalians? Here we demonstrate that motion trajectory only at low speed was encoded primarily as direction signal by both direction- and orientation-selective neurons preferring that motion direction, but at high speed, other groups of direction- and orientation-selective neurons with perpendicular preferences were activated to encode the motion trajectory as motion-axis information. Thus, depending on the motion speed, the combined processing of motion direction and axis by neurons with orthogonal direction and orientation preferences may serve as a fundamental principle of visual motion processing in early visual areas of high mammalians.


Archive | 2016

The Application of Spatiotemporal Energy Model in the Simulation of Population Responses in Early Visual Cortices

Yiliang Lu; Xu An; Hongliang Gong; Wei Wang

The early visual cortices (V1 and V2) are traditionally regarded as local feature detectors or filters of spatiotemporal components while higher visual areas as centers for global feature or object recognition in high mammalians. Natural scenes are complicated, thus synthesized visual stimuli are often composed of various local and global visual cues. Here we propose that the spatiotemporal energy model could simulate the population responses to the local component of most visual stimulus in the early visual cortices. The population responses to stimuli of either luminance or contrast and texture modulations, recorded using intrinsic optical imaging in both V1 and V2 of macaques and cats, could be successfully predicted by the energy model. However, it failed to predict the population responses to other complex stimuli such as illusory and kinetic contours. These results illustrated the application and limitation of the spatiotemporal energy model in accounting for population responses in the early visual cortices.


Neuron | 2018

Revealing Detail along the Visual Hierarchy: Neural Clustering Preserves Acuity from V1 to V4

Yiliang Lu; Jiapeng Yin; Zheyuan Chen; Hongliang Gong; Ye Liu; Liling Qian; Xiaohong Li; Rui Liu; Ian M. Andolina; Wei Wang


心理科学进展 | 2016

非人灵长类猕猴V1, V2, V4区空间频率选择性的研究

Yiliang Lu; Hongliang Gong; Jiapeng Yin; Zheyuan Chen; Ian M. Andolina; Wei Wang


Archive | 2015

ESM for Yin et al July 06 2015

Jiapeng Yin; Hongliang Gong; Xu An; Zheyuan Chen; Yiliang Lu; Ian M. Andolina; Niall McLoughlin; Wei Wang

Collaboration


Dive into the Hongliang Gong's collaboration.

Top Co-Authors

Avatar

Wei Wang

Chinese Academy of Sciences

View shared research outputs
Top Co-Authors

Avatar

Xu An

Chinese Academy of Sciences

View shared research outputs
Top Co-Authors

Avatar

Jiapeng Yin

Chinese Academy of Sciences

View shared research outputs
Top Co-Authors

Avatar

Yiliang Lu

Chinese Academy of Sciences

View shared research outputs
Top Co-Authors

Avatar

Xian Zhang

Chinese Academy of Sciences

View shared research outputs
Top Co-Authors

Avatar

Yanxia Pan

Chinese Academy of Sciences

View shared research outputs
Top Co-Authors

Avatar

Yupeng Yang

University of Science and Technology of China

View shared research outputs
Top Co-Authors

Avatar

Liling Qian

Chinese Academy of Sciences

View shared research outputs
Top Co-Authors

Avatar

Xiaochun Wang

Chinese Academy of Sciences

View shared research outputs
Top Co-Authors

Avatar

Zheyuan Chen

Chinese Academy of Sciences

View shared research outputs
Researchain Logo
Decentralizing Knowledge