Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Benjamin A. Rowland is active.

Publication


Featured researches published by Benjamin A. Rowland.


Nature Reviews Neuroscience | 2014

Development of multisensory integration from the perspective of the individual neuron

Barry E. Stein; Terrence R. Stanford; Benjamin A. Rowland

The ability to use cues from multiple senses in concert is a fundamental aspect of brain function. It maximizes the brains use of the information available to it at any given moment and enhances the physiological salience of external events. Because each sense conveys a unique perspective of the external world, synthesizing information across senses affords computational benefits that cannot otherwise be achieved. Multisensory integration not only has substantial survival value but can also create unique experiences that emerge when signals from different sensory channels are bound together. However, neurons in a newborns brain are not capable of multisensory integration, and studies in the midbrain have shown that the development of this process is not predetermined. Rather, its emergence and maturation critically depend on cross-modal experiences that alter the underlying neural circuit in such a way that optimizes multisensory integrative capabilities for the environment in which the animal will function.


The Journal of Neuroscience | 2012

Incorporating Cross-Modal Statistics in the Development and Maintenance of Multisensory Integration

Jinghong Xu; Liping Yu; Benjamin A. Rowland; Terrence R. Stanford; Barry E. Stein

Development of multisensory integration capabilities in superior colliculus (SC) neurons was examined in cats whose visual–auditory experience was restricted to a circumscribed period during early life (postnatal day 30–8 months). Animals were periodically exposed to visual and auditory stimuli appearing either randomly in space and time, or always in spatiotemporal concordance. At all other times animals were maintained in darkness. Physiological testing was initiated at ∼2 years of age. Exposure to random visual and auditory stimuli proved insufficient to spur maturation of the ability to integrate cross-modal stimuli, but exposure to spatiotemporally concordant cross-modal stimuli was highly effective. The multisensory integration capabilities of neurons in the latter group resembled those of normal animals and were retained for >16 months in the absence of subsequent visual–auditory experience. Furthermore, the neurons were capable of integrating stimuli having physical properties differing significantly from those in the exposure set. These observations suggest that acquiring the rudiments of multisensory integration requires little more than exposure to consistent relationships between the modality-specific components of a cross-modal event, and that continued experience with such events is not necessary for their maintenance. Apparently, the statistics of cross-modal experience early in life define the spatial and temporal filters that determine whether the components of cross-modal stimuli are to be integrated or treated as independent events, a crucial developmental process that determines the spatial and temporal rules by which cross-modal stimuli are integrated to enhance both sensory salience and the likelihood of eliciting an SC-mediated motor response.


The Journal of Neuroscience | 2015

Relative Unisensory Strength and Timing Predict Their Multisensory Product

X Ryan L. Miller; Scott R. Pluta; Barry E. Stein; Benjamin A. Rowland

Understanding the principles by which the brain combines information from different senses provides us with insight into the computational strategies used to maximize their utility. Prior studies of the superior colliculus (SC) neuron as a model suggest that the relative timing with which sensory cues appear is an important factor in this context. Cross-modal cues that are near-simultaneous are likely to be derived from the same event, and the neural inputs they generate are integrated more strongly than those from cues that are temporally displaced from one another. However, the present results from studies of cat SC neurons show that this “temporal principle” of multisensory integration is more nuanced than previously thought and reveal that the integration of temporally displaced sensory responses is also highly dependent on the relative efficacies with which they drive their common target neuron. Larger multisensory responses were achieved when stronger responses were advanced in time relative to weaker responses. This new temporal principle of integration suggests an inhibitory mechanism that better accounts for the sensitivity of the multisensory product to differences in the timing of cross-modal cues than do earlier mechanistic hypotheses based on response onset alignment or response overlap.


European Journal of Neuroscience | 2014

Noise-rearing disrupts the maturation of multisensory integration.

Jinghong Xu; Liping Yu; Benjamin A. Rowland; Terrence R. Stanford; Barry E. Stein

It is commonly believed that the ability to integrate information from different senses develops according to associative learning principles as neurons acquire experience with co‐active cross‐modal inputs. However, previous studies have not distinguished between requirements for co‐activation versus co‐variation. To determine whether cross‐modal co‐activation is sufficient for this purpose in visual–auditory superior colliculus (SC) neurons, animals were reared in constant omnidirectional noise. By masking most spatiotemporally discrete auditory experiences, the noise created a sensory landscape that decoupled stimulus co‐activation and co‐variance. Although a near‐normal complement of visual–auditory SC neurons developed, the vast majority could not engage in multisensory integration, revealing that visual–auditory co‐activation was insufficient for this purpose. That experience with co‐varying stimuli is required for multisensory maturation is consistent with the role of the SC in detecting and locating biologically significant events, but it also seems likely that this is a general requirement for multisensory maturation throughout the brain.


European Journal of Neuroscience | 2013

Development of cortical influences on superior colliculus multisensory neurons: effects of dark-rearing

Liping Yu; Jinghong Xu; Benjamin A. Rowland; Barry E. Stein

Rearing cats from birth to adulthood in darkness prevents neurons in the superior colliculus (SC) from developing the capability to integrate visual and non‐visual (e.g. visual‐auditory) inputs. Presumably, this developmental anomaly is due to a lack of experience with the combination of those cues, which is essential to form associative links between them. The visual‐auditory multisensory integration capacity of SC neurons has also been shown to depend on the functional integrity of converging visual and auditory inputs from the ipsilateral association cortex. Disrupting these cortico‐collicular projections at any stage of life results in a pattern of outcomes similar to those found after dark‐rearing; SC neurons respond to stimuli in both sensory modalities, but cannot integrate the information they provide. Thus, it is possible that dark‐rearing compromises the development of these descending tecto‐petal connections and the essential influences they convey. However, the results of the present experiments, using cortical deactivation to assess the presence of cortico‐collicular influences, demonstrate that dark‐rearing does not prevent the association cortex from developing robust influences over SC multisensory responses. In fact, dark‐rearing may increase their potency over that observed in normally‐reared animals. Nevertheless, their influences are still insufficient to support SC multisensory integration. It appears that cross‐modal experience shapes the cortical influence to selectively enhance responses to cross‐modal stimulus combinations that are likely to be derived from the same event. In the absence of this experience, the cortex develops an indiscriminate excitatory influence over its multisensory SC target neurons.


The Journal of Neuroscience | 2014

Brief Cortical Deactivation Early in Life Has Long-Lasting Effects on Multisensory Behavior

Benjamin A. Rowland; Wan Jiang; Barry E. Stein

Detecting and locating environmental events are markedly enhanced by the midbrains ability to integrate visual and auditory cues. Its capacity for multisensory integration develops in cats 1–4 months after birth but only after acquiring extensive visual–auditory experience. However, briefly deactivating specific regions of association cortex during this period induced long-term disruption of this maturational process, such that even 1 year later animals were unable to integrate visual and auditory cues to enhance their behavioral performance. The data from this animal model reveal a window of sensitivity within which association cortex mediates the encoding of cross-modal experience in the midbrain. Surprisingly, however, 3 years later, and without any additional intervention, the capacity appeared fully developed. This suggests that, although sensitivity degrades with age, the potential for acquiring or modifying multisensory integration capabilities extends well into adulthood.


Journal of Neurophysiology | 2015

What does a neuron learn from multisensory experience

Jinghong Xu; Liping Yu; Terrence R. Stanford; Benjamin A. Rowland; Barry E. Stein

The brains ability to integrate information from different senses is acquired only after extensive sensory experience. However, whether early life experience instantiates a general integrative capacity in multisensory neurons or one limited to the particular cross-modal stimulus combinations to which one has been exposed is not known. By selectively restricting either visual-nonvisual or auditory-nonauditory experience during the first few months of life, the present study found that trisensory neurons in cat superior colliculus (as well as their bisensory counterparts) became adapted to the cross-modal stimulus combinations specific to each rearing environment. Thus, even at maturity, trisensory neurons did not integrate all cross-modal stimulus combinations to which they were capable of responding, but only those that had been linked via experience to constitute a coherent spatiotemporal event. This selective maturational process determines which environmental events will become the most effective targets for superior colliculus-mediated shifts of attention and orientation.


The Journal of Neuroscience | 2017

Multisensory integration uses a real time unisensory-multisensory transform

Ryan Miller; Barry E. Stein; Benjamin A. Rowland

The manner in which the brain integrates different sensory inputs to facilitate perception and behavior has been the subject of numerous speculations. By examining multisensory neurons in cat superior colliculus, the present study demonstrated that two operational principles are sufficient to understand how this remarkable result is achieved: (1) unisensory signals are integrated continuously and in real time as soon as they arrive at their common target neuron and (2) the resultant multisensory computation is modified in shape and timing by a delayed, calibrating inhibition. These principles were tested for descriptive sufficiency by embedding them in a neurocomputational model and using it to predict a neurons moment-by-moment multisensory response given only knowledge of its responses to the individual modality-specific component cues. The predictions proved to be highly accurate, reliable, and unbiased and were, in most cases, not statistically distinguishable from the neurons actual instantaneous multisensory response at any phase throughout its entire duration. The model was also able to explain why different multisensory products are often observed in different neurons at different time points, as well as the higher-order properties of multisensory integration, such as the dependency of multisensory products on the temporal alignment of crossmodal cues. These observations not only reveal this fundamental integrative operation, but also identify quantitatively the multisensory transform used by each neuron. As a result, they provide a means of comparing the integrative profiles among neurons and evaluating how they are affected by changes in intrinsic or extrinsic factors. SIGNIFICANCE STATEMENT Multisensory integration is the process by which the brain combines information from multiple sensory sources (e.g., vision and audition) to maximize an organisms ability to identify and respond to environmental stimuli. The actual transformative process by which the neural products of multisensory integration are achieved is poorly understood. By focusing on the millisecond-by-millisecond differences between a neurons unisensory component responses and its integrated multisensory response, it was found that this multisensory transform can be described by two basic principles: unisensory information is integrated in real time and the multisensory response is shaped by calibrating inhibition. It is now possible to use these principles to predict a neurons multisensory response accurately armed only with knowledge of its unisensory responses.


Cerebral Cortex | 2016

Multisensory Plasticity in Superior Colliculus Neurons is Mediated by Association Cortex

Liping Yu; Jinghong Xu; Benjamin A. Rowland; Barry E. Stein

The ability to integrate information from different senses, and thereby facilitate detecting and localizing events, normally develops gradually in cat superior colliculus (SC) neurons as experience with cross-modal events is acquired. Here, we demonstrate that the portal for this experience-based change is association cortex. Unilaterally deactivating this cortex whenever visual-auditory events were present resulted in the failure of ipsilateral SC neurons to develop the ability to integrate those cross-modal inputs, even though they retained the ability to respond to them. In contrast, their counterparts in the opposite SC developed this capacity normally. The deficits were eliminated by providing cross-modal experience when cortex was active. These observations underscore the collaborative developmental processes that take place among different levels of the neuraxis to adapt the brains multisensory (and sensorimotor) circuits to the environment in which they will be used.


Scientific Reports | 2017

The normal environment delays the development of multisensory integration

Jinghong Xu; Liping Yu; Benjamin A. Rowland; Barry E. Stein

Multisensory neurons in animals whose cross-modal experiences are compromised during early life fail to develop the ability to integrate information across those senses. Consequently, they lack the ability to increase the physiological salience of the events that provide the convergent cross-modal inputs. The present study demonstrates that superior colliculus (SC) neurons in animals whose visual-auditory experience is compromised early in life by noise-rearing can develop visual-auditory multisensory integration capabilities rapidly when periodically exposed to a single set of visual-auditory stimuli in a controlled laboratory paradigm. However, they remain compromised if their experiences are limited to a normal housing environment. These observations seem counterintuitive given that multisensory integrative capabilities ordinarily develop during early life in normal environments, in which a wide variety of sensory stimuli facilitate the functional organization of complex neural circuits at multiple levels of the neuraxis. However, the very richness and inherent variability of sensory stimuli in normal environments will lead to a less regular coupling of any given set of cross-modal cues than does the otherwise “impoverished” laboratory exposure paradigm. That this poses no significant problem for the neonate, but does for the adult, indicates a maturational shift in the requirements for the development of multisensory integration capabilities.

Collaboration


Dive into the Benjamin A. Rowland's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jinghong Xu

East China Normal University

View shared research outputs
Top Co-Authors

Avatar

Liping Yu

East China Normal University

View shared research outputs
Top Co-Authors

Avatar

Ryan Miller

Wake Forest University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge