Manuel R. Mercier
Albert Einstein College of Medicine
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Manuel R. Mercier.
The Journal of Neuroscience | 2011
Ian C. Fiebelkorn; John J. Foxe; John S. Butler; Manuel R. Mercier; Adam C. Snyder; Sophie Molholm
The simultaneous presentation of a stimulus in one sensory modality often enhances target detection in another sensory modality, but the neural mechanisms that govern these effects are still under investigation. Here, we test a hypothesis proposed in the neurophysiological literature: that auditory facilitation of visual-target detection operates through cross-sensory phase reset of ongoing neural oscillations (Lakatos et al., 2009). To date, measurement limitations have prevented this potentially powerful neural mechanism from being directly linked with its predicted behavioral consequences. The present experiment uses a psychophysical approach in humans to demonstrate, for the first time, stimulus-locked periodicity in visual-target detection, following a temporally informative sound. Our data further demonstrate that periodicity in behavioral performance is strongly influenced by the probability of audiovisual co-occurrence. We argue that fluctuations in visual-target detection result from cross-sensory phase reset, both at the moment it occurs and persisting for seconds thereafter. The precise frequency at which this periodicity operates remains to be determined through a method that allows for a higher sampling rate.
NeuroImage | 2013
Manuel R. Mercier; John J. Foxe; Ian C. Fiebelkorn; John S. Butler; Theodore H. Schwartz; Sophie Molholm
Findings in animal models demonstrate that activity within hierarchically early sensory cortical regions can be modulated by cross-sensory inputs through resetting of the phase of ongoing intrinsic neural oscillations. Here, subdural recordings evaluated whether phase resetting by auditory inputs would impact multisensory integration processes in human visual cortex. Results clearly showed auditory-driven phase reset in visual cortices and, in some cases, frank auditory event-related potentials (ERP) were also observed over these regions. Further, when audiovisual bisensory stimuli were presented, this led to robust multisensory integration effects which were observed in both the ERP and in measures of phase concentration. These results extend findings from animal models to human visual cortices, and highlight the impact of cross-sensory phase resetting by a non-primary stimulus on multisensory integration in ostensibly unisensory cortices.
The Journal of Neuroscience | 2011
John S. Butler; Sophie Molholm; Ian C. Fiebelkorn; Manuel R. Mercier; Theodore H. Schwartz; John J. Foxe
Certain features of objects or events can be represented by more than a single sensory system, such as roughness of a surface (sight, sound, and touch), the location of a speaker (audition and sight), and the rhythm or duration of an event (by all three major sensory systems). Thus, these properties can be said to be sensory-independent or amodal. A key question is whether common multisensory cortical regions process these amodal features, or does each sensory system contain its own specialized region(s) for processing common features? We tackled this issue by investigating simple duration-detection mechanisms across audition and touch; these systems were chosen because fine duration discriminations are possible in both. The mismatch negativity (MMN) component of the human event-related potential provides a sensitive metric of duration processing and has been elicited independently during both auditory and somatosensory investigations. Employing high-density electroencephalographic recordings in conjunction with intracranial subdural recordings, we asked whether fine duration discriminations, represented by the MMN, were generated in the same cortical regions regardless of the sensory modality being probed. Scalp recordings pointed to statistically distinct MMN topographies across senses, implying differential underlying cortical generator configurations. Intracranial recordings confirmed these noninvasive findings, showing generators of the auditory MMN along the superior temporal gyrus with no evidence of a somatosensory MMN in this region, whereas a robust somatosensory MMN was recorded from postcentral gyrus in the absence of an auditory MMN. The current data clearly argue against a common circuitry account for amodal duration processing.
NeuroImage | 2013
Ian C. Fiebelkorn; Adam C. Snyder; Manuel R. Mercier; John S. Butler; Sophie Molholm; John J. Foxe
Functional networks are comprised of neuronal ensembles bound through synchronization across multiple intrinsic oscillatory frequencies. Various coupled interactions between brain oscillators have been described (e.g., phase-amplitude coupling), but with little evidence that these interactions actually influence perceptual sensitivity. Here, electroencephalographic (EEG) recordings were made during a sustained-attention task to demonstrate that cross-frequency coupling has significant consequences for perceptual outcomes (i.e., whether participants detect a near-threshold visual target). The data reveal that phase-detection relationships at higher frequencies are dependent on the phase of lower frequencies, such that higher frequencies alternate between periods when their phase is either strongly or weakly predictive of visual-target detection. Moreover, the specific higher frequencies and scalp topographies linked to visual-target detection also alternate as a function of lower-frequency phase. Cross-frequency coupling between lower (i.e., delta and theta) and higher frequencies (e.g., low- and high-beta) thus results in dramatic fluctuations of visual-target detection.
The Journal of Neuroscience | 2013
Ali Bahramisharif; Marcel A. J. van Gerven; Erik J. Aarnoutse; Manuel R. Mercier; Theodore H. Schwartz; John J. Foxe; Nick F. Ramsey; Ole Jensen
Neocortical neuronal activity is characterized by complex spatiotemporal dynamics. Although slow oscillations have been shown to travel over space in terms of consistent phase advances, it is unknown how this phenomenon relates to neuronal activity in other frequency bands. We here present electrocorticographic data from three male and one female human subject and demonstrate that gamma power is phase locked to traveling alpha waves. Given that alpha activity has been proposed to coordinate neuronal processing reflected in the gamma band, we suggest that alpha waves are involved in coordinating neuronal processing in both space and time.
The Journal of Neuroscience | 2015
Manuel R. Mercier; Sophie Molholm; Ian C. Fiebelkorn; John S. Butler; Theodore H. Schwartz; John J. Foxe
Even simple tasks rely on information exchange between functionally distinct and often relatively distant neuronal ensembles. Considerable work indicates oscillatory synchronization through phase alignment is a major agent of inter-regional communication. In the brain, different oscillatory phases correspond to low- and high-excitability states. Optimally aligned phases (or high-excitability states) promote inter-regional communication. Studies have also shown that sensory stimulation can modulate or reset the phase of ongoing cortical oscillations. For example, auditory stimuli can reset the phase of oscillations in visual cortex, influencing processing of a simultaneous visual stimulus. Such cross-regional phase reset represents a candidate mechanism for aligning oscillatory phase for inter-regional communication. Here, we explored the role of local and inter-regional phase alignment in driving a well established behavioral correlate of multisensory integration: the redundant target effect (RTE), which refers to the fact that responses to multisensory inputs are substantially faster than to unisensory stimuli. In a speeded detection task, human epileptic patients (N = 3) responded to unisensory (auditory or visual) and multisensory (audiovisual) stimuli with a button press, while electrocorticography was recorded over auditory and motor regions. Visual stimulation significantly modulated auditory activity via phase reset in the delta and theta bands. During the period between stimulation and subsequent motor response, transient synchronization between auditory and motor regions was observed. Phase synchrony to multisensory inputs was faster than to unisensory stimulation. This sensorimotor phase alignment correlated with behavior such that stronger synchrony was associated with faster responses, linking the commonly observed RTE with phase alignment across a sensorimotor network.
The Journal of Neuroscience | 2012
John S. Butler; John J. Foxe; Ian C. Fiebelkorn; Manuel R. Mercier; Sophie Molholm
The frequency of environmental vibrations is sampled by two of the major sensory systems, audition and touch, notwithstanding that these signals are transduced through very different physical media and entirely separate sensory epithelia. Psychophysical studies have shown that manipulating frequency in audition or touch can have a significant cross-sensory impact on perceived frequency in the other sensory system, pointing to intimate links between these senses during computation of frequency. In this regard, the frequency of a vibratory event can be thought of as a multisensory perceptual construct. In turn, electrophysiological studies point to temporally early multisensory interactions that occur in hierarchically early sensory regions where convergent inputs from the auditory and somatosensory systems are to be found. A key question pertains to the level of processing at which the multisensory integration of featural information, such as frequency, occurs. Do the sensory systems calculate frequency independently before this information is combined, or is this feature calculated in an integrated fashion during preattentive sensory processing? The well characterized mismatch negativity, an electrophysiological response that indexes preattentive detection of a change within the context of a regular pattern of stimulation, served as our dependent measure. High-density electrophysiological recordings were made in humans while they were presented with separate blocks of somatosensory, auditory, and audio-somatosensory “standards” and “deviants,” where the deviant differed in frequency. Multisensory effects were identified beginning at ∼200 ms, with the multisensory mismatch negativity (MMN) significantly different from the sum of the unisensory MMNs. This provides compelling evidence for preattentive coupling between the somatosensory and auditory channels in the cortical representation of frequency.
Journal of Neuroscience Methods | 2017
David M. Groppe; Stephan Bickel; Andrew R. Dykstra; Pierre Mégevand; Manuel R. Mercier; Fred A. Lado; Ashesh D. Mehta; Christopher J. Honey
BACKGROUND Intracranial electrical recordings (iEEG) and brain stimulation (iEBS) are invaluable human neuroscience methodologies. However, the value of such data is often unrealized as many laboratories lack tools for localizing electrodes relative to anatomy. To remedy this, we have developed a MATLAB toolbox for intracranial electrode localization and visualization, iELVis. NEW METHOD: iELVis uses existing tools (BioImage Suite, FSL, and FreeSurfer) for preimplant magnetic resonance imaging (MRI) segmentation, neuroimaging coregistration, and manual identification of electrodes in postimplant neuroimaging. Subsequently, iELVis implements methods for correcting electrode locations for postimplant brain shift with millimeter-scale accuracy and provides interactive visualization on 3D surfaces or in 2D slices with optional functional neuroimaging overlays. iELVis also localizes electrodes relative to FreeSurfer-based atlases and can combine data across subjects via the FreeSurfer average brain. RESULTS It takes 30-60min of user time and 12-24h of computer time to localize and visualize electrodes from one brain. We demonstrate iELViss functionality by showing that three methods for mapping primary hand somatosensory cortex (iEEG, iEBS, and functional MRI) provide highly concordant results. COMPARISON WITH EXISTING METHODS: iELVis is the first public software for electrode localization that corrects for brain shift, maps electrodes to an average brain, and supports neuroimaging overlays. Moreover, its interactive visualizations are powerful and its tutorial material is extensive. CONCLUSIONS iELVis promises to speed the progress and enhance the robustness of intracranial electrode research. The software and extensive tutorial materials are freely available as part of the EpiSurg software project: https://github.com/episurg/episurg.
European Journal of Neuroscience | 2015
Gizely N. Andrade; John S. Butler; Manuel R. Mercier; Sophie Molholm; John J. Foxe
When sensory inputs are presented serially, response amplitudes to stimulus repetitions generally decrease as a function of presentation rate, diminishing rapidly as inter‐stimulus intervals (ISIs) fall below 1 s. This ‘adaptation’ is believed to represent mechanisms by which sensory systems reduce responsivity to consistent environmental inputs, freeing resources to respond to potentially more relevant inputs. While auditory adaptation functions have been relatively well characterized, considerably less is known about visual adaptation in humans. Here, high‐density visual‐evoked potentials (VEPs) were recorded while two paradigms were used to interrogate visual adaptation. The first presented stimulus pairs with varying ISIs, comparing VEP amplitude to the second stimulus with that of the first (paired‐presentation). The second involved blocks of stimulation (N = 100) at various ISIs and comparison of VEP amplitude between blocks of differing ISIs (block‐presentation). Robust VEP modulations were evident as a function of presentation rate in the block‐paradigm, with strongest modulations in the 130–150 ms and 160–180 ms visual processing phases. In paired‐presentations, with ISIs of just 200–300 ms, an enhancement of VEP was evident when comparing S2 with S1, with no significant effect of presentation rate. Importantly, in block‐presentations, adaptation effects were statistically robust at the individual participant level. These data suggest that a more taxing block‐presentation paradigm is better suited to engage visual adaptation mechanisms than a paired‐presentation design. The increased sensitivity of the visual processing metric obtained in the block‐paradigm has implications for the examination of visual processing deficits in clinical populations.
NeuroImage | 2017
Manuel R. Mercier; Stephan Bickel; Pierre Mégevand; David M. Groppe; Charles E. Schroeder; Ashesh D. Mehta; Fred A. Lado
ABSTRACT While there is a strong interest in meso‐scale field potential recording using intracranial electroencephalography with penetrating depth electrodes (i.e. stereotactic EEG or S‐EEG) in humans, the signal recorded in the white matter remains ignored. White matter is generally considered electrically neutral and often included in the reference montage. Moreover, re‐referencing electrophysiological data is a critical preprocessing choice that could drastically impact signal content and consequently the results of any given analysis. In the present stereotactic electroencephalography study, we first illustrate empirically the consequences of commonly used references (subdermal, white matter, global average, local montage) on inter‐electrode signal correlation. Since most of these reference montages incorporate white matter signal, we next consider the difference between signals recorded in cortical gray matter and white matter. Our results reveal that electrode contacts located in the white matter record a mixture of activity, with part arising from the volume conduction (zero time delay) of activity from nearby gray matter. Furthermore, our analysis shows that white matter signal may be correlated with distant gray matter signal. While residual passive electrical spread from nearby matter may account for this relationship, our results suggest the possibility that this long distance correlation arises from the white matter fiber tracts themselves (i.e. activity from distant gray matter traveling along axonal fibers with time lag larger than zero); yet definitive conclusions about the origin of the white matter signal would require further experimental substantiation. By characterizing the properties of signals recorded in white matter and in gray matter, this study illustrates the importance of including anatomical prior knowledge when analyzing S‐EEG data.