Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Sophie Molholm is active.

Publication


Featured researches published by Sophie Molholm.


NeuroImage | 2005

Neural mechanisms involved in error processing: A comparison of errors made with and without awareness

Robert Hester; John J. Foxe; Sophie Molholm; Marina Shpaner; Hugh Garavan

The ability to detect an error in ones own performance and then to improve ongoing performance based on this error processing is critical for effective behaviour. In our event-related fMRI experiment, we show that explicit awareness of a response inhibition commission error and subsequent post-error behaviour were associated with bilateral prefrontal and parietal brain activation. Activity in the anterior cingulate region, typically associated with error detection, was equivalent for both errors subjects were aware of and those they were not aware of making. While anterior cingulate activation has repeatedly been associated with error-related processing, these results suggest that, in isolation, it is not sufficient for conscious awareness of errors or post-error adaptation of response strategies. Instead, it appears, irrespective of awareness, to detect information about stimuli/responses that requires interpretation in other brain regions for strategic implementation of post-error adjustments of behaviour.


The Journal of Neuroscience | 2006

The Anterior Cingulate and Error Avoidance

Elena Magno; John J. Foxe; Sophie Molholm; Ian H. Robertson; Hugh Garavan

The precise role of the anterior cingulate cortex in monitoring, evaluating, and correcting behavior remains unclear despite numerous theories and much empirical data implicating it in cognitive control. The present event-related functional magnetic resonance imaging study was able to separate monitoring from error-specific functions by allowing subjects to reject a trial so as to avoid errors. Cingulate and left dorsolateral prefrontal activity was greatest on rejection trials but comparable for correct and error trials, whereas an error-specific response was observed in bilateral insula. A dissociation was also observed between the cingulate and the nucleus accumbens with the latter more active for error than reject trials. These results reveal that the functional role of the cingulate is not particular to errors but instead is related to an evaluative function concerned with on-line behavioral adjustment in the service of avoiding losses.


Proceedings of the National Academy of Sciences of the United States of America | 2008

A human intracranial study of long-range oscillatory coherence across a frontal-occipital-hippocampal brain network during visual object processing.

Pejman Sehatpour; Sophie Molholm; Theodore H. Schwartz; Jeannette R. Mahoney; Ashesh D. Mehta; Daniel C. Javitt; Patric K. Stanton; John J. Foxe

Visual object-recognition is thought to involve activation of a distributed network of cortical regions, nodes of which include the lateral prefrontal cortex, the so-called lateral occipital complex (LOC), and the hippocampal formation. It has been proposed that long-range oscillatory synchronization is a major mode of coordinating such a distributed network. Here, intracranial recordings were made from three humans as they performed a challenging visual object-recognition task that required them to identify barely recognizable fragmented line-drawings of common objects. Subdural electrodes were placed over the prefrontal cortex and LOC, and depth electrodes were placed within the hippocampal formation. Robust beta-band coherence was evident in all subjects during processing of recognizable fragmented images. Significantly lower coherence was evident during processing of unrecognizable scrambled versions of the same. The results indicate that transient beta-band oscillatory coupling between these three distributed cortical regions may reflect a mechanism for effective communication during visual object processing.


The Journal of Neuroscience | 2011

Oscillatory alpha-band mechanisms and the deployment of spatial attention to anticipated auditory and visual target locations: Supramodal or sensory-specific control mechanisms?

Snigdha Banerjee; Adam C. Snyder; Sophie Molholm; John J. Foxe

Oscillatory alpha-band activity (8–15 Hz) over parieto-occipital cortex in humans plays an important role in suppression of processing for inputs at to-be-ignored regions of space, with increased alpha-band power observed over cortex contralateral to locations expected to contain distractors. It is unclear whether similar processes operate during deployment of spatial attention in other sensory modalities. Evidence from lesion patients suggests that parietal regions house supramodal representations of space. The parietal lobes are prominent generators of alpha oscillations, raising the possibility that alpha is a neural signature of supramodal spatial attention. Furthermore, when spatial attention is deployed within vision, processing of task-irrelevant auditory inputs at attended locations is also enhanced, pointing to automatic links between spatial deployments across senses. Here, we asked whether lateralized alpha-band activity is also evident in a purely auditory spatial-cueing task and whether it had the same underlying generator configuration as in a purely visuospatial task. If common to both sensory systems, this would provide strong support for “supramodal” attention theory. Alternately, alpha-band differences between auditory and visual tasks would support a sensory-specific account. Lateralized shifts in alpha-band activity were indeed observed during a purely auditory spatial task. Crucially, there were clear differences in scalp topographies of this alpha activity depending on the sensory system within which spatial attention was deployed. Findings suggest that parietally generated alpha-band mechanisms are central to attentional deployments across modalities but that they are invoked in a sensory-specific manner. The data support an “interactivity account,” whereby a supramodal system interacts with sensory-specific control systems during deployment of spatial attention.


Journal of Cognitive Neuroscience | 2006

Objects Are Highlighted by Spatial Attention

Antigona Martinez; Wolfgang A. Teder-Sälejärvi; M. Vazquez; Sophie Molholm; John J. Foxe; Daniel C. Javitt; F. Di Russo; Michael S. Worden; Steven A. Hillyard

Selective attention may be focused upon a region of interest within the visual surroundings, thereby improving the perceptual quality of stimuli at that location. It has been debated whether this spatially selective mechanism plays a role in the attentive selection of whole objects in a visual scene. The relationship between spatial and object-selective attention was investigated here through recordings of event-related brain potentials (ERPs) supplemented with functional magnetic brain imaging (fMRI). Subjects viewed a display consisting of two bar-shaped objects and directed attention to sequences of stimuli (brief corner offsets) at one end of one of the bars. Unattended stimuli belonging to the same object as the attended stimuli elicited spatiotemporal patterns of neural activity in the visual cortex closely resembling those elicited by the attended stimuli themselves, albeit smaller in amplitude. This enhanced neural activity associated with object-selective attention was localized by use of ERP dipole modeling and fMRI to the lateral occipital extrastriate cortex. We conclude that object-selective attention shares a common neural mechanism with spatial attention that entails the facilitation of sensory processing of stimuli within the boundaries of an attended object.


The Journal of Neuroscience | 2011

Oscillatory Sensory Selection Mechanisms during Intersensory Attention to Rhythmic Auditory and Visual Inputs: A Human Electrocorticographic Investigation

Manuel Gomez-Ramirez; Simon P. Kelly; Sophie Molholm; Pejman Sehatpour; Theodore H. Schwartz; John J. Foxe

Oscillatory entrainment mechanisms are invoked during attentional processing of rhythmically occurring stimuli, whereby their phase alignment regulates the excitability state of neurons coding for anticipated inputs. These mechanisms have been examined in the delta band (1–3 Hz), where entrainment frequency matches the stimulation rate. Here, we investigated entrainment for subdelta rhythmic stimulation, recording from intracranial electrodes over human auditory cortex during an intersensory audiovisual task. Audiovisual stimuli were presented at 0.67 Hz while participants detected targets within one sensory stream and ignored the other. It was found that entrainment operated at twice the stimulation rate (1.33 Hz), and this was reflected by higher amplitude values in the FFT spectrum, cyclic modulation of alpha-amplitude, and phase–amplitude coupling between delta phase and alpha power. In addition, we found that alpha-amplitude was relatively increased in auditory cortex coincident with to-be-ignored auditory stimuli during attention to vision. Thus, the data suggest that entrainment mechanisms operate within a delimited passband such that for subdelta task rhythms, oscillatory harmonics are invoked. The phase of these delta-entrained oscillations modulates alpha-band power. This may in turn increase or decrease responsiveness to relevant and irrelevant stimuli, respectively.


Cerebral Cortex | 2013

The Development of Multisensory Integration in High-Functioning Autism: High-Density Electrical Mapping and Psychophysical Measures Reveal Impairments in the Processing of Audiovisual Inputs

Alice B. Brandwein; John J. Foxe; John S. Butler; Natalie Russo; Ted S. Altschuler; Hilary Gomes; Sophie Molholm

Successful integration of auditory and visual inputs is crucial for both basic perceptual functions and for higher-order processes related to social cognition. Autism spectrum disorders (ASD) are characterized by impairments in social cognition and are associated with abnormalities in sensory and perceptual processes. Several groups have reported that individuals with ASD are impaired in their ability to integrate socially relevant audiovisual (AV) information, and it has been suggested that this contributes to the higher-order social and cognitive deficits observed in ASD. However, successful integration of auditory and visual inputs also influences detection and perception of nonsocial stimuli, and integration deficits may impair earlier stages of information processing, with cascading downstream effects. To assess the integrity of basic AV integration, we recorded high-density electrophysiology from a cohort of high-functioning children with ASD (7-16 years) while they performed a simple AV reaction time task. Children with ASD showed considerably less behavioral facilitation to multisensory inputs, deficits that were paralleled by less effective neural integration. Evidence for processing differences relative to typically developing children was seen as early as 100 ms poststimulation, and topographic analysis suggested that children with ASD relied on different cortical networks during this early multisensory processing stage.


Autism Research | 2010

Multisensory processing in children with autism: high-density electrical mapping of auditory–somatosensory integration

Natalie Russo; John J. Foxe; Alice B. Brandwein; Ted S. Altschuler; Hilary Gomes; Sophie Molholm

Successful integration of signals from the various sensory systems is crucial for normal sensory–perceptual functioning, allowing for the perception of coherent objects rather than a disconnected cluster of fragmented features. Several prominent theories of autism suggest that automatic integration is impaired in this population, but there have been few empirical tests of this thesis. A standard electrophysiological metric of multisensory integration (MSI) was used to test the integrity of auditory–somatosensory integration in children with autism (N=17, aged 6–16 years), compared to age‐ and IQ‐matched typically developing (TD) children. High‐density electrophysiology was recorded while participants were presented with either auditory or somatosensory stimuli alone (unisensory conditions), or as a combined auditory–somatosensory stimulus (multisensory condition), in randomized order. Participants watched a silent movie during testing, ignoring concurrent stimulation. Significant differences between neural responses to the multisensory auditory–somatosensory stimulus and the unisensory stimuli (the sum of the responses to the auditory and somatosensory stimuli when presented alone) served as the dependent measure. The data revealed group differences in the integration of auditory and somatosensory information that appeared at around 175 ms, and were characterized by the presence of MSI for the TD but not the autism spectrum disorder (ASD) children. Overall, MSI was less extensive in the ASD group. These findings are discussed within the framework of current knowledge of MSI in typical development as well as in relation to theories of ASD.


Cerebral Cortex | 2015

Severe Multisensory Speech Integration Deficits in High-Functioning School-Aged Children with Autism Spectrum Disorder (ASD) and Their Resolution During Early Adolescence

John J. Foxe; Sophie Molholm; Victor A. Del Bene; Hans-Peter Frey; Natalie Russo; Daniella Blanco; Dave Saint-Amour; Lars A. Ross

Under noisy listening conditions, visualizing a speakers articulations substantially improves speech intelligibility. This multisensory speech integration ability is crucial to effective communication, and the appropriate development of this capacity greatly impacts a childs ability to successfully navigate educational and social settings. Research shows that multisensory integration abilities continue developing late into childhood. The primary aim here was to track the development of these abilities in children with autism, since multisensory deficits are increasingly recognized as a component of the autism spectrum disorder (ASD) phenotype. The abilities of high-functioning ASD children (n = 84) to integrate seen and heard speech were assessed cross-sectionally, while environmental noise levels were systematically manipulated, comparing them with age-matched neurotypical children (n = 142). Severe integration deficits were uncovered in ASD, which were increasingly pronounced as background noise increased. These deficits were evident in school-aged ASD children (5-12 year olds), but were fully ameliorated in ASD children entering adolescence (13-15 year olds). The severity of multisensory deficits uncovered has important implications for educators and clinicians working in ASD. We consider the observation that the multisensory speech system recovers substantially in adolescence as an indication that it is likely amenable to intervention during earlier childhood, with potentially profound implications for the development of social communication abilities in ASD children.


The Journal of Neuroscience | 2011

Ready, Set, Reset: Stimulus-Locked Periodicity in Behavioral Performance Demonstrates the Consequences of Cross-Sensory Phase Reset

Ian C. Fiebelkorn; John J. Foxe; John S. Butler; Manuel R. Mercier; Adam C. Snyder; Sophie Molholm

The simultaneous presentation of a stimulus in one sensory modality often enhances target detection in another sensory modality, but the neural mechanisms that govern these effects are still under investigation. Here, we test a hypothesis proposed in the neurophysiological literature: that auditory facilitation of visual-target detection operates through cross-sensory phase reset of ongoing neural oscillations (Lakatos et al., 2009). To date, measurement limitations have prevented this potentially powerful neural mechanism from being directly linked with its predicted behavioral consequences. The present experiment uses a psychophysical approach in humans to demonstrate, for the first time, stimulus-locked periodicity in visual-target detection, following a temporally informative sound. Our data further demonstrate that periodicity in behavioral performance is strongly influenced by the probability of audiovisual co-occurrence. We argue that fluctuations in visual-target detection result from cross-sensory phase reset, both at the moment it occurs and persisting for seconds thereafter. The precise frequency at which this periodicity operates remains to be determined through a method that allows for a higher sampling rate.

Collaboration


Dive into the Sophie Molholm's collaboration.

Top Co-Authors

Avatar

John J. Foxe

University of Rochester

View shared research outputs
Top Co-Authors

Avatar

Walter Ritter

Nathan Kline Institute for Psychiatric Research

View shared research outputs
Top Co-Authors

Avatar

Ian C. Fiebelkorn

Albert Einstein College of Medicine

View shared research outputs
Top Co-Authors

Avatar

Manuel R. Mercier

Albert Einstein College of Medicine

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Daniel C. Javitt

Nathan Kline Institute for Psychiatric Research

View shared research outputs
Top Co-Authors

Avatar

Hilary Gomes

City University of New York

View shared research outputs
Top Co-Authors

Avatar

Adam C. Snyder

University of Pittsburgh

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge