Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Antonia Thelen is active.

Publication


Featured researches published by Antonia Thelen.


The Journal of Neuroscience | 2012

Looming Signals Reveal Synergistic Principles of Multisensory Integration

Céline Cappe; Antonia Thelen; Vincenzo Romei; Gregor Thut; Micah M. Murray

Multisensory interactions are a fundamental feature of brain organization. Principles governing multisensory processing have been established by varying stimulus location, timing and efficacy independently. Determining whether and how such principles operate when stimuli vary dynamically in their perceived distance (as when looming/receding) provides an assay for synergy among the above principles and also means for linking multisensory interactions between rudimentary stimuli with higher-order signals used for communication and motor planning. Human participants indicated movement of looming or receding versus static stimuli that were visual, auditory, or multisensory combinations while 160-channel EEG was recorded. Multivariate EEG analyses and distributed source estimations were performed. Nonlinear interactions between looming signals were observed at early poststimulus latencies (∼75 ms) in analyses of voltage waveforms, global field power, and source estimations. These looming-specific interactions positively correlated with reaction time facilitation, providing direct links between neural and performance metrics of multisensory integration. Statistical analyses of source estimations identified looming-specific interactions within the right claustrum/insula extending inferiorly into the amygdala and also within the bilateral cuneus extending into the inferior and lateral occipital cortices. Multisensory effects common to all conditions, regardless of perceived distance and congruity, followed (∼115 ms) and manifested as faster transition between temporally stable brain networks (vs summed responses to unisensory conditions). We demonstrate the early-latency, synergistic interplay between existing principles of multisensory interactions. Such findings change the manner in which to model multisensory interactions at neural and behavioral/perceptual levels. We also provide neurophysiologic backing for the notion that looming signals receive preferential treatment during perception.


Neuropsychologia | 2016

The multisensory function of the human primary visual cortex

Micah M. Murray; Antonia Thelen; Gregor Thut; Vincenzo Romei; Roberto Martuzzi; Pawel J. Matusz

It has been nearly 10 years since Ghazanfar and Schroeder (2006) proposed that the neocortex is essentially multisensory in nature. However, it is only recently that sufficient and hard evidence that supports this proposal has accrued. We review evidence that activity within the human primary visual cortex plays an active role in multisensory processes and directly impacts behavioural outcome. This evidence emerges from a full pallet of human brain imaging and brain mapping methods with which multisensory processes are quantitatively assessed by taking advantage of particular strengths of each technique as well as advances in signal analyses. Several general conclusions about multisensory processes in primary visual cortex of humans are supported relatively solidly. First, haemodynamic methods (fMRI/PET) show that there is both convergence and integration occurring within primary visual cortex. Second, primary visual cortex is involved in multisensory processes during early post-stimulus stages (as revealed by EEG/ERP/ERFs as well as TMS). Third, multisensory effects in primary visual cortex directly impact behaviour and perception, as revealed by correlational (EEG/ERPs/ERFs) as well as more causal measures (TMS/tACS). While the provocative claim of Ghazanfar and Schroeder (2006) that the whole of neocortex is multisensory in function has yet to be demonstrated, this can now be considered established in the case of the human primary visual cortex.


Cognition | 2015

Single-trial multisensory memories affect later auditory and visual object discrimination

Antonia Thelen; Durk Talsma; Micah M. Murray

Multisensory memory traces established via single-trial exposures can impact subsequent visual object recognition. This impact appears to depend on the meaningfulness of the initial multisensory pairing, implying that multisensory exposures establish distinct object representations that are accessible during later unisensory processing. Multisensory contexts may be particularly effective in influencing auditory discrimination, given the purportedly inferior recognition memory in this sensory modality. The possibility of this generalization and the equivalence of effects when memory discrimination was being performed in the visual vs. auditory modality were at the focus of this study. First, we demonstrate that visual object discrimination is affected by the context of prior multisensory encounters, replicating and extending previous findings by controlling for the probability of multisensory contexts during initial as well as repeated object presentations. Second, we provide the first evidence that single-trial multisensory memories impact subsequent auditory object discrimination. Auditory object discrimination was enhanced when initial presentations entailed semantically congruent multisensory pairs and was impaired after semantically incongruent multisensory encounters, compared to sounds that had been encountered only in a unisensory manner. Third, the impact of single-trial multisensory memories upon unisensory object discrimination was greater when the task was performed in the auditory vs. visual modality. Fourth, there was no evidence for correlation between effects of past multisensory experiences on visual and auditory processing, suggestive of largely independent object processing mechanisms between modalities. We discuss these findings in terms of the conceptual short term memory (CSTM) model and predictive coding. Our results suggest differential recruitment and modulation of conceptual memory networks according to the sensory task at hand.


Current Biology | 2014

Multisensory context portends object memory

Antonia Thelen; Pawel J. Matusz; Micah M. Murray

Summary Multisensory processes facilitate perception of currently-presented stimuli and can likewise enhance later object recognition. Memories for objects originally encountered in a multisensory context can be more robust than those for objects encountered in an exclusively visual or auditory context [1], upturning the assumption that memory performance is best when encoding and recognition contexts remain constant [2]. Here, we used event-related potentials (ERPs) to provide the first evidence for direct links between multisensory brain activity at one point in time and subsequent object discrimination abilities. Across two experiments we found that individuals showing a benefit and those impaired during later object discrimination could be predicted by their brain responses to multisensory stimuli upon their initial encounter. These effects were observed despite the multisensory information being meaningless, task-irrelevant, and presented only once. We provide critical insights into the advantages associated with multisensory interactions; they are not limited to the processing of current stimuli, but likewise encompass the ability to determine the benefit of ones memories for object recognition in later, unisensory contexts.


Multisensory Research | 2013

The efficacy of single-trial multisensory memories.

Antonia Thelen; Micah M. Murray

This review article summarizes evidence that multisensory experiences at one point in time have long-lasting effects on subsequent unisensory visual and auditory object recognition. The efficacy of single-trial exposure to task-irrelevant multisensory events is its ability to modulate memory performance and brain activity to unisensory components of these events presented later in time. Object recognition (either visual or auditory) is enhanced if the initial multisensory experience had been semantically congruent and can be impaired if this multisensory pairing was either semantically incongruent or entailed meaningless information in the task-irrelevant modality, when compared to objects encountered exclusively in a unisensory context. Processes active during encoding cannot straightforwardly explain these effects; performance on all initial presentations was indistinguishable despite leading to opposing effects with stimulus repetitions. Brain responses to unisensory stimulus repetitions differ during early processing stages (-100 ms post-stimulus onset) according to whether or not they had been initially paired in a multisensory context. Plus, the network exhibiting differential responses varies according to whether or not memory performance is enhanced or impaired. The collective findings we review indicate that multisensory associations formed via single-trial learning exert influences on later unisensory processing to promote distinct object representations that manifest as differentiable brain networks whose activity is correlated with memory performance. These influences occur incidentally, despite many intervening stimuli, and are distinguishable from the encoding/learning processes during the formation of the multisensory associations. The consequences of multisensory interactions that persist over time to impact memory retrieval and object discrimination.


European Journal of Neuroscience | 2015

The role of auditory cortices in the retrieval of single-trial auditory-visual object memories

Pawel J. Matusz; Antonia Thelen; Sarah Amrein; Eveline Geiser; Jacques Anken; Micah M. Murray

Single‐trial encounters with multisensory stimuli affect both memory performance and early‐latency brain responses to visual stimuli. Whether and how auditory cortices support memory processes based on single‐trial multisensory learning is unknown and may differ qualitatively and quantitatively from comparable processes within visual cortices due to purported differences in memory capacities across the senses. We recorded event‐related potentials (ERPs) as healthy adults (n = 18) performed a continuous recognition task in the auditory modality, discriminating initial (new) from repeated (old) sounds of environmental objects. Initial presentations were either unisensory or multisensory; the latter entailed synchronous presentation of a semantically congruent or a meaningless image. Repeated presentations were exclusively auditory, thus differing only according to the context in which the sound was initially encountered. Discrimination abilities (indexed by d’) were increased for repeated sounds that were initially encountered with a semantically congruent image versus sounds initially encountered with either a meaningless or no image. Analyses of ERPs within an electrical neuroimaging framework revealed that early stages of auditory processing of repeated sounds were affected by prior single‐trial multisensory contexts. These effects followed from significantly reduced activity within a distributed network, including the right superior temporal cortex, suggesting an inverse relationship between brain activity and behavioural outcome on this task. The present findings demonstrate how auditory cortices contribute to long‐term effects of multisensory experiences on auditory object discrimination. We propose a new framework for the efficacy of multisensory processes to impact both current multisensory stimulus processing and unisensory discrimination abilities later in time.


Multisensory Research | 2013

The neural bases of cross-modal correspondences: Reality or wishful thinking?

Antonia Thelen; Sarah Amrein; Micah M. Murray

There has recently been resurgent interest in the notion of cross-modal correspondences — i.e. stimulus features that may be preferentially integrated. The extent to which any such correspondences emanate from intrinsic anatomical connectivity or instead from learned (presumably statistical) regularities in the environment remains unresolved. Anatomical data from non-human primates would suggest that high-frequency auditory representations together with peripheral visual field representations in primary cortices are a preferred locus of low-level integration, whereas functional studies in humans have repeatedly demonstrated effects with centrally-presented stimuli and a range of auditory pitches/bandwidths. The present psychophysics and EEG study examined whether auditory–visual integration systematically varies with acoustic pitch and visual eccentricity. Subjects viewed 2 annuli (either foveally-presented or at 12.5° eccentricity, with surface area controlled for cortical magnification) and indicated which, if either, changed its brightness. The paradigm followed a 3 × 3 within subject design: (no, foveal, or peripheral brightness change) × (no, 500 Hz, or 4000 Hz pure tone presentation; the latter of which were controlled for perceived loudness). Accuracy data were analyzed according to signal detection theory, using sensitivity (d′). Reaction time data were analyzed after dividing by detection rates, using the multisensory response enhancement metric (MRE; see Rach et al., 2011, Psychological Research). Preliminary data suggest there to be a main effect of pitch (i.e. larger d′ and MRE when the multisensory conditions included a high pitch vs. low pitch sound). There was no evidence of a main effect of eccentricity or interaction between factors. Ongoing EEG analyses will likewise be discussed.


Seeing and Perceiving | 2012

Electrical neuroimaging of memory discrimination based on single-trial multisensory learning

Antonia Thelen; Céline Cappe; Micah M. Murray

article i nfo Multisensory experiences influence subsequent memory performance and brain responses. Studies have thus far concentrated on semantically congruent pairings, leaving unresolved the influence of stimulus pairing and memory sub-types. Here, we paired images with unique, meaningless sounds during a continuous recogni- tion task to determine if purely episodic, single-trial multisensory experiences can incidentally impact subse- quent visual object discrimination. Psychophysics and electrical neuroimaging analyses of visual evoked potentials (VEPs) compared responses to repeated images either paired or not with a meaningless sound dur- ing initial encounters. Recognition accuracy was significantly impaired for images initially presented as mul- tisensory pairs and could not be explained in terms of differential attention or transfer of effects from encoding to retrieval. VEP modulations occurred at 100-130 ms and 270-310 ms and stemmed from topo- graphic differences indicative of network configuration changes within the brain. Distributed source estima- tions localized the earlier effect to regions of the right posterior temporal gyrus (STG) and the later effect to regions of the middle temporal gyrus (MTG). Responses in these regions were stronger for images previously encountered as multisensory pairs. Only the later effect correlated with performance such that greater MTG activity in response to repeated visual stimuli was linked with greater performance decrements. The present findings suggest that brain networks involved in this discrimination may critically depend on whether mul- tisensory events facilitate or impair later visual memory performance. More generally, the data support models whereby effects of multisensory interactions persist to incidentally affect subsequent behavior as well as visual processing during its initial stages.


Seeing and Perceiving | 2012

Heterogeneous auditory–visual integration: Effects of pitch, band-width and visual eccentricity

Antonia Thelen; Micah M. Murray

The identification of monosynaptic connections between primary cortices in non-human primates has recently been complemented by observations of early-latency and low-level non-linear interactions in brain responses in humans as well as observations of facilitative effects of multisensory stimuli on behavior/performance in both humans and monkeys. While there is some evidence in favor of causal links between early–latency interactions within low-level cortices and behavioral facilitation, it remains unknown if such effects are subserved by direct anatomical connections between primary cortices. In non-human primates, the above monosynaptic projections from primary auditory cortex terminate within peripheral visual field representations within primary visual cortex, suggestive of there being a potential bias for the integration of eccentric visual stimuli and pure tone (vs. broad-band) sounds. To date, behavioral effects in humans (and monkeys) have been observed after presenting (para)foveal stimuli with any of a range of auditory stimuli from pure tones to noise bursts. The present study aimed to identify any heterogeneity in the integration of auditory–visual stimuli. To this end, we employed a 3 × 3 within subject design that varied the visual eccentricity of an annulus (2.5°, 5.7°, 8.9°) and auditory pitch (250, 1000, 4000 Hz) of multisensory stimuli while subjects completed a simple detection task. We also varied the auditory bandwidth (pure tone vs. pink noise) across blocks of trials that a subject completed. To ensure attention to both modalities, multisensory stimuli were equi-probable with both unisensory visual and unisensory auditory trials that themselves varied along the abovementioned dimensions. Median reaction times for each stimulus condition as well as the percentage gain/loss of each multisensory condition vs. the best constituent unisensory condition were measured. The preliminary results reveal that multisensory interactions (as measured from simple reaction times) are indeed heterogeneous across the tested dimensions and may provide a means for delimiting the anatomo-functional substrates of behaviorally-relevant early–latency neural response interactions. Interestingly, preliminary results suggest selective interactions for visual stimuli when presented with broadband stimuli but not when presented with pure tones. More precisely, centrally presented visual stimuli show the greatest index of multisensory facilitation when coupled to a high pitch tone embedded in pink noise, while visual stimuli presented at approximately 5.7° of visual angle show the greatest slowing of reaction times.


Seeing and Perceiving | 2012

Determinants of the efficacy of single-trial multisensory learning

Antonia Thelen; Micah M. Murray

Single-trial multisensory learning has been reliably shown to impact the later ability to discriminate images. The present study had the following three aims: (1) to determine if single-trial multisensory learning would elicit corresponding effects on auditory discrimination, (2) to determine if there were links between the impact of multisensory sensory learning on auditory discrimination and its impact on visual discrimination within individual participants, and (3) to determine the bases of inter-individual differences in the efficacy of single-trial multisensory learning. On two sessions separated by one week, participants discriminated initial from repeated presentations of either images or sounds during a continuous recognition task. Half of the initial presentations were auditory–visual multisensory parings (semantically congruent, semantically incongruent, or meaningless). The remaining half of initial presentations was unisensory. Half of the repeated stimuli were presented in an identical manner to their initial encounter, and the remaining half were presented in the complementary manner (i.e., those initially presented in a unisensory manner were now presented as multisensory pairs and vice versa). The results show that the efficacy of single-trial multisensory learning across the senses varies according to an individual’s propensity to exhibit repetition priming with sounds (i.e., faster RTs and higher accuracy for repeated vs. initial unisensory sound presentations). Individuals exhibiting such priming also showed facilitative effects of single-trial multisensory learning on both the auditory and visual discrimination tasks. Those who did not exhibit such priming did not benefit from single-trial multisensory learning. Single-trial multisensory learning is therefore an effective tool across the senses.

Collaboration


Dive into the Antonia Thelen's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Céline Cappe

École Polytechnique Fédérale de Lausanne

View shared research outputs
Top Co-Authors

Avatar

Roberto Martuzzi

École Polytechnique Fédérale de Lausanne

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Eveline Geiser

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge