Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Anton L. Beer is active.

Publication


Featured researches published by Anton L. Beer.


Experimental Brain Research | 2011

Diffusion tensor imaging shows white matter tracts between human auditory and visual cortex

Anton L. Beer; Tina Plank; Mark W. Greenlee

Although it is known that sounds can affect visual perception, the neural correlates for crossmodal interactions are still disputed. Previous tracer studies in non-human primates revealed direct anatomical connections between auditory and visual brain areas. We examined the structural connectivity of the auditory cortex in normal humans by diffusion-weighted tensor magnetic resonance imaging and probabilistic tractography. Tracts were seeded in Heschl’s region or the planum temporale. Fibres crossed hemispheres at the posterior corpus callosum. Ipsilateral fibres seeded in Heschl’s region projected to the superior temporal sulcus, the supramarginal gyrus and intraparietal sulcus and the occipital cortex including the calcarine sulcus. Fibres seeded in the planum temporale terminated primarily in the superior temporal sulcus, the supramarginal gyrus, the central sulcus and adjacent regions. Our findings suggest the existence of direct white matter connections between auditory and visual cortex—in addition to subcortical, temporal and parietal connections.


European Journal of Neuroscience | 2005

Attending to visual or auditory motion affects perception within and across modalities: an event-related potential study

Anton L. Beer; Brigitte Röder

The present event‐related potential (ERP) study examined the role of dynamic features in multisensory binding. It was tested whether endogenous attention to the direction of motion affects processing of visual and auditory stimuli within and across modalities. Human participants perceived horizontally moving dot patterns and sounds that were presented either continuously (standards) or briefly interrupted (infrequent deviants). Their task was to detect deviants moving in a particular direction within a primary modality, but to detect all deviants irrespective of their motion direction within the secondary modality. Attending to the direction of visual motion resulted in a broad selection negativity (SN) starting at about 200 ms post‐stimulus onset, and attending to the direction of auditory motion resulted in a positive difference wave at 150 ms that was followed by a broad negativity starting at about 200 ms (unimodal effects). Moreover, dot patterns moving in a direction that was attended within audition were detected faster and more accurately than oppositely moving stimuli and elicited a cross‐modal SN wave. Corresponding cross‐modal behavioural and ERP results were obtained for sounds moving in a direction that was attended within vision. Unimodal and cross‐modal ERP attention effects partially differed in their scalp topography. The present study shows that dynamic features (direction of motion) may be used to link input across modalities and demonstrates for the first time that these multisensory interactions take place as early as about 200 ms after stimulus onset.


European Journal of Neuroscience | 2006

Tight covariation of BOLD signal changes and slow ERPs in the parietal cortex in a parametric spatial imagery task with haptic acquisition

Tobias Schicke; Lars Muckli; Anton L. Beer; Michael Wibral; Wolf Singer; Rainer Goebel; Frank Rösler; Brigitte Röder

The present study investigated the relation of brain activity patterns measured with functional magnetic resonance imaging (fMRI) and slow event‐related potentials (ERPs) associated with a complex cognitive task. A second goal was to examine the neural correlates of spatial imagery of haptically – instead of visually – acquired representations. Using a mental image scanning task, spatial imagery requirements were systematically manipulated by parametrically varying the distance between haptically acquired landmarks. Results showed a close relation between slow ERPs and the blood oxygenation level dependent (BOLD) signal in human parietal lobe. Reaction times of mental scanning correlated with the distances between landmarks on the learned display. In parallel, duration and amplitude of slow ERPs and duration of the haemodynamic response systematically varied as a function of mental scanning distance. Source analysis confirmed that the ERP imagery effect likely originated from the same cortical substrate as the corresponding BOLD effect. This covariation of the BOLD signal with slow ERPs is in line with recent findings in animals demonstrating a tight link between local field potentials and the BOLD signal. The parietal location of the imagery effect is consistent with the idea that externally triggered (perceptual) and mentally driven (imagery) spatial processes are both mediated by the same supramodal brain areas.


European Journal of Neuroscience | 2009

3D surface perception from motion involves a temporal–parietal network

Anton L. Beer; Takeo Watanabe; Rui Ni; Yuka Sasaki; George J. Andersen

Previous research has suggested that three‐dimensional (3D) structure‐from‐motion (SFM) perception in humans involves several motion‐sensitive occipital and parietal brain areas. By contrast, SFM perception in nonhuman primates seems to involve the temporal lobe including areas MT, MST and FST. The present functional magnetic resonance imaging study compared several motion‐sensitive regions of interest including the superior temporal sulcus (STS) while human observers viewed horizontally moving dots that defined either a 3D corrugated surface or a 3D random volume. Low‐level stimulus features such as dot density and velocity vectors as well as attention were tightly controlled. Consistent with previous research we found that 3D corrugated surfaces elicited stronger responses than random motion in occipital and parietal brain areas including area V3A, the ventral and dorsal intraparietal sulcus, the lateral occipital sulcus and the fusiform gyrus. Additionally, 3D corrugated surfaces elicited stronger activity in area MT and the STS but not in area MST. Brain activity in the STS but not in area MT correlated with interindividual differences in 3D surface perception. Our findings suggest that area MT is involved in the analysis of optic flow patterns such as speed gradients and that the STS in humans plays a greater role in the analysis of 3D SFM than previously thought.


Frontiers in Integrative Neuroscience | 2013

Combined diffusion-weighted and functional magnetic resonance imaging reveals a temporal-occipital network involved in auditory-visual object processing

Anton L. Beer; Tina Plank; Georg Meyer; Mark W. Greenlee

Functional magnetic resonance imaging (MRI) showed that the superior temporal and occipital cortex are involved in multisensory integration. Probabilistic fiber tracking based on diffusion-weighted MRI suggests that multisensory processing is supported by white matter connections between auditory cortex and the temporal and occipital lobe. Here, we present a combined functional MRI and probabilistic fiber tracking study that reveals multisensory processing mechanisms that remained undetected by either technique alone. Ten healthy participants passively observed visually presented lip or body movements, heard speech or body action sounds, or were exposed to a combination of both. Bimodal stimulation engaged a temporal-occipital brain network including the multisensory superior temporal sulcus (msSTS), the lateral superior temporal gyrus (lSTG), and the extrastriate body area (EBA). A region-of-interest (ROI) analysis showed multisensory interactions (e.g., subadditive responses to bimodal compared to unimodal stimuli) in the msSTS, the lSTG, and the EBA region. Moreover, sounds elicited responses in the medial occipital cortex. Probabilistic tracking revealed white matter tracts between the auditory cortex and the medial occipital cortex, the inferior occipital cortex (IOC), and the superior temporal sulcus (STS). However, STS terminations of auditory cortex tracts showed limited overlap with the msSTS region. Instead, msSTS was connected to primary sensory regions via intermediate nodes in the temporal and occipital cortex. Similarly, the lSTG and EBA regions showed limited direct white matter connections but instead were connected via intermediate nodes. Our results suggest that multisensory processing in the STS is mediated by separate brain areas that form a distinct network in the lateral temporal and inferior occipital cortex.


Experimental Brain Research | 2009

Specificity of auditory-guided visual perceptual learning suggests crossmodal plasticity in early visual cortex

Anton L. Beer; Takeo Watanabe

Sounds modulate visual perception. Blind humans show altered brain activity in early visual cortex. However, it is still unclear whether crossmodal activity in visual cortex results from unspecific top-down feedback, a lack of visual input, or genuinely reflects crossmodal interactions at early sensory levels. We examined how sounds affect visual perceptual learning in sighted adults. Visual motion discrimination was tested prior to and following eight sessions in which observers were exposed to irrelevant moving dots while detecting sounds. After training, visual discrimination improved more strongly for motion directions that were paired with a relevant sound during training than for other directions. Crossmodal learning was limited to visual field locations that overlapped with the sound source and was little affected by attention. The specificity and automatic nature of these learning effects suggest that sounds automatically guide visual plasticity at a relatively early level of processing.


Cognitive, Affective, & Behavioral Neuroscience | 2004

Unimodal and crossmodal effects of endogenous attention to visual and auditory motion

Anton L. Beer; Brigitte Röder

Three experiments were conducted examining unimodal and crossmodal effects of attention to motion. Horizontally moving sounds and dot patterns were presented and participants’ task was to discriminate their motion speed or whether they were presented with a brief gap. In Experiments 1 and 2, stimuli of one modality and of one direction were presented with a higher probability ( p = .7) than other stimuli. Sounds and dot patterns moving in the expected direction were discriminated faster than stimuli moving in the unexpected direction. In Experiment 3, participants had to respond only to stimuli moving in one direction within the primary modality, but to all stimuli regardless of their direction within the rarer secondary modality. Stimuli of the secondary modality moving in the attended direction were discriminated faster than were oppositely moving stimuli. Results suggest that attending to the direction of motion affects perception within vision and audition, but also across modalities.


PLOS ONE | 2008

A motion illusion reveals mechanisms of perceptual stabilization.

Anton L. Beer; Andreas Heckel; Mark W. Greenlee

Visual illusions are valuable tools for the scientific examination of the mechanisms underlying perception. In the peripheral drift illusion special drift patterns appear to move although they are static. During fixation small involuntary eye movements generate retinal image slips which need to be suppressed for stable perception. Here we show that the peripheral drift illusion reveals the mechanisms of perceptual stabilization associated with these micromovements. In a series of experiments we found that illusory motion was only observed in the peripheral visual field. The strength of illusory motion varied with the degree of micromovements. However, drift patterns presented in the central (but not the peripheral) visual field modulated the strength of illusory peripheral motion. Moreover, although central drift patterns were not perceived as moving, they elicited illusory motion of neutral peripheral patterns. Central drift patterns modulated illusory peripheral motion even when micromovements remained constant. Interestingly, perceptual stabilization was only affected by static drift patterns, but not by real motion signals. Our findings suggest that perceptual instabilities caused by fixational eye movements are corrected by a mechanism that relies on visual rather than extraretinal (proprioceptive or motor) signals, and that drift patterns systematically bias this compensatory mechanism. These mechanisms may be revealed by utilizing static visual patterns that give rise to the peripheral drift illusion, but remain undetected with other patterns. Accordingly, the peripheral drift illusion is of unique value for examining processes of perceptual stabilization.


PLOS ONE | 2016

Surface-Based Analyses of Anatomical Properties of the Visual Cortex in Macular Degeneration

Doety Prins; Tina Plank; Heidi A. Baseler; Andre Gouws; Anton L. Beer; Antony B. Morland; Mark W. Greenlee; Frans W. Cornelissen

Introduction Macular degeneration (MD) can cause a central visual field defect. In a previous study, we found volumetric reductions along the entire visual pathways of MD patients, possibly indicating degeneration of inactive neuronal tissue. This may have important implications. In particular, new therapeutic strategies to restore retinal function rely on intact visual pathways and cortex to reestablish visual function. Here we reanalyze the data of our previous study using surface-based morphometry (SBM) rather than voxel-based morphometry (VBM). This can help determine the robustness of the findings and will lead to a better understanding of the nature of neuroanatomical changes associated with MD. Methods The metrics of interest were acquired by performing SBM analysis on T1-weighted MRI data acquired from 113 subjects: patients with juvenile MD (JMD; n = 34), patients with age-related MD (AMD; n = 24) and healthy age-matched controls (HC; n = 55). Results Relative to age-matched controls, JMD patients showed a thinner cortex, a smaller cortical surface area and a lower grey matter volume in V1 and V2, while AMD patients showed thinning of the cortex in V2. Neither patient group showed a significant difference in mean curvature of the visual cortex. Discussion The thinner cortex, smaller surface area and lower grey matter volume in the visual cortex of JMD patients are consistent with our previous results showing a volumetric reduction in their visual cortex. Finding comparable results using two rather different analysis techniques suggests the presence of marked cortical degeneration in the JMD patients. In the AMD patients, we found a thinner cortex in V2 but not in V1. In contrast to our previous VBM analysis, SBM revealed no volumetric reductions of the visual cortex. This suggests that the cortical changes in AMD patients are relatively subtle, as they apparently can be missed by one of the methods.


Seeing and Perceiving | 2011

Spatial shifts of audio-visual interactions by perceptual learning are specific to the trained orientation and eye.

Melissa A. Batson; Anton L. Beer; Aaron R. Seitz; Takeo Watanabe

A large proportion of the human cortex is devoted to visual processing. Contrary to the traditional belief that multimodal integration takes place in multimodal processing areas separate from visual cortex, several studies have found that sounds may directly alter processing in visual brain areas. Furthermore, recent findings show that perceptual learning can change the perceptual mechanisms that relate auditory and visual senses. However, there is still a debate about the systems involved in cross-modal learning. Here, we investigated the specificity of audio-visual perceptual learning. Audio-visual cuing effects were tested on a Gabor orientation task and an object discrimination task in the presence of lateralised sound cues before and after eight-days of cross-modal task-irrelevant perceptual learning. During training, the sound cues were paired with visual stimuli that were misaligned at a proximal (trained) visual field location relative to the sound. Training was performed with one eye patched and with only one Gabor orientation. Consistent with previous findings we found that cross-modal perceptual training shifted the audio-visual cueing effect towards the trained retinotopic location. However, this shift in audio-visual tuning was only observed for the trained stimulus (Gabors), at the trained orientation, and in the trained eye. This specificity suggests that multimodal interactions resulting from cross-modal (audio-visual) task-irrelevant perceptual learning involves so-called unisensory visual processing areas in humans. Our findings provide further support for recent anatomical and physiological findings that suggest relatively early interactions in cross-modal processing.

Collaboration


Dive into the Anton L. Beer's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Tina Plank

University of Regensburg

View shared research outputs
Top Co-Authors

Avatar

Georg Meyer

University of Liverpool

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Amy Spray

University of Liverpool

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge