Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Frank Bremmer is active.

Publication


Featured researches published by Frank Bremmer.


Neuron | 2001

Polymodal Motion Processing in Posterior Parietal and Premotor Cortex: A Human fMRI Study Strongly Implies Equivalencies between Humans and Monkeys

Frank Bremmer; Anja Schlack; N. Jon Shah; Oliver Zafiris; Michael Kubischik; Klaus-Peter Hoffmann; Karl Zilles; Gereon R. Fink

In monkeys, posterior parietal and premotor cortex play an important integrative role in polymodal motion processing. In contrast, our understanding of the convergence of senses in humans is only at its beginning. To test for equivalencies between macaque and human polymodal motion processing, we used functional MRI in normals while presenting moving visual, tactile, or auditory stimuli. Increased neural activity evoked by all three stimulus modalities was found in the depth of the intraparietal sulcus (IPS), ventral premotor, and lateral inferior postcentral cortex. The observed activations strongly suggest that polymodal motion processing in humans and monkeys is supported by equivalent areas. The activations in the depth of IPS imply that this area constitutes the human equivalent of macaque area VIP.


Nature | 1997

Spatial invariance of visual receptive fields in parietal cortex neurons.

Frank Bremmer; Suliann BenHamed; Werner Graf; Marcelin Bertelot

Spatial information is conveyed to the primary visual cortex in retinal coordinates. Movement trajectory programming, however, requires a transformation from this sensory frame of reference into a frame appropriate for the selected part of the body, such as the eye, head or arms. To achieve this transformation, visual information must be combined with information from other sources: for instance, the location of an object of interest can be defined with respect to the observers head if the position of the eyes in the orbit is known and is added to the objects retinal coordinates. Here we show that in a subdivision of the monkey parietal lobe, the ventral intraparietal area (VIP), the activity of visual neurons is modulated by eye-position signals, as in many other areas of the cortical visual system. We find that individual receptive fields of a population of VIP neurons are organized along a continuum, from eye to head coordinates. In the latter case, neurons encode the azimuth and/or elevation of a visual stimulus, independently of the direction in which the eyes are looking, thus representing spatial locations explicitly in at least a head-centred frame of reference.


The Journal of Neuroscience | 2005

Multisensory Space Representations in the Macaque Ventral Intraparietal Area

Anja Schlack; Susanne J. Sterbing-D'Angelo; Klaus Hartung; Klaus-Peter Hoffmann; Frank Bremmer

Animals can use different sensory signals to localize objects in the environment. Depending on the situation, the brain either integrates information from multiple sensory sources or it chooses the modality conveying the most reliable information to direct behavior. This suggests that somehow, the brain has access to a modality-invariant representation of external space. Accordingly, neural structures encoding signals from more than one sensory modality are best suited for spatial information processing. In primates, the posterior parietal cortex (PPC) is a key structure for spatial representations. One substructure within human and macaque PPC is the ventral intraparietal area (VIP), known to represent visual, vestibular, and tactile signals. In the present study, we show for the first time that macaque area VIP neurons also respond to auditory stimulation. Interestingly, the strength of the responses to the acoustic stimuli greatly depended on the spatial location of the stimuli [i.e., most of the auditory responsive neurons had surprisingly small spatially restricted auditory receptive fields (RFs)]. Given this finding, we compared the auditory RF locations with the respective visual RF locations of individual area VIP neurons. In the vast majority of neurons, the auditory and visual RFs largely overlapped. Additionally, neurons with well aligned visual and auditory receptive fields tended to encode multisensory space in a common reference frame. This suggests that area VIP constitutes a part of a neuronal circuit involved in the computation of a modality-invariant representation of external space.


European Journal of Neuroscience | 2002

Visual–vestibular interactive responses in the macaque ventral intraparietal area (VIP)

Frank Bremmer; François Klam; Jean-René Duhamel; Suliann Ben Hamed; Werner Graf

Self‐motion detection requires the interaction of a number of sensory systems for correct perceptual interpretation of a given movement and an eventual motor response. Parietal cortical areas are thought to play an important role in this function, and we have thus studied the encoding of multimodal signals and their spatiotemporal interactions in the ventral intraparietal area of macaque monkeys. Thereby, we have identified for the first time the presence of vestibular sensory input to this area and described its interaction with somatosensory and visual signals, via extracellular single‐cell recordings in awake head‐fixed animals. Visual responses were driven by large field stimuli that simulated either backward or forward self‐motion (contraction or expansion stimuli, respectively), or movement in the frontoparallel plane (visual increments moving simultaneously in the same direction). While the dominant sensory modality in most neurons was visual, about one third of all recorded neurons responded to horizontal rotation. These vestibular responses were typically in phase with head velocity, but in some cases they could signal acceleration or even showed integration to position. The associated visual responses were always codirectional with the vestibular on‐direction, i.e. noncomplementary. Somatosensory responses were in register with the visual preferred direction, either in the same or in the opposite direction, thus signalling translation or rotation in the horizontal plane. These results, taken together with data on responses to optic flow stimuli obtained in a parallel study, strongly suggest an involvement of area VIP in the analysis and the encoding of self‐motion.


Experimental Brain Research | 2001

Representation of the visual field in the lateral intraparietal area of macaque monkeys: a quantitative receptive field analysis

S. Ben Hamed; Jean-René Duhamel; Frank Bremmer; Werner Graf

Abstract. The representation of the visual field in the primate lateral intraparietal area (LIP) was examined, using a rapid, computer-driven receptive field (RF) mapping procedure. RF characteristics of single LIP neurons could thus be measured repeatedly under different behavioral conditions. Here we report data obtained using a standard ocular fixation task during which the animals were required to monitor small changes in color of the fixated target. In a first step, statistical analyses were conducted in order to establish the experimental limits of the mapping procedure on 171 LIP neurons recorded from three hemispheres of two macaque monkeys. The characteristics of the receptive fields of LIP neurons were analyzed at the single cell and at the population level. Although for many neurons the assumption of a simple two-dimensional gaussian profile with a central area of maximal excitability at the center and progressively decreasing response strength at the periphery can represent relatively accurately the spatial structure of the RF, about 19% of the cells had a markedly asymmetrical shape. At the population level, we observed, in agreement with prior studies, a systematic relation between RF size and eccentricity. However, we also found a more accentuated overrepresentation of the central visual field than had been previously reported and no marked differences between the upper and lower visual representation of space. This observation correlates with an extension of the definition of LIP from the posterior third of the lateral intraparietal sulcus to most of the middle and posterior thirds. Detailed histological analyses of the recorded hemispheres suggest that there exists, in this newly defined unitary functional cortical area, a coarse but systematic topographical organization in area LIP that supports the distinction between its dorsal and ventral regions, LIPd and LIPv, respectively. Paralleling the physiological data, the central visual field is mostly represented in the middle dorsal region and the visual periphery more ventral and posterior. An anteroposterior gradient from the lower to the upper visual field representations can also be identified. In conclusion, this study provides the basis for a reliable mapping method in awake monkeys and a reference for the organization of the properties of the visual space representation in an area LIP extended with respect to the previously described LIP and showing a relative emphasis of central visual field.


NeuroImage | 2001

Space Coding in Primate Posterior Parietal Cortex

Frank Bremmer; Anja Schlack; Jean-René Duhamel; Werner Graf; Gereon R. Fink

Neuropsychological studies of patients with lesions of right frontal (premotor) or posterior parietal cortex often show severe impairments of attentive sensorimotor behavior. Such patients frequently manifest symptoms like hemispatial neglect or extinction. Interestingly, these behavioral deficits occur across different sensory modalities and are often organized in head- or body-centered coordinates. These neuropsychological data provide evidence for the existence of a network of polymodal areas in (primate) premotor and inferior parietal cortex representing visual spatial information in a nonretinocentric frame of reference. In the monkey, a highly modular structural and functional specialization has been demonstrated especially within posterior parietal cortex. One such functionally specialized area is the ventral intraparietal area (VIP). This area is located in the fundus of the intraparietal sulcus and contains many neurons that show polymodal directionally selective discharges, i.e., these neurons respond to moving visual, tactile, vestibular, or auditory stimuli. Many of these neurons also encode sensory information from different modalities in a common, probably head-centered, frame of reference. Functional imaging data on humans reveal a network of cortical areas that respond to polymodal stimuli conveying motion information. One of these regions of activation is located in the depth of human intraparietal sulcus. Accordingly, it is suggested that this area constitutes the human equivalent of monkey area VIP. The functional role of area VIP for polymodal spatial perception in normals as well as the functional implications of lesions of area VIP in parietal patients needs to be established in further experiments.


European Journal of Neuroscience | 2002

Heading encoding in the macaque ventral intraparietal area (VIP).

Frank Bremmer; Jean-René Duhamel; Suliann Ben Hamed; Werner Graf

We recorded neuronal responses to optic flow stimuli in the ventral intraparietal area (VIP) of two awake macaque monkeys. According to previous studies on optic flow responses in monkey extrastriate cortex we used different stimulus classes: frontoparallel motion, radial stimuli (expansion and contraction) and rotational stimuli (clockwise and counter‐clockwise). Seventy‐five percent of the cells showed statistically significant responses to one or more of these optic flow stimuli. Shifting the location of the singularity of the optic flow stimuli within the visual field led to a response modulation in almost all cases. For the majority of neurons, this modulatory influence could be approximated in a statistically significant manner by a two‐dimensional linear regression. Gradient directions, derived from the regression parameters and indicating the direction of the steepest increase in the responses, were uniformly distributed. At the population level, an unbiased average response for the stimuli with different focus locations was observed. By applying a population code, termed ‘isofrequency encoding’, we demonstrate the capability of the recorded neuronal ensemble to retrieve the focus location from its population discharge. Responses to expansion and contraction stimuli cannot be predicted based on quantitative data on a neurons frontoparallel preferred stimulus direction and the location and size of its receptive field. These results, taken together with data on polymodal motion responses in this area, suggest an involvement of area VIP in the analysis and the encoding of heading.


The Journal of Neuroscience | 1996

Optic Flow Processing in Monkey STS: A Theoretical and Experimental Approach

Markus Lappe; Frank Bremmer; Martin Pekel; Alexander Thiele; Klaus-Peter Hoffmann

How does the brain process visual information about self-motion? In monkey cortex, the analysis of visual motion is performed by successive areas specialized in different aspects of motion processing. Whereas neurons in the middle temporal (MT) area are direction-selective for local motion, neurons in the medial superior temporal (MST) area respond to motion patterns. A neural network model attempts to link these properties to the psychophysics of human heading detection from optic flow. It proposes that populations of neurons represent specific directions of heading. We quantitatively compared single-unit recordings in area MST with single-neuron simulations in this model. Predictions were derived from simulations and subsequently tested in recorded neurons. Neuronal activities depended on the position of the singular point in the optic flow. Best responses to opposing motions occurred for opposite locations of the singular point in the visual field. Excitation by one type of motion is paired with inhibition by the opposite motion. Activity maxima often occur for peripheral singular points. The averaged recorded shape of the response modulations is sigmoidal, which is in agreement with model predictions. We also tested whether the activity of the neuronal population in MST can represent the directions of heading in our stimuli. A simple least-mean-square minimization could retrieve the direction of heading from the neuronal activities with a precision of 4.3°. Our results show good agreement between the proposed model and the neuronal responses in area MST and further support the hypothesis that area MST is involved in visual navigation.


Nature | 2003

Neural correlates of implied motion

Bart Krekelberg; Sabine Dannenberg; Klaus-Peter Hoffmann; Frank Bremmer; John Ross

Current views of the visual system assume that the primate brain analyses form and motion along largely independent pathways; they provide no insight into why form is sometimes interpreted as motion. In a series of psychophysical and electrophysiological experiments in humans and macaques, here we show that some form information is processed in the prototypical motion areas of the superior temporal sulcus (STS). First, we show that STS cells respond to dynamic Glass patterns, which contain no coherent motion but suggest a path of motion. Second, we show that when motion signals conflict with form signals suggesting a different path of motion, both humans and monkeys perceive motion in a compromised direction. This compromise also has a correlate in the responses of STS cells, which alter their direction preferences in the presence of conflicting implied motion information. We conclude that cells in the prototypical motion areas in the dorsal visual cortex process form that implies motion. Estimating motion by combining motion cues with form cues may be a strategy to deal with the complexities of motion perception in our natural environment.


European Journal of Neuroscience | 2002

Interaction of linear vestibular and visual stimulation in the macaque ventral intraparietal area (VIP)

Anja Schlack; Klaus-Peter Hoffmann; Frank Bremmer

Navigation in space requires the brain to combine information arising from different sensory modalities with the appropriate motor commands. Sensory information about self‐motion in particular is provided by the visual and the vestibular system. The macaque ventral intraparietal area (VIP) has recently been shown to be involved in the processing of self‐motion information provided by optical flow, to contain multimodal neurons and to receive input from areas involved in the analysis of vestibular information. By studying responses to linear vestibular, visual and bimodal stimulation we aimed at gaining more insight into the mechanisms involved in multimodal integration and self‐motion processing. A large proportion of cells (77%) revealed a significant response to passive linear translation of the monkey. Of these cells, 59% encoded information about the direction of self‐motion. The phase relationship between vestibular stimulation and neuronal responses covered a broad spectrum, demonstrating the complexity of the spatio‐temporal pattern of vestibular information encoded by neurons in area VIP. For 53% of the direction‐selective neurons the preferred directions for stimuli of both modalities were the same; they were opposite for the remaining 47% of the neurons. During bimodal stimulation the responses of neurons with opposite direction selectivity in the two modalities were determined either by the visual (53%) or the vestibular (47%) modality. These heterogeneous responses to unimodal and bimodal stimulation might be used to prevent misjudgements about self‐ and/or object‐motion, which could be caused by relying on information of one sensory modality alone.

Collaboration


Dive into the Frank Bremmer's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jean-René Duhamel

Centre national de la recherche scientifique

View shared research outputs
Researchain Logo
Decentralizing Knowledge