Michael Kubischik
Ruhr University Bochum
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Michael Kubischik.
Neuron | 2001
Frank Bremmer; Anja Schlack; N. Jon Shah; Oliver Zafiris; Michael Kubischik; Klaus-Peter Hoffmann; Karl Zilles; Gereon R. Fink
In monkeys, posterior parietal and premotor cortex play an important integrative role in polymodal motion processing. In contrast, our understanding of the convergence of senses in humans is only at its beginning. To test for equivalencies between macaque and human polymodal motion processing, we used functional MRI in normals while presenting moving visual, tactile, or auditory stimuli. Increased neural activity evoked by all three stimulus modalities was found in the depth of the intraparietal sulcus (IPS), ventral premotor, and lateral inferior postcentral cortex. The observed activations strongly suggest that polymodal motion processing in humans and monkeys is supported by equivalent areas. The activations in the depth of IPS imply that this area constitutes the human equivalent of macaque area VIP.
Annals of the New York Academy of Sciences | 1999
Frank Bremmer; Michael Kubischik; Martin Pekel; Markus Lappe; Klaus-Peter Hoffmann
Abstract: The present study was aimed at investigating the sensitivity to linear vestibular stimulation of neurons in the medial superior temporal area (MST) of the macaque monkey. Two monkeys were moved on a parallel swing while single‐unit activity was recorded. About one‐half of the cells (28/51) responded in the dark either to forward motion (n= 10), or to backward motion (n= 11), or to both (n= 7). Twenty cells responding to vestibular stimulation in darkness were also tested for their responses to optic flow stimulation simulating forward and backward self‐motion. Forty‐five percent (9/20) of them preferred the same self‐motion directions, that is, combined visual and vestibular signals in a synergistic manner. Thirty percent (6/20) of the cells were not responsive to visual stimulation alone. The remaining 25% (5/20) preferred directions that were antialigned. Our results provide strong evidence that neurons in the MST area are at least in part involved in the processing of self‐motion.
The Journal of Neuroscience | 2009
Frank Bremmer; Michael Kubischik; Klaus-Peter Hoffmann; Bart Krekelberg
We make fast, ballistic eye movements called saccades more often than our heart beats. Although every saccade causes a large movement of the image of the environment on our retina, we never perceive this motion. This aspect of perceptual stability is often referred to as saccadic suppression: a reduction of visual sensitivity around the time of saccades. Here, we investigated the neural basis of this perceptual phenomenon with extracellular recordings from awake, behaving monkeys in the middle temporal, medial superior temporal, ventral intraparietal, and lateral intraparietal areas. We found that, in each of these areas, the neural response to a visual stimulus changes around an eye movement. The perisaccadic response changes are qualitatively different in each of these areas, suggesting that they do not arise from a change in a common input area. Importantly, our data show that the suppression in the dorsal stream starts well before the eye movement. This clearly shows that the suppression is not just a consequence of the changes in visual input during the eye movement but rather must involve a process that actively modulates neural activity just before a saccade.
Neuron | 2003
Bart Krekelberg; Michael Kubischik; Klaus-Peter Hoffmann; Frank Bremmer
While reading this text, your eyes jump from word to word. Yet you are unaware of the motion this causes on your retina; the brain somehow compensates for these displacements and creates a stable percept of the world. This compensation is not perfect; perisaccadically, perceptual space is distorted. We show that this distortion can be traced to a representation of retinal position in the medial temporal and medial superior temporal areas. These cells accurately represent retinal position during fixation, but perisaccadically, the same cells distort the representation of space. The time course and magnitude of this distortion are similar to the mislocalization found psychophysically in humans. This challenges the assumption in many psychophysical studies that the perisaccadic retinal position signal is veridical.
Current Biology | 2012
Adam P. Morris; Michael Kubischik; Klaus-Peter Hoffmann; Bart Krekelberg; Frank Bremmer
BACKGROUNDnMany visual areas of the primate brain contain signals related to the current position of the eyes in the orbit. These cortical eye-position signals are thought to underlie the transformation of retinal input-which changes with every eye movement-into a stable representation of visual space. For this coding scheme to work, such signals would need to be updated fast enough to keep up with the eye during normal exploratory behavior. We examined the dynamics of cortical eye-position signals in four dorsal visual areas of the macaque brain: the lateral and ventral intraparietal areas (LIP; VIP), the middle temporal area (MT), and the medial-superior temporal area (MST). We recorded extracellular activity of single neurons while the animal performed sequences of fixations and saccades in darkness.nnnRESULTSnThe data show that eye-position signals are updated predictively, such that the representation shifts in the direction of a saccade prior to (<100 ms) the actual eye movement. Despite this early start, eye-position signals remain inaccurate until shortly after (10-150 ms) the eye movement. By using simulated behavioral experiments, we show that this brief misrepresentation of eye position provides a neural explanation for the psychophysical phenomenon of perisaccadic mislocalization, in which observers misperceive the positions of visual targets flashed around the time of saccadic eye movements.nnnCONCLUSIONSnTogether, these results suggest that eye-position signals in the dorsal visual system are updated rapidly across eye movements and play a direct role in perceptual localization, even when they are erroneous.
Experimental Brain Research | 2010
Frank Bremmer; Michael Kubischik; Martin Pekel; Klaus-Peter Hoffmann; Markus Lappe
The control of self-motion is supported by visual, vestibular, and proprioceptive signals. Recent research has shown how these signals interact in the monkey medio-superior temporal area (area MST) to enhance and disambiguate the perception of heading during self-motion. Area MST is a central stage for self-motion processing from optic flow, and integrates flow field information with vestibular self-motion and extraretinal eye movement information. Such multimodal cue integration is clearly important to solidify perception. However to understand the information processing capabilities of the brain, one must also ask how much information can be deduced from a single cue alone. This is particularly pertinent for optic flow, where controversies over its usefulness for self-motion control have existed ever since Gibson proposed his direct approach to ecological perception. In our study, we therefore, tested macaque MST neurons for their heading selectivity in highly complex flow fields based on the purely visual mechanisms. We recorded responses of MST neurons to simple radial flow fields and to distorted flow fields that simulated a self-motion plus an eye movement. About half of the cells compensated for such distortion and kept the same heading selectivity in both cases. Our results strongly support the notion of an involvement of area MST in the computation of heading.
Science | 2002
Alexander Thiele; Phillip Henning; Michael Kubischik; Klaus-Peter Hoffmann
Journal of Vision | 2010
Adam P. Morris; Michael Kubischik; Klaus-Peter Hoffman; Bart Krekelberg; Frank Bremmer
Archive | 2012
Adam P. Morris; Michael Kubischik; Klaus-Peter Hoffmann; Bart Krekelberg; Frank Bremmer
Archive | 2010
Syed A. Chowdhury; Katsumasa Takahashi; Gregory C. DeAngelis; Dora E. Angelaki; Wei Song Ong; Nina Hooshvar; Mingsha Zhang; James W. Bisley; Frank Bremmer; Michael Kubischik; Klaus Peter Hoffmann; Bart Krekelberg; Leanne Chukoskie; Ja Movshon; Keishi Fujiwara; Teppei Akao; Sergei Kurkin; Kikuro Fukushima