Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where T. Mergner is active.

Publication


Featured researches published by T. Mergner.


Neuroscience Letters | 2000

Vestibular, visual, and somatosensory contributions to human control of upright stance

C. Maurer; T. Mergner; B. Bolha; F. Hlavačka

We investigated the changes of human posture control of upright stance which occur when vestibular cues (VEST) are absent and visual and somatosensory orientation cues (VIS, SOM) are removed. Postural responses to sinusoidal tilts of a motion platform in the sagittal plane (+/-2 degrees, f=0.05, 0.1, 0.2 and 0.4 Hz) were studied in normal subjects (Ns) and patients with bilateral vestibular loss (Ps). We found that absence of VEST (Ps, visual reference) and removal of VIS (Ns, no visual reference) had little effect on stabilization of upright body posture in space. In the absence of both VEST and VIS (Ps, no visual reference) somatosensory graviception still provided some information on body orientation in space at 0.05 and 0.1 Hz. However, at the higher frequencies Ps qualitatively changed their behavior; they then tended to actively align their bodies with respect to the motion platform. The findings confirm predictions of a novel postural control model.


Neuroscience Letters | 2001

Human balance control during cutaneous stimulation of the plantar soles

C. Maurer; T. Mergner; B. Bolha; F. Hlavačka

Previous work on human postural control of upright stance, performed in the absence of visual and vestibular orientation cues, suggests that somatosensory cues in the feet enable subjects to maintain equilibrium during low-frequency platform tilts. Here we confirm earlier studies which indicated that stimulation of plantar cutaneous mechanoreceptors can lead to postural responses. Yet, this stimulation did not modify considerably the postural reactions of normal subjects and vestibular loss patients during platform tilts. We therefore suggest that it is necessary to differentiate between (i) cues from plantar cutaneous receptors involved in exteroceptive functions, like the evaluation of the support structure or of relative foot-to-surface motion, and (ii) cues from deep receptors which subserve proprioceptive functions like the control of center of pressure shifts within the limits of the foot support base.


Experimental Brain Research | 1992

Role of vestibular and neck inputs for the perception of object motion in space

T. Mergner; G. Rottler; H. Kimmig; W. Becker

SummaryThe contribution of vestibular and neck inputs to the perception of visual object motion in space was studied in the absence of a visual background (in the dark) in normal human subjects (Ss). Measures of these contributions were obtained by means of a closed loop nulling procedure; Ss fixed their eyes on a luminous spot (object) and nulled its actual or apparent motion in space during head rotation in space (vestibular stimulus) and/ or trunk rotation relative to the head (neck stimulus) with the help of a joystick. Vestibular and neck contributions were expressed in terms of gain and phase with respect to the visuo-oculomotor/joystick feedback loop which was assumed to have almost ideal transfer characteristics. The stimuli were applied as sinusoidal rotations in the horizontal plane (f= 0.025–0.8 Hz; peak angular displacements, 1–16°). Results: (1) During vestibular stimulation, Ss perceived the object, when kept in fixed alignment with the moving body, as moving in space. However, they underestimated the object motion; the gain was only about 0.7 at 0.2–0.8 Hz and clearly decreased at lower stimulus frequencies, while the phase exhibited a small lead. (2) During pure neck stimulation (trunk rotating relative to the stationary head), the object, when stationary, appeared to move in space counter to the trunk excursion. This neck-contingent object motion illusion was small at 0.2–0.8 Hz, but increased considerably with decreasing frequency, while its phase developed a small lag. (3) Vestibular, neck, and visuo-oculomotor effects summed linearly during combined stimulations. (4) The erroneous vestibular and neck contributions to the object motion perception were complementary to each other, and the perception became about veridical (G≈1, φ≈0°), when both inputs were combined during head rotation with the trunk stationary. The results are simulated by an extended version of a computer model that previously had been developed to describe vestibular and neck effects on human perception of head motion in space. In the model, the perception of object motion in space is derived from the superposition of three signals, representing “object to head”, (visuo-oculomotor; head coordinates), “head on trunk” (neck; trunk coordinates), and “trunk in space” (vestibular-neck interaction; space coordinates).


Experimental Brain Research | 1999

MR-eyetracker: a new method for eye movement recording in functional magnetic resonance imaging.

H. Kimmig; Mark W. Greenlee; F. Huethe; T. Mergner

Abstract We present a method for recording saccadic and pursuit eye movements in the magnetic resonance tomograph designed for visual functional magnetic resonance imaging (fMRI) experiments. To reliably classify brain areas as pursuit or saccade related it is important to carefully measure the actual eye movements. For this purpose, infrared light, created outside the scanner by light-emitting diodes (LEDs), is guided via optic fibers into the head coil and onto the eye of the subject. Two additional fiber optical cables pick up the light reflected by the iris. The illuminating and detecting cables are mounted in a plastic eyepiece that is manually lowered to the level of the eye. By means of differential amplification, we obtain a signal that covaries with the horizontal position of the eye. Calibration of eye position within the scanner yields an estimate of eye position with a resolution of 0.2° at a sampling rate of 1000 Hz. Experiments are presented that employ echoplanar imaging with 12 image planes through visual, parietal and frontal cortex while subjects performed saccadic and pursuit eye movements. The distribution of BOLD (blood oxygen level dependent) responses is shown to depend on the type of eye movement performed. Our method yields high temporal and spatial resolution of the horizontal component of eye movements during fMRI scanning. Since the signal is purely optical, there is no interaction between the eye movement signals and the echoplanar images. This reasonably priced eye tracker can be used to control eye position and monitor eye movements during fMRI.


Experimental Brain Research | 1998

Eye movements evoked by proprioceptive stimulation along the body axis in humans

T. Mergner; G. Schweigart; F. Botti; A. Lehmann

Abstract Proprioceptive input arising from torsional body movements elicits small reflexive eye movements. The functional relevance of these eye movements is still unknown so far. We evaluated their slow components as a function of stimulus frequency and velocity. The horizontal eye movements of seven adult subjects were recorded using an infrared device, while horizontal rotations were applied at three segmental levels of the body [i.e., between head and shoulders (neck stimulus), shoulders and pelvis (trunk stimulus), and pelvis and feet (leg stimulus)]. The following results were obtained: (1) Sinusoidal leg stimulation evoked an eye response with the slow component in the direction of the movement of the feet, while the response to trunk and neck stimulation was oriented in the opposite direction (i.e., in that of the head). (2) In contrast, the gain behavior of all three responses was similar, with very low gain at mid- to high frequencies (tested up to 0.4 Hz) but increasing gain at low frequencies (down to 0.0125 Hz). We show that this gain behavior is mainly due to a gain nonlinearity for low angular velocities. (3) The responses were compatible with linear summation when an interaction series was tested in which the leg stimulus was combined with a vestibular stimulus. (4) There was good correspondence of the median gain curves when eye responses were compared with psychophysical responses (perceived body rotation in space; additionally recorded in the interaction series). However, correlation of gain values on a single-trial basis was poor. (5) During transient neck stimulation (smoothed position ramp), the neck response noticeably consisted of two components – an initial head-directed eye shift (phasic component) followed by a shift in the opposite direction (compensatory tonic component). Both leg and neck responses can be described by one simple, dynamic model. In the model the proprioceptive input is fed into the gaze network via two pathways which differ in their dynamics and directional sign. The model simulates either leg or neck responses by selecting an appropriate weight for the gain of one of the pathways (phasic component). The interaction results can also be simulated when a vestibular path is added. This model has similarities to one we recently proposed for human self-motion perception and postural control. A major difference, though, is that the proprioceptive input to the gaze-stabilizing network is weak (restricted to low velocities), unlike that used for perception and postural control. We hold that the former undergoes involution during ontogenesis, as subjects depend on the functionally more appropriate vestibulo-ocular reflex. Yet, the weak proprioceptive eye responses that remain may have some functional relevance. Their tonic component tends to stabilize the eyes by slowly shifting them toward the primary head position relative to the body support. This applies solely to the earth-horizontal plane in which the vestibular signal has no static sensitivity.


Brain Research Bulletin | 1996

Control of the body vertical by vestibular and proprioceptive inputs

F. Hlavacka; T. Mergner; M. Krizkova

The study examines the influence of vestibular and leg proprioceptive cues on the maintenance of the body vertical in human stance. Vestibular body orientation cues were changed by applying bipolar currents to both mastoid bones (cosine-bell wave form of 3.3 s duration, 1 mA current intensity). Proprioceptive input was modified by vibrating the tibialis anterior muscle (at f = 90 Hz, step of 5 s duration and 1 mm amplitude). Furthermore, the vestibular stimulus was paired with the muscle vibration using three different temporal relationships between the stimuli. Body lean responses were analyzed in terms of sway trajectories of the center of foot pressure on the body support surface (horizontal plane). With the anode on the right mastoid, vestibular body lean response was essentially straight towards the right side, and with the anode on left mastoid towards the left side. Vibration of right tibialis anterior muscle induced an almost straight body lean forward and to the right. Upon combined stimulation, responses with complex trajectory resulted, which depended on the stimulus interval. These responses reflected a superposition of the individual vestibular and proprioceptive effects. The results show that the body vertical is under the continuous control of leg proprioceptive and vestibular inputs, which sum linearly. We present a concept according to which these inputs are used for establishing a reference system for the control of the body vertical.


Experimental Brain Research | 2006

A cognitive intersensory interaction mechanism in human postural control

A. Blümle; Christoph Maurer; G. Schweigart; T. Mergner

Human control of upright body posture involves inputs from several senses (visual, vestibular, proprioceptive, somatosensory) and their central interactions. We recently studied visual effects on posture control and their intersensory interactions and found evidence for the existence of an indirect and presumably cognitive mode of interaction, in addition to a direct interaction (we found, e.g., that a ‘virtual reality’ visual stimulus has a weaker postural effect than a ‘real world’ scene, because of its illusory character). Here we focus on the presumed cognitive interaction mechanism. We report experiments in healthy subjects and vestibular loss patients. We investigated to what extent a postural response to lateral platform tilt is modulated by tilt of a visual scene in an orthogonal rotational plane (anterior–posterior, a–p, direction). The a–p visual stimulus did not evoke a lateral postural response on its own. But it enhanced the response to the lateral platform tilt (i.e., it increased the evoked body excursion). The effect was related to the velocity of the visual stimulus, showed a threshold at 0.31°/s, and increased monotonically with increasing velocity. These characteristics were similar in normals and patients, but body excursions were larger in patients. In conclusion, the orthogonal stimulus arrangement in our experiments allowed us to selectively assess a cognitive intersensory interaction that upon co-planar stimulation tends to be merged with direct interaction. The observed threshold corresponds to the conscious perceptual detection threshold of the visual motion, which is clearly higher than the visual postural response threshold. This finding is in line with our notion of a cognitive phenomenon. We postulate that the cognitive mechanism in normals interferes with a central visual–vestibular interaction mechanism. This appears to be similar in vestibular loss patients, but patients use less effective somatosensory instead of vestibular anti-gravity mechanisms.


Neuroscience Letters | 1992

Interaction of vestibular and proprioceptive inputs for human self-motion perception

F. Hlavačka; T. Mergner; G. Schweigart

Human perception of horizontal self(body)-motion in space was studied during various combinations of vestibular and leg-proprioceptive stimuli in the dark. During sinusoidal rotations of the trunk relative to the stationary feet (functionally synergistic combination) the perception was almost veridical over the frequency range tested (0.025-0.4 Hz). This finding suggested a dominance of the proprioceptive over the vestibular input, since the quantitative aspects of the perception (gain, phase, and detection threshold): (a) closely resembled those of the proprioceptive foot-to-trunk perception, and (b) clearly differed from those of the vestibular self-motion perception. However, when using other combinations, the self-motion perception changed in a monotonous way as a function of the two inputs, indicating that the two inputs do interact in a linear way. In a model of these findings the interaction occurs in two stages: (1) summation of a vestibular trunk-in-space signal and a (dynamically matched) proprioceptive foot-to-trunk signal yields an internal representation of foot support motion in space; (2) superposition of the latter by an almost ideal proprioceptive trunk-to-foot signal results in a representation of trunk-in-space motion (essentially proprioception-dependent and ideal when the feet are stationary).


Vision Research | 1997

Gaze Stabilization by Optokinetic Reflex (OKR) and Vestibulo-ocular Reflex (VOR) During Active Head Rotation in Man

G Schweigart; T. Mergner; I Evdokimidis; S Morand; Wolfgang Becker

Vestibulo-ocular reflex (VOR)-optokinetic reflex (OKR) interaction was studied in normal human subjects during active sine-like head movements in the horizontal plane for a variety of vestibular-optokinetic stimulus combinations (frequency range, 0.05-1.6 Hz). At low to mid frequencies (< 0.2 Hz) the eyes tended to be stabilized on the optokinetic pattern, independently of whether the head, the pattern, or both were rotated. At higher frequencies, the OKR gain was attenuated and, in each of the differing stimulus combinations, the eyes became increasingly stabilized in space. Qualitatively similar results were obtained when, for the same visual-vestibular combinations, the head was passively rotated at 0.05 and 0.8 Hz. The data could be simulated by a model which assumes a linear interaction of vestibular and optokinetic signals. It considers the OKR with its negative feedback loop of primordial importance for image stabilization on the retina and the VOR only as a useful addition which compensates for the limited bandwidth of the OKR during high frequency/velocity head rotations in a stationary visual environment.


Neuropsychologia | 1999

Brain imaging in a patient with hemimicropsia.

Jan Kassubek; M Otte; T Wolter; Mark W. Greenlee; T. Mergner; Carl Hermann Lücking

Hemimicropsia is an isolated misperception of the size of objects in one hemifield (objects appear smaller) which is, as a phenomenon of central origin, very infrequently reported in literature. We present a case of hemimicropsia as a selective deficit of size and distance perception in the left hemifield without hemianopsia caused by a cavernous angioma with hemorrhage in the right occipitotemporal area. The symptom occurred only intermittently and was considered the consequence of a local irritation by the hemorrhage. Imaging data including a volume-rendering MR data set of the patients brain were transformed to the 3-D stereotactic grid system by Talairach and warped to a novel digital 3-D brain atlas. Imaging analysis included functional MRI (fMRI) to analyse the patients visual cortex areas (mainly V5) in relation to the localization of the hemangioma to establish physiological landmarks with respect to visual stimulation. The lesion was localized in the peripheral visual association cortex, Brodmann area (BA) 19, adjacent to BA 37, both of which are part of the occipitotemporal visual pathway. Additional psychophysical measurements revealed an elevated threshold for perceiving coherent motion, which we relate to a partial loss of function in V5, a region adjacent to the cavernoma. In our study, we localized for the first time a cerebral lesion causing micropsia by digital mapping in Talairach space using a 3-D brain atlas and topologically related it to fMRI data for visual motion. The localization of the brain lesion affecting BA 19 and the occipitotemporal visual pathway is discussed with respect to experimental and case report findings about the neural basis of object size perception.

Collaboration


Dive into the T. Mergner's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

H. Kimmig

University of Freiburg

View shared research outputs
Top Co-Authors

Avatar

A. Lehmann

University of Freiburg

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

C. Maurer

University of Freiburg

View shared research outputs
Top Co-Authors

Avatar

C. Siebold

University of Freiburg

View shared research outputs
Top Co-Authors

Avatar

F. Botti

University of Freiburg

View shared research outputs
Researchain Logo
Decentralizing Knowledge