Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where G. Schweigart is active.

Publication


Featured researches published by G. Schweigart.


Experimental Brain Research | 1998

Eye movements evoked by proprioceptive stimulation along the body axis in humans

T. Mergner; G. Schweigart; F. Botti; A. Lehmann

Abstract Proprioceptive input arising from torsional body movements elicits small reflexive eye movements. The functional relevance of these eye movements is still unknown so far. We evaluated their slow components as a function of stimulus frequency and velocity. The horizontal eye movements of seven adult subjects were recorded using an infrared device, while horizontal rotations were applied at three segmental levels of the body [i.e., between head and shoulders (neck stimulus), shoulders and pelvis (trunk stimulus), and pelvis and feet (leg stimulus)]. The following results were obtained: (1) Sinusoidal leg stimulation evoked an eye response with the slow component in the direction of the movement of the feet, while the response to trunk and neck stimulation was oriented in the opposite direction (i.e., in that of the head). (2) In contrast, the gain behavior of all three responses was similar, with very low gain at mid- to high frequencies (tested up to 0.4 Hz) but increasing gain at low frequencies (down to 0.0125 Hz). We show that this gain behavior is mainly due to a gain nonlinearity for low angular velocities. (3) The responses were compatible with linear summation when an interaction series was tested in which the leg stimulus was combined with a vestibular stimulus. (4) There was good correspondence of the median gain curves when eye responses were compared with psychophysical responses (perceived body rotation in space; additionally recorded in the interaction series). However, correlation of gain values on a single-trial basis was poor. (5) During transient neck stimulation (smoothed position ramp), the neck response noticeably consisted of two components – an initial head-directed eye shift (phasic component) followed by a shift in the opposite direction (compensatory tonic component). Both leg and neck responses can be described by one simple, dynamic model. In the model the proprioceptive input is fed into the gaze network via two pathways which differ in their dynamics and directional sign. The model simulates either leg or neck responses by selecting an appropriate weight for the gain of one of the pathways (phasic component). The interaction results can also be simulated when a vestibular path is added. This model has similarities to one we recently proposed for human self-motion perception and postural control. A major difference, though, is that the proprioceptive input to the gaze-stabilizing network is weak (restricted to low velocities), unlike that used for perception and postural control. We hold that the former undergoes involution during ontogenesis, as subjects depend on the functionally more appropriate vestibulo-ocular reflex. Yet, the weak proprioceptive eye responses that remain may have some functional relevance. Their tonic component tends to stabilize the eyes by slowly shifting them toward the primary head position relative to the body support. This applies solely to the earth-horizontal plane in which the vestibular signal has no static sensitivity.


Experimental Brain Research | 2006

A cognitive intersensory interaction mechanism in human postural control

A. Blümle; Christoph Maurer; G. Schweigart; T. Mergner

Human control of upright body posture involves inputs from several senses (visual, vestibular, proprioceptive, somatosensory) and their central interactions. We recently studied visual effects on posture control and their intersensory interactions and found evidence for the existence of an indirect and presumably cognitive mode of interaction, in addition to a direct interaction (we found, e.g., that a ‘virtual reality’ visual stimulus has a weaker postural effect than a ‘real world’ scene, because of its illusory character). Here we focus on the presumed cognitive interaction mechanism. We report experiments in healthy subjects and vestibular loss patients. We investigated to what extent a postural response to lateral platform tilt is modulated by tilt of a visual scene in an orthogonal rotational plane (anterior–posterior, a–p, direction). The a–p visual stimulus did not evoke a lateral postural response on its own. But it enhanced the response to the lateral platform tilt (i.e., it increased the evoked body excursion). The effect was related to the velocity of the visual stimulus, showed a threshold at 0.31°/s, and increased monotonically with increasing velocity. These characteristics were similar in normals and patients, but body excursions were larger in patients. In conclusion, the orthogonal stimulus arrangement in our experiments allowed us to selectively assess a cognitive intersensory interaction that upon co-planar stimulation tends to be merged with direct interaction. The observed threshold corresponds to the conscious perceptual detection threshold of the visual motion, which is clearly higher than the visual postural response threshold. This finding is in line with our notion of a cognitive phenomenon. We postulate that the cognitive mechanism in normals interferes with a central visual–vestibular interaction mechanism. This appears to be similar in vestibular loss patients, but patients use less effective somatosensory instead of vestibular anti-gravity mechanisms.


Neuroscience Letters | 1992

Interaction of vestibular and proprioceptive inputs for human self-motion perception

F. Hlavačka; T. Mergner; G. Schweigart

Human perception of horizontal self(body)-motion in space was studied during various combinations of vestibular and leg-proprioceptive stimuli in the dark. During sinusoidal rotations of the trunk relative to the stationary feet (functionally synergistic combination) the perception was almost veridical over the frequency range tested (0.025-0.4 Hz). This finding suggested a dominance of the proprioceptive over the vestibular input, since the quantitative aspects of the perception (gain, phase, and detection threshold): (a) closely resembled those of the proprioceptive foot-to-trunk perception, and (b) clearly differed from those of the vestibular self-motion perception. However, when using other combinations, the self-motion perception changed in a monotonous way as a function of the two inputs, indicating that the two inputs do interact in a linear way. In a model of these findings the interaction occurs in two stages: (1) summation of a vestibular trunk-in-space signal and a (dynamically matched) proprioceptive foot-to-trunk signal yields an internal representation of foot support motion in space; (2) superposition of the latter by an almost ideal proprioceptive trunk-to-foot signal results in a representation of trunk-in-space motion (essentially proprioception-dependent and ideal when the feet are stationary).


Experimental Brain Research | 1999

Eye movements during combined pursuit, optokinetic and vestibular stimulation in macaque monkey.

G. Schweigart; T. Mergner; G. Barnes

Abstract During natural behaviour in a visual environment, smooth pursuit eye movements (SP) usually override the vestibular-ocular reflex (VOR) and the optokinetic reflex (OKR), which stem from head-in-space and scene-relative-to-eye motion, respectively. We investigated the interaction of SP, VOR, and OKR, which is not fully understood to date. Eye movements were recorded in two macaque monkeys while applying various combinations of smooth eye pursuit, vestibular and optokinetic stimuli (sinusoidal horizontal rotations of visual target, chair and optokinetic pattern, respectively, at 0.025, 0.05, 0.1, 0.2, 0.4, and 0.8 Hz, corresponding to peak stimulus velocities of 1.25–40°/s for a standard stimulus of ±8°). Slow eye responses were analysed in terms of gain and phase. During SP at mid-frequencies, the eyes were almost perfectly on target (gain 0.98 at 0.1 Hz), independently of a concurrent vestibular or optokinetic stimulus. Pursuit gain at lower frequencies, although being almost ideal (0.98 at 0.025 Hz with pursuit-only stimulation), became modified by the optokinetic input (gain increase above unity when optokinetic stimulus had the same direction as target, decrease with opposite direction). At higher stimulus frequencies, pursuit gain decreased (down to 0.69 at 0.8 Hz), and the pursuit response became modified by vestibular input (gain increase during functionally synergistic combinations, decrease in antagonistic combinations).Thus, the pursuit system in monkey dominates during SP-OKR-VOR interaction, but it does so effectively only in the mid-frequency range. The results can be described in the form of a simple dynamic model in which it is assumed that the three systems interact by linear summation. In the model SP and OKR dominate VOR in the low- to mid-frequency/velocity range, because they represent closed loop systems with high internal gain values (>>1) at these frequencies/velocities, whereas the VOR represents an open loop system with about unity-gain (up to very high frequencies). SP dominance over OKR is obtained by allowing an ’attentional/volitional’ mechanism to boost SP gain and a predictive mechanism to improve its dynamics.


Annals of the New York Academy of Sciences | 2009

Posture Control in Vestibular-Loss Patients

Thomas Mergner; G. Schweigart; Luminous Fennell; Christoph Maurer

Patients with chronic bilateral loss of vestibular functions normally replace these by visual or haptic referencing to stationary surroundings, resulting in an almost normal stance control. But with eyes closed, they show abnormally large body sway, and may tend to fall when there are external disturbances to the body or when standing on an unstable support surface. Patients’ postural responses depend on joint angle proprioception and ground reaction–force cues (occasionally referred to as “somatosensory graviception”). It is asked why the force cues do not allow patients to fully substitute loss of the vestibular cues. In recent years, four sets of observations of experimental situations where patients, eyes closed, show impaired stance control or even may fall were identified: (1) with unstable or compliant support (“inevitable falls”); (2) with large external disturbances such as support surface tilts or pull stimuli impacting on their bodies (leading to abnormally large body movements); (3) with fast body–support tilts (also abnormally large body movements); and (4) with transient support tilt (overshooting body–support stabilization and abnormaly late body–space [BS] stabilization). When patients’ data were modeled, it was found that their problems stem mainly from the force cues. It was hypothesized that patients have difficulties decomposing this sensory information into its constituents in order to be able to get rid of an active force component. Normals do not have this difficulty, because the vestibular system performs the decomposition.


Studies in Visual Information Processing | 1995

Eye movements evoked by leg-proprioceptive and vestibular stimulation

Fabio M. Botti; G. Schweigart; Thomas Mergner

Abstract Leg-proprioceptive stimulation in intact humans during rotation of the feet under the stationary body induces nystagmus. The slow component of this leg-eye response reaches a considerable magnitude only at low stimulus frequencies/velocities. It appears to sum linearly with the vestibulo-ocular reflex (VOR) and to prevent VOR gain attenuation at low frequencies, if the body is rotated on the stationary feet. Its normal function could be to aid eye stabilization during slow body sways.


Archive | 2007

Sensorimotor Control of Human Dynamic Behavior in Space Implemented into a Hominoid Robot

Thomas Mergner; Christoph Maurer; G. Schweigart

To what extent can we claim nowadays that we understand sensorimotor control of human dynamic behavior in space? We try here to answer this question by exploring whether the available knowledge base suffices to build a hominoid robot such that its sensorimotor control functions mimic those of humans. It is, actually, our aim to build such a robot. We want to use it, in a systems approach, for simulations to better understand human sensorimotor control functions. We posit that a systems approach is necessary to deal with this complex non-linear control. We are especially interested in the sensory aspects of the control, the inter-sensory interactions (‘multisensory integration’ or sensor fusion) and the spatio-temporal coordination. Psychophysical work in our laboratory showed that the brain creates from sensory inputs internal estimates of the physical stimuli in the outside world (i.e., of the external constellation that caused a particular set of sensor stimuli). For example, the brain derives from vestibular and proprioceptive signals an estimate of body support surface motion. It then uses these estimates for sensorimotor feedback control (rather than the ‘raw’ sensory signals such as the vestibular signal). We hold that this internal reconstruction of the external physics is required for appropriate spatio-temporal coordination of the behavior. However, a problem arises from non-ideal sensors. An example is the vestibular sensor, which shows pronounced low-frequency noise. The solution of this problem involves sensory re-weighting mechanisms. Based on the discovered sensor fusion principles, we built a hominoid robot for control of upright stance (which we consider a simple prototype of sensorimotor control). It mimics human stance control even in complex behavioral situations. We aim to use it to better understand sensorimotor deficits in neurological patients and to develop new therapy designs.


Archive | 1999

Proprioceptive Evoked Eye Movements

G. Schweigart; F. Botti; A. Lehmann; T. Mergner

Stimulation of neck afferents by torsion of the head relative to the trunk elicits the cervico-ocular reflex (COR; Barany 1906, 1918/19; Jurgens and Mergner 1989) and rotations of the lower trunk or the legs relative to the stationary upper body also elicit reflexive eye movements (Grahe 1926, Warabi 1978; Botti et al. 1995). We reevaluated these proprioceptive eye responses since their functional relevance was unknown so far.


Studies in Visual Information Processing | 1994

Influence of Vestibular and Optokinetic Stimulation on Eye Fixation in the Macaque Monkey

G. Schweigart; Thomas Mergner

Eye fixation of a head-stationary visual target during head rotation in space (vestibular stimulation) and/or rotation of a visual scene relative to the head (optokinetic stimulation) was studied in a highly trained monkey using sinusoidal stimulation (0.025–0.8 Hz) in the horizontal plane. Fixation was found to be almost perfect in the 0.1–0.6 Hz frequency range, independent of the way in which the stimuli were combined. Suppression of both vestibular and optokinetic reflexes (VOR and OKR, respectively) became clearly incomplete with frequencies less than 0.1 Hz. In addition, VOR suppression became also incomplete at 0.8 Hz. A detailed interpretation of these findings was hampered by the fact that the periodicity of the stimulation affected the eye responses in different ways. VOR and OKR show a habituation-like attenuation of gain across all frequencies tested, whereas smooth-pursuit eye movements appear to profit from stimulus periodicity. Therefore, further studies with aperiodic stimulation appear necessary.


Experimental Brain Research | 1991

Human perception of horizontal trunk and head rotation in space during vestibular and neck stimulation

T. Mergner; C. Siebold; G. Schweigart; Wolfgang Becker

Collaboration


Dive into the G. Schweigart's collaboration.

Top Co-Authors

Avatar

T. Mergner

University of Freiburg

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

A. Lehmann

University of Freiburg

View shared research outputs
Top Co-Authors

Avatar

F. Botti

University of Freiburg

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

C. Siebold

University of Freiburg

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge