Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Marc O. Ernst is active.

Publication


Featured researches published by Marc O. Ernst.


Nature | 2002

Humans integrate visual and haptic information in a statistically optimal fashion

Marc O. Ernst; Martin S. Banks

When a person looks at an object while exploring it with their hand, vision and touch both provide information for estimating the properties of the object. Vision frequently dominates the integrated visual–haptic percept, for example when judging size, shape or position, but in some circumstances the percept is clearly affected by haptics. Here we propose that a general principle, which minimizes variance in the final estimate, determines the degree to which vision or haptics dominates. This principle is realized by using maximum-likelihood estimation to combine the inputs. To investigate cue combination quantitatively, we first measured the variances associated with visual and haptic estimation of height. We then used these measurements to construct a maximum-likelihood integrator. This model behaved very similarly to humans in a visual–haptic task. Thus, the nervous system seems to combine visual and haptic information in a fashion that is similar to a maximum-likelihood integrator. Visual dominance occurs when the variance associated with visual estimation is lower than that associated with haptic estimation.


Trends in Cognitive Sciences | 2004

Merging the Senses into a Robust Percept

Marc O. Ernst; Hh Bülthoff

To perceive the external environment our brain uses multiple sources of sensory information derived from several different modalities, including vision, touch and audition. All these different sources of information have to be efficiently merged to form a coherent and robust percept. Here we highlight some of the mechanisms that underlie this merging of the senses in the brain. We show that, depending on the type of information, different combination and integration strategies are used and that prior knowledge is often required for interpreting the sensory signals.


PLOS ONE | 2011

The Rubber Hand Illusion: Feeling of Ownership and Proprioceptive Drift Do Not Go Hand in Hand

Marieke Rohde; Massimiliano Di Luca; Marc O. Ernst

In the Rubber Hand Illusion, the feeling of ownership of a rubber hand displaced from a participants real occluded hand is evoked by synchronously stroking both hands with paintbrushes. A change of perceived finger location towards the rubber hand (proprioceptive drift) has been reported to correlate with this illusion. To measure the time course of proprioceptive drift during the Rubber Hand Illusion, we regularly interrupted stroking (performed by robot arms) to measure perceived finger location. Measurements were made by projecting a probe dot into the field of view (using a semi-transparent mirror) and asking participants if the dot is to the left or to the right of their invisible hand (Experiment 1) or to adjust the position of the dot to that of their invisible hand (Experiment 2). We varied both the measurement frequency (every 10 s, 40 s, 120 s) and the mode of stroking (synchronous, asynchronous, just vision). Surprisingly, with frequent measurements, proprioceptive drift occurs not only in the synchronous stroking condition but also in the two control conditions (asynchronous stroking, just vision). Proprioceptive drift in the synchronous stroking condition is never higher than in the just vision condition. Only continuous exposure to asynchronous stroking prevents proprioceptive drift and thus replicates the differences in drift reported in the literature. By contrast, complementary subjective ratings (questionnaire) show that the feeling of ownership requires synchronous stroking and is not present in the asynchronous stroking condition. Thus, subjective ratings and drift are dissociated. We conclude that different mechanisms of multisensory integration are responsible for proprioceptive drift and the feeling of ownership. Proprioceptive drift relies on visuoproprioceptive integration alone, a process that is inhibited by asynchronous stroking, the most common control condition in Rubber Hand Illusion experiments. This dissociation implies that conclusions about feelings of ownership cannot be drawn from measuring proprioceptive drift alone.


Journal of Vision | 2005

Focus Cues Affect Perceived Depth

Simon J. Watt; Kurt Akeley; Marc O. Ernst; Martin S. Banks

Depth information from focus cues--accommodation and the gradient of retinal blur--is typically incorrect in three-dimensional (3-D) displays because the light comes from a planar display surface. If the visual system incorporates information from focus cues into its calculation of 3-D scene parameters, this could cause distortions in perceived depth even when the 2-D retinal images are geometrically correct. In Experiment 1 we measured the direct contribution of focus cues to perceived slant by varying independently the physical slant of the display surface and the slant of a simulated surface specified by binocular disparity (binocular viewing) or perspective/texture (monocular viewing). In the binocular condition, slant estimates were unaffected by display slant. In the monocular condition, display slant had a systematic effect on slant estimates. Estimates were consistent with a weighted average of slant from focus cues and slant from disparity/texture, where the cue weights are determined by the reliability of each cue. In Experiment 2, we examined whether focus cues also have an indirect effect on perceived slant via the distance estimate used in disparity scaling. We varied independently the simulated distance and the focal distance to a disparity-defined 3-D stimulus. Perceived slant was systematically affected by changes in focal distance. Accordingly, depth constancy (with respect to simulated distance) was significantly reduced when focal distance was held constant compared to when it varied appropriately with the simulated distance to the stimulus. The results of both experiments show that focus cues can contribute to estimates of 3-D scene parameters. Inappropriate focus cues in typical 3-D displays may therefore contribute to distortions in perceived space.


Nature Neuroscience | 2000

Touch can change visual slant perception

Marc O. Ernst; Martin S. Banks; Hh Bülthoff

The visual system uses several signals to deduce the three-dimensional structure of the environment, including binocular disparity, texture gradients, shading and motion parallax. Although each of these sources of information is independently insufficient to yield reliable three-dimensional structure from everyday scenes, the visual system combines them by weighting the available information; altering the weights would therefore change the perceived structure. We report that haptic feedback (active touch) increases the weight of a consistent surface-slant signal relative to inconsistent signals. Thus, appearance of a subsequently viewed surface is changed: the surface appears slanted in the direction specified by the haptically reinforced signal.


Journal of Vision | 2008

The statistical determinants of adaptation rate in human reaching

Johannes Burge; Marc O. Ernst; Martin S. Banks

Rapid reaching to a target is generally accurate but also contains random and systematic error. Random errors result from noise in visual measurement, motor planning, and reach execution. Systematic error results from systematic changes in the mapping between the visual estimate of target location and the motor command necessary to reach the target (e.g., new spectacles, muscular fatigue). Humans maintain accurate reaching by recalibrating the visuomotor system, but no widely accepted computational model of the process exists. Given certain boundary conditions, a statistically optimal solution is a Kalman filter. We compared human to Kalman filter behavior to determine how humans take into account the statistical properties of errors and the reliability with which those errors can be measured. For most conditions, human and Kalman filter behavior was similar: Increasing measurement uncertainty caused similar decreases in recalibration rate; directionally asymmetric uncertainty caused different rates in different directions; more variation in systematic error increased recalibration rate. However, behavior differed in one respect: Inserting random error by perturbing feedback position causes slower adaptation in Kalman filters but had no effect in humans. This difference may be due to how biological systems remain responsive to changes in environmental statistics. We discuss the implications of this work.


Psychological Science | 2001

Viewpoint Dependence in Visual and Haptic Object Recognition

Fiona N. Newell; Marc O. Ernst; Bosco S. Tjan; Hh Bülthoff

On the whole, people recognize objects best when they see the objects from a familiar view and worse when they see the objects from views that were previously occluded from sight. Unexpectedly, we found haptic object recognition to be viewpoint-specific as well, even though hand movements were unrestricted. This viewpoint dependence was due to the hands preferring the back “view” of the objects. Furthermore, when the sensory modalities (visual vs. haptic) differed between learning an object and recognizing it, recognition performance was best when the objects were rotated back-to-front between learning and recognition. Our data indicate that the visual system recognizes the front view of objects best, whereas the hand recognizes objects best from the back.


Experimental Brain Research | 2007

Optimal integration of shape information from vision and touch

Hb Helbig; Marc O. Ernst

Many tasks can be carried out by using several sources of information. For example, an object’s size and shape can be judged based on visual as well as haptic cues. It has been shown recently that human observers integrate visual and haptic size information in a statistically optimal fashion, in the sense that the integrated estimate is most reliable (Ernst and Banks in Nature 415:429–433, 2002). In the present study, we tested whether this holds also for visual and haptic shape information. In previous studies virtual stimuli were used to test for optimality in integration. Virtual displays may, however, contain additional inappropriate cues that provide conflicting information and thus affect cue integration. Therefore, we studied optimal integration using real objects. Furthermore, we presented visual information via mirrors to create a spatial separation between visual and haptic cues while observers saw their hand touching the object and thus, knew that they were seeing and feeling the same object. Does this knowledge promote integration even though signals are spatially discrepant which has been shown to lead to a breakdown of integration (Gepshtein et al. in J Vis 5:1013–1023, 2005)? Consistent with the model predictions, observers weighted visual and haptic cues to shape according to their reliability: progressively more weight was given to haptics when visual information became less reliable. Moreover, the integrated visual–haptic estimate was more reliable than either unimodal estimate. These findings suggest that observers integrate visual and haptic shape information of real 3D objects. Thereby, knowledge that multisensory signals arise from the same object seems to promote integration.


Journal of Vision | 2005

The combination of vision and touch depends on spatial proximity

Sergei Gepshtein; Johannes Burge; Marc O. Ernst; Martin S. Banks

The nervous system often combines visual and haptic information about object properties such that the combined estimate is more precise than with vision or haptics alone. We examined how the system determines when to combine the signals. Presumably, signals should not be combined when they come from different objects. The likelihood that signals come from different objects is highly correlated with the spatial separation between the signals, so we asked how the spatial separation between visual and haptic signals affects their combination. To do this, we first created conditions for each observer in which the effect of combination--the increase in discrimination precision with two modalities relative to performance with one modality--should be maximal. Then under these conditions, we presented visual and haptic stimuli separated by different spatial distances and compared human performance with predictions of a model that combined signals optimally. We found that discrimination precision was essentially optimal when the signals came from the same location, and that discrimination precision was poorer when the signals came from different locations. Thus, the mechanism of visual-haptic combination is specialized for signals that coincide in space.


Experimental Brain Research | 2005

Feeling what you hear: auditory signals can modulate tactile tap perception

Jean-Pierre Bresciani; Marc O. Ernst; Knut Drewing; Guillaume Bouyer; Vincent Maury; Abderrahmane Kheddar

We tested whether auditory sequences of beeps can modulate the tactile perception of sequences of taps (two to four taps per sequence) delivered to the index fingertip. In the first experiment, the auditory and tactile sequences were presented simultaneously. The number of beeps delivered in the auditory sequence were either the same as, less than, or more than the number of taps of the simultaneously presented tactile sequence. Though task-irrelevant (subjects were instructed to focus on the tactile stimuli), the auditory stimuli systematically modulated subjects’ tactile perception; in other words subjects’ responses depended significantly on the number of delivered beeps. Such modulation only occurred when the auditory and tactile stimuli were similar enough. In the second experiment, we tested whether the automatic auditory-tactile integration depends on simultaneity or whether a bias can be evoked when the auditory and tactile sequence are presented in temporal asynchrony. Audition significantly modulated tactile perception when the stimuli were presented simultaneously but this effect gradually disappeared when a temporal asynchrony was introduced between auditory and tactile stimuli. These results show that when provided with auditory and tactile sensory signals that are likely to be generated by the same stimulus, the central nervous system (CNS) tends to automatically integrate these signals.

Collaboration


Dive into the Marc O. Ernst's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge