Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Robert Volcic is active.

Publication


Featured researches published by Robert Volcic.


Experimental Brain Research | 2008

Allocentric and egocentric reference frames in the processing of three-dimensional haptic space

Robert Volcic; Astrid M. L. Kappers

The main goal of our study is to gain insight into the reference frames involved in three-dimensional haptic spatial processing. Previous research has shown that two-dimensional haptic spatial processing is prone to large systematic deviations. A weighted average model that identifies the origin of the systematic error patterns in the biasing influence of an egocentric reference frame on the allocentric reference frame was proposed as an explanation of the results. The basis of the egocentric reference frame was linked either to the hand or to the body. In the present study participants had to construct a field of parallel bars that could be oriented in three dimensions. First, systematic error patterns were found also in this three-dimensional haptic parallelity task. Second, among the different models tested for their accuracy in explaining the error patterns, the Hand-centered weighted average model proved to most closely resemble the data. A participant-specific weighting factor determined the biasing influence of the hand-centered egocentric reference frame. A shift from the allocentric towards the egocentric frame of reference of approximately 20% was observed. These results support the hypothesis that haptic spatial processing is a product of the interplay of diverse, but synergistically operating frames of reference.


The Journal of Neuroscience | 2013

Visuomotor adaptation changes stereoscopic depth perception and tactile discrimination.

Robert Volcic; Carlo Fantoni; Corrado Caudek; John A. Assad; Fulvio Domini

Perceptual judgments of relative depth from binocular disparity are systematically distorted in humans, despite in principle having access to reliable 3D information. Interestingly, these distortions vanish at a natural grasping distance, as if perceived stereo depth is contingent on a specific reference distance for depth-disparity scaling that corresponds to the length of our arm. Here we show that the brains representation of the arm indeed powerfully modulates depth perception, and that this internal calibration can be quickly updated. We used a classic visuomotor adaptation task in which subjects execute reaching movements with the visual feedback of their reaching finger displaced farther in depth, as if they had a longer arm. After adaptation, 3D perception changed dramatically, and became accurate at the “new” natural grasping distance, the updated disparity scaling reference distance. We further tested whether the rapid adaptive changes were restricted to the visual modality or were characteristic of sensory systems in general. Remarkably, we found an improvement in tactile discrimination consistent with a magnified internal image of the arm. This suggests that the brain integrates sensory signals with information about arm length, and quickly adapts to an artificially updated body structure. These adaptive processes are most likely a relic of the mechanisms needed to optimally correct for changes in size and shape of the body during ontogenesis.


Attention Perception & Psychophysics | 2007

Haptic parallelity perception on the frontoparallel plane: The involvement of reference frames

Robert Volcic; Astrid M. L. Kappers; Jan J. Koenderink

It has been established that spatial representation in the haptic modality is subject to systematic distortions. In this study, the haptic perception of parallelity on the frontoparallel plane was investigated in a bimanual matching paradigm. Eight reference orientations and 23 combinations of stimulus locations were used. The current hypothesis from studies conducted on the horizontal and midsagittal planes presupposes that what is haptically perceived as parallel is a product of weighted contributions from both egocentric and allocentric reference frames. In our study, we assessed a correlation between deviations from the veridical and hand/arm postures and found support for the role of an intermediate frame of reference in modulating haptic parallelity on the frontoparallel plane as well. Moreover, a subject-dependent biasing influence of the egocentric reference frame determines both the reversal of the oblique effect and a scaling effect in deviations as a function of bar position.


Experimental Brain Research | 2009

Haptic perception disambiguates visual perception of 3D shape

Maarten W. A. Wijntjes; Robert Volcic; Sylvia C. Pont; Jan J. Koenderink; Astrid M. L. Kappers

We studied the influence of haptics on visual perception of three-dimensional shape. Observers were shown pictures of an oblate spheroid in two different orientations. A gauge-figure task was used to measure their perception of the global shape. In the first two sessions only vision was used. The results showed that observers made large errors and interpreted the oblate spheroid as a sphere. They also misinterpreted the rotated oblate spheroid for a prolate spheroid. In two subsequent sessions observers were allowed to touch the stimulus while performing the task. The visual input remained unchanged: the observers were looking at the picture and could not see their hands. The results revealed that observers perceived a shape that was different from the vision-only sessions and closer to the veridical shape. Whereas, in general, vision is subject to ambiguities that arise from interpreting the retinal projection, our study shows that haptic input helps to disambiguate and reinterpret the visual input more veridically.


Acta Psychologica | 2009

Haptic mental rotation revisited: multiple reference frame dependence

Robert Volcic; Maarten W. A. Wijntjes; Astrid M. L. Kappers

The nature of reference frames involved in haptic spatial processing was addressed by means of a haptic mental rotation task. Participants assessed the parity of two objects located in various spatial locations by exploring them with different hand orientations. The resulting response times were fitted with a triangle wave function. Phase shifts were found to depend on the relation between the hands and the objects, and between the objects and the body. We rejected the possibility that a single reference frame drives spatial processing. Instead, we found evidence of multiple interacting reference frames with the hand-centered reference frame playing the dominant role. We propose that a weighted average of the allocentric, the hand-centered and the body-centered reference frames influences the haptic encoding of spatial information. In addition, we showed that previous results can be reinterpreted within the framework of multiple reference frames. This mechanism has proved to be ubiquitously present in haptic spatial processing.


Experimental Brain Research | 2010

Cross-modal visuo-haptic mental rotation: comparing objects between senses

Robert Volcic; Maarten W. A. Wijntjes; Erik C. Kool; Astrid M. L. Kappers

The simple experience of a coherent percept while looking and touching an object conceals an intriguing issue: different senses encode and compare information in different modality-specific reference frames. We addressed this problem in a cross-modal visuo-haptic mental rotation task. Two objects in various orientations were presented at the same spatial location, one visually and one haptically. Participants had to identify the objects as same or different. The relative angle between viewing direction and hand orientation was manipulated (Aligned versus Orthogonal). In an additional condition (Delay), a temporal delay was introduced between haptic and visual explorations while the viewing direction and the hand orientation were orthogonal to each other. Whereas the phase shift of the response time function was close to 0° in the Aligned condition, we observed a consistent phase shift in the hand’s direction in the Orthogonal condition. A phase shift, although reduced, was also found in the Delay condition. Counterintuitively, these results mean that seen and touched objects do not need to be physically aligned for optimal performance to occur. The present results suggest that the information about an object is acquired in separate visual and hand-centered reference frames, which directly influence each other and which combine in a time-dependent manner.


Journal of Neurophysiology | 2014

Effect of visual and haptic feedback on grasping movements.

Chiara Bozzacchi; Robert Volcic; Fulvio Domini

Perceptual estimates of three-dimensional (3D) properties, such as the distance and depth of an object, are often inaccurate. Given the accuracy and ease with which we pick up objects, it may be expected that perceptual distortions do not affect how the brain processes 3D information for reach-to-grasp movements. Nonetheless, empirical results show that grasping accuracy is reduced when visual feedback of the hand is removed. Here we studied whether specific types of training could correct grasping behavior to perform adequately even when any form of feedback is absent. Using a block design paradigm, we recorded the movement kinematics of subjects grasping virtual objects located at different distances in the absence of visual feedback of the hand and haptic feedback of the object, before and after different training blocks with different feedback combinations (vision of the thumb and vision of thumb and index finger, with and without tactile feedback of the object). In the Pretraining block, we found systematic biases of the terminal hand position, the final grip aperture, and the maximum grip aperture like those reported in perceptual tasks. Importantly, the distance at which the object was presented modulated all these biases. In the Posttraining blocks only the hand position was partially adjusted, but final and maximum grip apertures remained unchanged. These findings show that when visual and haptic feedback are absent systematic distortions of 3D estimates affect reach-to-grasp movements in the same way as they affect perceptual estimates. Most importantly, accuracy cannot be learned, even after extensive training with feedback.


Experimental Brain Research | 2008

Differential effects of non-informative vision and visual interference on haptic spatial processing.

Robert Volcic; Joram J. van Rheede; Albert Postma; Astrid M. L. Kappers

The primary purpose of this study was to examine the effects of non-informative vision and visual interference upon haptic spatial processing, which supposedly derives from an interaction between an allocentric and egocentric reference frame. To this end, a haptic parallelity task served as baseline to determine the participant-dependent biasing influence of the egocentric reference frame. As expected, large systematic participant-dependent deviations from veridicality were observed. In the second experiment we probed the effect of non-informative vision on the egocentric bias. Moreover, orienting mechanisms (gazing directions) were studied with respect to the presentation of haptic information in a specific hemispace. Non-informative vision proved to have a beneficial effect on haptic spatial processing. No effect of gazing direction or hemispace was observed. In the third experiment we investigated the effect of simultaneously presented interfering visual information on the haptic bias. Interfering visual information parametrically influenced haptic performance. The interplay of reference frames that subserves haptic spatial processing was found to be related to both the effects of non-informative vision and visual interference. These results suggest that spatial representations are influenced by direct cross-modal interactions; inter-participant differences in the haptic modality resulted in differential effects of the visual modality.


Experimental Brain Research | 2016

On-line visual control of grasping movements

Robert Volcic; Fulvio Domini

Even though it is recognized that vision plays an important role in grasping movements, it is not yet fully understood how the visual feedback of the hand contributes to the on-line control. Visual feedback could be used to shape the posture of the hand and fingers, to adjust the trajectory of the moving hand, or a combination of both. Here, we used a dynamic perturbation method that altered the position of the visual feedback relative to the actual position of the thumb and index finger to virtually increase or decrease the visually sensed grip aperture. Subjects grasped objects in a virtual 3D environment with haptic feedback and with visual feedback provided by small virtual spheres anchored to the their unseen fingertips. We found that the effects of the visually perturbed grip aperture arose preeminently late in the movement when the hand was in the object’s proximity. The on-line visual feedback assisted both the scaling of the grip aperture to properly conform it to the object’s dimension and the transport of the hand to correctly position the digits on the object’s surface. However, the extent of these compensatory adjustments was contingent on the viewing geometry. The visual control of the actual grip aperture was mainly observed when the final grasp axis orientation was approximately perpendicular to the viewing direction. On the contrary, when the final grasp axis was aligned with the viewing direction, the visual control was predominantly concerned with the guidance of the digit toward the visible final contact point.


Proceedings of SPIE | 2014

A framework for the study of vision in active observers

Carlo Nicolini; Carlo Fantoni; Giovanni Mancuso; Robert Volcic; Fulvio Domini

We present a framework for the study of active vision, i.e., the functioning of the visual system during actively self-generated body movements. In laboratory settings, human vision is usually studied with a static observer looking at static or, at best, dynamic stimuli. In the real world, however, humans constantly move within dynamic environments. The resulting visual inputs are thus an intertwined mixture of self- and externally-generated movements. To fill this gap, we developed a virtual environment integrated with a head-tracking system in which the influence of self- and externally-generated movements can be manipulated independently. As a proof of principle, we studied perceptual stationarity of the visual world during lateral translation or rotation of the head. The movement of the visual stimulus was thus parametrically tethered to self-generated movements. We found that estimates of object stationarity were less biased and more precise during head rotation than translation. In both cases the visual stimulus had to partially follow the head movement to be perceived as immobile. We discuss a range of possibilities for our setup among which the study of shape perception in active and passive conditions, where the same optic flow is replayed to stationary observers.

Collaboration


Dive into the Robert Volcic's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Chiara Bozzacchi

Sapienza University of Rome

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Maarten W. A. Wijntjes

Delft University of Technology

View shared research outputs
Top Co-Authors

Avatar

Jan J. Koenderink

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ivan Camponogara

New York University Abu Dhabi

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Zoltan Derzsi

New York University Abu Dhabi

View shared research outputs
Researchain Logo
Decentralizing Knowledge