Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Dimitris Voudouris is active.

Publication


Featured researches published by Dimitris Voudouris.


Gait & Posture | 2009

Visual feedback training improves postural adjustments associated with moving obstacle avoidance in elderly women

Vassilia Hatzitaki; Dimitris Voudouris; Thomas Nikodelis; Ioannis G. Amiridis

The study examined the impact of visually guided weight shifting (WS) practice on the postural adjustments evoked by elderly women when avoiding collision with a moving obstacle while standing. Fifty-six healthy elderly women (70.9+/-5.7 years, 87.5+/-9.6 kg) were randomly assigned into one of three groups: a group that completed 12 sessions (25 min, 3s/week) of WS practice in the Anterior/Posterior direction (A/P group, n=20), a group that performed the same practice in the medio/lateral direction (M/L group, n=20) and a control group (n=16). Pre- and post-training, participants were tested in a moving obstacle avoidance task. As a result of practice, postural response onset shifted closer to the time of collision with the obstacle. Side-to-side WS resulted in a reduction of the M/L sway amplitude and an increase of the trunks velocity during avoidance. It is concluded that visually guided WS practice enhances elderlys ability for on-line visuo-motor processing when avoiding collision eliminating reliance on anticipatory scaling. Specifying the direction of WS seems to be critical for optimizing the transfer of training adaptations.


Journal of Neurophysiology | 2013

Ultra-fast selection of grasping points

Dimitris Voudouris; Jeroen B. J. Smeets; Eli Brenner

To grasp an object one needs to determine suitable positions on its surface for placing the digits and to move the digits to those positions. If the object is displaced during a reach-to-grasp movement, the digit movements are quickly adjusted. Do these fast adjustments only guide the digits to previously chosen positions on the surface of the object, or is the choice of contact points also constantly reconsidered? Subjects grasped a ball or a cube that sometimes rotated briefly when the digits started moving. The digits followed the rotation within 115 ms. When the object was a ball, subjects quickly counteracted the initial following response by reconsidering their choice of grasping points so that the digits ended at different positions on the rotated surface of the ball, and the ball was grasped with the preferred orientation of the hand. When the object was a cube, subjects sometimes counteracted the initial following response to grasp the cube by a different pair of sides. This altered choice of grasping points was evident within ∼160 ms of rotation onset, which is shorter than regular reaction times.


Human Movement Science | 2012

Do obstacles affect the selection of grasping points

Dimitris Voudouris; Jeroen B. J. Smeets; Eli Brenner

The selection of grasping points, the positions at which the digits make contact with an objects surface in order to pick it up, depends on several factors. In this study, we examined the influence of obstacles on the selection of grasping points. Subjects reached to grasp a sphere placed on a table. Obstacles were placed either near the anticipated grasping points or near the anticipated elbow position at the time of contact with the object. In all cases, subjects adjusted the way they moved when there was an obstacle nearby, but only obstacles near the thumb had a consistent influence across subjects. In general, the influence of the obstacle increased as it was placed closer to the digit or elbow, rather than the subject grasping in a manner that would be appropriate for all conditions. This suggests that under these circumstances the configuration of the arm and hand at the moment of contact was a critical factor when selecting at which points to grasp the objects.


Journal of Motor Behavior | 2012

Do humans prefer to see their grasping points

Dimitris Voudouris; Jeroen B. J. Smeets; Eli Brenner

ABSTRACT To grasp an object the digits need to be placed at suitable positions on its surface. The selection of such grasping points depends on several factors. Here the authors examined whether being able to see 1 of the selected grasping points is such a factor. Subjects grasped large cylinders or oriented blocks that would normally be grasped with the thumb continuously visible and the final part of the index fingers trajectory occluded by the object in question. An opaque screen that hid the thumbs usual grasping point was used to examine whether individuals would choose a grip that was oriented differently to maintain vision of the thumbs grasping point. A transparent screen was used as a control. Occluding the thumbs grasping point made subjects move more carefully (adopting a larger grip aperture) and choose a slightly different grip orientation. However, the change in grip orientation was much too small to keep the thumb visible. The authors conclude that humans do not particularly aim for visible grasping points.


Experimental Brain Research | 2010

Does planning a different trajectory influence the choice of grasping points

Dimitris Voudouris; Eli Brenner; Willemijn D. Schot; Jeroen B. J. Smeets

We examined whether the movement path is considered when selecting the positions at which the digits will contact the object’s surface (grasping points). Subjects grasped objects of different heights but with the same radius at various locations on a table. At some locations, one digit crossed to the side of the object opposite of where it started. In doing so, it moved over a short object whereas it curved around a tall object. This resulted in very different paths for different objects. Importantly, the selection of grasping points was unaffected. That subjects do not appear to consider the path when selecting grasping points suggests that the grasping points are selected before planning the movements towards those points.


PLOS ONE | 2016

Fixation Biases towards the Index Finger in Almost-Natural Grasping

Dimitris Voudouris; Jeroen B. J. Smeets; Eli Brenner

We use visual information to guide our grasping movements. When grasping an object with a precision grip, the two digits need to reach two different positions more or less simultaneously, but the eyes can only be directed to one position at a time. Several studies that have examined eye movements in grasping have found that people tend to direct their gaze near where their index finger will contact the object. Here we aimed at better understanding why people do so by asking participants to lift an object off a horizontal surface. They were to grasp the object with a precision grip while movements of their hand, eye and head were recorded. We confirmed that people tend to look closer to positions that a digit needs to reach more accurately. Moreover, we show that where they look as they reach for the object depends on where they were looking before, presumably because they try to minimize the time during which the eyes are moving so fast that no new visual information is acquired. Most importantly, we confirmed that people have a bias to direct gaze towards the index finger’s contact point rather than towards that of the thumb. In our study, this cannot be explained by the index finger contacting the object before the thumb. Instead, it appears to be because the index finger moves to a position that is hidden behind the object that is grasped, probably making this the place at which one is most likely to encounter unexpected problems that would benefit from visual guidance. However, this cannot explain the bias that was found in previous studies, where neither contact point was hidden, so it cannot be the only explanation for the bias.


Journal of Neurophysiology | 2017

Reach-relevant somatosensory signals modulate tactile suppression

Hanna Gertz; Dimitris Voudouris; Katja Fiehler

Tactile stimuli on moving limbs are typically attenuated during reach planning and execution. This phenomenon has been related to internal forward models that predict the sensory consequences of a movement. Tactile suppression is considered to occur due to a match between the actual and predicted sensory consequences of a movement, which might free capacities to process novel or task-relevant sensory signals. Here, we examined whether and how tactile suppression depends on the relevance of somatosensory information for reaching. Participants reached with their left or right index finger to the unseen index finger of their other hand (body target) or an unseen pad on a screen (external target). In the body target condition, somatosensory signals from the static hand were available for localizing the reach target. Vibrotactile stimuli were presented on the moving index finger before or during reaching or in a separate no-movement baseline block, and participants indicated whether they detected a stimulus. As expected, detection thresholds before or during reaching were higher compared with baseline. Tactile suppression was also stronger for reaches to body targets than external targets, as reflected by higher detection thresholds and lower precision of detectability. Moreover, detection thresholds were higher when reaching with the left than with the right hand. Our results suggest that tactile suppression is modulated by position signals from the target limb that are required to reach successfully to the own body. Moreover, limb dominance seems to affect tactile suppression, presumably due to disparate uncertainty of feedback signals from the moving limb.NEW & NOTEWORTHY Tactile suppression on a moving limb has been suggested to release computational resources for processing other relevant sensory events. In the current study, we show that tactile sensitivity on the moving limb decreases more when reaching to body targets than external targets. This indicates that tactile perception can be modulated by allocating processing capacities to movement-relevant somatosensory information at the target location. Our results contribute to understanding tactile processing and predictive mechanisms in the brain.


Journal of Experimental Psychology: Human Perception and Performance | 2017

Enhancement and suppression of tactile signals during reaching.

Dimitris Voudouris; Katja Fiehler

The perception of tactile stimuli presented on a moving hand is systematically suppressed. Such suppression has been attributed to the limited capacity of the brain to process task-irrelevant sensory information. Here, we examined whether humans do not only suppress movement-irrelevant but also enhance in parallel movement-relevant tactile signals when performing a goal-directed reaching movement. Participants reached to either a visual (LED) or somatosensory target (thumb or index finger of their unseen static hand) and discriminated 2 simultaneously presented tactile stimuli: a reference stimulus on the little finger of their static hand and a comparison stimulus on the index finger of their moving hand. Thus, during somatosensory reaching the location of the reference stimulus was task-relevant. Tactile suppression, as reflected by the increased points-of-subjective-equality (PSE) and just-noticeable-differences (JND), was stronger during reaching to somatosensory than visual targets. In Experiment 2, we presented the reference stimulus at a task-irrelevant location (sternum) and found similar suppression for somatosensory and visual reaching. This suggests that participants enhanced the sensation of the reference stimulus at the target hand during somatosensory reaching in Experiment 1. This suggestion was confirmed in Experiment 3 using a detection task in which we found lower detection thresholds on the target hand during somatosensory but not during visual reaching. We postulate that humans can flexibly modulate their tactile sensitivity by suppressing movement-irrelevant and enhancing movement-relevant signals in parallel when executing a reaching movement.


Experimental Brain Research | 2015

Grasping an object comfortably: orientation information is held in memory.

K. Roche; Rebekka Verheij; Dimitris Voudouris; H. Chainay; Jeroen B. J. Smeets

Abstract It has been shown that memorized information can influence real-time visuomotor control. For instance, a previously seen object (prime) influences grasping movements toward a target object. In this study, we examined how general the priming effect is: does it depend on the orientation of the target object and the similarity between the prime and the target? To do so, we examined whether priming effects occured for different orientations of the prime and the target objects and for primes that were either identical to the target object or only half of the target object. We found that for orientations of the target object that did not require an awkward grasp, the orientation of the prime could influence the initiation time and the final grip orientation. The priming effects on initiation time were only found when the whole target object was presented as prime, but not when only half of the target object was presented. The results suggest that a memory effect on real-time control is constrained by end-state comfort and by the relevance of the prime for the grasping movement, which might mean that the interactions between the ventral and dorsal pathways are task specific.


Attention Perception & Psychophysics | 2017

Spatial specificity of tactile enhancement during reaching

Dimitris Voudouris; Katja Fiehler

Tactile signals on a hand that serves as movement goal are enhanced during movement planning and execution. Here, we examined how spatially specific tactile enhancement is when humans reach to their own static hand. Participants discriminated two brief and simultaneously presented tactile stimuli: a comparison stimulus on the left thumb or little finger from a reference stimulus on the sternum. Tactile stimuli were presented either during right-hand reaching towards the left thumb or little finger or while holding both hands still (baseline). Consistent with our previous findings, stimuli on the left hand were perceived stronger during movement than baseline. However, tactile enhancement was not stronger when the stimuli were presented on the digit that served as reach target, thus the perception across the whole hand was uniformly enhanced. In experiment 2, we also presented stimuli on the upper arm in half of the trials to reduce the expectation of the stimulus location. Tactile stimuli on the target hand, but not on the upper arm, were generally enhanced, supporting the idea of a spatial gradient of tactile enhancement. Overall, our findings argue for low spatial specificity of tactile enhancement at movement-relevant body parts, here the target hand.

Collaboration


Dive into the Dimitris Voudouris's collaboration.

Top Co-Authors

Avatar

Eli Brenner

VU University Amsterdam

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Vassilia Hatzitaki

Aristotle University of Thessaloniki

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ioannis G. Amiridis

Aristotle University of Thessaloniki

View shared research outputs
Researchain Logo
Decentralizing Knowledge