Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Krista Overvliet is active.

Publication


Featured researches published by Krista Overvliet.


Neuropsychologia | 2011

Somatosensory saccades reveal the timing of tactile spatial remapping.

Krista Overvliet; Elena Azañón; Salvador Soto-Faraco

Remapping tactile events from skin to external space is an essential process for human behaviour. It allows us to refer tactile sensations to their actual externally based location, by combining anatomically based somatosensory information with proprioceptive information about the current body posture. We examined the time course of tactile remapping by recording speeded saccadic responses to somatosensory stimuli delivered to the hands. We conducted two experiments in which arm posture varied (crossed or uncrossed), so that anatomical and external frames of reference were either put in spatial conflict or were aligned. The data showed that saccade onset latencies in the crossed hands conditions were slower than in the uncrossed hands condition, suggesting that, in the crossed hands condition, remapping had to be completed before a correct saccade could be executed. Saccades to tactile stimuli when the hands were crossed were sometimes initiated to the wrong direction and then corrected in-flight, resulting in a turn-around saccade. These turn-around saccades were more likely to occur in short-latency responses, compared to onset latencies of saccades that went straight to target. The latter suggests that participants were postponing their saccade until the time the tactile event was represented according to the current body posture. We propose that the difference between saccade onset latencies of crossed and uncrossed hand postures, and between the onset of a turn-around saccade and a straight saccade in the crossed hand posture, reveal the timing of tactile spatial remapping.


Acta Psychologica | 2008

The use of proprioception and tactile information in haptic search

Krista Overvliet; Jeroen B. J. Smeets; Eli Brenner

To investigate how tactile and proprioceptive information are used in haptic object discrimination we conducted a haptic search task in which participants had to search for either a cylinder, a bar or a rotated cube within a grid of aligned cubes. Tactile information from one finger is enough to detect a cylinder amongst the cubes. For detecting a bar or a rotated cube amongst cubes touch alone is not enough. For the rotated cube this is evident because its shape is identical to that of the non-targets, so proprioception must provide information about the orientation of the fingers and hand when touching it. For the bar one either needs proprioceptive information about the distance and direction of a single fingers movements along the surfaces, or proprioceptive information from several fingers when they touch it simultaneously. When using only one finger, search times for the bar were much longer than those for the other two targets. When the whole hand or both hands were used the search times were similar for all shapes. Most errors were made when searching for the rotated cube, probably due to systematic posture-related biases in judging orientation on the basis of proprioception. The results suggest that tactile and proprioceptive information are readily combined for shape discrimination.


Experimental Brain Research | 2007

Haptic search with finger movements: using more fingers does not necessarily reduce search times

Krista Overvliet; Jeroen B. J. Smeets; Eli Brenner

Two haptic serial search tasks were used to investigate how the separations between items, and the number of fingers used to scan them, influence the search time and search strategy. In both tasks participants had to search for a target (cross) between a fixed number of non-targets (circles). The items were placed in a straight line. The target’s position was varied within blocks, and inter-item separation was varied between blocks. In the first experiment participants used their index finger to scan the display. As expected, search time depended on target position as well as on item separation. For larger separations participants’ movements were jerky, resembling ‘saccades’ and ‘fixations’, while for the shortest separation the movements were smooth. When only considering time in contact with an item, search times were the same for all separation conditions. Furthermore, participants never continued their movement after they encountered the target. These results suggest that participants did not use the time during which they were moving between the items to process information about the items. The search times were a little shorter than those in a static search experiment (Overvliet et al. in Percept Psychophys, 2007a), where multiple items were presented to the fingertips simultaneously. To investigate whether this is because the finger was moving or because only one finger was stimulated, we conducted a second experiment in which we asked participants to put three fingers in line and use them together to scan the items. Doing so increased the time in contact with the items for all separations, so search times were presumably longer in the static search experiment because multiple fingers were involved. This may be caused by the time that it takes to switch from one finger to the other.


Experimental Brain Research | 2011

Relative finger position influences whether you can localize tactile stimuli

Krista Overvliet; Helen A. Anema; Eli Brenner; H. C. Dijkerman; Jeroen B. J. Smeets

To investigate whether the relative positions of the fingers influence tactile localization, participants were asked to localize tactile stimuli applied to their fingertips. We measured the location and rate of errors for three finger configurations: fingers stretched out and together so that they are touching each other, fingers stretched out and spread apart maximally and fingers stretched out with the two hands on top of each other so that the fingers are interwoven. When the fingers contact each other, it is likely that the error rate to the adjacent fingers will be higher than when the fingers are spread apart. In particular, we reasoned that localization would probably improve when the fingers are spread. We aimed at assessing whether such adjacency was measured in external coordinates (taking proprioception into account) or on the body (in skin coordinates). The results confirmed that the error rate was lower when the fingers were spread. However, there was no decrease in error rate to neighbouring fingertips in the fingers spread condition in comparison with the fingers together condition. In an additional experiment, we showed that the lower error rate when the fingers were spread was not related to the continuous tactile input from the neighbouring fingers when the fingers were together. The current results suggest that information from proprioception is taken into account in perceiving the location of a stimulus on one of the fingertips.


Attention Perception & Psychophysics | 2007

Parallel and serial search in haptics.

Krista Overvliet; Jeroen B. J. Smeets; Eli Brenner

We propose a model that distinguishes between parallel and serial search in haptics. To test this model, participants performed three haptic search experiments in which a target and distractors were presented to their fingertips. The participants indicated a target’s presence by lifting the corresponding finger, or its absence by lifting all fingers. In one experiment, the target was a cross and the distractors were circles. In another, the target was a vertical line and the distractors were horizontal lines. In both cases, we found a serial search pattern. In a final experiment, the target was a horizontal line and the distractors were surfaces without any contours. In this case, we found a parallel search pattern. We conclude that the model can describe our data very well.


Experimental Brain Research | 2010

Serial search for fingers of the same hand but not for fingers of different hands

Krista Overvliet; Jeroen B. J. Smeets; Eli Brenner

In most haptic search tasks, tactile stimuli are presented to the fingers of both hands. In such tasks, the search pattern for some object features, such as the shape of raised line symbols, has been found to be serial. The question is whether this search is serial over all fingers irrespective of the hand, or whether it is serial over the fingers of each hand and parallel over the two hands. To investigate this issue, we determined the speed of static haptic search when two items are presented to two fingers of the same hand and when two items are presented to two fingers of different hands. We compared the results with predictions for parallel and serial search based on the results of a previous study using the same items and a similar task. The results indicate that two fingers of the same hand process information in a serial manner, while two fingers of two different hands process information in parallel. Thus, considering the individual fingers as independent units in haptic search may not be justified, because the hand that they belong to matters.


Neuropsychologia | 2011

Integration of tactile input across fingers in a patient with finger agnosia.

Helen A. Anema; Krista Overvliet; Jeroen B. J. Smeets; Eli Brenner; H. Chris Dijkerman

Finger agnosia has been described as an inability to explicitly individuate between the fingers, which is possibly due to fused neural representations of these fingers. Hence, are patients with finger agnosia unable to keep tactile information perceived over several fingers separate? Here, we tested a finger agnosic patient (GO) on two tasks that measured the ability to keep tactile information simultaneously perceived by individual fingers separate. In experiment 1 GO performed a haptic search task, in which a target (the absence of a protruded line) needed to be identified among distracters (protruded lines). The lines were presented simultaneously to the fingertips of both hands. Similarly to the controls, her reaction time decreased when her fingers were aligned as compared to when her fingers were stretched and in an unaligned position. This suggests that she can keep tactile input from different fingers separate. In experiment two, GO was required to judge the position of a target tactile stimulus to the index finger, relatively to a reference tactile stimulus to the middle finger, both in fingers uncrossed and crossed position. GO was able to indicate the relative position of the target stimulus as well as healthy controls, which indicates that she was able to keep tactile information perceived by two neighbouring fingers separate. Interestingly, GO performed better as compared to the healthy controls in the finger crossed condition. Together, these results suggest the GO is able to implicitly distinguish between tactile information perceived by multiple fingers. We therefore conclude that finger agnosia is not caused by minor disruptions of low-level somatosensory processing. These findings further underpin the idea of a selective impaired higher order body representation restricted to the fingers as underlying cause of finger agnosia.


Psychology and Aging | 2013

The Effects of Aging on Haptic 2D Shape Recognition

Krista Overvliet; Johan Wagemans; Ralf Thomas Krampe

We use the image-mediation model (Klatzky & Lederman, 1987) as a framework to investigate potential sources of adult age differences in the haptic recognition of two-dimensional (2D) shapes. This model states that the low-resolution, temporally sequential, haptic input is translated into a visual image, which is then reperceived through the visual processors, before it is matched against a long-term memory representation and named. In three experiments we tested groups of 12 older (mean age 73.11) and three groups of 12 young adults (mean age 22.80) each. In Experiment 1 we confirm age-related differences in haptic 2D shape recognition, and we show the typical age × complexity interaction. In Experiment 2 we show that if we facilitate the visual translation process, age differences become smaller, but only with simple shapes and not with the more complex everyday objects. In Experiment 3 we target the last step in the model (matching and naming) for complex stimuli. We found that age differences in exploration time were considerably reduced when this component process was facilitated by providing a category name. We conclude that the image-mediation model can explain adult-age differences in haptic recognition, particularly if the role of working memory in forming the transient visual image is considered. Our findings suggest that sensorimotor skills thought to rely on peripheral processes for the most part are critically constrained by age-related changes in central processing capacity in later adulthood.


Perception | 2016

Perceptual grouping affects haptic enumeration over the fingers

Krista Overvliet; Myrthe A. Plaisier

Spatial arrangement is known to influence enumeration times in vision. In haptic enumeration, it has been shown that dividing the total number of items over the two hands can speed up enumeration. Here we investigated how spatial arrangement of items and non-items presented to the individual fingers impacts enumeration times. More specifically, we tested whether grouping by proximity facilitates haptic serial enumeration (counting). Participants were asked to report the number of tangible items, amongst non-items, presented to the finger pads of both hands. In the first experiment, we divided the tangible items in one, two, or three groups that were defined by proximity (i.e., one nonitem in between two groups) and found that number of groups and not number of items were the critical factor in enumeration times. In a second experiment, we found that this grouping even takes place when groups extend across fingers of both hands. These results suggest that grouping by proximity affects haptic serial enumeration and that this grouping takes place on a spatial level possibly in addition to the somatotopic level. Our results support the idea that grouping by proximity, a principle introduced in vision, also greatly affects haptic processing of spatial information.


Journal of Experimental Psychology: Human Perception and Performance | 2017

Haptic search for movable parts

Myrthe A. Plaisier; Krista Overvliet

How do we know that we are touching 1 single object instead of 2 different ones? An important cue is movability: When different sources of input can move independently, it is likely that they belong to different objects or that the object consists of movable parts. We hypothesize that the haptic feature “movability” is used for making this differentiation and we expect movability to be detected efficiently. We investigated this hypothesis by using a haptic search task. In Experiment 1, participants were asked to press down on piano-like keys and respond whether 1 key was movable while the rest were static or the other way around (detection only). Search strategy was determined by comparing performance of 4 response time models. This showed that the search slope for the target absent and present trials was the same (detection without localization model). In Experiment 2, we asked participants to localize the target, in order to investigate whether localization is an extra processing step. In this case our localization after detection model described the data best. This suggests that the target was detected independent of localization. To our knowledge this is the first time such a search strategy has been reported in haptic search, and it highlights the special role of the detection of movability.

Collaboration


Dive into the Krista Overvliet's collaboration.

Top Co-Authors

Avatar

Ralf Krampe

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

Johan Wagemans

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

Eli Brenner

VU University Amsterdam

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Elvin Karana

Delft University of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge