Lisa M. Pritchett
York University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Lisa M. Pritchett.
Experimental Brain Research | 2011
Lisa M. Pritchett; Laurence R. Harris
The location of a touch to the skin, first coded in body coordinates, may be transformed into retinotopic coordinates to facilitate visual-tactile integration. In order for the touch location to be transformed into a retinotopic reference frame, the location of the eyes and head must be taken into account. Previous studies have found eye position–related errors (Harrar and Harris in Exp Brain Res 203:615–620, 2009) and head position–related errors (Ho and Spence Brain Res 1144:136–141, 2007) in tactile localization, indicating that imperfect versions of eye and head signals may be used in the body-to-visual coordinate transformation. Here, we investigated the combined effects of head and eye position on the perceived location of a mechanical touch to the arm. Subjects reported the perceived position of a touch that was presented while their head was positioned to the left, right, or center of the body and their eyes were positioned to the left, right, or center in their orbits. The perceived location of a touch shifted in the direction of both head and the eyes by approximately the same amount. We interpret these shifts as being consistent with touch location being coded in a visual reference frame with a gaze signal used to compute the transformation.
Experimental Brain Research | 2012
Lisa M. Pritchett; Michael J. Carnevale; Laurence R. Harris
The position of gaze (eye plus head position) relative to body is known to alter the perceived locations of sensory targets. This effect suggests that perceptual space is at least partially coded in a gaze-centered reference frame. However, the direction of the effects reported has not been consistent. Here, we investigate the cause of a discrepancy between reported directions of shift in tactile localization related to head position. We demonstrate that head eccentricity can cause errors in touch localization in either the same or opposite direction as the head is turned depending on the procedure used. When head position is held eccentric during both the presentation of a touch and the response, there is a shift in the direction opposite to the head. When the head is returned to center before reporting, the shift is in the same direction as head eccentricity. We rule out a number of possible explanations for the difference and conclude that when the head is moved between a touch and response the touch is coded in a predominantly gaze-centered reference frame, whereas when the head remains stationary a predominantly body-centered reference frame is used. The mechanism underlying these displacements in perceived location is proposed to involve an underestimated gaze signal. We propose a model demonstrating how this single neural error could cause localization errors in either direction depending on whether the gaze or body midline is used as a reference. This model may be useful in explaining gaze-related localization errors in other modalities.
Frontiers in Psychology | 2015
Laurence R. Harris; Michael J. Carnevale; Sarah D’Amour; Lindsey E. Fraser; Vanessa Harrar; Adria E. N. Hoover; Charles Mander; Lisa M. Pritchett
Incorporating the fact that the senses are embodied is necessary for an organism to interpret sensory information. Before a unified perception of the world can be formed, sensory signals must be processed with reference to body representation. The various attributes of the body such as shape, proportion, posture, and movement can be both derived from the various sensory systems and can affect perception of the world (including the body itself). In this review we examine the relationships between sensory and motor information, body representations, and perceptions of the world and the body. We provide several examples of how the body affects perception (including but not limited to body perception). First we show that body orientation effects visual distance perception and object orientation. Also, visual-auditory crossmodal-correspondences depend on the orientation of the body: audio “high” frequencies correspond to a visual “up” defined by both gravity and body coordinates. Next, we show that perceived locations of touch is affected by the orientation of the head and eyes on the body, suggesting a visual component to coding body locations. Additionally, the reference-frame used for coding touch locations seems to depend on whether gaze is static or moved relative to the body during the tactile task. The perceived attributes of the body such as body size, affect tactile perception even at the level of detection thresholds and two-point discrimination. Next, long-range tactile masking provides clues to the posture of the body in a canonical body schema. Finally, ownership of seen body parts depends on the orientation and perspective of the body part in view. Together, all of these findings demonstrate how sensory and motor information, body representations, and perceptions (of the body and the world) are interdependent.
Multisensory Research | 2013
Vanessa Harrar; Lisa M. Pritchett; Laurence R. Harris
Previous research showing systematic localisation errors in touch perception related to eye and head position has suggested that touch is at least partially localised in a visual reference frame. However, many previous studies had participants report the location of tactile stimuli relative to a visual probe, which may force coding into a visual reference. Also, the visual probe could itself be subject to an effect of eye or head position. Thus, it is necessary to assess the perceived position of a tactile stimulus using a within-modality measure in order to make definitive conclusions about the coordinate system in which touch might be coded. Here, we present a novel method for measuring the perceived location of a touch in body coordinates: the Segmented Space Method (SSM). In the SSM participants imagine the region within which the stimulus could be presented divided into several equally spaced, and numbered, segments. Participants then simply report the number corresponding to the segment in which they perceived the stimulus. The SSM represents a simple and novel method that can be easily extended to other modalities by dividing any response space into numbered segments centred on some appropriate reference point (e.g. the head, the torso, the hand, or some point in space off the body). Here we apply SSM to the forearm during eccentric viewing and report localisation errors for touch similar to those previously reported using a crossmodal comparison. The data collected with the SSM strengthen the theory that tactile spatial localisation is generally coded in a visual reference frame even when visual coding is not required by the task.
Journal of Experimental Psychology: Human Perception and Performance | 2015
Sarah D'Amour; Lisa M. Pritchett; Laurence R. Harris
To accurately interpret tactile information, the brain needs to have an accurate representation of the body to which to refer the sensations. Despite this, body representation has only recently been incorporated into the study of tactile perception. Here, we investigate whether distortions of body representation affect tactile sensations. We perceptually altered the length of the arm and the width of the waist using a tendon vibration illusion and measured spatial acuity and sensitivity. Surprisingly, we found reduction in both tactile acuity and sensitivity thresholds when the arm or waist was perceptually altered, which indicates a general disruption of low-level tactile processing. We postulate that the disruptive changes correspond to the preliminary stage as the body representation starts to change and may give new insights into sensory processing in people with long-term or sudden abnormal body representation such as are found in eating disorders or following amputation.
Multisensory Research | 2013
Laurence R. Harris; Sarah D’Amour; Lisa M. Pritchett; Michael J. Carnevale; Vanessa Harrar
How the body is represented in the brain has been an important topic of investigation for over a hundred years, with neurological, philosophical and physiological implications. Tactile processing, at even the lowest level, including localization, detection thresholds and two-point discrimination thresholds, turns out to depend on factors other than the cutaneous receptors. How these properties are affected and by what, can give insight into how the body is represented in the brain. Distortions in localization related to gaze direction suggest spatial coding in multiple reference frames. Distortions in detection and two-point discrimination thresholds related to alteration in perceived body shape indicate distortions in the relative size of different parts of the representation — perhaps related to receptive field changes in a cortical map. I will review several recent studies that together demonstrate that the body’s representation in the brain can be matched to various task requirements and that can be dynamically updated to respond to changes in posture and perceived body dimensions.
Multisensory Research | 2013
Lisa M. Pritchett; Laurence R. Harris
Head position affects touch localization on the torso in different directions depending whether the head returns to centre or remains eccentric during testing. These observations have been taken to suggest that touch is coded in either gaze-centered or body-centered reference frames depending on the task (Pritchett et al., 2012). Here we tested localization of tactile stimuli applied to the back using the same paradigms to indicate gaze-centered or body-centered coding. Tactile stimulation was applied using a horizontal array of eight tactors mounted on a belt, centred on the backbone. Participants reported the perceived location of touches on a visual scale presented on a screen. The head was held eccentrically during the stimulation and either returned to centre or remained eccentric during response. When the head was returned to centre, as for the touches on the front of the torso, perceived touch locations on the back were shifted in the head’s direction, with largest effects on the side to which the head was turned. When the head remained eccentric, perceived touch location was unaffected by head position. Returning the head to centre evokes a gaze-centred strategy with errors related to errors in coding gaze. Leaving the head eccentric is associated with a body-centred coding strategy. The lack of errors on the back, compared to on the front of the torso under this condition may indicate a body-centred representation of the back that is more stable than the front — perhaps related to the availability of the backbone as a prominent body landmark.
Seeing and Perceiving | 2012
Lisa M. Pritchett; Michael J. Carnevale; Laurence R. Harris
We have previously reported that head position affects the perceived location of touch differently depending on the dynamics of the task the subject is involved in. When touch was delivered and responses were made with head rotated touch location shifted in the opposite direction to the head position, consistent with body-centered coding. When touch was delivered with head rotated but response was made with head centered touch shifted in the same direction as the head, consistent with gaze-centered coding. Here we tested whether moving the head in-between touch and response would modulate the effects of head position on touch location. Each trial consisted of three periods, in the first arrows and LEDs guided the subject to a randomly chosen head orientation (90° left, right, or center) and a vibration stimulus was delivered. Next, they were either guided to turn their head or to remain in the same location. In the final period they again were guided to turn or to remain in the same location before reporting the perceived location of the touch on a visual scale using a mouse and computer screen. Reported touch location was shifted in the opposite direction of head orientation during touch presentation regardless of the orientation during response or whether a movement was made before the response. The size of the effect was much reduced compared to our previous results. These results are consistent with touch location being coded in both a gaze centered and body centered reference frame during dynamic conditions.
Seeing and Perceiving | 2012
Laurence R. Harris; Lisa M. Pritchett; Sarah D’Amour
Two-point discrimination threshold depends on the number and size of receptive fields between the touches. But what determines the size of the receptive fields? Are they anatomically fixed? Or are they related to perceived body size? To answer this question we manipulated perceived arm length using the Pinocchio illusion. The test arm was held at the wrist and the holding arm was made to feel perceptually more extended than it was by applying vibration to the tendon of the biceps (cf. de Vignemont et al., 2005). For control trials the holding arm was vibrated elsewhere. An array of tactors, separated by 3 cm, was placed on the upper surface of the arm and covered with a cloth. Vibro-tactile stimulation was applied to either one or two tactors in two periods. Subjects identified which period contained two stimuli. A psychometric function was drawn through the probability of correct response as a function of tactor separation to determine the threshold distance. In a separate experiment, subjects estimated the perceived location of each tactor against a scale laid on top of the cloth. The estimated locations of the tactors on the tested arm were displaced by tendon vibration of the holding arm compatible with a perceptual lengthening of the arm. The threshold for two-touch discrimination was significantly increased from 4.5 (±0.6) cm with no tendon stimulation to 5.7 (±0.5) cm when the arm was perceptually extended. We conclude that two-point touch discrimination depends on the size of central receptive fields that become larger when the arm is perceptually lengthened.
Seeing and Perceiving | 2012
Michael J. Carnevale; Lisa M. Pritchett; Laurence R. Harris
Eccentric gaze systematically biases touch localization on the arm and waist. These perceptual errors suggest that touch location is at least partially coded in a visual reference frame. Here we investigated whether touches to non-visible parts of the body are also affected by gaze position. If so, can the direction of mislocalization tell us how they are laid out in the visual representation? To test this, an array of vibro-tactors was attached to either the lower back or the forehead. During trials, participants were guided to orient the position of their head (90° left, right or straight ahead for touches on the lower back) or head and eyes (combination of ±15° left, right or straight ahead head and eye positions for touches on the forehead) using LED fixation targets and a head mounted laser. Participants then re-oriented to straight ahead and reported perceived touch location on a visual scale using a mouse and computer screen. Similar to earlier experiments on the arm and waist, perceived touch location on the forehead and lower back was biased in the same direction as eccentric head and eye position. This is evidence that perceived touch location is at least partially coded in a visual reference frame even for parts of the body that are not typically seen.