Denise Y. P. Henriques
York University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Denise Y. P. Henriques.
Annual Review of Neuroscience | 2011
J. Douglas Crawford; Denise Y. P. Henriques; W. Pieter Medendorp
Much of the central nervous system is involved in visuomotor transformations for goal-directed gaze and reach movements. These transformations are often described in terms of stimulus location, gaze fixation, and reach endpoints, as viewed through the lens of translational geometry. Here, we argue that the intrinsic (primarily rotational) 3-D geometry of the eye-head-reach systems determines the spatial relationship between extrinsic goals and effector commands, and therefore the required transformations. This approach provides a common theoretical framework for understanding both gaze and reach control. Combined with an assessment of the behavioral, neurophysiological, imaging, and neuropsychological literature, this framework leads us to conclude that (a) the internal representation and updating of visual goals are dominated by gaze-centered mechanisms, but (b) these representations must then be transformed as a function of eye and head orientation signals into effector-specific 3-D movement commands.
Experimental Brain Research | 2003
Denise Y. P. Henriques; W.P. Medendorp; C.C.A.M. Gielen; J. D. Crawford
Eye–hand coordination is geometrically complex. To compute the location of a visual target relative to the hand, the brain must consider every anatomical link in the chain from retinas to fingertips. Here we focus on the first three links, studying how the brain handles information about the angles of the two eyes and the head. It is known that people, even in darkness, reach more accurately when the eye looks toward the target, rather than right or left of it. We show that reaching is also impaired when the binocular fixation point is displaced from the target in depth: reaching becomes not just sloppy, but systematically inaccurate. Surprisingly, though, in normal Gaze-On-Target reaching we found no strong correlations between errors in aiming the eyes and hand onto the target site. We also asked people to reach when the head was not facing the target. When the eyes were on-target, people reached accurately, but when gaze was off-target, performance degraded. Taking all these findings together, we suggest that the brains computational networks have learned the complex geometry of reaching for well-practiced tasks, but that the networks are poorly calibrated for less common tasks such as Gaze-Off-Target reaching.
Experimental Brain Research | 2010
Erin K. Cressman; Danielle Salomonczyk; Denise Y. P. Henriques
Previous studies have shown that both young and older subjects adapt their reaches in response to a visuomotor distortion. It has been suggested that one’s continued ability to adapt to a visuomotor distortion with advancing age is due to the preservation of implicit learning mechanisms, where implicit learning mechanisms include processes that realign sensory inputs (i.e. shift one’s felt hand position to match the visual representation). The present study examined this proposal by determining if changes in sense of felt hand position (i.e. proprioceptive recalibration) follow visuomotor adaptation in older subjects. As well, we examined the influence of age on proprioceptive recalibration by comparing young and older subjects’ estimates of the position at which they felt their hand was aligned with a visual reference marker before and after aiming with a misaligned cursor that was gradually rotated 30° clockwise of the actual hand location. On estimation trials, subjects moved their hand along a robot-generated constrained pathway. At the end of the movement, a reference marker appeared and subjects indicated if their hand was left or right of the marker. Results indicated that all subjects adapted their reaches at a similar rate and to the same extent across the reaching trials. More importantly, we found that both young and older subjects recalibrated proprioception, such that they felt their hand was aligned with a reference marker when it was approximately 6° more left (or counterclockwise) of the marker following reaches with a rotated cursor. The leftward shift in both young and older subjects’ estimates was in the same direction and a third of the extent of adapted movement. Given that the changes in the estimate of felt hand position were only a fraction of the changes observed in the reaching movements, it is unlikely that sensory recalibration was the only source driving changes in reaches. Thus, we propose that proprioceptive recalibration combines with adapted sensorimotor mappings to produce changes in reaching movements. From the results of the present study, it is clear that changes in both sensory and motor systems are possible in older adults and could contribute to the preserved visuomotor adaptation.
Journal of Motor Behavior | 2012
Denise Y. P. Henriques; Erin K. Cressman
ABSTRACT Motor learning, in particular motor adaptation, is driven by information from multiple senses. For example, when arm control is faulty, vision, touch, and proprioception can all report on the arms movements and help guide the adjustments necessary for correcting motor error. In recent years we have learned a lot about how the brain integrates information from multiple senses for the purpose of perception. However, less is known about how multisensory data guide motor learning. Most models of, and studies on, motor learning focus almost exclusively on the ensuing changes in motor performance without exploring the implications on sensory plasticity. Nor do they consider how discrepancies in sensory information (e.g., vision and proprioception) related to hand position may affect motor learning. Here, we discuss research from our lab and others that shows how motor learning paradigms affect proprioceptive estimates of hand position, and how even the mere discrepancy between visual and proprioceptive feedback can affect learning and plasticity. Our results suggest that sensorimotor learning mechanisms do not exclusively rely on motor plasticity and motor memory, and that sensory plasticity, in particular proprioceptive recalibration, plays a unique and important role in motor learning.
Cerebral Cortex | 2014
Simona Monaco; Ying Chen; W.P. Medendorp; J. D. Crawford; Katja Fiehler; Denise Y. P. Henriques
Grasping behaviors require the selection of grasp-relevant object dimensions, independent of overall object size. Previous neuroimaging studies found that the intraparietal cortex processes object size, but it is unknown whether the graspable dimension (i.e., grasp axis between selected points on the object) or the overall size of objects triggers activation in that region. We used functional magnetic resonance imaging adaptation to investigate human brain areas involved in processing the grasp-relevant dimension of real 3-dimensional objects in grasping and viewing tasks. Trials consisted of 2 sequential stimuli in which the objects grasp-relevant dimension, its global size, or both were novel or repeated. We found that calcarine and extrastriate visual areas adapted to object size regardless of the grasp-relevant dimension during viewing tasks. In contrast, the superior parietal occipital cortex (SPOC) and lateral occipital complex of the left hemisphere adapted to the grasp-relevant dimension regardless of object size and task. Finally, the dorsal premotor cortex adapted to the grasp-relevant dimension in grasping, but not in viewing, tasks, suggesting that motor processing was complete at this stage. Taken together, our results provide a complete cortical circuit for progressive transformation of general object properties into grasp-related responses.
Journal of Neurophysiology | 2008
Michael Vesia; Xiaogang Yan; Denise Y. P. Henriques; Lauren E. Sergio; J. D. Crawford
Posterior parietal cortex (PPC) has been implicated in the integration of visual and proprioceptive information for the planning of action. We previously reported that single-pulse transcranial magnetic stimulation (TMS) over dorsal-lateral PPC perturbs the early stages of spatial processing for memory-guided reaching. However, our data did not distinguish whether TMS disrupted the reach goal or the internal estimate of initial hand position needed to calculate the reach vector. To test between these hypotheses, we investigated reaching in six healthy humans during left and right parietal TMS while varying visual feedback of the movement. We reasoned that if TMS were disrupting the internal representation of hand position, visual feedback from the hand might still recalibrate this signal. We tested four viewing conditions: 1) final vision of hand position; 2) full vision of hand position; 3) initial and final vision of hand position; and 4) middle and final vision of hand position. During the final vision condition, left parietal stimulation significantly increased endpoint variability, whereas right parietal stimulation produced a significant leftward shift in both visual fields. However, these errors significantly decreased with visual feedback of the hand during both planning and control stages of the reach movement. These new findings demonstrate that 1) visual feedback of hand position during the planning and early execution of the reach can recalibrate the perturbed signal and, importantly, and 2) TMS over dorsal-lateral PPC does not disrupt the internal representation of the visual goal, but rather the reach vector, or more likely the sense of initial hand position that is used to calculate this vector.
Experimental Brain Research | 2010
Johanna Reuschel; Knut Drewing; Denise Y. P. Henriques; Frank Rösler; Katja Fiehler
Many studies demonstrated a higher accuracy in perception and action when using more than one sense. The maximum-likelihood estimation (MLE) model offers a recent approach on how perceptual information is integrated across different sensory modalities suggesting statistically optimal integration. The purpose of the present study was to investigate how visual and proprioceptive movement information is integrated for the perception of trajectory geometry. To test this, participants sat in front of an apparatus that moved a handle along a horizontal plane. Participants had to decide whether two consecutive trajectories formed an acute or an obtuse movement path. Judgments had to be based on information from a single modality alone, i.e., vision or proprioception, or on the combined information of both modalities. We estimated both the bias and variance for each single modality condition and predicted these parameters for the bimodal condition using the MLE model. Consistent with previous findings, variability decreased for perceptual judgments about trajectory geometry based on combined visual-proprioceptive information. Furthermore, the observed bimodal data corresponded well to the predicted parameters. Our results suggest that visual and proprioceptive movement information for the perception of trajectory geometry is integrated in a statistically optimal manner.
Neuropsychologia | 2010
Stephanie A. H. Jones; Denise Y. P. Henriques
We examined the effect of gaze direction relative to target location on reach endpoint errors made to proprioceptive and multisensory targets. We also explored if and how visual and proprioceptive information about target location are integrated to guide reaches. Participants reached to their unseen left hand in one of three target locations (left of body midline, body midline, or right or body midline), while it remained at a target site (online), or after it was removed from this location (remembered), and also after the target hand had been briefly lit before reaching (multisensory target). The target hand was guided to a target location using a robot-generated path. Reaches were made with the right hand in complete darkness, while gaze was varied in one of four eccentric directions. Horizontal reach errors systematically varied relative to gaze for all target modalities; not only for visually remembered and online proprioceptive targets as has been found in previous studies, but for the first time, also for remembered proprioceptive targets and proprioceptive targets that were briefly visible. These results suggest that the brain represents the locations of online and remembered proprioceptive reach targets, as well as visual-proprioceptive reach targets relative to gaze, along with other motor-related representations. Our results, however, do not suggest that visual and proprioceptive information are optimally integrated when coding the location of multisensory reach targets in this paradigm.
Neuropsychologia | 2011
Danielle Salomonczyk; Erin K. Cressman; Denise Y. P. Henriques
Reaching with misaligned visual feedback of the hand leads to reach adaptation (motor recalibration) and also results in partial sensory recalibration, where proprioceptive estimates of hand position are changed in a way that is consistent with the visual distortion. The goal of the present study was to explore the relationship between changes in sensory and motor systems by examining these processes following (1) prolonged reach training and (2) training with increasing visuomotor distortions. To examine proprioceptive recalibration, we determined the position at which subjects felt their hand was aligned with a reference marker after completing three blocks of reach training trials with a cursor that was rotated 30° clockwise (CW) for all blocks, or with a visuomotor distortion that was increased incrementally across the training blocks up to 70°CW relative to actual hand motion. On average, subjects adapted their reaches by 16° and recalibrated their sense of felt hand position by 7° leftwards following the first block of reach training trials in which they reached with a cursor that was rotated 30°CW relative to the hand, compared to baseline values. There was no change in these values for the 30° training group across subsequent training blocks. However, subjects training with increasing levels of visuomotor distortion showed increased reach adaptation (up to 34° leftward movement aftereffects) and sensory recalibration (up to 15° leftwards). Analysis of motor and sensory changes following each training block did not reveal any significant correlations, suggesting that the processes underlying motor adaptation and proprioceptive recalibration occur simultaneously yet independently of each other.
Experimental Brain Research | 2010
Katja Fiehler; Frank Rösler; Denise Y. P. Henriques
There is considerable evidence that targets for action are represented in a dynamic gaze-centered frame of reference, such that each gaze shift requires an internal updating of the target. Here, we investigated the effect of eye movements on the spatial representation of targets used for position judgements. Participants had their hand passively placed to a location, and then judged whether this location was left or right of a remembered visual or remembered proprioceptive target, while gaze direction was varied. Estimates of position of the remembered targets relative to the unseen position of the hand were assessed with an adaptive psychophysical procedure. These positional judgements significantly varied relative to gaze for both remembered visual and remembered proprioceptive targets. Our results suggest that relative target positions may also be represented in eye-centered coordinates. This implies similar spatial reference frames for action control and space perception when positions are coded relative to the hand.