Paul L. Gribble
University of Western Ontario
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Paul L. Gribble.
Attention Perception & Psychophysics | 1996
Kevin G. Munhall; Paul L. Gribble; L. Sacco; M. Ward
Three experiments are reported on the influence of different timing relations on the McGurk effect. In the first experiment, it is shown that strict temporal synchrony between auditory and visual speech stimuli is not required for the McGurk effect. Subjects were strongly influenced by the visual stimuli when the auditory stimuli lagged the visual stimuli by as much as 180 msec. In addition, a stronger McGurk effect was found when the visual and auditory vowels matched. In the second experiment, we paired auditory and visual speech stimuli produced under different speaking conditions (fast, normal, clear). The results showed that the manipulations in both the visual and auditory speaking conditions independently influenced perception. In addition, there was a small but reliable tendency for the better matched stimuli to elicit more McGurk responses than unmatched conditions. In the third experiment, we combined auditory and visual stimuli produced under different speaking conditions (fast, clear) and delayed the acoustics with respect to the visual stimuli. The subjects showed the same pattern of results as in the second experiment. Finally, the delay did not cause different patterns of results for the different audiovisual speaking style combinations. The results suggest that perceivers may be sensitive to the concordance of the time-varying aspects of speech but they do not require temporal coincidence of that information.
Neuron | 2005
Andrew A. G. Mattar; Paul L. Gribble
Learning complex motor behaviors like riding a bicycle or swinging a golf club is based on acquiring neural representations of the mechanical requirements of movement (e.g., coordinating muscle forces to control the club). Here we provide evidence that mechanisms matching observation and action facilitate motor learning. Subjects who observed a video depicting another person learning to reach in a novel mechanical environment (imposed by a robot arm) performed better when later tested in the same environment than subjects who observed similar movements but no learning; moreover, subjects who observed learning of a different environment performed worse. We show that this effect is not based on conscious strategies but instead depends on the implicit engagement of neural systems for movement planning and control.
Nature | 2001
Stephen H. Scott; Paul L. Gribble; Kirsten M. Graham; D. William Cabel
The population vector hypothesis was introduced almost twenty years ago to illustrate that a population vector constructed from neural activity in primary motor cortex (MI) of non-human primates could predict the direction of hand movement during reaching. Alternative explanations for this population signal have been suggested but could not be tested experimentally owing to movement complexity in the standard reaching model. We re-examined this issue by recording the activity of neurons in contralateral MI of monkeys while they made reaching movements with their right arms oriented in the horizontal plane—where the mechanics of limb motion are measurable and anisotropic. Here we found systematic biases between the population vector and the direction of hand movement. These errors were attributed to a non-uniform distribution of preferred directions of neurons and the non-uniformity covaried with peak joint power at the shoulder and elbow. These observations contradict the population vector hypothesis and show that non-human primates are capable of generating reaching movements to spatial targets even though population vectors based on MI activity do not point in the direction of hand motion.
The Journal of Neuroscience | 2010
David J. Ostry; Mohammad Darainy; Andrew A. G. Mattar; Jeremy Wong; Paul L. Gribble
Motor learning is dependent upon plasticity in motor areas of the brain, but does it occur in isolation, or does it also result in changes to sensory systems? We examined changes to somatosensory function that occur in conjunction with motor learning. We found that even after periods of training as brief as 10 min, sensed limb position was altered and the perceptual change persisted for 24 h. The perceptual change was reflected in subsequent movements; limb movements following learning deviated from the prelearning trajectory by an amount that was not different in magnitude and in the same direction as the perceptual shift. Crucially, the perceptual change was dependent upon motor learning. When the limb was displaced passively such that subjects experienced similar kinematics but without learning, no sensory change was observed. The findings indicate that motor learning affects not only motor areas of the brain but changes sensory function as well.
Nature | 2002
Paul L. Gribble; Stephen H. Scott
A hallmark of the human motor system is its ability to adapt motor patterns for different environmental conditions, such as when a skilled ice-hockey player accurately shoots a puck with or without protective equipment. Each object (stick, shoulder pad, elbow pad) imparts a distinct load upon the limb, and a key problem in motor neuroscience is to understand how the brain controls movement for different mechanical contexts. We addressed this issue by training non-human primates to make reaching movements with and without viscous loads applied to the shoulder and/or elbow joints, and then examined neural representations in primary motor cortex (MI) for each load condition. Even though the shoulder and elbow loads are mechanically independent, we found that some neurons responded to both of these single-joint loads. Furthermore, changes in activity of individual neurons during multi-joint loads could be predicted from their response to subordinate single-joint loads. These findings suggest that neural representations of different mechanical contexts in MI are organized in a highly structured manner that may provide a neural basis for how complex motor behaviour is learned from simpler motor tasks.
Experimental Brain Research | 1998
Paul L. Gribble; David J. Ostry
Abstract The aim of this study was to examine the possibility of independent muscle coactivation at the shoulder and elbow. Subjects performed rapid point-to-point movements in a horizontal plane from different initial limb configurations to a single target. EMG activity was measured from flexor and extensor muscles acting at the shoulder (pectoralis clavicular head and posterior deltoid) and elbow (biceps long head and triceps lateral head) and flexor and extensor muscles acting at both joints (biceps short head and triceps long head). Muscle coactivation was assessed by measuring tonic levels of electromyographic (EMG) activity after limb position stabilized following the end of the movements. It was observed that tonic EMG levels following movements to the same target varied as a function of the amplitude of shoulder and elbow motion. Moreover, for the movements tested here, the coactivation of shoulder and elbow muscles was found to be independent – tonic EMG activity of shoulder muscles increased in proportion to shoulder movement, but was unrelated to elbow motion, whereas elbow and double-joint muscle coactivation varied with the amplitude of elbow movement and were not correlated with shoulder motion. In addition, tonic EMG levels were higher for movements in which the shoulder and elbow rotated in the same direction than for those in which the joints rotated in opposite directions. In this respect, muscle coactivation may reflect a simple strategy to compensate for forces introduced by multijoint limb dynamics.
Experimental Brain Research | 2006
Nicholas Cothros; Jeremy D. Wong; Paul L. Gribble
In recent studies of human motor learning, subjects learned to move the arm while grasping a robotic device that applied novel patterns of forces to the hand. Here, we examined the generality of force field learning. We tested the idea that contextual cues associated with grasping a novel object promote the acquisition and use of a distinct internal model, associated with that object. Subjects learned to produce point-to-point arm movements to targets in a horizontal plane while grasping a robotic linkage that applied either a velocity-dependent counter-clockwise or clockwise force field to the hand. Following adaptation, subjects let go of the robot and were asked to generate the same movements in free space. Small but reliable after-effects were observed during the first eight movements in free space, however, these after-effects were significantly smaller than those observed for control subjects who moved the robot in a null field. No reduction in retention was observed when subjects subsequently returned to the force field after moving in free space. In contrast, controls who reached with the robot in a NF showed much poorer retention when returning to a force field. These findings are consistent with the idea that contextual cues associated with grasping a novel object may promote the acquisition of a distinct internal model of the dynamics of the object, separate from internal models used to control limb dynamics alone.
Experimental Brain Research | 2002
Paul L. Gribble; Stefan Everling; Kristen A. Ford; Andrew A. G. Mattar
Visually guided arm movements such as reaching or pointing are accompanied by saccadic eye movements that typically begin prior to motion of the arm. In the past, some degree of coupling between the oculomotor and limb motor systems has been demonstrated by assessing the relative onset times of eye and arm movement, and by the demonstration of a gap effect for arm movement reaction times. However, measures of limb movement onset time based on kinematics are affected by factors such as the relatively high inertia of the limb and neuromechanical delays. The goal of the present study was thus to assess the relative timing of rapid eye and arm movements made to visual targets by examining electromyographic (EMG) activity of limb muscles in conjunction with eye and arm position measures. The observation of a positive correlation between eye and limb EMG onset latencies, and the presence of a gap effect for limb EMG onset times (a reduction in reaction time when a temporal gap is introduced between the disappearance of a central fixation point and the appearance of a new target) both support the idea that eye and arm movement initiation are linked. However, limb EMG onset in most cases precedes saccade onset, and the magnitude of EMG activity prior to eye movement is correlated with both the direction and amplitude of the upcoming arm movement. This suggests that, for the rapid movements studied here, arm movement direction and distance are specified prior to the onset of saccades.
PLOS ONE | 2010
Elizabeth T. Wilson; Jeremy D. Wong; Paul L. Gribble
Relatively few studies have been reported that document how proprioception varies across the workspace of the human arm. Here we examined proprioceptive function across a horizontal planar workspace, using a new method that avoids active movement and interactions with other sensory modalities. We systematically mapped both proprioceptive acuity (sensitivity to hand position change) and bias (perceived location of the hand), across a horizontal-plane 2D workspace. Proprioception of both the left and right arms was tested at nine workspace locations and in 2 orthogonal directions (left-right and forwards-backwards). Subjects made repeated judgments about the position of their hand with respect to a remembered proprioceptive reference position, while grasping the handle of a robotic linkage that passively moved their hand to each judgement location. To rule out the possibility that the memory component of the proprioceptive testing procedure may have influenced our results, we repeated the procedure in a second experiment using a persistent visual reference position. Both methods resulted in qualitatively similar findings. Proprioception is not uniform across the workspace. Acuity was greater for limb configurations in which the hand was closer to the body, and was greater in a forward-backward direction than in a left-right direction. A robust difference in proprioceptive bias was observed across both experiments. At all workspace locations, the left hand was perceived to be to the left of its actual position, and the right hand was perceived to be to the right of its actual position. Finally, bias was smaller for hand positions closer to the body. The results of this study provide a systematic map of proprioceptive acuity and bias across the workspace of the limb that may be used to augment computational models of sensory-motor control, and to inform clinical assessment of sensory function in patients with sensory-motor deficits.
Journal of Neurophysiology | 2010
Dinant A. Kistemaker; Jeremy D. Wong; Paul L. Gribble
It has been widely suggested that the many degrees of freedom of the musculoskeletal system may be exploited by the CNS to minimize energy cost. We tested this idea by having subjects making point-to-point movements while grasping a robotic manipulandum. The robot created a force field chosen such that the minimal energy hand path for reaching movements differed substantially from those observed in a null field. The results show that after extended exposure to the force field, subjects continued to move exactly as they did in the null field and thus used substantially more energy than needed. Even after practicing to move along the minimal energy path, subjects did not adapt their freely chosen hand paths to reduce energy expenditure. The results of this study indicate that for point-to-point arm movements minimization of energy cost is not a dominant factor that influences how the CNS arrives at kinematics and associated muscle activation patterns.