David J. Ostry
McGill University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by David J. Ostry.
Journal of Motor Behavior | 1993
J. Randall Flanagan; David J. Ostry; Anatol G. Feldman
Human reaching movements to fixed and displaced visual targets were recorded and compared with simulated movements generated by using a two-joint arm model based on the equilibrium-point (EP) hypothesis (lambda model) of motor control (Feldman, 1986). The aim was to investigate the form of central control signals underlying these movements. According to this hypothesis, movements result from changes in control variables that shift the equilibrium position (EP) of the arm. At any time, muscle activations and forces will depend on the difference between the arms EP and its actual position and on the limbs velocity. In this article, we suggest that the direction of EP shift in reaching is specified at the hand level, whereas the rate of EP shift may be specified at the hand or joint level. A common mechanism underlying reaching to fixed and displaced targets is proposed whereby the EP of the hand shifts in a straight line toward the present target. After the target is displaced, the direction of the hand EP shift is modified toward the second target. The results suggest that the rate of shift of the hand EP may be modified for movements in different parts of the work space. The model, with control signals that vary in a simple fashion over time, is able to generate the kinematic patterns observed empirically.
Journal of the Acoustical Society of America | 1985
David J. Ostry; Kevin G. Munhall
A computerized pulsed-ultrasound system was used to monitor tongue dorsum movements during the production of consonant-vowel sequences in which speech rate, vowel, and consonant were varied. The kinematics of tongue movement were analyzed by measuring the lowering gesture of the tongue to give estimates of movement amplitude, duration, and maximum velocity. All three subjects in the study showed reliable correlations between the amplitude of the tongue dorsum movement and its maximum velocity. Further, the ratio of the maximum velocity to the extent of the gesture, a kinematic indicator of articulator stiffness, was found to vary inversely with the duration of the movement. This relationship held both within individual conditions and across all conditions in the study such that a single function was able to accommodate a large proportion of the variance due to changes in movement duration. As similar findings have been obtained both for abduction and adduction gestures of the vocal folds and for rapid voluntary limb movements, the data suggest that a wide range of changes in the duration of individual movements might all have a similar origin. The control of movement rate and duration through the specification of biomechanical characteristics of speech articulators is discussed.
Nature | 2003
Stephanie Tremblay; Douglas M. Shiller; David J. Ostry
The hypothesis that speech goals are defined acoustically and maintained by auditory feedback is a central idea in speech production research. An alternative proposal is that speech production is organized in terms of control signals that subserve movements and associated vocal-tract configurations. Indeed, the capacity for intelligible speech by deaf speakers suggests that somatosensory inputs related to movement play a role in speech production—but studies that might have documented a somatosensory component have been equivocal. For example, mechanical perturbations that have altered somatosensory feedback have simultaneously altered acoustics. Hence, any adaptation observed under these conditions may have been a consequence of acoustic change. Here we show that somatosensory information on its own is fundamental to the achievement of speech movements. This demonstration involves a dissociation of somatosensory and auditory feedback during speech production. Over time, subjects correct for the effects of a complex mechanical load that alters jaw movements (and hence somatosensory feedback), but which has no measurable or perceptible effect on acoustic output. The findings indicate that the positions of speech articulators and associated somatosensory inputs constitute a goal of speech movements that is wholly separate from the sounds produced.
Experimental Brain Research | 2003
David J. Ostry; Anatol G. Feldman
The ability to formulate explicit mathematical models of motor systems has played a central role in recent progress in motor control research. As a result of these modeling efforts and in particular the incorporation of concepts drawn from control systems theory, ideas about motor control have changed substantially. There is growing emphasis on motor learning and particularly on predictive or anticipatory aspects of control that are related to the neural representation of dynamics. Two ideas have become increasingly prominent in mathematical modeling of motor function—forward internal models and inverse dynamics. The notion of forward internal models which has drawn from work in adaptive control arises from the recognition that the nervous system takes account of dynamics in motion planning. Inverse dynamics, a complementary way of adjusting control signals to deal with dynamics, has proved a simple means to establish the joint torques necessary to produce desired movements. In this paper, we review the force control formulation in which inverse dynamics and forward internal models play a central role. We present evidence in its favor and describe its limitations. We note that inverse dynamics and forward models are potential solutions to general problems in motor control—how the nervous system establishes a mapping between desired movements and associated control signals, and how control signals are adjusted in the context of motor learning, dynamics and loads. However, we find little empirical evidence that specifically supports the inverse dynamics or forward internal model proposals per se. We further conclude that the central idea of the force control hypothesis—that control levels operate through the central specification of forces—is flawed. This is specifically evident in the context of attempts to incorporate physiologically realistic muscle and reflex mechanisms into the force control model. In particular, the formulation offers no means to shift between postures without triggering resistance due to postural stabilizing mechanisms.
The Journal of Neuroscience | 2010
David J. Ostry; Mohammad Darainy; Andrew A. G. Mattar; Jeremy Wong; Paul L. Gribble
Motor learning is dependent upon plasticity in motor areas of the brain, but does it occur in isolation, or does it also result in changes to sensory systems? We examined changes to somatosensory function that occur in conjunction with motor learning. We found that even after periods of training as brief as 10 min, sensed limb position was altered and the perceptual change persisted for 24 h. The perceptual change was reflected in subsequent movements; limb movements following learning deviated from the prelearning trajectory by an amount that was not different in magnitude and in the same direction as the perceptual shift. Crucially, the perceptual change was dependent upon motor learning. When the limb was displaced passively such that subjects experienced similar kinematics but without learning, no sensory change was observed. The findings indicate that motor learning affects not only motor areas of the brain but changes sensory function as well.
Journal of Experimental Psychology: Human Perception and Performance | 1985
Kevin G. Munhall; David J. Ostry; Avraham Parush
The control of individual speech gestures was investigated by examining laryngeal and tongue movements during vowel and consonant production. A number of linguistic manipulations known to alter the durational characteristics of speech (i.e., speech rate, lexical stress, and phonemic identity) were tested. In all cases a consistent pattern was observed in the kinematics of the laryngeal and tongue gestures. The ratio of maximum instantaneous velocity to movement amplitude, a kinematic index of mass-normalized stiffness, was found to increase systematically as movement duration decreased. Specifically, the ratio of maximum velocity to movement amplitude varied as a function of a parameter, C, times the reciprocal of movement duration. The conformity of the data to this relation indicates that durational change is accomplished by scalar adjustment of a base velocity form. These findings are consistent with the idea that kinematic change is produced by the specification of articulator stiffness.
The Journal of Neuroscience | 2004
Nicole Malfait; David J. Ostry; Tomáš Paus
Substantial neurophysiological evidence points to the posterior parietal cortex (PPC) as playing a key role in the coordinate transformation necessary for visually guided reaching. Our goal was to examine the role of PPC in the context of learning new dynamics of arm movements. We assessed this possibility by stimulating PPC with transcranial magnetic stimulation (TMS) while subjects learned to make reaching movements with their right hand in a velocity-dependent force field. We reasoned that, if PPC is necessary to adjust the trajectory of the arm as it interacts with a novel mechanical system, interfering with the functioning of PPC would impair adaptation. Single pulses of TMS were applied over the left PPC 40 msec after the onset of movement during adaptation. As a control, another group of subjects was stimulated over the visual cortex. During early stages of learning, the magnitude of the error (measured as the deviation of the hand paths) was similar across groups. By the end of the learning period, however, error magnitudes decreased to baseline levels for controls but remained significantly larger for the group stimulated over PPC. Our findings are consistent with a role of PPC in the adjustment of motor commands necessary for adapting to a novel mechanical environment.
The Journal of Neuroscience | 2011
Shahabeddin Vahdat; Mohammad Darainy; Theodore E. Milner; David J. Ostry
Motor learning changes the activity of cortical motor and subcortical areas of the brain, but does learning affect sensory systems as well? We examined in humans the effects of motor learning using fMRI measures of functional connectivity under resting conditions and found persistent changes in networks involving both motor and somatosensory areas of the brain. We developed a technique that allows us to distinguish changes in functional connectivity that can be attributed to motor learning from those that are related to perceptual changes that occur in conjunction with learning. Using this technique, we identified a new network in motor learning involving second somatosensory cortex, ventral premotor cortex, and supplementary motor cortex whose activation is specifically related to perceptual changes that occur in conjunction with motor learning. We also found changes in a network comprising cerebellar cortex, primary motor cortex, and dorsal premotor cortex that were linked to the motor aspects of learning. In each network, we observed highly reliable linear relationships between neuroplastic changes and behavioral measures of either motor learning or perceptual function. Motor learning thus results in functionally specific changes to distinct resting-state networks in the brain.
Journal of Phonetics | 2008
Carol A. Fowler; Valery Sramko; David J. Ostry; Sarah Rowland; Pierre A. Hallé
We examined the voice onset times (VOTs) of monolingual and bilingual speakers of English and French to address the question whether cross language phonetic influences occur particularly in simultaneous bilinguals (that is, speakers who learned both languages from birth). Speakers produced sentences in which there were target words with initial /p/, /t/ or /k/. In French, natively bilingual speakers produced VOTs that were significantly longer than those of monolingual French speakers. French VOTs were even longer in bilingual speakers who learned English before learning French. The outcome was analogous in English speech. Natively bilingual speakers produced shorter English VOTs than monolingual speakers. English VOTs were even shorter in the speech of bilinguals who learned French before English. Bilingual speakers had significantly longer VOTs in their English speech than in their French. Accordingly, the cross language effects do not occur because natively bilingual speakers adopt voiceless stop categories intermediate between those of native English and French speakers that serve both languages. Monolingual speakers of French or English in Montreal had VOTs nearly identical respectively to those of monolingual Parisian French speakers and those of monolingual Connecticut English speakers. These results suggest that mere exposure to a second language does not underlie the cross language phonetic effect; however, these findings must be resolved with others that appear to show an effect of overhearing.
Proceedings of the National Academy of Sciences of the United States of America | 2009
Takayuki Ito; Mark Tiede; David J. Ostry
Somatosensory signals from the facial skin and muscles of the vocal tract provide a rich source of sensory input in speech production. We show here that the somatosensory system is also involved in the perception of speech. We use a robotic device to create patterns of facial skin deformation that would normally accompany speech production. We find that when we stretch the facial skin while people listen to words, it alters the sounds they hear. The systematic perceptual variation we observe in conjunction with speech-like patterns of skin stretch indicates that somatosensory inputs affect the neural processing of speech sounds and shows the involvement of the somatosensory system in the perceptual processing in speech.