Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where David A. Crowe is active.

Publication


Featured researches published by David A. Crowe.


Proceedings of the National Academy of Sciences of the United States of America | 2002

Parallel processing of serial movements in prefrontal cortex.

Bruno B. Averbeck; Matthew V. Chafee; David A. Crowe; Apostolos P. Georgopoulos

A key idea in Lashleys formulation of the problem of serial order in behavior is the postulated neural representation of all serial elements before the action begins. We studied this question by recording the activity of individual neurons simultaneously in small ensembles in prefrontal cortex while monkeys copied geometrical shapes shown on a screen. Monkeys drew the shapes as sequences of movement segments, and these segments were associated with distinct patterns of neuronal ensemble activity. Here we show that these patterns were present during the time preceding the actual drawing. The rank of the strength of representation of a segment in the neuronal population during this time, as assessed by discriminant analysis, predicted the serial position of the segment in the motor sequence. An analysis of errors in copying and their neural correlates supplied additional evidence for this code and provided a neural basis for Lashleys hypothesis that errors in motor sequences would be most likely to occur when executing elements that had prior representations of nearly equal strength.


The Journal of Neuroscience | 2010

Rapid sequences of population activity patterns dynamically encode task-critical spatial information in parietal cortex

David A. Crowe; Bruno B. Averbeck; Matthew V. Chafee

We characterized the temporal dynamics of population activity in parietal cortex of monkeys as they solved a spatial cognitive problem posed by an object construction task. We applied pattern classification techniques to characterize patterns of activity coding object-centered side, a task-defined variable specifying whether an object component was located on the left or right side of a reference object, regardless of its retinocentric position. During a period in which the value of object-centered side, as defined by task events, remained constant, parietal cortex represented this variable using a dynamic neural code by activating neurons with the same spatial preference in rapid succession so that the pattern of active neurons changed dramatically while the spatial information they collectively encoded remained stable. Furthermore, if the neurons shared the same spatial preference, then their pretrial activity (measured before objects were shown) was correlated to a degree that scaled as a positive linear function of how close together in time the neurons would be activated later in the trial. Finally, we found that while parietal cortex represented task-critical spatial information using a dynamic neural code, it simultaneously represented task-irrelevant spatial information using a stationary neural code. These data demonstrate that dynamic spatial representations exist in parietal cortex, provide novel insight into the synaptic mechanisms that generate them, and suggest they may preferentially encode task-critical spatial information.


Nature Neuroscience | 2013

Prefrontal neurons transmit signals to parietal neurons that reflect executive control of cognition

David A. Crowe; Shikha Jain Goodwin; Rachael K. Blackman; Sofia Sakellaridi; Scott R. Sponheim; Angus W. MacDonald; Matthew V. Chafee

Prefrontal cortex influences behavior largely through its connections with other association cortices; however, the nature of the information conveyed by prefrontal output signals and what effect these signals have on computations performed by target structures is largely unknown. To address these questions, we simultaneously recorded the activity of neurons in prefrontal and posterior parietal cortices of monkeys performing a rule-based spatial categorization task. Parietal cortex receives direct prefrontal input, and parietal neurons, like their prefrontal counterparts, exhibit signals that reflect rule-based cognitive processing in this task. By analyzing rapid fluctuations in the cognitive information encoded by activity in the two areas, we obtained evidence that signals reflecting rule-dependent categories were selectively transmitted in a top-down direction from prefrontal to parietal neurons, suggesting that prefrontal output is important for the executive control of distributed cognitive processing.Prefrontal cortex influences behavior largely through its connections with other association cortices; however, the nature of the information conveyed by prefrontal output signals and what effect these signals have on computations performed by target structures is largely unknown. To address these questions, we simultaneously recorded the activity of neurons in prefrontal and posterior parietal cortices of monkeys performing a rule-based spatial categorization task. Parietal cortex receives direct prefrontal input, and parietal neurons, like their prefrontal counterparts, exhibit signals that reflect rule-based cognitive processing in this task. By analyzing rapid fluctuations in the cognitive information encoded by activity in the two areas, we obtained evidence that signals reflecting rule-dependent categories were selectively transmitted in a top-down direction from prefrontal to parietal neurons, suggesting that prefrontal output is important for the executive control of distributed cognitive processing.


European Journal of Neuroscience | 2010

Understanding the parietal lobe syndrome from a neurophysiological and evolutionary perspective

Roberto Caminiti; Matthew V. Chafee; Alexandra Battaglia-Mayer; Bruno B. Averbeck; David A. Crowe; Apostolos P. Georgopoulos

In human and nonhuman primates parietal cortex is formed by a multiplicity of areas. For those of the superior parietal lobule (SPL) there exists a certain homology between man and macaques. As a consequence, optic ataxia, a disturbed visual control of hand reaching, has similar features in man and monkeys. Establishing such correspondence has proven difficult for the areas of the inferior parietal lobule (IPL). This difficulty depends on many factors. First, no physiological information is available in man on the dynamic properties of cells in the IPL. Second, the number of IPL areas identified in the monkey is paradoxically higher than that so far described in man, although this issue will probably be reconsidered in future years, thanks to comparative imaging studies. Third, the consequences of parietal lesions in monkeys do not always match those observed in humans. This is another paradox if one considers that, in certain cases, the functional properties of neurons in the monkey’s IPL would predict the presence of behavioral skills, such as construction capacity, that however do not seem to emerge in the wild. Therefore, constructional apraxia, which is well characterized in man, has never been described in monkeys and apes. Finally, only certain aspects, i.e. hand directional hypokinesia and gaze apraxia (Balint’s psychic paralysis of gaze), of the multifaceted syndrome hemispatial neglect have been described in monkeys. These similarities, differences and paradoxes, among many others, make the study of the evolution and function of parietal cortex a challenging case.


The Journal of Neuroscience | 2008

Neural Ensemble Decoding Reveals a Correlate of Viewer- to Object-Centered Spatial Transformation in Monkey Parietal Cortex

David A. Crowe; Bruno B. Averbeck; Matthew V. Chafee

The parietal cortex contains representations of space in multiple coordinate systems including retina-, head-, body-, and world-based systems. Previously, we found that when monkeys are required to perform spatial computations on objects, many neurons in parietal area 7a represent position in an object-centered coordinate system as well. Because visual information enters the brain in a retina-centered reference frame, generation of an object-centered reference requires the brain to perform computation on the visual input. We provide evidence that area 7a contains a correlate of that computation. Specifically, area 7a contains neurons that code information in retina- and object-centered coordinate systems. The information in retina-centered coordinates emerges first, followed by the information in object-centered coordinates. We found that the strength and accuracy of these representations is correlated across trials. Finally, we found that retina-centered information could be used to predict subsequent object-centered signals, but not vice versa. These results are consistent with the hypothesis that either area 7a, or an area that precedes area 7a in the visual processing hierarchy, is performing the retina- to object-centered transformation.


The Journal of Neuroscience | 2014

Dynamic Representation of the Temporal and Sequential Structure of Rhythmic Movements in the Primate Medial Premotor Cortex

David A. Crowe; Wilbert Zarco; Ramon Bartolo; Hugo Merchant

We determined the encoding properties of single cells and the decoding accuracy of cell populations in the medial premotor cortex (MPC) of Rhesus monkeys to represent in a time-varying fashion the duration and serial order of six intervals produced rhythmically during a synchronization-continuation tapping task. We found that MPC represented the temporal and sequential structure of rhythmic movements by activating small ensembles of neurons that encoded the duration or the serial order in rapid succession, so that the pattern of active neurons changed dramatically within each interval. Interestingly, the width of the encoding or decoding function for serial order increased as a function of duration. Finally, we found that the strength of correlation in spontaneous activity of the individual cells varied as a function of the timing of their recruitment. These results demonstrate the existence of dynamic representations in MPC for the duration and serial order of intervals produced rhythmically and suggest that this dynamic code depends on ensembles of interconnected neurons that provide a strong synaptic drive to the next ensemble in a consecutive chain of neural events.


Neuron | 2005

Dynamics of Parietal Neural Activity during Spatial Cognitive Processing

David A. Crowe; Bruno B. Averbeck; Matthew V. Chafee; Apostolos P. Georgopoulos

Dynamic neural processing unrelated to changes in sensory input or motor output is likely to be a hallmark of cognitive operations. Here we show that neural representations of space in parietal cortex are dynamic while monkeys perform a spatial cognitive operation on a static visual stimulus. We recorded neural activity in area 7a during a visual maze task in which monkeys mentally followed a path without moving their eyes. We found that the direction of the followed path could be recovered from neuronal population activity. When the monkeys covertly processed a path that turned, the population representation of path direction shifted in the direction of the turn. This neural population dynamic took place during a period of unchanging visual input and showed characteristics of both serial and parallel processing. The data suggest that the dynamic evolution of parietal neuronal activity is associated with the progression of spatial cognitive operations.


Journal of Cognitive Neuroscience | 2000

Mental Maze Solving

David A. Crowe; Bruno B. Averbeck; Matthew V. Chafee; John H. Anderson; Apostolos P. Georgopoulos

We sought to determine how a visual maze is mentally solved. Human subjects (N = 13) viewed mazes with orthogonal, unbranched paths; each subject solved 200-600 mazes in any specific experiment below. There were four to six openings at the perimeter of the maze, of which four were labeled: one was the entry point and the remainder were potential exits marked by Arabic numerals. Starting at the entry point, in some mazes the path exited, whereas in others it terminated within the maze. Subjects were required to type the number corresponding to the true exit (if the path exited) or type zero (if the path did not exit). In all cases, the only required hand movement was a key press, and thus the hand never physically traveled through the maze. Response times (RT) were recorded and analyzed using a multiple linear regression model. RT increased as a function of key parameters of the maze, namely the length of the main path, the number of turns in the path, the direct distance from entry to termination, and the presence of an exit. The dependence of RT on the number of turns was present even when the path length was fixed in a separate experiment (N = 10 subjects). In a different experiment, subjects solved large and small mazes (N = 3 subjects). The former was the same as the latter but was scaled up by 1.77 times. Thus both kinds of mazes contained the same number of squares but each square subtended 1.77 of visual angle (DVA) in the large maze, as compared to 1 DVA in the small one. We found that the average RT was practically the same in both cases. A multiple regression analysis revealed that the processing coefficients related to maze distance (i.e., path length and direct distance) were reduced by approximately one-half when solving large mazes, as compared to solving small mazes. This means that the efficiency in processing distance-related information almost doubled for scaled-up mazes. In contrast, the processing coefficients for number of turns and exit status were practically the same in the two cases. Finally, the eye movements of three subjects were recorded during maze solution. They consisted of sequences of saccades and fixations. The number of fixations in a trial increased as a linear function of the path length and number of turns. With respect to the fixations themselves, eyes tended to fixate on the main path and to follow it along its course, such that fixations occurring later in time were positioned at progressively longer distances from the entry point. Furthermore, the time the eyes spent at each fixation point increased as a linear function of the length and number of turns in the path segment between the current and the upcoming fixation points. These findings suggest that the maze segment from the current fixation spot to the next is being processed during the fixation time (FT), and that a significant aspect of this processing relates to the length and turns in that segment. We interpreted these relations to mean that the maze was mentally traversed. We then estimated the distance and endpoint of the path mentally traversed within a specific FT; we also hypothesized that the next portion of the main path would be traversed during the ensuing FT, and so on for the whole path. A prediction of this hypothesis is that the upcoming saccade would land the eyes at or near the locus on the path where the mental traversing ended, so that the eyes would pick up where the mental traversal left off. In this way, a portion of the path would be traversed during a fixation and successive such portions would be strung together closely along the main path to complete the processing of the whole path. We tested this prediction by analyzing the relations between the path distance of mental traverse and the distance along the path between the current and the next fixation spot. Indeed, we found that these distances were practically the same and that the endpoint of the hypothesized mental path traversing was very close to the point where the eye landed by the saccade to initiate a new mental traversing. This forward progression of fixation points along the maze path, coupled with the ongoing analysis of the path between successive fixation points, would constitute an algorithm for the routine solution of a maze.


Frontiers in Systems Neuroscience | 2011

Top-down spatial categorization signal from prefrontal to posterior parietal cortex in the primate.

Hugo Merchant; David A. Crowe; Melissa S. Robertson; Antonio F. Fortes; Apostolos P. Georgopoulos

In the present study we characterized the strength and time course of category-selective responses in prefrontal cortex and area 7a of the posterior parietal cortex during a match-to-sample spatial categorization task. A monkey was trained to categorize whether the height of a horizontal sample bar, presented in rectangular frame at one of three vertical locations, was “high” or “low,” depending on whether its position was above or below the frames midline. After the display of this sample bar, and after a delay, choice bars were sequentially flashed in two locations: at the top and at the bottom of the frame (“choice” epoch). If the monkey timed its response to the display of the choice bar that matched the sample bar, he was rewarded. We found that cells in prefrontal cortex discriminated category early after the initial sample bar was shown, and continued to differentiate “up” from “down” trials throughout the delay and choice periods. In contrast, parietal cells did not differentiate category until the choice period. Therefore, our results support the notion of a top-down categorical signal that originates in prefrontal cortex and that is only represented in parietal cortex when it is necessary to express the categorical decision through a movement.


Cortex | 2009

Differential contribution of superior parietal and dorsal-lateral prefrontal cortices in copying

Bruno B. Averbeck; David A. Crowe; Matthew V. Chafee; Apostolos P. Georgopoulos

In this study we examined the differential contribution of superior parietal cortex (SPC) and caudal dorsal-lateral prefrontal cortex (dlPFC) to drawing geometrical shapes. Monkeys were trained to draw triangles, squares, trapezoids and inverted triangles while we recorded the activity of small ensembles of neurons in caudal area 46 and areas 5 and 2 of parietal cortex. We analyzed the drawing factors encoded by individual neurons by fitting a step-wise general-linear model using as our dependent variable the firing rate averaged over segments of the produced trajectories. This analysis demonstrated that both cognitive (shape and segment serial position) and motor (maximum speed, position and direction of segment) factors modulated the activity of individual neurons. Furthermore, SPC had an enriched representation of both shape and motor factors, with the motor enrichment being stronger than the shape enrichment. Following this we used the activity in the simultaneously recorded neural ensembles to predict the hand velocity. In these analyses we found that the prediction of the hand velocity was better when we estimated different linear decoding functions for each shape than when we estimated a single function across shapes, although it was a subtle effect. Furthermore, we also found that ensembles of caudal dlPFC neurons carried considerable information about hand velocity, a purely motor factor. However, the SPC ensembles carried more information at the ensemble level as a function of the ensemble size than the caudal dlPFC ensembles, although the differences were not dramatic. Finally, an analysis of the response latencies of individual neurons showed that the caudal dlPFC representation was more sensory than the SPC representation, which was equally sensory and motor. Thus, this neurophysiological evidence suggests that both SPC and caudal dlPFC have a role in drawing, but that SPC plays a larger role in both the cognitive and the motor components.

Collaboration


Dive into the David A. Crowe's collaboration.

Top Co-Authors

Avatar

Apostolos P. Georgopoulos

Johns Hopkins University School of Medicine

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Bruno B. Averbeck

National Institutes of Health

View shared research outputs
Top Co-Authors

Avatar

Hugo Merchant

National Autonomous University of Mexico

View shared research outputs
Top Co-Authors

Avatar

Ramon Bartolo

National Autonomous University of Mexico

View shared research outputs
Top Co-Authors

Avatar

Wilbert Zarco

National Autonomous University of Mexico

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Adele L. DeNicola

United States Department of Veterans Affairs

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge