Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Yale E. Cohen is active.

Publication


Featured researches published by Yale E. Cohen.


Nature Reviews Neuroscience | 2002

A common reference frame for movement plans in the posterior parietal cortex

Yale E. Cohen; Richard A. Andersen

Orchestrating a movement towards a sensory target requires many computational processes, including a transformation between reference frames. This transformation is important because the reference frames in which sensory stimuli are encoded often differ from those of motor effectors. The posterior parietal cortex has an important role in these transformations. Recent work indicates that a significant proportion of parietal neurons in two cortical areas transforms the sensory signals that are used to guide movements into a common reference frame. This common reference frame is an eye-centred representation that is modulated by eye-, head-, body- or limb-position signals. A common reference frame might facilitate communication between different areas that are involved in coordinating the movements of different effectors. It might also be an efficient way to represent the locations of different sensory targets in the world.


Neuron | 2000

Reaches to Sounds Encoded in an Eye-Centered Reference Frame

Yale E. Cohen; Richard A. Andersen

A recent hypothesis suggests that neurons in the lateral intraparietal area (LIP) and the parietal reach region (PRR) encode movement plans in a common eye-centered reference frame. To test this hypothesis further, we examined how PRR neurons encode reach plans to auditory stimuli. We found that PRR activity was affected by eye and initial hand position. Population analyses, however, indicated that PRR neurons were affected more strongly by eye position than by initial hand position. These eye position effects were appropriate to maintain coding in eye coordinates. Indeed, a significant population of PRR neurons encoded reaches to auditory stimuli in an eye-centered reference frame. These results extend the hypothesis that, regardless of the modality of the sensory input or the eventual action, PRR and LIP neurons represent movement plans in a common, eye-centered representation.


American Journal of Otolaryngology | 1993

Middle-Ear Development V: Development of Umbo Sensitivity in the Gerbil

Yale E. Cohen; Daryl E. Doan; David M. Rubin; James C. Saunders

PURPOSE The development of the umbo response in the gerbil was studied in order to further elucidate the contribution of the middle ear to the development of auditory function. MATERIALS AND METHODS Laser interferometry was used to study the development of umbo velocity in Mongolian gerbils between 10 days after birth and maturity. RESULTS Before 15 days after birth, immaturities in the middle ear prevented any reliable measures of middle-ear motion. However, between 15 and 20 days after birth, a 10 dB improvement in umbo velocity was noted in the low-frequency (0.5 to 2.0 kHz) region of the umbo response. This improvement in sensitivity was correlated to an increased admittance due to an expanding bulla volume. Interestingly, umbo velocity remained relatively constant in the mid- and high-frequency regions of the response curve between 15 and 42 days after birth. The umbo response in the adult gerbil was decidedly different when compared with the response at 42 days after birth. CONCLUSION We speculate that a decrease in bulla volume along with increased ossicular mass contributed to the changes in the adult umbo response. When the maturation of the umbo response was compared with more central ontogenetic measures, it became apparent that structures more central to the middle ear continued to develop well past the time the middle ear was structurally and functionally mature.


Trends in Neurosciences | 1999

Maps versus clusters: different representations of auditory space in the midbrain and forebrain

Yale E. Cohen; Eric I. Knudsen

The auditory system determines the location of stimuli based on the evaluation of specific cues. The analysis begins in the tonotopic pathway, where these cues are processed in parallel, frequency-specific channels. This frequency-specific information is processed further in the midbrain and in the forebrain by specialized, space-processing pathways that integrate information across frequency channels, creating high-order neurons tuned to specific locations in space. Remarkably, the results of this integrative step are represented very differently in the midbrain and forebrain: in the midbrain, space is represented in maps, whereas, in the forebrain, space is represented in clusters of similarly tuned neurons. We propose that these different representations reflect the different roles that these two brain areas have in guiding behavior.


Journal of Cognitive Neuroscience | 2005

The Neurophysiology of Functionally Meaningful Categories: Macaque Ventrolateral Prefrontal Cortex Plays a Critical Role in Spontaneous Categorization of Species-Specific Vocalizations

Gordon W. Gifford; Katherine A. MacLean; Marc D. Hauser; Yale E. Cohen

Neurophysiological studies in nonhuman primates have demonstrated that the prefrontal cortex (PFC) plays a critical role in the acquisition of learned categories following training. What is presently unclear is whether this cortical area also plays a role in spontaneous recognition and discrimination of natural categories. Here, we explore this possibility by recording from neurons in the PFC while rhesus listen to species-specific vocalizations that vary in terms of their social function and acoustic morphology. We found that ventral prefrontal cortex (vPFC) activity, on average, did not differentiate between food calls that were associated with the same functional category, despite having different acoustic properties. In contrast, vPFC activity differentiated between food calls associated with different functional classes and specifically, information about the quality and motivational value of the food. These results suggest that the vPFC is involved in the categorization of socially meaningful signals, thereby both extending its previously conceived role in the acquisition of learned categories and showing the significance of using natural categorical distinctions in the study of neural mechanisms.


Cerebral Cortex | 2009

Motor-Related Signals in the Intraparietal Cortex Encode Locations in a Hybrid, rather than Eye-Centered Reference Frame

O'Dhaniel A. Mullette-Gillman; Yale E. Cohen; Jennifer M. Groh

The reference frame used by intraparietal cortex neurons to encode locations is controversial. Many previous studies have suggested eye-centered coding, whereas we have reported that visual and auditory signals employ a hybrid reference frame (i.e., a combination of head- and eye-centered information) (Mullette-Gillman et al. 2005). One possible explanation for this discrepancy is that sensory-related activity, which we studied previously, is hybrid, whereas motor-related activity might be eye centered. Here, we examined the reference frame of visual and auditory saccade-related activity in the lateral and medial banks of the intraparietal sulcus (areas lateral intraparietal area [LIP] and medial intraparietal area [MIP]) of 2 rhesus monkeys. We recorded from 275 single neurons as monkeys performed visual and auditory saccades from different initial eye positions. We found that both visual and auditory signals reflected a hybrid of head- and eye-centered coordinates during both target and perisaccadic task periods rather than shifting to an eye-centered format as the saccade approached. This account differs from numerous previous recording studies. We suggest that the geometry of the receptive field sampling in prior studies was biased in favor of an eye-centered reference frame. Consequently, the overall hybrid nature of the reference frame was overlooked because the non-eye-centered response patterns were not fully characterized.


The Journal of Neuroscience | 2004

Selectivity for the Spatial and Nonspatial Attributes of Auditory Stimuli in the Ventrolateral Prefrontal Cortex

Yale E. Cohen; Brian E. Russ; Gordon W. Gifford; Ruwan Kiringoda; Katherine A. MacLean

Spatial and nonspatial auditory processing is hypothesized to occur in parallel dorsal and ventral pathways, respectively. In this study, we tested the spatial and nonspatial sensitivity of auditory neurons in the ventrolateral prefrontal cortex (vPFC), a cortical area in the hypothetical nonspatial pathway. We found that vPFC neurons were modulated significantly by both the spatial and nonspatial attributes of an auditory stimulus. When comparing these responses with those in anterolateral belt region of the auditory cortex, which is hypothesized to be specialized for processing the nonspatial attributes of auditory stimuli, we found that the nonspatial sensitivity of vPFC neurons was poorer, whereas the spatial selectivity was better than anterolateral neurons. Also, the spatial and nonspatial sensitivity of vPFC neurons was comparable with that seen in the lateral intraparietal area, a cortical area that is a part of the dorsal pathway. These data suggest that substantial spatial and nonspatial processing occurs in both the dorsal and ventral pathways.


Neuron | 1999

Who Goes There

Yale E. Cohen; C. Mark Wessinger

Are there parallel processing streams for auditory spatial and object processing? At this point, the data do not adequately support this position. As we have discussed, auditory spatial and object processing appear to be mediated in the same cortical regions, which suggests that there are not independent processing streams. At best, studies have indicated that different cortical areas are involved preferentially in spatial or object processing (cf. Weeks et al. 1999xWeeks, R.A, Aziz-Sultan, A, Bushara, K.O, Tian, B, Wessinger, C.M, Dang, N, Rauschecker, J.P, and Hallett, M. Neurosci. Lett. 1999; 12: 155–158Crossref | Scopus (152)See all ReferencesWeeks et al. 1999). Of course, this may not be surprising, since most of the discussed cortical areas are strongly interconnected (9xKaas, J.H and Hackett, T.A. Audiol. Neurootol. 1998; 3: 73–85Crossref | PubMed | Scopus (152)See all References, 12xRomanski, L.M, Tian, B, Fritz, J, Mishkin, M, Goldman-Rakic, P.S, and Rauschecker, J.P. Nat. Neurosci. 1999; 2: 1131–1136Crossref | PubMed | Scopus (683)See all References). More experimentation is necessary to further specify the contribution of different cortical areas to spatial and object processing. For example, it will be important to determine whether, in the prefrontal and parietal cortices, there are distinct neuronal populations that separately mediate auditory spatial and object processing or whether the same neuronal populations can subserve both processes (Rao et al. 1997xRao, S.C, Rainer, G, and Miller, E.K. Science. 1997; 276: 821–824Crossref | PubMed | Scopus (558)See all ReferencesRao et al. 1997).*To whom correspondence should be addressed (e-mail: [email protected]).


Biology Letters | 2006

Spontaneous processing of abstract categorical information in the ventrolateral prefrontal cortex

Yale E. Cohen; Marc D. Hauser; Brian E. Russ

In various aspects of linguistic analysis and human cognition, some forms of observed variation are ignored in the service of handling more abstract categories. In the absence of training, rhesus discriminate between different types of vocalizations based on the information conveyed as opposed to their acoustic morphologies. We hypothesized that neurons in the ventrolateral prefrontal cortex (vPFC), an area involved in auditory-object processing, might be involved in this spontaneous categorization. To test this hypothesis, we recorded vPFC activity while rhesus listened to vocalizations conveying information about food and non-food events. Results showed between, but not within category discrimination. That is, vPFC neurons discriminated between vocalizations associated with food versus non-food events but not within the class of food calls associated with differences in quality. These results indicate that the vPFC plays a significant role in spontaneously processing abstract categorical information.


Experimental Brain Research | 2005

Spatial and non-spatial auditory processing in the lateral intraparietal area

Gordon W. Gifford; Yale E. Cohen

We tested the responses of neurons in the lateral parietal area (area LIP) for their sensitivity to the spatial and non-spatial attributes of an auditory stimulus. We found that the firing rates of LIP neurons were modulated by both of these attributes. These data indicate that, while area LIP is involved in spatial processing, non-spatial processing is not restricted to independent channels.

Collaboration


Dive into the Yale E. Cohen's collaboration.

Top Co-Authors

Avatar

Joji Tsunada

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Adam M. Gifford

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar

James C. Saunders

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Alan A. Stocker

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Sharath Bennur

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar

Jung Hoon Lee

University of Pennsylvania

View shared research outputs
Researchain Logo
Decentralizing Knowledge