Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jennifer M. Groh is active.

Publication


Featured researches published by Jennifer M. Groh.


Neuron | 2001

Eye Position Influences Auditory Responses in Primate Inferior Colliculus

Jennifer M. Groh; Amanda S. Trause; Abigail M. Underhill; Kimberly Rose Clark; Souheil Inati

We examined the frame of reference of auditory responses in the inferior colliculus in monkeys fixating visual stimuli at different locations. Eye position modulated the level of auditory responses in 33% of the neurons we encountered, but it did not appear to shift their spatial tuning. The effect of eye position on auditory responses was substantial-comparable in magnitude to that of sound location. The eye position signal appeared to interact with the auditory responses in at least a partly multiplicative fashion. We conclude that the representation of sound location in primate IC is distributed and that the frame of reference is intermediate between head- and eye-centered coordinates. The information contained in these neurons appears to be sufficient for later neural stages to calculate the positions of sounds with respect to the eyes.


Current Biology | 2003

Eye Position Affects Activity in Primary Auditory Cortex of Primates

Uri Werner-Reiss; Kristin A. Kelly; Amanda S. Trause; Abigail M. Underhill; Jennifer M. Groh

BACKGROUND Neurons in primary auditory cortex are known to be sensitive to the locations of sounds in space, but the reference frame for this spatial sensitivity has not been investigated. Conventional wisdom holds that the auditory and visual pathways employ different reference frames, with the auditory pathway using a head-centered reference frame and the visual pathway using an eye-centered reference frame. Reconciling these discrepant reference frames is therefore a critical component of multisensory integration. RESULTS We tested the reference frame of neurons in the auditory cortex of primates trained to fixate visual stimuli at different orbital positions. We found that eye position altered the activity of about one third of the neurons in this region (35 of 113, or 31%). Eye position affected not only the responses to sounds (26 of 113, or 23%), but also the spontaneous activity (14 of 113, or 12%). Such effects were also evident when monkeys moved their eyes freely in the dark. Eye position and sound location interacted to produce a representation for auditory space that was neither head- nor eye-centered in reference frame. CONCLUSIONS Taken together with emerging results in both visual and other auditory areas, these findings suggest that neurons whose responses reflect complex interactions between stimulus position and eye position set the stage for the eventual convergence of auditory and visual information.


Biological Cybernetics | 2001

Converting neural signals from place codes to rate codes

Jennifer M. Groh

Abstract. The nervous system uses two basic types of formats for encoding information. The parameters of many sensory (and some premotor) signals are represented by the pattern of activity among an array of neurons each of which is optimally responsive to a different parameter value. This type of code is commonly referred to as a place code. Motor commands, in contrast, use rate coding: the desired force of a muscle is specified as a monotonic function of the aggregate rate of discharge across all of its motor neurons. Generating movements based on sensory information often requires converting signals from a place code to a rate code. In this paper I discuss three possible models for how the brain does this.


Cerebral Cortex | 2009

Motor-Related Signals in the Intraparietal Cortex Encode Locations in a Hybrid, rather than Eye-Centered Reference Frame

O'Dhaniel A. Mullette-Gillman; Yale E. Cohen; Jennifer M. Groh

The reference frame used by intraparietal cortex neurons to encode locations is controversial. Many previous studies have suggested eye-centered coding, whereas we have reported that visual and auditory signals employ a hybrid reference frame (i.e., a combination of head- and eye-centered information) (Mullette-Gillman et al. 2005). One possible explanation for this discrepancy is that sensory-related activity, which we studied previously, is hybrid, whereas motor-related activity might be eye centered. Here, we examined the reference frame of visual and auditory saccade-related activity in the lateral and medial banks of the intraparietal sulcus (areas lateral intraparietal area [LIP] and medial intraparietal area [MIP]) of 2 rhesus monkeys. We recorded from 275 single neurons as monkeys performed visual and auditory saccades from different initial eye positions. We found that both visual and auditory signals reflected a hybrid of head- and eye-centered coordinates during both target and perisaccadic task periods rather than shifting to an eye-centered format as the saccade approached. This account differs from numerous previous recording studies. We suggest that the geometry of the receptive field sampling in prior studies was biased in favor of an eye-centered reference frame. Consequently, the overall hybrid nature of the reference frame was overlooked because the non-eye-centered response patterns were not fully characterized.


The Journal of Neuroscience | 2008

A Rate Code for Sound Azimuth in Monkey Auditory Cortex: Implications for Human Neuroimaging Studies

Uri Werner-Reiss; Jennifer M. Groh

Is sound location represented in the auditory cortex of humans and monkeys? Human neuroimaging experiments have had only mixed success at demonstrating sound location sensitivity in primary auditory cortex. This is in apparent conflict with studies in monkeys and other animals, in which single-unit recording studies have found stronger evidence for spatial sensitivity. Does this apparent discrepancy reflect a difference between humans and animals, or does it reflect differences in the sensitivity of the methods used for assessing the representation of sound location? The sensitivity of imaging methods such as functional magnetic resonance imaging depends on the following two key aspects of the underlying neuronal population: (1) what kind of spatial sensitivity individual neurons exhibit and (2) whether neurons with similar response preferences are clustered within the brain. To address this question, we conducted a single-unit recording study in monkeys. We investigated the nature of spatial sensitivity in individual auditory cortical neurons to determine whether they have receptive fields (place code) or monotonic (rate code) sensitivity to sound azimuth. Second, we tested how strongly the population of neurons favors contralateral locations. We report here that the majority of neurons show predominantly monotonic azimuthal sensitivity, forming a rate code for sound azimuth, but that at the population level the degree of contralaterality is modest. This suggests that the weakness of the evidence for spatial sensitivity in human neuroimaging studies of auditory cortex may be attributable to limited lateralization at the population level, despite what may be considerable spatial sensitivity in individual neurons.


Biological Cybernetics | 1993

Two models for transforming auditory signals from head-centered to eye-centered coordinates

Jennifer M. Groh; David L. Sparks

Two models for transforming auditory signals from head-centered to eye-centered coordinates are presented. The vector subtraction model subtracts a rate-coded eye position signal from a topographically weighted auditory target position signal to produce a rate-code of target location with respect to the eye. The rate-code is converted into a place-code through a graded synaptic weighting scheme and inhibition. The dendrite model performs a mapping of head-centered auditory space onto the dendrites of eye-centered units. Individual dendrites serve as logical comparators of target location and eye position. Both models produce a topographic map of auditory space in eye-centered coordinates like that found in the primate superior colliculus. Either type can be converted into a model for transforming visual signals from retinal to head-centered coordinates.


Current Opinion in Neurobiology | 2006

Seeing sounds: visual and auditory interactions in the brain

David A. Bulkin; Jennifer M. Groh

Objects and events can often be detected by more than one sensory system. Interactions between sensory systems can offer numerous benefits for the accuracy and completeness of the perception. Recent studies involving visual-auditory interactions have highlighted the perceptual advantages of combining information from these two modalities and have suggested that predominantly unimodal brain regions play a role in multisensory processing.


Journal of Cognitive Neuroscience | 2003

A Monotonic Code for Sound Azimuth in Primate Inferior Colliculus

Jennifer M. Groh; Kristin A. Kelly; Abigail M. Underhill

We investigated the format of the code for sound location in the inferior colliculi of three awake monkeys (Macaca mulatta). We found that roughly half of our sample of 99 neurons was sensitive to the free-field locations of broadband noise presented in the frontal hemisphere. Such neurons nearly always responded monotonically as a function of sound azimuth, with stronger responses for more contralateral sound locations. Few, if any, neurons had circumscribed receptive fields. Spatial sensitivity was broad: the proportion of the total sample of neurons responding to a sound at a given location ranged from 30 for ipsilateral locations to 80 for contralateral locations. These findings suggest that sound azimuth is represented via a population rate code of very broadly responsive neurons in primate inferior colliculi. This representation differs in format from the place code used for encoding the locations of visual and tactile stimuli and poses problems for the eventual convergence of auditory and visual or somatosensory signals. Accordingly, models for converting this representation into a place code are discussed.


Nature Neuroscience | 2000

Predicting perception from population codes

Jennifer M. Groh

Treue and colleagues use electrophysiological recordings in monkeys and psychophysical experiments in humans to suggest that the shape of a population response in a motion sensitive region of the brain (area MT), rather than the peak of the response, determines motion perception.


Frontiers in Neural Circuits | 2012

Sounds and beyond: multisensory and other non-auditory signals in the inferior colliculus

Kurtis G. Gruters; Jennifer M. Groh

The inferior colliculus (IC) is a major processing center situated mid-way along both the ascending and descending auditory pathways of the brain stem. Although it is fundamentally an auditory area, the IC also receives anatomical input from non-auditory sources. Neurophysiological studies corroborate that non-auditory stimuli can modulate auditory processing in the IC and even elicit responses independent of coincident auditory stimulation. In this article, we review anatomical and physiological evidence for multisensory and other non-auditory processing in the IC. Specifically, the contributions of signals related to vision, eye movements and position, somatosensation, and behavioral context to neural activity in the IC will be described. These signals are potentially important for localizing sound sources, attending to salient stimuli, distinguishing environmental from self-generated sounds, and perceiving and generating communication sounds. They suggest that the IC should be thought of as a node in a highly interconnected sensory, motor, and cognitive network dedicated to synthesizing a higher-order auditory percept rather than simply reporting patterns of air pressure detected by the cochlea. We highlight some of the potential pitfalls that can arise from experimental manipulations that may disrupt the normal function of this network, such as the use of anesthesia or the severing of connections from cortical structures that project to the IC. Finally, we note that the presence of these signals in the IC has implications for our understanding not just of the IC but also of the multitude of other regions within and beyond the auditory system that are dependent on signals that pass through the IC. Whatever the IC “hears” would seem to be passed both “upward” to thalamus and thence to auditory cortex and beyond, as well as “downward” via centrifugal connections to earlier areas of the auditory pathway such as the cochlear nucleus.

Collaboration


Dive into the Jennifer M. Groh's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

David L. Sparks

Baylor College of Medicine

View shared research outputs
Top Co-Authors

Avatar

Kristin K. Porter

University of Alabama at Birmingham

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

William T. Newsome

Howard Hughes Medical Institute

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge