Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where G. Christopher Stecker is active.

Publication


Featured researches published by G. Christopher Stecker.


Jaro-journal of The Association for Research in Otolaryngology | 2015

The precedence effect in sound localization.

Andrew D. Brown; G. Christopher Stecker; Daniel J. Tollin

In ordinary listening environments, acoustic signals reaching the ears directly from real sound sources are followed after a few milliseconds by early reflections arriving from nearby surfaces. Early reflections are spectrotemporally similar to their source signals but commonly carry spatial acoustic cues unrelated to the source location. Humans and many other animals, including nonmammalian and even invertebrate animals, are nonetheless able to effectively localize sound sources in such environments, even in the absence of disambiguating visual cues. Robust source localization despite concurrent or nearly concurrent spurious spatial acoustic information is commonly attributed to an assortment of perceptual phenomena collectively termed “the precedence effect,” characterizing the perceptual dominance of spatial information carried by the first-arriving signal. Here, we highlight recent progress and changes in the understanding of the precedence effect and related phenomena.


Jaro-journal of The Association for Research in Otolaryngology | 2016

Tuning to Binaural Cues in Human Auditory Cortex

Susan A. McLaughlin; Nathan C. Higgins; G. Christopher Stecker

Interaural level and time differences (ILD and ITD), the primary binaural cues for sound localization in azimuth, are known to modulate the tuned responses of neurons in mammalian auditory cortex (AC). The majority of these neurons respond best to cue values that favor the contralateral ear, such that contralateral bias is evident in the overall population response and thereby expected in population-level functional imaging data. Human neuroimaging studies, however, have not consistently found contralaterally biased binaural response patterns. Here, we used functional magnetic resonance imaging (fMRI) to parametrically measure ILD and ITD tuning in human AC. For ILD, contralateral tuning was observed, using both univariate and multivoxel analyses, in posterior superior temporal gyrus (pSTG) in both hemispheres. Response-ILD functions were U-shaped, revealing responsiveness to both contralateral and—to a lesser degree—ipsilateral ILD values, consistent with rate coding by unequal populations of contralaterally and ipsilaterally tuned neurons. In contrast, for ITD, univariate analyses showed modest contralateral tuning only in left pSTG, characterized by a monotonic response-ITD function. A multivoxel classifier, however, revealed ITD coding in both hemispheres. Although sensitivity to ILD and ITD was distributed in similar AC regions, the differently shaped response functions and different response patterns across hemispheres suggest that basic ILD and ITD processes are not fully integrated in human AC. The results support opponent-channel theories of ILD but not necessarily ITD coding, the latter of which may involve multiple types of representation that differ across hemispheres.


NeuroImage | 2015

Monaural and binaural contributions to interaural-level-difference sensitivity in human auditory cortex.

G. Christopher Stecker; Susan A. McLaughlin; Nathan C. Higgins

Whole-brain functional magnetic resonance imaging was used to measure blood-oxygenation-level-dependent (BOLD) responses in human auditory cortex (AC) to sounds with intensity varying independently in the left and right ears. Echoplanar images were acquired at 3 Tesla with sparse image acquisition once per 12-second block of sound stimulation. Combinations of binaural intensity and stimulus presentation rate were varied between blocks, and selected to allow measurement of response-intensity functions in three configurations: monaural 55–85 dB SPL, binaural 55–85 dB SPL with intensity equal in both ears, and binaural with average binaural level of 70 dB SPL and interaural level differences (ILD) ranging ±30 dB (i.e., favoring the left or right ear). Comparison of response functions equated for contralateral intensity revealed that BOLD-response magnitudes (1) generally increased with contralateral intensity, consistent with positive drive of the BOLD response by the contralateral ear, (2) were larger for contralateral monaural stimulation than for binaural stimulation, consistent with negative effects (e.g., inhibition) of ipsilateral input, which were strongest in the left hemisphere, and (3) also increased with ipsilateral intensity when contralateral input was weak, consistent with additional, positive, effects of ipsilateral stimulation. Hemispheric asymmetries in the spatial extent and overall magnitude of BOLD responses were generally consistent with previous studies demonstrating greater bilaterality of responses in the right hemisphere and stricter contralaterality in the left hemisphere. Finally, comparison of responses to fast (40/s) and slow (5/s) stimulus presentation rates revealed significant rate-dependent adaptation of the BOLD response that varied across ILD values.


Journal of the Acoustical Society of America | 2015

Temporal weighting of binaural information at low frequencies: Discrimination of dynamic interaural time and level differences

Anna C. Diedesch; G. Christopher Stecker

The importance of sound onsets in binaural hearing has been addressed in many studies, particularly at high frequencies, where the onset of the envelope may carry much of the useful binaural information. Some studies suggest that sound onsets might play a similar role in the processing of binaural cues [e.g., fine-structure interaural time differences (ITD)] at low frequencies. This study measured listeners sensitivity to ITD and interaural level differences (ILD) present in early (i.e., onset) and late parts of 80-ms pure tones of 250-, 500-, and 1000-Hz frequency. Following previous studies, tones carried static interaural cues or dynamic cues that peaked at sound onset and diminished to zero at sound offset or vice versa. Although better thresholds were observed in static than dynamic conditions overall, ITD discrimination was especially impaired, regardless of frequency, when cues were not available at sound onset. Results for ILD followed a similar pattern at 1000u2009Hz; at lower frequencies, ILD thresholds did not differ significantly between dynamic-cue conditions. The results support the onset hypothesis of Houtgast and Plomp [(1968). J. Acoust. Soc. Am. 44, 807-812] for ITD discrimination, but not necessarily ILD discrimination, in low-frequency pure tones.


Frontiers in Human Neuroscience | 2016

Early blindness results in developmental plasticity for auditory motion processing within auditory and occipital cortex

Fang Jiang; G. Christopher Stecker; Geoffrey M. Boynton; Ione Fine

Early blind subjects exhibit superior abilities for processing auditory motion, which are accompanied by enhanced BOLD responses to auditory motion within hMT+ and reduced responses within right planum temporale (rPT). Here, by comparing BOLD responses to auditory motion in hMT+ and rPT within sighted controls, early blind, late blind, and sight-recovery individuals, we were able to separately examine the effects of developmental and adult visual deprivation on cortical plasticity within these two areas. We find that both the enhanced auditory motion responses in hMT+ and the reduced functionality in rPT are driven by the absence of visual experience early in life; neither loss nor recovery of vision later in life had a discernable influence on plasticity within these areas. Cortical plasticity as a result of blindness has generally be presumed to be mediated by competition across modalities within a given cortical region. The reduced functionality within rPT as a result of early visual loss implicates an additional mechanism for cross modal plasticity as a result of early blindness—competition across different cortical areas for functional role.


Frontiers in Neuroscience | 2014

Processing of spatial sounds in human auditory cortex during visual, discrimination and 2-back tasks

Teemu Rinne; Heidi Ala-Salomäki; G. Christopher Stecker; Jukka Pätynen; Tapio Lokki

Previous imaging studies on the brain mechanisms of spatial hearing have mainly focused on sounds varying in the horizontal plane. In this study, we compared activations in human auditory cortex (AC) and adjacent inferior parietal lobule (IPL) to sounds varying in horizontal location, distance, or space (i.e., different rooms). In order to investigate both stimulus-dependent and task-dependent activations, these sounds were presented during visual discrimination, auditory discrimination, and auditory 2-back memory tasks. Consistent with previous studies, activations in AC were modulated by the auditory tasks. During both auditory and visual tasks, activations in AC were stronger to sounds varying in horizontal location than along other feature dimensions. However, in IPL, this enhancement was detected only during auditory tasks. Based on these results, we argue that IPL is not primarily involved in stimulus-level spatial analysis but that it may represent such information for more general processing when relevant to an active auditory task.


Journal of Neurophysiology | 2015

Evidence for a neural source of the precedence effect in sound localization

Andrew D. Brown; Heath G. Jones; Alan H. Kan; Tanvi Thakkar; G. Christopher Stecker; Matthew J. Goupell; Ruth Y. Litovsky

Normal-hearing human listeners and a variety of studied animal species localize sound sources accurately in reverberant environments by responding to the directional cues carried by the first-arriving sound rather than spurious cues carried by later-arriving reflections, which are not perceived discretely. This phenomenon is known as the precedence effect (PE) in sound localization. Despite decades of study, the biological basis of the PE remains unclear. Though the PE was once widely attributed to central processes such as synaptic inhibition in the auditory midbrain, a more recent hypothesis holds that the PE may arise essentially as a by-product of normal cochlear function. Here we evaluated the PE in a unique human patient population with demonstrated sensitivity to binaural information but without functional cochleae. Users of bilateral cochlear implants (CIs) were tested in a psychophysical task that assessed the number and location(s) of auditory images perceived for simulated source-echo (lead-lag) stimuli. A parallel experiment was conducted in a group of normal-hearing (NH) listeners. Key findings were as follows: 1) Subjects in both groups exhibited lead-lag fusion. 2) Fusion was marginally weaker in CI users than in NH listeners but could be augmented by systematically attenuating the amplitude of the lag stimulus to coarsely simulate adaptation observed in acoustically stimulated auditory nerve fibers. 3) Dominance of the lead in localization varied substantially among both NH and CI subjects but was evident in both groups. Taken together, data suggest that aspects of the PE can be elicited in CI users, who lack functional cochleae, thus suggesting that neural mechanisms are sufficient to produce the PE.


Proceedings of the National Academy of Sciences of the United States of America | 2017

Evidence for cue-independent spatial representation in the human auditory cortex during active listening

Nathan C. Higgins; Susan A. McLaughlin; Teemu Rinne; G. Christopher Stecker

Significance Individuals determine horizontal sound location based on precise calculations of sound level and sound timing differences at the two ears. Although these cues are processed independently at lower levels of the auditory system, their cortical processing remains poorly understood. This study seeks to address two key questions. (i) Are these cues integrated to form a cue-independent representation of space? (ii) How does active listening to sound location alter cortical response to these cues? We use functional brain imaging to address these questions, demonstrating that cue responses overlap in the cortex, that voxel patterns from one cue can predict the other cue and vice versa, and that active spatial listening enhances cortical responses independently of specific features of the sound. Few auditory functions are as important or as universal as the capacity for auditory spatial awareness (e.g., sound localization). That ability relies on sensitivity to acoustical cues—particularly interaural time and level differences (ITD and ILD)—that correlate with sound-source locations. Under nonspatial listening conditions, cortical sensitivity to ITD and ILD takes the form of broad contralaterally dominated response functions. It is unknown, however, whether that sensitivity reflects representations of the specific physical cues or a higher-order representation of auditory space (i.e., integrated cue processing), nor is it known whether responses to spatial cues are modulated by active spatial listening. To investigate, sensitivity to parametrically varied ITD or ILD cues was measured using fMRI during spatial and nonspatial listening tasks. Task type varied across blocks where targets were presented in one of three dimensions: auditory location, pitch, or visual brightness. Task effects were localized primarily to lateral posterior superior temporal gyrus (pSTG) and modulated binaural-cue response functions differently in the two hemispheres. Active spatial listening (location tasks) enhanced both contralateral and ipsilateral responses in the right hemisphere but maintained or enhanced contralateral dominance in the left hemisphere. Two observations suggest integrated processing of ITD and ILD. First, overlapping regions in medial pSTG exhibited significant sensitivity to both cues. Second, successful classification of multivoxel patterns was observed for both cue types and—critically—for cross-cue classification. Together, these results suggest a higher-order representation of auditory space in the human auditory cortex that at least partly integrates the specific underlying cues.


Frontiers in Systems Neuroscience | 2017

Sensitivity to an Illusion of Sound Location in Human Auditory Cortex

Nathan C. Higgins; Susan A. McLaughlin; Sandra Da Costa; G. Christopher Stecker

Human listeners place greater weight on the beginning of a sound compared to the middle or end when determining sound location, creating an auditory illusion known as the Franssen effect. Here, we exploited that effect to test whether human auditory cortex (AC) represents the physical vs. perceived spatial features of a sound. We used functional magnetic resonance imaging (fMRI) to measure AC responses to sounds that varied in perceived location due to interaural level differences (ILD) applied to sound onsets or to the full sound duration. Analysis of hemodynamic responses in AC revealed sensitivity to ILD in both full-cue (veridical) and onset-only (illusory) lateralized stimuli. Classification analysis revealed regional differences in the sensitivity to onset-only ILDs, where better classification was observed in posterior compared to primary AC. That is, restricting the ILD to sound onset—which alters the physical but not the perceptual nature of the spatial cue—did not eliminate cortical sensitivity to that cue. These results suggest that perceptual representations of auditory space emerge or are refined in higher-order AC regions, supporting the stable perception of auditory space in noisy or reverberant environments and forming the basis of illusions such as the Franssen effect.


Journal of the Acoustical Society of America | 2018

Validating auditory spatial awareness with virtual reality and vice-versa

G. Christopher Stecker; Steven Carter; Travis M. Moore; Monica L. Folkerts

“Immersive” technologies such as virtual (VR) and augmented reality (AR) stand to transform sensory and perceptual science, clinical assessments, and habilitation of spatial awareness. This talk explores some of the general challenges and opportunities for VR- and AR-enabled research, illustrated by specific studies in the area of spatial hearing. In one study, freefield localization and discrimination measures were compared across conditions which used VR to show, hide, or alter the visible locations of loudspeakers from trial to trial. The approach is well suited to understanding potential biases and cuing effects in real-world settings. A second study used headphone presentation to understand contextual effects on the relative weighting of binaural timing and intensity cues. Previous studies have used adjustment and lateralization to “trade” time and level cues in a sound-booth setting, but none have attempted to measure how listeners weight cues in realistic multisensory scenes or in realistic tempora...

Collaboration


Dive into the G. Christopher Stecker's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Fang Jiang

University of Washington

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ione Fine

University of Washington

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Teemu Rinne

University of Helsinki

View shared research outputs
Top Co-Authors

Avatar

Alan H. Kan

University of Wisconsin-Madison

View shared research outputs
Researchain Logo
Decentralizing Knowledge