Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Stefanie E. Kuchinsky is active.

Publication


Featured researches published by Stefanie E. Kuchinsky.


The Journal of Neuroscience | 2013

The Cingulo-Opercular Network Provides Word-Recognition Benefit

Kenneth I. Vaden; Stefanie E. Kuchinsky; Stephanie L. Cute; Jayne B. Ahlstrom; Judy R. Dubno; Mark A. Eckert

Recognizing speech in difficult listening conditions requires considerable focus of attention that is often demonstrated by elevated activity in putative attention systems, including the cingulo-opercular network. We tested the prediction that elevated cingulo-opercular activity provides word-recognition benefit on a subsequent trial. Eighteen healthy, normal-hearing adults (10 females; aged 20–38 years) performed word recognition (120 trials) in multi-talker babble at +3 and +10 dB signal-to-noise ratios during a sparse sampling functional magnetic resonance imaging (fMRI) experiment. Blood oxygen level-dependent (BOLD) contrast was elevated in the anterior cingulate cortex, anterior insula, and frontal operculum in response to poorer speech intelligibility and response errors. These brain regions exhibited significantly greater correlated activity during word recognition compared with rest, supporting the premise that word-recognition demands increased the coherence of cingulo-opercular network activity. Consistent with an adaptive control network explanation, general linear mixed model analyses demonstrated that increased magnitude and extent of cingulo-opercular network activity was significantly associated with correct word recognition on subsequent trials. These results indicate that elevated cingulo-opercular network activity is not simply a reflection of poor performance or error but also supports word recognition in difficult listening conditions.


Psychophysiology | 2013

Pupil size varies with word listening and response selection difficulty in older adults with hearing loss.

Stefanie E. Kuchinsky; Jayne B. Ahlstrom; Kenneth I. Vaden; Stephanie L. Cute; Larry E. Humes; Judy R. Dubno; Mark A. Eckert

Listening to speech in noise can be exhausting, especially for older adults with impaired hearing. Pupil dilation is thought to track the difficulty associated with listening to speech at various intelligibility levels for young and middle-aged adults. This study examined changes in the pupil response with acoustic and lexical manipulations of difficulty in older adults with hearing loss. Participants identified words at two signal-to-noise ratios (SNRs) among options that could include a similar-sounding lexical competitor. Growth Curve Analyses revealed that the pupil response was affected by an SNR × Lexical competition interaction, such that it was larger and more delayed and sustained in the harder SNR condition, particularly in the presence of lexical competition. Pupillometry detected these effects for correct trials and across reaction times, suggesting it provides additional evidence of task difficulty than behavioral measures alone.


The Journal of Neuroscience | 2015

Cortical Activity Predicts Which Older Adults Recognize Speech in Noise and When

Kenneth I. Vaden; Stefanie E. Kuchinsky; Jayne B. Ahlstrom; Judy R. Dubno; Mark A. Eckert

Speech recognition in noise can be challenging for older adults and elicits elevated activity throughout a cingulo-opercular network that is hypothesized to monitor and modify behaviors to optimize performance. A word recognition in noise experiment was used to test the hypothesis that cingulo-opercular engagement provides performance benefit for older adults. Healthy older adults (N = 31; 50–81 years of age; mean pure tone thresholds <32 dB HL from 0.25 to 8 kHz, best ear; species: human) performed word recognition in multitalker babble at 2 signal-to-noise ratios (SNR = +3 or +10 dB) during a sparse sampling fMRI experiment. Elevated cingulo-opercular activity was associated with an increased likelihood of correct recognition on the following trial independently of SNR and performance on the preceding trial. The cingulo-opercular effect increased for participants with the best overall performance. These effects were lower for older adults compared with a younger, normal-hearing adult sample (N = 18). Visual cortex activity also predicted trial-level recognition for the older adults, which resulted from discrete decreases in activity before errors and occurred for the oldest adults with the poorest recognition. Participants demonstrating larger visual cortex effects also had reduced fractional anisotropy in an anterior portion of the left inferior frontal-occipital fasciculus, which projects between frontal and occipital regions where activity predicted word recognition. Together, the results indicate that older adults experience performance benefit from elevated cingulo-opercular activity, but not to the same extent as younger adults, and that declines in attentional control can limit word recognition.


Psychophysiology | 2014

Speech-perception training for older adults with hearing loss impacts word recognition and effort.

Stefanie E. Kuchinsky; Jayne B. Ahlstrom; Stephanie L. Cute; Larry E. Humes; Judy R. Dubno; Mark A. Eckert

The current pupillometry study examined the impact of speech-perception training on word recognition and cognitive effort in older adults with hearing loss. Trainees identified more words at the follow-up than at the baseline session. Training also resulted in an overall larger and faster peaking pupillary response, even when controlling for performance and reaction time. Perceptual and cognitive capacities affected the peak amplitude of the pupil response across participants but did not diminish the impact of training on the other pupil metrics. Thus, we demonstrated that pupillometry can be used to characterize training-related and individual differences in effort during a challenging listening task. Importantly, the results indicate that speech-perception training not only affects overall word recognition, but also a physiological metric of cognitive effort, which has the potential to be a biomarker of hearing loss intervention outcome.


NeuroImage | 2012

Multiple imputation of missing fMRI data in whole brain analysis

Kenneth I. Vaden; Mulugeta Gebregziabher; Stefanie E. Kuchinsky; Mark A. Eckert

Whole brain fMRI analyses rarely include the entire brain because of missing data that result from data acquisition limits and susceptibility artifact, in particular. This missing data problem is typically addressed by omitting voxels from analysis, which may exclude brain regions that are of theoretical interest and increase the potential for Type II error at cortical boundaries or Type I error when spatial thresholds are used to establish significance. Imputation could significantly expand statistical map coverage, increase power, and enhance interpretations of fMRI results. We examined multiple imputation for group level analyses of missing fMRI data using methods that leverage the spatial information in fMRI datasets for both real and simulated data. Available case analysis, neighbor replacement, and regression based imputation approaches were compared in a general linear model framework to determine the extent to which these methods quantitatively (effect size) and qualitatively (spatial coverage) increased the sensitivity of group analyses. In both real and simulated data analysis, multiple imputation provided 1) variance that was most similar to estimates for voxels with no missing data, 2) fewer false positive errors in comparison to mean replacement, and 3) fewer false negative errors in comparison to available case analysis. Compared to the standard analysis approach of omitting voxels with missing data, imputation methods increased brain coverage in this study by 35% (from 33,323 to 45,071 voxels). In addition, multiple imputation increased the size of significant clusters by 58% and number of significant clusters across statistical thresholds, compared to the standard voxel omission approach. While neighbor replacement produced similar results, we recommend multiple imputation because it uses an informed sampling distribution to deal with missing data across subjects that can include neighbor values and other predictors. Multiple imputation is anticipated to be particularly useful for 1) large fMRI data sets with inconsistent missing voxels across subjects and 2) addressing the problem of increased artifact at ultra-high field, which significantly limit the extent of whole brain coverage and interpretations of results.


Cerebral Cortex | 2012

Word Intelligibility and Age Predict Visual Cortex Activity during Word Listening

Stefanie E. Kuchinsky; Kenneth I. Vaden; Noam I. Keren; Kelly C. Harris; Jayne B. Ahlstrom; Judy R. Dubno; Mark A. Eckert

The distractibility that older adults experience when listening to speech in challenging conditions has been attributed in part to reduced inhibition of irrelevant information within and across sensory systems. Whereas neuroimaging studies have shown that younger adults readily suppress visual cortex activation when listening to auditory stimuli, it is unclear the extent to which declining inhibition in older adults results in reduced suppression or compensatory engagement of other sensory cortices. The current functional magnetic resonance imaging study examined the effects of age and stimulus intelligibility in a word listening task. Across all participants, auditory cortex was engaged when listening to words. However, increasing age and declining word intelligibility had independent and spatially similar effects: both were associated with increasing engagement of visual cortex. Visual cortex activation was not explained by age-related differences in vascular reactivity but rather auditory and visual cortices were functionally connected across word listening conditions. The nature of this correlation changed with age: younger adults deactivated visual cortex when activating auditory cortex, middle-aged adults showed no relation, and older adults synchronously activated both cortices. These results suggest that age and stimulus integrity are additive modulators of crossmodal suppression and activation.


Experimental Aging Research | 2016

Task-Related Vigilance During Word Recognition in Noise for Older Adults with Hearing Loss

Stefanie E. Kuchinsky; Kenneth I. Vaden; Jayne B. Ahlstrom; Stephanie L. Cute; Larry E. Humes; Judy R. Dubno; Mark A. Eckert

Background/Study Context: Vigilance refers to the ability to sustain and adapt attentional focus in response to changing task demands. For older adults with hearing loss, vigilant listening may be particularly effortful and variable across individuals. This study examined the extent to which neural responses to sudden, unexpected changes in task structure (e.g., from rest to word recognition epochs) were related to pupillometry measures of listening effort. Methods: Individual differences in the task-evoked pupil response during word recognition were used to predict functional magnetic resonance imaging (MRI) estimates of neural responses to salient transitions between quiet rest, noisy rest, and word recognition in unintelligible, fluctuating background noise. Participants included 29 older adults (M = 70.2 years old) with hearing loss (pure tone average across all frequencies = 36.1 dB HL [hearing level], SD = 6.7). Results: Individuals with a greater average pupil response exhibited a more vigilant pattern of responding on a standardized continuous performance test (response time variability across varying interstimulus intervals r(27) = .38, p = .04). Across participants there was widespread engagement of attention- and sensory-related cortices in response to transitions between blocks of rest and word recognition conditions. Individuals who exhibited larger task-evoked pupil dilation also showed even greater activity in the right primary auditory cortex in response to changes in task structure. Conclusion: Pupillometric estimates of word recognition effort predicted variation in activity within cortical regions that were responsive to salient changes in the environment for older adults with hearing loss. The results of the current study suggest that vigilant attention is increased amongst older adults who exert greater listening effort.


Experimental Aging Research | 2016

Cingulo-Opercular Function During Word Recognition in Noise for Older Adults with Hearing Loss.

Kenneth I. Vaden; Stefanie E. Kuchinsky; Jayne B. Ahlstrom; Susan Teubner-Rhodes; Dubno; Mark A. Eckert

Background/Study Context: Adaptive control, reflected by elevated activity in cingulo-opercular brain regions, optimizes performance in challenging tasks by monitoring outcomes and adjusting behavior. For example, cingulo-opercular function benefits trial-level word recognition in noise for normal-hearing adults. Because auditory system deficits may limit the communicative benefit from adaptive control, we examined the extent to which cingulo-opercular engagement supports word recognition in noise for older adults with hearing loss (HL). Methods: Participants were selected to form groups with Less HL (n = 12; mean pure tone threshold, pure tone average [PTA] = 19.2 ± 4.8 dB HL [hearing level]) and More HL (n = 12; PTA = 38.4 ± 4.5 dB HL, 0.25–8 kHz, both ears). A word recognition task was performed with words presented in multitalker babble at +3 or +10 dB signal-to-noise ratios (SNRs) during a sparse acquisition fMRI experiment. The participants were middle-aged and older (ages: 64.1 ± 8.4 years) English speakers with no history of neurological or psychiatric diagnoses. Results: Elevated cingulo-opercular activity occurred with increased likelihood of correct word recognition on the next trial (t(23) = 3.28, p = .003), and this association did not differ between hearing loss groups. During trials with word recognition errors, the More HL group exhibited higher blood oxygen level-dependent (BOLD) contrast in occipital and parietal regions compared with the Less HL group. Across listeners, more pronounced cingulo-opercular activity during recognition errors was associated with better overall word recognition performance. Conclusion: The trial-level word recognition benefit from cingulo-opercular activity was equivalent for both hearing loss groups. When speech audibility and performance levels are similar for older adults with mild to moderate hearing loss, cingulo-opercular adaptive control contributes to word recognition in noise.


Psychophysiology | 2017

Pupillometry reveals changes in physiological arousal during a sustained listening task.

Ronan Mcgarrigle; Piers Dawes; Andrew J. Stewart; Stefanie E. Kuchinsky; Kevin J. Munro

Hearing loss is associated with anecdotal reports of fatigue during periods of sustained listening. However, few studies have attempted to measure changes in arousal, as a potential marker of fatigue, over the course of a sustained listening task. The present study aimed to examine subjective, behavioral, and physiological indices of listening-related fatigue. Twenty-four normal-hearing young adults performed a speech-picture verification task in different signal-to-noise ratios (SNRs) while their pupil size was monitored and response times recorded. Growth curve analysis revealed a significantly steeper linear decrease in pupil size in the more challenging SNR, but only in the second half of the trial block. Changes in pupil dynamics over the course of the more challenging listening condition block suggest a reduction in physiological arousal. Behavioral and self-report measures did not reveal any differences between listening conditions. This is the first study to show reduced physiological arousal during a sustained listening task, with changes over time consistent with the onset of fatigue.


Neuropsychologia | 2011

Inferior Frontal Sensitivity to Common Speech Sounds Is Amplified by Increasing Word Intelligibility.

Kenneth I. Vaden; Stefanie E. Kuchinsky; Noam I. Keren; Kelly C. Harris; Jayne B. Ahlstrom; Judy R. Dubno; Mark A. Eckert

The left inferior frontal gyrus (LIFG) exhibits increased responsiveness when people listen to words composed of speech sounds that frequently co-occur in the English language (Vaden, Piquado, & Hickok, 2011), termed high phonotactic frequency (Vitevitch & Luce, 1998). The current experiment aimed to further characterize the relation of phonotactic frequency to LIFG activity by manipulating word intelligibility in participants of varying age. Thirty six native English speakers, 19-79 years old (mean=50.5, sd=21.0) indicated with a button press whether they recognized 120 binaurally presented consonant-vowel-consonant words during a sparse sampling fMRI experiment (TR=8 s). Word intelligibility was manipulated by low-pass filtering (cutoff frequencies of 400 Hz, 1000 Hz, 1600 Hz, and 3150 Hz). Group analyses revealed a significant positive correlation between phonotactic frequency and LIFG activity, which was unaffected by age and hearing thresholds. A region of interest analysis revealed that the relation between phonotactic frequency and LIFG activity was significantly strengthened for the most intelligible words (low-pass cutoff at 3150 Hz). These results suggest that the responsiveness of the left inferior frontal cortex to phonotactic frequency reflects the downstream impact of word recognition rather than support of word recognition, at least when there are no speech production demands.

Collaboration


Dive into the Stefanie E. Kuchinsky's collaboration.

Top Co-Authors

Avatar

Mark A. Eckert

Medical University of South Carolina

View shared research outputs
Top Co-Authors

Avatar

Judy R. Dubno

Medical University of South Carolina

View shared research outputs
Top Co-Authors

Avatar

Kenneth I. Vaden

Medical University of South Carolina

View shared research outputs
Top Co-Authors

Avatar

Jayne B. Ahlstrom

Medical University of South Carolina

View shared research outputs
Top Co-Authors

Avatar

Stephanie L. Cute

Medical University of South Carolina

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Kevin J. Munro

Central Manchester University Hospitals NHS Foundation Trust

View shared research outputs
Top Co-Authors

Avatar

Piers Dawes

University of Manchester

View shared research outputs
Top Co-Authors

Avatar

Larry E. Humes

Indiana University Bloomington

View shared research outputs
Researchain Logo
Decentralizing Knowledge