Sharon Coffey-Corina
University of California, Davis
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Sharon Coffey-Corina.
Developmental Neuropsychology | 1997
Debra L. Mills; Sharon Coffey-Corina; Helen J. Neville
The purpose of this study was to examine developmental changes in the organization of brain activity linked to comprehension of single words in 13‐ to 20‐month‐old infants. Event‐related potentials (ERPs) were recorded as children listened to a series of words whose meanings were understood by the child, words whose meanings the child did not understand, and backward words. The results were consistent with a previous study suggesting that ERPs differed as a function of word meaning within 200 ms after word onset. At 13 to 17 months, ERP differences between comprehended and unknown words were bilateral and broadly distributed over anterior and posterior regions. In contrast, at 20 months of age these effects were limited to temporal and parietal regions of the left hemisphere. The results are discussed in relation to the general effects of maturation, the maturation of language‐relevant brain systems, and the development of brain systems linked to level of ability independent of chronological age. We offer...
Journal of Cognitive Neuroscience | 1993
Debra L. Mills; Sharon Coffey-Corina; Helen J. Neville
The purpose of the present study was to examine patterns of neural activity relevant to language processing in 20-month-old infants, and to determine whether or not changes in cerebral organization occur as a function of specific changes in language development. Event-related potentials (ERPs) were recorded as children listened to a series of words whose meaning was understood by the child, words whose meaning the child did not understand, and backward words. The results showed that specific and different ERP components discriminated comprehended words from unknown and from backward words. Distinct lateral and anterior-posterior specializations were apparent in EW responsiveness to the different types of words. Moreover, the results suggested that increasing language abilities were associated with increasing cerebral specialization for language processing over the temporal and parietal regions of the left hemisphere.
Journal of Cognitive Neuroscience | 2001
Giordana Grossi; Donna Coch; Sharon Coffey-Corina; Phillip J. Holcomb; Helen J. Neville
We employed a visual rhyming priming paradigm to characterize the development of brain systems important for phonological processing in reading. We studied 109 right-handed, native English speakers within eight age groups: 7-8, 9-10, 11-12, 13-14, 15-16, 17-18, 19-20, and 21-23. Participants decided whether two written words (prime-target) rhymed (JUICE-MOOSE) or not (CHAIR-MOOSE). In similar studies of adults, two main event-related potential (ERP) effects have been described: a negative slow wave to primes, larger over anterior regions of the left hemisphere and hypothesized to index rehearsal of the primes, and a negative deflection to targets, peaking at 400-450 msec, maximal over right temporal-parietal regions, larger for nonrhyming than rhyming targets, and hypothesized to index phonological matching. In this study, these two ERP effects were observed in all age groups; however, the two effects showed different developmental timecourses. On the one hand, the frontal asymmetry to primes increased with age; moreover, this asymmetry was correlated with reading and spelling scores, even after controlling for age. On the other hand, the distribution and onset of the more posterior rhyming effect (RE) were stable across age groups, suggesting that phonological matching relied on similar neural systems across these ages. Behaviorally, both reaction times and accuracy improved with age. These results suggest that different aspects of phonological processing rely on different neural systems that have different developmental timecourses.
Biological Psychiatry | 2010
Ali Mazaheri; Sharon Coffey-Corina; George R. Mangun; Evelijne M. Bekker; Anne S. Berry; Blythe A. Corbett
BACKGROUND Current pathophysiologic models of attention-deficit/hyperactivity disorder (ADHD) suggest that impaired functional connectivity within brain attention networks may contribute to the disorder. In this electroencephalographic (EEG) study, we analyzed cross-frequency amplitude correlations to investigate differences in cue-induced functional connectivity in typically developing children and children with ADHD. METHODS Electroencephalographic activity was recorded in 25 children aged 8 to 12 years (14 with ADHD) while they performed a cross-modal attention task in which cues signaled the most likely (.75 probability) modality of an upcoming target. The power spectra of the EEG in the theta (3-5 Hz) and alpha (8-12 Hz) bands were calculated for the 1-sec interval after the cue and before the target while subjects prepared to discriminate the expected target. RESULTS Both groups showed behavioral benefits of the predictive attentional cues, being faster and more accurate for validly cued targets (e.g., visual target preceded by a cue predicting a visual target) than to invalidly cued targets (e.g., visual target preceded by a cue predicting an auditory target); in addition, independent of cue-target validity, typical children were faster to respond overall. In the typically developing children, the alpha activity was differentially modulated by the two cues and anticorrelated with midfrontal theta activity; these EEG correlates of attentional control were not observed in the children with ADHD. CONCLUSIONS Our findings provide neurophysiological evidence for a specific deficit in top-down attentional control in children with ADHD that is manifested as a functional disconnection between frontal and occipital cortex.
Developmental Science | 2002
Donna Coch; Giordana Grossi; Sharon Coffey-Corina; Phillip J. Holcomb; Helen J. Neville
In a simple auditory rhyming paradigm requiring a button-press response (rhyme/nonrhyme) to the second word (target) of each spoken stimulus pair, both the early (P50, N120, P200, N240) and late (CNV, N400, P300) components of the ERP waveform evidenced considerable change from middle childhood to adulthood. In addition, behavioral accuracy and reaction time improved with increasing age. In contrast, the size, distribution and latency of each of several rhyming effects (including the posterior N400 rhyming effect, a left hemisphere anterior rhyming effect, and early rhyming effects on P50 latency, N120 latency and P200 amplitude) remained constant from age 7 to adulthood. These results indicate that the neurocognitive networks involved in processing auditory rhyme information, as indexed by the present task, are well established and have an adult-like organization at least by the age of 7.
PLOS ONE | 2013
Patricia K. Kuhl; Sharon Coffey-Corina; Denise Padden; Jeffrey Munson; Annette Estes; Geraldine Dawson
Autism Spectrum Disorder (ASD) is a developmental disability that affects social behavior and language acquisition. ASD exhibits great variability in outcomes, with some individuals remaining nonverbal and others exhibiting average or above average function. Cognitive ability contributes to heterogeneity in autism and serves as a modest predictor of later function. We show that a brain measure (event-related potentials, ERPs) of word processing in children with ASD, assessed at the age of 2 years (N = 24), is a broad and robust predictor of receptive language, cognitive ability, and adaptive behavior at ages 4 and 6 years, regardless of the form of intensive clinical treatment during the intervening years. The predictive strength of this brain measure increases over time, and exceeds the predictive strength of a measure of cognitive ability, used here for comparison. These findings have theoretical implications and may eventually lead to neural measures that allow early prediction of developmental outcomes as well as more individually tailored clinical interventions, with the potential for greater effectiveness in treating children with ASD.
Biological Psychiatry | 2014
Ali Mazaheri; Catherine Fassbender; Sharon Coffey-Corina; Tadeus A. Hartanto; Julie B. Schweitzer; George R. Mangun
BACKGROUND A neurobiological-based classification of attention-deficit/hyperactivity disorder (ADHD) subtypes has thus far remained elusive. The aim of this study was to use oscillatory changes in the electroencephalogram (EEG) related to informative cue processing, motor preparation, and top-down control to investigate neurophysiological differences between typically developing (TD) adolescents, and those diagnosed with predominantly inattentive (IA) or combined (CB) (associated with symptoms of inattention as well as impulsivity/hyperactivity) subtypes of ADHD. METHODS The EEG was recorded from 57 rigorously screened adolescents (12 to 17 years of age; 23 TD, 17 IA, and 17 CB), while they performed a cued flanker task. We examined the oscillatory changes in theta (3-5 Hz), alpha (8-12 Hz), and beta (22-25 Hz) EEG bands after cues that informed participants with which hand they would subsequently be required to respond. RESULTS Relative to TD adolescents, the IA group showed significantly less postcue alpha suppression, suggesting diminished processing of the cue in the visual cortex, whereas the CB group showed significantly less beta suppression at the electrode contralateral to the cued response hand, suggesting poor motor planning. Finally, both ADHD subtypes showed weak functional connectivity between frontal theta and posterior alpha, suggesting common top-down control impairment. CONCLUSIONS We found both distinct and common task-related neurophysiological impairments in ADHD subtypes. Our results suggest that task-induced changes in EEG oscillations provide an objective measure, which in conjunction with other sources of information might help distinguish between ADHD subtypes and therefore aid in diagnoses and evaluation of treatment.
Journal of the Acoustical Society of America | 2008
Sharon Coffey-Corina; Denise Padden; Patricia K. Kuhl
Children with Autism Spectrum Disorder (ASD) participated in a research study that involved both electrophysiological and behavioral measures. Event related brain potentials (ERPs) were recorded during auditory presentation of known and unknown words. Behavioral measures of language/cognitive function and severity of autism symptoms were also collected at the time of ERP testing and again one year later. In general, higher functioning children with ASD exhibited more localized brain effects for differences between known and unknown words. Lower functioning children with ASD had more diffuse patterns of response to the different word classes and also exhibited a stronger right hemisphere lateralization. That is, they showed differences between known and unknown words at many electrode sites and larger differences in the right hemisphere. In addition, significant correlations were obtained between specific brain wave measurements for both known and unknown words and the various behavioral measures. Patterns of ERPs effectively predicted later behavioral scores.
Frontiers in Psychology | 2017
David P. Corina; Shane Blau; Todd LaMarr; Laurel A. Lawyer; Sharon Coffey-Corina
Deaf children who receive a cochlear implant early in life and engage in intensive oral/aural therapy often make great strides in spoken language acquisition. However, despite clinicians’ best efforts, there is a great deal of variability in language outcomes. One concern is that cortical regions which normally support auditory processing may become reorganized for visual function, leaving fewer available resources for auditory language acquisition. The conditions under which these changes occur are not well understood, but we may begin investigating this phenomenon by looking for interactions between auditory and visual evoked cortical potentials in deaf children. If children with abnormal auditory responses show increased sensitivity to visual stimuli, this may indicate the presence of maladaptive cortical plasticity. We recorded evoked potentials, using both auditory and visual paradigms, from 25 typical hearing children and 26 deaf children (ages 2–8 years) with cochlear implants. An auditory oddball paradigm was used (85% /ba/ syllables vs. 15% frequency modulated tone sweeps) to elicit an auditory P1 component. Visual evoked potentials (VEPs) were recorded during presentation of an intermittent peripheral radial checkerboard while children watched a silent cartoon, eliciting a P1–N1 response. We observed reduced auditory P1 amplitudes and a lack of latency shift associated with normative aging in our deaf sample. We also observed shorter latencies in N1 VEPs to visual stimulus offset in deaf participants. While these data demonstrate cortical changes associated with auditory deprivation, we did not find evidence for a relationship between cortical auditory evoked potentials and the VEPs. This is consistent with descriptions of intra-modal plasticity within visual systems of deaf children, but do not provide evidence for cross-modal plasticity. In addition, we note that sign language experience had no effect on deaf children’s early auditory and visual ERP responses.
Journal of the Acoustical Society of America | 2006
Patricia K. Kuhl; Sharon Coffey-Corina; Denise Padden; Maritza Rivera-Gaxiola
Infants raised in monolingual families are equally good at native and non‐native speech discrimination early in life. By 12 months, performance on native speech has significantly improved while non‐native performance declines. We tested bilingual American infants at 7 and 11 months of age on native (/ta‐pa/) and non‐native (Mandarin affricate‐fricative) contrasts used in the monolingual tests. Phonetic discrimination was assessed using behavioral (conditioned head‐turn) and brain (event‐related potential) measures. The MacArthur Communicative Development Inventory estimated infants developing language skills at 14, 18, 24, and 30 months of age. The monolingual data [Kuhl et al., Language Learning and Development (2005)] demonstrated that at 7 months of age, infants native and non‐native speech perception skills predicted their later language development, but differentially. Better native phonetic perception predicted more rapid language development between 14 and 30 months, whereas better non‐native phone...