Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Katherine J. Midgley is active.

Publication


Featured researches published by Katherine J. Midgley.


Psychological Science | 2015

A Thousand Words Are Worth a Picture: Snapshots of Printed-Word Processing in an Event-Related Potential Megastudy

Stéphane Dufau; Jonathan Grainger; Katherine J. Midgley; Phillip J. Holcomb

In the experiment reported here, approximately 1,000 words were presented to 75 participants in a go/no-go lexical decision task while event-related potentials (ERPs) were recorded. Partial correlations were computed for variables selected to reflect orthographic, lexical, and semantic processing, as well as for a novel measure of the visual complexity of written words. Correlations were based on the item-level ERPs at each electrode site and time slice while a false-discovery-rate correction was applied. Early effects of visual complexity were seen around 50 ms after word onset, followed by the earliest sustained orthographic effects around 100 to 150 ms, with the bulk of orthographic and lexical influences arising after 200 ms. Effects of a semantic variable (concreteness) emerged later, at around 300 ms. The overall time course of these ERP effects is in line with hierarchical, cascaded, interactive accounts of word recognition, in which fast feed-forward influences are consolidated by top-down feedback via recurrent processing loops.


Journal of Neurolinguistics | 2016

Neural changes underlying early stages of L2 vocabulary acquisition

He Pu; Phillip J. Holcomb; Katherine J. Midgley

Research has shown neural changes following second language (L2) acquisition after weeks or months of instruction. But are such changes detectable even earlier than previously shown? The present study examines the electrophysiological changes underlying the earliest stages of second language vocabulary acquisition by recording event-related potentials (ERPs) within the first week of learning. Adult native English speakers with no previous Spanish experience completed less than four hours of Spanish vocabulary training, with pre- and post-training ERPs recorded to a backward translation task. Results indicate that beginning L2 learners show rapid neural changes following learning, manifested in changes to the N400 - an ERP component sensitive to lexicosemantic processing and degree of L2 proficiency. Specifically, learners in early stages of L2 acquisition show growth in N400 amplitude to L2 words following learning as well as a backward translation N400 priming effect that was absent pre-training. These results were shown within days of minimal L2 training, suggesting that the neural changes captured during adult second language acquisition are more rapid than previously shown. Such findings are consistent with models of early stages of bilingualism in adult learners of L2 (e.g. Kroll and Stewarts RHM) and reinforce the use of ERP measures to assess L2 learning.


Brain and Language | 2017

Implicit co-activation of American Sign Language in deaf readers: An ERP study

Gabriela Meade; Katherine J. Midgley; Zed Sevcikova Sehyr; Phillip J. Holcomb; Karen Emmorey

HighlightsDeaf bilinguals read word pairs, some had ASL translations with form overlap.N400‐like (anterior) and RT effects in the group unaware of form overlap at debrief.Lexicosemantic interactivity underlies implicit co‐activation of sign language.Weaker suppression of non‐target language from another modality leads to RT effect.Late ERP effect in group aware of form manipulation related to explicit translation. ABSTRACT In an implicit phonological priming paradigm, deaf bimodal bilinguals made semantic relatedness decisions for pairs of English words. Half of the semantically unrelated pairs had phonologically related translations in American Sign Language (ASL). As in previous studies with unimodal bilinguals, targets in pairs with phonologically related translations elicited smaller negativities than targets in pairs with phonologically unrelated translations within the N400 window. This suggests that the same lexicosemantic mechanism underlies implicit co‐activation of a non‐target language, irrespective of language modality. In contrast to unimodal bilingual studies that find no behavioral effects, we observed phonological interference, indicating that bimodal bilinguals may not suppress the non‐target language as robustly. Further, there was a subset of bilinguals who were aware of the ASL manipulation (determined by debrief), and they exhibited an effect of ASL phonology in a later time window (700–900 ms). Overall, these results indicate modality‐independent language co‐activation that persists longer for bimodal bilinguals.


Language, cognition and neuroscience | 2018

Electrophysiological evidence for the interaction of prosody and thematic fit during sentence comprehension

Shannon M. Sheppard; Katherine J. Midgley; Tracy Love; Lewis P. Shapiro; Phillip J. Holcomb

ABSTRACT This study investigated the interaction of prosody and thematic fit/plausibility information during the processing of sentences containing temporary early closure (correct) or late closure (incorrect) syntactic ambiguities using event-related potentials (ERPs). Early closure sentences with congruent and incongruent prosody were presented where the temporarily ambiguous NP was either a plausible or an implausible continuation for the subordinate verb (e.g. “While the band played the song/beer pleased all the customers.”). N400 and P600 components were examined at critical points in each condition. The CPS was examined in sentences with congruent prosody. Prosodic and thematic fit cues interacted immediately (N400–P600) at the implausible NP (beer), when it was paired with incongruent prosody. Incongruent prosody paired with a plausible NP (song) resulted in garden-path effects (N400–P600) at the critical verb (pleased). These findings provide strong evidence that prosodic and thematic fit/plausibility cues interact to aid the parser in syntactic structure building.


Language, cognition and neuroscience | 2018

Phonological and semantic priming in American Sign Language: N300 and N400 effects

Gabriela Meade; Brittany Lee; Katherine J. Midgley; Phillip J. Holcomb; Karen Emmorey

ABSTRACT This study investigated the electrophysiological signatures of phonological and semantic priming in American Sign Language (ASL). Deaf signers made semantic relatedness judgments to pairs of ASL signs separated by a 1300 ms prime-target SOA. Phonologically related sign pairs shared two of three phonological parameters (handshape, location, and movement). Target signs preceded by phonologically related and semantically related prime signs elicited smaller negativities within the N300 and N400 windows than those preceded by unrelated primes. N300 effects, typically reported in studies of picture processing, are interpreted to reflect the mapping from the visual features of the signs to more abstract linguistic representations. N400 effects, consistent with rhyme priming effects in the spoken language literature, are taken to index lexico-semantic processes that appear to be largely modality independent. Together, these results highlight both the unique visual-manual nature of sign languages and the linguistic processing characteristics they share with spoken languages.


Archive | 2017

A Neuro-cognitive View of the Bilingual Brain

Katherine J. Midgley

This neurocognitive view of the bilingual brain presents different directions and methods of exploration that have led us to our current understanding of bilingual language processing. Keeping in mind that the question of bilingual language processing is vast, here our primary focus is on the processing of words in one or both of a bilingual’s two languages.


Neuropsychologia | 2017

The N170 ERP component differs in laterality, distribution, and association with continuous reading measures for deaf and hearing readers

Karen Emmorey; Katherine J. Midgley; Casey B. Kohen; Zed Sevcikova Sehyr; Phillip J. Holcomb

Abstract The temporo‐occipitally distributed N170 ERP component is hypothesized to reflect print‐tuning in skilled readers. This study investigated whether skilled deaf and hearing readers (matched on reading ability, but not phonological awareness) exhibit similar N170 patterns, given their distinct experiences learning to read. Thirty‐two deaf and 32 hearing adults viewed words and symbol strings in a familiarity judgment task. In the N170 epoch (120–240 ms) hearing readers produced greater negativity for words than symbols at left hemisphere (LH) temporo‐parietal and occipital sites, while deaf readers only showed this asymmetry at occipital sites. Linear mixed effects regression was used to examine the influence of continuous measures of reading, spelling, and phonological skills on the N170 (120–240 ms). For deaf readers, better reading ability was associated with a larger N170 over the right hemisphere (RH), but for hearing readers better reading ability was associated with a smaller RH N170. Better spelling ability was related to larger occipital N170s in deaf readers, but this relationship was weak in hearing readers. Better phonological awareness was associated with smaller N170s in the LH for hearing readers, but this association was weaker and in the RH for deaf readers. The results support the phonological mapping hypothesis for a left‐lateralized temporo‐parietal N170 in hearing readers and indicate that skilled reading is characterized by distinct patterns of neural tuning to print in deaf and hearing adults. HighlightsDeaf and hearing adults (equal reading) saw words and symbol strings in an ERP study.Deaf readers (poor phonological skill) showed a bilateral temporo‐parietal N170.Better deaf readers had larger right N170; better hearing readers had smaller R N170.Better spellers in both groups had larger occipital N170s.Results support the phonological mapping hypothesis for L N170 in hearing readers.


Neuropsychologia | 2018

Orthographic and phonological selectivity across the reading system in deaf skilled readers

Laurie S. Glezer; Jill Weisberg; Cindy O’Grady Farnady; Stephen McCullough; Katherine J. Midgley; Phillip J. Holcomb; Karen Emmorey

ABSTRACT People who are born deaf often have difficulty learning to read. Recently, several studies have examined the neural substrates involved in reading in deaf people and found a left lateralized reading system similar to hearing people involving temporo‐parietal, inferior frontal, and ventral occipito‐temporal cortices. Previous studies in typical hearing readers show that within this reading network there are separate regions that specialize in processing orthography and phonology. We used fMRI rapid adaptation in deaf adults who were skilled readers to examine neural selectivity in three functional ROIs in the left hemisphere: temporoparietal cortex (TPC), inferior frontal gyrus (IFG), and the visual word form area (VWFA). Results show that in deaf skilled readers, the left VWFA showed selectivity for orthography similar to what has been reported for hearing readers, the TPC showed less sensitivity to phonology than previously reported for hearing readers using the same paradigm, and the IFG showed selectivity to orthography, but not phonology (similar to what has been reported previously for hearing readers). These results provide evidence that while skilled deaf readers demonstrate coarsely tuned phonological representations in the TPC, they develop finely tuned representations for the orthography of written words in the VWFA and IFG. This result suggests that phonological tuning in the TPC may have little impact on the neural network associated with skilled reading for deaf adults. HighlightsAn fMRI rapid adaptation paradigm explored neural tuning in skilled deaf readers.Both the left and right VWFA showed selectivity to whole word orthography.Sensitivity, but not selectivity to phonology was found in temporoparietal cortex.Skilled deaf readers develop finely tuned neural representations of written words.Phonological tuning may have little impact on reading success for deaf adults.


Language, cognition and neuroscience | 2018

An electrophysiological megastudy of spoken word recognition

Kurt Winsler; Katherine J. Midgley; Jonathan Grainger; Phillip J. Holcomb

ABSTRACT This study used electrophysiological recordings to a large sample of spoken words to track the time-course of word frequency, phonological neighbourhood density, concreteness and stimulus duration effects in two experiments. Fifty subjects were presented more than a thousand spoken words during either a go/no go lexical decision task (Experiment 1) or a go/no go semantic categorisation task (Experiment 2) while EEG was collected. Linear mixed effects modelling was used to analyze the data. Effects of word frequency were found on the N400 and also as early as 100 ms in Experiment 1 but not Experiment 2. Phonological neighbourhood density produced an early effect around 250 ms and the typical N400 effect. Concreteness elicited effects in later epochs on the N400. Stimulus duration affected all epochs and its influence reflected changes in the timing of the ERP components. Overall the results support cascaded interactive models of spoken word recognition.


Frontiers in Psychology | 2018

An ERP Investigation of L2–L1 Translation Priming in Adult Learners

Gabriela Meade; Katherine J. Midgley; Phillip J. Holcomb

A longstanding debate centers around how beginning adult bilinguals process words in their second language (L2). Do they access the meaning of the L2 words directly or do they first activate the native language (L1) translation equivalents in order to access meaning? To address this question, we used ERPs to investigate how newly learned L2 words influence processing of their L1 translation equivalents. We taught participants the meanings of 80 novel L2 (pseudo)words by presenting them with pictures of familiar objects. After 3 days of learning, participants were tested in a backward translation priming paradigm with a short (140 ms) stimulus onset asynchrony. L1 targets preceded by their L2 translations elicited faster responses and smaller amplitude negativities than the same L1 targets preceded by unrelated L2 words. The bulk of the ERP translation priming effect occurred within the N400 window (350–550 ms), suggesting that the new L2 words were automatically activating their semantic representations. A weaker priming effect in the preceding window (200–350 ms) was found at anterior sites, providing some evidence that the forms of the L1 translation equivalents had also been activated. These results have implications for models of L2 processing at the earliest stages of learning.

Collaboration


Dive into the Katherine J. Midgley's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Karen Emmorey

San Diego State University

View shared research outputs
Top Co-Authors

Avatar

Gabriela Meade

University of California

View shared research outputs
Top Co-Authors

Avatar

Kurt Winsler

San Diego State University

View shared research outputs
Top Co-Authors

Avatar

Lewis P. Shapiro

San Diego State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Tracy Love

San Diego State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Brittany Lee

University of California

View shared research outputs
Researchain Logo
Decentralizing Knowledge