Corianne Rogalsky
Arizona State University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Corianne Rogalsky.
Neuropsychologia | 2015
Corianne Rogalsky; Tasha Poppa; Kuan Hua Chen; Steven W. Anderson; Hanna Damasio; Tracy Love; Gregory Hickok
For more than a century, speech repetition has been used as an assay for gauging the integrity of the auditory-motor pathway in aphasia, thought classically to involve a linkage between Wernickes area and Brocas area via the arcuate fasciculus. During the last decade, evidence primarily from functional imaging in healthy individuals has refined this picture both computationally and anatomically, suggesting the existence of a cortical hub located at the parietal-temporal boundary (area Spt) that functions to integrate auditory and motor speech networks for both repetition and spontaneous speech production. While functional imaging research can pinpoint the regions activated in repetition/auditory-motor integration, lesion-based studies are needed to infer causal involvement. Previous lesion studies of repetition have yielded mixed results with respect to Spts critical involvement in speech repetition. The present study used voxel-based lesion symptom mapping (VLSM) to investigate the neuroanatomy of repetition of both real words and non-words in a sample of 47 patients with focal left hemisphere brain damage. VLSMs identified a large voxel cluster spanning gray and white matter in the left temporal-parietal junction, including area Spt, where damage was significantly related to poor non-word repetition. Repetition of real words implicated a very similar dorsal network including area Spt. Cortical regions including Spt were implicated in repetition performance even when white matter damage was factored out. In addition, removing variance associated with speech perception abilities did not alter the overall lesion pattern for either task. Together with past functional imaging work, our results suggest that area Spt is integral in both word and non-word repetition, that its contribution is above and beyond that made by white matter pathways, and is not driven by perceptual processes alone. These findings are highly consistent with the claim that Spt is an area of sensory-motor translation in speech processing.
Frontiers in Human Neuroscience | 2014
Gregory Hickok; Corianne Rogalsky; Rong Chen; Edward H. Herskovits; Sarah Townsley; Argye E. Hillis
We tested the hypothesis that motor planning and programming of speech articulation and verbal short-term memory (vSTM) depend on partially overlapping networks of neural regions. We evaluated this proposal by testing 76 individuals with acute ischemic stroke for impairment in motor planning of speech articulation (apraxia of speech, AOS) and vSTM in the first day of stroke, before the opportunity for recovery or reorganization of structure-function relationships. We also evaluated areas of both infarct and low blood flow that might have contributed to AOS or impaired vSTM in each person. We found that AOS was associated with tissue dysfunction in motor-related areas (posterior primary motor cortex, pars opercularis; premotor cortex, insula) and sensory-related areas (primary somatosensory cortex, secondary somatosensory cortex, parietal operculum/auditory cortex); while impaired vSTM was associated with primarily motor-related areas (pars opercularis and pars triangularis, premotor cortex, and primary motor cortex). These results are consistent with the hypothesis, also supported by functional imaging data, that both speech praxis and vSTM rely on partially overlapping networks of brain regions.
Language, cognition and neuroscience | 2015
Corianne Rogalsky; Diogo Almeida; Jon Sprouse; Gregory Hickok
The role of Broca’s area in sentence processing is hotly debated. Hypotheses include that Broca’s area supports sentence comprehension via syntax-specific processes, hierarchical structure building, or working memory. Here we adopt a within-subject functional magnetic resonance imaging (fMRI) approach using sentence-level contrasts and non-sentential comparison tasks to address these hypotheses. Standard syntactic movement distance effects were replicated, but no difference was found between movement and non-movement sentences in Brocas area in the group or consistently in the individual subject analyses. Group and individual results both identify Brocas area subregions that are selective for sentence structure. Group, but not individual subject results, suggest shared resources for sentence processing and articulation in Brocas area. We conclude that Broca’s area is not selectively processing syntactic movement, but that subregions are selectively responsive to sentence structure. Our findings reinforce Fedorenko and Kanwishsers call for individual subject analyses in Brocas area, as group findings can obscure selective responses.
Frontiers in Psychology | 2015
Arianna N. LaCroix; Alvaro Diaz; Corianne Rogalsky
The relationship between the neurobiology of speech and music has been investigated for more than a century. There remains no widespread agreement regarding how (or to what extent) music perception utilizes the neural circuitry that is engaged in speech processing, particularly at the cortical level. Prominent models such as Patels Shared Syntactic Integration Resource Hypothesis (SSIRH) and Koelschs neurocognitive model of music perception suggest a high degree of overlap, particularly in the frontal lobe, but also perhaps more distinct representations in the temporal lobe with hemispheric asymmetries. The present meta-analysis study used activation likelihood estimate analyses to identify the brain regions consistently activated for music as compared to speech across the functional neuroimaging (fMRI and PET) literature. Eighty music and 91 speech neuroimaging studies of healthy adult control subjects were analyzed. Peak activations reported in the music and speech studies were divided into four paradigm categories: passive listening, discrimination tasks, error/anomaly detection tasks and memory-related tasks. We then compared activation likelihood estimates within each category for music vs. speech, and each music condition with passive listening. We found that listening to music and to speech preferentially activate distinct temporo-parietal bilateral cortical networks. We also found music and speech to have shared resources in the left pars opercularis but speech-specific resources in the left pars triangularis. The extent to which music recruited speech-activated frontal resources was modulated by task. While there are certainly limitations to meta-analysis techniques particularly regarding sensitivity, this work suggests that the extent of shared resources between speech and music may be task-dependent and highlights the need to consider how task effects may be affecting conclusions regarding the neurobiology of speech and music.
Neuropsychologia | 2016
Kayoko Okada; Corianne Rogalsky; Lucinda O'Grady; Leila Hanaumi; Ursula Bellugi; David P. Corina; Gregory Hickok
Since the discovery of mirror neurons, there has been a great deal of interest in understanding the relationship between perception and action, and the role of the human mirror system in language comprehension and production. Two questions have dominated research. One concerns the role of Brocas area in speech perception. The other concerns the role of the motor system more broadly in understanding action-related language. The current study investigates both of these questions in a way that bridges research on language with research on manual actions. We studied the neural basis of observing and executing American Sign Language (ASL) object and action signs. In an fMRI experiment, deaf signers produced signs depicting actions and objects as well as observed/comprehended signs of actions and objects. Different patterns of activation were found for observation and execution although with overlap in Brocas area, providing prima facie support for the claim that the motor system participates in language perception. In contrast, we found no evidence that action related signs differentially involved the motor system compared to object related signs. These findings are discussed in the context of lesion studies of sign language execution and observation. In this broader context, we conclude that the activation in Brocas area during ASL observation is not causally related to sign language understanding.
Neurobiology of Language | 2016
Corianne Rogalsky
There are numerous computations necessary for successful sentence comprehension, including phonological, semantic, syntactic, and combinatorial processes. Although Broca’s area has long been the focus of sentence-level research, the anterior temporal lobe (ATL) has emerged as a strong candidate for supporting sentence processing because there is ample evidence that the ATL is sensitive to the presence of sentence structure. This chapter focuses on how the ATL may contribute to sentence comprehension, in particular in relation to basic syntactic and combinatorial semantic operations.
Journal of Cognitive Neuroscience | 2018
Corianne Rogalsky; Arianna N. LaCroix; Kuan Hua Chen; Steven W. Anderson; Hanna Damasio; Tracy Love; Gregory Hickok
Brocas area has long been implicated in sentence comprehension. Damage to this region is thought to be the central source of “agrammatic comprehension” in which performance is substantially worse (and near chance) on sentences with noncanonical word orders compared with canonical word order sentences (in English). This claim is supported by functional neuroimaging studies demonstrating greater activation in Brocas area for noncanonical versus canonical sentences. However, functional neuroimaging studies also have frequently implicated the anterior temporal lobe (ATL) in sentence processing more broadly, and recent lesion–symptom mapping studies have implicated the ATL and mid temporal regions in agrammatic comprehension. This study investigates these seemingly conflicting findings in 66 left-hemisphere patients with chronic focal cerebral damage. Patients completed two sentence comprehension measures, sentence–picture matching and plausibility judgments. Patients with damage including Brocas area (but excluding the temporal lobe; n = 11) on average did not exhibit the expected agrammatic comprehension pattern—for example, their performance was >80% on noncanonical sentences in the sentence–picture matching task. Patients with ATL damage (n = 18) also did not exhibit an agrammatic comprehension pattern. Across our entire patient sample, the lesions of patients with agrammatic comprehension patterns in either task had maximal overlap in posterior superior temporal and inferior parietal regions. Using voxel-based lesion–symptom mapping, we find that lower performances on canonical and noncanonical sentences in each task are both associated with damage to a large left superior temporal–inferior parietal network including portions of the ATL, but not Brocas area. Notably, however, response bias in plausibility judgments was significantly associated with damage to inferior frontal cortex, including gray and white matter in Brocas area, suggesting that the contribution of Brocas area to sentence comprehension may be related to task-related cognitive demands.
Frontiers in Psychology | 2018
Lisa M. Johnson; Megan C. Fitzhugh; Yuji Yi; Soren Mickelsen; Leslie C. Baxter; Pamela Howard; Corianne Rogalsky
The neurobiology of sentence comprehension is well-studied but the properties and characteristics of sentence processing networks remain unclear and highly debated. Sign languages (i.e., visual-manual languages), like spoken languages, have complex grammatical structures and thus can provide valuable insights into the specificity and function of brain regions supporting sentence comprehension. The present study aims to characterize how these well-studied spoken language networks can adapt in adults to be responsive to sign language sentences, which contain combinatorial semantic and syntactic visual-spatial linguistic information. Twenty native English-speaking undergraduates who had completed introductory American Sign Language (ASL) courses viewed videos of the following conditions during fMRI acquisition: signed sentences, signed word lists, English sentences and English word lists. Overall our results indicate that native language (L1) sentence processing resources are responsive to ASL sentence structures in late L2 learners, but that certain L1 sentence processing regions respond differently to L2 ASL sentences, likely due to the nature of their contribution to language comprehension. For example, L1 sentence regions in Brocas area were significantly more responsive to L2 than L1 sentences, supporting the hypothesis that Brocas area contributes to sentence comprehension as a cognitive resource when increased processing is required. Anterior temporal L1 sentence regions were sensitive to L2 ASL sentence structure, but demonstrated no significant differences in activation to L1 than L2, suggesting its contribution to sentence processing is modality-independent. Posterior superior temporal L1 sentence regions also responded to ASL sentence structure but were more activated by English than ASL sentences. An exploratory analysis of the neural correlates of L2 ASL proficiency indicates that ASL proficiency is positively correlated with increased activations in response to ASL sentences in L1 sentence processing regions. Overall these results suggest that well-established fronto-temporal spoken language networks involved in sentence processing exhibit functional plasticity with late L2 ASL exposure, and thus are adaptable to syntactic structures widely different than those in an individuals native language. Our findings also provide valuable insights into the unique contributions of the inferior frontal and superior temporal regions that are frequently implicated in sentence comprehension but whose exact roles remain highly debated.
Brain and Language | 2018
Visar Berisha; Davis Gilton; Leslie C. Baxter; Steven R. Corman; Chris Blais; Gene A. Brewer; Scott W. Ruston; B. Hunter Ball; Kimberly M. Wingert; Beate Peter; Corianne Rogalsky
HighlightsStructural MRIs of monolinguals and Farsi‐English bilinguals were compared.A decision tree classifier correctly classified bilinguals as such 85% of the time.Decision tree classifiers are a promising tool for predicting bilingualism. &NA; The neurobiology of bilingualism is hotly debated. The present study examines whether normalized cortical measurements can be used to reliably classify monolinguals versus bilinguals in a structural MRI dataset of Farsi‐English bilinguals and English monolinguals. A decision tree classifier classified bilinguals with an average correct classification rate of 85%, and monolinguals with a rate of 71.4%. The most relevant regions for classification were the right supramarginal gyrus, left inferior temporal gyrus and left inferior frontal gyrus. Larger studies with carefully matched monolingual and bilingual samples are needed to confirm that features of these regions can reliably categorize monolingual and bilingual brains. Nonetheless, the present findings suggest that a single structural MRI scan, analyzed with measures readily available using default procedures in a free open‐access software (Freesurfer), can be used to reliably predict an individual’s language experience using a decision tree classifier, and that Farsi‐English bilingualism implicates regions identified in previous group‐level studies of bilingualism in other languages.
Cortex | 2018
Gregory Hickok; Corianne Rogalsky; William Matchin; Alexandra Basilakos; Julia Cai; Michelle Ferrill; Soren Mickelsen; Steven W. Anderson; Tracy Love; Jeffrey R. Binder; Julius Fridriksson