Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Sarah Chabal is active.

Publication


Featured researches published by Sarah Chabal.


PLOS ONE | 2012

CLEARPOND: Cross-Linguistic Easy-Access Resource for Phonological and Orthographic Neighborhood Densities

Viorica Marian; James Bartolotti; Sarah Chabal; Anthony Shook

Past research has demonstrated cross-linguistic, cross-modal, and task-dependent differences in neighborhood density effects, indicating a need to control for neighborhood variables when developing and interpreting research on language processing. The goals of the present paper are two-fold: (1) to introduce CLEARPOND (Cross-Linguistic Easy-Access Resource for Phonological and Orthographic Neighborhood Densities), a centralized database of phonological and orthographic neighborhood information, both within and between languages, for five commonly-studied languages: Dutch, English, French, German, and Spanish; and (2) to show how CLEARPOND can be used to compare general properties of phonological and orthographic neighborhoods across languages. CLEARPOND allows researchers to input a word or list of words and obtain phonological and orthographic neighbors, neighborhood densities, mean neighborhood frequencies, word lengths by number of phonemes and graphemes, and spoken-word frequencies. Neighbors can be defined by substitution, deletion, and/or addition, and the database can be queried separately along each metric or summed across all three. Neighborhood values can be obtained both within and across languages, and outputs can optionally be restricted to neighbors of higher frequency. To enable researchers to more quickly and easily develop stimuli, CLEARPOND can also be searched by features, generating lists of words that meet precise criteria, such as a specific range of neighborhood sizes, lexical frequencies, and/or word lengths. CLEARPOND is freely-available to researchers and the public as a searchable, online database and for download at http://clearpond.northwestern.edu.


Neuropsychologia | 2015

Task dependent lexicality effects support interactive models of reading: A meta-analytic neuroimaging review

Chris McNorgan; Sarah Chabal; Daniel O'Young; Sladjana Lukic; James R. Booth

Models of reading must explain how orthographic input activates a phonological representation, and elicits the retrieval of word meaning from semantic memory. Comparisons between tasks that theoretically differ with respect to the degree to which they rely on connections between orthographic, phonological and semantic systems during reading can thus provide valuable insight into models of reading, but such direct comparisons are not well-represented in the literature. An ALE meta-analysis explored lexicality effects directly contrasting words and pseudowords using the lexical decision task and overt or covert naming, which we assume rely most on the semantic and phonological systems, respectively. Interactions between task and lexicality effects demonstrate that different demands of the lexical decision and naming tasks lead to different manifestations of lexicality effects.


Journal of Experimental Psychology: General | 2015

Speakers of different languages process the visual world differently.

Sarah Chabal; Viorica Marian

Language and vision are highly interactive. Here we show that people activate language when they perceive the visual world, and that this language information impacts how speakers of different languages focus their attention. For example, when searching for an item (e.g., clock) in the same visual display, English and Spanish speakers look at different objects. Whereas English speakers searching for the clock also look at a cloud, Spanish speakers searching for the clock also look at a gift, because the Spanish names for gift (regalo) and clock (reloj) overlap phonologically. These different looking patterns emerge despite an absence of direct language input, showing that linguistic information is automatically activated by visual scene processing. We conclude that the varying linguistic information available to speakers of different languages affects visual perception, leading to differences in how the visual world is processed.


Attention Perception & Psychophysics | 2015

Audio-visual object search is changed by bilingual experience

Sarah Chabal; Scott R. Schroeder; Viorica Marian

The current study examined the impact of language experience on the ability to efficiently search for objects in the face of distractions. Monolingual and bilingual participants completed an ecologically-valid, object-finding task that contained conflicting, consistent, or neutral auditory cues. Bilinguals were faster than monolinguals at locating the target item, and eye movements revealed that this speed advantage was driven by bilinguals’ ability to overcome interference from visual distractors and focus their attention on the relevant object. Bilinguals fixated the target object more often than did their monolingual peers, who, in contrast, attended more to a distracting image. Moreover, bilinguals’, but not monolinguals’, object-finding ability was positively associated with their executive control ability. We conclude that bilinguals’ executive control advantages extend to real-world visual processing and object finding within a multi-modal environment.


Archive | 2015

In the Mind's Eye: Eye­Tracking and Multi­modal Integration During Bilingual Spoken­Language Processing

Sarah Chabal; Viorica Marian

The human language system integrates information from multiple sources and modalities. Bilinguals, who experience increased processing demands due to competition between their two languages, may be especially likely to rely on cues from multiple modalities. The cross-modal integration involved in language processing has been frequently studied using eye-tracking, an approach that can accommodate the simultaneous presence of both auditory and visual inputs. Eye-tracking research has demonstrated that bilinguals activate both of their languages in parallel (e.g., Spivey and Marian 1999; Weber and Cutler 2004), leading to competition both within (e.g., the English word “marker” competes with the phonologically similar word “marble”) and between (e.g., the English word “marker” competes with the Russian word “marka”) their two languages (Marian and Spivey 2003a, b). Interestingly, this competition arises even when languages do not share phonology (e.g., bimodal bilinguals; Shook and Marian 2012) and in the absence of explicit linguistic input (Chabal and Marian 2015), demonstrating the highly interactive nature of bilingual language processing. This interactivity in the language system yields changes to bilingual cognitive function. For example, bilinguals’ need to control phonological competition is associated with enhanced executive control (Blumenfeld and Marian 2011), and their experience suppressing information from the non-target language improves their ability to learn novel vocabulary (Bartolotti and Marian 2012). We conclude that multi-modal investigations of language processing such as those employing eye-tracking are not only ecologically valid, as they closely resemble real-world multi-modal situations, but also can demonstrate how language interacts with other cognitive and perceptual functions in a non-modular mind.


Brain and Language | 2014

Differential recruitment of executive control regions during phonological competition in monolinguals and bilinguals

Viorica Marian; Sarah Chabal; James Bartolotti; Kailyn A.L. Bradley; Arturo E. Hernandez


Cognitive Science | 2013

Language experience modifies the processing of visual input.

Sarah Chabal; Viorica Marian


Cognitive Science | 2013

CLEARPOND: Phonological and orthographic neighborhood information for multiple languages.

Anthony Shook; Sarah Chabal; James Bartolotti; Viorica Marian


Proceedings of the Annual Meeting of the Cognitive Science Society | 2011

The Impact of Musical Experience on Statistical Language Learning - eScholarship

Anthony Shook; Viorica Marian; James Bartolotti; Scott R. Schroeder; Sarah Chabal


Cognitive Science | 2011

The Impact of Musical Experience on Statistical Language Learning.

Anthony Shook; Viorica Marian; James Bartolotti; Scott R. Schroeder; Sarah Chabal

Collaboration


Dive into the Sarah Chabal's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Kailyn A.L. Bradley

Icahn School of Medicine at Mount Sinai

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge