Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Arielle Borovsky is active.

Publication


Featured researches published by Arielle Borovsky.


Cognition | 2013

Fast mapping, slow learning: Disambiguation of novel word―object mappings in relation to vocabulary learning at 18, 24, and 30 months

Ricardo A. H. Bion; Arielle Borovsky; Anne Fernald

When hearing a novel name, children tend to select a novel object rather than a familiar one, a bias known as disambiguation. Using online processing measures with 18-, 24-, and 30-month-olds, we investigate how the development of this bias relates to word learning. Childrens proportion of looking time to a novel object after hearing a novel name related to their success in retention of the novel word, and also to their vocabulary size. However, skill in disambiguation and retention of novel words developed gradually: 18-month-olds did not show a reliable preference for the novel object after labeling; 24-month-olds reliably looked at a novel object on Disambiguation trials but showed no evidence of retention; and 30-month-olds succeeded on Disambiguation trials and showed only fragile evidence of retention. We conclude that the ability to find the referent of a novel word in ambiguous contexts is a skill that improves from 18 to 30months of age. Word learning is characterized as an incremental process that is related to - but not dependent on - the emergence of disambiguation biases.


Cognition | 2010

Learning to use words: Event-related potentials index single-shot contextual word learning

Arielle Borovsky; Marta Kutas; Jeffrey L. Elman

Humans have the remarkable capacity to learn words from a single instance. The goal of this study was to examine the impact of initial learning context on the understanding of novel word usage using event-related brain potentials. Participants saw known and unknown words in strongly or weakly constraining sentence contexts. After each sentence context, word usage knowledge was assessed via plausibility ratings of these words as the objects of transitive verbs. Plausibility effects were observed in the N400 component to the verb only when the upcoming novel word object had initially appeared in a strongly constraining context. These results demonstrate that rapid word learning is modulated by contextual constraint and reveal a rapid mental process that is sensitive to novel word usage.


Journal of Child Language | 2006

Language input and semantic categories: a relation between cognition and early word learning.

Arielle Borovsky; Jeffrey L. Elman

Variations in the amount and nature of early language to which children are exposed have been linked to their subsequent ability (e.g. Huttenlocher, Haight, Bryk, Seltzer & Lyons, 1991; Hart & Risley, 1995). In three computational simulations, we explore how differences in linguistic experience can explain differences in word learning ability due to changes in the development of semantic category structure. More specifically, we manipulate the amount of language input, sentential complexity, and the frequency distribution of words within categories. In each of these simulations, improvements in category structure, are tightly correlated with subsequent improvements in word learning ability even when the nature of the input remains the same over time. These simulations suggest that variation in early language environments may result in differences in lexical proficiency by altering underlying cognitive abilities like categorization.


Language Learning and Development | 2012

Once Is Enough: N400 Indexes Semantic Integration of Novel Word Meanings from a Single Exposure in Context.

Arielle Borovsky; Jeffrey L. Elman; Marta Kutas

We investigated the impact of contextual constraint on the integration of novel word meanings into semantic memory. Adults read strongly or weakly constraining sentences ending in known or unknown (novel) words as scalp-recorded electrical brain activity was recorded. Word knowledge was assessed via a lexical decision task in which recently seen known and unknown word sentence endings served as primes for semantically related, unrelated, and synonym/identical target words. As expected, N400 amplitudes to target words preceded by known word primes were reduced by prime-target relatedness. Critically, N400 amplitudes to targets preceded by novel primes also varied with prime-target relatedness, but only when they had initially appeared in highly constraining sentences. This demonstrates for the first time that fast-mapped word representations can develop strong associations with semantically related word meanings and reveals a rapid neural process that can integrate information about word meanings into the mental lexicon of young adults.


Developmental Science | 2016

Lexical leverage: Category knowledge boosts real-time novel word recognition in 2-year-olds

Arielle Borovsky; Erica M. Ellis; Julia L. Evans; Jeffrey L. Elman

Recent research suggests that infants tend to add words to their vocabulary that are semantically related to other known words, though it is not clear why this pattern emerges. In this paper, we explore whether infants leverage their existing vocabulary and semantic knowledge when interpreting novel label-object mappings in real time. We initially identified categorical domains for which individual 24-month-old infants have relatively higher and lower levels of knowledge, irrespective of overall vocabulary size. Next, we taught infants novel words in these higher and lower knowledge domains and then asked if their subsequent real-time recognition of these items varied as a function of their category knowledge. While our participants successfully acquired the novel label-object mappings in our task, there were important differences in the way infants recognized these words in real time. Namely, infants showed more robust recognition of high (vs. low) domain knowledge words. These findings suggest that dense semantic structure facilitates early word learning and real-time novel word recognition.


Journal of Communication Disorders | 2013

Lexical activation during sentence comprehension in adolescents with history of Specific Language Impairment.

Arielle Borovsky; Erin Burns; Jeffrey L. Elman; Julia L. Evans

UNLABELLED One remarkable characteristic of speech comprehension in typically developing (TD) children and adults is the speed with which the listener can integrate information across multiple lexical items to anticipate upcoming referents. Although children with Specific Language Impairment (SLI) show lexical deficits (Sheng & McGregor, 2010) and slower speed of processing (Leonard et al., 2007), relatively little is known about how these deficits manifest in real-time sentence comprehension. In this study, we examine lexical activation in the comprehension of simple transitive sentences in adolescents with a history of SLI and age-matched, TD peers. Participants listened to sentences that consisted of the form, Article-Agent-Action-Article-Theme, (e.g., The pirate chases the ship) while viewing pictures of four objects that varied in their relationship to the Agent and Action of the sentence (e.g., Target, Agent-Related, Action-Related, and Unrelated). Adolescents with SLI were as fast as their TD peers to fixate on the sentences final item (the Target) but differed in their post-action onset visual fixations to the Action-Related item. Additional exploratory analyses of the spatial distribution of their visual fixations revealed that the SLI group had a qualitatively different pattern of fixations to object images than did the control group. The findings indicate that adolescents with SLI integrate lexical information across words to anticipate likely or expected meanings with the same relative fluency and speed as do their TD peers. However, the failure of the SLI group to show increased fixations to Action-Related items after the onset of the action suggests lexical integration deficits that result in failure to consider alternate sentence interpretations. LEARNING OUTCOMES As a result of this paper, the reader will be able to describe several benefits of using eye-tracking methods to study populations with language disorders. They should also recognize several potential explanations for lexical deficits in SLI, including possible reduced speed of processing, and degraded lexical representations. Finally, they should recall the main outcomes of this study, including that adolescents with SLI show different timing and location of eye-fixations while interpreting sentences than their age-matched peers.


Journal of Experimental Psychology: Learning, Memory and Cognition | 2015

Real-Time Processing of ASL Signs: Delayed First Language Acquisition Affects Organization of the Mental Lexicon

Amy M. Lieberman; Arielle Borovsky; Marla Hatrak; Rachel I. Mayberry

Sign language comprehension requires visual attention to the linguistic signal and visual attention to referents in the surrounding world, whereas these processes are divided between the auditory and visual modalities for spoken language comprehension. Additionally, the age-onset of first language acquisition and the quality and quantity of linguistic input for deaf individuals is highly heterogeneous, which is rarely the case for hearing learners of spoken languages. Little is known about how these modality and developmental factors affect real-time lexical processing. In this study, we ask how these factors impact real-time recognition of American Sign Language (ASL) signs using a novel adaptation of the visual world paradigm in deaf adults who learned sign from birth (Experiment 1), and in deaf adults who were late-learners of ASL (Experiment 2). Results revealed that although both groups of signers demonstrated rapid, incremental processing of ASL signs, only native signers demonstrated early and robust activation of sublexical features of signs during real-time recognition. Our findings suggest that the organization of the mental lexicon into units of both form and meaning is a product of infant language learning and not the sensory and motor modality through which the linguistic signal is sent and received.


Cognitive Science | 2017

Maternal Socioeconomic Status Influences the Range of Expectations during Language Comprehension in Adulthood.

Melissa Troyer; Arielle Borovsky

In infancy, maternal socioeconomic status (SES) is associated with real-time language processing skills, but whether or not (and if so, how) this relationship carries into adulthood is unknown. We explored the effects of maternal SES in college-aged adults on eye-tracked, spoken sentence comprehension tasks using the visual world paradigm. When sentences ended in highly plausible, expected target nouns (Exp. 1), higher SES was associated with a greater likelihood of considering alternative endings related to the action of the sentence. Moreover, for unexpected sentence endings (Exp. 2), individuals from higher SES backgrounds were sensitive to whether the ending was action-related (plausible) or unrelated (implausible), showing a benefit for plausible endings. Individuals from lower SES backgrounds did not show this advantage. This suggests maternal SES can influence the dynamics of sentence processing even in adulthood, with consequences for processing unexpected content. These findings highlight the importance of early lexical experience for adult language skills.


Language, cognition and neuroscience | 2017

The amount and structure of prior event experience affects anticipatory sentence interpretation

Arielle Borovsky

ABSTRACT Listeners easily interpret speech about novel events in everyday conversation; however, much of research on mechanisms of spoken language comprehension, by design, capitalises on event knowledge that is familiar to most listeners. This paper explores how listeners generalise from previous experience during incremental processing of novel spoken sentences. In two studies, participants initially heard stories that conveyed novel event mappings between agents, actions and objects, and their ability to interpret a novel, related event in real-time was measured via eye-tracking. A single exposure to a novel event was not sufficient to support generalisation in real-time sentence processing. When each story event was repeated with either the same agent or a different, related agent, listeners generalised in the repetition condition, but not in the multiple agent condition. These findings shed light on the conditions under which listeners leverage prior event experience while interpreting novel linguistic signals in everyday speech.


Journal of the Acoustical Society of America | 2007

Multiple means of conveying information through sound: Comparisons of environmental sounds and spoken language processing using converging methodologies

Robert Leech; Alycia Cummings; Arielle Borovsky; Ayse Pinar Saygin

Environmental sounds are increasingly viewed as an attractive nonlinguistic analog for studying meaningful speech in that they can convey referential—or at least associative—information about objects, scenes, and events that unfold over time. However, environmental sounds also differ significantly from speech along other perceptual and informational parameters. These cross‐domain similarities and differences have proved useful in uncovering the perceptual and cognitive divisions of labor in the developing and mature brain. Our group has directly compared environmental sound and spoken language understanding in a series of behavioral and neuroimaging studies with infants, typically and atypically developing children, healthy adults, and aphasic patients. In general, our results suggest that environmental sounds and language share many of the same processing and neural resources over the lifespan. [This research is supported by the National Institutes of Health and the Medical Research Council.]

Collaboration


Dive into the Arielle Borovsky's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Marta Kutas

University of California

View shared research outputs
Top Co-Authors

Avatar

Julia L. Evans

University of California

View shared research outputs
Top Co-Authors

Avatar

Erica M. Ellis

San Diego State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Kim Sweeney

University of California

View shared research outputs
Top Co-Authors

Avatar

Marla Hatrak

University of California

View shared research outputs
Top Co-Authors

Avatar

Melissa Troyer

University of California

View shared research outputs
Researchain Logo
Decentralizing Knowledge