Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Daniel Swingley is active.

Publication


Featured researches published by Daniel Swingley.


Proceedings of the National Academy of Sciences of the United States of America | 2012

At 6-9 months, human infants know the meanings of many common nouns.

Elika Bergelson; Daniel Swingley

It is widely accepted that infants begin learning their native language not by learning words, but by discovering features of the speech signal: consonants, vowels, and combinations of these sounds. Learning to understand words, as opposed to just perceiving their sounds, is said to come later, between 9 and 15 mo of age, when infants develop a capacity for interpreting others’ goals and intentions. Here, we demonstrate that this consensus about the developmental sequence of human language learning is flawed: in fact, infants already know the meanings of several common words from the age of 6 mo onward. We presented 6- to 9-mo-old infants with sets of pictures to view while their parent named a picture in each set. Over this entire age range, infants directed their gaze to the named pictures, indicating their understanding of spoken words. Because the words were not trained in the laboratory, the results show that even young infants learn ordinary words through daily experience with language. This surprising accomplishment indicates that, contrary to prevailing beliefs, either infants can already grasp the referential intentions of adults at 6 mo or infants can learn words before this ability emerges. The precocious discovery of word meanings suggests a perspective in which learning vocabulary and learning the sound structure of spoken language go hand in hand as language acquisition begins.


Psychological Science | 1998

Rapid Gains in Speed of Verbal Processing by Infants in the 2nd Year

Anne Fernald; John P. Pinto; Daniel Swingley; Amy Weinbergy; Gerald W. McRoberts

Infants improve substantially in language ability during their 2nd year. Research on the early development of speech production shows that vocabulary begins to expand rapidly around the age of 18 months. During this period, infants also make impressive gains in understanding spoken language. We examined the time course of word recognition in infants ages 15 to 24 months, tracking their eye movements as they looked at pictures in response to familiar spoken words. The speed and efficiency of verbal processing increased dramatically over the 2nd year. Although 15-month-old infants did not orient to the correct picture until after the target word was spoken, 24-month-olds were significantly faster, shifting their gaze to the correct picture before the end of the spoken word. By 2 years of age, children are progressing toward the highly efficient performance of adults, making decisions about words based on incomplete acoustic information.


Cognitive Psychology | 2007

Lexical competition in young children's word learning.

Daniel Swingley; Richard N. Aslin

In two experiments, 1.5-year-olds were taught novel words whose sound patterns were phonologically similar to familiar words (novel neighbors) or were not (novel nonneighbors). Learning was tested using a picture-fixation task. In both experiments, children learned the novel nonneighbors but not the novel neighbors. In addition, exposure to the novel neighbors impaired recognition performance on familiar neighbors. Finally, children did not spontaneously use phonological differences to infer that a novel word referred to a novel object. Thus, lexical competition--inhibitory interaction among words in speech comprehension--can prevent children from using their full phonological sensitivity in judging words as novel. These results suggest that word learning in young children, as in adults, relies not only on the discrimination and identification of phonetic categories, but also on evaluating the likelihood that an utterance conveys a new word.


Language and Speech | 2003

Phonetic detail in the developing lexicon

Daniel Swingley

Although infants show remarkable sensitivity to linguistically relevant phonetic variation in speech, young children sometimes appear not to make use of this sensitivity. Here, childrens knowledge of the soundforms of familiar words was assessed using a visual fixation task. Dutch 19-month-olds were shown pairs of pictures and heard correct pronunciations and mispronunciations of familiar words naming one of the pictures. Mispronunciations were word-initial in Experiment 1 and wordmedial in Experiment 2, and in both experiments involved substituting one segment with[d](a common sound in Dutch)or [g](a rare sound). In both experiments, word recognition performance was better for correct pronunciations than for mispronunciations involving either substituted consonant. These effects did not depend upon childrens knowledge of lexical or nonlexical phonological neighbors of the tested words. The results indicate the encoding of phonetic detail in words at 19 months.


Philosophical Transactions of the Royal Society B | 2009

Contributions of infant word learning to language development

Daniel Swingley

Infants learn the forms of words by listening to the speech they hear. Though little is known about the degree to which these forms are meaningful for young infants, the words still play a role in early language development. Words guide the infant to his or her first syntactic intuitions, aid in the development of the lexicon, and, it is proposed, may help infants learn phonetic categories.


Psychological Science | 2010

Conceptual Penetration of Visual Processing

Gary Lupyan; Sharon L. Thompson-Schill; Daniel Swingley

In traditional hierarchical models of information processing, visual representations feed into conceptual systems, but conceptual categories do not exert an influence on visual processing. We provide evidence, across four experiments, that conceptual information can in fact penetrate early visual processing, rather than merely biasing the output of perceptual systems. Participants performed physical-identity judgments on visually equidistant pairs of letter stimuli that were either in the same conceptual category (Bb) or in different categories (Bp). In the case of nonidentical letters, response times were longer when the stimuli were from the same conceptual category, but only when the letters were presented sequentially. The differences in effect size between simultaneous and sequential trials rules out a decision-level account. An additional experiment using animal silhouettes replicated the major effects found with letters. Thus, performance on an explicitly visual task was influenced by conceptual categories. This effect depended on processing time, immediately preceding experience, and stimulus typicality, which suggests that it was produced by the direct influence of category knowledge on perception, rather than by a postperceptual decision bias.


Language Learning and Development | 2010

Fast Mapping and Slow Mapping in Children's Word Learning

Daniel Swingley

When young children encounter a word they do not know, their guesses about what the word might mean are often surprisingly accurate. This is true not only with respect to the particular instance that the speaker refers to at that moment but also with respect to the entire category of things, states, situations, or events to which the word may refer in the language. For more than 30 years, understanding how this is possible has been the central empirical and theoretical concern of most of the developmental psychologists and linguists who study the process of word learning experimentally. This attention has not been misplaced. The domain of word learning has provided a fertile ground for testing competing accounts of children’s understanding of reference in language, their use of ontological divisions and other world knowledge in categorization, and their grasp of syntactic regularities. On the whole, children have performed surprisingly well in these experiments—by the age of two or three, they make efficient and appropriate use of a wide range of sources of information in determining what speakers are referring to in the moment, and in evaluating how novel words may be used in future situations. Most such tests have taken place in contrived but well-controlled experimental situations in which a brief exposure to a novel word, in a particular social or linguistic context, is revealed to lead children to choose one object or scene rather than another as a referent of the word. What gives such studies their force is the careful manipulation of the precise content of the introducing event, and the selection of the alternatives offered to the child, which pit one possibly tempting interpretation against another. The point is not that the experimental situation closely mimics children’s daily lives but rather that children’s interpretations reveal antecedent knowledge either innately specified or gained in development. The experiments acknowledged as the primary intellectual ancestors to this research tradition are those reported in Carey and Bartlett (1978; see also Brown, 1957; Katz, Baker, & MacNamara, 1974). Carey and Bartlett (1978) introduced the term “fast mapping,” which has become central to developmental psychology’s narrative about how words are learned. In this narrative, it is children’s accuracy in fast mapping that cries out for explanation. How can children arrive at the correct meaning of a word given only indirect and incomplete evidence? Yet in Carey and Bartlett’s famous “chromium” study, fast mapping was not so successful. Fewer than one in ten of the 3-year-olds appeared to have linked the word to its intended meaning (olive green). The children who had been exposed to the word in the study’s naturalistic teaching context (“bring


Journal of Experimental Psychology: Human Perception and Performance | 2009

Supervised and Unsupervised Learning of Multidimensional Acoustic Categories

Martijn Goudbeek; Daniel Swingley; Roel Smits

Learning to recognize the contrasts of a language-specific phonemic repertoire can be viewed as forming categories in a multidimensional psychophysical space. Research on the learning of distributionally defined visual categories has shown that categories defined over 1 dimension are easy to learn and that learning multidimensional categories is more difficult but tractable under specific task conditions. In 2 experiments, adult participants learned either a unidimensional or a multidimensional category distinction with or without supervision (feedback) during learning. The unidimensional distinctions were readily learned and supervision proved beneficial, especially in maintaining category learning beyond the learning phase. Learning the multidimensional category distinction proved to be much more difficult and supervision was not nearly as beneficial as with unidimensionally defined categories. Maintaining a learned multidimensional category distinction was only possible when the distributional information that identified the categories remained present throughout the testing phase. We conclude that listeners are sensitive to both trial-by-trial feedback and the distributional information in the stimuli. Even given limited exposure, listeners learned to use 2 relevant dimensions, albeit with considerable difficulty.


Quarterly Journal of Experimental Psychology | 2012

Self-directed speech affects visual search performance.

Gary Lupyan; Daniel Swingley

People often talk to themselves, yet very little is known about the functions of this self-directed speech. We explore effects of self-directed speech on visual processing by using a visual search task. According to the label feedback hypothesis (Lupyan, 2007a), verbal labels can change ongoing perceptual processing—for example, actually hearing “chair” compared to simply thinking about a chair can temporarily make the visual system a better “chair detector”. Participants searched for common objects, while being sometimes asked to speak the targets name aloud. Speaking facilitated search, particularly when there was a strong association between the name and the visual target. As the discrepancy between the name and the target increased, speaking began to impair performance. Together, these results speak to the power of words to modulate ongoing visual processing.


Language Learning and Development | 2015

Early Word Comprehension in Infants: Replication and Extension.

Elika Bergelson; Daniel Swingley

A handful of recent experimental reports have shown that infants of 6–9 months know the meanings of some common words. Here, we replicate and extend these findings. With a new set of items, we show that when young infants (age 6–16 months, n = 49) are presented with side-by-side video clips depicting various common early words, and one clip is named in a sentence, they look at the named video at above-chance rates. We demonstrate anew that infants understand common words by 6–9 months and that performance increases substantially around 14 months. The results imply that 6- to 9-month-olds’ failure to understand words not referring to objects (verbs, adjectives, performatives) in a similar prior study is not attributable to the use of dynamic video depictions. Thus, 6- to 9-month-olds’ experience of spoken language includes some understanding of common words for concrete objects, but relatively impoverished comprehension of other words.

Collaboration


Dive into the Daniel Swingley's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Gary Lupyan

University of Wisconsin-Madison

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Colman Humphrey

University of Pennsylvania

View shared research outputs
Researchain Logo
Decentralizing Knowledge