Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Athena Vouloumanos is active.

Publication


Featured researches published by Athena Vouloumanos.


Journal of Cognitive Neuroscience | 2001

Detection of Sounds in the Auditory Stream: Event-Related fMRI Evidence for Differential Activation to Speech and Nonspeech

Athena Vouloumanos; Kent A. Kiehl; Janet F. Werker; Peter F. Liddle

The detection of speech in an auditory stream is a requisite first step in processing spoken language. In this study, we used event-related fMRI to investigate the neural substrates mediating detection of speech compared with that of nonspeech auditory stimuli. Unlike previous studies addressing this issue, we contrasted speech with nonspeech analogues that were matched along key temporal and spectral dimensions. In an oddball detection task, listeners heard nonsense speech sounds, matched sine wave analogues (complex nonspeech), or single tones (simple nonspeech). Speech stimuli elicited significantly greater activation than both complex and simple nonspeech stimuli in classic receptive language areas, namely the middle temporal gyri bilaterally and in a locus lateralized to the left posterior superior temporal gyrus. In addition, speech activated a small cluster of the right inferior frontal gyrus. The activation of these areas in a simple detection task, which requires neither identification nor linguistic analysis, suggests they play a fundamental role in speech processing.


Developmental Psychology | 2009

Infants' Learning of Novel Words in a Stochastic Environment

Athena Vouloumanos; Janet F. Werker

In everyday word learning words are only sometimes heard in the presence of their referent, making the acquisition of novel words a particularly challenging task. The current study investigated whether children (18-month-olds who are novice word learners) can track the statistics of co-occurrence between words and objects to learn novel mappings in a stochastic environment. Infants were briefly trained on novel word-novel object pairs with variable degrees of co-occurrence: Words were either paired reliably with 1 referent or stochastically paired with 2 different referents with varying probabilities. Infants were sensitive to the co-occurrence statistics between words and referents, tracking not just the strongest available contingency but also low-frequency information. The statistical strength of the word-referent mapping may also modulate real-time online lexical processing in infants. Infants are thus able to track stochastic relationships between words and referents in the process of learning novel words.


Nature Neuroscience | 2003

Does Broca's play by the rules?

Gary F. Marcus; Athena Vouloumanos; Ivan A. Sag

Languages may all share and be constrained by a universal grammar. A new study shows that Brocas area (long thought to participate in grammatical aspects of language) becomes increasingly active as participants acquire rules from a foreign language, but not as they acquire comparable rules that are inconsistent with real languages. Could Brocas area be a neural substrate for universal grammar?


Cognition | 2006

From semantics to syntax and back again: argument structure in the third year of life.

Keith J. Fernandes; Gary F. Marcus; Jennifer A. Di Nubila; Athena Vouloumanos

An essential part of the human capacity for language is the ability to link conceptual or semantic representations with syntactic representations. On the basis of data from spontaneous production, suggested that young children acquire such links on a verb-by-verb basis, with little in the way of a general understanding of linguistic argument structure. Here, we suggest that a receptive understanding of argument structure--including principles linking syntax and conceptual/semantic structure--appears earlier. In a forced-choice pointing task we have shown that toddlers in the third year of life can map a single scene (involving a novel causative action paired with a novel verb) onto two distinct syntactic frames (transitive and intransitive). This suggests that even before toddlers begin generalizing argument structure in their own speech, they have some representation of conceptual/semantic categories, syntactic categories, and a system that links the two.


Attention Perception & Psychophysics | 2007

Discriminating languages by speech-reading

Salvador Soto-Faraco; Jordi Navarra; Whitney M. Weikum; Athena Vouloumanos; Núria Sebastián-Gallés; Janet F. Werker

The goal of this study was to explore the ability to discriminate languages using the visual correlates of speech (i.e., speech-reading). Participants were presented with silent video clips of an actor pronouncing two sentences (in Catalan and/or Spanish) and were asked to judge whether the sentences were in the same language or in different languages. Our results established that Spanish—Catalan bilingual speakers could discriminate running speech from their two languages on the basis of visual cues alone (Experiment 1). However, we found that this ability was critically restricted by linguistic experience, since Italian and English speakers who were unfamiliar with the test languages could not successfully discriminate the stimuli (Experiment 2). A test of Spanish monolingual speakers revealed that knowledge of only one of the two test languages was sufficient to achieve the discrimination, although at a lower level of accuracy than that seen in bilingual speakers (Experiment 3). Finally, we evaluated the ability to identify the language by speech-reading particularly distinctive words (Experiment 4). The results obtained are in accord with recent proposals arguing that the visual speech signal is rich in informational content, above and beyond what traditional accounts based solely on visemic confusion matrices would predict.


NeuroImage | 2003

Abnormal processing of speech during oddball target detection in schizophrenia

Elton T.C. Ngan; Athena Vouloumanos; Tara A. Cairo; Kristin R. Laurens; Alan T. Bates; Cameron M. Anderson; Janet F. Werker; Peter F. Liddle

Healthy subjects show increased activation in left temporal lobe regions in response to speech sounds compared to complex nonspeech sounds. Abnormal lateralization of speech-processing regions in the temporal lobes has been posited to be a cardinal feature of schizophrenia. Event-related fMRI was used to test the hypothesis that schizophrenic patients would show an abnormal pattern of hemispheric lateralization when detecting speech compared with complex nonspeech sounds in an auditory oddball target-detection task. We predicted that differential activation for speech in the vicinity of the superior temporal sulcus would be greater in schizophrenic patients than in healthy subjects in the right hemisphere, but less in patients than in healthy subjects in the left hemisphere. Fourteen patients with schizophrenia (selected from an outpatient population, 2 females, 12 males, mean age 35.1 years) and 29 healthy subjects (8 females, 21 males, mean age 29.3 years) were scanned while they performed an auditory oddball task in which the oddball stimuli were either speech sounds or complex nonspeech sounds. Compared to controls, individuals with schizophrenia showed greater differential activation between speech and nonspeech in right temporal cortex, left superior frontal cortex, and the left temporal-parietal junction. The magnitude of the difference in the left temporal parietal junction was significantly correlated with severity of disorganized thinking. This study supports the hypothesis that aberrant functional lateralization of speech processing is an underlying feature of schizophrenia and suggests the magnitude of the disturbance in speech-processing circuits may be associated with severity of disorganized thinking.


Proceedings of the National Academy of Sciences of the United States of America | 2012

Twelve-month-old infants recognize that speech can communicate unobservable intentions

Athena Vouloumanos; Kristine H. Onishi; Amanda Pogue

Much of our knowledge is acquired not from direct experience but through the speech of others. Speech allows rapid and efficient transfer of information that is otherwise not directly observable. Do infants recognize that speech, even if unfamiliar, can communicate about an important aspect of the world that cannot be directly observed: a person’s intentions? Twelve-month-olds saw a person (the Communicator) attempt but fail to achieve a target action (stacking a ring on a funnel). The Communicator subsequently directed either speech or a nonspeech vocalization to another person (the Recipient) who had not observed the attempts. The Recipient either successfully stacked the ring (Intended outcome), attempted but failed to stack the ring (Observable outcome), or performed a different stacking action (Related outcome). Infants recognized that speech could communicate about unobservable intentions, looking longer at Observable and Related outcomes than the Intended outcome when the Communicator used speech. However, when the Communicator used nonspeech, infants looked equally at the three outcomes. Thus, for 12-month-olds, speech can transfer information about unobservable aspects of the world such as internal mental states, which provides preverbal infants with a tool for acquiring information beyond their immediate experience.


Schizophrenia Research | 2006

Do you hear what I hear? Neural correlates of thought disorder during listening to speech in schizophrenia

Sara Weinstein; Janet F. Werker; Athena Vouloumanos; Todd S. Woodward; Elton T.C. Ngan

Thought disorder is a fundamental symptom of schizophrenia, observable as irregularities in speech. It has been associated with functional and structural abnormalities in brain regions involved in language processing, including left temporal regions, during language production tasks. We were interested in the neural correlates of thought disorder during receptive language processing, as this function is relatively preserved despite relying on the same brain regions as expressive language. Twelve patients with schizophrenia and 11 controls listened to 30-s speech samples while undergoing fMRI scanning. Thought disorder and global symptom ratings were obtained for each patient. Thought disorder but not global symptomatology correlated positively with the BOLD response in the left posterior superior temporal lobe while listening to comprehensible speech (cluster-level corrected p=.023). The pattern of brain activity associated with thought disorder during listening to comprehensible speech differs from that seen during language generation tasks, where a reduction of the leftward laterality of language has often been observed. As receptive language is spared in thought disorder, we propose that the increase in activation reflects compensatory processing allowing for normal performance.


Developmental Science | 2014

Do 6-month-olds understand that speech can communicate?

Athena Vouloumanos; Alia Martin; Kristine H. Onishi

Adults and 12-month-old infants recognize that even unfamiliar speech can communicate information between third parties, suggesting that they can separate the communicative function of speech from its lexical content. But do infants recognize that speech can communicate due to their experience understanding and producing language, or do they appreciate that speech is communicative earlier, with little such experience? We examined whether 6-month-olds recognize that speech can communicate information about an object. Infants watched a Communicator selectively grasp one of two objects (target). During test, the Communicator could no longer reach the objects; she turned to a Recipient and produced speech (a nonsense word) or non-speech (coughing). Infants looked longer when the Recipient selected the non-target than the target object when the Communicator spoke but not when she coughed - unless the Recipient had previously witnessed the Communicators selective grasping of the target object. Our results suggest that at 6 months, with a receptive vocabulary of no more than a handful of commonly used words, infants possess some abstract understanding of the communicative function of speech. This understanding may provide an early mechanism for language and knowledge acquisition.


Developmental Science | 2014

Neural Specialization for Speech in the First Months of Life.

Sarah Shultz; Athena Vouloumanos; Randi H. Bennett; Kevin A. Pelphrey

How does the brain’s response to speech change over the first months of life? Although behavioral findings indicate that neonates’ listening biases are sharpened over the first months of life, with a species-specific preference for speech emerging by 3 months, the neural substrates underlying this developmental change are unknown. We examined neural responses to speech compared with biological non-speech sounds in 1- to 4-month-old infants using fMRI. Infants heard speech and biological non-speech sounds, including heterospecific vocalizations and human non-speech. We observed a left-lateralized response in temporal cortex for speech compared to biological non-speech sounds, indicating that this region is highly selective for speech by the first month of life. Specifically, this brain region becomes increasingly selective for speech over the next 3 months as neural substrates become less responsive to non-speech sounds. These results reveal specific changes in neural responses during a developmental period characterized by rapid behavioral changes.

Collaboration


Dive into the Athena Vouloumanos's collaboration.

Top Co-Authors

Avatar

Janet F. Werker

University of British Columbia

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Kevin A. Pelphrey

George Washington University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Whitney M. Weikum

University of British Columbia

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge