Eugenio Parise
Lancaster University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Eugenio Parise.
Psychological Science | 2012
Eugenio Parise; Gergely Csibra
Early word learning in infants relies on statistical, prosodic, and social cues that support speech segmentation and the attachment of meaning to words. It is debated whether such early word knowledge represents mere associations between sound patterns and visual object features, or reflects referential understanding of words. By measuring an event-related brain potential component known as the N400, we demonstrated that 9-month-old infants can detect the mismatch between an object appearing from behind an occluder and a preceding label with which their mother introduces it. Differential N400 amplitudes have been shown to reflect semantic priming in adults, and its absence in infants has been interpreted as a sign of associative word learning. By setting up a live communicative situation for referring to objects, we demonstrated that a similar priming effect also occurs in young infants. This finding may indicate that word meaning is referential from the outset of word learning and that referential expectation drives, rather than results from, vocabulary acquisition in humans.
Child Development | 2009
Stefanie Hoehl; Vincent Reid; Eugenio Parise; Andrea Handl; Letizia Palumbo; Tricia Striano
The importance of eye gaze as a means of communication is indisputable. However, there is debate about whether there is a dedicated neural module, which functions as an eye gaze detector and when infants are able to use eye gaze cues in a referential way. The application of neuroscience methodologies to developmental psychology has provided new insights into early social cognitive development. This review integrates findings on the development of eye gaze processing with research on the neural mechanisms underlying infant and adult social cognition. This research shows how a cognitive neuroscience approach can improve our understanding of social development and autism spectrum disorder.
Developmental Psychology | 2009
Vincent Reid; Stefanie Hoehl; Maren Grigutsch; Anna Groendahl; Eugenio Parise; Tricia Striano
The sequential nature of action ensures that an individual can anticipate the conclusion of an observed action via the use of semantic rules. The semantic processing of language and action has been linked to the N400 component of the event-related potential (ERP). The authors developed an ERP paradigm in which infants and adults observed simple sequences of actions. In one condition the conclusion of the sequence was anticipated, whereas in the other condition the conclusion was not anticipated. Adults and infants at 9 months and 7 months were assessed via the same neural mechanisms-the N400 component and analysis of the theta frequency. Results indicated that adults and infants at 9 months produced N400-like responses when anticipating action conclusions. The infants at 7 months displayed no N400 component. Analysis of the theta frequency provided support for the relation between the N400 and semantic processing. This study suggests that infants at 9 months anticipate goals and use similar cognitive mechanisms to adults in this task. In addition, this result suggests that language processing may derive from understanding action in early development.
Frontiers in Human Neuroscience | 2010
Tobias Grossmann; Eugenio Parise; Angela D. Friederici
A precondition for successful communication between people is the detection of signals indicating the intention to communicate, such as eye contact or calling a persons name. In adults, establishing communication by eye contact or calling a persons name results in overlapping activity in right prefrontal cortex, suggesting that, regardless of modality, the intention to communicate is detected by the same brain region. We measured prefrontal cortex responses in 5-month-olds using near-infrared spectroscopy (NIRS) to examine the neural basis of detecting communicative signals across modalities in early development. Infants watched human faces that either signaled eye contact or directed their gaze away from the infant, and they also listened to voices that addressed them with their own name or another name. The results revealed that infants recruit adjacent but non-overlapping regions in the left dorsal prefrontal cortex when they process eye contact and own name. Moreover, infants that responded sensitively to eye contact in the one prefrontal region were also more likely to respond sensitively to their own name in the adjacent prefrontal region as revealed in a correlation analysis, suggesting that responding to communicative signals in these two regions might be functionally related. These NIRS results suggest that infants selectively process and attend to communicative signals directed at them. However, unlike adults, infants do not seem to recruit a common prefrontal region when processing communicative signals of different modalities. The implications of these findings for our understanding of infants’ developing communicative abilities are discussed.
Social Neuroscience | 2008
Eugenio Parise; Vincent M. Reid; Manuela Stets; Tricia Striano
Abstract Do 5-month-old infants show differences in processing objects as a function of a prior interaction with an adult? Using a live ERP paradigm we assessed this question utilizing a within-subjects design. Infants saw objects during two pretest phases with an adult experimenter. We recorded event-related potentials to the presentation of objects following the interactive pretest phases. Experimental conditions differed only in the nature of eye contact between the infant and the experimenter during the pretests. In one condition the experimenter engaged the infant with direct eye contact. In a second condition the experimenter looked only at the infants chest. We found that the negative component, related to attentional processes, showed differences between experimental conditions in left fronto-central locations. These data show that 5-month-old infants allocate more attention to objects that have been previously seen during direct eye-contact interaction. In addition, these results clarify the functional nature of the negative component.
PLOS ONE | 2010
Eugenio Parise; Angela D. Friederici; Tricia Striano
An infants own name is a unique social cue. Infants are sensitive to their own name by 4 months of age, but whether they use their names as a social cue is unknown. Electroencephalogram (EEG) was measured as infants heard their own name or strangers names and while looking at novel objects. Event related brain potentials (ERPs) in response to names revealed that infants differentiate their own name from stranger names from the first phoneme. The amplitude of the ERPs to objects indicated that infants attended more to objects after hearing their own names compared to another name. Thus, by 5 months of age infants not only detect their name, but also use it as a social cue to guide their attention to events and objects in the world.
Brain & Development | 2010
Daniel Stahl; Eugenio Parise; Stefanie Hoehl; Tricia Striano
Event-related potential (ERP) studies with infants are often limited by a small number of measurements. We introduce a weighted general linear mixed model analysis with a time-varying covariate, which allows for the efficient analysis of all available event-related potential data of infants. This method allows controlling the signal to noise ratio effect on averaged ERP estimates due to small and varying numbers of trials. The method enables analyzing ERP data sets of infants, which would often not be possible otherwise. We illustrate this method by analyzing an experimental study and discuss the advantages in comparison to currently used methods as well as its potential limitations. In this study, 6-month-old infants saw a face showing a neutral or an angry expression in combination with direct or averted eye gaze. We examined how the infant brain processes facial expressions and whether the direction of eye gaze has an influence on it. We focused on the infant Negative Central ERP component (Nc). The neutral expression elicited larger amplitude and peaked earlier than the angry expression. An interaction between emotion and gaze was found for Nc latency, suggesting that emotions are processed in combination with eye gaze in infancy.
Social Neuroscience | 2014
Stefanie Hoehl; Christine Michel; Vincent M. Reid; Eugenio Parise; Tricia Striano
We examined infants’ oscillatory brain activity during a live interaction with an adult who showed them novel objects. Activation in the alpha frequency range was assessed. Nine-month-old infants responded with desynchronization of alpha-band activity when looking at an object together with an adult during a social interaction involving eye contact. When infant and experimenter only looked at the object without engaging in eye contact, no such effect was observed. Results are interpreted in terms of activation of a generic semantic knowledge system induced by eye contact during a social interaction.
Child Development | 2011
Eugenio Parise; Andrea Handl; Letizia Palumbo; Angela D. Friederici
Eye gaze is an important communicative signal, both as mutual eye contact and as referential gaze to objects. To examine whether attention to speech versus nonspeech stimuli in 4- to 5-month-olds (n=15) varies as a function of eye gaze, event-related brain potentials were used. Faces with mutual or averted gaze were presented in combination with forward- or backward-spoken words. Infants rapidly processed gaze and spoken words in combination. A late Slow Wave suggests an interaction of the 2 factors, separating backward-spoken word+direct gaze from all other conditions. An additional experiment (n=15) extended the results to referential gaze. The current findings suggest that interactions between visual and auditory cues are present early in infancy.
PLOS ONE | 2015
Estefanía Domínguez-Martínez; Eugenio Parise; Tommy Strandvall; Vincent M. Reid
In a typical visual Event Related Potential (ERP) study, the stimulus is presented centrally on the screen. Normally an ERP response will be measured provided that the participant directs their gaze towards the stimulus. The aim of this study was to assess how the N400 component of an ERP was affected when the stimulus was presented in the foveal, parafoveal or peripheral vision of the participant’s visual field. Utilizing stimuli that have previously produced an N400 response to action incongruities, the same stimuli sequences were presented at 0°, 4°, 8° and 12° of visual angle from a fixation location. In addition to the EEG data, eye tracking data were recorded to act as a fixation control method and to allow for eye artifact detection. The results show a significant N400 effect in the right parieto-temporal electrodes within the 0° visual angle condition. For the other conditions, the N400 effect was reduced (4°) or not present (8° and 12°). Our results suggest that the disappearance of the N400 effect with eccentricity is due to the fixation distance to the stimulus. However, variables like attentional allocation could have also had an impact on the results. This study highlights the importance of presenting a stimulus within the foveal vision of the participant in order to maximize ERP effects related to higher order cognitive processes.