Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Peter Hagoort is active.

Publication


Featured researches published by Peter Hagoort.


Trends in Cognitive Sciences | 2005

On Broca, brain, and binding: a new framework.

Peter Hagoort

In speaking and comprehending language, word information is retrieved from memory and combined into larger units (unification). Unification operations take place in parallel at the semantic, syntactic and phonological levels of processing. This article proposes a new framework that connects psycholinguistic models to a neurobiological account of language. According to this proposal the left inferior frontal gyrus (LIFG) plays an important role in unification. Research in other domains of cognition indicates that left prefrontal cortex has the necessary neurobiological characteristics for its involvement in the unification for language. I offer here a psycholinguistic perspective on the nature of language unification and the role of LIFG.


Language and Cognitive Processes | 1993

The syntactic positive shift (SPS) as an ERP measure of syntactic processing

Peter Hagoort; Colin M. Brown; Jolanda Groothusen

Abstract This paper presents event-related brain potential (ERP) data from an experiment on syntactic processing. Subjects read individual sentences containing one of three different kinds of violations of the syntactic constraints of Dutch. The ERP results provide evidence for M electrophysiological response to syntactic processing that is qualitatively different from established ERP responses to semantic processing. We refer to this electro-physiological manifestation of parsing as the Syntactic Positive Shift (SPS). The SPS was observed in an experiment in which no task demands, other than to read the input, were imposed on the subjects. The pattern of responses to the different kinds of syntactic violations suggests that the SPS indicates the impossibility for the parser to assign the preferred structure to an incoming string of words, irrespective of the specific syntactic nature of this preferred structure. The implications of these findings for further research on parsing are discussed.


Journal of Cognitive Neuroscience | 1993

The processing nature of the n400: Evidence from masked priming

Colin M. Brown; Peter Hagoort

The N400 is an endogenous event-related brain potential (ERP) that is sensitive to semantic processes during language comprehension. The general question we address in this paper is which aspects of the comprehension process are manifest in the N400. The focus is on the sensitivity of the N400 to the automatic process of lexical access, or to the controlled process of lexical integration. The former process is the reflex-like and effortless behavior of computing a form representation of the linguistic signal, and of mapping this representation onto corresponding entries in the mental lexicon. The latter process concerns the integration of a spoken or written word into a higher-order meaning representation of the context within which it occurs. ERPs and reaction times (RTs) were acquired to target words preceded by semantically related and unrelated prime words. The semantic relationship between a prime and its target has been shown to modulate the amplitude of the N400 to the target. This modulation can arise from lexical access processes, reflecting the automatic spread of activation between words related in meaning in the mental lexicon. Alternatively, the N400 effect can arise from lexical integration processes, reflecting the relative ease of meaning integration between the prime and the target. To assess the impact of automatic lexical access processes on the N400, we compared the effect of masked and unmasked presentations of a prime on the N400 to a following target. Masking prevents perceptual identification, and as such it is claimed to rule out effects from controlled processes. It therefore enables a stringent test of the possible impact of automatic lexical access processes on the N400. The RT study showed a significant semantic priming effect under both unmasked and masked presentations of the prime. The result for masked priming reflects the effect of automatic spreading of activation during the lexical access process. The ERP study showed a significant N400 effect for the unmasked presentation condition, but no such effect for the masked presentation condition. This indicates that the N400 is not a manifestation of lexical access processes, but reflects aspects of semantic integration processes.


Journal of Cognitive Neuroscience | 1999

Semantic Integration in Sentences and Discourse: Evidence from the N400

Jos J. A. Van Berkum; Peter Hagoort; Colin M. Brown

In two ERP experiments we investigated how and when the language comprehension system relates an incoming word to semantic representations of an unfolding local sentence and a wider discourse. In Experiment 1, subjects were presented with short stories. The last sentence of these stories occasionally contained a critical word that, although acceptable in the local sentence context, was semantically anomalous with respect to the wider discourse (e.g., Jane told the brother that he was exceptionally slow in a discourse context where he had in fact been very quick). Relative to coherent control words (e.g., quick), these discourse-dependent semantic anomalies elicited a large N400 effect that began at about 200 to 250 msec after word onset. In Experiment 2, the same sentences were presented without their original story context. Although the words that had previously been anomalous in discourse still elicited a slightly larger average N400 than the coherent words, the resulting N400 effect was much reduced, showing that the large effect observed in stories depended on the wider discourse. In the same experiment, single sentences that contained a clear local semantic anomaly elicited a standard sentence-dependent N400 effect (e.g., Kutas & Hillyard, 1980). The N400 effects elicited in discourse and in single sentences had the same time course, overall morphology, and scalp distribution. We argue that these findings are most compatible with models of language processing in which there is no fundamental distinction between the integration of a word in its local (sentence-level) and its global (discourse-level) semantic context.


Brain and Language | 2007

Neural evidence for the interplay between language, gesture, and action: A review

Roel M. Willems; Peter Hagoort

Co-speech gestures embody a form of manual action that is tightly coupled to the language system. As such, the co-occurrence of speech and co-speech gestures is an excellent example of the interplay between language and action. There are, however, other ways in which language and action can be thought of as closely related. In this paper we will give an overview of studies in cognitive neuroscience that examine the neural underpinnings of links between language and action. Topics include neurocognitive studies of motor representations of speech sounds, action-related language, sign language and co-speech gestures. It will be concluded that there is strong evidence on the interaction between speech and gestures in the brain. This interaction however shares general properties with other domains in which there is interplay between language and action.


Journal of Cognitive Neuroscience | 1999

The Neural Circuitry Involved in the Reading of German Words and Pseudowords: A PET Study

Peter Hagoort; Peter Indefrey; Colin M. Brown; Hans Herzog; Helmuth Steinmetz; Rüdiger J. Seitz

Silent reading and reading aloud of German words and pseudowords were used in a PET study using (15O) butanol to examine the neural correlates of reading and of the phonological conversion of legal letter strings, with or without meaning. The results of 11 healthy, right-handed volunteers in the age range of 25 to 30 years showed activation of the lingual gyri during silent reading in comparison with viewing a fixation cross. Comparisons between the reading of words and pseudo-words suggest the involvement of the middle temporal gyri in retrieving both the phonological and semantic code for words. The reading of pseudowords activates the left inferior frontal gyrus, including the ventral part of Brocas area, to a larger extent than the reading of words. This suggests that this area might be involved in the sublexical conversion of orthographic input strings into phonological output codes. (Pre)motor areas were found to be activated during both silent reading and reading aloud. On the basis of the obtained activation patterns, it is hypothesized that the articulation of high-frequency syllables requires the retrieval of their concomitant articulatory gestures from the SMA and that the articulation of low-frequency syllables recruits the left medial premotor cortex.


Neuron | 2011

Neuronal Dynamics Underlying High- and Low-Frequency EEG Oscillations Contribute Independently to the Human BOLD Signal

René Scheeringa; Pascal Fries; Karl Magnus Petersson; Robert Oostenveld; Iris Grothe; David G. Norris; Peter Hagoort; Marcel C. M. Bastiaansen

Work on animals indicates that BOLD is preferentially sensitive to local field potentials, and that it correlates most strongly with gamma band neuronal synchronization. Here we investigate how the BOLD signal in humans performing a cognitive task is related to neuronal synchronization across different frequency bands. We simultaneously recorded EEG and BOLD while subjects engaged in a visual attention task known to induce sustained changes in neuronal synchronization across a wide range of frequencies. Trial-by-trial BOLD fluctuations correlated positively with trial-by-trial fluctuations in high-EEG gamma power (60-80 Hz) and negatively with alpha and beta power. Gamma power on the one hand, and alpha and beta power on the other hand, independently contributed to explaining BOLD variance. These results indicate that the BOLD-gamma coupling observed in animals can be extrapolated to humans performing a task and that neuronal dynamics underlying high- and low-frequency synchronization contribute independently to the BOLD signal.


NeuroImage | 2003

How the brain solves the binding problem for language: A neurocomputational model of syntactic processing

Peter Hagoort

Syntax is one of the components in the architecture of language processing that allows the listener/reader to bind single-word information into a unified interpretation of multiword utterances. This paper discusses ERP effects that have been observed in relation to syntactic processing. The fact that these effects differ from the semantic N400 indicates that the brain honors the distinction between semantic and syntactic binding operations. Two models of syntactic processing attempt to account for syntax-related ERP effects. One type of model is serial, with a first phase that is purely syntactic in nature (syntax-first model). The other type of model is parallel and assumes that information immediately guides the interpretation process once it becomes available. This is referred to as the immediacy model. ERP evidence is presented in support of the latter model. Next, an explicit computational model is proposed to explain the ERP data. This Unification Model assumes that syntactic frames are stored in memory and retrieved on the basis of the spoken or written word form input. The syntactic frames associated with the individual lexical items are unified by a dynamic binding process into a structural representation that spans the whole utterance. On the basis of a meta-analysis of imaging studies on syntax, it is argued that the left posterior inferior frontal cortex is involved in binding syntactic frames together, whereas the left superior temporal cortex is involved in retrieval of the syntactic frames stored in memory. Lesion data that support the involvement of this left frontotemporal network in syntactic processing are discussed.


Journal of Cognitive Neuroscience | 2008

The neural integration of speaker and message

Jos J. A. Van Berkum; Daniëlle Van den Brink; Cathelijne M. J. Y. Tesink; Miriam Kos; Peter Hagoort

When do listeners take into account who the speaker is? We asked people to listen to utterances whose content sometimes did not match inferences based on the identity of the speaker (e.g., If only I looked like Britney Spears in a male voice, or I have a large tattoo on my back spoken with an upper-class accent). Event-related brain responses revealed that the speakers identity is taken into account as early as 200300 msec after the beginning of a spoken word, and is processed by the same early interpretation mechanism that constructs sentence meaning based on just the words. This finding is difficult to reconcile with standard Gricean models of sentence interpretation in which comprehenders initially compute a local, context-independent meaning for the sentence (semantics) before working out what it really means given the wider communicative context and the particular speaker (pragmatics). Because the observed brain response hinges on voice-based and usually stereotype-dependent inferences about the speaker, it also shows that listeners rapidly classify speakers on the basis of their voices and bring the associated social stereotypes to bear on what is being said. According to our event-related potential results, language comprehension takes very rapid account of the social context, and the construction of meaning based on language alone cannot be separated from the social aspects of language use. The linguistic brain relates the message to the speaker immediately.


Journal of Cognitive Neuroscience | 2001

Electrophysiological Evidence for Early Contextual Influences during Spoken-Word Recognition: N200 Versus N400 Effects

Daniëlle Van den Brink; Colin M. Brown; Peter Hagoort

An event-related brain potential experiment was carried out to investigate the time course of contextual influences on spoken-word recognition. Subjects were presented with spoken sentences that ended with a word that was either (a) congruent, (b) semantically anomalous, but beginning with the same initial phonemes as the congruent completion, or (c) semantically anomalous beginning with phonemes that differed from the congruent completion. In addition to finding an N400 effect in the two semantically anomalous conditions, we obtained an early negative effect in the semantically anomalous condition where word onset differed from that of the congruent completions. It was concluded that the N200 effect is related to the lexical selection process, where word-form information resulting from an initial phonological analysis and content information derived from the context interact.

Collaboration


Dive into the Peter Hagoort's collaboration.

Researchain Logo
Decentralizing Knowledge