Bastien Boutonnet
Bangor University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Bastien Boutonnet.
Brain Research | 2012
Bastien Boutonnet; Panos Athanasopoulos; Guillaume Thierry
Does language modulate perception and categorisation of everyday objects? Here, we approach this question from the perspective of grammatical gender in bilinguals. We tested Spanish-English bilinguals and control native speakers of English in a semantic categorisation task on triplets of pictures in an all-in-English context while measuring event-related brain potentials (ERPs). Participants were asked to press a button when the third picture of a triplet belonged to the same semantic category as the first two, and another button when it belonged to a different category. Unbeknownst to them, in half of the trials, the gender of the third picture name in Spanish had the same gender as that of the first two, and the opposite gender in the other half. We found no priming in behavioural results of either semantic relatedness or gender consistency. In contrast, ERPs revealed not only the expected semantic priming effect in both groups, but also a negative modulation by gender inconsistency in Spanish-English bilinguals, exclusively. These results provide evidence for spontaneous and unconscious access to grammatical gender in participants functioning in a context requiring no access to such information, thereby providing support for linguistic relativity effects in the grammatical domain.
Journal of Cognitive Neuroscience | 2013
Bastien Boutonnet; Benjamin Dering; Nestor Viñas-Guasch; Guillaume Thierry
Recent streams of research support the Whorfian hypothesis according to which language affects ones perception of the world. However, studies of object categorization in different languages have heavily relied on behavioral measures that are fuzzy and inconsistent. Here, we provide the first electrophysiological evidence for unconscious effects of language terminology on object perception. Whereas English has two words for cup and mug, Spanish labels those two objects with the word “taza.” We tested native speakers of Spanish and English in an object detection task using a visual oddball paradigm, while measuring event-related brain potentials. The early deviant-related negativity elicited by deviant stimuli was greater in English than in Spanish participants. This effect, which relates to the existence of two labels in English versus one in Spanish, substantiates the neurophysiological evidence that language-specific terminology affects object categorization.
Cognitive, Affective, & Behavioral Neuroscience | 2016
Rafał Jończyk; Bastien Boutonnet; Kamil Musiał; Katie Hoemann; Guillaume Thierry
Neurobilingualism research has failed to reveal significant language differences in the processing of affective content. However, the evidence to date derives mostly from studies in which affective stimuli are presented out of context, which is unnatural and fails to capture the complexity of everyday sentence-based communication. Here we investigated semantic integration of affectively salient stimuli in sentential context in the first- and second-language (L2) of late fluent Polish–English bilinguals living in the UK. The 19 participants indicated whether Polish and English sentences ending with a semantically and affectively congruent or incongruent adjective of controlled affective valence made sense while undergoing behavioral and electrophysiological recordings. We focused on the N400, a wave of event-related potentials known to index semantic integration. We expected N400 amplitude to index increased processing demands in L2 English comprehension and potential language–valence interactions to reveal differences in affective processing between languages. Contrary to our initial expectation, we found increased N400 for sentences in L1 Polish, possibly driven by greater affective salience of sentences in the native language. Critically, language interacted with affective valence, such that N400 amplitudes were reduced for English sentences ending in a negative fashion as compared to all other conditions. We interpreted this as a sign that bilinguals suppress L2 content embedded in naturalistic L2 sentences when it has negative valence, thus extending the findings of previous research on single words in clinical and linguistic research.
bioRxiv | 2016
Jason Samaha; Bastien Boutonnet; Gary Lupyan
Perceptual experience results from a complex interplay of bottom-up input and prior knowledge about the world, yet the extent to which knowledge affects perception, the neural mechanisms underlying these effects, and the stages of processing at which these two sources of information converge, are still unclear. In a series of experiments we show that verbal cues not only help recognition of ambiguous “Mooney” images, but improve accuracy and RTs in a same/different discrimination task. We then used electroencephalography (EEG) to better understand the mechanisms of this effect. The improved discrimination of images previously made meaningful was accompanied by a larger occipital-parietal P1 evoked response to the meaningful versus meaningless target stimuli. Time-frequency analysis of the interval between the two stimuli (just prior to the target stimulus) revealed increases in the power of posterior alpha-band (8-14 Hz) oscillations when the meaning of the stimuli to be compared was trained. The magnitude of the prestimulus alpha difference and the P1 amplitude difference was positively correlated across individuals. These results suggest that prior knowledge prepares the brain for upcoming perception via the modulation of prestimulus alpha-band oscillations, and that this preparatory state influences early (~120 ms) stages of subsequent visual processing. Significance Statement What we see is affected by what we know, but what kind of knowledge affects our perception, and at what stages of perceptual processing do such effects occur? We show that verbal hints vastly improve people’s ability to recognize ambiguous images and improve objective performance on a visual discrimination task. Using electrophysiology (EEG) we then show that knowing in advance the meaning of an ambiguous image increases alpha-band oscillations prior to image onset and visual-evoked potentials show rapid enhancement ~120 ms following image onset. These results suggest that alpha is involved in bringing prior knowledge to bear on the interpretation of sensory stimuli, demonstrating that perception is constructed from both sensory input and prior knowledge about the word.
Frontiers in Psychology | 2014
Bastien Boutonnet; Rhonda McClain; Guillaume Thierry
Linguistic relativity theory has received empirical support in domains such as color perception and object categorization. It is unknown, however, whether relations between words idiosyncratic to language impact non-verbal representations and conceptualizations. For instance, would one consider the concepts of horse and sea as related were it not for the existence of the compound seahorse? Here, we investigated such arbitrary conceptual relationships using a non-linguistic picture relatedness task in participants undergoing event-related brain potential recordings. Picture pairs arbitrarily related because of a compound and presented in the compound order elicited N400 amplitudes similar to unrelated pairs. Surprisingly, however, pictures presented in the reverse order (as in the sequence horse–sea) reduced N400 amplitudes significantly, demonstrating the existence of a link in memory between these two concepts otherwise unrelated. These results break new ground in the domain of linguistic relativity by revealing predicted semantic associations driven by lexical relations intrinsic to language.
Scientific Reports | 2018
Jason Samaha; Bastien Boutonnet; Bradley R. Postle; Gary Lupyan
Perceptual experience results from a complex interplay of bottom-up input and prior knowledge about the world, yet the extent to which knowledge affects perception, the neural mechanisms underlying these effects, and the stages of processing at which these two sources of information converge, are still unclear. In several experiments we show that language, in the form of verbal labels, both aids recognition of ambiguous “Mooney” images and improves objective visual discrimination performance in a match/non-match task. We then used electroencephalography (EEG) to better understand the mechanisms of this effect. The improved discrimination of images previously labeled was accompanied by a larger occipital-parietal P1 evoked response to the meaningful versus meaningless target stimuli. Time-frequency analysis of the interval between the cue and the target stimulus revealed increases in the power of posterior alpha-band (8–14u2009Hz) oscillations when the meaning of the stimuli to be compared was trained. The magnitude of the pre-target alpha difference and the P1 amplitude difference were positively correlated across individuals. These results suggest that prior knowledge prepares the brain for upcoming perception via the modulation of alpha-band oscillations, and that this preparatory state influences early (~120u2009ms) stages of visual processing.
Journal of Memory and Language | 2013
Clara D. Martin; Guillaume Thierry; Jan Rouke Kuipers; Bastien Boutonnet; Alice Foucart; Albert Costa
Archive | 2016
Panos Athanasopoulos; Bastien Boutonnet
Archive | 2016
Gary Lupyan; Jason Samaha; Bastien Boutonnet
Archive | 2015
Rafał Jończyk; Bastien Boutonnet