Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Pascal Belin is active.

Publication


Featured researches published by Pascal Belin.


Trends in Cognitive Sciences | 2002

Structure and function of auditory cortex: music and speech

Robert J. Zatorre; Pascal Belin; Virginia B. Penhune

We examine the evidence that speech and musical sounds exploit different acoustic cues: speech is highly dependent on rapidly changing broadband sounds, whereas tonal patterns tend to be slower, although small and precise changes in frequency are important. We argue that the auditory cortices in the two hemispheres are relatively specialized, such that temporal resolution is better in left auditory cortical areas and spectral resolution is better in right auditory cortical areas. We propose that cortical asymmetries might have developed as a general solution to the need to optimize processing of the acoustic environment in both temporal and frequency domains.


Nature Neuroscience | 2004

Abnormal cortical voice processing in autism

Hélène Gervais; Pascal Belin; Nathalie Boddaert; Marion Leboyer; Arnaud Coez; Ignacio Sfaello; Catherine Barthélémy; Francis Brunelle; Yves Samson; Monica Zilbovicius

Impairments in social interaction are a key feature of autism and are associated with atypical social information processing. Here we report functional magnetic resonance imaging (fMRI) results showing that individuals with autism failed to activate superior temporal sulcus (STS) voice-selective regions in response to vocal sounds, whereas they showed a normal activation pattern in response to nonvocal sounds. These findings suggest abnormal cortical processing of socially relevant auditory information in autism.


Neurology | 1996

Recovery from nonfluent aphasia after melodic intonation therapy A PET study

Pascal Belin; Monica Zilbovicius; Ph. Remy; C. Francois; S. Guillaume; F. Chain; G. Rancurel; Yves Samson

We examined mechanisms of recovery from aphasia in seven nonfluent aphasic patients, who were successfully treated with melodic intonation therapy (MIT) after a lengthy absence of spontaneous recovery.We measured changes in relative cerebral blood flow (CBF) with positron emission tomography (PET) during hearing and repetition of simple words, and during repetition of MIT-loaded words. Without MIT, language tasks abnormally activated right hemisphere regions, homotopic to those activated in the normal subject, and deactivated left hemisphere language zones. In contrast, repeating words with MIT reactivated Brocas area and the left prefrontal cortex, while deactivating the counterpart of Wernickes area in the right hemisphere. The recovery process induced by MIT in these patients probably coincides with this reactivation of left prefrontal structures. In contrast, the right hemisphere regions abnormally activated during simple language tasks seem to be associated with the initial persistence of the aphasia. This study supports the idea that abnormal activation patterns in the lesioned brain are not necessarily related to the recovery process. NEUROLOGY 1996;47: 1504-1511


Journal of Cognitive Neuroscience | 1998

Lateralization of Speech and Auditory Temporal Processing

Pascal Belin; Monica Zilbovicius; Sophie Crozier; Lionel Thivard; and Anne Fontaine; Marie-Cécile Masure; Yves Samson

To investigate the role of temporal processing in language lateralization, we monitored asymmetry of cerebral activation in human volunteers using positron emission tomography (PET). Subjects were scanned during passive auditory stimulation with nonverbal sounds containing rapid (40 msec) or extended (200 msec) frequency transitions. Bilateral symmetric activation was observed in the auditory cortex for slow frequency transitions. In contrast, left-biased asymmetry was observed in response to rapid frequency transitions due to reduced response of the right auditory cortex. These results provide direct evidence that auditory processing of rapid acoustic transitions is lateralized in the human brain. Such functional asymmetry in temporal processing is likely to contribute to language lateralization from the lowest levels of cortical processing.


Nature Neuroscience | 2002

Where is 'where' in the human auditory cortex?

Robert J. Zatorre; Marc Bouffard; Pierre Ahad; Pascal Belin

We examine the functional characteristics of auditory cortical areas that are sensitive to spatial cues in the human brain, and determine whether they can be dissociated from parietal lobe mechanisms. Three positron emission tomography (PET) experiments were conducted using a speaker array permitting quasi free-field sound presentation within the scanner. Posterior auditory cortex responded to sounds that varied in their spatial distribution, but only when multiple complex stimuli were presented simultaneously, implicating this cortical system in disambiguation of overlapping auditory sources. We also found that the right inferior parietal cortex is specifically recruited in localization tasks, and that its activity predicts behavioral performance, consistent with its involvement in sensorimotor integration and spatial transformation. These findings clarify the functional roles of posterior auditory and parietal cortices, and help to reconcile competing models of auditory cortical organization.


NeuroImage | 1998

Event-Related fMRI of the Auditory Cortex

Pascal Belin; Robert J. Zatorre; Richard D. Hoge; Alan C. Evans; Bruce Pike

An event-related protocol was designed to permit auditory fMRI studies minimally affected by the echo-planar noise artifact; a long time interval (TR = 10 s) between each cerebral volume acquisition was combined with stroboscopic data acquisition, and event-related curves were reconstructed with a 1-s resolution. The cerebral hemodynamic-response time course to a target auditory stimulus was measured in five individual subjects using this method. Clear bell-shaped event-related responses were observed bilaterally in all individuals in primary auditory cortex (A1) as well as in laterally extending secondary cortical fields. Group-average event-related curves attained their maxima (0.5-0.7%) 3 s after stimulus onset in A1 (4 s for more anterior and lateral regions of auditory cortex), and signal had returned to near-baseline level 6 s after stimulus onset. The stroboscopic event-related method appeared effective in minimizing effects of the interaction between scanning noise and experimental auditory stimulation; it adds useful temporal information to the spatial resolution afforded by fMRI in studies of human auditory function, while allowing presentation of auditory stimuli on a silent background.


Behavior Research Methods | 2008

The Montreal Affective Voices: A validated set of nonverbal affect bursts for research on auditory affective processing

Pascal Belin; Sarah Fillion-Bilodeau; Frédéric Gosselin

The Montreal Affective Voices consist of 90 nonverbal affect bursts corresponding to the emotions of anger, disgust, fear, pain, sadness, surprise, happiness, and pleasure (plus a neutral expression), recorded by 10 different actors (5 of them male and 5 female). Ratings of valence, arousal, and intensity for eight emotions were collected for each vocalization from 30 participants. Analyses revealed high recognition accuracies for most of the emotional categories (mean of 68%). They also revealed significant effects of both the actors’ and the participants’ gender: The highest hit rates (75%) were obtained for female participants rating female vocalizations, and the lowest hit rates (60%) for male participants rating male vocalizations. Interestingly, the mixed situations— that is, male participants rating female vocalizations or female participants rating male vocalizations—yielded similar, intermediate ratings. The Montreal Affective Voices are available for download at vnl.psy.gla.ac.uk/ (Resources section).


PLOS Biology | 2013

Speech rhythms and multiplexed oscillatory sensory coding in the human brain.

Joachim Gross; Nienke Hoogenboom; Gregor Thut; Philippe G. Schyns; Stefano Panzeri; Pascal Belin; Simon Garrod

A neuroimaging study reveals how coupled brain oscillations at different frequencies align with quasi-rhythmic features of continuous speech such as prosody, syllables, and phonemes.


NeuroImage | 2007

Amygdala responses to nonlinguistic emotional vocalizations

Shirley Fecteau; Pascal Belin; Yves Joanette; Jorge L. Armony

Whereas there is ample evidence for a role of the amygdala in the processing of visual emotional stimuli, particularly those with negative value, discrepant results have been reported regarding amygdala responses to emotional auditory stimuli. The present study used event-related functional magnetic resonance imaging to investigate cerebral activity underlying processing of emotional nonlinguistic vocalizations, with a particular focus on neural changes in the amygdala. Fourteen healthy volunteers were scanned while performing a gender identification task. Stimuli, previously validated on emotional valence, consisted of positive (happiness and sexual pleasure) and negative (sadness and fear) vocalizations, as well as emotionally neutral sounds (e.g., coughs). Results revealed bilateral amygdala activation in response to all emotional vocalizations when compared to neutral stimuli. These findings suggest that the generally accepted involvement of the amygdala in the perception of emotional visual stimuli, such as facial expressions, also applies to stimuli within the auditory modality. Importantly, this amygdala response was observed for both positive and negative emotional vocalizations.


Pain | 2008

Recognition and discrimination of prototypical dynamic expressions of pain and emotions

Daniela Simon; Kenneth D. Craig; Frédéric Gosselin; Pascal Belin; Pierre Rainville

&NA; Facial expressions of pain and emotions provide powerful social signals, which impart information about a person’s state. Unfortunately, research on pain and emotion expression has been conducted largely in parallel with few bridges allowing for direct comparison of the expressive displays and their impact on observers. Moreover, although facial expressions are highly dynamic, previous research has relied mainly on static photographs. Here we directly compare the recognition and discrimination of dynamic facial expressions of pain and basic emotions by naïve observers. One‐second film clips were recorded in eight actors displaying neutral facial expressions and expressions of pain and the basic emotions of anger, disgust, fear, happiness, sadness and surprise. Results based on the Facial Action Coding System (FACS) confirmed the distinct (and prototypical) configuration of pain and basic emotion expressions reported in previous studies. Volunteers’ evaluations of those dynamic expressions on intensity, arousal and valence demonstrate the high sensitivity and specificity of the observers’ judgement. Additional rating data further suggest that, for comparable expression intensity, pain is perceived as more arousing and more unpleasant. This study strongly supports the claim that the facial expression of pain is distinct from the expression of basic emotions. This set of dynamic facial expressions provides unique material to explore the psychological and neurobiological processes underlying the perception of pain expression, its impact on the observer, and its role in the regulation of social behaviour.

Collaboration


Dive into the Pascal Belin's collaboration.

Top Co-Authors

Avatar

Robert J. Zatorre

Montreal Neurological Institute and Hospital

View shared research outputs
Top Co-Authors

Avatar

Marianne Latinus

François Rabelais University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ian Charest

Cognition and Brain Sciences Unit

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Cyril Pernet

University of Edinburgh

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Yves Samson

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge