Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Ian Charest is active.

Publication


Featured researches published by Ian Charest.


NeuroImage | 2015

The human voice areas: Spatial organization and inter-individual variability in temporal and extra-temporal cortices.

Cyril Pernet; Philip McAleer; Marianne Latinus; Krzysztof J. Gorgolewski; Ian Charest; Patricia E. G. Bestelmeyer; Rebecca Watson; David Fleming; Frances Crabbe; Mitchell Valdés-Sosa; Pascal Belin

fMRI studies increasingly examine functions and properties of non-primary areas of human auditory cortex. However there is currently no standardized localization procedure to reliably identify specific areas across individuals such as the standard ‘localizers’ available in the visual domain. Here we present an fMRI ‘voice localizer’ scan allowing rapid and reliable localization of the voice-sensitive ‘temporal voice areas’ (TVA) of human auditory cortex. We describe results obtained using this standardized localizer scan in a large cohort of normal adult subjects. Most participants (94%) showed bilateral patches of significantly greater response to vocal than non-vocal sounds along the superior temporal sulcus/gyrus (STS/STG). Individual activation patterns, although reproducible, showed high inter-individual variability in precise anatomical location. Cluster analysis of individual peaks from the large cohort highlighted three bilateral clusters of voice-sensitivity, or “voice patches” along posterior (TVAp), mid (TVAm) and anterior (TVAa) STS/STG, respectively. A series of extra-temporal areas including bilateral inferior prefrontal cortex and amygdalae showed small, but reliable voice-sensitivity as part of a large-scale cerebral voice network. Stimuli for the voice localizer scan and probabilistic maps in MNI space are available for download.


Proceedings of the National Academy of Sciences of the United States of America | 2014

Unique semantic space in the brain of each beholder predicts perceived similarity

Ian Charest; Rogier A. Kievit; Taylor W. Schmitz; Diana Deca; Nikolaus Kriegeskorte

Significance Everyone is different. Understanding the unique way an individual perceives the world is a fundamental goal of psychology and brain science. Using novel methods for analyzing functional MRI (fMRI) data, we show that each person viewing a set of objects represents the objects uniquely in his or her brain. Moreover, given an individual’s measured brain-activity patterns, idiosyncrasies in his or her perception of the similarities among the objects can be predicted. Prediction accuracy is modest using current technology. However, our results demonstrate that fMRI has the power to reveal individually unique representations of particular objects in the human brain. The novel method might help us understand the biological substrate of individual experience in mental health and disease. The unique way in which each of us perceives the world must arise from our brain representations. If brain imaging could reveal an individual’s unique mental representation, it could help us understand the biological substrate of our individual experiential worlds in mental health and disease. However, imaging studies of object vision have focused on commonalities between individuals rather than individual differences and on category averages rather than representations of particular objects. Here we investigate the individually unique component of brain representations of particular objects with functional MRI (fMRI). Subjects were presented with unfamiliar and personally meaningful object images while we measured their brain activity on two separate days. We characterized the representational geometry by the dissimilarity matrix of activity patterns elicited by particular object images. The representational geometry remained stable across scanning days and was unique in each individual in early visual cortex and human inferior temporal cortex (hIT). The hIT representation predicted perceived similarity as reflected in dissimilarity judgments. Importantly, hIT predicted the individually unique component of the judgments when the objects were personally meaningful. Our results suggest that hIT brain representational idiosyncrasies accessible to fMRI are expressed in an individuals perceptual judgments. The unique way each of us perceives the world thus might reflect the individually unique representation in high-level visual areas.


Cortex | 2014

People-selectivity, audiovisual integration and heteromodality in the superior temporal sulcus

Rebecca Watson; Marianne Latinus; Ian Charest; Frances Crabbe; Pascal Belin

The functional role of the superior temporal sulcus (STS) has been implicated in a number of studies, including those investigating face perception, voice perception, and face–voice integration. However, the nature of the STS preference for these ‘social stimuli’ remains unclear, as does the location within the STS for specific types of information processing. The aim of this study was to directly examine properties of the STS in terms of selective response to social stimuli. We used functional magnetic resonance imaging (fMRI) to scan participants whilst they were presented with auditory, visual, or audiovisual stimuli of people or objects, with the intention of localising areas preferring both faces and voices (i.e., ‘people-selective’ regions) and audiovisual regions designed to specifically integrate person-related information. Results highlighted a ‘people-selective, heteromodal’ region in the trunk of the right STS which was activated by both faces and voices, and a restricted portion of the right posterior STS (pSTS) with an integrative preference for information from people, as compared to objects. These results point towards the dedicated role of the STS as a ‘social-information processing’ centre.


Cerebral Cortex | 2013

Cerebral Processing of Voice Gender Studied Using a Continuous Carryover fMRI Design

Ian Charest; Cyril Pernet; Marianne Latinus; Frances Crabbe; Pascal Belin

Normal listeners effortlessly determine a persons gender by voice, but the cerebral mechanisms underlying this ability remain unclear. Here, we demonstrate 2 stages of cerebral processing during voice gender categorization. Using voice morphing along with an adaptation-optimized functional magnetic resonance imaging design, we found that secondary auditory cortex including the anterior part of the temporal voice areas in the right hemisphere responded primarily to acoustical distance with the previously heard stimulus. In contrast, a network of bilateral regions involving inferior prefrontal and anterior and posterior cingulate cortex reflected perceived stimulus ambiguity. These findings suggest that voice gender recognition involves neuronal populations along the auditory ventral stream responsible for auditory feature extraction, functioning in pair with the prefrontal cortex in voice gender perception.


NeuroImage: Clinical | 2013

Binge drinking influences the cerebral processing of vocal affective bursts in young adults.

Pierre Maurage; Patricia E. G. Bestelmeyer; Julien Rouger; Ian Charest; Pascal Belin

Binge drinking is now considered a central public health issue and is associated with emotional and interpersonal problems, but the neural implications of these deficits remain unexplored. The present study aimed at offering the first insights into the effects of binge drinking on the neural processing of vocal affect. On the basis of an alcohol-consumption screening phase (204 students), 24 young adults (12 binge drinkers and 12 matched controls, mean age: 23.8 years) were selected and performed an emotional categorisation task on morphed vocal stimuli (drawn from a morphed fear–anger continuum) during fMRI scanning. In comparison to controls, binge drinkers presented (1) worse behavioural performance in emotional affect categorisation; (2) reduced activation of bilateral superior temporal gyrus; and (3) increased activation of right middle frontal gyrus. These results constitute the first evidence of altered cerebral processing of emotional stimuli in binge drinking and confirm that binge drinking leads to marked cerebral changes, which has important implications for research and clinical practice.


Language, cognition and neuroscience | 2015

The brain of the beholder: honouring individual representational idiosyncrasies

Ian Charest; Nikolaus Kriegeskorte

In the early days of neuroimaging, brain function was investigated by averaging across voxels within a region, stimuli within a category, and individuals within a group. These three forms of averaging discard important neuroscientific information. Recent studies have explored analyses that combine the evidence in better-motivated ways. Multivariate pattern analyses enable researchers to reveal representations in distributed population codes, honouring the unique information contributed by different voxels (or neurons). Condition-rich designs more richly sample the stimulus space and can treat each stimulus as a unique entity. Finally, each individuals brain is unique and recent studies have found ways to model and analyse the interindividual representational variability. Here we review our fields journey towards more sophisticated analyses that honour these important idiosyncrasies of brain representations. We describe an emerging framework for investigating individually unique pattern representations of particular stimuli in the brain. The framework models stimuli, responses and individuals multivariately and relates representations by means of representational dissimilarity matrices. Important components are computational models and multivariate descriptions of brain and behavioural responses. These recent developments promise a new paradigm for studying the individually unique brain at unprecedented levels of representational detail.


Cortex | 2014

Automatic domain-general processing of sound source identity in the left posterior middle frontal gyrus

Bruno L. Giordano; Cyril Pernet; Ian Charest; Guylaine Belizaire; Robert J. Zatorre; Pascal Belin

Identifying sound sources is fundamental to developing a stable representation of the environment in the face of variable auditory information. The cortical processes underlying this ability have received little attention. In two fMRI experiments, we investigated passive adaptation to (Exp. 1) and explicit discrimination of (Exp. 2) source identities for different categories of auditory objects (voices, musical instruments, environmental sounds). All cortical effects of source identity were independent of high-level category information, and were accounted for by sound-to-sound differences in low-level structure (e.g., loudness). A conjunction analysis revealed that the left posterior middle frontal gyrus (pMFG) adapted to identity repetitions during both passive listening and active discrimination tasks. These results indicate that the comparison of sound source identities in a stream of auditory stimulation recruits the pMFG in a domain-general way, i.e., independent of the sound category, based on information contained in the low-level acoustical structure. pMFG recruitment during both passive listening and explicit identity comparison tasks also suggests its automatic engagement in sound source identity processing.


Nature Neuroscience | 2018

Author Correction: Retrieval induces adaptive forgetting of competing memories via cortical pattern suppression

Maria Wimber; Arjen Alink; Ian Charest; Nikolaus Kriegeskorte; Michael C. Anderson

In the published version of this article, a detail is missing from the Methods section “Experimental procedure.” The following sentence is to be inserted at the end of its fourth paragraph: “If participants failed to respond within 3.5 s, we assumed that they were unable to successfully recognize the item and coded the corresponding trial as an error.” The critical behavioral forgetting effect is significant irrespective of whether these timeouts are coded as errors (t23 = 4.91, P < 0.001) or as missing data (t23 = 3.31, P < 0.01). The original article has not been corrected.


Archive | 2013

Audiovisual Integration of Face–Voice Gender Studied Using “Morphed Videos”

Rebecca Watson; Ian Charest; Julien Rouger; Christoph Casper; Marianne Latinus; Pascal Belin

Both the face and the voice provide us with not only linguistic information but also a wealth of paralinguistic information, including gender cues. However, the way in which we integrate these two sources in our perception of gender has remained largely unexplored. In the following study, we used a bimodal perception paradigm in which varying degrees of incongruence were created between facial and vocal information within audiovisual stimuli. We found that in general participants were able to combine both sources of information, with gender of the face being influenced by that of the voice and vice versa. However, in conditions that directed attention to either modality, we observed that participants were unable to ignore the gender of the voice, even when instructed to. Overall, our results point to a larger role of the voice in gender perception, when more controlled visual stimuli are used.


Current Biology | 2010

Vocal Attractiveness Increases by Averaging

Laetitia Bruckert; Patricia E. G. Bestelmeyer; Marianne Latinus; Julien Rouger; Ian Charest; Guillaume A. Rousselet; Hideki Kawahara; Pascal Belin

Collaboration


Dive into the Ian Charest's collaboration.

Top Co-Authors

Avatar

Pascal Belin

Université de Montréal

View shared research outputs
Top Co-Authors

Avatar

Marianne Latinus

François Rabelais University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Cyril Pernet

University of Edinburgh

View shared research outputs
Top Co-Authors

Avatar

Nikolaus Kriegeskorte

Cognition and Brain Sciences Unit

View shared research outputs
Top Co-Authors

Avatar

Pierre Maurage

Université catholique de Louvain

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge