Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Laurence Chaby is active.

Publication


Featured researches published by Laurence Chaby.


Brain Research | 2009

The time course of repetition effects for familiar faces and objects: An ERP study

Cécile Guillaume; Bérengère Guillery-Girard; Laurence Chaby; Karine Lebreton; Laurent Hugueville; Francis Eustache; Nicole Fiori

Face and object priming has been extensively studied, but less is known about the repetition processes which are specific to each material and those which are common to both types of material. In order to track the time course of these repetition processes, EEG was recorded while 12 healthy young subjects performed a long-term perceptual repetition priming task using faces and object drawings. Item repetition induced early (N170) and late (P300 and 400-600 ms time-window) event-related potential (ERP) modulations. The N170 component was reduced in response to primed stimuli even with several hundred intervening items and this repetition effect was larger for objects than for faces. This early repetition effect may reflect the implicit retrieval of perceptual features. The late repetition effects showed enhanced positivity for primed items at centro-parietal, central and frontal sites. During this later time-window (400 and 600 ms at central and frontal sites), ERP repetition effects were more obvious at the left side for objects and at the right side for faces. ERP repetition effects were also larger for famous faces during this time-window. These later repetition effects may reflect deeper semantic processing and/or greater involvement of involuntary explicit retrieval processes for the famous faces. Taken together, these results suggest that among the implicit and explicit memory processes elicited by a perceptual priming task, some of them are modulated by the type of item which is repeated.


Neuroreport | 1999

Differential processing of part-to-whole and part-to-part face priming: an Erp study

Boutheina Jemel; Nathalie George; Laurence Chaby; Nicole Fiori; Bernard Renault

We provide electrophysiological evidence supporting the hypothesis that part and whole face processing involve distinct functional mechanisms. We used a congruency judgment task and studied part-to-whole and part-to-part priming effects. Neither part-to-whole nor part-to-part conditions elicited early congruency effects on face-specific ERP components, suggesting that activation of the internal representations should occur later on. However, these components showed differential responsiveness to whole faces and isolated eyes. In addition, although late ERP components were affected when the eye targets were not associated with the prime in both conditions, their temporal and topographical features depended on the latter. These differential effects suggest the existence of distributed neural networks in the inferior temporal cortex where part and whole facial representations may be stored.


Psychology and Aging | 2011

Older adults' configural processing of faces: role of second-order information.

Laurence Chaby; Pauline Narme; Nathalie George

Problems with face recognition are frequent in older adults. However, the mechanisms involved have only been partially discovered. In particular, it is unknown to what extent these problems may be related to changes in configural face processing. Here, we investigated the face inversion effect (FIE) together with the ability to detect modifications in the vertical or horizontal second-order relations between facial features. We used a same/different unfamiliar face discrimination task with 33 young and 33 older adults. The results showed dissociations in the performances of older versus younger adults. There was a lack of inversion effect during the recognition of original faces by older adults. However, for modified faces, older adults showed a pattern of performance similar to that of young participants, with preserved FIE for vertically modified faces and no detectable FIE for horizontally modified faces. Most importantly, the detection of vertical modifications was preserved in older relative to young adults whereas the detection of horizontal modifications was markedly diminished. We conclude that age has dissociable effects on configural face-encoding processes, with a relative preservation of vertical compared to horizontal second-order relations processing. These results help to understand some divergent results in the literature and may explain the spared familiar face identification abilities in the daily lives of older adults.


Neuroscience Letters | 2003

Age-related changes in brain responses to personally known faces: an event-related potential (ERP) study in humans.

Laurence Chaby; Nathalie George; Bernard Renault; Nicole Fiori

Midlife period has not been investigated so far regarding associations between brain responses and spared abilities for face processing. This study examines the effects of midlife aging on behavioural performance and event-related potentials (ERPs) during the perception of personally known faces. Ten middle-aged adults (aged 45-60) and 12 young adults (aged 20-30) performed a visual discrimination task based on the detection of modified eye colours. We found that this task was performed as accurately by middle-aged as by young adults. However, midlife aging is associated with specific ERP latency delays and important changes in scalp ERP distribution. These results -interpreted according to a compensation hypothesis- provide enlightening indications showing that, compared to young adults, the changes in brain activities observed in middle-aged adults may contribute to their maintained behavioural performance.


Psychologie & Neuropsychiatrie Du Vieillissement | 2009

La reconnaissance des visages et de leurs expressions faciales au cours du vieillissement normal et dans les pathologies neurodégénératives

Laurence Chaby; Pauline Narme

The ability to recognize facial identity and emotional facial expression is central to social relationships. This paper reviews studies concerning face recognition and emotional facial expression during normal aging as well as in neurodegenerative diseases occurring in the elderly. It focuses on Alzheimers disease, frontotemporal and semantic dementia, and also Parkinsons disease. The results of studies on healthy elderly individuals show subtle alterations in the recognition of facial identity and emotional facial expression from the age of 50 years, and increasing after 70. Studies in neurodegenerative diseases show that - during their initial stages - face recognition and facial expression can be specifically affected. Little has been done to assess these difficulties in clinical practice. They could constitute a useful marker for differential diagnosis, especially for the clinical differentiation of Alzheimers disease (AD) from frontotemporal dementia (FTD). Social difficulties and some behavioural problems observed in these patients may, at least partly, result from these deficits in face processing. Thus, it is important to specify the possible underlying anatomofunctional substrates of these deficits as well as to plan suitable remediation programs.


Frontiers in Psychology | 2015

A Multidimensional Approach to the Study of Emotion Recognition in Autism Spectrum Disorders

Jean Xavier; Violaine Vignaud; Rosa Ruggiero; Nicolas Bodeau; David Cohen; Laurence Chaby

Although deficits in emotion recognition have been widely reported in autism spectrum disorder (ASD), experiments have been restricted to either facial or vocal expressions. Here, we explored multimodal emotion processing in children with ASD (N = 19) and with typical development (TD, N = 19), considering uni (faces and voices) and multimodal (faces/voices simultaneously) stimuli and developmental comorbidities (neuro-visual, language and motor impairments). Compared to TD controls, children with ASD had rather high and heterogeneous emotion recognition scores but showed also several significant differences: lower emotion recognition scores for visual stimuli, for neutral emotion, and a greater number of saccades during visual task. Multivariate analyses showed that: (1) the difficulties they experienced with visual stimuli were partially alleviated with multimodal stimuli. (2) Developmental age was significantly associated with emotion recognition in TD children, whereas it was the case only for the multimodal task in children with ASD. (3) Language impairments tended to be associated with emotion recognition scores of ASD children in the auditory modality. Conversely, in the visual or bimodal (visuo-auditory) tasks, the impact of developmental coordination disorder or neuro-visual impairments was not found. We conclude that impaired emotion processing constitutes a dimension to explore in the field of ASD, as research has the potential to define more homogeneous subgroups and tailored interventions. However, it is clear that developmental age, the nature of the stimuli, and other developmental comorbidities must also be taken into account when studying this dimension.


Neurocase | 2014

A case of bilateral frontal tumors without “frontal syndrome”

Monique Plaza; V. du Boullay; A. Perrault; Laurence Chaby; Laurent Capelle

We report the longitudinal case study of a right-handed patient harboring two frontal tumors that benefited from bilateral simultaneous surgery. The tumors were WHO Grade II gliomas located in the left inferior frontal area (including the cingulate gyrus) and the right anterior superior frontal gyrus. The double tumor resection was guided by direct electrical stimulation of brain areas while the patient was awake. Neuropsychological assessments were administered before and after the surgery to analyse how the brain functions in the presence of two frontal gliomas that affect both hemispheres and reacts to a bilateral resection, which can brutally compromise the neuronal connectivity, progressively established during the infiltrating process. We showed that both the tumor infiltration and their bilateral resection did not lead to a “frontal syndrome” or a “dysexecutive syndrome” predicted by the localization models. However, a subtle fragility was observed in fine-grain language, memory and emotional skills. This case study reveals the significance of brain plasticity in the reorganization of cognitive networks, even in cases of bilateral tumors. It also confirms the clinical relevance of hodotopical brain models, which considers the brain to be organized in parallel-distributed networks around cortical centers and epicenters.


Frontiers in Psychology | 2015

Compensating for age limits through emotional crossmodal integration

Laurence Chaby; Viviane Luherne-du Boullay; Mohamed Chetouani; Monique Plaza

Social interactions in daily life necessitate the integration of social signals from different sensory modalities. In the aging literature, it is well established that the recognition of emotion in facial expressions declines with advancing age, and this also occurs with vocal expressions. By contrast, crossmodal integration processing in healthy aging individuals is less documented. Here, we investigated the age-related effects on emotion recognition when faces and voices were presented alone or simultaneously, allowing for crossmodal integration. In this study, 31 young adults (M = 25.8 years) and 31 older adults (M = 67.2 years) were instructed to identify several basic emotions (happiness, sadness, anger, fear, disgust) and a neutral expression, which were displayed as visual (facial expressions), auditory (non-verbal affective vocalizations) or crossmodal (simultaneous, congruent facial and vocal affective expressions) stimuli. The results showed that older adults performed slower and worse than younger adults at recognizing negative emotions from isolated faces and voices. In the crossmodal condition, although slower, older adults were as accurate as younger except for anger. Importantly, additional analyses using the “race model” demonstrate that older adults benefited to the same extent as younger adults from the combination of facial and vocal emotional stimuli. These results help explain some conflicting results in the literature and may clarify emotional abilities related to daily life that are partially spared among older adults.


Schizophrenia Research | 2015

Facial, vocal and cross-modal emotion processing in early-onset schizophrenia spectrum disorders

Marianna Giannitelli; Jean Xavier; Anne François; Nicolas Bodeau; Claudine Laurent; David Cohen; Laurence Chaby

Recognition of emotional expressions plays an essential role in childrens healthy development. Anomalies in these skills may result in empathy deficits, social interaction difficulties and premorbid emotional problems in children and adolescents with schizophrenia. Twenty-six subjects with early onset schizophrenia spectrum (EOSS) disorders and twenty-eight matched healthy controls (HC) were instructed to identify five basic emotions and a neutral expression. The assessment entailed presenting visual, auditory and congruent cross-modal stimuli. Using a generalized linear mixed model, we found no significant association for handedness, age or gender. However, significant associations emerged for emotion type, perception modality, and group. EOSS patients performed worse than HC in uni- and cross-modal emotional tasks with a specific negative emotion processing impairment pattern. There was no relationship between emotion identification scores and positive or negative symptoms, self-reported empathy traits or a positive history of developmental disorders. However, we found a significant association between emotional identification scores and nonverbal communication impairments. We conclude that cumulative dysfunctions in both nonverbal communication and emotion processing contribute to the social vulnerability and morbidity found in youths who display EOSS disorder.


Frontiers in Psychology | 2017

Gaze Behavior Consistency among Older and Younger Adults When Looking at Emotional Faces

Laurence Chaby; Isabelle Hupont; Marie Avril; Viviane Luherne-du Boullay; Mohamed Chetouani

The identification of non-verbal emotional signals, and especially of facial expressions, is essential for successful social communication among humans. Previous research has reported an age-related decline in facial emotion identification, and argued for socio-emotional or aging-brain model explanations. However, more perceptual differences in the gaze strategies that accompany facial emotional processing with advancing age have been under-explored yet. In this study, 22 young (22.2 years) and 22 older (70.4 years) adults were instructed to look at basic facial expressions while their gaze movements were recorded by an eye-tracker. Participants were then asked to identify each emotion, and the unbiased hit rate was applied as performance measure. Gaze data were first analyzed using traditional measures of fixations over two preferential regions of the face (upper and lower areas) for each emotion. Then, to better capture core gaze changes with advancing age, spatio-temporal gaze behaviors were deeper examined using data-driven analysis (dimension reduction, clustering). Results first confirmed that older adults performed worse than younger adults at identifying facial expressions, except for “joy” and “disgust,” and this was accompanied by a gaze preference toward the lower-face. Interestingly, this phenomenon was maintained during the whole time course of stimulus presentation. More importantly, trials corresponding to older adults were more tightly clustered, suggesting that the gaze behavior patterns of older adults are more consistent than those of younger adults. This study demonstrates that, confronted to emotional faces, younger and older adults do not prioritize or ignore the same facial areas. Older adults mainly adopted a focused-gaze strategy, consisting in focusing only on the lower part of the face throughout the whole stimuli display time. This consistency may constitute a robust and distinctive “social signature” of emotional identification in aging. Younger adults, however, were more dispersed in terms of gaze behavior and used a more exploratory-gaze strategy, consisting in repeatedly visiting both facial areas.

Collaboration


Dive into the Laurence Chaby's collaboration.

Top Co-Authors

Avatar

Monique Plaza

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar

Nicole Fiori

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Bernard Renault

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar

Pauline Narme

University of Picardie Jules Verne

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Stéphanie Hun

University of Nice Sophia Antipolis

View shared research outputs
Top Co-Authors

Avatar

Sylvie Serret

University of Nice Sophia Antipolis

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge