Elisa Back
Kingston University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Elisa Back.
Visual Cognition | 2009
Elisa Back; Timothy R. Jordan; Sharon M. Thomas
The ability to recognize mental states from facial expressions is essential for effective social interaction. However, previous investigations of mental state recognition have used only static faces so the benefit of dynamic information for recognizing mental states remains to be determined. Experiment 1 found that dynamic faces produced higher levels of recognition accuracy than static faces, suggesting that the additional information contained within dynamic faces can facilitate mental state recognition. Experiment 2 explored the facial regions that are important for providing dynamic information in mental state displays. This involved using a new technique to freeze motion in a particular facial region (eyes, nose, mouth) so that this region was static while the remainder of the face was naturally moving. Findings showed that dynamic information in the eyes and the mouth was important and the region of influence depended on the mental state. Processes involved in mental state recognition are discussed.
Research in Developmental Disabilities | 2013
Mary Hanley; Deborah M. Riby; Stephen Caswell; Sinead Rooney; Elisa Back
Individuals with the neuro-developmental disorder Williams syndrome (WS) are characterised by a combination of features which makes this group vulnerable socially, including mild-moderate cognitive difficulties, pro-social drive, and indiscriminate trust. The purpose of this study was to explore a key socio-communicative skill in individuals with WS, namely, mental state recognition abilities. We explored this skill in a detailed way by looking at how well individuals with WS recognise complex everyday mental states, and how they allocate their attention while making these judgements. Participants with WS were matched to two typically developing groups for comparison purposes, a verbal ability matched group and a chronological age matched group. While eye movements were recorded, participants were shown displays of eight different mental states in static and dynamic form, and they performed a forced-choice judgement on the mental state. Mental states were easier to recognise in dynamic form rather than static form. Mental state recognition ability for individuals with WS was poorer than expected by their chronological age, and at the level expected by their verbal ability. However, the pattern of mental state recognition for participants with WS varied according to mental state, and we found some interesting links between ease/difficulty recognising some mental states (worried/do not trust) and the classic behavioural profile associated with WS (high anxiety/indiscriminate trust). Furthermore, eye tracking data revealed that participants with WS allocated their attention atypically, with less time spent attending the information from the face regions. This challenges the widely held understanding of WS being associated with prolonged face and eye gaze, and indicates that there is more heterogeneity within this disorder in terms of socio-perception than previous reports would suggest.
Neuropsychologia | 2010
Deborah M. Riby; Elisa Back
The Williams syndrome (WS) social phenotype is characterised by a high level of social engagement, heightened empathy and prolonged attention to peoples faces. These behaviours appear in contradiction to research reporting problems recognising and interpreting basic emotions and more complex mental states from other people. The current task involved dynamic (moving) face stimuli of an actor depicting complex mental states (e.g., worried, disinterested). Cues from the eye and mouth regions were systematically frozen and kept neutrally expressive to help identify the source of mental state information in typical development and WS. Eighteen individuals with WS (aged 8-23 years) and matched groups of typically developing participants were most accurate inferring mental states from whole dynamic faces. In this condition individuals with WS performed at a level predicted by chronological age. When face parts (eyes or mouth) were frozen and neutrally expressive, individuals with WS showed the greatest decrement in performance when the eye region was uninformative. We propose that using moving whole face stimuli individuals with WS can infer mental states and the eye region plays a particularly important role in performance.
PLOS ONE | 2014
Elisa Back; Timothy R. Jordan
Although a great deal of research has been conducted on the recognition of basic facial emotions (e.g., anger, happiness, sadness), much less research has been carried out on the more subtle facial expressions of an individuals mental state (e.g., anxiety, disinterest, relief). Of particular concern is that these mental state expressions provide a crucial source of communication in everyday life but little is known about the accuracy with which natural dynamic facial expressions of mental states are identified and, in particular, the variability in mental state perception that is produced. Here we report the findings of two studies that investigated the accuracy and variability with which dynamic facial expressions of mental states were identified by participants. Both studies used stimuli carefully constructed using procedures adopted in previous research, and free-report (Study 1) and forced-choice (Study 2) measures of response accuracy and variability. The findings of both studies showed levels of response accuracy that were accompanied by substantial variation in the labels assigned by observers to each mental state. Thus, when mental states are identified from facial expressions in experiments, the identities attached to these expressions appear to vary considerably across individuals. This variability raises important issues for understanding the identification of mental states in everyday situations and for the use of responses in facial expression research.
Cognition | 2008
Ian A. Apperly; Elisa Back; Dana Samson
Child Development | 2007
Elisa Back; Danielle Ropar; Peter Mitchell
Cognition | 2010
Elisa Back; Ian A. Apperly
Archive | 2010
Elisa Back; H. Hunt; A. Lindell
Archive | 2007
Elisa Back; Peter Mitchell; Danielle Ropar
Archive | 2015
Elisa Back; Ian A. Apperly