Megan L. Willis
Macquarie University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Megan L. Willis.
Neuropsychologia | 2011
Romina Palermo; Megan L. Willis; Davide Rivolta; Elinor McKone; C. Ellie Wilson; Andrew J. Calder
Research highlights ► Congenital prosopagnosics show weak holistic coding of expression and identity. ► Normal expression recognition can result from compensatory strategies. ► There may be a common stage of holistic coding for expression and identity. ► Holistic coding of identity is functionally involved in face identification ability.
Emotion | 2011
Megan L. Willis; Romina Palermo; Darren Burke
The aim of the current study was to examine how emotional expressions displayed by the face and body influence the decision to approach or avoid another individual. In Experiment 1, we examined approachability judgments provided to faces and bodies presented in isolation that were displaying angry, happy, and neutral expressions. Results revealed that angry expressions were associated with the most negative approachability ratings, for both faces and bodies. The effect of happy expressions was shown to differ for faces and bodies, with happy faces judged more approachable than neutral faces, whereas neutral bodies were considered more approachable than happy bodies. In Experiment 2, we sought to examine how we integrate emotional expressions depicted in the face and body when judging the approachability of face-body composite images. Our results revealed that approachability judgments given to face-body composites were driven largely by the facial expression. In Experiment 3, we then aimed to determine how the categorization of body expression is affected by facial expressions. This experiment revealed that body expressions were less accurately recognized when the accompanying facial expression was incongruent than when neutral. These findings suggest that the meaning extracted from a body expression is critically dependent on the valence of the associated facial expression.
Neuropsychologia | 2010
Megan L. Willis; Romina Palermo; Darren Burke; Ky McGrillen; Laurie A. Miller
Facial expressions of emotion display a wealth of important social information that we use to guide our social judgements. The aim of the current study was to investigate whether patients with orbitofrontal cortex (OFC) lesions exhibit an impaired ability to judge the approachability of emotional faces. Furthermore, we also intended to establish whether impaired approachability judgements provided to emotional faces emerged in the presence of preserved explicit facial expression recognition. Using non-parametric statistics, we found that patients with OFC lesions had a particular difficulty using negative facial expressions to guide approachability judgements, compared to healthy controls and patients with frontal lesions sparing the OFC. Importantly, this deficit arose in the absence of an explicit facial expression recognition deficit. In our sample of healthy controls, we also demonstrated that the capacity to recognise facial expressions was not significantly correlated with approachability judgements given to emotional faces. These results demonstrate that the integrity of the OFC is critical for the appropriate assessment of approachability from negatively valenced faces and this ability is functionally dissociable from the capacity to explicitly recognise facial expressions.
NeuroImage | 2010
Megan L. Willis; Romina Palermo; Darren Burke; Carmen Atkinson; Genevieve McArthur
The aim of this study was to determine if, and when, the neural processes involved in switching associations formed with angry and happy faces start to diverge. We measured event-related potentials (ERPs) and behavioural responses while participants performed a reversal learning task with angry and happy faces. In the task, participants were simultaneously presented with two neutral faces and learned to associate one of the faces with an emotional expression (either angry or happy), which was displayed by the face when correctly selected. After three to seven trials, the face that had consistently been displaying an emotional expression when selected would instead remain neutral, signalling the participant to switch their response and select the other face on the subsequent trial. The neural processes involved in switching associations formed with angry and happy faces diverged 375 ms after stimulus onset. Specifically, P3a amplitude was reduced and P3b latency was delayed when participants were cued to switch associations formed with angry expressions compared to happy expressions. This difference was also evident in later behavioural responses, which showed that it was more difficult to switch associations made with angry expressions than happy expressions. These findings may reflect an adaptive mechanism that facilitates the maintenance of our memory of threatening individuals by associating them with their potential threat.
Social Cognitive and Affective Neuroscience | 2015
Megan L. Willis; Jillian M. Murphy; Nicole J. Ridley; Ans Vercammen
The orbitofrontal cortex (OFC) has been implicated in the capacity to accurately recognise facial expressions. The aim of the current study was to determine if anodal transcranial direct current stimulation (tDCS) targeting the right OFC in healthy adults would enhance facial expression recognition, compared with a sham condition. Across two counterbalanced sessions of tDCS (i.e. anodal and sham), 20 undergraduate participants (18 female) completed a facial expression labelling task comprising angry, disgusted, fearful, happy, sad and neutral expressions, and a control (social judgement) task comprising the same expressions. Responses on the labelling task were scored for accuracy, median reaction time and overall efficiency (i.e. combined accuracy and reaction time). Anodal tDCS targeting the right OFC enhanced facial expression recognition, reflected in greater efficiency and speed of recognition across emotions, relative to the sham condition. In contrast, there was no effect of tDCS to responses on the control task. This is the first study to demonstrate that anodal tDCS targeting the right OFC boosts facial expression recognition. This finding provides a solid foundation for future research to examine the efficacy of this technique as a means to treat facial expression recognition deficits, particularly in individuals with OFC damage or dysfunction.
Neuropsychology (journal) | 2014
Megan L. Willis; Romina Palermo; Ky McGrillen; Laurie A. Miller
OBJECTIVE Orbitofrontal cortex (OFC) damage has been associated with facial expression recognition deficits in some, but not all, previous studies. The pattern of performance of a group of patients with OFC damage was assessed across a series of facial expression recognition tasks. We aimed to determine whether some tasks were more sensitive at detecting deficits than others. METHOD Seven patients with damage to the OFC, 6 control patients with frontal lesions that spared the OFC, and a group of healthy controls completed a series of facial expression recognition tasks including 2 labeling tasks and 2 matching tasks. RESULTS The OFC patient group demonstrated impaired labeling of negative facial expressions (i.e., anger, disgust, fear, and sadness) shown for a short time (500 ms) relative to the comparison groups. When facial expressions were shown for a longer time (5,000 ms), the OFC patient groups performance did not differ significantly from either comparison group. The OFC patient group was impaired when matching subtle, low intensity negative facial expressions, but not when matching high intensity, prototypical facial expressions. CONCLUSIONS The pattern of performance across tasks revealed that only certain facial expression recognition tasks appear to be sufficiently sensitive to detect deficits in patients with OFC damage. These findings have important implications for the assessment of facial expression recognition difficulties in patients with OFC damage and more broadly, for special populations.
PLOS ONE | 2013
Megan L. Willis; Helen F. Dodd; Romina Palermo
The aim of the current study was to examine the relationship between individual differences in anxiety and the social judgements of trustworthiness and approachability. We assessed levels of state and trait anxiety in eighty-two participants who rated the trustworthiness and approachability of a series of unexpressive faces. Higher levels of trait anxiety (controlling for age, sex and state anxiety) were associated with the judgement of faces as less trustworthy. In contrast, there was no significant association between trait anxiety and judgements of approachability. These findings indicate that trait anxiety is a significant predictor of trustworthiness evaluations and illustrate the importance of considering the role of individual differences in the evaluation of trustworthiness. We propose that trait anxiety may be an important variable to control for in future studies assessing the cognitive and neural mechanisms underlying trustworthiness. This is likely to be particularly important for studies involving clinical populations who often experience atypical levels of anxiety.
PLOS ONE | 2015
Megan L. Willis; Natalie A. Windsor; Danielle L. Lawson; Nicole J. Ridley
Facial expressions of emotion play a key role in guiding social judgements, including deciding whether or not to approach another person. However, no research has examined how situational context modulates approachability judgements assigned to emotional faces, or the relationship between perceived threat and approachability judgements. Fifty-two participants provided approachability judgements to angry, disgusted, fearful, happy, neutral, and sad faces across three situational contexts: no context, when giving help, and when receiving help. Participants also rated the emotional faces for level of perceived threat and labelled the facial expressions. Results indicated that context modulated approachability judgements to faces depicting negative emotions. Specifically, faces depicting distress-related emotions (i.e., sadness and fear) were considered more approachable in the giving help context than both the receiving help and neutral context. Furthermore, higher ratings of threat were associated with the assessment of angry, happy and neutral faces as less approachable. These findings are the first to demonstrate the significant role that context plays in the evaluation of an individual’s approachability and illustrate the important relationship between perceived threat and the evaluation of approachability.
Frontiers in Psychology | 2015
Megan L. Willis; Danielle L. Lawson; Nicole J. Ridley; Peter Koval; Peter G. Rendell
Previous research on approachability judgments has indicated that facial expressions modulate how these judgments are made, but the relationship between emotional empathy and context in this decision-making process has not yet been examined. This study examined the contribution of emotional empathy to approachability judgments assigned to emotional faces in different contexts. One-hundred and twenty female participants completed the questionnaire measure of emotional empathy. Participants provided approachability judgments to faces displaying angry, disgusted, fearful, happy, neutral, and sad expressions, in three different contexts—when evaluating whether they would approach another individual to: (1) receive help; (2) give help; or (3) when no contextual information was provided. In addition, participants were also required to provide ratings of perceived threat, emotional intensity and label facial expressions. Emotional empathy significantly predicted approachability ratings for specific emotions in each context, over and above the contribution of perceived threat and intensity, which were associated with emotional empathy. Higher emotional empathy predicted less willingness to approach people with angry and disgusted faces to receive help, and a greater willingness to approach people with happy faces to receive help. Higher emotional empathy also predicted a greater willingness to approach people with sad faces to offer help, and more willingness to approach people with happy faces when no contextual information was provided. These results highlight the important contribution of individual differences in emotional empathy in predicting how approachability judgments are assigned to facial expressions in context.
Clinical Eeg and Neuroscience | 2012
Adam Bentvelzen; Genevieve McArthur; Blake W. Johnson; Megan L. Willis; Stuart Lee; Greg Savage
The N400 is a human neuroelectric response to semantic incongruity in on-line sentence processing, and implausibility in context has been identified as one of the factors that influence the size of the N400. In this paper we investigate whether predictors derived from Latent Semantic Analysis, language models, and Roark’s parser are significant in modeling of the N400m (the neuromagnetic version of the N400). We also investigate significance of a novel pairwise-priming language model based on the IBM Model 1 translation model. Our experiments show that all the predictors are significant. Moreover, we show that predictors based on the 4-gram language model and the pairwise-priming language model are highly correlated with the manual annotation of contextual plausibility, suggesting that these predictors are capable of playing the same role as the manual annotations in prediction of the N400m response. We also show that the proposed predictors can be grouped into two clusters of significant predictors, suggesting that each cluster is capturing a different characteristic of the N400m response.