Andrés Fernández-Martín
University of La Laguna
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Andrés Fernández-Martín.
Cognition | 2012
Manuel G. Calvo; Andrés Fernández-Martín; Lauri Nummenmaa
Why is a face with a smile but non-happy eyes likely to be interpreted as happy? We used blended expressions in which a smiling mouth was incongruent with the eyes (e.g., angry eyes), as well as genuine expressions with congruent eyes and mouth (e.g., both happy or angry). Tasks involved detection of a smiling mouth (perceptual), categorization of the expression (semantic), and valence evaluation (affective). The face stimulus display duration and stimulus onset asynchrony (SOA) were varied to assess the time course of each process. Results indicated that (a) a smiling mouth was visually more salient than the eyes both in truly happy and blended expressions; (b) a smile led viewers to categorize blended expressions as happy similarly for upright and inverted faces; (c) truly happy, but not blended, expressions primed the affective evaluation of probe scenes 550 ms following face onset; (d) both truly happy and blended expressions primed the detection of a smile in a probe scene by 170 ms post-stimulus; and (e) smile detection and expression categorization had similar processing thresholds and preceded affective evaluation. We conclude that the saliency of single physical features such as the mouth shape makes the smile quickly accessible to the visual system, which initially speeds up expression categorization regardless of congruence with the eyes. Only when the eye expression is later configurally integrated with the mouth, will affective discrimination begin. The present research provides support for serial models of facial expression processing.
Clinical psychological science | 2015
Jenny Yiend; Andrew Mathews; Tom Burns; Kevin Dutton; Andrés Fernández-Martín; George A. Georgiou; Michael Luckie; Alexandra Rose; Riccardo Russo; Elaine Fox
A well-established literature has identified different selective attentional orienting mechanisms underlying anxiety-related attentional bias, such as engagement and disengagement of attention. These mechanisms are thought to contribute to the onset and maintenance of anxiety disorders. However, conclusions to date have relied heavily on experimental work from subclinical samples. We therefore investigated individuals with diagnosed generalized anxiety disorder (GAD), healthy volunteers, and individuals with high trait anxiety (but not meeting GAD diagnostic criteria). Across two experiments we found faster disengagement from negative (angry and fearful) faces in GAD groups, an effect opposite to that expected on the basis of the subclinical literature. Together these data challenge current assumptions that we can generalize, to those with GAD, the pattern of selective attentional orienting to threat found in subclinical groups. We suggest a decisive two-stage experiment identifying stimuli of primary salience in GAD, then using these to reexamine orienting mechanisms across groups.
Biological Psychology | 2014
Manuel G. Calvo; David Beltrán; Andrés Fernández-Martín
We investigated the time course and processes in the recognition of facial expressions in peripheral vision (10.5°). Happy faces were categorized more accurately and faster than angry, fearful, sad, and neutral faces. Consistently, the N1 (90 to 130ms post-stimulus) and N2pc (200-300ms) ERP (event-related-potentials) components were more negative, and the SPWs (slow positive waves; 700-800ms) were smaller, for happy than for non-happy faces. Computational modeling revealed that the smiling mouth became visually salient earlier (95ms) than any other region, in temporal correspondence with the N1, thus showing an attentional capture by the smile. The N2pc presumably reflected the subsequent selective allocation of processing resources to happy faces. As a result, the reduced SPWs suggest that the decision process in expression categorization became less demanding for happy faces. We propose that facial expression recognition in peripheral vision is mainly driven by perceptual processing, without affective discrimination.
Journal of cognitive psychology | 2012
Manuel G. Calvo; Aida Gutiérrez; Andrés Fernández-Martín
We investigated whether anxiety facilitates detection of threat stimuli outside the focus of overt attention, and the time course of the interference produced by threat distractors. Threat or neutral word distractors were presented in attended (foveal) and unattended (parafoveal) locations followed by an unrelated probe word at 300 ms (Experiments 1 and 2) or 1000 ms (Experiment 2) stimulus–onset asynchrony (SOA) in a lexical decision task. Results showed: (1) no effects of trait anxiety on selective saccades to the parafoveal threat distractors; (2) interference with probe processing (i.e., slowed lexical decision times) following a foveal threat distractor at 300 ms SOA for all participants, regardless of anxiety, but only for high-anxiety participants at 1000 ms SOA; and (3) no interference effects of parafoveal threat distractors. These findings suggest that anxiety does not enhance preattentive semantic processing of threat words. Rather, anxiety leads to delays in the inhibitory control of attended task-irrelevant threat stimuli.
Journal of cognitive psychology | 2017
Manuel G. Calvo; Patricia Álvarez-Plaza; Andrés Fernández-Martín
ABSTRACT What expressive facial features and processing mechanisms make a person look trustworthy, relative to happy? Participants judged the un/happiness or un/trustworthiness of people with dynamic expressions in which the eyes and/or the mouth unfolded from neutral to happy or vice versa. Faces with an unfolding smile looked more trustworthy and happier than faces with a neutral mouth, regardless of the eye expression. Unfolding happy eyes increased both trustworthiness and happiness only in the presence of a congruent unfolding smiling mouth. Nevertheless, the contribution of the mouth was greater for happiness than for trustworthiness; and the mouth was especially visually salient for expressions favouring happiness more than trustworthiness. We conclude that the categorisation of facial happiness is more automatically driven by the visual saliency of a single feature, that is, the smiling mouth, while perception of trustworthiness is more strategic, with the eyes being necessarily incorporated into a configural face representation.
Consciousness and Cognition | 2017
Andrés Fernández-Martín; Aida Gutiérrez-García; Juan I. Capafóns; Manuel G. Calvo
We investigated selective attention to emotional scenes in peripheral vision, as a function of adaptive relevance of scene affective content for male and female observers. Pairs of emotional-neutral images appeared peripherally-with perceptual stimulus differences controlled-while viewers were fixating on a different stimulus in central vision. Early selective orienting was assessed by the probability of directing the first fixation towards either scene, and the time until first fixation. Emotional scenes selectively captured covert attention even when they were task-irrelevant, thus revealing involuntary, automatic processing. Sex of observers and specific emotional scene content (e.g., male-to-female-aggression, families and babies, etc.) interactively modulated covert attention, depending on adaptive priorities and goals for each sex, both for pleasant and unpleasant content. The attentional system exhibits domain-specific and sex-specific biases and attunements, probably rooted in evolutionary pressures to enhance reproductive and protective success. Emotional cues selectively capture covert attention based on their bio-social significance.
Cognition | 2016
Andrés Fernández-Martín; Manuel G. Calvo
We investigated the relative attentional capture by positive versus simultaneously presented negative images in extrafoveal vision for female observers. Pairs of task-irrelevant pleasant and unpleasant visual scenes were displayed peripherally (⩾5° away from fixation) during a task-relevant letter-discrimination task at fixation. Selective attentional orienting was assessed by the probability of first fixating each scene and the time until first fixation. Results revealed a higher first fixation probability and shorter entry times, followed by longer dwell times, for pleasant relative to unpleasant scenes. The attentional capture advantage by pleasant scenes occurred in the absence of differences in perceptual properties. Processing of affective scene significance automatically occurs through covert attention in peripheral vision early. At least in non-threatening conditions, the attentional system is tuned to initially orient to pleasant images when competing with unpleasant ones.
Quarterly Journal of Experimental Psychology | 2018
Manuel G. Calvo; Eva G. Krumhuber; Andrés Fernández-Martín
A happy facial expression makes a person look (more) trustworthy. Do perceptions of happiness and trustworthiness rely on the same face regions and visual attention processes? In an eye-tracking study, eye movements and fixations were recorded while participants judged the un/happiness or the un/trustworthiness of dynamic facial expressions in which the eyes and/or the mouth unfolded from neutral to happy or vice versa. A smiling mouth and happy eyes enhanced perceived happiness and trustworthiness similarly, with a greater contribution of the smile relative to the eyes. This comparable judgement output for happiness and trustworthiness was reached through shared as well as distinct attentional mechanisms: (a) entry times and (b) initial fixation thresholds for each face region were equivalent for both judgements, thereby revealing the same attentional orienting in happiness and trustworthiness processing. However, (c) greater and (d) longer fixation density for the mouth region in the happiness task, and for the eye region in the trustworthiness task, demonstrated different selective attentional engagement. Relatedly, (e) mean fixation duration across face regions was longer in the trustworthiness task, thus showing increased attentional intensity or processing effort.
Visual Cognition | 2015
Andrés Fernández-Martín; Manuel G. Calvo
ABSTRACT Pairs of emotional (pleasant or unpleasant) and neutral scenes were presented peripherally (≥5° away from fixation) during a central letter-discrimination task. Selective attentional capture was assessed by means of eye movement orienting, i.e., probability of first fixating a scene and the time until first fixation. Static and dynamic visual saliency values of the scenes were computationally modelled. Results revealed selective orienting to both pleasant and unpleasant relative to neutral scenes. Importantly, such effects remained in the absence of visual saliency differences, even though saliency influenced eye movements. This suggests that selective attention to emotional scenes is genuinely driven by the processing of affective significance in extrafoveal vision.
Frontiers in Psychology | 2018
Manuel G. Calvo; Andrés Fernández-Martín; Guillermo Recio; Daniel Lundqvist
Most experimental studies of facial expression processing have used static stimuli (photographs), yet facial expressions in daily life are generally dynamic. In its original photographic format, the Karolinska Directed Emotional Faces (KDEF) has been frequently utilized. In the current study, we validate a dynamic version of this database, the KDEF-dyn. To this end, we applied animation between neutral and emotional expressions (happy, sad, angry, fearful, disgusted, and surprised; 1,033-ms unfolding) to 40 KDEF models, with morphing software. Ninety-six human observers categorized the expressions of the resulting 240 video-clip stimuli, and automated face analysis assessed the evidence for 6 expressions and 20 facial action units (AUs) at 31 intensities. Low-level image properties (luminance, signal-to-noise ratio, etc.) and other purely perceptual factors (e.g., size, unfolding speed) were controlled. Human recognition performance (accuracy, efficiency, and confusions) patterns were consistent with prior research using static and other dynamic expressions. Automated assessment of expressions and AUs was sensitive to intensity manipulations. Significant correlations emerged between human observers’ categorization and automated classification. The KDEF-dyn database aims to provide a balance between experimental control and ecological validity for research on emotional facial expression processing. The stimuli and the validation data are available to the scientific community.