Alla Yankouskaya
University of Oxford
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Alla Yankouskaya.
Attention Perception & Psychophysics | 2012
Alla Yankouskaya; David A. Booth; Glyn W. Humphreys
Interactions between the processing of emotion expression and form-based information from faces (facial identity) were investigated using the redundant-target paradigm, in which we specifically tested whether identity and emotional expression are integrated in a superadditive manner (Miller, Cognitive Psychology 14:247–279, 1982). In Experiments 1 and 2, participants performed emotion and face identity judgments on faces with sad or angry emotional expressions. Responses to redundant targets were faster than responses to either single target when a universal emotion was conveyed, and performance violated the predictions from a model assuming independent processing of emotion and face identity. Experiment 4 showed that these effects were not modulated by varying interstimulus and nontarget contingencies, and Experiment 5 demonstrated that the redundancy gains were eliminated when faces were inverted. Taken together, these results suggest that the identification of emotion and facial identity interact in face processing.
Quarterly Journal of Experimental Psychology | 2017
Moritz Stolte; Glyn W. Humphreys; Alla Yankouskaya; Jie Sui
We examined whether self-biases in perceptual matching reflect the positive valence of self-related stimuli. Participants associated geometric shapes with either personal labels (e.g., you, friend, stranger) or faces with different emotional expressions (e.g., happy, neutral, sad). They then judged whether shape–label or shape–face pairs were as originally shown or re-paired. Match times were faster to self-associated stimuli and to stimuli associated with the most positive valence. In addition, both the self-bias and the positive emotion bias were reliable across individuals in different test sessions. In contrast there was no sign of a correlation between the self-bias and the emotion-bias effects. We argue that self-bias and the bias to stimuli linked to positive emotion are separate and may reflect different underlying processes.
Journal of Experimental Psychology: Learning, Memory and Cognition | 2014
Alla Yankouskaya; Glyn W. Humphreys; Pia Rotshtein
We examined relations between the processing of facial identity and emotion in own- and other-race faces, using a fully crossed design with participants from 3 different ethnicities. The benefits of redundant identity and emotion signals were evaluated and formally tested in relation to models of independent and coactive feature processing and measures of processing capacity for the different types of stimuli. There was evidence for coactive processing of identity and emotion that was linked to super capacity for own-race but not for other-race faces. In addition, the size of the redundancy gain for other-race faces varied with the amount of social contact participants had with individuals from the other race. The data demonstrate qualitative differences in the processing of facial identity and emotion cues in own and other races. The results also demonstrate that the level of integration of identity and emotion cues in faces may be determined by life experience and exposure to individuals of different ethnicities.
Frontiers in Human Neuroscience | 2014
Alla Yankouskaya; Glyn W. Humphreys; Pia Rotshtein
Facial identity and emotional expression are two important sources of information for daily social interaction. However the link between these two aspects of face processing has been the focus of an unresolved debate for the past three decades. Three views have been advocated: (1) separate and parallel processing of identity and emotional expression signals derived from faces; (2) asymmetric processing with the computation of emotion in faces depending on facial identity coding but not vice versa; and (3) integrated processing of facial identity and emotion. We present studies with healthy participants that primarily apply methods from mathematical psychology, formally testing the relations between the processing of facial identity and emotion. Specifically, we focused on the “Garner” paradigm, the composite face effect and the divided attention tasks. We further ask whether the architecture of face-related processes is fixed or flexible and whether (and how) it can be shaped by experience. We conclude that formal methods of testing the relations between processes show that the processing of facial identity and expressions interact, and hence are not fully independent. We further demonstrate that the architecture of the relations depends on experience; where experience leads to higher degree of inter-dependence in the processing of identity and expressions. We propose that this change occurs as integrative processes are more efficient than parallel. Finally, we argue that the dynamic aspects of face processing need to be incorporated into theories in this field.
Journal of Aging Research | 2014
Alla Yankouskaya; Pia Rotshtein; Glyn W. Humphreys
We tested how aging affects the integration of visual information from faces. Three groups of participants aged 20–30, 40–50, and 60–70 performed a divided attention task in which they had to detect the presence of a target facial identity or a target facial expression. Three target stimuli were used: (1) with the target identity but not the target expression, (2) with the target expression but not the target identity, and (3) with both the target identity and target expression (the redundant target condition). On nontarget trials the faces contained neither the target identity nor expression. All groups were faster in responding to a face containing both the target identity and emotion compared to faces containing either single target. Furthermore the redundancy gains for combined targets exceeded performance limits predicted by the independent processing of facial identity and emotion. These results are held across the age range. The results suggest that there is interactive processing of facial identity and emotion which is independent of the effects of cognitive aging. Older participants demonstrated reliably larger size of the redundancy gains compared to the young group that reflect a greater experience with faces. Alternative explanations are discussed.
Visual Cognition | 2015
Zargol Moradi; Alla Yankouskaya; Mihaela Duta; Miles Hewstone; Glyn W. Humphreys
ABSTRACT There is ample evidence showing that decision times are shorter when detecting two targets for which the same response is required, compared to when only one of the targets is present—resulting in a redundancy gain. Though effects of perceptual manipulations on redundancy gains are established, effects of social associations are still unclear. Here, we examined for the first time whether associating arbitrary stimuli with in-group as opposed to out-group targets modulates redundancy gains. Participants made associations between a shape, a colour and either in- or out-group labels. They then had to discriminate whether in- or out-group stimuli appeared (single or redundant features). Responses to in-group but not to out-group stimuli violated predictions of models in which the associated features are processed independently, and were consistent with in-group stimuli being processed with super-capacity. Our results, replicated across two experiments, providing the first evidence that there is enhanced perceptual integration for information associated with an in-group.
Social Cognitive and Affective Neuroscience | 2017
Alla Yankouskaya; Glyn W. Humphreys; Moritz Stolte; Mark G. Stokes; Zahra Zargol Moradi; Jie Sui
Abstract Although theoretical discourse and experimental studies on the self- and reward-biases have a long tradition, currently we have only a limited understanding of how the biases are represented in the brain and, more importantly, how they relate to each other. We used multi-voxel pattern analysis to test for common representations of self and reward in perceptual matching in healthy human subjects. Voxels across an anterior–posterior axis in ventromedial prefrontal cortex (vmPFC) distinguished (i) self–others and (ii) high–low reward, but cross-generalization between these dimensions decreased from anterior to posterior vmPFC. The vmPFC is characterized by a shift from a common currency for value to independent, distributed representations of self and reward across an anterior–posterior axis. This shift reflected changes in functional connectivity between the posterior part of the vmPFC and the frontal pole when processing self-associated stimuli, and the middle frontal gyrus when processing stimuli associated with high reward. The changes in functional connectivity were correlated with behavioral biases, respectively, to the self and reward. The distinct representations of self and reward in the posterior vmPFC are associated with self- and reward-biases in behavior.
Quarterly Journal of Experimental Psychology | 2017
Alla Yankouskaya; Diahann Palmer; Moritz Stolte; Jie Sui; Glyn W. Humphreys
We present novel data on the role of attention in eliciting enhanced processing of stimuli associated with self. Participants were required to make pro- or anti-saccades according to whether learned shape–label pairings matched or mismatched. When stimuli matched participants were required to make an anti-saccade, and when the stimuli mismatched a pro-saccade was required. We found that anti-saccades were difficult to make to stimuli associated with self when compared to stimuli associated with a friend and a stranger. In contrast, anti-saccades to friend-stimuli were easier to make than anti-saccades to stranger-stimuli. In addition, a correct anti-saccade to a self-associated stimulus disrupted subsequent pro-saccade trials, relative to when the preceding anti-saccade was made to other stimuli. The data indicate that self-associated stimuli provide a strong cue for explicit shifts of attention to them, and that correct anti-saccades to such stimuli demand high levels of inhibition (which carries over to subsequent pro-saccade trials). The self exerts an automatic draw on attention.
Brain and Cognition | 2017
Alla Yankouskaya; Moritz Stolte; Zargol Moradi; Pia Rotshtein; Glyn W. Humphreys
&NA; Separate neural systems have been implicated in the recognition of facial identity and emotional expression. A growing number of studies now provide evidence against this modular view by demonstrating that integration of identity and emotion information enhances face processing. Yet, the neural mechanisms that shape this integration remain largely unknown. We hypothesize that the presence of both personal and emotional expression target information triggers changes in functional connectivity between frontal and extrastriate areas in the brain. We report and discuss three important findings. First, the presence of target identity and emotional expression in the same face was associated with super capacity and violations of the independent processing of identity and expression cues. Second, activity in the orbitofrontal cortex (OFC) was associated with the presence of redundant targets and changes in functional connectivity between a particular region of the right OFC (BA11/47) and bilateral visual brain regions (the inferior occipital gyrus (IOG)). Third, these changes in connectivity showed a strong link to behavioural measures of capacity processing. We suggest that the changes in functional connectivity between the right OFC and IOG reduce variability of BOLD responses in the IOG, enhancing integration of identity and emotional expression cues in faces. HighlightsFacilitation effects in faces are modulated by higher cognitive functions.Increasing functional connectivity supports interaction between identity and emotions.The OFC plays a crucial role in integrating identity and emotional cues in faces.
Systems Factorial Technology#R##N#A Theory Driven Methodology for the Identification of Perceptual and Cognitive Mechanisms | 2017
Alla Yankouskaya; Jie Sui; Zargol Moradi; Pia Rotshtein; Glyn W. Humphreys
We reviewed three studies where we investigated the effects of social factors (race, in-group and self-biases) on perceptual processes using capacity analysis. Specifically, we demonstrate how the utility of processing efficiency can be used to quantify the effects of social and motivational biases (i.e., race bias, in-group bias, self-bias, and monetary reward bias,) on visual perception. Contrasting to previous studies where the capacity measures were employed with double factorial experimental design, these three studies in social biases provide a new application of the capacity framework by combining the divided attention task with a recently developed associative learning task. We found that social biases enhance integration of information by modulating perceptual processing, and the modulatory effects reflect increases in processing efficiency during information processing. We suggest that increasing processing efficiency can be sourced: (i) from learned configural properties of perceptual objects (such as facial condifuration), (ii) stronger perceptual and conceptual representations for objects associated with self or high reward, and (iii) currently-salient social categorization (e.g., team membership). Future directions in applying the capacity framework to issues in social cognition are discussed.