Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Alejandra Sel is active.

Publication


Featured researches published by Alejandra Sel.


The Journal of Neuroscience | 2014

The Emotional Homunculus: ERP Evidence for Independent Somatosensory Responses during Facial Emotional Processing

Alejandra Sel; Bettina Forster; Beatriz Calvo-Merino

Current models of face perception propose that initial visual processing is followed by activation of nonvisual somatosensory areas that contributes to emotion recognition. To test whether there is a pure and independent involvement of somatosensory cortex (SCx) during face processing over and above visual responses, we directly measured participants somatosensory-evoked activity by tactually probing (105 ms postvisual facial stimuli) the state of SCx during an emotion discrimination task while controlling for visual effects. Discrimination of emotional versus neutral expressions enhanced early somatosensory-evoked activity between 40 and 80 ms after stimulus onset, suggesting visual emotion processing in SCx. This effect was source localized within primary, secondary, and associative somatosensory cortex. Emotional face processing influenced somatosensory responses to both face (congruent body part) and finger (control site) tactile stimulation, suggesting a general process that includes nonfacial cortical representations. Gender discrimination of the same facial expressions did not modulate somatosensory-evoked activity. We provide novel evidence that SCx activation is not a byproduct of visual processing but is independently shaped by face emotion processing.


PLOS ONE | 2012

How the emotional content of discourse affects language comprehension.

Laura Jiménez-Ortega; Manuel Martín-Loeches; Pilar Casado; Alejandra Sel; Sabela Fondevila; Annekathrin Schacht; Werner Sommer

Emotion effects on cognition have often been reported. However, only few studies investigated emotional effects on subsequent language processing, and in most cases these effects were induced by non-linguistic stimuli such as films, faces, or pictures. Here, we investigated how a paragraph of positive, negative, or neutral emotional valence affects the processing of a subsequent emotionally neutral sentence, which contained either semantic, syntactic, or no violation, respectively, by means of event-related brain potentials (ERPs). Behavioral data revealed strong effects of emotion; error rates and reaction times increased significantly in sentences preceded by a positive paragraph relative to negative and neutral ones. In ERPs, the N400 to semantic violations was not affected by emotion. In the syntactic experiment, however, clear emotion effects were observed on ERPs. The left anterior negativity (LAN) to syntactic violations, which was not visible in the neutral condition, was present in the negative and positive conditions. This is interpreted as reflecting modulatory effects of prior emotions on syntactic processing, which is discussed in the light of three alternative or complementary explanations based on emotion-induced cognitive styles, working memory, and arousal models. The present effects of emotion on the LAN are especially remarkable considering that syntactic processing has often been regarded as encapsulated and autonomous.


Cerebral Cortex | 2016

Heartfelt Self: Cardio-Visual Integration Affects Self-Face Recognition and Interoceptive Cortical Processing

Alejandra Sel; Ruben T. Azevedo

Abstract The sense of body‐ownership relies on the representation of both interoceptive and exteroceptive signals coming from ones body. However, it remains unknown how the integration of bodily signals coming from “outside” and “inside” the body is instantiated in the brain. Here, we used a modified version of the Enfacement Illusion to investigate whether the integration of visual and cardiac information can alter self‐face recognition (Experiment 1) and neural responses to heartbeats (Experiment 2). We projected a pulsing shade, that was synchronous or asynchronous with the participants heartbeat, onto a picture depicting the participants face morphed with the face of an unfamiliar other. Results revealed that synchronous (vs. asynchronous) cardio‐visual stimulation led to increased self‐identification with the others face (Experiment 1), while during stimulation, synchronicity modulated the amplitude of the Heartbeat Evoked Potential, an electrophysiological index of cortical interoceptive processing (Experiment 2). Importantly, the magnitude of the illusion‐related effects was dependent on, and increased linearly, with the participants’ Interoceptive Accuracy. These results provide the first direct neural evidence for the integration of interoceptive and exteroceptive signals in bodily self‐awareness.


Frontiers in Psychology | 2014

Predictive codes of interoception, emotion, and the self

Alejandra Sel

Interoception is the ability to perceive and integrate physiological signals from within the body. It is closely related to the autonomic system and is a key component in the generation of affective states and abstract representations of the self (Critchley et al., 2004; Ainley and Tsakiris, 2013). Seth proposes a predictive coding (PC) model of interoception that involves a free-energy based explanation of emotion awareness and selfhood. In this model, emotions, and in turn the sense of self, rely on predictions of the causes of interoceptive signals. Within this framework, the interoceptive system minimizes free-energy, or the discrepancy between predictions and interoceptive signals. Free-energy can be minimized either by updating predictions about the causes of the sensory signals (perceptual updating), or by acting to change autonomic states such that bodily states are more predictable (active inference).


Social Neuroscience | 2012

The sacred and the absurd––an electrophysiological study of counterintuitive ideas (at sentence level)

Sabela Fondevila; Manuel Martín-Loeches; Laura Jiménez-Ortega; Pilar Casado; Alejandra Sel; Anabel Fernández-Hernández; Werner Sommer

Religious beliefs are both catchy and durable: They exhibit a high degree of adherence to our cognitive system, given their success of transmission and spreading throughout history. A prominent explanation for religions cultural success comes from the “MCI hypothesis,” according to which religious beliefs are both easy to recall and desirable to transmit because they are minimally counterintuitive (MCI). This hypothesis has been empirically tested at concept and narrative levels by recall measures. However, the neural correlates of MCI concepts remain poorly understood. We used the N400 component of the event-related brain potential as a measure of counterintuitiveness of violations comparing religious and non-religious sentences, both counterintuitive, when presented in isolation. Around 80% in either condition were core-knowledge violations. We found smaller N400 amplitudes for religious as compared to non-religious counterintuitive ideas, suggesting that religious ideas are less semantically anomalous. Moreover, behavioral measures revealed that religious ideas are not readily detected as unacceptable. Finally, systematic analyses of our materials, according to conceptual features proposed in cognitive models of religion, did not reveal any outstanding variable significantly contributing to these differences. Refinements of cognitive models of religion should elucidate which combination of factors renders an anomaly less counterintuitive and thus more suitable for recall and transmission.


Social Cognitive and Affective Neuroscience | 2015

When you smile, the world smiles at you: ERP evidence for self-expression effects on face processing

Alejandra Sel; Beatriz Calvo-Merino; Simone Tuettenberg; Bettina Forster

Current models of emotion simulation propose that intentionally posing a facial expression can change ones subjective feelings, which in turn influences the processing of visual input. However, the underlying neural mechanism whereby ones facial emotion modulates the visual cortical responses to others facial expressions remains unknown. To understand how ones facial expression affects visual processing, we measured participants visual evoked potentials (VEPs) during a facial emotion judgment task of positive and neutral faces. To control for the effects of facial muscles on VEPs, we asked participants to smile (adopting an expression of happiness), to purse their lips (incompatible with smiling) or to pose with a neutral face, in separate blocks. Results showed that the smiling expression modulates face-specific visual processing components (N170/vertex positive potential) to watching other facial expressions. Specifically, when making a happy expression, neutral faces are processed similarly to happy faces. When making a neutral expression or pursing the lips, however, responses to neutral and happy face are significantly different. This effect was source localized within multisensory associative areas, angular gyrus, associative visual cortex and somatosensory cortex. We provide novel evidence that ones own emotional expression acts as a top-down influence modulating low-level neural encoding during facial perception.


PLOS ONE | 2009

Encouraging Expressions Affect the Brain and Alter Visual Attention

Manuel Martín-Loeches; Alejandra Sel; Pilar Casado; Laura Jiménez; Luis R. Castellanos

Background Very often, encouraging or discouraging expressions are used in competitive contexts, such as sports practice, aiming at provoking an emotional reaction on the listener and, consequently, an effect on subsequent cognition and/or performance. However, the actual efficiency of these expressions has not been tested scientifically. Methodology/Principal Findings To fill this gap, we studied the effects of encouraging, discouraging, and neutral expressions on event-related brain electrical activity during a visual selective attention task in which targets were determined by location, shape, and color. Although the expressions preceded the attentional task, both encouraging and discouraging messages elicited a similar long-lasting brain emotional response present during the visuospatial task. In addition, encouraging expressions were able to alter the customary working pattern of the visual attention system for shape selection in the attended location, increasing the P1 and the SP modulations while simultaneously fading away the SN. Conclusions/Significance This was interpreted as an enhancement of the attentional processes for shape in the attended location after an encouraging expression. It can be stated, therefore, that encouraging expressions, as those used in sport practice, as well as in many other contexts and situations, do seem to be efficient in exerting emotional reactions and measurable effects on cognition.


PLOS ONE | 2010

How Is Sentence Processing Affected by External Semantic and Syntactic Information? Evidence from Event-Related Potentials

Annekathrin Schacht; Manuel Martín-Loeches; Pilar Casado; Alejandra Sel; Werner Sommer

Background A crucial question for understanding sentence comprehension is the openness of syntactic and semantic processes for other sources of information. Using event-related potentials in a dual task paradigm, we had previously found that sentence processing takes into consideration task relevant sentence-external semantic but not syntactic information. In that study, internal and external information both varied within the same linguistic domain—either semantic or syntactic. Here we investigated whether across-domain sentence-external information would impact within-sentence processing. Methodology In one condition, adjectives within visually presented sentences of the structure [Det]-[Noun]-[Adjective]-[Verb] were semantically correct or incorrect. Simultaneously with the noun, auditory adjectives were presented that morphosyntactically matched or mismatched the visual adjectives with respect to gender. Findings As expected, semantic violations within the sentence elicited N400 and P600 components in the ERP. However, these components were not modulated by syntactic matching of the sentence-external auditory adjective. In a second condition, syntactic within-sentence correctness-variations were combined with semantic matching variations between the auditory and the visual adjective. Here, syntactic within-sentence violations elicited a LAN and a P600 that did not interact with semantic matching of the auditory adjective. However, semantic mismatching of the latter elicited a frontocentral positivity, presumably related to an increase in discourse level complexity. Conclusion The current findings underscore the open versus algorithmic nature of semantic and syntactic processing, respectively, during sentence comprehension.


NeuroImage | 2016

Electrophysiological correlates of self-specific prediction errors in the human brain

Alejandra Sel; Rachel Harding

Recognising ones self, vs. others, is a key component of self-awareness, crucial for social interactions. Here we investigated whether processing self-face and self-body images can be explained by the brains prediction of sensory events, based on regularities in the given context. We measured evoked cortical responses while participants observed alternating sequences of self-face or other-face images (experiment 1) and self-body or other-body images (experiment 2), which were embedded in an identity-irrelevant task. In experiment 1, the expected sequences were violated by deviant morphed images, which contained 33%, 66% or 100% of the self-face when the others face was expected (and vice versa). In experiment 2, the anticipated sequences were violated by deviant images of the self when the others image was expected (and vice versa), or by two deviant images composed of pictures of the self-face attached to the others body, or the others face attached to the self-body. This manipulation allowed control of the prediction error associated with the self or the others image. Deviant self-images (but not deviant images of the other) elicited a visual mismatch response (vMMR)--a cortical index of violations of regularity. This was source localised to face and body related visual, sensorimotor and limbic areas and had amplitude proportional to the amount of deviance from the self-image. We provide novel evidence that self-processing can be described by the brains prediction error system, which accounts for self-bias in visual processing. These findings are discussed in the light of recent predictive coding models of self-processing.


Human Brain Mapping | 2018

Affective interoceptive inference: Evidence from heart-beat evoked brain potentials

Antje Gentsch; Alejandra Sel; Amanda C. Marshall; Simone Schütz-Bosbach

The perception of internal bodily signals (interoception) is central to many theories of emotion and embodied cognition. According to recent theoretical views, the sensory processing of visceral signals such as ones own heartbeat is determined by top‐down predictions about the expected interoceptive state of the body (interoceptive inference). In this EEG study we examined neural responses to heartbeats following expected and unexpected emotional stimuli. We used a modified stimulus repetition task in which pairs of facial expressions were presented with repeating or alternating emotional content, and we manipulated the emotional valence and the likelihood of stimulus repetition. We found that affective predictions of external socially relevant information modulated the heartbeat‐evoked potential, a marker of cardiac interoception. Crucially, the HEP changes highly relied on the expected emotional content of the facial expression. Thus, expected negative faces led to a decreased HEP amplitude, whereas such an effect was not observed after an expected neutral face. These results suggest that valence‐specific affective predictions, and their uniquely associated predicted bodily sensory state, can reduce or amplify cardiac interoceptive responses. In addition, the affective repetition effects were dependent on repetition probability, highlighting the influence of top‐down exteroceptive predictions on interoception. Our results are in line with recent models of interoception supporting the idea that predicted bodily states influence sensory processing of salient external information.

Collaboration


Dive into the Alejandra Sel's collaboration.

Top Co-Authors

Avatar

Manuel Martín-Loeches

Complutense University of Madrid

View shared research outputs
Top Co-Authors

Avatar

Pilar Casado

Complutense University of Madrid

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Werner Sommer

Humboldt University of Berlin

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Laura Jiménez-Ortega

Complutense University of Madrid

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge