Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Ana Tajadura-Jiménez is active.

Publication


Featured researches published by Ana Tajadura-Jiménez.


Consciousness and Cognition | 2013

Bodily ownership and self-location: Components of bodily self-consciousness

Andrea Serino; Adrian Alsmith; Marcello Costantini; Alisa Mandrigin; Ana Tajadura-Jiménez; Christophe Lopez

Recent research on bodily self-consciousness has assumed that it consists of three distinct components: the experience of owning a body (body ownership); the experience of being a body with a given location within the environment (self-location); and the experience of taking a first-person, body-centered, perspective on that environment (perspective). Here we review recent neuroimaging studies suggesting that at least two of these components-body ownership and self-location-are implemented in rather distinct neural substrates, located, respectively, in the premotor cortex and in the temporo-parietal junction. We examine these results and consider them in relation to clinical evidence from patients with altered body perception and work on a variety of multisensory, body-related illusions, such as the rubber hand illusion, the full body illusion, the body swap illusion and the enfacement illusion. We conclude by providing a preliminary synthesis of the data on bodily self-consciousness and its neural correlates.


PLOS ONE | 2012

The other in me: interpersonal multisensory stimulation changes the mental representation of the self.

Ana Tajadura-Jiménez; Stephanie Grehl

Background Recent studies have shown that the well-known effect of multisensory stimulation on body-awareness can be extended to self-recognition. Seeing someone else’s face being touched at the same time as one’s own face elicits changes in the mental representation of the self-face. We sought to further elucidate the underlying mechanisms and the effects of interpersonal multisensory stimulation (IMS) on the mental representation of the self and others. Methodology/Principal Findings Participants saw an unfamiliar face being touched synchronously or asynchronously with their own face, as if they were looking in the mirror. Following synchronous, but not asynchronous, IMS, participants assimilated features of the other’s face in the mental representation of their own face as evidenced by the change in the point of subjective equality for morphed pictures of the two faces. Interestingly, synchronous IMS resulted in a unidirectional change in the self-other distinction, affecting recognition of one’s own face, but not recognition of the other’s face. The participants’ autonomic responses to objects approaching the other’s face were higher following synchronous than asynchronous IMS, but this increase was not specific to the pattern of IMS in interaction with the viewed object. Finally, synchronous, as compared to asynchronous, IMS resulted in significant differences in participants’ ratings of their experience, but unlike other bodily illusions, positive changes in subjective experience were related to the perceived physical similarity between the two faces, and not to identification. Conclusions/Significance Synchronous IMS produces quantifiable changes in the mental representations of one’s face, as measured behaviorally. Changes in autonomic responses and in the subjective experience of self-identification were broadly consistent with patterns observed in other bodily illusions, but less robust. Overall, shared multisensory experiences between self and other can change the mental representation of one’s identity, and the perceived similarity of others relative to one’s self.


Current Biology | 2012

Action sounds recalibrate perceived tactile distance

Ana Tajadura-Jiménez; Aleksander Väljamäe; Iwaki Toshima; Toshitaka Kimura; Norimichi Kitagawa

Summary Almost every bodily movement, from the most complex to the most mundane, such as walking, can generate impact sounds that contain 360° spatial information of high temporal resolution. Given the strong connection of auditory cues to our body actions, and the dependency of body-awareness on the interaction between peripheral sensory inputs and mental body-representations, one could assume that audition plays a specific role in this interaction. Despite the conclusive evidence for the role that the integration of vision, touch and proprioception plays in updating body-representations [1,2], hardly any study has looked at the contribution of audition. We show that the representation of a key property of ones body, like its length, is affected by the sound of ones actions. Participants tapped on a surface while progressively extending their right arm sideways, and in synchrony with each tap participants listened to a tapping sound. In the critical condition, the sound originated at double the distance at which participants actually tapped. After exposure to this condition, tactile distances on the test right arm, as compared to distances on the reference left arm, felt bigger than those before the exposure. No evidence of changes in tactile distance reports was found at the quadruple tapping sound distance or the asynchronous auditory feedback conditions. Our results suggest that tactile perception is referenced to an implicit body-representation which is informed by auditory feedback. This is the first evidence of the contribution of self-produced sounds to body-representation, addressing the auditory-dependent plasticity of body-representation and its spatial boundaries.


Consciousness and Cognition | 2012

The person in the mirror: using the enfacement illusion to investigate the experiential structure of self-identification

Ana Tajadura-Jiménez; Matthew R. Longo; Rosie Coleman

How do we acquire a mental representation of our own face? Recently, synchronous, but not asynchronous, interpersonal multisensory stimulation (IMS) between ones own and another persons face has been used to evoke changes in self-identification (enfacement illusion). We investigated the conscious experience of these changes with principal component analyses (PCA) that revealed that while the conscious experience during synchronous IMS focused on resemblance and similarity with the others face, during asynchronous IMS it focused on multisensory stimulation. Analyses of the identified common factor structure revealed significant quantitative differences between synchronous and asynchronous IMS on self-identification and perceived similarity with the others face. Experiment 2 revealed that participants with lower interoceptive sensitivity experienced stronger enfacement illusion. Overall, self-identification and body-ownership rely on similar basic mechanisms of multisensory integration, but the effects of multisensory input on their experience are qualitatively different, possibly underlying the faces unique role as a marker of selfhood.


Emotion | 2010

Embodied auditory perception: The emotional impact of approaching and receding sound sources.

Ana Tajadura-Jiménez; Aleksander Väljamäe; Erkin Asutay; Daniel Västfjäll

Research has shown the existence of perceptual and neural bias toward sounds perceived as sources approaching versus receding a listener. It has been suggested that a greater biological salience of approaching auditory sources may account for these effects. In addition, these effects may hold only for those sources critical for our survival. In the present study, we bring support to these hypotheses by quantifying the emotional responses to different sounds with changing intensity patterns. In 2 experiments, participants were exposed to artificial and natural sounds simulating approaching or receding sources. The auditory-induced emotional effect was reflected in the performance of participants in an emotion-related behavioral task, their self-reported emotional experience, and their physiology (electrodermal activity and facial electromyography). The results of this study suggest that approaching unpleasant sound sources evoke more intense emotional responses in listeners than receding ones, whereas such an effect of perceived sound motion does not exist for pleasant or neutral sound sources. The emotional significance attributed to the sound source itself, the loudness of the sound, and loudness change duration seem to be relevant factors in this disparity.


Journal of Experimental Psychology: General | 2014

Balancing the "inner" and the "outer" self: interoceptive sensitivity modulates self-other boundaries.

Ana Tajadura-Jiménez

Distinguishing self from other is necessary for self-awareness and social interactions. This distinction is thought to depend on multisensory integration dominated by visual feedback. However, self-awareness also relies on the processing of interoceptive signals. We contrasted the exteroceptive and interoceptive models of the self to investigate the hitherto unexplored interaction between the perception of the self from the outside and from within. Multisensory stimulation between self and other was used to induce controlled changes in the representation of ones identity. Interoceptive sensitivity predicted the malleability of self-representations in response to multisensory integration across behavioral, physiological, and introspective responses, suggesting that interoception plays a key modulating role in the self-recognition system. In particular, only participants with low interoceptive sensitivity experienced changes in self-other boundaries in response to multisensory stimulation. These results support the view that interoceptive predictive coding models are used to monitor and assign the sources of sensory input either to the self or to others, as well as support the hypothesis of the insular cortex as a convergence zone in the processing and global representation of the material self given its involvement in both interoceptive feelings, multisensory integration, and self-processing.


human factors in computing systems | 2014

Motivating people with chronic pain to do physical activity: opportunities for technology design

Aneesha Singh; Annina Klapper; Jinni Jia; Antonio Rei Fidalgo; Ana Tajadura-Jiménez; Natalie Kanakam; Nadia Bianchi-Berthouze; Amanda C. de C. Williams

Physical activity is important for improving quality of life in people with chronic pain. However, actual or anticipated pain exacerbation, and lack of confidence when doing physical activity, make it difficult to maintain and build towards long-term activity goals. Research guiding the design of interactive technology to motivate and support physical activity in people with chronic pain is lacking. We conducted studies with: (1) people with chronic pain, to understand how they maintained and increased physical activity in daily life and what factors deterred them; and (2) pain-specialist physiotherapists, to understand how they supported people with chronic pain. Building on this understanding, we investigated the use of auditory feedback to address some of the psychological barriers and needs identified and to increase self-efficacy, motivation and confidence in physical activity. We conclude by discussing further design opportunities based on the overall findings.


Neuropsychologia | 2009

Auditory-somatosensory multisensory interactions are spatially modulated by stimulated body surface and acoustic spectra

Ana Tajadura-Jiménez; Norimichi Kitagawa; Aleksander Väljamäe; Massimiliano Zampini; Micah M. Murray; Charles Spence

Previous research has provided inconsistent results regarding the spatial modulation of auditory-somatosensory interactions. The present study reports three experiments designed to investigate the nature of these interactions in the space close to the head. Human participants made speeded detection responses to unimodal auditory, somatosensory, or simultaneous auditory-somatosensory stimuli. In Experiment 1, electrocutaneous stimuli were presented to either earlobe, while auditory stimuli were presented from the same versus opposite sides, and from one of two distances (20 vs. 70 cm) from the participants head. The results demonstrated a spatial modulation of auditory-somatosensory interactions when auditory stimuli were presented from close to the head. In Experiment 2, electrocutaneous stimuli were delivered to the hands, which were placed either close to or far from the head, while the auditory stimuli were again presented at one of two distances. The results revealed that the spatial modulation observed in Experiment 1 was specific to the particular body part stimulated (head) rather than to the region of space (i.e. around the head) where the stimuli were presented. The results of Experiment 3 demonstrate that sounds that contain high-frequency components are particularly effective in eliciting this auditory-somatosensory spatial effect. Taken together, these findings help to resolve inconsistencies in the previous literature and suggest that auditory-somatosensory multisensory integration is modulated by the stimulated body surface and acoustic spectra of the stimuli presented.


human factors in computing systems | 2015

As Light as your Footsteps: Altering Walking Sounds to Change Perceived Body Weight, Emotional State and Gait

Ana Tajadura-Jiménez; Maria Basia; Ophelia Deroy; Merle T. Fairhurst; Nicolai Marquardt; Nadia Bianchi-Berthouze

An ever more sedentary lifestyle is a serious problem in our society. Enhancing peoples exercise adherence through technology remains an important research challenge. We propose a novel approach for a system supporting walking that draws from basic findings in neuroscience research. Our shoe-based prototype senses a persons footsteps and alters in real-time the frequency spectra of the sound they produce while walking. The resulting sounds are consistent with those produced by either a lighter or heavier body. Our user study showed that modified walking sounds change ones own perceived body weight and lead to a related gait pattern. In particular, augmenting the high frequencies of the sound leads to the perception of having a thinner body and enhances the motivation for physical activity inducing a more dynamic swing and a shorter heel strike. We here discuss the opportunities and the questions our findings open.


Cerebral Cortex | 2015

Plasticity in Unimodal and Multimodal Brain Areas Reflects Multisensory Changes in Self-Face Identification

Matthew A. J. Apps; Ana Tajadura-Jiménez; Marty Sereno; Olaf Blanke

Nothing provides as strong a sense of self as seeing ones face. Nevertheless, it remains unknown how the brain processes the sense of self during the multisensory experience of looking at ones face in a mirror. Synchronized visuo-tactile stimulation on ones own and anothers face, an experience that is akin to looking in the mirror but seeing anothers face, causes the illusory experience of ownership over the other persons face and changes in self-recognition. Here, we investigate the neural correlates of this enfacement illusion using fMRI. We examine activity in the human brain as participants experience tactile stimulation delivered to their face, while observing either temporally synchronous or asynchronous tactile stimulation delivered to anothers face on either a specularly congruent or incongruent location. Activity in the multisensory right temporo-parietal junction, intraparietal sulcus, and the unimodal inferior occipital gyrus showed an interaction between the synchronicity and the congruency of the stimulation and varied with the self-reported strength of the illusory experience, which was recorded after each stimulation block. Our results highlight the important interplay between unimodal and multimodal information processing for self-face recognition, and elucidate the neurobiological basis for the plasticity required for identifying with our continuously changing visual appearance.

Collaboration


Dive into the Ana Tajadura-Jiménez's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Aleksander Väljamäe

Chalmers University of Technology

View shared research outputs
Top Co-Authors

Avatar

Norimichi Kitagawa

Nippon Telegraph and Telephone

View shared research outputs
Top Co-Authors

Avatar

Mendel Kleiner

Chalmers University of Technology

View shared research outputs
Top Co-Authors

Avatar

Penny Bergman

Chalmers University of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ophelia Deroy

School of Advanced Study

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge