Maria Alessandra Umiltà
University of Parma
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Maria Alessandra Umiltà.
Experimental Brain Research | 2010
Magali J. Rochat; Fausto Caruana; Ahmad Jezzini; Ludovic Escola; Irakli Intskirveli; Franck Grammont; Vittorio Gallese; Giacomo Rizzolatti; Maria Alessandra Umiltà
Mirror neurons are a distinct class of neurons that discharge both during the execution of a motor act and during observation of the same or similar motor act performed by another individual. However, the extent to which mirror neurons coding a motor act with a specific goal (e.g., grasping) might also respond to the observation of a motor act having the same goal, but achieved with artificial effectors, is not yet established. In the present study, we addressed this issue by recording mirror neurons from the ventral premotor cortex (area F5) of two monkeys trained to grasp objects with pliers. Neuron activity was recorded during the observation and execution of grasping performed with the hand, with pliers and during observation of an experimenter spearing food with a stick. The results showed that virtually all neurons responding to the observation of hand grasping also responded to the observation of grasping with pliers and, many of them to the observation of spearing with a stick. However, the intensity and pattern of the response differed among conditions. Hand grasping observation determined the earliest and the strongest discharge, while pliers grasping and spearing observation triggered weaker responses at longer latencies. We conclude that F5 grasping mirror neurons respond to the observation of a family of stimuli leading to the same goal. However, the response pattern depends upon the similarity between the observed motor act and the one executed by the hand, the natural motor template.
PLOS ONE | 2010
Thierry Chaminade; Massimiliano Zecca; Sarah-Jayne Blakemore; Atsuo Takanishi; Chris Frith; Silvestro Micera; Paolo Dario; Giacomo Rizzolatti; Vittorio Gallese; Maria Alessandra Umiltà
Background The humanoid robot WE4-RII was designed to express human emotions in order to improve human-robot interaction. We can read the emotions depicted in its gestures, yet might utilize different neural processes than those used for reading the emotions in human agents. Methodology Here, fMRI was used to assess how brain areas activated by the perception of human basic emotions (facial expression of Anger, Joy, Disgust) and silent speech respond to a humanoid robot impersonating the same emotions, while participants were instructed to attend either to the emotion or to the motion depicted. Principal Findings Increased responses to robot compared to human stimuli in the occipital and posterior temporal cortices suggest additional visual processing when perceiving a mechanical anthropomorphic agent. In contrast, activity in cortical areas endowed with mirror properties, like left Brocas area for the perception of speech, and in the processing of emotions like the left anterior insula for the perception of disgust and the orbitofrontal cortex for the perception of anger, is reduced for robot stimuli, suggesting lesser resonance with the mechanical agent. Finally, instructions to explicitly attend to the emotion significantly increased response to robot, but not human facial expressions in the anterior part of the left inferior frontal gyrus, a neural marker of motor resonance. Conclusions Motor resonance towards a humanoid robot, but not a human, display of facial emotion is increased when attention is directed towards judging emotions. Significance Artificial agents can be used to assess how factors like anthropomorphism affect neural response to the perception of human actions.
Experimental Brain Research | 2010
Alena Streltsova; Cristina Berchio; Vittorio Gallese; Maria Alessandra Umiltà
The main aim of the present study was to explore, by means of high-density EEG, the intensity and the temporal pattern of event-related sensory-motor alpha desynchronization (ERD) during the observation of different types of hand motor acts and gestures. In particular, we aimed to investigate whether the sensory-motor ERD would show a specific modulation during the observation of hand behaviors differing for goal-relatedness (hand grasping of an object and meaningless hand movements) and social relevance (communicative hand gestures and grasping within a social context). Time course analysis of alpha suppression showed that all types of hand behaviors were effective in triggering sensory-motor alpha ERD, but to a different degree depending on the category of observed hand motor acts and gestures. Meaningless gestures and hand grasping were the most effective stimuli, resulting in the strongest ERD. The observation of social hand behaviors such as social grasping and communicative gestures, triggered a more dynamic time course of ERD compared to that driven by the observation of simple grasping and meaningless gestures. These findings indicate that the observation of hand motor acts and gestures evoke the activation of a motor resonance mechanism that differs on the basis of the goal-relatedness and the social relevance of the observed hand behavior.
Current Opinion in Neurobiology | 2007
Thomas Brochier; Maria Alessandra Umiltà
The skilled use of the hand for grasping and manipulation of objects is a fundamental feature of the primate motor system. Grasping movements involve transforming the visual information about an object into a motor command appropriate for the coordinated activation of hand and finger muscles. The cerebral cortex and its descending projections to the spinal cord are known to play a crucial role for the control of grasp. Recent studies in non-human primates have provided some striking new insights into the respective contribution of the parietal and frontal motor cortical areas to the control of grasp. Also, new approaches allowed investigating the coupling of grasp-related activity in different cortical areas for the control of the descending motor command.
PLOS ONE | 2013
Beatrice Sbriscia-Fioretti; Cristina Berchio; David A. Freedberg; Vittorio Gallese; Maria Alessandra Umiltà
The aim of this study was to test the involvement of sensorimotor cortical circuits during the beholding of the static consequences of hand gestures devoid of any meaning.In order to verify this hypothesis we performed an EEG experiment presenting to participants images of abstract works of art with marked traces of brushstrokes. The EEG data were analyzed by using Event Related Potentials (ERPs). We aimed to demonstrate a direct involvement of sensorimotor cortical circuits during the beholding of these selected works of abstract art. The stimuli consisted of three different abstract black and white paintings by Franz Kline. Results verified our experimental hypothesis showing the activation of premotor and motor cortical areas during stimuli observation. In addition, abstract works of art observation elicited the activation of reward-related orbitofrontal areas, and cognitive categorization-related prefrontal areas. The cortical sensorimotor activation is a fundamental neurophysiological demonstration of the direct involvement of the cortical motor system in perception of static meaningless images belonging to abstract art. These results support the role of embodied simulation of artist’s gestures in the perception of works of art.
Behavioral and Brain Sciences | 2001
Vittorio Gallese; Pier Francesco Ferrari; Maria Alessandra Umiltà
Empathy is the phenomenal experience of mirroring ourselves into others. It can be explained in terms of simulations of actions, sensations, and emotions which constitute a shared manifold for intersubjectivity. Simulation, in turn, can be sustained at the subpersonal level by a series of neural mirror matching systems. The article by Preston & de Waal (Pd see Gallese 2000a; 2000b; 2001; see also Rizzolatti et al. 2001), emotion perception (as simulation of the perceived emotion; see Adolphs 1999; Adolphs et al. 2000; Commentary/Preston & deWaal: Empathy: Its ultimate and proximate bases BEHAVIORAL AND BRAIN SCIENCES (2002) 25:1 35 Figure 1 (Eslinger et al.). Results of fMRI study of normal volunteers making explicit moral judgments, showing activation primarily of frontopolar cortex, medial frontal gyrus and related regions Gallese 2001), and mindreading. Simulation theory in fact holds that we understand others’ thoughts by pretending to be in their “mental shoes,” and by using our own mind/body as a model for the minds of others (Gallese & Goldman 1998; Goldman 1989; Gordon 1986; Harris 1989). Is there a further level of description that can provide a common and coherent explanatory frame for all these different simulation mechanisms? We propose, yes: such a level could be represented by the neural matching system constituted by mirror neurons (Gallese et al. 1996; 2002; Rizzolatti et al. 1996; Umilta et al. 2001; see also Rizzolatti et al. 2001) – or by equivalent neural systems described in the human brain (Fadiga et al. 1995; Iacoboni et al. 1999; Nishitani & Hari 2000). Mirror neurons could underpin a direct, automatic, nonpredicative, and noninferential simulation mechanism, by means of which the observer would be able to recognize, understand, and imitate the behavior of others. The authors maintain that “mirror neurons . . . provide concrete cellular evidence for the shared representations of perception and action” (see target article, sect. 3.1). They fail, nevertheless, to draw the correct conclusions from such a statement. It is true, as they argue, that mirror neurons do not produce per se any empathy. However, if an action-perception matching is crucial for the production of empathy, as the authors suggest, mirror neurons represent the most parsimonious neural system so far described, enabling such a matching to occur. The trick here is not to confound the phenomenal aspect of behavior, its functional level of description, and the neural mechanism at its base. Preliminary results suggest that a mirror matching system could be at the basis of our capacity to perceive in a meaningful way, not only the actions, but also the sensations and the emotions of others (see Gallese 2001). Single neuron recording experiments in humans have demonstrated that the same neurons become active when the subject either feels pain or observes others feeling pain (Hutchison et al. 1999). Furthermore, a recent fMRI study has shown that the amygdala becomes active not only during the observation, but also during the active expression of facial emotions, especially when imitation is involved (Carr et al. 2001). In conclusion, these recent findings suggest that a neural matching system is present also in a variety of apparently nonmotor-related human brain structures. Thus, different simulation mechanisms are applied in different domains, being sustained by a mirror-matching, dual-mode of operation (action-driven and perception-driven) of given brain structures. We propose that such simulation mechanisms may constitute altogether a shared manifold of intersubjectivity (see Gallese 2001).
Frontiers in Human Neuroscience | 2013
Mariateresa Sestito; Maria Alessandra Umiltà; Giancarlo De Paola; Renata Fortunati; Andrea Raballo; Emanuela Leuci; Simone Maffei; Matteo Tonna; Mario Amore; Carlo Maggini; Vittorio Gallese
Emotional facial expression is an important low-level mechanism contributing to the experience of empathy, thereby lying at the core of social interaction. Schizophrenia is associated with pervasive social cognitive impairments, including emotional processing of facial expressions. In this study we test a novel paradigm in order to investigate the evaluation of the emotional content of perceived emotions presented through dynamic expressive stimuli, facial mimicry evoked by the same stimuli, and their functional relation. Fifteen healthy controls and 15 patients diagnosed with schizophrenia were presented with stimuli portraying positive (laugh), negative (cry) and neutral (control) emotional stimuli in visual, auditory modalities in isolation, and congruently or incongruently associated. Participants where requested to recognize and quantitatively rate the emotional value of the perceived stimuli, while electromyographic activity of Corrugator and Zygomaticus muscles was recorded. All participants correctly judged the perceived emotional stimuli and prioritized the visual over the auditory modality in identifying the emotion when they were incongruently associated (Audio-Visual Incongruent condition). The neutral emotional stimuli did not evoke any muscle responses and were judged by all participants as emotionally neutral. Control group responded with rapid and congruent mimicry to emotional stimuli, and in Incongruent condition muscle responses were driven by what participants saw rather than by what they heard. Patient group showed a similar pattern only with respect to negative stimuli, whereas showed a lack of or a non-specific Zygomaticus response when positive stimuli were presented. Finally, we found that only patients with reduced facial mimicry (Internalizers) judged both positive and negative emotions as significantly more neutral than controls. The relevance of these findings for studying emotional deficits in schizophrenia is discussed.
PLOS ONE | 2013
Martina Ardizzi; Francesca Martini; Maria Alessandra Umiltà; Mariateresa Sestito; Roberto Ravera; Vittorio Gallese
Facial expression of emotions is a powerful vehicle for communicating information about others’ emotional states and it normally induces facial mimicry in the observers. The aim of this study was to investigate if early aversive experiences could interfere with emotion recognition, facial mimicry, and with the autonomic regulation of social behaviors. We conducted a facial emotion recognition task in a group of “street-boys” and in an age-matched control group. We recorded facial electromyography (EMG), a marker of facial mimicry, and respiratory sinus arrhythmia (RSA), an index of the recruitment of autonomic system promoting social behaviors and predisposition, in response to the observation of facial expressions of emotions. Results showed an over-attribution of anger, and reduced EMG responses during the observation of both positive and negative expressions only among street-boys. Street-boys also showed lower RSA after observation of facial expressions and ineffective RSA suppression during presentation of non-threatening expressions. Our findings suggest that early aversive experiences alter not only emotion recognition but also facial mimicry of emotions. These deficits affect the autonomic regulation of social behaviors inducing lower social predisposition after the visualization of facial expressions and an ineffective recruitment of defensive behavior in response to non-threatening expressions.
PLOS ONE | 2013
Hiroaki Ishida; Luca Fornia; Laura Clara Grandi; Maria Alessandra Umiltà; Vittorio Gallese
The posterior inner perisylvian region including the secondary somatosensory cortex (area SII) and the adjacent region of posterior insular cortex (pIC) has been implicated in haptic processing by integrating somato-motor information during hand-manipulation, both in humans and in non-human primates. However, motor-related properties during hand-manipulation are still largely unknown. To investigate a motor-related activity in the hand region of SII/pIC, two macaque monkeys were trained to perform a hand-manipulation task, requiring 3 different grip types (precision grip, finger exploration, side grip) both in light and in dark conditions. Our results showed that 70% (n = 33/48) of task related neurons within SII/pIC were only activated during monkeys’ active hand-manipulation. Of those 33 neurons, 15 (45%) began to discharge before hand-target contact, while the remaining neurons were tonically active after contact. Thirty-percent (n = 15/48) of studied neurons responded to both passive somatosensory stimulation and to the motor task. A consistent percentage of task-related neurons in SII/pIC was selectively activated during finger exploration (FE) and precision grasping (PG) execution, suggesting they play a pivotal role in control skilled finger movements. Furthermore, hand-manipulation-related neurons also responded when visual feedback was absent in the dark. Altogether, our results suggest that somato-motor neurons in SII/pIC likely contribute to haptic processing from the initial to the final phase of grasping and object manipulation. Such motor-related activity could also provide the somato-motor binding principle enabling the translation of diachronic somatosensory inputs into a coherent image of the explored object.
IEEE Transactions on Neural Systems and Rehabilitation Engineering | 2013
Pouya Ahmadian; Saeid Sanei; Luca Ascari; Lara González-Villanueva; Maria Alessandra Umiltà
One of the changes seen in electroencephalography (EEG) data preceding human voluntary movement is a cortical potential called readiness potential (RP). Detection of this potential can benefit researchers in clinical neurosciences for rehabilitation of malfunctioning brain and those working on brain-computer interfacing to develop a suitable mechanism to detect the intention of movement. Here, a constrained blind source extraction (CBSE) is attempted for detection of RP. A suitable constraint is defined and applied. The results are also compared with those of the traditional blind source separation in terms of true positive rate, false positive rate, and computation time. The results show that the CBSE approach in overall has superior performance.