Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Massimiliano Zecca is active.

Publication


Featured researches published by Massimiliano Zecca.


intelligent robots and systems | 2004

Effective emotional expressions with expression humanoid robot WE-4RII: integration of humanoid robot hand RCH-1

Hiroyasu Miwa; Kazuko Itoh; M. Matsumoto; Massimiliano Zecca; Hideaki Takanobu; S. Rocella; Maria Chiara Carrozza; Paolo Dario; Atsuo Takanishi

The authors have been developing humanoid robots in order to develop new mechanisms and functions for a humanoid robot that has the ability to communicate naturally with a human by expressing human-like emotion. We considered that human hands play an important role in communication because human hands have grasping, sensing and emotional expression abilities. Then, we developed the emotion expression humanoid robot WE-4RII (Waseda Eye No.4 Refined II) by integrating the new humanoid robot hands RCH-1 (RoboCasa Hand No.1) into the emotion expression humanoid robot WE-4R. Furthermore, we confirmed that RCH-1 and WE-4RII had effective emotional expression ability because the correct recognition rate of WE-4RIIs emotional expressions was higher than the WE-4Rs one. In this paper, we describe the mechanical features of WE-4RII.


international conference on robotics and automation | 2003

Experimental analysis of an innovative prosthetic hand with proprioceptive sensors

Maria Chiara Carrozza; Fabrizio Vecchi; Fabrizio Sebastiani; Giovanni Cappiello; Stefano Roccella; Massimiliano Zecca; Roberto Lazzarini; Paolo Dario

This paper presents an underactuated artificial hand intended for functional replacement of the natural hand in upper limb amputees. The natural hand has three basic functionalities: grasping, manipulation and exploration. To accomplish the goal of restoring these capabilities by implanting an artificial hand, two fundamental steps are necessary: to develop an artificial hand equipped with artificial proprioceptive and exteroceptive sensors and to fabricate an appropriate interface able to exchange sensory-motor signals with the amputees body and the central nervous system. In order to address these objectives, we have studied an underactuated hand according to a biomimetic approach, and we have exploited robotic and microengineering technologies to design and fabricate its building blocks. The architecture of the hand comprises the following modules: an actuator system embedded in the underactuated mechanical structure (artificial musculoskeletal system), a proprioceptive sensory system (position and force sensors), an exteroceptive sensory system (3D force sensors distributed on the cosmetic glove), an embedded control unit, and a human/machine interface. The first prototype of the artificial hand has been designed and fabricated. The hand is underactuated, and is equipped with opposable thumb and a proprioceptive sensory system. This paper presents the fabrication and experimental characterization of the hand, focusing on the mechanical structure, the actuator system and the proprioceptive sensory system.


PLOS ONE | 2010

Brain Response to a Humanoid Robot in Areas Implicated in the Perception of Human Emotional Gestures

Thierry Chaminade; Massimiliano Zecca; Sarah-Jayne Blakemore; Atsuo Takanishi; Chris Frith; Silvestro Micera; Paolo Dario; Giacomo Rizzolatti; Vittorio Gallese; Maria Alessandra Umiltà

Background The humanoid robot WE4-RII was designed to express human emotions in order to improve human-robot interaction. We can read the emotions depicted in its gestures, yet might utilize different neural processes than those used for reading the emotions in human agents. Methodology Here, fMRI was used to assess how brain areas activated by the perception of human basic emotions (facial expression of Anger, Joy, Disgust) and silent speech respond to a humanoid robot impersonating the same emotions, while participants were instructed to attend either to the emotion or to the motion depicted. Principal Findings Increased responses to robot compared to human stimuli in the occipital and posterior temporal cortices suggest additional visual processing when perceiving a mechanical anthropomorphic agent. In contrast, activity in cortical areas endowed with mirror properties, like left Brocas area for the perception of speech, and in the processing of emotions like the left anterior insula for the perception of disgust and the orbitofrontal cortex for the perception of anger, is reduced for robot stimuli, suggesting lesser resonance with the mechanical agent. Finally, instructions to explicitly attend to the emotion significantly increased response to robot, but not human facial expressions in the anterior part of the left inferior frontal gyrus, a neural marker of motor resonance. Conclusions Motor resonance towards a humanoid robot, but not a human, display of facial emotion is increased when attention is directed towards judging emotions. Significance Artificial agents can be used to assess how factors like anthropomorphism affect neural response to the perception of human actions.


robot and human interactive communication | 2009

Whole body emotion expressions for KOBIAN humanoid robot — preliminary experiments with different Emotional patterns —

Massimiliano Zecca; Yu Mizoguchi; Keita Endo; Fumiya Iida; Yousuke Kawabata; Nobutsuna Endo; Kazuko Itoh; Atsuo Takanishi

Personal robots and robot technology (RT)-based assistive devices are expected to play a major role in our elderly-dominated society, with an active participation to joint works and community life with humans, as partner and as friends for us. In particular, these robots are expected to be fundamental for helping and assisting elderly and disabled people during their activities of daily living (ADLs). To achieve this result, personal robots should be also capable of human-like emotion expressions. To this purpose we developed a new whole body emotion expressing bipedal humanoid robot, named KOBIAN, which is also capable to express human-like emotions. In this paper we presented three different evaluations of the emotional expressiveness of KOBIAN. In particular In particular, we presented the analysis of the roles of the face, the body, and their combination in emotional expressions. We also compared Emotional patterns created by a Photographer and a Cartoonist with the ones created by us. Overall, although the experimental results are not as good as we were expecting, we confirmed the robot can clearly express its emotions, and that very high recognition ratios are possible.


ieee-ras international conference on humanoid robots | 2008

Design of the humanoid robot KOBIAN - preliminary analysis of facial and whole body emotion expression capabilities-

Massimiliano Zecca; Nobutsuna Endo; Shimpei Momoki; Kazuko Itoh; Atsuo Takanishi

Personal robots and robot technology (RT)-based assistive devices are expected to play a major role in our elderly-dominated society, with an active participation to joint works and community life with humans, as partner and as friends for us. In particular, these robots are expected to be fundamental for helping and assisting elderly and disabled people during their activities of daily living (ADLs). To achieve this result, personal robots should be capable of human-like emotion expressions; in addition, human-like bipedal walking is the best solution for the robots which should be active in the human living environment. Although several bipedal robots and several emotional expression robots have been developed in the recent years, until now there was no robot which integrated all these functions. Therefore we developed a new bipedal walking robot, named KOBIAN, which is also capable to express human-like emotions. In this paper, we present the design and the preliminary evaluation of the new emotional expression head. The preliminary results showed that the emotion expressed by only the head cannot be really easily understood by the users. However, the presence of a full body clearly enhances the emotion expression capability of the robot, thus proving the effectiveness of the proposed approach.


international conference on robotics and automation | 2008

Development of whole-body emotion expression humanoid robot

Nobutsuna Endo; Shimpei Momoki; Massimiliano Zecca; Minoru Saito; Yu Mizoguchi; Kazuko Itoh; Atsuo Takanishi

Personal robots and robot technology (RT)-based assistive devices are expected to play a major role in our elderly-dominated society, with an active participation to joint works and community life with humans, as partner and as friends for us. The authors think that the emotion expression of a robot is effective in joint activities of human and robot. In addition, we also think that bipedal walking is necessary to robots which are active in human living environment. But, there was no robot which has those functions. And, it is not clear what kinds of functions are effective actually. Therefore we developed a new bipedal walking robot which is capable to express emotions. In this paper, we present the design and the preliminary evaluation of the new head of the robot with only a small number of degrees of freedom for facial expression.


Journal of Intelligent and Robotic Systems | 2013

A Methodology for the Performance Evaluation of Inertial Measurement Units

Salvatore Sessa; Massimiliano Zecca; Zhuohua Lin; Luca Bartolomeo; Hiroyuki Ishii; Atsuo Takanishi

This paper presents a methodology for a reliable comparison among Inertial Measurement Units or attitude estimation devices in a Vicon environment. The misalignment among the reference systems and the lack of synchronization among the devices are the main problems for the correct performance evaluation using Vicon as reference measurement system. We propose a genetic algorithm coupled with Dynamic Time Warping (DTW) to solve these issues. To validate the efficacy of the methodology, a performance comparison is implemented between the WB-3 ultra-miniaturized Inertial Measurement Unit (IMU), developed by our group, with the commercial IMU InertiaCube3™ by InterSense.


ieee-ras international conference on humanoid robots | 2005

Behavior model of humanoid robots based on operant conditioning

Kazuko Itoh; Hiroyasu Miwa; Munemichi Matsumoto; Massimiliano Zecca; Hideaki Takanobu; Stefano Roccella; Maria Chiara Carrozza; Paolo Dario; Atsuo Takanishi

Personal robots, which are expected to become popular in the near future, are required to be active in work and community life alongside humans. Therefore, we have been developing new mechanisms and functions in order to realize natural bilateral interaction by expressing emotions, behaviors and personality in a human-like manner. We have proposed a mental model with emotion, mood, personality and needs. However, the robot behavior was very simple since the robot shows just a single kind of behavior in response to a robot mental state. In this paper, we present a new behavior model for humanoid robots based on operant conditioning, which is a well-known psychological behavior model. We implemented this new behavior model into the emotion expression humanoid robot WE-4RII (Waseda Eye No.4 Refined II), developed in 2004. Through the experimental evaluations, we confirmed that the robot with the new behavior model could autonomously select suitable behavior for the situation within a predefined behavior list


international conference on robotics and automation | 2004

Various emotional expressions with emotion expression humanoid robot WE-4RII

Kazuko Itoh; Hiroyasu Miwa; Munemichi Matsumoto; Massimiliano Zecca; Hideaki Takanobu; Stefano Roccella; Maria Chiara Carrozza; Paolo Dario; Atsuo Takanishi

The authors have been developing humanoid robots in order to develop new mechanisms and functions for a humanoid robot that has the ability to communicate naturally with a human by expressing human-like emotion. In 2004, we developed the emotion expression humanoid robot WE-4RII (Waseda Eye No.4 Refined II) by integrating the new humanoid robot hands RCH-I (RoboCasa Hand No.1) into the emotion expression humanoid robot WE-4R. We confirmed that WE-4RII can effectively express its emotion.


international conference of the ieee engineering in medicine and biology society | 2011

Development of the wireless ultra-miniaturized inertial measurement unit WB-4: Preliminary performance evaluation

Zhuohua Lin; Massimiliano Zecca; Salvatore Sessa; Luca Bartolomeo; Hiroyuki Ishii; Atsuo Takanishi

This paper presents the preliminary performance evaluation of our new wireless ultra-miniaturized inertial measurement unit (IMU) WB-4 by compared with the Vicon motion capture system. The WB-4 IMU primarily contains a mother board for motion sensing, a Bluetooth module for wireless data transmission with PC, and a Li-Polymer battery for power supply. The mother board is provided with a microcontroller and 9-axis inertial sensors (miniaturized MEMS accelerometer, gyroscope and magnetometer) to measure orientation. A quaternion-based extended Kalman filter (EKF) integrated with an R-Adaptive algorithm for automatic estimation of the measurement covariance matrix is implemented for the sensor fusion to retrieve the attitude. The experimental results showed that the wireless ultra-miniaturized WB-4 IMU could provide high accuracy performance at the angles of roll and pitch. The yaw angle which has reasonable performance needs to be further evaluated.

Collaboration


Dive into the Massimiliano Zecca's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Paolo Dario

Sant'Anna School of Advanced Studies

View shared research outputs
Top Co-Authors

Avatar

Maria Chiara Carrozza

Sant'Anna School of Advanced Studies

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge