Tatsuhiro Kishi
Waseda University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Tatsuhiro Kishi.
intelligent robots and systems | 2012
Tatsuhiro Kishi; Takuya Otani; Nobutsuna Endo; Przemyslaw Kryczka; Kenji Hashimoto; Kei Nakata; Atsuo Takanishi
This paper describes the development of a new expressive robotic head for the bipedal humanoid robot. Facial expressions of our old robotic head have low facial expression recognition rate and in order to improve it we asked amateur cartoonists to create computer graphics (CG) images. To realize such expressions found in the CGs, the new head was provided with 24-DoFs and facial color. We designed compact mechanisms that fit into the head, which dimensions are based on average adult Japanese female size. We conducted a survey with pictures and videos to evaluate the expression ability. Results showed that facial expression recognition rates for the 6 basic emotions are increased compared to the old KOBIANs head.
international conference on social robotics | 2012
Gabriele Trovato; Tatsuhiro Kishi; Nobutsuna Endo; Kenji Hashimoto; Atsuo Takanishi
Communication between humans and robots is a very critical step for the integration of social robots into society. Emotion expression through a robotic face is one of the key points of communication. Despite the most recent efforts, no matter how much expression capabilities improve, facial expression recognition is often hampered by a cultural divide between subjects that participate in surveys. The purpose of this work is to take advantage of the 24 degrees of freedom head of the humanoid social robot KOBIAN-R for making it capable of displaying different versions of the same expressions, using face and neck, in a way that they are easy to understand for Japanese and for Western subjects. We present a system based on relevant studies of human communication and facial anatomy, as well as on the work of illustrators and cartoonists. The expression generator we developed can be adapted to specific cultures. Results confirmed the in-group advantage, showing that the recognition rate of this system is higher when the nationality of the subjects and the cultural characterisation of the shown expressions are coincident. We conclude that this system could be used, in future, on robots that have to interact in a social environment, with people with different cultural background.
international conference on robotics and automation | 2012
Kenji Hashimoto; Yuki Takezaki; Hiromitsu Motohashi; Takuya Otani; Tatsuhiro Kishi; Hun-ok Lim; Atsuo Takanishi
This paper describes a walking stabilization control based on gait analysis for a biped humanoid robot. We have developed a human-like foot mechanism mimicking the medial longitudinal arch to clarify the function of the foot arch structure. To evaluate the arch function through walking experiments using a robot, a walking stabilization control should also be designed based on gait analysis. Physiologists suggest the ankle, hip and stepping strategies, but these strategies are proposed by measuring human beings who are not “walking” but “standing” against force disturbances. Therefore, first we conducted gait analysis in this study, and we modeled human walking strategy enough to be implemented on humanoid robots. We obtained following two findings from gait analysis: i) a foot-landing point exists on the line joining the stance leg and the projected point of CoM on the ground, and ii) the distance between steps is modified to keep mechanical energy at the landing within a certain value. A walking stabilization control is designed based on the gait analysis. Verification of the proposed control is conducted through experiments with a human-sized humanoid robot WABIAN-2R. The experimental videos are supplemented.
International Journal of Humanoid Robotics | 2013
Gabriele Trovato; Massimiliano Zecca; Tatsuhiro Kishi; Nobutsuna Endo; Kenji Hashimoto; Atsuo Takanishi
Communication between humans and robots is a very important aspect in the field of Humanoid Robotics. For a natural interaction, robots capable of nonverbal communication must be developed. However, despite the most recent efforts, robots still can show only limited expression capabilities. The purpose of this work is to create a facial expression generator that can be applied to the 24 DoF head of the humanoid robot KOBIAN-R. In this manuscript, we present a system that based on relevant studies of human communication and facial anatomy can produce thousands of combinations of facial and neck movements. The wide range of expressions covers not only primary emotions, but also complex or blended ones, as well as communication acts that are not strictly categorized as emotions. Results showed that the recognition rate of expressions produced by this system is comparable to the rate of recognition of the most common facial expressions. Context-based recognition, which is especially important in case of more complex communication acts, was also evaluated. Results proved that produced robotic expressions can alter the meaning of a sentence in the same way as human expressions do. We conclude that our system can successfully improve the communication abilities of KOBIAN-R, making it capable of complex interaction in the future.
robotics and biomimetics | 2013
Sarah Cosentino; Tatsuhiro Kishi; Massimiliano Zecca; Salvatore Sessa; Luca Bartolomeo; Kenji Hashimoto; Takashi Nozawa; Atsuo Takanishi
In this paper, we describe a human gesture recognition system developed to make a humanoid robot understand non-verbal human social behaviors, and we present the results of preliminary experiments to demonstrate the feasibility of the proposed method. In particular, we have focused on the detection and recognition of laughter, a very peculiar human social signal. In fact, although it is a direct form of social interaction, laughter is classified as semi voluntary action, can be elicited by several different stimuli, and it is strongly associated with positive emotion and physical well-being. The possibility of recognize, and further elicit laughter, will help the humanoid robot to interact in a more natural way with humans, to build positive relationships and thus be more socially integrated in the human society.
International Journal of Social Robotics | 2013
Gabriele Trovato; Tatsuhiro Kishi; Nobutsuna Endo; Massimiliano Zecca; Kenji Hashimoto; Atsuo Takanishi
Emotion display through facial expressions is an important channel of communication. However, between humans there are differences in the way a meaning to facial cues is assigned, depending on the background culture. This leads to a gap in recognition rates of expressions: this problem is present when displaying a robotic face too, as a robot’s facial expression recognition is often hampered by a cultural divide, and poor scores of recognition rate may lead to poor acceptance and interaction. It would be desirable if robots could switch their output facial configuration flexibly, adapting to different cultural backgrounds. To achieve this, we made a generation system that produces facial expressions and applied it to the 24 degrees of freedom head of the humanoid social robot KOBIAN-R, and thanks to the work of illustrators and cartoonists, the system can generate two versions of the same expression, in order to be easily recognisable by both Japanese and Western subjects. As a tool for making recognition easier, the display of Japanese comic symbols on the robotic face has also been introduced and evaluated. In this work, we conducted a cross-cultural study aimed at assessing this gap in recognition and finding solutions for it. The investigation was extended to Egyptian subjects too, as a sample of another different culture. Results confirmed the differences in recognition rates, the effectiveness of customising expressions, and the usefulness of symbols display, thereby suggesting that this approach might be valuable for robots that in the future will interact in a multi-cultural environment.
ieee-ras international conference on humanoid robots | 2012
Gabriele Trovato; Tatsuhiro Kishi; Nobutsuna Endo; Kenji Hashimoto; Atsuo Takanishi
Human-robot communication is a very important aspect in the field of Humanoid Robotics. Non-verbal communication is one of the components that make interaction natural. However, despite the most recent efforts, robots still can show only limited expression capabilities. The purpose of this work is to create a facial expression generator that can be applied to the new 24 DoF head of the humanoid robot KOBIAN-R. In this paper, we present a system based on relevant studies of human communication and facial anatomy, adapted to the specific robotic face. It makes use of polynomial classifiers and is able to produce over 600 thousands of facial cues combinations, together with appropriate neck movement. Results showed that the recognition rate of expressions produced by this system is comparable to the rate of recognition of the most common facial expressions. We conclude that our system can successfully improve the communication capabilities of the robot KOBIAN-R, and that there is potential for using it to implement more complex interaction.
Frontiers in Psychology | 2015
Matthieu Destephe; Martim Brandao; Tatsuhiro Kishi; Massimiliano Zecca; Kenji Hashimoto; Atsuo Takanishi
The Uncanny valley hypothesis, which tells us that almost-human characteristics in a robot or a device could cause uneasiness in human observers, is an important research theme in the Human Robot Interaction (HRI) field. Yet, that phenomenon is still not well-understood. Many have investigated the external design of humanoid robot faces and bodies but only a few studies have focused on the influence of robot movements on our perception and feelings of the Uncanny valley. Moreover, no research has investigated the possible relation between our uneasiness feeling and whether or not we would accept robots having a job in an office, a hospital or elsewhere. To better understand the Uncanny valley, we explore several factors which might have an influence on our perception of robots, be it related to the subjects, such as culture or attitude toward robots, or related to the robot such as emotions and emotional intensity displayed in its motion. We asked 69 subjects (N = 69) to rate the motions of a humanoid robot (Perceived Humanity, Eeriness, and Attractiveness) and state where they would rather see the robot performing a task. Our results suggest that, among the factors we chose to test, the attitude toward robots is the main influence on the perception of the robot related to the Uncanny valley. Robot occupation acceptability was affected only by Attractiveness, mitigating any Uncanny valley effect. We discuss the implications of these findings for the Uncanny valley and the acceptability of a robotic worker in our society.
Archive | 2013
Tatsuhiro Kishi; Takuya Otani; Nobutsuna Endo; Przemyslaw Kryczka; Kenji Hashimoto; Kei Nakata; Atsuo Takanishi
This paper describes the development of a new expressive robotic head for bipedal humanoid robot. Through a preliminary experiment, the authors defined the representative facial expressions for 6 basic emotions. To realize these facial expressions, 24-DoFs of mechanisms that allow wide moveable range, and facial color were needed on the face. We designed compact mechanisms that fit into the head of which major dimensions are based on average of adult Japanese female’s size. We conducted a questionnaire surveys to evaluate the facial expression recognition rates. The results show that facial expression recognition rates for 6 basic emotions were increased compared to the old head.
2012 First International Conference on Innovative Engineering Systems | 2012
Gabriele Trovato; Tatsuhiro Kishi; Nobutsuna Endo; Kenji Hashimoto; Atsuo Takanishi
Human-Robot Interaction is one of the biggest challenges in Humanoid Robotics. Interaction can be carried out through different channels, as humans can communicate through complex languages, including the use of non-verbal communication. Specifically, facial expressions are important for conveying emotions and communication intentions. Several humanoid robots can already perform a certain set of expressions, but their capabilities are somewhat limited and as a result, interaction is not natural. One aspect that could help robots being perceived as real is asymmetry. That is why in this paper we perform an evaluation study of symmetrical and asymmetrical facial expressions of the humanoid robot KOBIAN-R These expressions are made by a generator based on relevant studies of facial anatomyand non-verbal communication and on the work of illustrators and cartoonists. Survey results confirmed the effectiveness of the generator and the importance of asymmetry. We conclude that using this system, robot communication capabilities improve, making possible the development of more advanced interaction in the future.