Yugo Takeuchi
Shizuoka University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Yugo Takeuchi.
Speech Communication | 2003
Noriko Suzuki; Yugo Takeuchi; Kazuo Ishii; Michio Okada
Our research goal is to investigate interpersonal relations involving empathy in human-computer interaction. We focus on mimicry behavior and its ability to elicit intentional stance of a partner in interaction. In this study, we conducted a psychological experiment to examine how prosodic mimicry by computers affects people. An interactive system in this experiment uses an animated character that mimics the prosodic features in a humans voice echoicly by synthesizing the hummed sounds. The sounds consist only of prosodic components similar to continuous humming of the open vowel /a/ or /o/ without any language information. The subjects completed a questionnaire to evaluate the character at different mimicry ratio. The results indicated the following possibilities: First, people favorably interpret a computers simple response such as echoic mimicry using hummed sounds mixed with a slightly constant prosody response. Second, people may establish an interpersonal relations with a computer through such facilitated interaction.
Robotics and Autonomous Systems | 2000
Noriko Suzuki; Yugo Takeuchi; Kazuo Ishii; Michio Okada
Abstract This paper describes an interactive system called “Talking Eye”, which casually chats with humans via speech voice, and reports the results of a psychological experiment to evaluate social behaviors on interaction with the Talking Eye system. We have been investigating the mechanism of chatting in order to create local interaction between humans and computers through the dialogue process. We have constructed a prototype system, i.e., Talking Eye, with a mechanism for producing conversational behaviors according to the current situation using a multi-agent architecture. The results of experiments have demonstrated that humans can find affinity with the system through the process of interaction even if the conversational purpose is not achieved; the result of deriving conversational behaviors from the current situation based on an architecture with multi-agents. The main goal of this study is to build autonomous creatures with a mechanism to promote further empathic interaction with humans.
robot and human interactive communication | 2014
Ryo Sato; Yugo Takeuchi
In this study, we suggest a method to coordinate turn-taking and talking in multi-party conversations by the gaze of a robot that participates on the side. Also, we use the experimental paradigm named “Cooperative Turn-taking Game in Non-verbal Situation”, which is a simplified multi-party conversation environment. We investigated whether designing eye-gazes for such robots can coordinate turn-taking and talking in multi-party conversations, and we found the robots gaze could coordinate turn-taking and talking in multi-party conversations. Our study is expected to effectively encourage desirable talking in such multi-party conversations as collaborative learning scenes.
international conference on knowledge based and intelligent information and engineering systems | 2000
Toru Takahashi; Yugo Takeuchi; Yasuhiro Katagiri
We discuss the design of a life-like agent interface that considers the social aspects in human-agent interaction. The underlying hypothesis of the design is that human social behaviors toward life-like agents are on a par with those toward humans. Humans tend to sympathize with and follow other humans based on their affiliation needs. Therefore, it should be with life-like agents. We have designed and incorporated life-like guide agents into an exhibition guide system based on the hypothesis. The system was used at the 12/sup th/ ATR Open House, where we had the opportunity to study the behaviors of its users. We observed that the behaviors of the users were directed toward maintaining their good social relationships with their life-like agents, thereby supporting our hypothesis. The results indicated the importance of incorporating an agents capability to induce the effect of human affiliation needs in a system design.
FIRA RoboWorld Congress | 2009
Hisashi Naito; Yugo Takeuchi
Recently, agents have widely surfaced as existences that interact with humans. In face-to-face communication, we can confidently communicate through each other’s bodies. In our future ubiquitous society, realization will increase that the place that receives information and the information content are closely related. In this study in a cooperative task experiment, we clarified how the body’s role in the information processing activity in the real world with agents and the relation between information and environment influence agent evaluation. We found that an agent with a body in the real world is more likely to follow instructions than an agent in the virtual world, suggesting that the body plays an important role in real-world based interaction.
robot and human interactive communication | 2015
Genta Yoshioka; Takafumi Sakamoto; Yugo Takeuchi
This paper reports an analytic finding in which humans inferred the emotional states of a simple, flat robot that only moves autonomously on a floor in all directions based on Russells circumplex model of affect that depends on humans spatial position. We observed the physical interaction between humans and a robot through an experiment where our participants seek a treasure in the given field, and the robot expresses its affective state by movements. This result will contribute to the basic design of HRI. The robot only showed its internal state using its simple movements.
human-agent interaction | 2014
Takafumi Sakamoto; Yugo Takeuchi
Humans can interact with strangers because they can communicate with each other. On the other hand, developing a relationship with an unknown artifact is difficult. In order to address this problem, existing studies have explored various approaches to the artifacts behavioral design. However, little research has been done on interaction where there is no information about the interaction partner and where there has been no experimental task. Clarification of how people regard unknown objects as interaction partners is required. We believe that a stage of subconscious interaction plays a role in this process. We created an experimental environment to observe the interaction between a human and a robot whose behavior was actually mapped by another human. We observed this interaction under two different conditions; where the participants knew and where the participants did not know that the robot could interact. Both sets of participants approached or avoided the robot, but differences in the interaction property for each condition was confirmed. The results of our experiment suggest that a stage of subconscious interaction does exist for recognition of artifacts as interaction partners.
Proceedings of the 4th Workshop on Eye Gaze in Intelligent Human Machine Interaction | 2012
Saori Yamamoto; Nazomu Teraya; Yumika Nakamura; Narumi Watanabe; Yande Lin; Mayumi Bono; Yugo Takeuchi
This paper shows a prototype system that provides a natural multi-party conversation environment among participants in different places. Eye gaze is an important feature for maintaining smooth multi-party conversations because it indicates whom the speech is addressing or nominates the next speaker. Nevertheless, most popular video conversation systems, such as Skype or FaceTime, do not support the interaction of eye gaze. Serious confusion is caused in multi-party video conversation systems that have no eye gaze support. For example, who is the addressee of the speech? Who is the next speaker? We propose a simple multi-party video conversation environment called Ptolemaeus that realizes eye gaze interaction among more than three participants without any special equipment. This system provides natural turn-taking in face-to-face video conversations and can be implemented more easily than former schemes concerned with eye gaze interaction.
international conference on online communities and social computing | 2009
Yugo Takeuchi; Hikaru Nakagami
This paper investigates how people attribute individual autonomy to a remotely operated robot. An experiment was conducted in which participants remotely operated a goalkeeper robot to defend its goal from the kicker robot. Participants were assigned to two types of experimental conditions. Participants assigned to the first condition watched video images that captured the motion of the kicker robot from behind the goal. Participants assigned to the second condition watched video images of the kicker robot from the position of the goalkeeper robot. The result suggests that people are not concerned with the avatars autonomy when they are focused on the avatars situation.
international conference on human-computer interaction | 2017
Riya Banerjee; Yugo Takeuchi
This paper gives a detailed explanation of the proposal to build a mobile application for the purpose of communicating with other users of the mobile app who are in the vicinity of the communication initiator. This app enables the user to send text messages which may contain location-specific information.