Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Eun-Sook Jee is active.

Publication


Featured researches published by Eun-Sook Jee.


robot and human interactive communication | 2007

Emotion Interaction System for a Service Robot

Dong-Soo Kwon; Yoon Keun Kwak; Jong C. Park; Myung Jin Chung; Eun-Sook Jee; Kh Park; Hyoung-Rock Kim; Young-Min Kim; Jong-Chan Park; Eun Ho Kim; Kyung Hak Hyun; Hye-Jin Min; Hui Sung Lee; Jeong Woo Park; Su Hun Jo; S.M. Park; Kyung-Won Lee

This paper introduces an emotion interaction system for a service robot. The purpose of emotion interaction systems in service robots is to make people feel that the robot is not a mere machine, but reliable living assistant in the home. The emotion interaction system is composed of the emotion recognition, generation, and expression systems. A users emotion is recognized by multi-modality, such as voice, dialogue, and touch. The robots emotion is generated according to a psychological theory about emotion: OCC (Ortony, Clore, and Collins) model, which focuses on the users emotional state and the information about environment and the robot itself. The generated emotion is expressed by facial expression, gesture, and the musical sound of the robot. Because the proposed system is composed of all the three components that are necessary for a full emotional interaction cycle, it can be implemented in the real robot system and be tested. Even though the multi- modality in emotion recognition and expression is still in its rudimentary stages, the proposed system is shown to be extremely useful in service robot applications. Furthermore, the proposed framework can be a cornerstone for the design of emotion interaction and generation systems for robots.


robot and human interactive communication | 2007

Composition of Musical Sound Expressing an Emotion of Robot Based on Musical Factors

Eun-Sook Jee; Chong Hui Kim; Soon-Young Park; Kyung-Won Lee

In human-robot interaction, how to express robots emotion to human is an important issue. In this paper, we have proposed the music as a medium expressing robots emotion. Since emotions and the feelings from music are somewhat subjective, we have composed the music considering musical factors to reduce ambiguity. As an earlier work, we composed the music for happy, sad, fear, and dislike. We analyzed their musical factors and explained the motives of composition. To validate the effect of the composed music for happy, sad, fear, and dislike, we have conducted experiments and compared the results with that on robots facial expression. The results showed that the composed music is very effective in conveying emotions. In addition, the results showed better emotion expression when music and facial expression were simultaneously presented.


robot and human interactive communication | 2009

Composition of musical sound to express robot's emotion with intensity and synchronized expression with robot's behavior

Eun-Sook Jee; S.M. Park; Chong Hui Kim; Hisato Kobayashi

In human-robot interaction, emotion expression of robot is one of the important issue for interacting with human more dynamically. In this paper, we consider two issues in view of real robot implementation. One is emotional intensity expression and the other is synchronized expression with behavior of robot. To express emotional intensity, we regulate tempo, pitch, and volume of base emotion sound which are controllable parameters by computer system of robot. We also suggest musical structure which has repeatable section to synchronize with robots behavior.


robot and human interactive communication | 2013

Research trends in Art and Entertainment Robots (AnE Robots)

Si Jung Kim; Jeong Hyeon Cheon; Steven Forsyth; Eun-Sook Jee

The world robot population has just reached around 10 million. That is almost the same number of the population of a country such as Belgium or Hungary. A number that tells almost 0.16% of the world population are robots. This paper describes trends in Arts and Entertainment Robots (AnE robots) based on a qualitative study. The study focused on investigating the qualitative and quantitative aspect of AnE Robots published in three major conferences over the past three years.


international conference on ubiquitous robots and ambient intelligence | 2012

Novel musical notation for emotional sound expression of interactive robot

Kyoung Soo Chun; Eun-Sook Jee; Dong-Soo Kwon

Through the voice, people can understand each others emotional states. Since the auditory component has an older communicative history than the visual, the expression using sound is more direct primarily. Therefore, sound expression should be one of important part of human-robot interaction research. In this paper, we propose a musical notation based on spectrogram. With this notation, we designed 30 patterns of sound for expression of the English teacher robot.


IFAC Proceedings Volumes | 2008

Emotional Exchange of a Socially Interactive Robot

Dong-Soo Kwon; Myung Jin Chung; Jong C. Park; Chang D. Yoo; Eun-Sook Jee; Kh Park; Young-Min Kim; Hyoung-Rock Kim; Jong-Chan Park; Hye-Jin Min; Jeong Woo Park; Sungrack Yun; Kyung-Won Lee

This paper presents an emotional exchange framework for a socially interactive robot. The purpose of emotional exchange in social interaction between a robot and people is to make people feel that the robot is a believable living assistant, not a mere machine for information translation. Our emotional exchange framework is composed of the emotion recognition, generation, and expression systems. A users emotion is recognized by multi-modality such as voice, dialogue, and touch. The robots emotion is generated according to a psychological theory about cognitive emotions caused by the social interaction within people. Furthermore, the emotion intensity is regulated by the loyalty level of a robot to various users. The generated emotion is dynamically expressed by facial expression, gesture, and the musical sound of the robot. The proposed system, which is composed of all the three components that are necessary for a full emotional interaction cycle, is implemented in the real robot system and tested. The proposed framework can be a cornerstone for the design of emotion interaction and generation systems for robots.


Intelligent Service Robotics | 2010

Sound design for emotion and intention expression of socially interactive robots

Eun-Sook Jee; Yong-Jeon Jeong; Chong Hui Kim; Hisato Kobayashi


Archive | 2009

Sound Production for the Emotional Expression of Socially Interactive Robots

Eun-Sook Jee; Yong-Jeon Cheong; Chong Hui Kim; Dong-Soo Kwon; Hisato Kobayashi


Journal of robotics and mechatronics | 2011

Modulation of Musical Sound Clips for Robot’s Dynamic Emotional Expression

Eun-Sook Jee; Chong Hui Kim; Hisato Kobayashi


The Fifth IARP-IEEE/RAS-EURON Joint Workshop on Technical Challenges for Dependable Robots in Human Environments | 2007

Emotion Recognition, Generation, Expression System for Dependable Home Service Robot

Dong-Soo Kwon; Yoon Keun Kwak; Jc. Park; Myung-Jin Chung; Eun-Sook Jee; Hyoung-Rock Kim; Young-Geun Kim

Collaboration


Dive into the Eun-Sook Jee's collaboration.

Researchain Logo
Decentralizing Knowledge