Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jong-Chan Park is active.

Publication


Featured researches published by Jong-Chan Park.


robot and human interactive communication | 2007

Emotion Interaction System for a Service Robot

Dong-Soo Kwon; Yoon Keun Kwak; Jong C. Park; Myung Jin Chung; Eun-Sook Jee; Kh Park; Hyoung-Rock Kim; Young-Min Kim; Jong-Chan Park; Eun Ho Kim; Kyung Hak Hyun; Hye-Jin Min; Hui Sung Lee; Jeong Woo Park; Su Hun Jo; S.M. Park; Kyung-Won Lee

This paper introduces an emotion interaction system for a service robot. The purpose of emotion interaction systems in service robots is to make people feel that the robot is not a mere machine, but reliable living assistant in the home. The emotion interaction system is composed of the emotion recognition, generation, and expression systems. A users emotion is recognized by multi-modality, such as voice, dialogue, and touch. The robots emotion is generated according to a psychological theory about emotion: OCC (Ortony, Clore, and Collins) model, which focuses on the users emotional state and the information about environment and the robot itself. The generated emotion is expressed by facial expression, gesture, and the musical sound of the robot. Because the proposed system is composed of all the three components that are necessary for a full emotional interaction cycle, it can be implemented in the real robot system and be tested. Even though the multi- modality in emotion recognition and expression is still in its rudimentary stages, the proposed system is shown to be extremely useful in service robot applications. Furthermore, the proposed framework can be a cornerstone for the design of emotion interaction and generation systems for robots.


robot and human interactive communication | 2008

Design of a robot head for emotional expression: EEEX

Hyunsoo Song; Young-Min Kim; Jong-Chan Park; Chong Hui Kim; Dong-Soo Kwon

This paper describes the design of a new robot head which has three special parts for emotional expression. The emotion of robots has become an issue of interest in the field of robotics. This robot head, EEEX(exaggerating emotion expresser) is developed to efficiently express various emotions. EEEX is useful in the human-robot interaction. EEEX has 3 special elements that humans do not possess to clearly express emotions. The first is arm-type antennae which consist of a 3-DOFs link system on each side to make dynamically emotional movements. They are specialized parts to show emotion by copying the humanpsilas arm motion. Second, emoticon-eyes help to represent emotions intuitively since the eyes are the most important parts to convey emotions. Lastly, exaggerating jaw & cheeks when expressing exaggerative emotions should assist in emotional expression as well, particularly surprise and anger. These 3 parts, the arm-type antennae, emoticon-eyes and exaggerating jaw & cheeks, would help human to recognize a robotpsilas emotions more clearly.


robot and human interactive communication | 2007

Behavior Coordination of Socially Interactive Robot using Sentiment Relation Model

Young-Min Kim; Jong-Chan Park; Dong-Soo Kwon

Social capability of a robot becomes one of important issues in human-robot interaction (HRI). Especially, for a robot to form a social relationship with people is significant for improving believability of a robot through more natural communication with people. In this paper, we propose a formal approach to make a robot establish and learn a social relationship based on affective relation in sociological perspectives. The main idea is based on representing sentiment relation (liking/disliking) within social individuals, which is regarded as a basis for forming interpersonal relation in sociology. Our Sentiment Relation model can be applied to loyalty implementation of service robots in underlying assumptions that a service robot must have high positive relationship to her host and tends to behave to minimize tension(stress) by unbalanced states, which are generated by different affective states between individuals in social group. To confirm the possibility of our model, the reinforcement learning-based behavior coordination using loyalty level is simulated in the simple grid world.


international conference on industrial electronics control and instrumentation | 1991

A framework for the evaluation and selection of assembly plans

Jong-Chan Park; Dong-Soo Kwon; Myung-Jin Chung

The authors introduce four criteria for evaluation of assembly plans. Quantitative measures corresponding to the criteria are also introduced. The quantitative measures are used to search for an optimal assembly plan from the modified AND/OR graph representation of assembly plans of a product using a branch-and-bound algorithm. The graph-search technique is used to avoid complete enumeration and evaluation of all feasible assembly plans of a product and to improve the efficiency of the selection process for the best plan. In the search process of an optimal assembly plan, the structure of an assembly system and assembly tasks are determined.<<ETX>>


advanced robotics and its social impacts | 2010

Robot's behavior expressions according to the sentence types and emotions with modification by personality

Jong-Chan Park; Hyunsoo Song; Seongyong Koo; Young-Min Kim; Dong-Soo Kwon

Expression has become one of important parts in human-robot interaction as an intuitive communication channel between humans and robots. However it is very difficult to construct robots behaviors one by one. Developers consider how to make various motions of the robot easily. Therefore we propose an useful behavior expression method according to the sentence types and emotions. In this paper, robots express behaviors using motion sets of multi-modalities described as a combination of sentence types and emotions. In order to gather the data of multi-modal motion sets, we used video analysis of the actress for human modalities and did user-test for non-human modalities. We developed a behavior edit-toolkit to make and modify robots behaviors easily. And also we proposed stereotyped actions according to the robots personality for diversifying behavior expressions. Defined 25 behaviors based on the sentence types and emotions are applied to Silbot, a test-bed robot in CIR of Korea, and used for the English education.


The Journal of Korea Robotics Society | 2013

Design and Control of the Active Split Offset Caster based Omni-directional Vehicle

Han-Gyeol Kim; Do Ngoc Huan; Jong-Chan Park; Dong-Soo Kwon

This research would investigate deeply the operation of an omni-directional mobile robot that is able to move with high acceleration. For the high acceleration performance, the vehicle utilizes the structure of Active Split Offset Casters (ASOCs). This paper is mainly focused on inverse kinematics of the structure, hardware design to secure durability and preserve the wheels` contact to the ground during high acceleration, and localization for the real time position control.


international conference on advanced intelligent mechatronics | 2012

Design of the active split offset casters based omni-directional vehicle in high acceleration condition

Han-Gyeol Kim; Do Ngoc Huan; Jong-Chan Park; Dong-Soo Kwon

Up to now, most of studies about omni-directional vehicles have been focusing on omni-directional maneuverability. In contrast, this research would investigate deeply the operation of an omni-directional mobile robot that is able to move with high acceleration. In order to fulfill that goal, the vehicle utilizes the structure of Active Split Offset Casters (ASOC). This paper is mainly focused on hardware design and control algorithm to achieve high acceleration while preserving the vehicles isotropy and optimum controllability.


IFAC Proceedings Volumes | 2008

Emotional Exchange of a Socially Interactive Robot

Dong-Soo Kwon; Myung Jin Chung; Jong C. Park; Chang D. Yoo; Eun-Sook Jee; Kh Park; Young-Min Kim; Hyoung-Rock Kim; Jong-Chan Park; Hye-Jin Min; Jeong Woo Park; Sungrack Yun; Kyung-Won Lee

This paper presents an emotional exchange framework for a socially interactive robot. The purpose of emotional exchange in social interaction between a robot and people is to make people feel that the robot is a believable living assistant, not a mere machine for information translation. Our emotional exchange framework is composed of the emotion recognition, generation, and expression systems. A users emotion is recognized by multi-modality such as voice, dialogue, and touch. The robots emotion is generated according to a psychological theory about cognitive emotions caused by the social interaction within people. Furthermore, the emotion intensity is regulated by the loyalty level of a robot to various users. The generated emotion is dynamically expressed by facial expression, gesture, and the musical sound of the robot. The proposed system, which is composed of all the three components that are necessary for a full emotional interaction cycle, is implemented in the real robot system and tested. The proposed framework can be a cornerstone for the design of emotion interaction and generation systems for robots.


robot and human interactive communication | 2009

Robot's individual emotion generation model and action coloring according to the robot's personality

Jong-Chan Park; Hyoung-Rock Kim; Young-Min Kim; Dong-Soo Kwon


human-robot interaction | 2011

Make your wishes to 'genie in the lamp': physical push with a socially intelligent robot

Hye-Jin Min; Jong-Chan Park

Collaboration


Dive into the Jong-Chan Park's collaboration.

Researchain Logo
Decentralizing Knowledge