Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Gyung-hye Yang is active.

Publication


Featured researches published by Gyung-hye Yang.


international conference of the ieee engineering in medicine and biology society | 2004

Development stress monitoring system based on personal digital assistant (PDA)

Mi-Hee Lee; Gyung-hye Yang; Hyoung-Ki Lee; Seok-won Bang

We have developed nonintrusive type stress monitoring system based on the PDA (Personal Digital Assistance). This system separated sensing part of the physiological signal and estimating part of the stress states. First, sensing part consists of four electrodes such as one PPG electrode, two EDA electrodes and one SKT electrode. Sensing part was able to measuring heart rate, skin temperature variation, and electrodermal activity, all of which can be acquired without discomfort from finger. Second, estimating part was developed and verified for physiological signal database that was obtained from multiple subjects by presenting stress stimuli that were elaborated to effectively induce stress. This system is a useful measure of human stress in portabel device as PDA and smart phone.


international symposium on industrial electronics | 2006

Recognition of Grip-Patterns by Using Capacitive Touch Sensors

Wook Chang; Kee-Eung Kim; Hyun-Jeong Lee; Joon Kee Cho; Byung Seok Soh; Jung Hyun Shim; Gyung-hye Yang; Sung-jung Cho; Joonah Park

A novel and intuitive way of accessing applications of mobile devices is presented. The key idea is to use grip-pattern, which is naturally produced when a user tries to use the mobile device, as a clue to determine an application to be launched. To this end, a capacitive touch sensor system is carefully designed and installed underneath the housing of the mobile device to capture the information of the users grip-pattern. The captured data is then recognized by a minimum distance classifier and a naive Bayes classifier. The recognition test is performed to validate the feasibility of the proposed user interface system


IEEE Transactions on Consumer Electronics | 2006

Vibrotactile rendering for simulating virtual environment in a mobile game

Sang-Youn Kim; Kyu Yong Kim; Byung Seok Soh; Gyung-hye Yang; Sang Ryong Kim

Vibrotactile rendering is a process of computing and generating haptic information in response to a users interaction with virtual objects. The crucial procedure in the vibrotactile rendering is to transform the simulated behavior of a virtual object into vibrotactile information according to users action. This paper presents a vibrotactile rendering method that expresses the reaction of a car on a road surface in a racing game using vibrotactile information. To this end, we first design a miniaturized vibrotactile rendering system with an eccentric vibration motor and a solenoid actuator, which generates vibrotactile information having a large bandwidth and amplitude. We also construct an interactive racing game as a test bed for the proposed vibration rendering method. We designed voltage input patterns that can be haptically discriminated by human for experiments. The proposed vibrotactile rendering based on these patterns generates control input to vibrotactile actuators to make users realistically feel the sensation of collision, driving on a bump, driving on a hard shoulder. To evaluate the proposed vibrotactile rendering method, nine persons experience two kind of racing games: one is the game with the proposed vibrotactile rendering, and the other is the game without the vibrotactile rendering. After the experiment, participants are asked to compare and appraise those two games based on suggested criteria. The experiment clearly shows the effectiveness and the feasibility of the proposed vibrotactile rendering method, which implies the sufficient applicability of the proposed vibrotactile rendering system to mobile devices


The International Journal of Fuzzy Logic and Intelligent Systems | 2006

GripLaunch: a Novel Sensor-Based Mobile User Interface with Touch Sensing Housing

Wook Chang; Joonah Park; Hyun-Jeong Lee; Joon Kee Cho; Byung Seok Soh; Jung Hyun Shim; Gyung-hye Yang; Sung-jung Cho

This paper describes a novel way of applying capacitive sensing technology to a mobile user interface. The key idea is to use grip-pattern, which is naturally produced when a user tries to use the mobile device, as a clue to determine an application to be launched. To this end, a capacitive touch sensing system is carefully designed and installed underneath the housing of the mobile device to capture the information of the users grip-pattern. The captured data is then recognized by dedicated recognition algorithms. The feasibility of the proposed user interface system is thoroughly evaluated with various recognition tests.


Archive | 2013

Display apparatus and method thereof

Soo-yeoun Youn; O-jae Kwon; Yoo-tai Kim; Bong-Hyun Cho; Gyung-hye Yang; Eun-Hee Park


Archive | 2006

Haptic button and haptic device using the same

Kyu-yong Kim; Sang-youn Kim; Byung-seok Soh; Gyung-hye Yang; Yong-beom Lee


Archive | 2005

System and method for identifying objects in a space

Gyung-hye Yang; Hyoung-Ki Lee


Archive | 2006

Method and medium for variably arranging content menu and display device using the same

Gyung-hye Yang; Jung-hyun Shim; Hyun-Jeong Lee; Joonah Park


Archive | 2006

Input device supporting various input modes and apparatus using the same

Kyu-yong Kim; Sang-youn Kim; Yong-beom Lee; Byung-seok Soh; Gyung-hye Yang


Archive | 2006

Method and apparatus for providing touch screen user interface, and electronic devices including the same

Jung-hyun Shim; Gyung-hye Yang; Hyun-Jeong Lee; Joonah Park

Collaboration


Dive into the Gyung-hye Yang's collaboration.

Researchain Logo
Decentralizing Knowledge