Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Inwook Hwang is active.

Publication


Featured researches published by Inwook Hwang.


international conference on control, automation and systems | 2008

Design and control of omni-directional mobile robot for Mobile Haptic Interface

Kyung-Lyong Han; Oh Kyu Choi; In Lee; Inwook Hwang; Jin S. Lee; Seungmoon Choi

A mobile haptic interface (MHI) refers to a system where a grounded force-feedback haptic interface is mounted onto a mobile robot to provide the user with unlimited workspace, especially for large virtual environments. In MHI, the mobile base needs to quickly change the movement direction, thus a omni-directional robot is preferred. In this paper, we present a novel omni-directional mobile robot designed for a mobile haptic interface (MHI), which uses four custom-made Mecanum wheels to provide higher operation stability. We also implemented two PI control methods (the conventional independent motor control and one combined with the Cartesian velocity control) and empirically evaluated their performance through an experiment. The experimental results indicated that the developed holonomic mobile robot allows accurate velocity control required for the MHI.


Industrial Robot-an International Journal | 2009

Cooperative robotic assistant with drill‐by‐wire end‐effector for spinal fusion surgery

Jongwon Lee; Inwook Hwang; Keehoon Kim; Seungmoon Choi; Wan Kyun Chung; Young Soo Kim

Purpose – The purpose of this paper is to present a surgical robot for spinal fusion and its control framework that provides higher operation accuracy, greater flexibility of robot position control, and improved ergonomics.Design/methodology/approach – A human‐guided robot for the spinal fusion surgery has been developed with a dexterous end‐effector that is capable of high‐speed drilling for cortical layer gimleting and tele‐operated insertion of screws into the vertebrae. The end‐effector is position‐controlled by a five degrees‐of‐freedom robot body that has a kinematically closed structure to withstand strong reaction force occurring in the surgery. The robot also allows the surgeon to control cooperatively the position and orientation of the end‐effector in order to provide maximum flexibility in exploiting his or her expertise. Also incorporated for improved safety is a “drill‐by‐wire” mechanism wherein a screw is tele‐drilled by the surgeon in a mechanically decoupled master/slave system. Finally, ...


ieee haptics symposium | 2010

Perceptual space and adjective rating of sinusoidal vibrations perceived via mobile device

Inwook Hwang; Seungmoon Choi

In the past five years, how to utilize the haptics technology in the mobile device to improve its limited user interface has emerged as an attractive research topic. In this paper, we report two kinds of perceptual data related to vibrotactile signals perceived through a mobile device held in the hand. In Experiment I, we estimated perceptual dissimilarities between sinusoidal vibrations with seven frequencies in 40–250 Hz and two amplitudes of 30 and 40 dB SL. Multi-dimensional scaling was then applied to the perceptual distances, and led to a two-dimensional perceptual space. In the perceptual space, the vibrations of the two amplitudes formed distinct groups. The two groups showed similar structures with respect to the frequency variation. In particular, two perceptual dimensions that spanned a low frequency range (40–100 Hz) and a high frequency range (100–250 Hz) were close to be orthogonal. In Experiment II, we evaluated the subjective qualities of sinusoidal vibrations with different frequencies via adjective rating. Thirteen adjective pairs were carefully selected, and rated for the sinusoidal vibrations played through the mobile device. The results were regressed to the perceptual space, revealing several adjective pairs that can largely account for the distributions of vibration points in the perceptual space, such as ‘dark-bright,’ ‘dull-clear,’ ‘slow-fast,’ ‘vague-distinct,’ ‘thick-thin,’ and ‘heavy-light.’ The findings of this paper can help understand the perceptual characteristics and subjective impressions of mobile device vibrations.


IEEE Transactions on Haptics | 2014

Consonance of Vibrotactile Chords

Yongjae Yoo; Inwook Hwang; Seungmoon Choi

This paper is concerned with the perception of complex vibrotactile stimuli in which a few sinusoidal vibrations with different frequencies are superimposed. We begin with an observation that such vibrotactile signals are analogous to musical chords in which multiple notes are played simultaneously. A set of so-called “vibrotactile chords” are designed on the basis of musical chords, and their degrees of consonance (harmony) that participants perceive are evaluated through a perceptual experiment. Experimental results indicate that participants can reliably rate the degrees of consonance of vibrotactile chords and establish a well-defined function that relates the degree of consonance to the base and chordal frequency of a vibrotactile chord. These findings have direct implications for the design of complex vibrotactile signals that can be produced by current wideband actuators such as voice-coil, piezoelectric, and electroactive polymer actuators.


world haptics conference | 2011

The haptic crayola effect: Exploring the role of naming in learning haptic stimuli

Inwook Hwang; Karon E. MacLean; Matthew Brehmer; Jeff C. Hendy; Andreas Sotirakopoulos; Seungmoon Choi

A haptic icon is a short physical stimulus attached to a simple meaning, which provides information and feedback to a user. To scale the utility demonstrated for small icon sets to larger ones, we need efficient strategies to help users learn subtle distinctions among stimuli, in a modality for which they may not hold detailed descriptive percepts. This paper investigates the effect of naming haptic stimuli - i.e. explicitly creating a linguistic marker - on the accuracy with which users are able to identify, distinguish, and recall stimuli. We conducted a between-subjects experiment using 60 participants equally divided among three naming conditions: no names, pre-selected non-descriptive names, and self-selected names. The experiment examined the impact of naming strategy on the ability of participants to identify stimuli in a nonverbal matching test, and on remembering stimulus names. For this challenging task and the degree of learning afforded, naming did not significantly impact accuracy of matching stimuli to meanings for all participants. However, more than twice of many of those allowed to choose names reported the ability to remember and distinguish stimuli than those required to use non-descriptive names, and many participants felt that the names were useful. Of middle-performing participants, the self-selected names group performed significantly better than the non-descriptive names group, and appeared to progress more quickly in learning. We summarize evidence for a trend that might widen with refined naming strategies and more extensive learning.


korea japan joint workshop on frontiers of computer vision | 2015

A gesture based TV control interface for visually impaired: Initial design and user study

Inwook Hwang; Hyun-Cheol Kim; Jihun Cha; Chung-Hyun Ahn; Karam Kim; Jong-Il Park

We introduce our initial design of gesture interface for the TV control of visually impaired users, with its implementation and user study. Two bare-hand gesture sets were designed using simple linear and circular motions of arms. For the two gesture sets, a rule-based recognition system was developed using a Kinect sensor. In a user study, the linear and circular gesture sets on our recognition system scored similarly and showed equal or better subjective ratings compared to the commercial pointer-type gesture interface. The users were negative at pointer-type gestures and showed their concerns about large motions of linear gestures and difficult matching of circular gestures.


world haptics conference | 2011

TAXEL: Initial progress toward self-morphing visio-haptic interface

Ki-Uk Kyung; Jeong-Mook Lim; Yo-An Lim; Suntak Park; Seung Koo Park; Inwook Hwang; Seungmoon Choi; Jongman Seo; Sang-Youn Kim; Tae-Heon Yang; Dong-Soo Kwon

This paper proposes a new interactive interface TAXEL, which aims at developing a reconfigurable self-morphing visio-haptic interface. We first present the overall architecture and concept of a self-morphing visuo-haptic interface. Key hardware components are developed and tested, including three tactile actuators using a piezoelectric active linear actuator, a passive MR fluid actuator, and a thin film-type actuator, respectively, and a flexible visual display based on the light-waveguide technology. Using the developed components, a tactile platform that includes a 8×16 array of the linear actuators is implemented for a proof of concept. A rendering engine is also designed for the tactile platform with emphasis on the use of haptic feedback together with GUI. We also carried out a user study with virtual button simulation as a benchmark to evaluate the performance of the TAXEL tactile platform. Lastly, an integrated system with a visual display is demonstrated along with several application examples.


international conference on human haptic sensing and touch enabled computer applications | 2014

Improved Haptic Music Player with Auditory Saliency Estimation

Inwook Hwang; Seungmoon Choi

This paper presents improvements made on our previous haptic music player designed to enhance music listening experience with mobile devices. Our previous haptic music player featured with dual-band rendering; it delivers bass beat sensations in music with rough superimposed vibrations and high-frequency salient features in music with high-frequency smooth vibrations. This work extends the previous algorithm by taking into account auditory saliency in determining the intensity of vibration to be rendered. The auditory saliency is estimated in real-time from several auditory features in music. The feasibility of multiband rendering was also tested using a wideband actuator. We carried out a user study to evaluate the subjective performance of three haptic music playing modes: saliency-improved dual-band rendering, saliency-improved multiband rendering, and our previous dual-band rendering. Experimental results showed that the new dual-band mode has perceptual merits over the multiband mode and the previous dual-band mode, particularly for rock or dance music. The results can contribute to enhancing multimedia experience by means of vibrotactile rendering of music.


international conference on human haptic sensing and touch enabled computer applications | 2012

Effect of mechanical ground on the vibrotactile perceived intensity of a handheld object

Inwook Hwang; Seungmoon Choi

This study investigates the effect of mechanical ground on the perceived intensity of vibration transmitted through a handheld object. To this end, we carried out an intensity matching experiment in which the points of subjective equality were measured between grounded and ungrounded conditions. Results showed that the grounded vibrations were perceived to be 1.63---1.86 times stronger than the ungrounded vibrations. This intensity difference was decreased with increasing vibration frequency. Our results are in line with the general fact that afferent movements, which are more apparent under the ungrounded condition, may induce tactile suppression.


symposium on haptic interfaces for virtual environment and teleoperator systems | 2009

System improvements in Mobile Haptic Interface

In Lee; Inwook Hwang; Kyung-Lyoung Han; Oh Kyu Choi; Seungmoon Choi; Jin S. Lee

A Mobile Haptic Interface (MHI), a force-feedback haptic interface with a mobile base, allows to render very large virtual objects in a safe and portable manner. In this paper, we present a novel MHI system, featured with: 1) extended horizontal workspace using a omni-directional mobile base, 2) extended vertical workspace using a linear lift, 3) high-accuracy estimation of the position of the haptic interface tool, 4) efficient motion planning algorithm to move the mobile base while avoiding collisions with the user and other objects, and 5) closed-loop force control to compensate the undesired effect of mobile base dynamics on the final rendering force perceived by the user. As a consequence, our MHI system can provide high quality haptic rendering of large virtual objects.

Collaboration


Dive into the Inwook Hwang's collaboration.

Top Co-Authors

Avatar

Seungmoon Choi

Pohang University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Jongman Seo

Pohang University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Oh Kyu Choi

Pohang University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Hyeseon Lee

Pohang University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

In Lee

Pohang University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Jin S. Lee

Pohang University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Myongchan Kim

Pohang University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Yongjae Yoo

Pohang University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Chung-Hyun Ahn

Electronics and Telecommunications Research Institute

View shared research outputs
Researchain Logo
Decentralizing Knowledge