Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Sunghoon Yim is active.

Publication


Featured researches published by Sunghoon Yim.


IEEE Transactions on Consumer Electronics | 2009

Gesture-recognizing hand-held interface with vibrotactile feedback for 3D interaction

Sangki Kim; Gunhyuk Park; Sunghoon Yim; Seungmoon Choi; Seungjin Choi

This article presents a hand-held interface system for 3D interaction with digital media contents. The system is featured with 1) tracking of the full 6 degrees-of-freedom position and orientation of a hand-held controller, 2) robust gesture recognition using continuous hidden Markov models based on the acceleration and position measurements, and 3) dual-mode vibrotactile feedback using both vibration motor and voice-coil actuator. We also demonstrate the advantages of the system through a usability experiment.


ieee haptics symposium | 2012

Haptic simulation of refrigerator door

Sunghwan Shin; In Lee; Hojin Lee; Gabjong Han; Kyungpyo Hong; Sunghoon Yim; Jongwon Lee; Young Jin Park; Byeong Ki Kang; Dae Ho Ryoo; Dae Whan Kim; Seungmoon Choi; Wan Kyun Chung

Recently, haptics has begun to impact consumer products, e.g., mobile phones and automobiles. In this paper, we introduce one such new application, that is, haptic simulation of refrigerator operation, and present an initial prototype for the front door. A one degree-of-freedom haptic interface is designed and built to provide torque feedback of the front door. Simulation software consisting of system control, graphic renderer, and haptic renderer are also developed. For haptic rendering, the motion dynamics of a refrigerator door is modeled, and the haptic renderer is implemented based on the dynamics model. Lastly, we report a user experiment carried out to assess the perceived similarity between simulated and real door operations, i.e., the realism, which shows promising results.


IEEE Transactions on Human-Machine Systems | 2013

Haptic Assistance for Memorization of 2-D Selection Sequences

Hojin Lee; Gabjong Han; In Lee; Sunghoon Yim; Kyungpyo Hong; Hyeseon Lee; Seungmoon Choi

This paper investigates the effect of haptic feedback on the learning of a 2-D sequential selection task, used as an abstraction of complex industrial manual assembly tasks. This mnemonic-motor task requires memorization of the selection order of points scattered on a 2-D plane and reproduction of this order using entire arm movements. Four information presentation methods, visual information only, visual information + enactment, visual information + haptic guidance, and visual information + haptic disturbance, are considered. The latter three methods provide different levels of haptic kinesthetic feedback to the trainee. We carried out a user study to assess the quantitative performance differences of the four training methods using a custom-built visuo-haptic training system. Experimental results showed the relative advantages and disadvantages of each information presentation method for both short-term and long-term memorization. In particular, training with only visual information was the best option for short-term memory, while training also with haptic disturbance was the most effective for long-term memory. Our findings have implications to designing a training method that is suitable for given training requirements.


IEEE Transactions on Haptics | 2016

Data-Driven Haptic Modeling and Rendering of Viscoelastic and Frictional Responses of Deformable Objects

Sunghoon Yim; Seokhee Jeon; Seungmoon Choi

In this paper, we present an extended data-driven haptic rendering method capable of reproducing force responses during pushing and sliding interaction on a large surface area. The main part of the approach is a novel input variable set for the training of an interpolation model, which incorporates the position of a proxy - an imaginary contact point on the undeformed surface. This allows us to estimate friction in both sliding and sticking states in a unified framework. Estimating the proxy position is done in real-time based on simulation using a sliding yield surface - a surface defining a border between the sliding and sticking regions in the external force space. During modeling, the sliding yield surface is first identified via an automated palpation procedure. Then, through manual palpation on a target surface, input data and resultant force data are acquired. The data are used to build a radial basis interpolation model. During rendering, this input-output mapping interpolation model is used to estimate force responses in real-time in accordance with the interaction input. Physical performance evaluation demonstrates that our approach achieves reasonably high estimation accuracy. A user study also shows plausible perceptual realism under diverse and extensive exploration.


ieee haptics symposium | 2012

Shape modeling of soft real objects using force-feedback haptic interface

Sunghoon Yim; Seungmoon Choi

In this paper, we propose an interactive shape modeling system using a regular force-feedback haptic interface designed for soft, deformable real objects. The modeling system consists of a regular haptic interface with a force sensor. The user taps on the surface of an object to collect sample points. Our system detects a contact and then compensates for an error in the sampled point position occurring due to surface deformation. The sampled points are then processed by the alpha-shape algorithm to reconstruct a mesh model of the object. The performance of the proposed system is experimentally compared with a standard 3D optical scanner. The proposed system allows the user to easily model the shape of real soft objects with a common haptic interface, without any further equipment, and can be a viable alternative for deformable objects with inadequate optical properties for optical scanners.


world haptics conference | 2015

Data-driven haptic modeling and rendering of deformable objects including sliding friction

Sunghoon Yim; Seokhee Jeon; Seungmoon Choi

This paper presents an extended data-driven haptic rendering method capable of reproducing force responses during sliding interaction on a large surface area. The core part of the approach is a novel input variable set for data interpolation model training, which includes the position of a proxy - the contact point on a surface when undeformed. This enables us to estimate friction in both sliding and sticking states. The behavior of the proxy is simulated in real-time based on a sliding yield surface - a cone-shaped surface separating the sliding and the sticking area in the external force space, identified through an automated palpation procedure. During modeling, input data and resultant force data are collected through manual palpation of a target object, which are used to build a radial-basis interpolation model. During rendering, this interpolation model with interaction inputs estimates force responses in real-time. Performance evaluation shows that our approach achieves reasonable estimation accuracy. This work is among the first attempts that fully support sliding exploration in the data-driven paradigm.


international conference on virtual reality | 2007

Design and evaluation of a hybrid display system for motion-following tasks

Sangyoon Lee; Sunghoon Yim; Gerard Jounghyun Kim; Ungyeon Yang; Chang Hun Kim

Hybrid display systems are those that combine different types of displays to exploit the complementary characteristics of the constituent display systems. In this paper, we introduce a hybrid system that combines a stereoscopic optical see-through head-mounted display (HMD) and a large projection display for an application in a multi-user ship painting training scenario. The proposed hybrid systems projection display provides a large FOV and a physical metaphor to the ship surface with natural depth perception, while the HMD provides personal and unoccluded display of the motion training guides. To quantify its effectiveness, we conducted a human subject experiment, comparing the subjects motion following task performance among three different display systems: large projection display, head-mounted display, and hybrid. The preliminary results obtained from the experiment has shown that given the same FOV, the hybrid system performed, despite problems with registration between the real and virtual worlds, up to par with the head-mounted display, and better than the projection display. Thus, it is expected that the hybrid display will result in higher task performance with the larger FOV factor available.


IEEE Transactions on Haptics | 2015

Topography Compensation for Haptization of a Mesh Object and Its Stiffness Distribution

Sunghoon Yim; Seokhee Jeon; Seungmoon Choi

This work was motivated by the need for perceptualizing nano-scale scientific data, e.g., those acquired by a scanning probe microscope, where collocated topography and stiffness distribution of a surface can be measured. Previous research showed that when the topography of a surface with spatially varying stiffness is rendered using the conventional penalty-based haptic rendering method, the topography perceived by the user could be significantly distorted from its original model. In the worst case, a higher region with a smaller stiffness value can be perceived to be lower than a lower region with a larger stiffness value. This problem was explained by the theory of force constancy: the user tends to maintain an invariant contact force when s/he strokes the surface to perceive its topography. In this paper, we present a haptization algorithm that can render the shape of a mesh surface and its stiffness distribution with high perceptual accuracy. Our algorithm adaptively changes the surface topography on the basis of the force constancy theory to deliver adequate shape information to the user while preserving the stiffness perception. We also evaluated the performance of the proposed haptization algorithm in comparison to the constraint-based algorithm by examining relevant proximal stimuli and carrying out a user experiment. Results demonstrated that our algorithm could improve the perceptual accuracy of shape and reduce the exploration time, thereby leading to more accurate and efficient haptization.


AsiaHaptics | 2015

Normal and Tangential Force Decomposition and Augmentation Based on Contact Centroid

Sunghoon Yim; Seokhee Jeon; Seungmoon Choi

This study presents a simple but effective approach to extract and selectively augment normal and tangential force components from a reaction force when a user interacts with a real object using a probe. The approach first approximates the behavior of a contact area with a single representative point, i.e., contact centroid. We use it to decompose the reaction force into a normal and a tangential force component. For augmentation, the two components are selectively amplified or diminished. For demonstration, we applied the approach in a breast tumor palpation scenario, where inhomogeneity due to hard nodules is amplified for better detectability of abnormality.


international conference on ubiquitous robots and ambient intelligence | 2014

Data-driven haptic modeling and rendering of frictional sliding contact with soft objects for medical training

Sunghoon Yim; Seokhee Jeon; Seungmoon Choi

Haptic rendering of soft objects is an essential component for medical haptic applications. In this paper we follow a data-driven haptic rendering approach, which generates haptic feedback based on recorded signals for high fidelity rendering. This paper introduces a new data-driven haptic rendering framework that can especially generates a model and renders frictional sliding contact with a deformable object.

Collaboration


Dive into the Sunghoon Yim's collaboration.

Top Co-Authors

Avatar

Seungmoon Choi

Pohang University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Gabjong Han

Pohang University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Hojin Lee

Pohang University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

In Lee

Pohang University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Kyungpyo Hong

Pohang University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Gunhyuk Park

Pohang University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Jane Hwang

Pohang University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Sangki Kim

Pohang University of Science and Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge