Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Seungmoon Choi is active.

Publication


Featured researches published by Seungmoon Choi.


Proceedings of the IEEE | 2013

Vibrotactile Display: Perception, Technology, and Applications

Seungmoon Choi; Katherine J. Kuchenbecker

This paper reviews the technology and applications of vibrotactile display, an effective information transfer modality for the emerging area of haptic media. Our emphasis is on summarizing foundational knowledge in this area and providing implementation guidelines for application designers who do not yet have a background in haptics. Specifically, we explain the relevant human vibrotactile perceptual capabilities, detail the main types of commercial vibrotactile actuators, and describe how to build both monolithic and localized vibrotactile displays. We then identify exemplary vibrotactile display systems in application areas ranging from the presentation of physical object properties to broadcasting vibrotactile media content.


ieee haptics symposium | 2010

Effects of haptic guidance and disturbance on motor learning: Potential advantage of haptic disturbance

Jaebong Lee; Seungmoon Choi

One of the primary goals of haptic guidance is to facilitate the learning of complex human motor skills by providing haptic cues that are helpful to induce desired movements. Nevertheless, a majority of previous studies have found that haptic guidance is ineffective, or sometimes even detrimental, to motor skill learning. In this paper, we propose the opposite concept, haptic disturbance, and evaluate its efficacy. In haptic disturbance, haptic cues that interfere with the movements of a learner are presented during training. We designed two methods of haptic disturbance using repulsive and noise-like forces, respectively. The effects of these methods were experimentally assessed, comparatively with the conventional methods of visual learning only and progressive haptic guidance. The motor task was to track a dot moving on a 2D plane with a haptic interface operated with one arm. We found that during training, the progressive haptic guidance showed the best tracking accuracy, but in immediate and delayed retention tests, the noise-like haptic disturbance led to the best performance. The results suggest high potentials for haptic disturbance to be a general strategy for expediting the motor learning process.


Teleoperators and Virtual Environments | 2009

Haptic augmented reality: Taxonomy and an example of stiffness modulation

Seokhee Jeon; Seungmoon Choi

Haptic augmented reality (AR) enables the user to feel a real environment augmented with synthetic haptic stimuli. This article addresses two important topics in haptic AR. First, a new taxonomy for haptic AR is established based on a composite visuo-haptic reality-virtuality continuum extended from the conventional continuum for visual AR. Previous studies related to haptic AR are reviewed and classified using the composite continuum, and associated research issues are discussed. Second, the feasibility of haptically modulating the feel of a real object with the aid of virtual force feedback is investigated, with the stiffness as a goal haptic property. All required algorithms for contact detection, stiffness modulation, and force control are developed, and their individual performances are thoroughly evaluated. The resulting haptic AR system is also assessed in a psychophysical experiment, demonstrating its competent perceptual performance for stiffness modulation. To our knowledge, this work is among the first efforts in haptic AR for systematic augmentation of real object attributes with virtual forces, and it serves as an initial building block toward a general haptic AR system. Finally, several research issues identified during the feasibility study are introduced, with the aim of eliciting more research interest in this exciting yet unexplored area.


intelligent robots and systems | 2006

Detection Threshold and Mechanical Impedance of the Hand in a Pen-Hold Posture

Ali Israr; Seungmoon Choi; Hong Z. Tan

We report position and force detection thresholds for sinusoidal waveforms in the frequency range 10-500 Hz delivered through a stylus. The participants were required to hold the stylus in a way similar to that of holding the stylus of a force-feedback device. A minishaker moved the stylus along its length so that the majority of vibrations were presented tangentially to the skin of the hand. The measured position thresholds decreased initially with an increasing stimulus frequency and formed a U-shaped curve in the high frequency region. The thresholds of high frequency vibrations were lower than those reported previously for vibrations that were perpendicular to the skin, but were similar to the thresholds reported earlier using vibrations that were tangential to the skin. A similar force threshold curve was obtained using a force sensor attached to one end of the stylus. Mechanical impedance of the skin derived from velocity estimates and force measurements indicated that the skin and tissues in the hand holding the stylus can be modeled with mass-, damper- and spring-like elements. A comparison of the mechanical impedance from the present study with those reported previously showed similar results for vibrations delivered in the tangential and normal directions to the skin


IEEE Transactions on Visualization and Computer Graphics | 2009

Real-Time Depth-of-Field Rendering Using Anisotropically Filtered Mipmap Interpolation

Sungkil Lee; G. Jounghyun Kim; Seungmoon Choi

This article presents a real-time GPU-based post-filtering method for rendering acceptable depth-of-field effects suited for virtual reality. Blurring is achieved by nonlinearly interpolating mipmap images generated from a pinhole image. Major artifacts common in the post-filtering techniques such as bilinear magnification artifact, intensity leakage, and blurring discontinuity are practically eliminated via magnification with a circular filter, anisotropic mipmapping, and smoothing of blurring degrees. The whole framework is accelerated using GPU programs for constant and scalable real-time performance required for virtual reality. We also compare our method to recent GPU-based methods in terms of image quality and rendering performance.


tests and proofs | 2005

Force constancy and its effect on haptic perception of virtual surfaces

Seungmoon Choi; Laron Walker; Hong Z. Tan; S Crittenden; R. Reifenberger

The force-constancy hypothesis states that the user of a force-feedback device maintains a constant penetration force when stroking virtual surfaces in order to perceive their topography. The hypothesis was developed to address a real-world data perceptualization problem where the perception of surface topography was distorted when the surface stiffness was nonuniform. Two experiments were conducted. In Experiment I, we recorded the penetration depths of the probe tip while the user stroked two surfaces with equal height but different stiffness values. We found that the data could be quantitatively modeled by the force-constancy hypothesis when the virtual surfaces were neither too soft nor too hard. In Experiment II, we demonstrated that given two adjacent surfaces, their perceived height difference depended on both the surface stiffness values as well as the relative heights of the surfaces. Specifically, we showed that the higher but softer surface could be perceived to be lower, at the same height, or higher than the other surface, depending on how much higher it was than the other surface. The results were consistent with the predictions of the force-constancy hypothesis. Our findings underscore the importance of understanding the interplay of haptic rendering parameters.


Presence: Teleoperators & Virtual Environments | 2004

Perceived Instability of Virtual Haptic Texture. I. Experimental Studies

Seungmoon Choi; Hong Z. Tan

This paper presents a quantitative characterization of the instability that a human user often experiences while interacting with a virtual textured surface rendered with a force-reflecting haptic interface. First, we quantified the degree of stability/ instability during haptic texture rendering through psychophysical experiments. The stiffness of the virtual textured surface upon detection of instability was measured under a variety of experimental conditions using two texture rendering methods, two exploration modes, and various texture model parameters. We found that the range of stiffness values for stable texture rendering was quite limited. Second, we investigated the attributes of the proximal stimuli experienced by a human hand while exploring the virtual textured surface in an attempt to identify the sources of perceived instability. Position, force, and acceleration were measured and then analyzed in the frequency domain. The results were characterized by sensation levels in terms of spectral intensity in dB relative to the human detection threshold at the same frequency. We found that the spectral bands responsible for texture and instability perception were well separated in frequency such that they excited different mechanoreceptors and were, therefore, perceptually distinctive. Furthermore, we identified the high-frequency dynamics of the device to be a likely source of perceived instability. Our work has implications for displaying textured surfaces through a force feedback device in a virtual environment.


ieee haptics symposium | 2010

Initial study for creating linearly moving vibrotactile sensation on mobile device

Jongman Seo; Seungmoon Choi

This paper investigates the feasibility of creating spatially moving vibrotactile sensations using two vibrotactile actuators in a mobile device. The idea is based on well-known tactile illusions of apparent tactile motion and “phantom” sensation. The phantom sensation refers to a perceptual phenomenon where spatially separated vibrotactile actuators that stimulate different skin zones induce a single tactile sensation midway between the two stimulation points. We tested whether such sensation can also be elicited in a mobile device via a psychophysical experiment that employed an open response paradigm. Experimental conditions differed in vibration rendering method, signal duration, and sensation movement direction. The subjects answered the perceived positions and intensities of vibrotactile sensations by drawing graphs with respect to time. The results demonstrated that such “vibrotactile flow” can be reliably produced in a mobile device and that a performance trade-off exists depending on the method and signal duration used for rendering. The findings of this paper can be applied to the user interface design of a mobile device with enriched vibrotactile sensations and an improved information transfer bandwidth.


IEEE Transactions on Haptics | 2012

Rendering Virtual Tumors in Real Tissue Mock-Ups Using Haptic Augmented Reality

Seokhee Jeon; Seungmoon Choi; Matthias Harders

Haptic augmented reality (AR) is an emerging research area, which targets the modulation of haptic properties of real objects by means of virtual feedback. In our research, we explore the feasibility of using this technology for medical training systems. As a possible demonstration example, we currently examine the use of augmentation in the context of breast tumor palpation. The key idea in our prototype system is to augment the real feedback of a silicone breast mock-up with simulated forces stemming from virtual tumors. In this paper, we introduce and evaluate the underlying algorithm to provide these force augmentations. This includes a method for the identification of the contact dynamics model via measurements on real sample objects. The performance of our augmentation is examined quantitatively as well as in a user study. Initial results show that the haptic feedback of indenting a real silicone tumor with a rod can be approximated reasonably well with our algorithm. The advantage of such an augmentation approach over physical training models is the ability to create a nearly infinite variety of palpable findings.


IEEE Transactions on Visualization and Computer Graphics | 2009

Real-Time Tracking of Visually Attended Objects in Virtual Environments and Its Application to LOD

Sungkil Lee; Gerard Jounghyun Kim; Seungmoon Choi

This paper presents a real-time framework for computationally tracking objects visually attended by the user while navigating in interactive virtual environments. In addition to the conventional bottom-up (stimulus-driven) saliency map, the proposed framework uses top-down (goal-directed) contexts inferred from the users spatial and temporal behaviors, and identifies the most plausibly attended objects among candidates in the object saliency map. The computational framework was implemented using GPU, exhibiting high computational performance adequate for interactive virtual environments. A user experiment was also conducted to evaluate the prediction accuracy of the tracking framework by comparing objects regarded as visually attended by the framework to actual human gaze collected with an eye tracker. The results indicated that the accuracy was in the level well supported by the theory of human cognition for visually identifying single and multiple attentive targets, especially owing to the addition of top-down contextual information. Finally, we demonstrate how the visual attention tracking framework can be applied to managing the level of details in virtual environments, without any hardware for head or eye tracking.

Collaboration


Dive into the Seungmoon Choi's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Inwook Hwang

Pohang University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Jaebong Lee

Pohang University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

In Lee

Pohang University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Sunghoon Yim

Pohang University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Gunhyuk Park

Pohang University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Hojin Lee

Pohang University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Gabjong Han

Pohang University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Jongman Seo

Pohang University of Science and Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge