Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Seokhee Jeon is active.

Publication


Featured researches published by Seokhee Jeon.


Teleoperators and Virtual Environments | 2009

Haptic augmented reality: Taxonomy and an example of stiffness modulation

Seokhee Jeon; Seungmoon Choi

Haptic augmented reality (AR) enables the user to feel a real environment augmented with synthetic haptic stimuli. This article addresses two important topics in haptic AR. First, a new taxonomy for haptic AR is established based on a composite visuo-haptic reality-virtuality continuum extended from the conventional continuum for visual AR. Previous studies related to haptic AR are reviewed and classified using the composite continuum, and associated research issues are discussed. Second, the feasibility of haptically modulating the feel of a real object with the aid of virtual force feedback is investigated, with the stiffness as a goal haptic property. All required algorithms for contact detection, stiffness modulation, and force control are developed, and their individual performances are thoroughly evaluated. The resulting haptic AR system is also assessed in a psychophysical experiment, demonstrating its competent perceptual performance for stiffness modulation. To our knowledge, this work is among the first efforts in haptic AR for systematic augmentation of real object attributes with virtual forces, and it serves as an initial building block toward a general haptic AR system. Finally, several research issues identified during the feasibility study are introduced, with the aim of eliciting more research interest in this exciting yet unexplored area.


virtual reality software and technology | 2006

Interaction techniques in large display environments using hand-held devices

Seokhee Jeon; Jane Hwang; Gerard Jounghyun Kim; Mark Billinghurst

Hand-held devices possess a large potential as an interaction device for their todays ubiquity, and present us with an opportunity to devise new and unique ways of interaction as a smart device with multi-modal sensing and display capabilities. This paper introduces user interaction techniques (for selection, translation, scaling and rotation of objects) using a camera-equipped hand-held device such as a mobile phone or a PDA for large shared environments. We propose three intuitive interaction techniques for 2D and 3D objects in such an environment. The first approach uses the motion flow information to estimate the relative motion of the hand-held device and interact with the large display. The marker-object and marker-cursor approaches both use software markers on the interaction object or on the cursor for the various interactive tasks. The proposed interaction techniques can be further combined with many auxiliary functions and wireless services (of the hand-held devices) for seamless information sharing and exchange among multiple users. A formal usability analysis is currently on-going.


IEEE Transactions on Haptics | 2012

Rendering Virtual Tumors in Real Tissue Mock-Ups Using Haptic Augmented Reality

Seokhee Jeon; Seungmoon Choi; Matthias Harders

Haptic augmented reality (AR) is an emerging research area, which targets the modulation of haptic properties of real objects by means of virtual feedback. In our research, we explore the feasibility of using this technology for medical training systems. As a possible demonstration example, we currently examine the use of augmentation in the context of breast tumor palpation. The key idea in our prototype system is to augment the real feedback of a silicone breast mock-up with simulated forces stemming from virtual tumors. In this paper, we introduce and evaluate the underlying algorithm to provide these force augmentations. This includes a method for the identification of the contact dynamics model via measurements on real sample objects. The performance of our augmentation is examined quantitatively as well as in a user study. Initial results show that the haptic feedback of indenting a real silicone tumor with a rod can be approximated reasonably well with our algorithm. The advantage of such an augmentation approach over physical training models is the ability to create a nearly infinite variety of palpable findings.


ubiquitous computing | 2010

Interaction with large ubiquitous displays using camera-equipped mobile phones

Seokhee Jeon; Jane Hwang; Gerard Jounghyun Kim; Mark Billinghurst

In the ubiquitous computing environment, people will interact with everyday objects (or computers embedded in them) in ways different from the usual and familiar desktop user interface. One such typical situation is interacting with applications through large displays such as televisions, mirror displays, and public kiosks. With these applications, the use of the usual keyboard and mouse input is not usually viable (for practical reasons). In this setting, the mobile phone has emerged as an excellent device for novel interaction. This article introduces user interaction techniques using a camera-equipped hand-held device such as a mobile phone or a PDA for large shared displays. In particular, we consider two specific but typical situations (1) sharing the display from a distance and (2) interacting with a touch screen display at a close distance. Using two basic computer vision techniques, motion flow and marker recognition, we show how a camera-equipped hand-held device can effectively be used to replace a mouse and share, select, and manipulate 2D and 3D objects, and navigate within the environment presented through the large display.


Teleoperators and Virtual Environments | 2011

Real stiffness augmentation for haptic augmented reality

Seokhee Jeon

Haptic augmented reality (AR) mixes a real environment with computer-generated virtual haptic stimuli, enabling the system to modulate the haptic attributes of a real object to desired values. This paper reports our second study on this functionality, with stiffness as a goal modulation property. Our first study explored the potential of haptic AR by presenting an effective stiffness modulation system for simple 1D interaction. This paper extends the system so that a user can interact with a real object in any 3D exploratory pattern while perceiving its augmented stiffness. We develop a complete set of algorithms for contact detection, deformation estimation, force rendering, and force control. The core part is the deformation estimation where the magnitude and direction of real object deformation are estimated using a contact dynamics model identified in a preprocessing step. All algorithms are designed in a way that maximizes the efficiency and usability of the system while maintaining convincing perceptual quality. In particular, the need for a large amount of preprocessing such as geometry modeling is avoided to improve the usability. The physical performance of each algorithm is thoroughly evaluated with real samples. Each algorithm is experimentally verified to satisfy the physical performance requirements that need to be satisfied to achieve convincing rendering quality. The final perceptual quality of stiffness rendering is assessed in a psychophysical experiment where the difference in the perceived stiffness between augmented and virtual objects is measured. The error is less than the human discriminability of stiffness, demonstrating that our system can provide accurate stiffness modulation with perceptually insignificant errors. The limitations of our AR system are also discussed along with a plan for future work.


ieee haptics symposium | 2010

Stiffness modulation for Haptic Augmented Reality: Extension to 3D interaction

Seokhee Jeon; Seungmoon Choi

Haptic Augmented Reality (AR) allows a user to touch a real environment augmented with synthetic haptic stimuli. For example, medical students can palpate a virtual tumor inside a real mannequin using a haptic AR system to practice cancer detection. To realize such functionality, we need to alter the haptic attributes of a real object by means of virtual haptic feedback. Previously, we presented a haptic AR system with stiffness as a goal modulation property, and demonstrated its competent physical and perceptual performances for 1D interaction. In this paper, we extend the system so that a user can interact with a real object in any 3D exploratory pattern while perceiving its augmented stiffness. A series of algorithms are developed for contact detection, deformation estimation, force rendering, and force control. Their performances are thoroughly evaluated with real samples. A particular focus has been on minimizing the amount of preprocessing such as geometry modeling. Our haptic AR system can provide convincing stiffness modulation for real objects of relatively homogeneous deformation properties. The limitations of our AR system are also discussed along with a plan for future work.


international symposium on mixed and augmented reality | 2010

Haptic simulation of breast cancer palpation: A case study of haptic augmented reality

Seokhee Jeon; Benjamin Knoerlein; Matthias Harders; Seungmoon Choi

Haptic augmented reality (AR) allows to modulate the haptic properties of a real object by providing virtual haptic feedback. We previously developed a haptic AR system wherein the stiffness of a real object can be augmented with the aid of a haptic interface. To demonstrate its potential, this paper presents a case study for medical training of breast cancer palpation. A real breast model made of soft silicone is augmented with a virtual tumor rendered inside. Haptic stimuli for the virtual tumor are generated based on a contact dynamics model identified via real measurements, without the need of geometric information on the breast. A subjective evaluation confirmed the realism and fidelity of our palpation system.


international conference on haptics perception devices and scenarios | 2008

Modulating Real Object Stiffness for Haptic Augmented Reality

Seokhee Jeon; Seungmoon Choi

In haptic augmented reality, a user can enjoy the sensations of real objects augmented with synthetic haptic stimuli created by a haptic interface. For example, a haptic augmented reality system may allow the user to feel a soft sponge as a stiffer rubber. In this paper, we present a framework in which the stiffness of a real object can be modulated with additional virtual haptic feedback. For this, a commercial haptic interface is extended with a force sensor. Efficient and effective algorithms for contact detection and stiffness modulation are proposed for the closed-loop framework. Performance evaluation with real samples showed that the stiffness modulation is quite capable except for very rigid objects (e.g., a wood plate) where unstable oscillations dominate the response. This work serves as an initial building block towards a general haptic augmented reality system.


Sensors | 2014

Context representation and fusion: advancements and opportunities.

Asad Masood Khattak; Noman Akbar; Mohammad Aazam; Taqdir Ali; Adil Mehmood Khan; Seokhee Jeon; Myunggwon Hwang; Sungyoung Lee. Lee

The acceptance and usability of context-aware systems have given them the edge of wide use in various domains and has also attracted the attention of researchers in the area of context-aware computing. Making user context information available to such systems is the center of attention. However, there is very little emphasis given to the process of context representation and context fusion which are integral parts of context-aware systems. Context representation and fusion facilitate in recognizing the dependency/relationship of one data source on another to extract a better understanding of user context. The problem is more critical when data is emerging from heterogeneous sources of diverse nature like sensors, user profiles, and social interactions and also at different timestamps. Both the processes of context representation and fusion are followed in one way or another; however, they are not discussed explicitly for the realization of context-aware systems. In other words most of the context-aware systems underestimate the importance context representation and fusion. This research has explicitly focused on the importance of both the processes of context representation and fusion and has streamlined their existence in the overall architecture of context-aware systems’ design and development. Various applications of context representation and fusion in context-aware systems are also highlighted in this research. A detailed review on both the processes is provided in this research with their applications. Future research directions (challenges) are also highlighted which needs proper attention for the purpose of achieving the goal of realizing context-aware systems.


Methods | 2014

Codon-based encoding for DNA sequence analysis

Byeong-Soo Jeong; A.T.M. Golam Bari; Mst. Rokeya Reaz; Seokhee Jeon; Chae-Gyun Lim; Ho-Jin Choi

With the exponential growth of biological sequence data (DNA or Protein Sequence), DNA sequence analysis has become an essential task for biologist to understand the features, functions, structures, and evolution of species. Encoding DNA sequences is an effective method to extract the features from DNA sequences. It is commonly used for visualizing DNA sequences and analyzing similarities/dissimilarities between different species or cells. Although there have been many encoding approaches proposed for DNA sequence analysis, we require more elegant approaches for higher accuracy. In this paper, we propose a noble encoding approach for measuring the degree of similarity/dissimilarity between different species. Our approach can preserve the physiochemical properties, positional information, and the codon usage bias of nucleotides. An extensive performance study shows that our approach provides higher accuracy than existing approaches in terms of the degree of similarity.

Collaboration


Dive into the Seokhee Jeon's collaboration.

Top Co-Authors

Avatar

Seungmoon Choi

Pohang University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Gerard Jounghyun Kim

Pohang University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Sunghoon Yim

Pohang University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Gabjong Han

Pohang University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge