Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Sunjun Kim is active.

Publication


Featured researches published by Sunjun Kim.


australasian computer-human interaction conference | 2012

Interaction techniques for unreachable objects on the touchscreen

Sunjun Kim; Jihyun Yu; Geehyuk Lee

Large-screen mobile devices have recently been introduced. While they can display more information on the screen, they have raised the issue of thumb reachability during one-handed use. To solve this problem, we designed four factorial combinations of two triggering techniques (Edge and Large touch) and two selection techniques (Sliding screen and Extendible cursor). A prototype realizing the four interaction techniques was implemented, and a user study was conducted to examine the benefits and problems faced while using these techniques in both portrait and landscape orientations. User study exhibited a significant advantage of Edge triggering with Extendible cursor technique. Also, we collected meaningful comments from the user interview.


human factors in computing systems | 2013

TapBoard: making a touch screen keyboard more touchable

Sunjun Kim; Jeongmin Son; Geehyuk Lee; Hwan Kim; Woohun Lee

We propose the TapBoard, a touch screen software keyboard that regards tapping actions as keystrokes, and other touches as touched states. In a series of user studies, we could validate the effectiveness of the TapBoard concept. First, we could show that tapping to type is in fact compatible with the existing typing skill of most touch screen keyboard users. Second, users could soon adapt to the TapBoard and learn to rest their fingers in a touched state. Finally, we confirm by a controlled experiment that there is no difference in the text entry performance between the TapBoard and a traditional touch screen software keyboard. In addition to these experimental results, we demonstrate a few new interaction techniques that will be made possible by the TapBoard.


human factors in computing systems | 2013

LongPad: a touchpad using the entire area below the keyboard of a laptop computer

Jiseong Gu; Seongkook Heo; Jaehyun Han; Sunjun Kim; Geehyuk Lee

In this paper, we explore the possibility of a long touchpad that utilizes the entire area below the keyboard of a laptop computer. An essential prerequisite for such a touchpad is a robust palm rejection method, which we satisfy using a proximity-sensing touchpad. We developed LongPad, a proximity-sensing optical touchpad that is as wide as a laptop keyboard, and implemented a palm rejection algorithm that utilizes proximity images from LongPad. In a user study conducted, we observed that LongPad rejected palm touches almost perfectly while participants were repeating typing and pointing tasks. We also summarize the new design space enabled by LongPad and demonstrate a few of the interaction techniques it facilitates.


user interface software and technology | 2011

ThickPad: a hover-tracking touchpad for a laptop

Sangwon Choi; Jaehyun Han; Sunjun Kim; Seongkook Heo; Geehyuk Lee

We explored the use of a hover tracking touchpad in a laptop environment. In order to study the new experience, we implemented a prototype touchpad consisting of infrared LEDs and photo-transistors, which can track fingers as far as 10mm over the surface. We demonstrate here three major interaction techniques that would become possible when a hover-tracking touchpad meets a laptop


human factors in computing systems | 2010

iLight: information flashlight on objects using handheld projector

Sunjun Kim; Jaewoo Chung; Alice H. Oh; Chris Schmandt; Ig-Jae Kim

Handheld Projectors are novel display devices developed recently. In this paper we present iLight, Information flashLight, which is based on the ongoing research project Guiding Light [9] using a handheld projector. By using a handheld projector with a tiny camera attached on it, system can recognize objects and augment information directly on them. iLight also present a interaction methodology on handheld projector and a novel real-time interactive experiences among users.


human factors in computing systems | 2018

Neuromechanics of a Button Press

Antti Oulasvirta; Sunjun Kim; Byungjoo Lee

To press a button, a finger must push down and pull up with the right force and timing. How the motor system succeeds in button-pressing, in spite of neural noise and lacking direct access to the mechanism of the button, is poorly understood. This paper investigates a unifying account based on neuromechanics. Mechanics is used to model muscles controlling the finger that contacts the button. Neurocognitive principles are used to model how the motor system learns appropriate muscle activations over repeated strokes though relying on degraded sensory feedback. Neuromechanical simulations yield a rich set of predictions for kinematics, dynamics, and user performance and may aid in understanding and improving input devices. We present a computational implementation and evaluate predictions for common button types.


human factors in computing systems | 2018

Moving Target Selection: A Cue Integration Model

Byungjoo Lee; Sunjun Kim; Antti Oulasvirta; Jong-In Lee; Eunji Park

This paper investigates a common task requiring temporal precision: the selection of a rapidly moving target on display by invoking an input event when it is within some selection window. Previous work has explored the relationship between accuracy and precision in this task, but the role of visual cues available to users has remained unexplained. To expand modeling of timing performance to multimodal settings, common in gaming and music, our model builds on the principle of probabilistic cue integration. Maximum likelihood estimation (MLE) is used to model how different types of cues are integrated into a reliable estimate of the temporal task. The model deals with temporal structure (repetition, rhythm) and the perceivable movement of the target on display. It accurately predicts error rate in a range of realistic tasks. Applications include the optimization of difficulty in game-level design.


Polymer Bulletin | 1996

Synthesis and properties of poly(dipropargyl-16-crown-5)

Hyun-Nam Cho; J. Y. Lee; Sunjun Kim; Sunkyu Choi; Chun-Ho Kim

SummaryWe have synthesized a polyacetylene derivative, poly(dipropargyl-16-crown-5), through the cyclopolymerization of the corresponding monomer by metathesis catalysts. The polymer containing crown-ether units was characterized by spectroscopic and thermal techniques. The polymer structure is believed to be a cyclized from with both five- and sixmembered rings. The polymer exhibits high cation-binding properties and ionochromic effects. The order of the selectivity of alkali- metal cations for both the monomer and the polymer was found to be Na+>K+>Li+ and the polymer shows the largest red shift by approximately 40 nm in λmax for Na+.


human factors in computing systems | 2016

TapBoard 2: Simple and Effective Touchpad-like Interaction on a Multi-Touch Surface Keyboard

Sunjun Kim; Geehyuk Lee

We introduce TapBoard 2, a touchpad-based keyboard that solves the problem of typing and pointing disambiguation. The pointing interaction design of TapBoard 2 is nearly identical to natural touchpad interaction, and its shared workspace naturally invites bimanual pointing interaction. To implement TapBoard 2, we developed a novel gesture representation scheme for a systematic design and gesture recognizer. A user evaluation showed that TapBoard 2 successfully supports collocated pointing and typing interaction. It was able to disambiguate typing and pointing actions with an accuracy of greater than 95%. In addition, the typing and pointing performance of TapBoard 2 were comparable to that of a separate keyboard and mouse. In particular, the bimanual pointing operations of TapBoard 2 are highly efficient and strongly favored by participants.


user interface software and technology | 2017

Reflector: Distance-Independent, Private Pointing on a Reflective Screen

Jong-In Lee; Sunjun Kim; Masaaki Fukumoto; Byungjoo Lee

Reflector is a novel direct pointing method that utilizes hidden design space on reflective screens. By aligning a part of the users onscreen reflection with objects rendered on the screen, Reflector enables (1) distance-independent and (2) private pointing on commodity screens. Reflector can be implemented easily in both desktop and mobile conditions through a single camera installed at the edge of the screen. Reflectors pointing performance was compared to todays major direct input devices: eye trackers and touchscreens. We demonstrate that Reflector allows the user to point more reliably, regardless of distance from the screen, compared to an eye tracker. Further, due to the private nature of an onscreen reflection, Reflector shows a shoulder surfing success rate 20 times lower than that of touchscreens for the task of entering a 4-digit PIN.

Collaboration


Dive into the Sunjun Kim's collaboration.

Researchain Logo
Decentralizing Knowledge