Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Hyoseok Yoon is active.

Publication


Featured researches published by Hyoseok Yoon.


ubiquitous computing | 2012

Social itinerary recommendation from user-generated digital trails

Hyoseok Yoon; Yu Zheng; Xing Xie; Woontack Woo

Planning travel to unfamiliar regions is a difficult task for novice travelers. The burden can be eased if the resident of the area offers to help. In this paper, we propose a social itinerary recommendation by learning from multiple user-generated digital trails, such as GPS trajectories of residents and travel experts. In order to recommend satisfying itinerary to users, we present an itinerary model in terms of attributes extracted from user-generated GPS trajectories. On top of this itinerary model, we present a social itinerary recommendation framework to find and rank itinerary candidates. We evaluated the efficiency of our recommendation method against baseline algorithms with a large set of user-generated GPS trajectories collected from Beijing, China. First, systematically generated user queries are used to compare the recommendation performance in the algorithmic level. Second, a user study involving current residents of Beijing is conducted to compare user perception and satisfaction on the recommended itinerary. Third, we compare mobile-only approach with Mobile+Cloud architecture for practical mobile recommender deployment. Lastly, we discuss personalization and adaptation factors in social itinerary recommendation throughout the paper.


ubiquitous intelligence and computing | 2010

Smart itinerary recommendation based on user-generated GPS trajectories

Hyoseok Yoon; Yu Zheng; Xing Xie; Woontack Woo

Traveling to unfamiliar regions require a significant effort from novice travelers to plan where to go within a limited duration. In this paper, we propose a smart recommendation for highly efficient and balanced itineraries based on multiple user-generated GPS trajectories. Users only need to provide a minimal query composed of a start point, an end point and travel duration to receive an itinerary recommendation. To differentiate good itinerary candidates from less fulfilling ones, we describe how we model and define itinerary in terms of several characteristics mined from user-generated GPS trajectories. Further, we evaluated the efficiency of our method based on 17,745 user-generated GPS trajectories contributed by 125 users in Beijing, China. Also we performed a user study where current residents of Beijing used our system to review and give ratings to itineraries generated by our algorithm and baseline algorithms for comparison.


international conference on human computer interaction | 2007

Context-aware mobile AR system for personalization, selective sharing, and interaction of contents in ubiquitous computing environments

Youngjung Suh; Young Min Park; Hyoseok Yoon; Yoonje Chang; Woontack Woo

With advances in tracking and increased computing power, mobile AR systems are popular in our daily life. Researchers in mobile AR technology have emphasized the technical challenges involved in the limitations imposed from mobility. They did not consider context-aware service with user-related information annotation, even if in ubiquitous computing environment, various contexts of both a user and an environment can be utilized easily as well as effectively. Moreover, it is difficult to have access to pervasive but invisible computing resources. At the same time, the more smart appliances become evolved with more features, the harder their user interfaces tend to become to use. Thus, in this paper, we propose Context-aware Mobile Augmented Reality (CaMAR) system. It lets users interact with their smart objects through personalized control interfaces on their mobile AR devices. Also, it supports enabling contents to be not only personalized but also shared selectively and interactively among user communities.


international symposium on ubiquitous virtual reality | 2009

CAMAR 2.0: Future Direction of Context-Aware Mobile Augmented Reality

Choonsung Shin; Wonwoo Lee; Youngjung Suh; Hyoseok Yoon; Youngho Lee; Woontack Woo

With the rapid spreading of ubiComp and mobile augmented reality, the interaction of mobile users in U-VR environments has been evolving. However, current interaction is limited in individuals’ experience with given contents and services. In this paper, we propose CAMAR 2.0 as a future direction of CAMAR aiming at improving perception and interaction of users in U-VR environments. We thus introduce three principles for future interaction and experience in U-VR environments. We also discuss technical challenges and promising scenarios for realizing the vision of CAMAR 2.0 in U-VR environments.


ubiquitous computing | 2016

Lightful user interaction on smart wearables

Hyoseok Yoon; Se-Ho Park; Kyung-Taek Lee

Smart wearables are body-worn small devices that require novel user interaction due to its compactness and wearability. Current UI/UX of smart wearables is rooted in a smartphone-like UI/UX that is inadequate in many cases constrained by such small form factors. To overcome these limitations, research efforts are invested for augmenting wearable devices with various sensors and improving efficiency of existing input modalities through careful orchestration. In this paper, we propose a new concept called lightful user interaction exploiting a readily available ambient light sensor as a novel and alternative user interface for smart wearables. We design and model lightful user interaction based on typical usages of representative smart wearables. Then, we demonstrate the proposed lightful user interaction through three implemented applications such as PIN entry, morse code, and control indicator, respectively. At the end, we evaluate the concept and applications in terms of occluded display area, input expressivity and lightweight implementation aspects to make a case for a promising novel and alternative UI for smart wearables.


2015 Fourth International Conference on Cyber Security, Cyber Warfare, and Digital Forensic (CyberSec) | 2015

Exploiting Ambient Light Sensor for Authentication on Wearable Devices

Hyoseok Yoon; Se-Ho Park; Kyung-Taek Lee

Wearable devices are computationally rich, portable and wearable to suit users in various needs and situations. Still, users of wearable devices suffer from lack of efficient input methods and proper feedback modality. In this paper, we design and implement a novel input method exploiting readily available ambient light sensor on wearable devices. We present concepts and implementation results of re-purposing the light sensor for user input and processing UI events. Furthermore, a PIN entry application based on the proposed input method is implemented to demonstrate a device authentication scenario for wearables.


international conference on human-computer interaction | 2011

Foundation of a new digital ecosystem for u-content: needs, definition, and design

Yoosoo Oh; Sébastien Duval; Sehwan Kim; Hyoseok Yoon; Taejin Ha; Woontack Woo

In this paper, we analyze and classify digital ecosystems to demonstrate the need for a new digital ecosystem, oriented towards contents for ubiquitous virtual reality (U-VR), and to identify appropriate designs. First, we survey the digital ecosystems, explore their differences, identify unmet challenges, and consider their appropriateness for emerging services tightly linking real and virtual (i.e. digital) spaces. Second, we define a new type of content ecosystem (u-Content ecosystem) and describe its necessary and desirable features. Finally, the results of our analysis show that our proposed ecosystem surpasses the existing ecosystems for U-VR applications and contents.


international symposium on ubiquitous virtual reality | 2009

CAMAR Mashup: Empowering End-user Participation in U-VR Environment

Hyoseok Yoon; Woontack Woo

In this paper, we propose a concept of Context Aware Mobile AR (CAMAR) mashup as an enriched form of participatory user interaction in U-VR environment. We define CAMAR mashup and discuss how its characteristics are different and distinguishable from previous mashup activities in various additional aspects such as context-awareness, mobility and presentation. To elaborate the proposed concept, an exemplar scenario is presented and foreseeable technical challenges are discussed.


international conference on universal access in human computer interaction | 2007

Personal companion: personalized user interface for u-service discovery, selection and interaction

Hyoseok Yoon; Hyejin Kim; Woontack Woo

In this paper, we propose a mobile user interface named personal companion which enables selection and interaction of u-services based on context of user. Personal companion selects u-services from a list of discovered services, supports camera-based selection with embedded marker and personalizes UI of the selected service in ubiquitous computing environment. In order to verify its usefulness, we implemented personal companion on PDA and UMPC platform and deployed into smart home testbed for selecting and interacting with u-services. The proposed personal companion is expected to play a vital role in ubiquitous computing environment by bridging users and u-services.


international conference on information and communication technology convergence | 2015

LuxUI: Repurposing ambient light sensor for contact-based explicit interaction on smartwatch

Hyoseok Yoon; Min-Sung Park; Se-Ho Park; Kyung-Taek Lee

In this paper, we propose a novel LuxUI based on an ambient light sensor as an alternative input method for smartwatches. We present a concept of LuxUI with design and implementation details of LuxUI interactions and events followed by possible applications and discussion issues. The proposed LuxUI is simple in design yet effective which works well for 200 ms interval interaction with a modest battery usage. For contact-based explicit interaction on smartwatches, the proposed LuxUI can be used as an alternative UI for otherwise expressivity-limited smartwatches.

Collaboration


Dive into the Hyoseok Yoon's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Kyung-Taek Lee

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Choonsung Shin

Gwangju Institute of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Hyejin Kim

Gwangju Institute of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Yoosoo Oh

Gwangju Institute of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Youngjung Suh

Gwangju Institute of Science and Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Kyung-Taek Lee

Carnegie Mellon University

View shared research outputs
Researchain Logo
Decentralizing Knowledge