Youngjung Suh
Gwangju Institute of Science and Technology
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Youngjung Suh.
ubiquitous computing | 2011
Youngjung Suh; Choonsung Shin; Woontack Woo; Steven P. Dow; Blair MacIntyre
We designed and built a mobile phone-based guidance system to support shared group experiences by suggesting the use of an eavesdropping metaphor inspired by Sotto Voce that allows visitors to eavesdrop on each other’s audio. Going beyond Sotto Voce, we create a shared experience by synchronizing the audio controls of all people who are eavesdropping on each other. Our contribution is the design of a mobile phone guide for cultural tours that combines a linear tour with in-depth information exploration, GPS-based maps offering group awareness, simple content customization and suggestions, and fluid movement between individual and ad-hoc group touring. The most important contribution is the design of a simple sharing scheme that gives all users in an ad hoc group implicit control over the audio content of everyone currently linked together. We evaluated our approach using data collected from participants, and our results validated the effectiveness and usefulness of our sharing scheme and interface for group experiences. In addition, we gained an understanding of how sharing information during visits to cultural heritage sites by socially related people influences the visiting experience; differing mutual eavesdropping and content control behaviors emerged according to group types (family vs. friends). By enabling groups to share their experience on-site, our system should increase the appeal of mobile phones as electronic tour guides, providing adequate support for shared group experiences.
international conference on human computer interaction | 2007
Youngjung Suh; Young Min Park; Hyoseok Yoon; Yoonje Chang; Woontack Woo
With advances in tracking and increased computing power, mobile AR systems are popular in our daily life. Researchers in mobile AR technology have emphasized the technical challenges involved in the limitations imposed from mobility. They did not consider context-aware service with user-related information annotation, even if in ubiquitous computing environment, various contexts of both a user and an environment can be utilized easily as well as effectively. Moreover, it is difficult to have access to pervasive but invisible computing resources. At the same time, the more smart appliances become evolved with more features, the harder their user interfaces tend to become to use. Thus, in this paper, we propose Context-aware Mobile Augmented Reality (CaMAR) system. It lets users interact with their smart objects through personalized control interfaces on their mobile AR devices. Also, it supports enabling contents to be not only personalized but also shared selectively and interactively among user communities.
IEEE Transactions on Consumer Electronics | 2009
Youngjung Suh; Choonsung Shin; Woontack Woo
We have built a mobile phone-based guide system to strengthen the user experience in a cultural heritage site by supporting spatial awareness, personalization, and social connectedness. Our mobile phone guide implemented on a Java-enabled mobile phone provides both audio and visual content that is tailored by tracking user movement with GPS, collecting various user inputs and demographics, and allowing for socially acceptable eavesdropping via wireless networking. We applied our system to a cemetery site and present the results of a user study. We also conducted a performance evaluation to verify the effectiveness of both content filtering in personalization and content synchronization in social connectedness. Spatially, our results validated that our system provided a satisfying physical exploration of historic space. With respect to personalization, the content presented to visitors was well tailored to their real-time feedback over the course of their visit. Socially, the effectiveness of our sharing interface motivated visitors to frequently synchronize content among themselves. Our system will hopefully broaden the appeal of mobile phones as electronic tour guides providing adequate support for spatial awareness, personalization, and shared group experiences.
international symposium on ubiquitous virtual reality | 2009
Choonsung Shin; Wonwoo Lee; Youngjung Suh; Hyoseok Yoon; Youngho Lee; Woontack Woo
With the rapid spreading of ubiComp and mobile augmented reality, the interaction of mobile users in U-VR environments has been evolving. However, current interaction is limited in individuals’ experience with given contents and services. In this paper, we propose CAMAR 2.0 as a future direction of CAMAR aiming at improving perception and interaction of users in U-VR environments. We thus introduce three principles for future interaction and experience in U-VR environments. We also discuss technical challenges and promising scenarios for realizing the vision of CAMAR 2.0 in U-VR environments.
embedded and ubiquitous computing | 2006
Dongpyo Hong; Youngjung Suh; Ahyoung Choi; Umar Rashid; Woontack Woo
In this paper, we propose a toolkit, wear-UCAM, which can support mobile user interactions in smart environments through utilizing users context. With the rapid developments of ubiquitous computing and its relevant technologies, the interest in context-aware applications for mobile/wearable computing also becomes popular in both academic and industrial fields. In such smart environments, furthermore, it is crucial for a user to manage personal information (health, preferences, activities, etc) for the personalized services without his or her explicit inputs. Regarding reflection of users context to context-aware applications, however, there are only a few research activities on such frameworks or toolkits for mobile/wearable computers. In the proposed wear-UCAM, therefore, we focus on a software framework for context-aware applications by taking account of how to acquire contextual information relevant to a user from sensors, how to integrate and manage it, and how to control its disclosure in smart environments
IEICE Transactions on Information and Systems | 2007
Youngho Lee; Sejin Oh; Youngjung Suh; Seiie Jang; Woontack Woo
In this letter, we propose a enhanced framework for a Personalized User Interface (PUI). This framework allows users to access and customize virtual objects in virtual environments in the sense of sharing user centric context with virtual objects. The proposed framework is enhanced by integrating a unified context-aware application for virtual environments (vr-UCAM 1.5) into virtual objects in the PUI framework. It allows a virtual object to receive context from both real and virtual environments, to decide responses based on context and if-then rules, and to communicate with other objects individually. To demonstrate the effectiveness of the proposed framework, we applied it to a virtual heritage system. Experimental results show that we enhance the accessibility and the customizability of virtual objects through the PUI. The proposed framework is expected to play an important role in VR applications such as education, entertainment, and storytelling.
ubiPCMM | 2005
Youngjung Suh; Dongoh Kang; Woontack Woo
The 4th international symposium on ubiquitous VR | 2006
Sunhyuck Kim; Youngjung Suh; Yong-Gu Lee; Woontack Woo
international symposium on ubiquitous virtual reality | 2007
Youngjung Suh; Kiyoung Kim; JoungHyun Han; Woontack Woo
international symposium on ubiquitous virtual reality | 2007
Youngho Lee; Hedda Rahel Schmidtke; Youngjung Suh; Woontack Woo