Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Yongjoo Cho is active.

Publication


Featured researches published by Yongjoo Cho.


virtual reality software and technology | 2000

CAVERNsoft G2: a toolkit for high performance tele-immersive collaboration

Kyoung Shin Park; Yongjoo Cho; Naveen K. Krishnaprasad; Chris Scharver; Michael J. Lewis; Jason Leigh; Andrew E. Johnson

This paper describes the design and implementation of CAVERNsoft G2, a toolkit for building collaborative virtual reality applications. G2s special emphasis is on providing the tools to support high-performance computing and data intensive systems that are coupled to collaborative, immersive environments.This paper describes G2s broad range of services, and demonstrates how they are currently being used in a collaborative volume visualization application.


eurographics | 2001

Adaptive networking for tele-immersion

Jason Leigh; Oliver Yu; Dan Schonfeld; Rashid Ansari; Eric He; A. M. Nayak; Jinghua Ge; Naveen K. Krishnaprasad; Kyoung Shin Park; Yongjoo Cho; Liujia Hu; Ray Fang; Alan Verlo; Linda Winkler; Thomas A. DeFanti

Tele-Immersive applications possess an unusually broad range of networking requirements. As high-speed and Quality of Service-enabled networks emerge, it will becoming more difficult for developers of Tele-Immersion applications, and networked applications in general, to take advantage of these enhanced services. This paper proposes an adaptive networking framework to ultimately allow applications to optimize their network utilization in pace with advances in networking services. In working toward this goal, this paper will present a number of networking techniques for improving performance in tele-immersive applications and examines whether the Differentiated Services mechanism for network Quality of Service is suitable for Tele-Immersion.


Japanese Journal of Applied Physics | 2006

Enhanced Image Mapping Algorithm for Computer-Generated Integral Imaging System

Sung-Wook Min; Kyoung Shin Park; Binara Lee; Yongjoo Cho; Minsoo Hahn

An enhanced image mapping algorithm is proposed for a real-time computer-generated (CG) integral imaging system. The proposed algorithm, viewpoint vector rendering, can be easily adapted to generate a set of elemental images from complex three-dimensional (3D) objects and is less affected by system factors and object image quality than previous methods. In addition, it can support all the display modes of the integral imaging system. The feasibility and efficiency of the proposed approach are verified and analyzed through rendering experiments. Using this technique, it is possible to realize an interactive CG integral imaging system which can be applied to virtual reality.


IEICE Transactions on Information and Systems | 2007

Viewpoint Vector Rendering for Efficient Elemental Image Generation

Kyoung Shin Park; Sung Wook Min; Yongjoo Cho

This paper presents a fast elemental image generation algorithm, called the Viewpoint Vector Rendering (VVR), for the computer-generated integral imaging system. VVR produces a set of elemental images in real-time by assembling the segmented area of the directional scenes taken from a range of viewpoints. This algorithm is less affected by system factors such as the number of elemental lens and the number of polygons. It also supports all display modes of the integral imaging system, real, virtual and focused mode. This paper first describes the characteristics of integral imaging system. It then discusses the design, implementation, and performance evaluation of the VVR algorithm, which can be easily adapted to render the integral images of complex 3D objects.


IEEE Transactions on Consumer Electronics | 2011

Interactive emotional content communications system using portable wireless biofeedback device

Dong Keun Kim; Jonghwa Kim; Eui Chul Lee; Mincheol Whang; Yongjoo Cho

In this paper, we implemented an interactive emotional content communication system using a portable wireless biofeedback device to support convenient emotion recognition and immersive emotional content representation for users. The newly designed system consists of the portable wireless biofeedback device and a novel emotional content rendering system. The former performs the acquisition and transmission of three different physiological signals (photoplethysmography, skin temperature, and galvanic skin response) to the remote emotional content rendering system via Bluetooth links in real time. The latter displays video content concurrently manipulated using the feedback of the user¿s emotional state. The results of effectiveness of the system indicated that the response time of the emotional content communication system was nearly instant, the changes of between emotional contents and emotional states base on physiological signals was corresponded. The user¿s concentration was increased by watching the measuredemotion- based rendered visual stimuli. In the near future, the users of this proposed system will be able to create further substantial user-oriented content based on emotional changes.


Computers & Graphics | 2004

Learning science inquiry skills in a virtual field

Andrew E. Johnson; Thomas G. Moher; Yongjoo Cho; Daniel C. Edelson; Eric J. Russell

Abstract Sixth grade students at Abraham Lincoln Elementary School explore a virtual field via an ImmersaDesk and collect data there using hand-held computers. Back in the classroom they integrate their data, visualize it to see the patterns that emerge, and then propose explanations for these patterns. The goal is to help the students learn science inquiry skills within an environment that encourages their formation.


international conference on entertainment computing | 2011

Emotional intelligent contents: expressing user's own emotion within contents

Minyoung Kim; Kyoung Shin Park; Dongkeun Kim; Yongjoo Cho

This paper presents an Emotionally Intelligent Contents (EIC) framework. This framework helps to create content that changes its elements (such as textures, color, light and sound) dynamically in response to a users emotional state. Also, this emotionally intelligent content allows users to add their own emotion characters at run-time. This paper presents an overview of the EIC framework designed to adapt a game environment to a users emotional state as measured physiologically or through an explicit rating of ones affective state. It will then describe a couple of applications built with this framework.


Proceedings of the 4th International Conference on Ubiquitous Information Technologies & Applications | 2009

Design and Development of a Distributed Tabletop System Using EBITA Framework

Minyoung Kim; Yongjoo Cho; Kyoung Shin Park

Over the past decade, tabletop systems are becoming more popular. However, prior works on tabletop system have mostly focused on supporting user interactions with digital contents on one single tabletop display, which are not easily extendable. In this paper we present a new scalable distributed tabletop system consisted of master/slave computer and tangible interfaces to provide high-resolution interactive tabletop display surfaces. Our work is to develop a tabletop system constructed with LCD panels and a cluster of low-cost commodity PCs to support a large highresolution scalable tiled display. It also employs the tangible user interface using the infrared camera tracking and tangible blocks to support intuitive user interaction on the tabletop surface. In this research, EBITA (Environment for Building Interactive Tangible Applications) framework is developed to support various modules necessary for easy construction of any interactive high-resolution applications that run on the distributed tabletop system. This paper briefly describes the design of EBITA framework and detail implementations of the current prototype of distributed tabletop system. Then it demonstrates two applications developed using EBITA framework: 2D high-resolution image viewer and 3D block crash game.


Ksii Transactions on Internet and Information Systems | 2011

Effects of Depth Map Quantization for Computer-Generated Multiview Images using Depth Image-Based Rendering

Minyoung Kim; Yongjoo Cho; Hyon-Gon Choo; Jin Woong Kim; Kyoung Shin Park

This paper presents the effects of depth map quantization for multiview intermediate image generation using depth image-based rendering (DIBR). DIBR synthesizes multiple virtual views of a 3D scene from a 2D image and its associated depth map. However, it needs precise depth information in order to generate reliable and accurate intermediate view images for use in multiview 3D display systems. Previous work has extensively studied the pre-processing of the depth map, but little is known about depth map quantization. In this paper, we conduct an experiment to estimate the depth map quantization that affords acceptable image quality to generate DIBR-based multiview intermediate images. The experiment uses computer-generated 3D scenes, in which the multiview images captured directly from the scene are compared to the multiview intermediate images constructed by DIBR with a number of quantized depth maps. The results showed that there was no significant effect on depth map quantization from 16-bit to 7-bit (and more specifically 96-scale) on DIBR. Hence, a depth map above 7-bit is needed to maintain sufficient image quality for a DIBR-based multiview 3D system.


cooperative design visualization and engineering | 2005

Designing Virtual Reality Reconstruction of the Koguryo Mural

Yongjoo Cho; Kyoung Shin Park; Soyon Park; Hyungtae Moon

Digital Koguryo is a virtual reality reconstruction of the Koguryo mural tumulus called Anak No. 3. It was designed to help young children learn the cultural background and life style of the ancient Koguryo. This virtual environment brings the life aspects of the Koguryo culture to the users through a rich, interactive learning environment. In this paper, we present the collaborative design process among geographically distributed and multidisciplinary researchers, who worked to construct the Digital Koguryo project. We will also discuss design issues and lessons learned from this collaborative work.

Collaboration


Dive into the Yongjoo Cho's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Andrew E. Johnson

University of Illinois at Chicago

View shared research outputs
Top Co-Authors

Avatar

Thomas G. Moher

University of Illinois at Chicago

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge