Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Bum-Suk Choi is active.

Publication


Featured researches published by Bum-Suk Choi.


multimedia signal processing | 2010

4-D broadcasting with MPEG-V

Kyoungro Yoon; Bum-Suk Choi; Eun-Seo Lee; Tae-Beom Lim

Advances in media technologies brought 3-D TV home and 4-D movies to your neighbour. We present a framework for 4-D broadcasting to bring 4-D entertainment home based on MPEG-V standard. A complete framework for 4-D entertainment from authoring of sensory effects to environment description and commanding rendering devices for the sensory effects can supported by MPEG-V and couple of other standards. Part 2 of MPEG-V provides tools for describing capabilities of the sensory devices and sensors, part 3 provides tools to describe sensory effects, and part 5 provides tools to actually interact with the sensory devices and sensors.


international conference on information science and applications | 2011

Streaming Media with Sensory Effect

Bum-Suk Choi; Eun-Seo Lee; Kyoungro Yoon

4D is an irresistible new trend in movie industry. Also the Digital Cinema technology is pervasive in most of theatre system. They provide movies in streaming way to the theatres simultaneously. We present a framework for streaming service with sensory effect to bring 4-D entertainment home based on MPEG-V standard. A complete framework for 4-D streaming can be supported from authoring of sensory effects to environment description and commanding rendering devices for the sensory effects by MPEG-V and couple of other standards. Now the standard and the framework technology are ready for streaming 4-D contents for users in home


international conference on advanced communication technology | 2008

A Metadata Schema Design on Representation of Sensory Effect Information for Sensible Media and its Service Framework using UPnP

Shinjee Pyo; Sanghyun Joo; Bum-Suk Choi; Munchurl Kim; Jae-Gon Kim

With advent of various media services and development of audio and video devices, we can enjoy the media more effectively and realistically. Conventional media content is presented mostly via speakers, TV and LCD monitors. Beyond the media rendering only, if the media contents interlink with peripheral devices when being playbacked, it is possible to make fascinating effects on audiovisual media contents. In this paper, we suggest a device rendered sensible media and metadata schema for representing the effect and control information and design a service framework for device rendered sensible media based on UPnP framework.


IEEE Transactions on Broadcasting | 2014

Design and Implementation for Interactive Augmented Broadcasting System

Junghak Kim; Jeounglak Ha; Bum-Suk Choi; Youngho Jeong; Jin-Woo Hong

In a typical augmented reality (AR), users can rotate, resize, or move AR objects, usually in application programs on handheld devices. In addition, users can instruct them to do an animation. Moreover, various types of application programs are provided and various ways of interactions are tried out for more natural user experience. On the other hand, in regular TV, an AR-applied TV program may be a composite video of original video and AR objects, which was already processed by the transmit side of broadcasting system and thus the TV viewer cannot influence the representation of TV program. Thus, until now, it can be said that an authentic AR service on TV broadcasting is not practically provided. In this paper, a new alternative paradigm of AR service in TV broadcasting is introduced. To provide an interactive AR experience to the viewer, first of all, we propose that the place where the original video and the augmented objects are composited needs to be moved from the transmit side to the TV terminal. Therefore, all augmented objects should be graphically rendered on the TV terminal. Considering the prerequisite, we propose an augmented broadcasting service system based on a hybrid framework that combines the conventional TV broadcasting and the broadband Internet. In the proposed system, the augmented objects are downloaded and rendered in compliance with instructions described in a metadata. The metadata includes the time information to synchronize the representation of the original video images and the rendered augmented-object images. And the metadata is delivered in compliance with MPEG-2 system. Inevitably, the architecture of the MPEG-2 transmission stream multiplexer should be slightly modified to insert the time information for the synchronously compositing. The metadata should be packetized in compliance with the MPEG-2 system. So, we propose a packetized elementary stream format for the metadata. Additionally, we propose an interaction mechanism using a second-screen device that supports touch-screen capability. With second-screen devices, TV viewers can manipulate the augmented objects displayed on TV screen. Moreover, a lot of additive contents synchronized or correlated to TV programs can be automatically popped up on the screen of second-screen device.


The Journal of Supercomputing | 2018

Deep feature learning for person re-identification in a large-scale crowdsourced environment

Seon Ho Oh; Seung-Wan Han; Bum-Suk Choi; Geon-Woo Kim; Kyung-Soo Lim

Finding the same individual across cameras in disjoint views at different locations and times, which is known as person re-identification (re-id), is an important but difficult task in intelligent visual surveillance. However, to build a practical re-id system for large-scale and crowdsourced environments, the existing approaches are largely unsuitable because of their high model complexity. In this paper, we present a deep feature learning framework for automated large-scale person re-id with low computational cost and memory usage. The experimental results show that the proposed framework is comparable to the state-of-the-art methods while having low model complexity.


Cluster Computing | 2017

A scheme of AR-based personalized interactive broadcasting service in terrestrial digital broadcasting system

Soonchoul Kim; Moon-Hyun Kim; Kuinam J. Kim; Bum-Suk Choi; Jinwook Chung

This paper proposes a AR-based personalized interactive broadcasting system via broadcast and broadband networks in terrestrial digital broadcasting system (DTV) environment. Augmented reality (AR) technology is a kind of mixed reality which 2D/3D graphics are integrated into the real world in order to enhance user experience and enrich information. A goal of AR-based interactive broadcasting system is to enable broadcasting program using AR technology to blend augmented content (2D/3D graphic object) with broadcasting content in real-time on receiving terminals. The proposed service scheme enables viewer-selectable augmented broadcasting service being provided from two content providers with authorization by a broadcaster and we show the implemented results under DTV environment.


Archive | 2008

Sensory effect media generating and consuming method and apparatus thereof

Sanghyun Joo; Bum-Suk Choi; Hae-Ryong Lee; Kwang-Roh Park; Chaekyu Kim; Munchurl Kim; Jae-Gon Kim


Archive | 2010

Method and apparatus for representing sensory effects using sensory device capability metadata

Bum-Suk Choi; Sanghyun Joo; Jong-Hyun Jang; Kwang-Roh Park


Archive | 2009

Method and apparatus for representing sensory effects and computer readable recording medium storing user sensory preference metadata

Bum-Suk Choi; Sanghyun Joo; Hae-Ryong Lee; Seungsoon Park; Kwang-Roh Park


Etri Journal | 2013

A Metadata Design for Augmented Broadcasting and Testbed System Implementation

Bum-Suk Choi; Jeonghak Kim; Soonchoul Kim; Youngho Jeong; Jin Woo Hong; Won Don Lee

Collaboration


Dive into the Bum-Suk Choi's collaboration.

Top Co-Authors

Avatar

Jin-Woo Hong

Electronics and Telecommunications Research Institute

View shared research outputs
Top Co-Authors

Avatar

Sanghyun Joo

Electronics and Telecommunications Research Institute

View shared research outputs
Top Co-Authors

Avatar

Youngho Jeong

Electronics and Telecommunications Research Institute

View shared research outputs
Top Co-Authors

Avatar

Soonchoul Kim

Electronics and Telecommunications Research Institute

View shared research outputs
Top Co-Authors

Avatar

Kwang-Roh Park

Electronics and Telecommunications Research Institute

View shared research outputs
Top Co-Authors

Avatar

Hae-Ryong Lee

Electronics and Telecommunications Research Institute

View shared research outputs
Top Co-Authors

Avatar

Jeounglak Ha

Electronics and Telecommunications Research Institute

View shared research outputs
Top Co-Authors

Avatar

JeHo Nam

Electronics and Telecommunications Research Institute

View shared research outputs
Top Co-Authors

Avatar

Junghak Kim

Electronics and Telecommunications Research Institute

View shared research outputs
Top Co-Authors

Avatar

Seungsoon Park

Electronics and Telecommunications Research Institute

View shared research outputs
Researchain Logo
Decentralizing Knowledge