Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jae Joon Han is active.

Publication


Featured researches published by Jae Joon Han.


MPEG-V#R##N#Bridging the Virtual and Real World | 2015

Applications of MPEG-V Standard

Kyoungro Yoon; Sang-Kyun Kim; Jae Joon Han; Seungju Han; Marius Preda

Virtual worlds such as Second Life and those used in 4D Internet/broadcasting services have become increasingly popular. A life-scale virtual-world presentation and the intuitive interaction between users and virtual worlds can provide a more natural and immersive experience for users. MPEG-V specifies the associated information representations to enable interoperability between virtual worlds, as well as between the real and virtual worlds. This chapter introduces the architectures and applications of MPEG-V, such as 4D broadcasting/theater, haptic interaction, avatar/object motion control, facial expression cloning, body gesture tracking, seamless interaction, and interoperable virtual worlds.


MPEG-V#R##N#Bridging the Virtual and Real World | 2015

Common Tools for MPEG-V and MPEG-V Reference SW with Conformance

Kyoungro Yoon; Sang-Kyun Kim; Jae Joon Han; Seungju Han; Marius Preda

There are several tools or types which are used in multiple parts of the MPEG-V standard. MPEG-V Part 6 provides definitions and examples for the use of these common tools and types. Most of the classification schemes defined for MPEG-V are also provided in MPEG-V Part 6. A classification scheme is a collection of terms or objects in a certain category with names and definitions of each term or object, so that when referring to a term or an object within a description, the semantics or meaning of the term/object can be understood without ambiguity. MPEG-V Part 7 provides reference software with conformance rules, so that users or implementers of MPEG-V standard can easily start building a MPEG-V application without breaking the guideline or the rules embedded in the specification of the standard.


MPEG-V#R##N#Bridging the Virtual and Real World | 2015

Adding Sensorial Effects to Media Content

Kyoungro Yoon; Sang-Kyun Kim; Jae Joon Han; Seungju Han; Marius Preda

The provision of sensory effects in addition to audiovisual media content has recently gained attention because more sensorial stimulation supports more immersion on user experiences. For the successful industrial deployment of multiple sensorial media (MulSeMedia), it is important to provide an easy and efficient means of producing MulSeMedia content. In other words, the standard descriptions of sensorial effects (SEs) are one of the key success factors of the MuLSeMedia industry. In this chapter, the standard syntax and semantics from MPEG-V, Part 3 to describe such SEs are introduced along with their valid instances.


MPEG-V#R##N#Bridging the Virtual and Real World | 2015

Interoperable Virtual World

Kyoungro Yoon; Sang-Kyun Kim; Jae Joon Han; Seungju Han; Marius Preda

The characteristics of objects in a virtual world are specified in ISO/IEC 23005-4. This standard describes the metadata for avatars and virtual objects rather than their associated resources such as sound and animation files. The metadata can then be used to characterize a new avatar in another virtual world by utilizing the characteristics defined in the metadata. The benefit of such standardization is to make isolated virtual worlds interoperable. Its visibility through such metadata can also benefit any standard compatible real-world interface devices in controlling the virtual objects through any adaptation engine used for a virtual world and render the effects in the real world.


MPEG-V#R##N#Bridging the Virtual and Real World | 2015

Introduction to MPEG-V Standards

Kyoungro Yoon; Sang-Kyun Kim; Jae Joon Han; Seungju Han; Marius Preda

The addition of sensory effects in audiovisual content has recently gained the attention of academics and industry because more sensorial stimulation increases the immersion effect and enriches the user experience. Moreover, digital virtual worlds provide experiences through virtual characters (called avatars) by simulating realistic fictional environments in the user’s imagination. In addition, recent advances in natural interaction interfaces enable users to control virtual worlds through their own motions combined with the surrounding contextual information obtained through sensors. Within this context, ISO/IEC has elaborated on the MPEG-V standard for bridging between real and virtual worlds.


MPEG-V#R##N#Bridging the Virtual and Real World | 2015

Standard Interfacing Format for Actuators and Sensors

Kyoungro Yoon; Sang-Kyun Kim; Jae Joon Han; Seungju Han; Marius Preda

In an MPEG-V environment, there are two types of devices: actuators and sensors. Actuators are used to generate sensorial effects in the real world or render them. Sensors are used to pick up user inputs or environmental information for adaptation of the effects, or use them as controlling commands for a virtual world. MPEG-V provides standardized interfaces for actuators and sensors by defining the Interaction Information Description Language (IIDL) with the Device Command Vocabulary (DCV) and Sensed Information Vocabulary (SIV) based on the XML schema. IIDL provides a framework for the interfacing format, DCV provides a command format for the actuators, and SIV provides an information format for the sensors.


MPEG-V#R##N#Bridging the Virtual and Real World | 2015

Chapter 4 – Adapting Sensory Effects and Adapted Control of Devices

Kyoungro Yoon; Sang-Kyun Kim; Jae Joon Han; Seungju Han; Marius Preda

All actuators and sensors used in rendering sensorial effects have their own capabilities or sensing environments. When commanding an actuator to render an effect, or when obtaining information from a sensor for control in a virtual world, the capabilities of the devices, such as the operating range of an actuator or accuracy of a sensor, should be considered for detailed control. Another aspect of adaptation is the user preferences. Users may have certain preferences toward certain sensorial effects or on an adaptation of the sensed information for control of a virtual world. For a standardized description of such information, MPEG-V provides the Control Information Description Language as a framework for the description of the actuator capabilities, sensor capabilities, and the user preferences regarding the sensorial effects and adaptation of the sensor inputs.


Archive | 2012

Apparatus and method for controlling user interface using sound recognition

Jae Joon Han; Chang Kyu Choi; Byung In Yoo


Archive | 2010

Virtual world processing device and method

Hyun Jeong Lee; Jae Joon Han; Seung Ju Han; Joon Ah Park


Archive | 2010

Method and device of measuring location, and moving object

Hyong Euk Lee; Bho Ram Lee; Won-chul Bang; Jae Joon Han

Collaboration


Dive into the Jae Joon Han's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge