Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jehee Lee is active.

Publication


Featured researches published by Jehee Lee.


international conference on computer graphics and interactive techniques | 2002

Interactive control of avatars animated with human motion data

Jehee Lee; Jinxiang Chai; Paul S. A. Reitsma; Jessica K. Hodgins; Nancy S. Pollard

Real-time control of three-dimensional avatars is an important problem in the context of computer games and virtual environments. Avatar animation and control is difficult, however, because a large repertoire of avatar behaviors must be made available, and the user must be able to select from this set of behaviors, possibly with a low-dimensional input device. One appealing approach to obtaining a rich set of avatar behaviors is to collect an extended, unlabeled sequence of motion data appropriate to the application. In this paper, we show that such a motion database can be preprocessed for flexibility in behavior and efficient search and exploited for real-time avatar control. Flexibility is created by identifying plausible transitions between motion segments, and efficient search through the resulting graph structure is obtained through clustering. Three interface techniques are demonstrated for controlling avatar motion using this data structure: the user selects from a set of available choices, sketches a path through an environment, or acts out a desired motion in front of a video camera. We demonstrate the flexibility of the approach through four different applications and compare the avatar motion to directly recorded human motion.


international conference on computer graphics and interactive techniques | 1999

A hierarchical approach to interactive motion editing for human-like figures

Jehee Lee; Sung Yong Shin

This paper presents a technique for adapting existing motion of a human-like character to have the desired features that are specified by a set of constraints. This problem can be typically formulated as a spacetime constraint problem. Our approach combines a hierarchical curve fitting technique with a new inverse kinematics solver. Using the kinematics solver, we can adjust the configuration of an articulated figure to meet the constraints in each frame. Through the fitting technique, the motion displacement of every joint at each constrained frame is interpolated and thus smoothly propagated to frames. We are able to adaptively add motion details to satisfy the constraints within a specified tolerance by adopting a multilevel Bspline representation which also provides a speedup for the interpolation. The performance of our system is further enhanced by the new inverse kinematics solver. We present a closed-form solution to compute the joint angles of a limb linkage. This analytical method greatly reduces the burden of a numerical optimization to find the solutions for full degrees of freedom of a human-like articulated figure. We demonstrate that the technique can be used for retargetting a motion to compensate for geometric variations caused by both characters and environments. Furthermore, we can also use this technique for directly manipulating a motion clip through a graphical interface. CR Categories: I.3.7 [Computer Graphics]: Threedimensional Graphics—Animation; G.1.2 [Numerical Analysis]: Approximation—Spline and piecewise polynomial approximation


ACM Transactions on Graphics | 2001

Computer puppetry: An importance-based approach

Hyun Joon Shin; Jehee Lee; Sung Yong Shin; Michael Gleicher

Computer puppetry maps the movements of a performer to an animated character in real-time. In this article, we provide a comprehensive solution to the problem of transferring the observations of the motion capture sensors to an animated character whose size and proportion may be different from the performers. Our goal is to map as many of the important aspects of the motion to the target character as possible, while meeting the online, real-time demands of computer puppetry. We adopt a Kalman filter scheme that addresses motion capture noise issues in this setting. We provide the notion of dynamic importance of an end-effector that allows us to determine what aspects of the performance must be kept in the resulting motion. We introduce a novel inverse kinematics solver that realizes these important aspects within tight real-time constraints. Our approach is demonstrated by its application to broadcast television performances.


symposium on computer animation | 2007

Group behavior from video: a data-driven approach to crowd simulation

Kang Hoon Lee; Myung Geol Choi; Qyoun Hong; Jehee Lee

Crowd simulation techniques have frequently been used to animate a large group of virtual humans in computer graphics applications. We present a data-driven method of simulating a crowd of virtual humans that exhibit behaviors imitating real human crowds. To do so, we record the motion of a human crowd from an aerial view using a camcorder, extract the two-dimensional moving trajectories of each individual in the crowd, and then learn an agent model from observed trajectories. The agent model decides each agents actions based on features of the environment and the motion of nearby agents in the crowd. Once the agent model is learned, we can simulate a virtual crowd that behaves similarly to the real crowd in the video. The versatility and flexibility of our approach is demonstrated through examples in which various characteristics of group behaviors are captured and reproduced in simulated crowds.


ACM Transactions on Graphics | 2003

Planning biped locomotion using motion capture data and probabilistic roadmaps

Min Gyu Choi; Jehee Lee; Sung Yong Shin

Typical high-level directives for locomotion of human-like characters are useful for interactive games and simulations as well as for off-line production animation. In this paper, we present a new scheme for planning natural-looking locomotion of a biped figure to facilitate rapid motion prototyping and task-level motion generation. Given start and goal positions in a virtual environment, our scheme gives a sequence of motions to move from the start to the goal using a set of live-captured motion clips. Based on a novel combination of probabilistic path planning and hierarchical displacement mapping, our scheme consists of three parts: roadmap construction, roadmap search, and motion generation. We randomly sample a set of valid footholds of the biped figure from the environment to construct a directed graph, called a roadmap, that guides the locomotion of the figure. Every edge of the roadmap is associated with a live-captured motion clip. Augmenting the roadmap with a posture transition graph, we traverse it to obtain the sequence of input motion clips and that of target footprints. We finally adapt the motion sequence to the constraints specified by the footprint sequence to generate a desired locomotion.


symposium on computer animation | 2004

Precomputing avatar behavior from human motion data

Jehee Lee; Kang Hoon Lee

Creating controllable, responsive avatars is an important problem in computer games and virtual environments. Recently, large collections of motion capture data have been exploited for increased realism in avatar animation and control. Large motion sets have the advantage of accommodating a broad variety of natural human motion. However, when a motion set is large, the time required to identify an appropriate sequence of motions is the bottleneck for achieving interactive avatar control. In this paper, we present a novel method of precomputing avatar behavior from unlabelled motion data in order to animate and control avatars at minimal runtime cost. Based on dynamic programming, our method finds a control policy that indicates how the avatar should act in any given situation. We demonstrate the effectiveness of our approach through examples that include avatars interacting with each other and with the user.


international conference on computer graphics and interactive techniques | 2007

Simulating biped behaviors from human motion data

Kwang Won Sok; Manmyung Kim; Jehee Lee

Physically based simulation of human motions is an important issue in the context of computer animation, robotics and biomechanics. We present a new technique for allowing our physically-simulated planar biped characters to imitate human behaviors. Our contribution is twofold. We developed an optimization method that transforms any (either motion-captured or kinematically synthesized) biped motion into a physically-feasible, balance-maintaining simulated motion. Our optimization method allows us to collect a rich set of training data that contains stylistic, personality-rich human behaviors. Our controller learning algorithm facilitates the creation and composition of robust dynamic controllers that are learned from training data. We demonstrate a planar articulated character that is dynamically simulated in real time, equipped with an integrated repertoire of motor skills, and controlled interactively to perform desired motions.


international conference on computer graphics and interactive techniques | 2010

Data-driven biped control

Yoonsang Lee; Sung-Eun Kim; Jehee Lee

We present a dynamic controller to physically simulate under-actuated three-dimensional full-body biped locomotion. Our data-driven controller takes motion capture reference data to reproduce realistic human locomotion through realtime physically based simulation. The key idea is modulating the reference trajectory continuously and seamlessly such that even a simple dynamic tracking controller can follow the reference trajectory while maintaining its balance. In our framework, biped control can be facilitated by a large array of existing data-driven animation techniques because our controller can take a stream of reference data generated on-the-fly at runtime. We demonstrate the effectiveness of our approach through examples that allow bipeds to turn, spin, and walk while steering its direction interactively.


international conference on computer graphics and interactive techniques | 2010

Morphable crowds

Eunjung Ju; Myung Geol Choi; Minji Park; Jehee Lee; Kang Hoon Lee; Shigeo Takahashi

Crowd simulation has been an important research field due to its diverse range of applications that include film production, military simulation, and urban planning. A challenging problem is to provide simple yet effective control over captured and simulated crowds to synthesize intended group motions. We present a new method that blends existing crowd data to generate a new crowd animation. The new animation can include an arbitrary number of agents, extends for an arbitrary duration, and yields a natural-looking mixture of the input crowd data. The main benefit of this approach is to create new spatio-temporal crowd behavior in an intuitive and predictable manner. It is accomplished by introducing a morphable crowd model that allows us to encode the formations and individual trajectories in crowd data. Then, its original spatio-temporal behavior can be reconstructed and interpolated at an arbitrary scale using our morphable model.


IEEE Transactions on Visualization and Computer Graphics | 2002

General construction of time-domain filters for orientation data

Jehee Lee; Sung Yong Shin

Capturing live motion has gained considerable attention in computer animation as an important motion generation technique. Canned motion data are comprised of both position and orientation components. Although a great number of signal processing methods are available for manipulating position data, the majority of these methods cannot be generalized easily to orientation data due to the inherent nonlinearity of the orientation space. In this paper, we present a new scheme that enables us to apply a filter mask (or a convolution filter) to orientation data. The key idea is to transform the orientation data into their analogues in a vector space, to apply a filter mask on them, and then to transform the results back to the orientation space. This scheme gives time-domain filters for orientation data that are computationally efficient and satisfy such important properties as coordinate invariance, time invariance and symmetry. Experimental results indicate that our scheme is useful for various purposes, including smoothing and sharpening.

Collaboration


Dive into the Jehee Lee's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Manmyung Kim

Seoul National University

View shared research outputs
Top Co-Authors

Avatar

Moon Seok Park

Seoul National University Bundang Hospital

View shared research outputs
Top Co-Authors

Avatar

Myung Geol Choi

Catholic University of Korea

View shared research outputs
Top Co-Authors

Avatar

Jongmin Kim

Seoul National University

View shared research outputs
Top Co-Authors

Avatar

Eunjung Ju

Seoul National University

View shared research outputs
Top Co-Authors

Avatar

Jungdam Won

Seoul National University

View shared research outputs
Top Co-Authors

Avatar

Kyung Ho Lee

Seoul National University

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge