Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where C. Karen Liu is active.

Publication


Featured researches published by C. Karen Liu.


international conference on computer graphics and interactive techniques | 2005

Learning physics-based motion style with nonlinear inverse optimization

C. Karen Liu; Aaron Hertzmann; Zoran Popović

This paper presents a novel physics-based representation of realistic character motion. The dynamical model incorporates several factors of locomotion derived from the biomechanical literature, including relative preferences for using some muscles more than others. elastic mechanisms at joints due to the mechanical properties of tendons, ligaments, and muscles, and variable stiffness at joints depending on the task. When used in a spacetime optimization framework, the parameters of this model define a wide range of styles of natural human movement.Due to the complexity of biological motion, these style parameters are too difficult to design by hand. To address this, we introduce Nonlinear Inverse Optimization, a novel algorithm for estimating optimization parameters from motion capture data. Our method can extract the physical parameters from a single short motion sequence. Once captured, this representation of style is extremely flexible: motions can be generated in the same style but performing different tasks, and styles may be edited to change the physical properties of the body.


international conference on computer graphics and interactive techniques | 2002

Synthesis of complex dynamic character motion from simple animations

C. Karen Liu; Zoran Popović

In this paper we present a general method for rapid prototyping of realistic character motion. We solve for the natural motion from a simple animation provided by the animator. Our framework can be used to produce relatively complex realistic motion with little user effort.We describe a novel constraint detection method that automatically determines different constraints on the character by analyzing the input motion. We show that realistic motion can be achieved by enforcing a small set of linear and angular momentum constraints. This simplified approach helps us avoid the complexities of computing muscle forces. Simpler dynamic constraints also allow us to generate animations of models with greater complexity, performing more intricate motions. Finally, we show that by learning a small set of key parameters that describe a character pose we can help a non-skilled animator rapidly create realistic character motion.


ACM Transactions on Graphics | 2014

Iterative Training of Dynamic Skills Inspired by Human Coaching Techniques

Sehoon Ha; C. Karen Liu

Inspired by how humans learn dynamic motor skills through a progressive process of coaching and practices, we introduce an intuitive and interactive framework for developing dynamic controllers. The user only needs to provide a primitive initial controller and high-level, human-readable instructions as if s/he is coaching a human trainee, while the character has the ability to interpret the abstract instructions, accumulate the knowledge from the coach, and improve its skill iteratively. We introduce “control rigs” as an intermediate layer of control module to facilitate the mapping between high-level instructions and low-level control variables. Control rigs also utilize the human coachs knowledge to reduce the search space for control optimization. In addition, we develop a new sampling-based optimization method, Covariance Matrix Adaptation with Classification (CMA-C), to efficiently compute-control rig parameters. Based on the observation of human ability to “learn from failure”, CMA-C utilizes the failed simulation trials to approximate an infeasible region in the space of control rig parameters, resulting a faster convergence for the CMA optimization. We demonstrate the design process of complex dynamic controllers using our framework, including precision jumps, turnaround jumps, monkey vaults, drop-and-rolls, and wall-backflips.


Nature | 2005

Dance reveals symmetry especially in young men

William Michael Brown; Lee Cronk; Keith Grochow; Amy Jacobson; C. Karen Liu; Zoran Popović; Robert Trivers

Dance is believed to be important in the courtship of a variety of species, including humans, but nothing is known about what dance reveals about the underlying phenotypic—or genotypic—quality of the dancer. One measure of quality in evolutionary studies is the degree of bodily symmetry (fluctuating asymmetry, FA), because it measures developmental stability. Does dance quality reveal FA to the observer and is the effect stronger for male dancers than female? To answer these questions, we chose a population that has been measured twice for FA since 1996 (ref. 9) in a society (Jamaican) in which dancing is important in the lives of both sexes. Motion-capture cameras created controlled stimuli (in the form of videos) that isolated dance movements from all other aspects of visual appearance (including FA), and the same population evaluated these videos for dancing ability. Here we report that there are strong positive associations between symmetry and dancing ability, and these associations were stronger in men than in women. In addition, women rate dances by symmetrical men relatively more positively than do men, and more-symmetrical men value symmetry in women dancers more than do less-symmetrical men. In summary, dance in Jamaica seems to show evidence of sexual selection and to reveal important information about the dancer.


ACM Transactions on Graphics | 2009

Optimization-based interactive motion synthesis

Sumit Jain; Yuting Ye; C. Karen Liu

We present a physics-based approach to synthesizing motion of a virtual character in a dynamically varying environment. Our approach views the motion of a responsive virtual character as a sequence of solutions to the constrained optimization problem formulated at every time step. This framework allows the programmer to specify active control strategies using intuitive kinematic goals, significantly reducing the engineering effort entailed in active body control. Our optimization framework can incorporate changes in the characters surroundings through a synthetic visual sensory system and create significantly different motions in response to varying environmental stimuli. Our results show that our approach is general enough to encompass a wide variety of highly interactive motions.


symposium on computer animation | 2004

Momentum-based parameterization of dynamic character motion

Yeuhi Abe; C. Karen Liu; Zoran Popović

This paper presents a system for rapid editing of highly dynamic motion capture data. At the heart of this system is an optimization algorithm that can transform the captured motion so that it satisfies high-level user constraints while enforcing that the linear and angular momentum of the motion remain physically plausible. Unlike most previous approaches to motion editing, our algorithm does not require pose specification or model reduction, and the user only need specify high-level changes to the input motion. To preserve the dynamic behavior of the input motion, we introduce a spline-based parameterization that matches the linear and angular momentum patterns of the motion capture data. Because our algorithm enables rapid convergence by presenting a good initial state of the optimization, the user can efficiently generate a large number of realistic motions from a single input motion. The algorithm can then populate the dynamic space of motions by simple interpolation, effectively parameterizing the space of realistic motions. We show how this framework can be used to produce an effective interface for rapid creation of dynamic animations, as well as to drive the dynamic motion of a character in real-time.


international conference on computer graphics and interactive techniques | 2010

Optimal feedback control for character animation using an abstract model

Yuting Ye; C. Karen Liu

Real-time adaptation of a motion capture sequence to virtual environments with physical perturbations requires robust control strategies. This paper describes an optimal feedback controller for motion tracking that allows for on-the-fly re-planning of long-term goals and adjustments in the final completion time. We first solve an offline optimal trajectory problem for an abstract dynamic model that captures the essential relation between contact forces and momenta. A feedback control policy is then derived and used to simulate the abstract model online. Simulation results become dynamic constraints for online reconstruction of full-body motion from a reference. We applied our controller to a wide range of motions including walking, long stepping, and a squat exercise. Results show that our controllers are robust to large perturbations and changes in the environment.


international conference on computer graphics and interactive techniques | 2009

Dextrous manipulation from a grasping pose

C. Karen Liu

This paper introduces an optimization-based approach to synthesizing hand manipulations from a starting grasping pose. We describe an automatic method that takes as input an initial grasping pose and partial object trajectory, and produces as output physically plausible hand animation that effects the desired manipulation. In response to different dynamic situations during manipulation, our algorithm can generate a range of possible hand manipulations including changes in joint configurations, changes in contact points, and changes in the grasping force. Formulating hand manipulation as an optimization problem is key to our algorithms ability to generate a large repertoire of hand motions from limited user input. We introduce an objective function that accentuates the detailed hand motion and contacts adjustment. Furthermore, we describe an optimization method that solves for hand motion and contacts efficiently while taking into account long-term planning of contact forces. Our algorithm does not require any tuning of parameters, nor does it require any prescribed hand motion sequences.


symposium on computer animation | 2006

Composition of complex optimal multi-character motions

C. Karen Liu; Aaron Hertzmann; Zoran Popović

This paper presents a physics-based method for creating complex multi-character motions from short single-character sequences. We represent multi-character motion synthesis as a spacetime optimization problem where constraints represent the desired character interactions. We extend standard spacetime optimization with a novel timewarp parameterization in order to jointly optimize the motion and the interaction constraints. In addition, we present an optimization algorithm based on block coordinate descent and continuations that can be used to solve large problems multiple characters usually generate. This framework allows us to synthesize multi-character motion drastically different from the input motion. Consequently, a small set of input motion dataset is sufficient to express a wide variety of multi-character motions.


international conference on computer graphics and interactive techniques | 2009

Performance-based control interface for character animation

Satoru Ishigaki; Timothy White; Victor B. Zordan; C. Karen Liu

Most game interfaces today are largely symbolic, translating simplified input such as keystrokes into the choreography of full-body character movement. In this paper, we describe a system that directly uses human motion performance to provide a radically different, and much more expressive interface for controlling virtual characters. Our system takes a data feed from a motion capture system as input, and in real-time translates the performance into corresponding actions in a virtual world. The difficulty with such an approach arises from the need to manage the discrepancy between the real and virtual world, leading to two important subproblems 1) recognizing the users intention, and 2) simulating the appropriate action based on the intention and virtual context. We solve this issue by first enabling the virtual worlds designer to specify possible activities in terms of prominent features of the world along with associated motion clips depicting interactions. We then integrate the prerecorded motions with online performance and dynamic simulation to synthesize seamless interaction of the virtual character in a simulated virtual world. The result is a flexible interface through which a user can make freeform control choices while the resulting character motion maintains both physical realism and the users personal style.

Collaboration


Dive into the C. Karen Liu's collaboration.

Top Co-Authors

Avatar

Greg Turk

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Jie Tan

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Wenhao Yu

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Sumit Jain

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Yuting Ye

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Michael X. Grey

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Aaron D. Ames

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Yunfei Bai

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Zoran Popović

University of Washington

View shared research outputs
Researchain Logo
Decentralizing Knowledge