Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Nick Taubert is active.

Publication


Featured researches published by Nick Taubert.


Scientific Reports | 2015

Prejudiced interactions: Implicit racial bias reduces predictive simulation during joint action with an out-group avatar

Lucia Maria Sacheli; Andrea Christensen; Martin A. Giese; Nick Taubert; Enea Francesco Pavone; Salvatore Maria Aglioti; Matteo Candidi

During social interactions people automatically apply stereotypes in order to rapidly categorize others. Racial differences are among the most powerful cues that drive these categorizations and modulate our emotional and cognitive reactivity to others. We investigated whether implicit racial bias may also shape hand kinematics during the execution of realistic joint actions with virtual in- and out-group partners. Caucasian participants were required to perform synchronous imitative or complementary reach-to-grasp movements with avatars that had different skin color (white and black) but showed identical action kinematics. Results demonstrate that stronger visuo-motor interference (indexed here as hand kinematics differences between complementary and imitative actions) emerged: i) when participants were required to predict the partners action goal in order to on-line adapt their own movements accordingly; ii) during interactions with the in-group partner, indicating the partners racial membership modulates interactive behaviors. Importantly, the in-group/out-group effect positively correlated with the implicit racial bias of each participant. Thus visuo-motor interference during joint action, likely reflecting predictive embodied simulation of the partners movements, is affected by cultural inter-individual differences.


international conference on artificial neural networks | 2014

Coupling Gaussian Process Dynamical Models with Product-of-Experts Kernels

Dmytro Velychko; Dominik Endres; Nick Taubert; Martin A. Giese

We describe a new probabilistic model for learning of coupled dynamical systems in latent state spaces. The coupling is achieved by combining predictions from several Gaussian process dynamical models in a product-of-experts fashion. Our approach facilitates modulation of coupling strengths without the need for computationally expensive re-learning of the dynamical models. We demonstrate the effectiveness of the new coupling model on synthetic toy examples and on high-dimensional human walking motion capture data.


Annual Conference on Artificial Intelligence | 2011

Shaking Hands in Latent Space

Nick Taubert; Dominik Endres; Andrea Christensen; Martin A. Giese

We present an approach for the generative modeling of human interactions with emotional style variations. We employ a hierarchical Gaussian process latent variable model (GP-LVM) to map motion capture data of handshakes into a space of low dimensionality. The dynamics of the handshakes in this low dimensional space are then learned by a standard hidden Markov model, which also encodes the emotional style variation. To assess the quality of generated and rendered handshakes, we asked human observers to rate them for realism and emotional content. We found that generated and natural handshakes are virtually indistinguishable, proving the accuracy of the learned generative model.


Geometric and Numerical Foundations of Movements | 2017

Modeling of Coordinated Human Body Motion by Learning of Structured Dynamic Representations

Albert Mukovskiy; Nick Taubert; Dominik Endres; Christian Vassallo; Maximilien Naveau; Olivier Stasse; Philippe Souères; Martin Giese

The modeling and online-generation of human-like body motion is a central topic in computer graphics and robotics. The analysis of the coordination structure of complex body movements in humans helps to develop flexible technical algorithms for movement synthesis. This chapter summarizes work that uses learned structured representations for the synthesis of complex human-like body movements in real-time. This work follows two different general approaches. The first one is to learn spatio-temporal movement primitives from human kinematic data, and to derive from this Dynamic Movement Primitives (DMPs), which are modeled by nonlinear dynamical systems. Such dynamical primitives are then coupled and embedded into networks that generate complex human-like behaviors online, as self-organized solutions of the underlying dynamics. The flexibility of this approach is demonstrated by synthesizing complex coordinated movements of single agents and crowds. We demonstrate that Contraction Theory provides an appropriate framework for the design of the stability properties of such complex composite systems. In addition, we demonstrate how such primitive-based movement representations can be embedded into a model-based predictive control architecture for the humanoid robot HRP-2. Using the primitive-based trajectory synthesis algorithm for fast online planning of full-body movements, we were able to realize flexibly adapting human-like multi-step sequences, which are coordinated with goal-directed reaching movements. The resulting architecture realizes fast online planing of multi-step sequences, at the same time ensuring dynamic balance during walking and the feasibility of the movements for the robot. The computation of such dynamically feasible multi-step sequences using state-of-the-art optimal control approaches would take hours, while our method works in real-time. The second presented framework for the online synthesis of complex body motion is based on the learning of hierarchical probabilistic generative models, where we exploit Bayesian machine learning approaches for nonlinear dimensionality reduction and the modeling of dynamical systems. Combining Gaussian Process Latent Variable Models (GPLVMs) and Gaussian Process Dynamical Models (GPDMs), we learned models for the interactive movements of two humans. In order to build an online reactive agent with controlled emotional style, we replaced the state variables of one actor by measurements obtained by real-time motion capture from a user and determined the most probable state of the interaction partner using Bayesian model inversion. The proposed method results in highly believable human-like reactive body motion.


Journal of Vision | 2015

Dependence of the perception of emotional body movements on concurrent social motor behavior

Nick Taubert; Junru Li; Dominik Endres; Martin Giese

UNLABELLED Embodiment theories hypothesize that the perception of emotions from body movements involves an activation of brain structures that are involved in motor execution during social interaction [1,2]. This predicts that, for identical visual stimulation, bodily emotions should be perceived as more expressive when the observers are involved in social motor behavior. We tested this hypothesis, exploiting advanced VR technology, requiring participants to judge the emotions of an avatar that reacted to their own motor behavior. METHODS Based on motion capture data from four human actors, we learned generative models for the body motion during emotional pair interactions, exploiting a framework based on Gaussian Process Latent Variable Models [3]. Using a head mounted display, participants observed ten angry and ten fearful emotional reactions, each with eight repetitions, of a human-sized virtual agent, who turned towards the subject after being tipped on the shoulder. As control conditions, participants watched exactly the same stimuli without doing any movements. The emotion information of the avatars was controlled using a motion morphing method based on the GP model, using 5 emotional strength levels, which were carefully adjusted based on a pre-experiment for each emotion and actor. Participants had to rate the emotional expressiveness of the stimuli on a Likert scale. RESULTS Initial data indicates that emotional expressiveness of the stimuli was rated higher when the participants initiate the emotional reaction of the avatar in the VR setup by their own behavior, as compared to pure observation (F > 6.2 and p < 0.03). This confirms an involvement of representations for the execution of social interactive behaviors in the processing of emotional body expressions, consistent with embodiment theories. [1] Wolpert et al., Science 269, 1995. [2] Wicker et al., Neuropsychologia 41, 2003. [3] Lawrence, ND, Advances in neural information processing, 2004. Meeting abstract presented at VSS 2015.


acm symposium on applied perception | 2012

Online simulation of emotional interactive behaviors with hierarchical Gaussian process dynamical models

Nick Taubert; Andrea Christensen; Dominik Endres; Martin A. Giese


acm symposium on applied perception | 2013

A virtual reality setup for controllable, stylized real-time interactions between humans and avatars with sparse Gaussian process dynamical models

Nick Taubert; Martin Löffler; Nicolas Ludolph; Andrea Christensen; Dominik Endres; Martin A. Giese


KI'11 Proceedings of the 34th Annual German conference on Advances in artificial intelligence | 2011

Shaking hands in latent space modeling emotional interactions with Gaussian process latent variable models

Nick Taubert; Dominik Endres; Andrea Christensen; Martin A. Giese


Archive | 2015

Perception of emotional body expressions depends on concurrent involvement in social interaction

Nick Taubert; Junru Li; Dominik Endres; Martin Giese


5th Joint Action Meeting (JAM-V) | 2013

Racial bias modulates joint-actions with ingroup vs outgroup avatars

Salvatore Maria Aglioti; Lucia Maria Sacheli; Andrea Christensen; Martin A. Giese; Nick Taubert; Enea Francesco Pavone

Collaboration


Dive into the Nick Taubert's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Lucia Maria Sacheli

University of Milano-Bicocca

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge