Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Christian Mandery is active.

Publication


Featured researches published by Christian Mandery.


international conference on advanced robotics | 2015

The KIT whole-body human motion database

Christian Mandery; Ömer Terlemez; Martin Do; Nikolaus Vahrenkamp; Tamim Asfour

We present a large-scale whole-body human motion database consisting of captured raw motion data as well as the corresponding post-processed motions. This database serves as a key element for a wide variety of research questions related e.g. to human motion analysis, imitation learning, action recognition and motion generation in robotics. In contrast to previous approaches, the motion data in our database considers the motions of the observed human subject as well as the objects with which the subject is interacting. The information about human-object relations is crucial for the proper understanding of human actions and their goal-directed reproduction on a robot. To facilitate the creation and processing of human motion data, we propose procedures and techniques for capturing of motion, labeling and organization of the motion capture data based on a Motion Description Tree, as well as for the normalization of human motion to an unified representation based on a reference model of the human body. We provide software tools and interfaces to the database allowing access and efficient search with the proposed motion representation.


ieee-ras international conference on humanoid robots | 2014

Master Motor Map (MMM) — Framework and toolkit for capturing, representing, and reproducing human motion on humanoid robots

Orner Terlemez; Stefan Ulbrich; Christian Mandery; Martin Do; Nikolaus Vahrenkamp; Tamim Asfour

We present an extended version of our work on the design and implementation of a reference model of the human body, the Master Motor Map (MMM) which should serve as a unifying framework for capturing human motions, their representation in standard data structures and formats as well as their reproduction on humanoid robots. The MMM combines the definition of a comprehensive kinematics and dynamics model of the human body with 104 DoF including hands and feet with procedures and tools for unified capturing of human motions. We present online motion converters for the mapping of human and object motions to the MMM model while taking into account subject specific anthropométrie data as well as for the mapping of MMM motion to a target robot kinematics. Experimental evaluation of the approach performed on VICON motion recordings demonstrate the benefits of the MMM as an important step towards standardized human motion representation and mapping to humanoid robots.


ieee-ras international conference on humanoid robots | 2015

Analyzing whole-body pose transitions in multi-contact motions

Christian Mandery; Júlia Borràs; Mirjam Jöchner; Tamim Asfour

When executing whole-body motions, humans are able to use a large variety of support poses which not only utilize the feet, but also hands, knees and elbows to enhance stability. While there are many works analyzing the transitions involved in walking, very few works analyze human motion where more complex supports occur. In this work, we analyze complex support pose transitions in human motion involving locomotion and manipulation tasks (loco-manipulation). We have applied a method for the detection of human support contacts from motion capture data to a large-scale dataset of loco-manipulation motions involving multi-contact supports, providing a semantic representation of them. Our results provide a statistical analysis of the used support poses, their transitions and the time spent in each of them. In addition, our data partially validates our taxonomy of whole-body support poses presented in our previous work. We believe that this work extends our understanding of human motion for humanoids, with a long-term objective of developing methods for autonomous multi-contact motion planning.


IEEE Transactions on Robotics | 2016

Unifying Representations and Large-Scale Whole-Body Motion Databases for Studying Human Motion

Christian Mandery; Ömer Terlemez; Martin Do; Nikolaus Vahrenkamp; Tamim Asfour

Large-scale human motion databases are key for research questions ranging from human motion analysis and synthesis, biomechanics of human motion, data-driven learning of motion primitives, and rehabilitation robotics to the design of humanoid robots and wearable robots such as exoskeletons. In this paper we present a large-scale database of whole-body human motion with methods and tools, which allows a unifying representation of captured human motion, and efficient search in the database, as well as the transfer of subject-specific motions to robots with different embodiments. To this end, captured subject-specific motion is normalized regarding the subjects height and weight by using a reference kinematics and dynamics model of the human body, the master motor map (MMM). In contrast with previous approaches and human motion databases, the motion data in our database consider not only the motions of the human subject but the position and motion of objects with which the subject is interacting as well. In addition to the description of the MMM reference model, we present procedures and techniques for the systematic recording, labeling, and organization of human motion capture data, object motions as well as the subject-object relations. To allow efficient search for certain motion types in the database, motion recordings are manually annotated with motion description tags organized in a tree structure. We demonstrate the transfer of human motion to humanoid robots and provide several examples of motion analysis using the database.


international conference on multisensor fusion and integration for intelligent systems | 2016

Real-time whole-body human motion tracking based on unlabeled markers

Jannik Steinbring; Christian Mandery; Florian Pfaff; Florian Faion; Tamim Asfour; Uwe D. Hanebeck

In this paper, we present a novel online approach for tracking whole-body human motion based on unlabeled measurements of markers attached to the body. For that purpose, we employ a given kinematic model of the human body including the locations of the attached markers. Based on the model, we apply a combination of constrained sample-based Kalman filtering and multi-target tracking techniques: 1) joint constraints imposed by the human body are satisfied by introducing a parameter transformation based on periodic functions, 2) a global nearest neighbor (GNN) algorithm computes the most likely one-to-one association between markers and measurements, and 3) multiple hypotheses tracking (MHT) allows for a robust initialization that only requires an upright standing user. Evaluations clearly demonstrate that the proposed tracking provides highly accurate pose estimates in realtime, even for fast and complex motions. In addition, it provides robustness to partial occlusion of markers and also handles unavoidable clutter measurements.


intelligent robots and systems | 2016

Using language models to generate whole-body multi-contact motions

Christian Mandery; Júlia Borràs; Mirjam Jöchner; Tamim Asfour

We present a novel approach for generating sequences of whole-body poses with multi-contacts for humanoid robots, which is inspired by techniques from natural language processing. To this end, we propose a probabilistic n-gram language model learned from observation of human locomotion tasks. Human motion data is automatically segmented according to detected contacts of the body with the environment to provide support, that is, support poses, which are further subdivided with regard to whole-body configuration. These poses are subsequently used to train a language model, whose words are the poses, and whose sentences represent sequences of poses. Then, we propose a planning algorithm that, given the constraints imposed by a task, finds the sequence of transitions with the highest probability according to our language model. We have applied our approach to 140 motion capture recordings of locomotion tasks that involve using one or both hands for support. The evaluation demonstrates that our approach is able to generate complex sets of pose transitions, and shows promising results regarding its application to more complex tasks.


ieee ras international conference on humanoid robots | 2017

A framework for evaluating motion segmentation algorithms

Christian R. G. Dreher; Nicklas Kulp; Christian Mandery; Mirko Wächter; Tamim Asfour

There have been many proposals for algorithms segmenting human whole-body motion in the literature. However, the wide range of use cases, datasets, and quality measures that were used for the evaluation render the comparison of algorithms challenging. In this paper, we introduce a framework that puts motion segmentation algorithms on a unified testing ground and provides a possibility to allow comparing them. The testing ground features both a set of quality measures known from the literature and a novel approach tailored to the evaluation of motion segmentation algorithms, termed Integrated Kernel approach. Datasets of motion recordings, provided with a ground truth, are included as well. They are labelled in a new way, which hierarchically organises the ground truth, to cover different use cases that segmentation algorithms can possess. The framework and datasets are publicly available and are intended to represent a service for the community regarding the comparison and evaluation of existing and new motion segmentation algorithms.


ISRR (2) | 2018

On the Dualities Between Grasping and Whole-Body Loco-Manipulation Tasks

Tamim Asfour; Júlia Borràs; Christian Mandery; Peter Kaiser; Eren Erdal Aksoy; Markus Grotz

Exploiting interaction with the environment is a promising and powerful way to enhance stability of humanoid robots and robustness while executing locomotion and manipulation tasks. This paper revisits several of our works that have a point in common: the exploration of techniques commonly applied in the context of robot grasping with multifingered hands to be applied for whole-body poses during execution of loco-manipulation tasks. Exploiting the fact that the kinematic and dynamic structure of hands holding objects is very similar to the body balancing with multi-contacts, we show how we have defined a taxonomy of whole body poses that provide support to the body, we have used motion data analysis to automatically extract information of detected support poses and the motion transition between them, and we apply the concept of grasp affordances to associate whole-body affordances to an unknown scene. This work provides an overview of our works and proposes directions of promising research direction that is expected to provide meaningful results in the area humanoid robotics in the future.


Science Robotics | 2017

A whole-body support pose taxonomy for multi-contact humanoid robot motions

Júlia Borràs; Christian Mandery; Tamim Asfour

A taxonomy of whole-body support poses promotes representation and generation of multi-contact humanoid robot motions. A taxonomy of whole-body support poses promotes representation, recognition, and generation of multi-contact humanoid robot motions.


ieee-ras international conference on humanoid robots | 2016

Is hugging a robot weird? Investigating the influence of robot appearance on users' perception of hugging

Gabriele Trovato; Martin Do; Ömer Terlemez; Christian Mandery; Hiroyuki Ishii; Nadia Bianchi-Berthouze; Tamim Asfour; Atsuo Takanishi

Humanoid robots are expected to be able to communicate with humans using physical interaction, including hug, which is a common gesture of affection. In order to achieve that, their physical embodiment has to be carefully planned, as a user-friendly design will facilitate interaction and minimise repulsion. In this paper, we investigate the effect of manipulating the visual/tactile appearance of a robot, covering wires and metallic parts with clothes, and the auditory effect by enabling or disabling the connector of the hand. The experiment consists in a hugging interaction between the participants and the humanoid robot ARMAR-IIIb. Results after participation of 24 subjects confirm the positive effect from using clothes to modify the appearance and the negative effect of noise and vibration.

Collaboration


Dive into the Christian Mandery's collaboration.

Top Co-Authors

Avatar

Tamim Asfour

Karlsruhe Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Martin Do

Karlsruhe Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Nikolaus Vahrenkamp

Karlsruhe Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Matthias Plappert

Karlsruhe Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Peter Kaiser

Karlsruhe Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Ömer Terlemez

Karlsruhe Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Jannik Steinbring

Karlsruhe Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Markus Grotz

Karlsruhe Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Mirjam Jöchner

Karlsruhe Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge