Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Martin Do is active.

Publication


Featured researches published by Martin Do.


international conference on advanced robotics | 2015

The KIT whole-body human motion database

Christian Mandery; Ömer Terlemez; Martin Do; Nikolaus Vahrenkamp; Tamim Asfour

We present a large-scale whole-body human motion database consisting of captured raw motion data as well as the corresponding post-processed motions. This database serves as a key element for a wide variety of research questions related e.g. to human motion analysis, imitation learning, action recognition and motion generation in robotics. In contrast to previous approaches, the motion data in our database considers the motions of the observed human subject as well as the objects with which the subject is interacting. The information about human-object relations is crucial for the proper understanding of human actions and their goal-directed reproduction on a robot. To facilitate the creation and processing of human motion data, we propose procedures and techniques for capturing of motion, labeling and organization of the motion capture data based on a Motion Description Tree, as well as for the normalization of human motion to an unified representation based on a reference model of the human body. We provide software tools and interfaces to the database allowing access and efficient search with the proposed motion representation.


ieee-ras international conference on humanoid robots | 2008

Imitation of human motion on a humanoid robot using non-linear optimization

Martin Do; Pedram Azad; Tamim Asfour; Rüdiger Dillmann

In this paper, we present a system for the imitation of human motion on a humanoid robot, which is capable of incorporating both vision-based markerless and marker-based human motion capture techniques. Based on the so-called Master Motor Map, an interface for transferring motor knowledge between embodiments with different kinematics structure, the system is able to map human movement to a human-like movement on the humanoid while preserving the goal-directed characteristics of the movement. To attain an exact and goal-directed imitation of an observed movement, we introduce a reproduction module using non-linear optimization to maximize the similarity between the demonstrated human movement and the imitation by the robot. Experimental result using markerless and marker-based human motion capture data are given.


ieee-ras international conference on humanoid robots | 2010

On-line periodic movement and force-profile learning for adaptation to new surfaces

Andrej Gams; Martin Do; Ales Ude; Tamim Asfour; Rüdiger Dillmann

To control the motion of a humanoid robot along a desired trajectory in contact with a rigid object, we need to take into account forces that arise from contact with the surface of the object. In this paper we propose a new method that enables the robot to adapt its motion to different surfaces. The initial trajectories are encoded by dynamic movement primitives, which can be learned from visual feedback using a two-layered imitation system. In our approach these initial trajectories are modified using regression methods. The data for learning is provided by force feedback. In this way new trajectories are learned that ensure that the robot can move along the object while maintaining contact and applying the desired force to the object. Active compliance can be used more effectively with such trajectories. We present the results for both movement imitation and force profile learning on two different surfaces. We applied the method to the ARMAR-IIIb humanoid robot, where we use the system for learning and imitating a periodic task of wiping a kitchen table.


ieee-ras international conference on humanoid robots | 2014

Master Motor Map (MMM) — Framework and toolkit for capturing, representing, and reproducing human motion on humanoid robots

Orner Terlemez; Stefan Ulbrich; Christian Mandery; Martin Do; Nikolaus Vahrenkamp; Tamim Asfour

We present an extended version of our work on the design and implementation of a reference model of the human body, the Master Motor Map (MMM) which should serve as a unifying framework for capturing human motions, their representation in standard data structures and formats as well as their reproduction on humanoid robots. The MMM combines the definition of a comprehensive kinematics and dynamics model of the human body with 104 DoF including hands and feet with procedures and tools for unified capturing of human motions. We present online motion converters for the mapping of human and object motions to the MMM model while taking into account subject specific anthropométrie data as well as for the mapping of MMM motion to a target robot kinematics. Experimental evaluation of the approach performed on VICON motion recordings demonstrate the benefits of the MMM as an important step towards standardized human motion representation and mapping to humanoid robots.


international conference on robotics and automation | 2010

Integrated Grasp and motion planning

Nikolaus Vahrenkamp; Martin Do; Tamim Asfour; R. Dillmann

In this work, we present an integrated planner for collision-free single and dual arm grasping motions. The proposed Grasp-RRT planner combines the three main tasks needed for grasping an object: finding a feasible grasp, solving the inverse kinematics and searching a collision-free trajectory that brings the hand to the grasping pose. Therefore, RRT-based algorithms are used to build a tree of reachable and collision-free configurations. During RRT-generation, potential grasping positions are generated and approach movements toward them are computed. The quality of reachable grasping poses is scored with an online grasp quality measurement module which is based on the computation of applied forces in order to diminish the net torque.We also present an extension to a dual arm planner which generates bimanual grasps together with corresponding dual arm grasping motions. The algorithms are evaluated with different setups in simulation and on the humanoid robot ARMAR-III.


ieee-ras international conference on humanoid robots | 2012

Encoding of periodic and their transient motions by a single dynamic movement primitive

Johannes Ernesti; Ludovic Righetti; Martin Do; Tamim Asfour; Stefan Schaal

Present formulations of periodic dynamic movement primitives (DMPs) do not encode the transient behavior required to start the rhythmic motion, although these transient movements are an important part of the rhythmic movements (i.e. when walking, there is always a first step that is very different from the subsequent ones). An ad-hoc procedure is then necessary to get the robot into the periodic motion. In this contribution we present a novel representation for rhythmic Dynamic Movement Primitives (DMPs) that encodes both the rhythmic motion and its transient behaviors. As with previously proposed DMPs, we use a dynamical system approach where an asymptotically stable limit cycle represents the periodic pattern. Transients are then represented as trajectories converging towards the limit cycle, different trajectories representing varying transients from different initial conditions. Our approach thus constitutes a generalization of previously proposed rhythmic DMPs. Experiments conducted on the humanoid robot ARMAR-III demonstrate the applicability of the approach for movement generation.


Künstliche Intelligenz | 2010

Advances in Robot Programming by Demonstration

Rüdiger Dillmann; Tamim Asfour; Martin Do; Rainer Jäkel; Alexander Kasper; Pedram Azad; Ales Ude; Sven R. Schmidt-Rohr; Martin Lösch

Robot Programming by Demonstration (PbD) has been dealt with in the literature as a promising way to teach robots new skills in an intuitive way. In this paper we describe our current work in the field toward the implementation of PbD system which allows robots to learn continuously from human observation, build generalized representations of human demonstration and apply such representations to new situations.


international conference on robotics and automation | 2014

Learn to wipe: A case study of structural bootstrapping from sensorimotor experience

Martin Do; Julian Schill; Johannes Ernesti; Tamim Asfour

In this paper, we address the question of generative knowledge construction from sensorimotor experience, which is acquired by exploration. We show how actions and their effects on objects, together with perceptual representations of the objects, are used to build generative models which then can be used in internal simulation to predict the outcome of actions. Specifically, the paper presents an experiential cycle for learning association between object properties (softness and height) and action parameters for the wiping task and building generative models from sensorimotor experience resulting from wiping experiments. Object and action are linked to the observed effect to generate training data for learning a non-parametric continuous model using Support Vector Regression. In subsequent iterations, this model is grounded and used to make predictions on the expected effects for novel objects which can be used to constrain the parameter exploration. The cycle and skills have been implemented on the humanoid platform ARMAR-IIIb. Experiments with set of wiping objects differing in softness and height demonstrate efficient learning and adaptation behavior of action of wiping.


ieee-ras international conference on humanoid robots | 2009

Grasp recognition and mapping on humanoid robots

Martin Do; Javier Romero; Hedvig Kjellström; Pedram Azad; Tamim Asfour; Danica Kragic; Rüdiger Dillmann

In this paper, we present a system for vision-based grasp recognition, mapping and execution on a humanoid robot to provide an intuitive and natural communication channel between humans and humanoids. This channel enables a human user to teach a robot how to grasp an object. The system comprises three components: human upper body motion capture system which provides the approaching direction towards an object, hand pose estimation and grasp recognition system, which provides the grasp type performed by the human as well as a grasp mapping and execution system for grasp reproduction on a humanoid robot with five-fingered hands. All three components are real-time and markerless. Once an object is reached, the hand posture is estimated, including hand orientation and grasp type. For the execution on a robot, hand posture and approach movement are mapped and optimized according to the kinematic limitations of the robot. Experimental results are performed on the humanoid robot ARMAR-IIIb.


IEEE Transactions on Robotics | 2016

Unifying Representations and Large-Scale Whole-Body Motion Databases for Studying Human Motion

Christian Mandery; Ömer Terlemez; Martin Do; Nikolaus Vahrenkamp; Tamim Asfour

Large-scale human motion databases are key for research questions ranging from human motion analysis and synthesis, biomechanics of human motion, data-driven learning of motion primitives, and rehabilitation robotics to the design of humanoid robots and wearable robots such as exoskeletons. In this paper we present a large-scale database of whole-body human motion with methods and tools, which allows a unifying representation of captured human motion, and efficient search in the database, as well as the transfer of subject-specific motions to robots with different embodiments. To this end, captured subject-specific motion is normalized regarding the subjects height and weight by using a reference kinematics and dynamics model of the human body, the master motor map (MMM). In contrast with previous approaches and human motion databases, the motion data in our database consider not only the motions of the human subject but the position and motion of objects with which the subject is interacting as well. In addition to the description of the MMM reference model, we present procedures and techniques for the systematic recording, labeling, and organization of human motion capture data, object motions as well as the subject-object relations. To allow efficient search for certain motion types in the database, motion recordings are manually annotated with motion description tags organized in a tree structure. We demonstrate the transfer of human motion to humanoid robots and provide several examples of motion analysis using the database.

Collaboration


Dive into the Martin Do's collaboration.

Top Co-Authors

Avatar

Tamim Asfour

Karlsruhe Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Rüdiger Dillmann

Center for Information Technology

View shared research outputs
Top Co-Authors

Avatar

Nikolaus Vahrenkamp

Karlsruhe Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Christian Mandery

Karlsruhe Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Ömer Terlemez

Karlsruhe Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Ales Ude

Karlsruhe Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Pedram Azad

Karlsruhe Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Kai Welke

Karlsruhe Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge