Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Daniel Leidner is active.

Publication


Featured researches published by Daniel Leidner.


international conference on robotics and automation | 2012

Power grasp planning for anthropomorphic robot hands

Maximo A. Roa; Max J. Argus; Daniel Leidner; Christoph Borst; Gerd Hirzinger

This paper presents an approach for computing power grasps for hands with kinematic structure similar to the human hand, which allows the implementation of strategies inspired in human grasping actions. The proposed method first samples the object surface to look for the best spots for creating an opposing grasp with two or three fingers, and then aligns the other fingers to match the local curvature of the object surface. Different grasp strategies are considered, depending on the relative size of the object with respect to the hand, and on the location of potential obstacles in the environment. Several application examples are provided with two different hand models.


ieee-ras international conference on humanoid robots | 2012

Things are made for what they are: Solving manipulation tasks by using functional object classes

Daniel Leidner; Christoph Borst; Gerd Hirzinger

Solving arbitrary manipulation tasks is a key feature for humanoid service robots. However, especially when tasks involve handling complex mechanisms or using tools, a generic action description is hard to define. Different objects require different handling methods. Therefore, we try to solve manipulation tasks from point of view of the object, rather than in the context of the robot. Action templates within the object context are introduced to resolve object specific task constraints. As part of a centralized world representation, the action templates are integrated into the planning process. This results in an intuitive way of solving manipulation tasks. The underlying architecture as well as the mechanisms are discussed within this paper. The proposed methods are evaluated in two experiments.


intelligent robots and systems | 2010

Exploiting structure in two-armed manipulation tasks for humanoid robots

Franziska Zacharias; Daniel Leidner; Florian Schmidt; Christoph Borst; Gerd Hirzinger

In autonomous bimanual operation of a robot, parallelized planning and execution of a task is essential. Elements of a task have different functional and spatial relationships. They may depend on each other and have to be executed in a specific order or they may be independent and their order can be determined freely. Consequently, individual actions can be planned and executed in parallel or not. In a proof of concept, this paper shows that the structure of a task and its mapping onto subordinate planners can significantly influence planning speed and task execution. Independent tasks are planned using two parallel path planners. Dependent tasks are planned using one path planner for both arms. Using a simple, yet expandable experimentation scenario, the resulting recommendations for parameterizing path planners are verified on a humanoid robot. For execution on the real robot a violation of the rigid body model used in path planners had to be addressed.


Autonomous Robots | 2016

Knowledge-enabled parameterization of whole-body control strategies for compliant service robots

Daniel Leidner; Alexander Dietrich; Michael Beetz; Alin Albu-Schäffer

Compliant manipulation is one of the grand challenges for autonomous robots. Many household chores in human environments, such as cleaning the floor or wiping windows, rely on this principle. At the same time these tasks often require whole-body motions to cover a larger workspace. The performance of the actual task itself is thereby dependent on a large number of parameters that have to be taken into account. To tackle this issue we propose to utilize low-level compliant whole-body control strategies parameterized by high-level hybrid reasoning mechanisms. We categorize compliant wiping actions in order to determine relevant control parameters. According to these parameters we set up process models for each identified wiping action and implement generalized control strategies based on human task knowledge. We evaluate our approach experimentally on three whole-body manipulation tasks, namely scrubbing a mug with a sponge, skimming a window with a window wiper and bi-manually collecting the shards of a broken mug with a broom.


intelligent robots and systems | 2015

Classifying compliant manipulation tasks for automated planning in robotics

Daniel Leidner; Christoph Borst; Alexander Dietrich; Michael Beetz; Alin Albu-Schäffer

Many household chores and industrial manufacturing tasks require a certain compliant behavior to make deliberate physical contact with the environment. This compliant behavior can be implemented by modern robotic manipulators. However, in order to plan the task execution, a robot requires generic process models of these tasks which can be adapted to different domains and varying environmental conditions. In this work we propose a classification of compliant manipulation tasks meeting these requirements, to derive related actions for automated planning. We also present a classification for the sub-category of wiping tasks, which are most common and of great importance in service robotics.We categorize actions from an object-centric perspective to make them independent of any specific robot kinematics. The aim of the proposed taxonomy is to guide robotic programmers to develop generic actions for any kind of robotic systems in arbitrary domains.


Advanced Bimanual Manipulation | 2012

Observation and Execution

Christoph Borst; Franziska Zacharias; Florian Schmidt; Daniel Leidner; Maximo A. Roa; Katharina Hertkorn; Gerhard Grunwald; Pietro Falco; Ciro Natale; Emilio Maggio

Assistive robotic systems in household or industrial production environments get more and more capable of performing also complex tasks which previously only humans were able to do. As robots are often equipped with two arms and hands, similar manipulations can be executed. The robust programming of such devices with a very large number of degrees of freedom (DOFs) compared with single industrial robot arms however is laborious if done joint-wise. Two major directions to overcome this problem have been previously proposed. The programming by demonstration (PbD) approach, where human arm and recently also hand motions are tracked, segmented and re-executed in an adaptive way on the robotic system and the high-level planning approach which tries to generate a task sequence on a logical level and attributes geometric information as necessary to generate artificial trajectories to solve the task. Here we propose to combine the best of both worlds. For the very complex motion generation for a robotic hand, a rather direct approach to assign manipulation actions from human demonstration to a human hand is taken. For the combination of different basic manipulation actions the task constraints are segmented from the demonstration action and used to generate a task oriented plan. This plan is validated against the robot kinematic and geometric constraints and then a geometric motion planner can generate the necessary robot motions to fulfill the task execution on the system.


human robot interaction | 2017

EDAN: EMG-controlled Daily Assistant

Annette Hagengruber; Daniel Leidner; Jörn Vogel

Neuromuscular diseases, stroke, or trauma can lead to a reduced neural function which severely inhibits limb functionality. If the disease is strongly advanced, people can’t manage their daily life independently and become reliant on 24-hour care. In this situation, assistive technology, like a robotic manipulator mounted on a wheelchair, can provide help and relief. However, control of such a device is usually achieved with a joystick, which requires to have remaining functionality in hand and finger movement. This prevents many people with tetraplegia from efficient use of such assistive technology. An alternative to the joystick, is given by Brain-Computer Interfaces. It has been shown that noninvasive interfaces like Electroencephalography can be used to achieve control over low-dimensional devices like power wheelchairs [3]. More complex tasks like control of assistive robotic devices have been demonstrated with invasive interfaces, like the BrainGate Neural Interface System [1]. We investigate the use of surface Electromyography (EMG) as an interface for assistive robotic devices. It is a comparably cheap and easy to apply technology. We could show that people with tetraplegia due to a severe Spinal Muscular Atrophy (SMA), can still achieve control over a robotic manipulator (e.g. drinking from a bottle) by recording remaining muscular activity [4]. To investigate the use of EMG as an interface to assistive technology, we developed the research platform EDAN (EMG-controlled Daily Assistant). It consists of a robotic manipulator mounted on a state of the art power wheelchair. We use a torque controlled robotic arm (DLR-LWR 3), which is well suited for safe physical interaction with humans and the environment. The five-finger hand mounted on the robotic arm allows for stable grasping of a variety of everydayobjects. The focus of our research is twofold. On the one hand, we investigate the use of EMG as a non-invasive interface to provide people with control over assistive systems. On the other hand, we develop methods to simplify the usage of such systems with the support of artificial intelligence. Manual control of robotic manipulators is rather slow and cumbersome, especially when controlled with a noisy interface like a BCI. Artificial intelligence can help to significantly improve the usability of such systems. A shared control ap-


international conference on robotics and automation | 2014

Object-Centered Hybrid Reasoning for Whole-Body Mobile Manipulation

Daniel Leidner; Alexander Dietrich; Florian Schmidt; Christoph Borst; Alin Albu-Schäffer


ieee-ras international conference on humanoid robots | 2014

A knowledge-driven shared autonomy human-robot interface for tablet computers

Peter Birkenkampf; Daniel Leidner; Christoph Borst


human robot interaction | 2015

Command Robots from Orbit with Supervised Autonomy: An Introduction to the Meteron Supvis-Justin Experiment

Neal Y. Lii; Daniel Leidner; Andre Schiele; Peter Birkenkampf; Benedikt Pleintinger; Ralph Bayer

Collaboration


Dive into the Daniel Leidner's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Neal Y. Lii

German Aerospace Center

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ralph Bayer

German Aerospace Center

View shared research outputs
Researchain Logo
Decentralizing Knowledge