Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Markus Przybylski is active.

Publication


Featured researches published by Markus Przybylski.


simulation modeling and programming for autonomous robots | 2010

OpenGRASP: a toolkit for robot grasping simulation

Beatriz León; Stefan Ulbrich; Rosen Diankov; Gustavo Puche; Markus Przybylski; Antonio Morales; Tamim Asfour; Sami Moisio; Jeannette Bohg; James J. Kuffner; Rüdiger Dillmann

Simulation is essential for different robotic research fields such as mobile robotics, motion planning and grasp planning. For grasping in particular, there are no software simulation packages, which provide a holistic environment that can deal with the variety of aspects associated with this problem. These aspects include development and testing of new algorithms, modeling of the environments and robots, including the modeling of actuators, sensors and contacts. In this paper, we present a new simulation toolkit for grasping and dexterous manipulation called OpenGRASP addressing those aspects in addition to extensibility, interoperability and public availability. OpenGRASP is based on a modular architecture, that supports the creation and addition of new functionality and the integration of existing and widely-used technologies and standards. In addition, a designated editor has been created for the generation and migration of such models. We demonstrate the current state of OpenGRASPs development and its application in a grasp evaluation environment.


intelligent robots and systems | 2010

Unions of balls for shape approximation in robot grasping

Markus Przybylski; Tamim Asfour; Rüdiger Dillmann

Typical tasks of future service robots involve grasping and manipulating a large variety of objects differing in size and shape. Generating stable grasps on 3D objects is considered to be a hard problem, since many parameters such as hand kinematics, object geometry, material properties and forces have to be taken into account. This results in a high-dimensional space of possible grasps that cannot be searched exhaustively. We believe that the key to find stable grasps in an efficient manner is to use a special representation of the object geometry that can be easily analyzed. In this paper, we present a novel grasp planning method that evaluates local symmetry properties of objects to generate only candidate grasps that are likely to be of good quality. We achieve this by computing the medial axis which represents a 3D object as a union of balls. We analyze the symmetry information contained in the medial axis and use a set of heuristics to generate geometrically and kinematically reasonable candidate grasps. These candidate grasps are tested for force-closure. We present the algorithm and show experimental results on various object models using an anthropomorphic hand of a humanoid robot in simulation.


intelligent robots and systems | 2011

Planning grasps for robotic hands using a novel object representation based on the medial axis transform

Markus Przybylski; Tamim Asfour; Rüdiger Dillmann

Many supporting activities that future service robots might perform in peoples homes depend on the capability to grasp and manipulate arbitrary objects. Easily accomplished by humans, but very difficult to achieve for robots, grasping involves dealing with a high-dimensional space of parameters which include hand kinematics, object geometry, material properties and forces. We believe that the way a robot grasps an object should be motivated by the objects geometry and that the search space for stable grasps can be dramatically reduced if the underlying object representation reflects symmetry properties of the object that contain valuable information for grasp planning. In this paper, we introduce the grid of medial spheres, a volumetric object representation based on the medial axis transform. The grid of medial spheres represents arbitrarily shaped objects with arbitrary levels of detail and contains symmetry information that can be easily exploited by a grasp planning algorithm. We present the data structure as well as a grasp planning algorithm that exploits it and provide experimental results on various object models using two robot hands in simulation.


ieee-ras international conference on humanoid robots | 2010

Representation of pre-grasp strategies for object manipulation

Daniel Kappler; Lillian Y. Chang; Markus Przybylski; Nancy S. Pollard; Tamim Asfour; Rüdiger Dillmann

In this paper, we present a method for representing and re-targeting manipulations for object adjustment before final grasping. Such pre-grasp manipulation actions bring objects into better configurations for grasping through e.g. object rotation or object sliding. For this purpose, we propose a scaling-invariant and rotation-invariant representation of the hand poses, which is then automatically adapted to the target object to perform the selected pre-grasp manipulations. We show that pre-grasp strategies such as sliding manipulations not only enable more robust object grasping, but also significantly increase the success rate for grasping.


intelligent robots and systems | 2011

The OpenGRASP benchmarking suite: An environment for the comparative analysis of grasping and dexterous manipulation

Stefan Ulbrich; Daniel Kappler; Tamim Asfour; Nikolaus Vahrenkamp; Alexander Bierbaum; Markus Przybylski; Rüdiger Dillmann

In this work, we present a new software environment for the comparative evaluation of algorithms for grasping and dexterous manipulation. The key aspect in its development is to provide a tool that allows the reproduction of well-defined experiments in real-life scenarios in every laboratory and, hence, benchmarks that pave the way for objective comparison and competition in the field of grasping. In order to achieve this, experiments are performed on a sound open-source software platform with an extendable structure in order to be able to include a wider range of benchmarks defined by robotics researchers. The environment is integrated into the OpenGRASP toolkit that is built upon the OpenRAVE project and includes grasp-specific extensions and a tool for the creation/integration of new robot models. Currently, benchmarks for grasp and motion planningare included as case studies, as well as a library of domestic everyday objects models, and a real-life scenario that features a humanoid robot acting in a kitchen.


ieee-ras international conference on humanoid robots | 2011

Bimanual grasp planning

Nikolaus Vahrenkamp; Markus Przybylski; Tamim Asfour; Rüdiger Dillmann

The ability to grasp large objects with both hands enables bimanual robot systems to fully employ their capabilities in human-centered environments. Hence, algorithms are needed to precompute bimanual grasping configurations that can be used online to efficiently create whole body grasps. In this work we present a bimanual grasp planner that can be used to build a set of grasps together with manipulability information for a given object. For efficient grasp planning precomputed reachability information and a beneficial object representation, based on medial axis descriptions, are used. Since bimanual grasps may suffer from low manipulability, caused by a closed kinematic chain, we show how the manipulability of a bimanual grasp can be used as a quality measure. Therefore, manipulability clusters are introduced as an efficient way to approximatively describe the manipulability of a given bimanual grasp. The proposed approach is evaluated with a reference implementation, based on Simox [1], for the humanoid robot ARMAR-III [2]. Since the presented algorithms are robot-independent, there are no limitations for using this planner on other robot systems.


IFAC Proceedings Volumes | 2012

Task-based Grasp Adaptation on a Humanoid Robot

Jeannette Bohg; Kai Welke; Beatriz León; Martin Do; Dan Song; Walter Wohlkinger; Marianna Madry; Aitor Aldoma; Markus Przybylski; Tamim Asfour; Higinio Martí; Danica Kragic; Antonio Morales; Markus Vincze

In this paper, we present an approach towards autonomous grasping of objects according to their category and a given task. Recent advances in the field of object segmentation and categorization as well as task-based grasp inference have been leveraged by integrating them into one pipeline. This allows us to transfer task-specific grasp experience between objects of the same category. The effectiveness of the approach is demonstrated on the humanoid robot ARMAR-IIIa.


ieee-ras international conference on humanoid robots | 2011

Human-inspired selection of grasp hypotheses for execution on a humanoid robot

Markus Przybylski; Tamim Asfour; Rüdiger Dillmann; Rene Gilster; Heiner Deubel

Future humanoid robots will need the capability to grasp and manipulate arbitrary objects in order to assist people in their homes, to interact with them and with the environment. In this work, we present an approach to grasp known objects. Our approach consists of an offline step for grasp planning, a rating step which determines the human likeness of the grasps and an execution step, where the most suitable grasp is performed on a humanoid robot. We especially focus on the rating step where we use human grasping data to rate pre-computed grasp hypotheses from our grasp planner in order to select the most human-like feasible grasp for execution on the real robot. We present the details of our method together with experiments on our ARMAR-III humanoid robot.


ieee-ras international conference on humanoid robots | 2012

A skeleton-based approach to grasp known objects with a humanoid robot

Markus Przybylski; Mirko Wächter; Tamim Asfour; Rüdiger Dillmann

This paper is about grasping known objects of arbitrary shape with a humanoid robot. We extend our previous work, where we presented a grasp planning method using an object representation based on the medial axis transform (MAT). The MAT describes an objects topological skeleton and contains information about local symmetry properties and thickness valuable for grasp planning. So far, our previous work was only conducted in simulation. The contribution of this paper is the transfer of our grasp planning method to the real world. We present grasping experiments with challenging arbitrarily shaped objects where we execute the grasps generated by our grasp planner on a real humanoid robot with a five-finger hand.


Archive | 2013

Grasp and Motion Planning for Humanoid Robots

Markus Przybylski; Nikolaus Vahrenkamp; Tamim Asfour; Rüdiger Dillmann

The capability of humanoid robots to grasp objects is a key competence for their successful application in human-centered environments. We present an approach for grasping daily objects consisting of offline and online phases for grasp and collision-free motion planning. The proposed method generates object-related sets of feasible grasping configurations in an offline phase that are being used for online planning of grasping motions on a humanoid robot. Generating force-closure (FC) grasps on 3D objects is considered to be a hard problem, since many parameters, such as hand kinematics, object geometry, material properties, and forces have to be taken into account, making the space of possible candidate grasps too large to search exhaustively. We believe that the key to find stable grasps in an efficient manner is to use a special representation of the object geometry that can be easily analyzed. In this chapter, we present a novel grasp planning method that evaluates local symmetry properties of objects to generate only candidate grasps that are likely to be of good quality. We achieve this by computing the medial axis which represents symmetry properties of 3D objects by inscribing spheres of maximum diameter into the original shape. Our grasp planner performs offline analysis of the object’s medial axis and generates geometrically meaningful candidate grasps. These are then tested for FC in order to create sets of feasible grasps. The resulting grasp sets are used during the online phase for planning collision-free grasping motions with the IK-RRT approach. In contrast to classical motion planning algorithms related to Rapidly exploring Random Trees (RRT), the IK-RRT planner does not rely on one specific goal configuration, but it implicitly uses a goal region in configuration space that is implied by a set of potential grasping configurations in workspace. By using efficient IK-solvers to sample potential goal configurations during planning, the IK-RRT planner is able to efficiently compute collision-free grasping motions for high-dimensional planning problems. Further, an extension to bimanual grasping problems is discussed and evaluations on the humanoid robot ARMAR-III [3] are performed.

Collaboration


Dive into the Markus Przybylski's collaboration.

Top Co-Authors

Avatar

Tamim Asfour

Karlsruhe Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Rüdiger Dillmann

Center for Information Technology

View shared research outputs
Top Co-Authors

Avatar

Nikolaus Vahrenkamp

Karlsruhe Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Kai Welke

Karlsruhe Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Daniel Kappler

Karlsruhe Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Julian Schill

Karlsruhe Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

David Schiebener

Karlsruhe Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Martin Do

Karlsruhe Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Stefan Ulbrich

Karlsruhe Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge