Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Manuel Brucker is active.

Publication


Featured researches published by Manuel Brucker.


intelligent robots and systems | 2013

Combining object modeling and recognition for active scene exploration

Simon Kriegel; Manuel Brucker; Zoltan-Csaba Marton; Tim Bodenmüller; Michael Suppa

Active scene exploration incorporates object recognition methods for analyzing a scene of partially known objects and exploration approaches for autonomous modeling of unknown parts. In this work, recognition, exploration, and planning methods are extended and combined in a single scene exploration system, enabling advanced techniques such as multi-view recognition from planned view positions and iterative recognition by integration of new objects from a scene. Here, a geometry based approach is used for recognition, i.e. matching objects from a database. Unknown objects are autonomously modeled and added to the recognition database. Next-Best-View planning is performed both for recognition and modeling. Moreover, 3D measurements are merged in a Probabilistic Voxel Space, which is utilized for planning collision free paths, minimal occlusion views, and verifying the poses of the recognized objects against all previous information. Experiments on an industrial robot with attached 3D sensors are shown for scenes with household and industrial objects.


intelligent robots and systems | 2013

Virtual reality support for teleoperation using online grasp planning

Katharina Hertkorn; Maximo A. Roa; Manuel Brucker; Philipp Kremer; Christoph Borst

Classic telepresence approaches allow a human to interact with a remote or a virtual reality environment (VR) with force feedback. Coupling with a remote robot can be used to work in dangerous environments without the human being on-site. The coupling with a VR system can be used for training and verification of task sequences or robotic actions.training and verification of task sequences or robotic actions. We present an enhanced telepresence system that uses the advantages of VR to perform manipulation tasks in remote environments with multifingered hands.It provides the user with an intuitive interface that visualizes the knowledge of the robot about its environment; and the combination of VR, telepresence and shared autonomy facilitates object manipulation for the user.


international conference on robotics and automation | 2012

Sequential scene parsing using range and intensity information

Manuel Brucker; Simon Leonard; Tim Bodenmüller; Gregory D. Hager

This paper describes an extension of the sequential scene analysis system presented by Hager and Wegbreit [12]. In contrast to the original system, which was limited to scenes consisting of geometric primitives, such as spheres, cuboids, and cylinders computed from range data, the extended system is capable of dealing with arbitrarily shaped objects computed from range and intensity images. An object model composed of a triangulated geometry and intensity-based SURF features is introduced. The integration of prior object models into the sequential scene parsing framework is described. The extended system is evaluated with respect to pose estimation and its ability to handle complex scene sequences. It is shown that the new object models enable accurate pose estimation and reliable recognition even in highly cluttered scenes.


international conference on ubiquitous robots and ambient intelligence | 2015

Autonomous pick and place operations in industrial production

Andreas Dömel; Simon Kriegel; Manuel Brucker; Michael Suppa

Pick and place applications are very common in industrial environments. Autonomous execution of this task is desirable, as it is simple and repetitive for a human. To this end, we employ a mobile robot containing a light weight robot (LWR) arm, a pan-tilt unit (PTU) and a multitude of sensors (see Fig. 1).


international conference on advanced robotics | 2017

Experience-based optimization of robotic perception

Maximilian Durner; Simon Kriegel; Sebastian Riedel; Manuel Brucker; Zoltan-Csaba Marton; Ferenc Balint-Benczedi; Rudolph Triebel

As the performance of key perception tasks heavily depends on their parametrization, deploying versatile robots to different application domains will also require a way to tune these changing scenarios by their operators. As many of these tunings are found by trial and error basically by experts as well, and the quality criteria change from application to application, we propose a Pipeline Optimization Framework that helps overcoming lengthy setup times by largely automating this process. When deployed, fine-tuning optimizations as presented in this paper can be initiated on pre-recorded data, dry runs, or automatically during operation. Here, we quantified the performance gains for two crucial modules based on ground truth annotated data. We release our challenging THR dataset, including evaluation scenes for two application scenarios.


Archive | 2018

Implicit 3D Orientation Learning for 6D Object Detection from RGB Images

Martin Sundermeyer; Zoltan-Csaba Marton; Maximilian Durner; Manuel Brucker; Rudolph Triebel

We propose a real-time RGB-based pipeline for object detection and 6D pose estimation. Our novel 3D orientation estimation is based on a variant of the Denoising Autoencoder that is trained on simulated views of a 3D model using Domain Randomization.


Autonomous Robots | 2018

Improving object orientation estimates by considering multiple viewpoints

Zoltan-Csaba Marton; Serkan Türker; Christian Rink; Manuel Brucker; Simon Kriegel; Tim Bodenmüller; Sebastian Riedel

This article describes a probabilistic approach for improving the accuracy of general object pose estimation algorithms. We propose a histogram filter variant that uses the exploration capabilities of robots, and supports active perception through a next-best-view proposal algorithm. For the histogram-based fusion method we focus on the orientation of the 6 degrees of freedom (DoF) pose, since the position can be processed with common filtering techniques. The detected orientations of the object, estimated with a pose estimator, are used to update the hypothesis of its actual orientation. We discuss the design of experiments to estimate the error model of a detection method, and describe a suitable representation of the orientation histograms. This allows us to consider priors about likely object poses or symmetries, and use information gain measures for view selection. The method is validated and compared to alternatives, based on the outputs of different 6 DoF pose estimators, using real-world depth images acquired using different sensors, and on a large synthetic dataset.


International Journal of Advanced Robotic Systems | 2017

Toward fully autonomous mobile manipulation for industrial environments

Andreas Dömel; Simon Kriegel; Michael Kaßecker; Manuel Brucker; Tim Bodenmüller; Michael Suppa

This work presents a concept for autonomous mobile manipulation in industrial environments. Utilizing autonomy enables an unskilled human worker to easily configure a complex robotics system in a setup phase before carrying out fetch and carry operations in the execution phase. In order to perform the given tasks in real industrial production sites, we propose a robotic system consisting of a mobile platform, a torque-controlled manipulator, and an additional sensor head. Multiple sensors are attached which allow for perception of the environment and the objects to be manipulated. This is essential for coping with uncertainties in real-world application. In order to provide an easy-to-use and flexible system, we present a modular software concept which is handled and organized by a hierarchical flow control depending on the given task and environmental requirements. The presented concept for autonomous mobile manipulation is implemented exemplary for industrial manipulation tasks and proven by real-world application in a water pump production site. Furthermore, the concept has also been applied to other robotic systems and other domains for planetary exploration with a rover.


international symposium on robotics | 2014

Integration and Assessment of Multiple Mobile Manipulators in a Real-World Industrial Production Facility

Simon Bøgh; Casper Schou; Thomas Rühr; Yevgen Kogan; Andreas Dömel; Manuel Brucker; Christof Eberst; Riccardo Tornese; Christoph Sprunk; Gian Diego Tipaldi; Trine Vestergaard Hennessy


international conference on robotics and automation | 2018

Semantic Labeling of Indoor Environments from 3D RGB Maps

Manuel Brucker; Maximilian Durner; Rares Ambrus; Zoltan-Csaba Marton; Axel Wendt; Patric Jensfelt; Kai Oliver Arras; Rudolph Triebel

Collaboration


Dive into the Manuel Brucker's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Sebastian Riedel

Technische Universität München

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge