Dennis Krupke
University of Hamburg
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Dennis Krupke.
26th Conference on Modelling and Simulation | 2012
Dennis Krupke; Guoyuan Li; Jianwei Zhang; Houxiang Zhang; Hans Petter Hildre
In this paper a novel GUI for a modular robots simulation environment is introduced. The GUI is intended to be used by unexperienced users that take part in an educational workshop as well as by experienced researchers who want to work on the topic of control algorithms of modular robots with the help of a framework. It offers two modes for the two kinds of users. Each mode makes it possible to configure everything needed with a graphical interface and stores configurations in XML files. Furthermore, the GUI not only supports importing the user’s control algorithms, but also provides online modulation for these algorithms. Some learning techniques such as genetic algorithms and reinforcement learning are also integrated into the GUI for locomotion optimization. Thus, its easy to use, and its scalability makes it suitable for research and education.
virtual reality software and technology | 2016
Dennis Krupke; Lasse Einig; Eike Langbehn; Jianwei Zhang; Frank Steinicke
Current developments in the field of user interface (UI) technologies as well as robotic systems provide enormous potential to reshape the future of human-robot interaction (HRI) and collaboration. However, the design of reliable, intuitive and comfortable user interfaces is a challenging task. In this paper, we focus on one important aspect of such interfaces, i.e., teleoperation. We explain how to setup a heterogeneous, extendible and immersive system for controlling a distant robotic system via the network. Therefore, we exploit current technologies from the area of virtual reality (VR) and the Unity3D game engine in order to provide natural user interfaces for teleoperation. Regarding robot control, we use the well-known robot operating system (ROS) and apply its freely available modular components. The contribution of this work lies in the implementation of a flexible immersive grasping control system using a network layer (ROSbridge) between Unity3D and ROS for arbitary robotic hardware.
Industrial Robot-an International Journal | 2015
Dennis Krupke; Florens Wasserfall; Norman Hendrich; Jianwei Zhang
Purpose – This paper aims to present the design of a modular robot with 3D-printing technology. Design/methodology/approach – The robot consists of a number of autonomous modules coupled by magnetic interfaces. Each module combines 3D-printed mechanical parts with widely available standard electronic components, including a microcontroller and a single servo actuator. The mechanical and electrical connection is provided by a single set of magnets which apply the physical force between the modules and at the same time serve as wires for power and communication. Findings – The PMR is a full-featured robotic device, well integrated into a simulation framework, capable to execute common locomotion patterns but still extremely affordable (approximately 25/module). Furthermore, the design is easy to extend and replicate for other research and education groups. Originality/value – This paper explores a novel approach of connecting devices in a complex way by utilizing very simple magnetic parts. A second focus l...
international conference on multisensor fusion and integration for intelligent systems | 2012
Martin Noeske; Dennis Krupke; Norman Hendrich; Jianwei Zhang; Houxiang Zhang
This paper describes a method to integrate a hardware device into a modular robot control and simulation software and introduces sensor fusion to investigate the current set of control parameters. As a highlevel remote unit for modulation of control algorithms the Wiimote and the usage of its sensors will be introduced. Sensor fusion of the Wiimotes sensors allows to create a human-robot interaction modul to control a simulation environment as well as real robots. Finally its benefit for evaluation of locomotion control algorithms will be pointed out.
IEEE Transactions on Visualization and Computer Graphics | 2018
Jingxin Zhang; Eike Langbehn; Dennis Krupke; Nicholas Katzakis; Frank Steinicke
Telepresence systems have the potential to overcome limits and distance constraints of the real-world by enabling people to remotely visit and interact with each other. However, current telepresence systems usually lack natural ways of supporting interaction and exploration of remote environments (REs). In particular, single webcams for capturing the RE provide only a limited illusion of spatial presence, and movement control of mobile platforms in todays telepresence systems are often restricted to simple interaction devices. One of the main challenges of telepresence systems is to allow users to explore a RE in an immersive, intuitive and natural way, e.g., by real walking in the users local environment (LE), and thus controlling motions of the robot platform in the RE. However, the LE in which the users motions are tracked usually provides a much smaller interaction space than the RE. In this context, redirected walking (RDW) is a very suitable approach to solve this problem. However, so far there is no previous work, which explored if and how RDW can be used in video-based 360° telepresence systems. In this article, we conducted two psychophysical experiments in which we have quantified how much humans can be unknowingly redirected on virtual paths in the RE, which are different from the physical paths that they actually walk in the LE. Experiment 1 introduces a discrimination task between local and remote translations, and in Experiment 2 we analyzed the discrimination between local and remote rotations. In Experiment 1 participants performed straightforward translations in the LE that were mapped to straightforward translations in the RE shown as 360° videos, which were manipulated by different gains. Then, participants had to estimate if the remotely perceived translation was faster or slower than the actual physically performed translation. Similarly, in Experiment 2 participants performed rotations in the LE that were mapped to the virtual rotations in a 360° video-based RE to which we applied different gains. Again, participants had to estimate whether the remotely perceived rotation was smaller or larger than the actual physically performed rotation. Our results show that participants are not able to reliably discriminate the difference between physical motion in the LE and the virtual motion from the 360° video RE when virtual translations are down-scaled by 5.8% and up-scaled by 9.7%, and virtual rotations are about 12.3% less or 9.2% more than the corresponding physical rotations in the LE.
ieee virtual reality conference | 2016
Dennis Krupke; Paul Lubos; Lena Demski; Jonas Brinkhoff; Gregor Weber; Fabian Willke; Frank Steinicke
This video presents the experimental setup of an immersive flight simulation system, which combines body tracking with a head-mounted display. Users are hanging freely in a climbing harness in order to improve the impression to fly. The base system was created by a group of students during one semester as a bachelor project. A study revealed that a high degree of presence is achievable by the combination of body-posture based control and 3D visualization via a head-mounted display while hanging freely in a climbing gear. Furthermore the results of a study that compares two different steering methods are presented. Both control methods seem to be quite interesting and intuitively controllable but show individual preferences among the participants of the study. In our experiments very frequently strong vaction effects were reported when flying turns to one side. The setup very likely causes simulator sickness, especially if a participant was bad in controlling his flight.
Multisensor Fusion and Information Integration for Intelligent Systems (MFI), 2014 International Conference on | 2014
Dennis Krupke; Martin Noeske; Florens Wasserfall; Jianwei Zhang
This paper describes a method to generate locomotion of a modular reconfigurable robot in chainlike configuration based on its topology and orientation. Combined with results from physics simulation efficient locomotion patterns, using optimized control parameters, can be applied to the robot. The resulting locomotion depends highly on the properties of the current configuration of the robots body and its orientation.
CLAWAR 2015: 18th International Conference on Climbing and Walking Robots and the Support Technologies for Mobile Machines | 2015
Dennis Krupke; Norman Hendrich; Jianwei Zhang; Houxiang Zhang
intelligent robots and systems | 2017
Sebastian Starke; Norman Hendrich; Dennis Krupke; Jianwei Zhang
CLAWAR 2017: 20th International Conference on Climbing and Walking Robots and the Support Technologies for Mobile Machines | 2017
Dennis Krupke; Sebastian Starke; Lasse Einig; Jianwei Zhang; Frank Steinicke