Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Johannes Hummel is active.

Publication


Featured researches published by Johannes Hummel.


international symposium on visual computing | 2012

An Evaluation of Open Source Physics Engines for Use in Virtual Reality Assembly Simulations

Johannes Hummel; Robin Wolff; Tobias Stein; Andreas Gerndt; Torsten W. Kuhlen

We present a comparison of five freely available physics engines with specific focus on robotic assembly simulation in virtual reality (VR) environments. The aim was to evaluate the engines with generic settings and minimum parameter tweaking. Our benchmarks consider the minimum collision detection time for a large number of objects, restitution characteristics, as well as constraint reliability and body inter-penetration. A further benchmark tests the simulation of a screw and nut mechanism made of rigid-bodies only, without any analytic approximation. Our results show large deviations across the tested engines and reveal benefits and disadvantages that help in selecting the appropriate physics engine for assembly simulations in VR.


symposium on 3d user interfaces | 2013

An evaluation of two simple methods for representing heaviness in immersive virtual environments

Johannes Hummel; Janki Dodiya; Robin Wolff; Andreas Gerndt; Torsten W. Kuhlen

Weight perception in virtual environments generally can be achieved with haptic devices. However, most of these are hard to integrate in an immersive virtual environment (IVE) due to their technical complexity and the restriction of a users movement within the IVE. We describe two simple methods using only a wireless light-weight finger-tracking device in combination with a physics simulated hand model to create a feeling of heaviness of virtual objects when interacting with them in an IVE. The first method maps the varying distance between tracked fingers and the thumb to the grasping force required for lifting a virtual object with a given weight. The second method maps the detected intensity of finger pinch during grasping gestures to the lifting force. In an experiment described in this paper we investigated the potential of the proposed methods for the discrimination of heaviness of virtual objects by finding the just noticeable difference (JND) to calculate the Weber fraction. Furthermore, the workload that users experienced using these methods was measured to gain more insight into their usefulness as interaction technique. At a hit ratio of 0.75, the determined Weber fraction using the finger distance based method was 16.25% and using the pinch based method was 15.48%, which corresponds to values found in related work. There was no significant effect of method on the difference threshold measured and the workload experienced, however the user preference was higher for the pinch based method. The results demonstrate the capability of the proposed methods for the perception of heaviness in IVEs and therefore represent a simple alternative to haptics based methods.


ieee virtual reality conference | 2016

A lightweight electrotactile feedback device for grasp improvement in immersive virtual environments

Johannes Hummel; Janki Dodiya; Laura Eckardt; Robin Wolff; Andreas Gerndf; Torsten W. Kuhlen

An immersive virtual environment is the ideal platform for the planning and training of on-orbit servicing missions, as it provides a flexible and safe environment. In such kind of virtual assembly simulation, grasping virtual objects is one of the most common and natural interactions. However, unlike grasping objects in the real world, it is a non-trivial task in virtual environments, where the primary feedback is visual only. A lot of research investigated ways to provide haptic feedback, such as force, vibrational and electrotactile feedback. Such devices, however, are usually uncomfortable and hard to integrate in projection-based immersive YR systems. In this paper, we present a novel, small and lightweight electro-tactile feedback device, specifically designed for immersive virtual environments. It consists of a small tactor with eight electrodes for each finger and a signal generator attached to the users hand or arm. Our device can be easily integrated with an existing optical finger tracking system. The study presented in this paper assesses the feasibility and usability of the interaction device. An experiment was conducted in a repeated measures design using the electrotactile feedback modality as independent variable. As benchmark, we chose three typical assembly tasks of a YR simulation for satellite on-orbit servicing missions, including pressing a button, switching a lever switch, and pulling a module from its slot. Results show that electrotactile feedback improved the users grasping in our virtual on-orbit servicing scenario. The task completion time was significantly lower for all three tasks and the precision of the users interaction was higher. The workload reported by the participants was significantly lower when using electrotactile feedback. Additionally, users were more confident with their performance while completing the tasks with electrotactile feedback. We describe the device, outline the user study and report the results.


symposium on 3d user interfaces | 2015

Cirque des bouteilles: The art of blowing on bottles

Daniel Zielasko; Dominik Rausch; Yuen C. Law; Thomas Knott; Sebastian Pick; Sven Porsche; Joachim Herber; Johannes Hummel; Torsten W. Kuhlen

Making music by blowing on bottles is fun but challenging. We introduce a novel 3D user interface to play songs on virtual bottles. For this purpose the user blows into a microphone and the stream of air is recreated in the virtual environment and redirected to virtual bottles she is pointing to with her fingers. This is easy to learn and subsequently opens up opportunities for quickly switching between bottles and playing groups of them together to form complex melodies. Furthermore, our interface enables the customization of the virtual environment, by means of moving bottles, changing their type or filling level.


ieee virtual reality conference | 2012

Comparing three interaction methods for manipulating thin deformable virtual objects

Johannes Hummel; Robin Wolff; Andreas Gerndt; Torsten W. Kuhlen

We present results of a user study in which we compared three interaction methods for manipulating deformable objects in immersive virtual environments. The task was to control a virtual robot hand removing a thin foil cover from a satellite in an on-orbit servicing training simulator. The lack of haptic feedback placed a high challenge on the user when trying to apply the right force for grasping the foil without losing grip or damaging it. We compared the intuitiveness and effectiveness of using a tracked joystick, finger distance measurement, and a novel prototype enabling direct force input through pinching.


ieee aerospace conference | 2015

VR-OOS: The DLR's virtual reality simulator for telerobotic on-orbit servicing with haptic feedback

Mikel Sagardia; Katharina Hertkorn; Thomas Hulin; Simon Schätzle; Robin Wolff; Johannes Hummel; Janki Dodiya; Andreas Gerndt

The growth of space debris is becoming a severe issue that urgently requires mitigation measures based on maintenance, repair, and de-orbiting technologies. Such on-orbit servicing (OOS) missions, however, are delicate and expensive. Virtual Reality (VR) enables the simulation and training in a flexible and safe environment, and hence has the potential to drastically reduce costs and time, while increasing the success rate of future OOS missions. This paper presents a highly immersive VR system with which satellite maintenance procedures can be simulated interactively using visual and haptic feedback. The system can be used for verification and training purposes for human and robot systems interacting in space. Our framework combines unique realistic virtual reality simulation engines with advanced immersive interaction devices. The DLR bimanual haptic device HUG is used as the main user interface. The HUG is equipped with two light-weight robot arms and is able to provide realistic haptic feedback on both human arms. Additional devices provide vibrotactile and electrotactile feedback at the elbow and the fingertips. A particularity of the realtime simulation is the fusion of the Bullet physics engine with our haptic rendering algorithm, which is an enhanced version of the Voxmap-Pointshell Algorithm. Our haptic rendering engine supports multiple objects in the scene and is able to compute collisions for each of them within 1 msec, enabling realistic virtual manipulation tasks even for stiff collision configurations. The visualization engine ViSTA is used during the simulation to achieve photo-realistic effects, increasing the immersion. In order to provide a realistic experience at interactive frame rates, we developed a distributed system architecture, where the load of computing the physics simulation, haptic feedback and visualization of a complex scene is transferred to dedicated machines. The implementations are presented in detail and the performance of the overall system is validated. Additionally, a preliminary user study in which the virtual system is compared to a physical test bed shows the suitability of the VR-OOS framework.


Archive | 2013

Fallbeispiele für VR/AR

Ralf Dörner; Geert Matthys; Manfred Bogen; Stefan Rilling; Andreas Gerndt; Janki Dodiya; Katharina Hertkorn; Thomas Hulin; Johannes Hummel; Mikel Sagardia; Robin Wolff; Tom Kühnert; Guido Brunnett; Hagen Buchholz; Lisa Blum; Christoffer Menk; Christian Bade; Werner Schreiber; Matthias Greiner; Thomas Alexander; Michael Kleiber; Gerd Bruder; Frank Steinicke

Dieses Kapitel enthalt eine Sammlung von ausgewahlten erfolgreichen Fallbeispielen fur VR/AR aus Forschung und Praxis.


ICAT/EGVE/EuroVR | 2012

Towards Interacting with Force-Sensitive Thin Deformable Virtual Objects

Johannes Hummel; Robin Wolff; Janki Dodiya; Andreas Gerndt; Torsten W. Kuhlen

The selection of the right input devices for 3D interaction methods is important for a successful VR system. While natural direct interaction is often preferred, research has shown that indirect interaction can be beneficial. This paper focuses on an immersive simulation and training environment, in which one sub-task it is to carefully grasp and move a force-sensitive thin deformable foil without damaging it. In order to ensure transfer of training it was necessary to inform the user of the fact of gentle grasping and moving the foil. We explore the potential of three simple and light-weight interaction methods that each map interaction to a virtual hand in a distinct way. We used a standard tracked joystick with an indirect mapping, a standard finger tracking device with direct mapping based on finger position, and a novel enhanced finger tracking device, which additionally allowed pinch force input. The results of our summative user study show that the task performance did not show a significant difference among the three interaction methods. The simple position based mapping using finger tracking was most preferred, although the enhanced finger tracking device with direct force input offered the most natural interaction mapping. Our findings show that both a direct and indirect input method have potential to interact with force-sensitive thin deformable objects, while the direct method is preferred.


Archive | 2012

Short Paper: Towards Interacting with Force-Sensitive Thin Deformable Virtual Objects

Johannes Hummel; Robin Wolff; Janki Dodiya; Andreas Gerndt; Torsten W. Kuhlen


Archive | 2014

Fallbeispiel für VR/AR - Virtuelle Satellitenreparatur im Orbit

Andreas Gerndt; Janki Dodiya; Katharina Hertkorn; Thomas Hulin; Johannes Hummel; Mikel Sagardia; Robin Wolff

Collaboration


Dive into the Johannes Hummel's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Janki Dodiya

German Aerospace Center

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Thomas Hulin

German Aerospace Center

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge