Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Eike Langbehn is active.

Publication


Featured researches published by Eike Langbehn.


tests and proofs | 2017

Walking in Virtual Reality: Effects of Manipulated Visual Self-Motion on Walking Biomechanics

Omar Janeh; Eike Langbehn; Frank Steinicke; Gerd Bruder; Alessandro Gulberti; Monika Poetter-Nerger

Walking constitutes the predominant form of self-propelled movement from one geographic location to another in our real world. Likewise, walking in virtual environments (VEs) is an essential part of a users experience in many application domains requiring a high degree of interactivity. However, researchers and practitioners often observe that basic implementations of virtual walking, in which head-tracked movements are mapped isometrically to a VE are not estimated as entirely natural. Instead, users estimate a virtual walking velocity as more natural when it is slightly increased compared to the users physical body movement. In this article, we investigate the effects of such nonisometric mappings between physical movements and virtual motions in the VE on walking velocity and biomechanics of the gait cycle. Therefore, we performed an experiment in which we measured and analyzed parameters of the biomechanics of walking under conditions with isometric as well as nonisometric mappings. Our results show significant differences in most gait parameters when walking in the VE in the isometric mapping condition compared to the corresponding parameters in the real world. For nonisometric mappings we found an increased divergence of gait parameters depending on the velocity of visual self-motion feedback. The results revealed a symmetrical effect of gait detriments for up- or down-scaled virtual velocities, which we discuss in the scope of the previous findings.


IEEE Transactions on Visualization and Computer Graphics | 2017

Bending the Curve: Sensitivity to Bending of Curved Paths and Application in Room-Scale VR

Eike Langbehn; Paul Lubos; Gerd Bruder; Frank Steinicke

Redirected walking (RDW) promises to allow near-natural walking in an infinitely large virtual environment (VE) by subtle manipulations of the virtual camera. Previous experiments analyzed the human sensitivity to RDW manipulations by focusing on the worst-case scenario, in which users walk perfectly straight ahead in the VE, whereas they are redirected on a circular path in the real world. The results showed that a physical radius of at least 22 meters is required for undetectable RDW. However, users do not always walk exactly straight in a VE. So far, it has not been investigated how much a physical path can be bent in situations in which users walk a virtual curved path instead of a straight one. Such curved walking paths can be often observed, for example, when users walk on virtual trails, through bent corridors, or when circling around obstacles. In such situations the question is not, whether or not the physical path can be bent, but how much the bending of the physical path may vary from the bending of the virtual path. In this article, we analyze this question and present redirection by means of bending gains that describe the discrepancy between the bending of curved paths in the real and virtual environment. Furthermore, we report the psychophysical experiments in which we analyzed the human sensitivity to these gains. The results reveal encouragingly wider detection thresholds than for straightforward walking. Based on our findings, we discuss the potential of curved walking and present a first approach to leverage bent paths in a way that can provide undetectable RDW manipulations even in room-scale VR.


virtual reality software and technology | 2016

Visual blur in immersive virtual environments: does depth of field or motion blur affect distance and speed estimation?

Eike Langbehn; Tino Raupp; Gerd Bruder; Frank Steinicke; Benjamin Bolte; Markus Lappe

It is known for decades that users tend to significantly underestimate or overestimate distances or speed in immersive virtual environments (IVEs) compared to corresponding judgments in the real world. Although several factors have been identified in the past that could explain small portions of this effect, the main causes of these perceptual discrepancies still remain elusive. One of the factors that has received less attention in the literature is the amount of blur presented in the visual imagery, for example, when using a head-mounted display (HMD). In this paper, we analyze the impact of the visual blur effects depth-of-field and motion blur in terms of their effects on distance and speed estimation in IVEs. We conducted three psychophysical experiments in which we compared distance or speed estimation between the real world and IVEs with different levels of depth-of-field or motion blur. Our results indicate that the amount of blur added to the visual stimuli had no noticeable influence on distance and speed estimation even when high magnitudes of blur were shown. Our findings suggest that the human perceptual system is highly capable of extracting depth and motion information regardless of blur, and implies that blur can likely be ruled out as the main cause of these misperception effects in IVEs.


symposium on spatial user interaction | 2016

Subliminal Reorientation and Repositioning in Virtual Reality During Eye Blinks

Eike Langbehn; Gerd Bruder; Frank Steinicke

Locomotion in Immersive Virtual Environments (IVEs) is one of the most basic interactions, while human walking is the most natural user-interface for this. Obviously, this technique is limited by the available physical space. Redirected Walking (RDW) wants to overcome this issue by subliminal redirection of the user inside the physical space. Traditional RDW algorithms need a circle with a radius of 22m to allow the user the exploration of an infinite virtual world. Because this is still too large to fit in a room-scale setup, we have to optimize detection thresholds and algorithms. Bolte et al. already examined reorientation and repositioning during saccades and showed that a subtle manipulation is possible. In this poster we describe how we investigated reorientation and repositioning of the user in the virtual world during eye blinks. Furthermore, we present an experimental setup for evaluating detection thresholds of reorientation and repositioning during eye blinks. And we indicate first impressions of the perception and the usability.


ieee virtual reality conference | 2017

Application of redirected walking in room-scale VR

Eike Langbehn; Paul Lubos; Gerd Bruder; Frank Steinicke

Redirected walking (RDW) promises to allow near-natural walking in an infinitely large virtual environment (VE) by subtle manipulations of the virtual camera. Previous experiments showed that a physical radius of at least 22 meters is required for undetectable RDW. However, we found that it is possible to decrease this radius and to apply RDW to room-scale VR, i. e., up to approximately 5m × 5m. This is done by using curved paths in the Ve instead of straight paths, and by coupling them together in a way that enables continuous walking. Furthermore, the corresponding paths in the real world are laid out in a way that fits perfectly into room-scale VR. In this research demo, users can experience RDW in a room-scale head-mounted display VR setup and explore a VE of approximately 25m × 25m.


virtual reality software and technology | 2016

Immersive remote grasping: realtime gripper control by a heterogenous robot control system

Dennis Krupke; Lasse Einig; Eike Langbehn; Jianwei Zhang; Frank Steinicke

Current developments in the field of user interface (UI) technologies as well as robotic systems provide enormous potential to reshape the future of human-robot interaction (HRI) and collaboration. However, the design of reliable, intuitive and comfortable user interfaces is a challenging task. In this paper, we focus on one important aspect of such interfaces, i.e., teleoperation. We explain how to setup a heterogeneous, extendible and immersive system for controlling a distant robotic system via the network. Therefore, we exploit current technologies from the area of virtual reality (VR) and the Unity3D game engine in order to provide natural user interfaces for teleoperation. Regarding robot control, we use the well-known robot operating system (ROS) and apply its freely available modular components. The contribution of this work lies in the implementation of a flexible immersive grasping control system using a network layer (ROSbridge) between Unity3D and ROS for arbitary robotic hardware.


international conference on computer graphics and interactive techniques | 2018

In the Blink of an Eye: Leveraging Blink-Induced Suppression for Imperceptible Position and Orientation Redirection in Virtual Reality

Eike Langbehn; Frank Steinicke; Markus Lappe; Gregory F. Welch; Gerd Bruder

Immersive computer-generated environments (aka virtual reality, VR) are limited by the physical space around them, e.g., enabling natural walking in VR is only possible by perceptually-inspired locomotion techniques such as redirected walking (RDW). We introduce a completely new approach to imperceptible position and orientation redirection that takes advantage of the fact that even healthy humans are functionally blind for circa ten percent of the time under normal circumstances due to motor processes preventing light from reaching the retina (such as eye blinks) or perceptual processes suppressing degraded visual information (such as blink-induced suppression). During such periods of missing visual input, change blindness occurs, which denotes the inability to perceive a visual change such as the motion of an object or self-motion of the observer. We show that this phenomenon can be exploited in VR by synchronizing the computer graphics rendering system with the human visual processes for imperceptible camera movements, in particular to implement position and orientation redirection. We analyzed human sensitivity to such visual changes with detection thresholds, which revealed that commercial off-the-shelf eye trackers and head-mounted displays suffice to translate a user by circa 4 -- 9 cm and rotate the user by circa 2 -- 5 degrees in any direction, which could be accumulated each time the user blinks. Moreover, we show the potential for RDW, whose performance could be improved by approximately 50% when using our technique.


IEEE Transactions on Visualization and Computer Graphics | 2018

Detection Thresholds for Rotation and Translation Gains in 360° Video-Based Telepresence Systems

Jingxin Zhang; Eike Langbehn; Dennis Krupke; Nicholas Katzakis; Frank Steinicke

Telepresence systems have the potential to overcome limits and distance constraints of the real-world by enabling people to remotely visit and interact with each other. However, current telepresence systems usually lack natural ways of supporting interaction and exploration of remote environments (REs). In particular, single webcams for capturing the RE provide only a limited illusion of spatial presence, and movement control of mobile platforms in todays telepresence systems are often restricted to simple interaction devices. One of the main challenges of telepresence systems is to allow users to explore a RE in an immersive, intuitive and natural way, e.g., by real walking in the users local environment (LE), and thus controlling motions of the robot platform in the RE. However, the LE in which the users motions are tracked usually provides a much smaller interaction space than the RE. In this context, redirected walking (RDW) is a very suitable approach to solve this problem. However, so far there is no previous work, which explored if and how RDW can be used in video-based 360° telepresence systems. In this article, we conducted two psychophysical experiments in which we have quantified how much humans can be unknowingly redirected on virtual paths in the RE, which are different from the physical paths that they actually walk in the LE. Experiment 1 introduces a discrimination task between local and remote translations, and in Experiment 2 we analyzed the discrimination between local and remote rotations. In Experiment 1 participants performed straightforward translations in the LE that were mapped to straightforward translations in the RE shown as 360° videos, which were manipulated by different gains. Then, participants had to estimate if the remotely perceived translation was faster or slower than the actual physically performed translation. Similarly, in Experiment 2 participants performed rotations in the LE that were mapped to the virtual rotations in a 360° video-based RE to which we applied different gains. Again, participants had to estimate whether the remotely perceived rotation was smaller or larger than the actual physically performed rotation. Our results show that participants are not able to reliably discriminate the difference between physical motion in the LE and the virtual motion from the 360° video RE when virtual translations are down-scaled by 5.8% and up-scaled by 9.7%, and virtual rotations are about 12.3% less or 9.2% more than the corresponding physical rotations in the LE.


ieee virtual reality conference | 2017

Biomechanical analysis of (non-)isometric virtual walking of older adults

Omar Janeh; Eike Langbehn; Frank Steinicke; Gerd Bruder; Alessandro Gulberti; Monika Poetter-Nerger

Our study investigates the effects of (non-)isometric mappings between physical movements and virtual motions in the virtual environment (VE) on walking biomechanics of older adults. Three primary domains (pace, base of support and phase) of spatio-temporal and temporo-phasic parameters were used to evaluate gait performance. Our results show similar results in pace and phasic domains when older adults walk in the VE in the isometric mapping condition compared to the corresponding parameters in the real world. We found significant differences in base of support for our user group between walking in the VE and real world. For non-isometric mappings we found an increased divergence of gait parameters in all domains correlating with the up-or down-scaled velocity of visual self-motion feedback.


symposium on 3d user interfaces | 2017

Influence of avatar appearance on presence in social VR

Paul Heidicker; Eike Langbehn; Frank Steinicke

Collaboration


Dive into the Eike Langbehn's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Gerd Bruder

University of Central Florida

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge