Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Frank Steinicke is active.

Publication


Featured researches published by Frank Steinicke.


IEEE Transactions on Visualization and Computer Graphics | 2010

Estimation of Detection Thresholds for Redirected Walking Techniques

Frank Steinicke; Gerd Bruder; Jason Jerald; Harald Frenz; Markus Lappe

In immersive virtual environments (IVEs), users can control their virtual viewpoint by moving their tracked head and walking through the real world. Usually, movements in the real world are mapped one-to-one to virtual camera motions. With redirection techniques, the virtual camera is manipulated by applying gains to user motion so that the virtual world moves differently than the real world. Thus, users can walk through large-scale IVEs while physically remaining in a reasonably small workspace. In psychophysical experiments with a two-alternative forced-choice task, we have quantified how much humans can unknowingly be redirected on physical paths that are different from the visually perceived paths. We tested 12 subjects in three different experiments: (E1) discrimination between virtual and physical rotations, (E2) discrimination between virtual and physical straightforward movements, and (E3) discrimination of path curvature. In experiment E1, subjects performed rotations with different gains, and then had to choose whether the visually perceived rotation was smaller or greater than the physical rotation. In experiment E2, subjects chose whether the physical walk was shorter or longer than the visually perceived scaled travel distance. In experiment E3, subjects estimate the path curvature when walking a curved path in the real world while the visual display shows a straight path in the virtual world. Our results show that users can be turned physically about 49 percent more or 20 percent less than the perceived virtual rotation, distances can be downscaled by 14 percent and upscaled by 26 percent, and users can be redirected on a circular arc with a radius greater than 22 m while they believe that they are walking straight.


virtual reality software and technology | 2008

Analyses of human sensitivity to redirected walking

Frank Steinicke; Gerd Bruder; Jason Jerald; Harald Frenz; Markus Lappe

Redirected walking allows users to walk through large-scale immersive virtual environments (IVEs) while physically remaining in a reasonably small workspace by intentionally injecting scene motion into the IVE. In a constant stimuli experiment with a two-alternative-forced-choice task we have quantified how much humans can unknowingly be redirected on virtual paths which are different from the paths they actually walk. 18 subjects have been tested in four different experiments: (E1a) discrimination between virtual and physical rotation, (E1b) discrimination between two successive rotations, (E2) discrimination between virtual and physical translation, and discrimination of walking direction (E3a) without and (E3b) with start-up. In experiment E1a subjects performed rotations to which different gains have been applied, and then had to choose whether or not the visually perceived rotation was greater than the physical rotation. In experiment E1b subjects discriminated between two successive rotations where different gains have been applied to the physical rotation. In experiment E2 subjects chose if they thought that the physical walk was longer than the visually perceived scaled travel distance. In experiment E3a subjects walked a straight path in the IVE which was physically bent to the left or to the right, and they estimate the direction of the curvature. In experiment E3a the gain was applied immediately, whereas the gain was applied after a start-up of two meters in experiment E3b. Our results show that users can be turned physically about 68% more or 10% less than the perceived virtual rotation, distances can be up- or down-scaled by 22%, and users can be redirected on an circular arc with a radius greater than 24 meters while they believe they are walking straight.


symposium on 3d user interfaces | 2009

Arch-Explore: A natural user interface for immersive architectural walkthroughs

Gerd Bruder; Frank Steinicke; Klaus H. Hinrichs

In this paper we propose the Arch-Explore user interface, which supports natural exploration of architectural 3D models at different scales in a real walking virtual reality (VR) environment such as head-mounted display (HMD) or CAVE setups. We discuss in detail how user movements can be transferred to the virtual world to enable walking through virtual indoor environments. To overcome the limited interaction space in small VR laboratory setups, we have implemented redirected walking techniques to support natural exploration of comparably large-scale virtual models. Furthermore, the concept of virtual portals provides a means to cover long distances intuitively within architectural models. We describe the software and hardware setup and discuss benefits of Arch-Explore.


Archive | 2013

Human Walking in Virtual Environments: Perception, Technology, and Applications

Frank Steinicke; Yon Visell; Jennifer L. Campos; Anatole Lcuyer

This book presents a survey of past and recent developments on human walking in virtual environments with an emphasis on human self-motion perception, the multisensory nature of experiences of walking, conceptual design approaches, current technologies, and applications. The use of Virtual Reality and movement simulation systems is becoming increasingly popular and more accessible to a wide variety of research fields and applications. While, in the past, simulation technologies have focused on developing realistic, interactive visual environments, it is becoming increasingly obvious that our everyday interactions are highly multisensory. Therefore, investigators are beginning to understand the critical importance of developing and validating locomotor interfaces that can allow for realistic, natural behaviours. The book aims to present an overview of what is currently understood about human perception and performance when moving in virtual environments and to situate it relative to the broader scientific and engineering literature on human locomotion and locomotion interfaces. The contents include scientific background and recent empirical findings related to biomechanics, self-motion perception, and physical interactions. The book also discusses conceptual approaches to multimodal sensing, display systems, and interaction for walking in real and virtual environments. Finally, it will present current and emerging applications in areas such as gait and posture rehabilitation, gaming, sports, and architectural design.


ieee virtual reality conference | 2012

A taxonomy for deploying redirection techniques in immersive virtual environments

Evan A. Suma; Gerd Bruder; Frank Steinicke; David M. Krum; Mark T. Bolas

Natural walking can provide a compelling experience in immersive virtual environments, but it remains an implementation challenge due to the physical space constraints imposed on the size of the virtual world. The use of redirection techniques is a promising approach that relaxes the space requirements of natural walking by manipulating the users route in the virtual environment, causing the real world path to remain within the boundaries of the physical workspace. In this paper, we present and apply a novel taxonomy that separates redirection techniques according to their geometric flexibility versus the likelihood that they will be noticed by users. Additionally, we conducted a user study of three reorientation techniques, which confirmed that participants were less likely to experience a break in presence when reoriented using the techniques classified as subtle in our taxonomy. Our results also suggest that reorientation with change blindness illusions may give the impression of exploring a more expansive environment than continuous rotation techniques, but at the cost of negatively impacting spatial knowledge acquisition.


IEEE Transactions on Visualization and Computer Graphics | 2012

Geometric Calibration of Head-Mounted Displays and its Effects on Distance Estimation

Falko Kellner; Benjamin Bolte; Gerd Bruder; Ulrich Rautenberg; Frank Steinicke; Markus Lappe; Reinhard Koch

Head-mounted displays (HMDs) allow users to observe virtual environments (VEs) from an egocentric perspective. However, several experiments have provided evidence that egocentric distances are perceived as compressed in VEs relative to the real world. Recent experiments suggest that the virtual view frustum set for rendering the VE has an essential impact on the users estimation of distances. In this article we analyze if distance estimation can be improved by calibrating the view frustum for a given HMD and user. Unfortunately, in an immersive virtual reality (VR) environment, a full per user calibration is not trivial and manual per user adjustment often leads to mini- or magnification of the scene. Therefore, we propose a novel per user calibration approach with optical see-through displays commonly used in augmented reality (AR). This calibration takes advantage of a geometric scheme based on 2D point - 3D line correspondences, which can be used intuitively by inexperienced users and requires less than a minute to complete. The required user interaction is based on taking aim at a distant target marker with a close marker, which ensures non-planar measurements covering a large area of the interaction space while also reducing the number of required measurements to five. We found the tendency that a calibrated view frustum reduced the average distance underestimation of users in an immersive VR environment, but even the correctly calibrated view frustum could not entirely compensate for the distance underestimation effects.


applied perception in graphics and visualization | 2009

Transitional environments enhance distance perception in immersive virtual reality systems

Frank Steinicke; Gerd Bruder; Klaus H. Hinrichs; Markus Lappe; Brian Ries; Victoria Interrante

Several experiments have provided evidence that ego-centric distances are perceived as compressed in immersive virtual environments relative to the real world. The principal factors responsible for this phenomenon have remained largely unknown. However, recent experiments suggest that when the virtual environment (VE) is an exact replica of a users real physical surroundings, the persons distance perception improves. Furthermore, it has been shown that when users start their virtual reality (VR) experience in such a virtual replica and then gradually transition to a different VE, their sense of presence in the actual virtual world increases significantly. In this case the virtual replica serves as a transitional environment between the real and virtual world. In this paper we examine whether a persons distance estimation skills can be transferred from a transitional environment to a different VE. We have conducted blind walking experiments to analyze if starting the VR experience in a transitional environment can improve a persons ability to estimate distances in an immersive VR system. We found that users significantly improve their distance estimation skills when they enter the virtual world via a transitional environment.


applied perception in graphics and visualization | 2008

Sensitivity to scene motion for phases of head yaws

Jason Jerald; Tabitha C. Peck; Frank Steinicke

In order to better understand how scene motion is perceived in immersive virtual environments and to provide guidelines for designing more useable systems, we measured sensitivity to scene motion for different phases of quasi-sinusoidal head yaw motions. We measured and compared scene-velocity thresholds for nine subjects across three conditions: visible <u>W</u>ith head rotation (W) where the scene is presented during the center part of sinusoidal head yaws and the scene moves in the same direction the head is rotating, visible <u>A</u>gainst head rotation (A) where the scene is presented during the center part of sinusoidal head yaws and the scene moves in the opposite direction the head is rotating, and visible at the <u>E</u>dge of head rotation (E) where the scene is presented at the extreme of sinusoidal head yaws and the scene moves during the time that head direction changes. The W condition had a significantly higher threshold (decreased sensitivity) than both the E and A conditions. The median threshold for the W condition was 2.1 times the A condition and 1.5 times the E condition. We did not find a significant difference between the E and A conditions, although there was a trend for the A thresholds to be less than the E thresholds. An Equivalence Test showed the A and E thresholds to be statistically equivalent. Our results suggest the phase of users head yaw should be taken into account when inserting additional scene motion into immersive virtual environments if one does not want users to perceive that motion. In particular, there is much more latitude for artificially and imperceptibly rotating a scene, as in Razzaques redirecting walking technique, in the same direction of head yaw than against the direction of yaw. The implications for maximum end-to-end latency in a head-mounted display is that users are less likely to notice latency when beginning a head yaw (when the scene moves with the head) than when slowing down a head yaw (when the scene moves against the head) or when changing head direction (when the head is near still and scene motion due to latency is maximized).


eurographics | 2008

Stroke-based transfer function design

Timo Ropinski; Jörg-Stefan Praßni; Frank Steinicke; Klaus H. Hinrichs

In this paper we propose a user interface for the design of 1D transfer functions. The user can select a feature of interest by drawing one or more strokes directly onto the volume rendering near its silhouette. Based on the stroke(s), our algorithm performs a histogram analysis in order to identify the desired feature in histogram space. Once the feature of interest has been identified, we automatically generate a component transfer function, which associates optical properties with the previously determined intervals in the domain of the data values. By supporting direct interaction techniques, which are performed in the image domain, the transfer function design becomes more intuitive compared to the specification performed in the histogram domain. To be able to modify and combine the previously generated component transfer functions conveniently, we propose a user interface, which has been inspired by the layer mechanism commonly found in image processing software. With this user interface, the optical properties assigned through a component function can be altered, and the component functions to be combined into a final transfer function can be selected.


smart graphics | 2006

Visually Supporting Depth Perception in Angiography Imaging

Timo Ropinski; Frank Steinicke; Klaus H. Hinrichs

In this paper we propose interactive visualization techniques which support the spatial comprehension of angiogram images by emphasizing depth information and introducing combined depth cues. In particular, we propose a depth based color encoding, two variations of edge enhancement and the application of a modified depth of field effect in order to enhance depth perception of complex blood vessel systems. All proposed techniques have been developed to improve the human depth perception and have been adapted with special consideration of the spatial comprehension of blood vessel structures. To evaluate the presented techniques, we have conducted a user study, in which users had to accomplish certain depth perception tasks.

Collaboration


Dive into the Frank Steinicke's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Gerd Bruder

University of Central Florida

View shared research outputs
Top Co-Authors

Avatar

Gerd Bruder

University of Central Florida

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge