Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Harald Frenz is active.

Publication


Featured researches published by Harald Frenz.


virtual reality software and technology | 2008

Analyses of human sensitivity to redirected walking

Frank Steinicke; Gerd Bruder; Jason Jerald; Harald Frenz; Markus Lappe

Redirected walking allows users to walk through large-scale immersive virtual environments (IVEs) while physically remaining in a reasonably small workspace by intentionally injecting scene motion into the IVE. In a constant stimuli experiment with a two-alternative-forced-choice task we have quantified how much humans can unknowingly be redirected on virtual paths which are different from the paths they actually walk. 18 subjects have been tested in four different experiments: (E1a) discrimination between virtual and physical rotation, (E1b) discrimination between two successive rotations, (E2) discrimination between virtual and physical translation, and discrimination of walking direction (E3a) without and (E3b) with start-up. In experiment E1a subjects performed rotations to which different gains have been applied, and then had to choose whether or not the visually perceived rotation was greater than the physical rotation. In experiment E1b subjects discriminated between two successive rotations where different gains have been applied to the physical rotation. In experiment E2 subjects chose if they thought that the physical walk was longer than the visually perceived scaled travel distance. In experiment E3a subjects walked a straight path in the IVE which was physically bent to the left or to the right, and they estimate the direction of the curvature. In experiment E3a the gain was applied immediately, whereas the gain was applied after a start-up of two meters in experiment E3b. Our results show that users can be turned physically about 68% more or 10% less than the perceived virtual rotation, distances can be up- or down-scaled by 22%, and users can be redirected on an circular arc with a radius greater than 24 meters while they believe they are walking straight.


Vision Research | 2005

Absolute travel distance from optic flow

Harald Frenz; Markus Lappe

Optic flow fields provide rich information about the observers self-motion. Besides estimation of the direction of self-motion human observers are also able to discriminate the travel distances of two self-motion simulations. Recent studies have shown that observers estimate the simulated ego velocity of the self-motion simulation and integrate it over time. Thus, observers use a 3-D percept of the ego motion through the environment. In the present work we ask if human observers are able to use this 3-D percept of the motion simulation to build up an internal representation of travel distance and indicate it in a static scene. We visually simulated self-motion in different virtual environments and asked subjects to indicate the perceived distances in terms of static virtual intervals on the ground. The results show that human observers possess a static distance gauge, but that they undershoot the travel distances for short motion simulations. In further experiments we changed the modality of the distance indication but the undershoot in distance estimation remained. This suggests that the undershoot is linked to the perception of the optic flow field.


tests and proofs | 2007

Estimation of travel distance from visual motion in virtual environments

Harald Frenz; Markus Lappe; Marina Kolesnik; Thomas Bührmann

Distance estimation of visually simulated self-motion is difficult, because one has to know or make assumptions about scene layout to judge ego speed. Discrimination of the travel distances of two sequentially simulated self-motions in the same scene can be performed quite accurately (Bremmer and Lappe 1999; Frenz et al., 2003). However, the indication of the perceived distance of a single movement in terms of a spatial interval results in a depth scaling error: Intervals are correlated with the true travel distance, but underestimate travel distance by about 25% (Frenz and Lappe, 2005). Here we investigated whether the inclusion of further depth cues (disparity/motion parallax/figural cues) in the virtual environment allows more veridical interval adjustment. Experiments were conducted on a large single projection screen and in a fully immersive computer-animated virtual environment (CAVE). Forward movements in simple virtual environments were simulated with distances between 1.5 and 13 m with varying speeds. Subjects indicated the perceived distance of each movement in terms of a depth interval on the virtual ground plane. We found good correlation between simulated and indicated distances, indicative of an internal representation of the perceived distance. The slopes of the fitted regression lines revealed an underestimation of distance by about 25% under all conditions. We conclude that estimation of travel distance from optic flow is subject to scaling when compared to static intervals in the environment, irrespective of additional depth cues.


Vision Research | 2003

Discrimination of travel distances from situated optic flow

Harald Frenz; Frank Bremmer; Markus Lappe

Effective navigation requires knowledge of the direction of motion and of the distance traveled. Humans can use visual motion cues from optic flow to estimate direction of self-motion. Can they also estimate travel distance from visual motion?Optic flow is ambiguous with regard to travel distance. But when the depth structure of the environment is known or can be inferred, i.e., when the flow can be calibrated to the environmental situation, distance estimation may become possible. Previous work had shown that humans can discriminate and reproduce travel distances of two visually simulated self-motions under the assumption that the environmental situation and the depth structure of the scene is the same in both motions. Here we ask which visual cues are used for distance estimation when this assumption is fulfilled. Observers discriminated distances of visually simulated self-motions in four different environments with various depth cues. Discrimination was possible in all cases, even when motion parallax was the only depth cue available. In further experiments we ask whether distance estimation is based directly on image velocity or on an estimate of observer velocity derived from image velocity and the structure of the environment. By varying the simulated height above ground, the visibility range, or the simulated gaze angle we modify visual information about the structure of the environment and alter the image velocity distribution in the optic flow. Discrimination ability remained good. We conclude that the judgment of travel distance is based on an estimate of observer speed within the simulated environment.


Experimental Brain Research | 2009

Visual estimation of travel distance during walking

Markus Lappe; Harald Frenz

The optic flow generated in the eyes during self-motion provides an important control signal for direction and speed of self-motion, and can be used to track the distance that has been traveled. The use of vision for these behavioral tasks can be studied in isolation in virtual reality setups, in which self-motion is merely simulated, and in which the visual motion can be controlled independently of other sensory cues. In such experiments it was found that the estimation of the travel distance of a simulated movement shows characteristic errors, sometimes overestimating and sometimes underestimating the true travel distance. These errors can be explained by a leaky path integration model. To test whether this model also holds for actual self-motion in the real world we studied walking distance perception in an open field with tasks similar to those previously used in virtual environments. We show that similar errors occur in the estimation of travel distance in the real world as in virtual environment, and that they are consistent with the leaky integration model.


Spanish Journal of Psychology | 2006

Visual distance estimation in static compared to moving virtual scenes

Harald Frenz; Markus Lappe

Visual motion is used to control direction and speed of self-motion and time-to-contact with an obstacle. In earlier work, we found that human subjects can discriminate between the distances of different visually simulated self-motions in a virtual scene. Distance indication in terms of an exocentric interval adjustment task, however, revealed linear correlation between perceived and indicated distances but with a profound distance underestimation. One possible explanation for this underestimation is the perception of visual space in virtual environments. Humans perceive visual space in natural scenes as curved, and distances are increasingly underestimated with increasing distance from the observer. Such spatial compression may also exist in our virtual environment. We therefore surveyed perceived visual space in a static virtual scene. We asked observers to compare two horizontal depth intervals, similar to experiments performed in natural space. Subjects had to indicate the size of one depth interval relative to a second interval. Our observers perceived visual space in the virtual environment as compressed, similar to the perception found in natural scenes. However, the nonlinear depth function we found can not explain the observed distance underestimation of visual simulated self-motions in the same environment.


human vision and electronic imaging conference | 2005

Virtual odometry from visual flow

Markus Lappe; Harald Frenz; Thomas Buehrmann; Marina Kolesnik

We investigate how visual motion registered during ones own movement through a structured world can be used to gauge travel distance. Estimating absolute travel distance from the visual flow induced in the optic array of a moving observer is problematic because optic flow speeds co-vary with the dimensions of the environment and are thus subject to an environment specific scale factor. Discrimination of the distances of two simulated self-motions of different speed and duration is reliably possible from optic flow, however, if the visual environment is the same for both motions, because the scale factors cancel in this case. Here, we ask whether a distance estimate obtained from optic flow can be transformed into a spatial interval in the same visual environment. Subjects viewed a simulated self-motion sequence on a large (90 by 90 deg) projection screen or in a computer animated virtual environment (CAVE) with completely immersive, stereographic, head-yoked projection, that extended 180deg horizontally and included the floor space in front of the observer. The sequence depicted self-motion over a ground plane covered with random dots. Simulated distances ranged from 1.5 to 13 meters with variable speed and duration of the movement. After the movement stopped, the screen depicted a stationary view of the scene and two horizontal lines appeared on the ground in front of the observer. The subject had to adjust one of these lines such that the spatial interval between the lines matched the distance traveled during the movement simulation. Adjusted interval size was linearly related to simulated travel distance, suggesting that observers could obtain a measure of distance from the optic flow. The slope of the regression was 0.7. Thus, subjects underestimated distance by 30%. This result was similar for stereoscopic and monoscopic conditions. We conclude that optic flow can be used to derive an estimate of travel distance, but this estimate is subject to scaling when compared to static intervals in the environment, irrespective of steroscopic depth cues.


ieee virtual reality conference | 2008

A Universal Virtual Locomotion System: Supporting Generic Redirected Walking and Dynamic Passive Haptics within Legacy 3D Graphics Applications

Frank Steinice; Timo Ropinski; Gerd Bruder; Klaus H. Hinrichs; Harald Frenz; Markus Lappe

In this paper we introduce a virtual locomotion system that allows navigation within any large-scale virtual environment (VE) by real walking. In contrast to the work of Razzaque et al. (2001) we have developed generic redirected walking concepts by combining motion compression, i. e., scaling the real distance users walk, rotation gains, which make the real turns smaller or larger, and curvature gains, which bend the users walking direction such that s/he walks on a curve. Furthermore, we introduce the new concept of dynamic passive haptics which extends passive haptics (Insko et al., 2001; Kohli et al., 2005) in such a way that any number of virtual objects can be sensed by means of real proxy objects having similar haptic capabilities, i. e., size, shape and surface structure. We have evaluated these concepts and explain technical details regarding their integration into legacy 3D graphics applications.


JVRB - Journal of Virtual Reality and Broadcasting | 2009

Real Walking through Virtual Environments by Redirection Techniques

Frank Steinicke; Gerd Bruder; Klaus H. Hinrichs; Jason Jerald; Harald Frenz; Markus Lappe


Journal of Vision | 2010

Travel distance estimation from optic flow

Harald Frenz; Markus Lappe

Collaboration


Dive into the Harald Frenz's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Gerd Bruder

University of Central Florida

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jason Jerald

University of North Carolina at Chapel Hill

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge