Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Daniel C. Zikovitz is active.

Publication


Featured researches published by Daniel C. Zikovitz.


Experimental Brain Research | 2000

Visual and non-visual cues in the perception of linear self-motion.

Laurence R. Harris; Michael Jenkin; Daniel C. Zikovitz

Abstract. Surprisingly little is known of the perceptual consequences of visual or vestibular stimulation in updating our perceived position in space as we move around. We assessed the roles of visual and vestibular cues in determining the perceived distance of passive, linear self motion. Subjects were given cues to constant-acceleration motion: either optic flow presented in a virtual reality display, physical motion in the dark or combinations of visual and physical motions. Subjects indicated when they perceived they had traversed a distance that had been previously given to them either visually or physically. The perceived distance of motion evoked by optic flow was accurate relative to a previously presented visual target but was perceptually equivalent to about half the physical motion. The perceived distance of physical motion in the dark was accurate relative to a previously presented physical motion but was perceptually equivalent to a much longer visually presented distance. The perceived distance of self motion when both visual and physical cues were present was more closely perceptually equivalent to the physical motion experienced rather than the simultaneous visual motion, even when the target was presented visually. We discuss this dominance of the physical cues in determining the perceived distance of self motion in terms of capture by non-visual cues. These findings are related to emerging studies that show the importance of vestibular input to neural mechanisms that process self motion.


Virtual Reality | 2002

Simulating Self-Motion I: Cues for the Perception of Motion

Laurence R. Harris; Michael Jenkin; Daniel C. Zikovitz; Fara Redlick; Philip Jaekl; Urszula Jasiobedzka; Heather L. Jenkin; Robert S. Allison

When people move there are many visual and non-visual cues that can inform them about their movement Simulating self-motion in a virtual reality environment thus needs to take these non-visual cues into account in addition to the normal high-quality visual display. Here we examine the contribution of visual and non-visual cues to our perception of self-motion. The perceived distance of self-motion can be estimated from the visual flow field, physical forces or the act of moving. On its own, passive visual motion is a very effective cue to self-motion, and evokes a perception of self-motion that is related to the actual motion in a way that varies with acceleration Passive physical motion turns out to be a particularly potent self-motion cue: not only does it evoke an exaggerated sensation of motion, but it also tends to dominate other cues.


ieee virtual reality conference | 2002

Perceptual stability during head movement in virtual reality

Philip Jaekl; Robert S. Allison; Laurence R. Harris; Urszula Jasiobedzka; Heather L. Jenkin; Michael Jenkin; James E. Zacher; Daniel C. Zikovitz

Virtual reality displays introduce spatial distortions that are very hard to correct because of the difficulty of precisely modelling the camera from the nodal point of each eye. How significant are these distortions for spatial perception in virtual reality? In this study, we used a helmet-mounted display and a mechanical head tracker to investigate the tolerance to errors between head motions and the resulting visual display. The relationship between the head movement and the associated updating of the visual display was adjusted by subjects until the image was judged as stable relative to the world. Both rotational and translational movements were tested, and the relationship between the movements and the direction of gravity was varied systematically. Typically, for the display to be judged as stable, subjects needed the visual world to be moved in the opposite direction to the head movement by an amount greater than the head movement itself, during both rotational and translational head movements, although a large range of movement was tolerated and judged as appearing stable. These results suggest that it not necessary to model the visual geometry accurately and suggest circumstances when tracker drift can be corrected by jumps in the display which will pass unnoticed by the user.


ieee virtual reality conference | 1999

Vestibular cues and virtual environments: choosing the magnitude of the vestibular cue

Laurence R. Harris; Michael Jenkin; Daniel C. Zikovitz

The design of virtual environments usually concentrates on constructing a realistic visual simulation and ignores the non-visual cues normally associated with moving through an environment. The lack of the normal complement of cues may contribute to cybersickness and may affect operator performance. Previously (1998) we described the effect of adding vestibular cues during passive linear motion and showed an unexpected dominance of the vestibular cue in determining the magnitude of the perceived motion. Here we vary the relative magnitude of the visual and vestibular cues and describe a simple linear summation model that predicts the resulting perceived magnitude of motion. The model suggests that designers of virtual reality displays should add vestibular information in a ratio of one to four with the visual motion to obtain convincing and accurate performance.


ieee virtual reality conference | 1998

Vestibular cues and virtual environments

Laurence R. Harris; Michael Jenkin; Daniel C. Zikovitz

The vast majority of virtual environments concentrate on constructing a realistic visual simulation while ignoring non-visual environmental cues. Although these missing cues can to some extent be ignored by an operator, the lack of appropriate cues may contribute to cybersickness and may affect operator performance. We examine the role of vestibular cues to self-motion on an operators sense of self-motion within a virtual environment. We show that the presence of vestibular cues has a very significant effect on an operators estimate of self-motion. The addition of vestibular cues, however, is not always beneficial.


ieee international workshop on haptic audio visual environments and games | 2007

Auditory Motion Perception Threshold

Bill Kapralos; Daniel C. Zikovitz; Saad Khattak

Despite the fact that we are capable of localizing dynamic sound sources, the mechanisms responsible for this are not completely understood since the majority of sound localization research has focused on static environments where the sound source and listener are both stationary. Although various auditory cues can provide motion information, intensity changes appear to be the dominant cue. Previous studies indicate that our perception of auditory motion is greatly overestimated when using auditory intensity cues solely. In this paper we describe an experiment that examines our perception of auditory-motion in the presence of a stationary sound source whose intensity decreases following one of five rates of acceleration. Preliminary results indicate that in order to detect a change in sound source intensity, this change must occur over a constant amount of time irrespective of the rate of acceleration in which the decreasing intensity is following.


Archive | 2010

Simulated Day and Night Effects on Perceived Motion in an Aircraft Taxiing Simulator

Daniel C. Zikovitz; Keith K. Niall; Laurence R. Harris; Michael Jenkin

Flight simulation often depends on both visual simulation and movement by a motion base. The present study asks if tilt is equivalently perceived as linear translation, and if so, whether other information such as simulated day or night conditions may affect accuracy. We examine the perception of self-motion in non-pilots during passive simulated airplane taxiing along a straight runway. Providing physical motion cueing by a motion platform, simulation scenarios were presented at a constant physical or visual acceleration of either 0.4 m/s2 or 1.6 m/s2 (simulated using tilt). Nine subjects indicated the moment when they perceived that they had travelled through distances between 10 to 90 m under either day or night-time display conditions. Their estimates were made either with or without tilt (to simulate linear acceleration). We present results as a ratio of response distance (response distance) to stimulus distance (stimulus distance). Subjects’ motion estimates under tilt conditions do not significantly differ from under vision-only conditions. We found an interaction of tilt and illumination conditions, particularly for targets greater than 30 m. The ratio of response distance to stimulus distance significantly increases in the dark (1.1 vs. 0.85), at higher accelerations (1.01 for 1.6 m/s2 vs. 0.95 for 0.4 m/s2) and, during daytime illumination, in the presence of a physical-motion cue (0.92 vs. 0.78). Conditions affecting the magnitude of perceived self-motion include: • illumination • magnitude of the simulated acceleration • presence of physical tilt during daytime illumination This study shows that passive humans can be expected to make significant, predictable errors in judging taxiing distances under specific simulation conditions. Questions for further research include: • if similar effects occur in pilots as in non-pilots • if such effects also occur in real taxiing scenarios The results obtained here may help to counter perceptual errors, as the results become part of the knowledge on which appropriate cueing schemes can be based. 6


Ergonomics | 1999

Head tilt during driving

Daniel C. Zikovitz; Laurence R. Harris


Journal of The Audio Engineering Society | 2004

Auditory Cues in the Perception of Self Motion

Bill Kapralos; Daniel C. Zikovitz; Michael Jenkin; Laurence R. Harris


Acta Astronautica | 2005

The relative role of visual and non-visual cues in determining the perceived direction of "up": experiments in parabolic flight.

Heather L. Jenkin; Richard T. Dyde; Jim E. Zacher; Daniel C. Zikovitz; Michael Jenkin; Robert S. Allison; Ian P. Howard; Laurence R. Harris

Collaboration


Dive into the Daniel C. Zikovitz's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Bill Kapralos

University of Ontario Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge