Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where William B. Thompson is active.

Publication


Featured researches published by William B. Thompson.


Presence: Teleoperators & Virtual Environments | 2004

Does the quality of the computer graphics matter when judging distances in visually immersive environments

William B. Thompson; Peter Willemsen; Amy Ashurst Gooch; Sarah H. Creem-Regehr; Jack M. Loomis; Andrew C. Beall

In the real world, people are quite accurate in judging distances to locations in the environment, at least for targets resting on the ground plane and distances out to about 20 m. Distance judgments in visually immersive environments are much less accurate. Several studies have now shown that in visually immersive environments, the world appears significantly smaller than intended. This study investigates whether or not the compression in apparent distances is the result of the low-quality computer graphics utilized in previous investigations. Visually directed triangulated walking was used to assess distance judgments in the real world and in three virtual environments with graphical renderings of varying quality.


Perception | 2005

The Influence of Restricted Viewing Conditions on Egocentric Distance Perception: Implications for Real and Virtual Indoor Environments

Sarah H. Creem-Regehr; Peter Willemsen; Amy Ashurst Gooch; William B. Thompson

We carried out three experiments to examine the influence of field of view and binocular viewing restrictions on absolute distance perception in real-world indoor environments. Few of the classical visual cues provide direct information for accurate absolute distance judgments to points in the environment beyond about 2 m from the viewer. Nevertheless, in previous work it has been found that visually directed walking tasks reveal accurate distance estimations in full-cue real-world environments to distances up to 20 m. In contrast, the same tasks in virtual environments produced with head-mounted displays (HMDs) show large compression of distance. Field of view and binocular viewing are common limitations in research with HMDs, and have been rarely studied under full pictorial-cue conditions in the context of distance perception in the real-world. Experiment 1 showed that the view of ones body and feet on the floor was not necessary for accurate distance perception. In experiment 2 we manipulated the horizontal and the vertical field of view along with head rotation and found that a restricted field of view did not affect the accuracy of distance estimations when head movement was allowed. Experiment 3 showed that performance with monocular viewing was equal to that with binocular viewing. These results have implications for the information needed to scale egocentric distance in the real-world and reduce the support for the hypothesis that a limited field of view or imperfections in binocular image presentation are the cause of the underestimation seen with HMDs.


international conference on robotics and automation | 1999

Feature-based reverse engineering of mechanical parts

William B. Thompson; Jonathan C. Owen; H.J. de St. Germain; S.R. Stark; Thomas C. Henderson

Reverse engineering of mechanical parts requires extraction of information about an instance of a particular part sufficient to replicate the part using appropriate manufacturing techniques. This is important in a wide variety of situations, since functional CAD models are often unavailable or unusable for parts which must be duplicated or modified. Computer vision techniques applied to three-dimensional (3-D) data acquired using noncontact, 3-D position digitizers have the potential for significantly aiding the process. Serious challenges must be overcome, however, if sufficient accuracy is to be obtained and if models produced from sensed data are to be truly useful for manufacturing operations. The paper describes a prototype of a reverse engineering system which uses manufacturing features as geometric primitives. This approach has two advantages over current practice. The resulting models can be directly imported into feature-based CAD systems without loss of the semantics and topological information inherent in feature-based representations. In addition, the feature-based approach facilitates methods capable of producing highly accurate models, even when the original 3-D sensor data has substantial errors.


Experimental Brain Research | 2007

Visual flow influences gait transition speed and preferred walking speed.

Betty J. Mohler; William B. Thompson; Sarah H. Creem-Regehr; Herbert L. Pick; William H. Warren

It is typically assumed that basic features of human gait are determined by purely biomechanical factors. In two experiments, we test whether gait transition speed and preferred walking speed are also influenced by visual information about the speed of self-motion. The visual flow during treadmill locomotion was manipulated to be slower than, matched to, or faster than the physical gait speed (visual gains of 0.5, 1.0, 2.0). Higher flow rates elicit significantly lower transition speeds for both the Walk–Run and Run–Walk transition, as expected. Similarly, higher flow rates elicit significantly lower preferred walking speeds. These results suggest that visual information becomes calibrated to mechanical or energetic aspects of gait and contributes to the control of locomotor behavior.


Presence: Teleoperators & Virtual Environments | 2008

Effects of stereo viewing conditions on distance perception in virtual environments

Peter Willemsen; Amy Ashurst Gooch; William B. Thompson; Sarah H. Creem-Regehr

Several studies from different research groups investigating perception of absolute, egocentric distances in virtual environments have reported a compression of the intended size of the virtual space. One potential explanation for the compression is that inaccuracies and cue conflicts involving stereo viewing conditions in head mounted displays result in an inaccurate absolute scaling of the virtual world. We manipulate stereo viewing conditions in a head mounted display and show the effects of using both measured and fixed inter-pupilary distances, as well as bi-ocular and monocular viewing of graphics, on absolute distance judgments. Our results indicate that the amount of compression of distance judgments is unaffected by these manipulations. The equivalent performance with stereo, bi-ocular, and monocular viewing suggests that the limitations on the presentation of stereo imagery that are inherent in head mounted displays are likely not the source of distance compression reported in previous virtual environment studies.


tests and proofs | 2009

The effects of head-mounted display mechanical properties and field of view on distance judgments in virtual environments

Peter Willemsen; Mark B. Colton; Sarah H. Creem-Regehr; William B. Thompson

Research has shown that people are able to judge distances accurately in full-cue, real-world environments using visually directed actions. However, in virtual environments viewed with head-mounted display (HMD) systems, there is evidence that people act as though the virtual space is smaller than intended. This is a surprising result given how well people act in real environments. The behavior in the virtual setting may be linked to distortions in the available visual cues or to a persons ability to locomote without vision. Either could result from issues related to added mass, moments of inertia, and restricted field of view in HMDs. This article describes an experiment in which distance judgments based on normal real-world and HMD viewing are compared with judgments based on real-world viewing while wearing two specialized devices. One is a mock HMD, which replicated the mass, moments of inertia, and field of view of the HMD and the other an inertial headband designed to replicate the mass and moments of inertia of the HMD, but constructed to not restrict the field of view of the observer or otherwise feel like wearing a helmet. Distance judgments using the mock HMD showed a statistically significant underestimation relative to the no restriction condition but not of a magnitude sufficient to account for all the distance compression seen in the HMD. Indicated distances with the inertial headband were not significantly smaller than those made with no restrictions.


Presence: Teleoperators & Virtual Environments | 2010

The effect of viewing a self-avatar on distance judgments in an hmd-based virtual environment

Betty J. Mohler; Sarah H. Creem-Regehr; William B. Thompson; Hh Bülthoff

Few HMD-based virtual environment systems display a rendering of the users own body. Subjectively, this often leads to a sense of disembodiment in the virtual world. We explore the effect of being able to see ones own body in such systems on an objective measure of the accuracy of one form of space perception. Using an action-based response measure, we found that participants who explored near space while seeing a fully-articulated and tracked visual representation of themselves subsequently made more accurate judgments of absolute egocentric distance to locations ranging from 4 m to 6 m away from where they were standing than did participants who saw no avatar. A nonanimated avatar also improved distance judgments, but by a lesser amount. Participants who viewed either animated or static avatars positioned 3 m in front of their own position made subsequent distance judgments with similar accuracy to the participants who viewed the equivalent animated or static avatar positioned at their own location. We discuss the implications of these results on theories of embodied perception in virtual environments.


applied perception in graphics and visualization | 2004

The effects of head-mounted display mechanics on distance judgments in virtual environments

Peter Willemsen; Mark B. Colton; Sarah H. Creem-Regehr; William B. Thompson

In virtual environments that use head-mounted displays (HMD), distance judgments to targets on the ground are compressed, at least when indicated through visually-directed walking tasks. The same tasks performed in the real world yield veridical results over distances ranging from 2m to 25m. This paper describes experiments aimed at determining if mechanical aspects of HMDs such as mass and moments of inertia are responsible for the apparent distortion of distance. Our results indicate that the mechanical aspects of HMDs cannot explain the full magnitude of distance underestimation seen in HMD-based virtual environments, though they may account for a portion of the effect.


tests and proofs | 2005

Throwing versus walking as indicators of distance perception in similar real and virtual environments

Cynthia S. Sahm; Sarah H. Creem-Regehr; William B. Thompson; Peter Willemsen

For humans to effectively interact with their environment, it is important for the visual system to determine the absolute size and distance of objects. Previous experiments performed in full-cue, real-world environments have demonstrated that blind walking to targets serves as an accurate indication of distance perception, up to about 25 m. In contrast, the same task performed in virtual environments (VEs) using head-mounted displays shows significant underestimation in walking. To date, blind walking is the only visually directed action task that has been used to evaluate distance perception in VEs beyond reaching distances. The possible influence of the response measure itself on absolute distance perception in virtual environments is currently an open question. Blind walking involves locomotion and the egocentric updating of the environment with ones own movement. We compared this measure to blind throwing, a task that involves the initiation of a movement directed by vision, but no further interaction within the environment. Both throwing and walking were compressed in the VE but accurate in the real world. We suggest that distance compression found in VEs may be a result of a general perceptual origin rather than specific to the response measure.


tests and proofs | 2009

HMD calibration and its effects on distance judgments

Scott A. Kuhl; William B. Thompson; Sarah H. Creem-Regehr

Most head-mounted displays (HMDs) suffer from substantial optical distortion, and vendor-supplied specifications for field-of-view often are at variance with reality. Unless corrected, such displays do not present perspective-related visual cues in a geometrically correct manner. Distorted geometry has the potential to affect applications of HMDs, which depend on precise spatial perception. This article provides empirical evidence for the degree to which common geometric distortions affect one type of spatial judgment in virtual environments. We show that minification or magnification in the HMD that would occur from misstated HMD field of view causes significant changes in distance judgments. Incorrectly calibrated pitch and pincushion distortion, however, do not cause statistically significant changes in distance judgments for the degree of distortions examined. While the means for determining the optical distortion of display systems are well known, they are often not used in non-see-through HMDs due to problems in measuring and correcting for distortion. As a result, we also provide practical guidelines for creating geometrically calibrated systems.

Collaboration


Dive into the William B. Thompson's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge