Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where J. Edward Swan is active.

Publication


Featured researches published by J. Edward Swan.


applied perception in graphics and visualization | 2008

The effects of virtual reality, augmented reality, and motion parallax on egocentric depth perception

J. Adam Jones; J. Edward Swan; Gurjot Singh; Eric Kolstad; Stephen R. Ellis

A large number of previous studies have shown that egocentric depth perception tends to be underestimated in virtual reality (VR) - objects appear smaller and farther away than they should. Various theories as to why this might occur have been investigated, but to date the cause is not fully understood. A much smaller number of studies have investigated how depth perception operates in augmented reality (AR), and some of these studies have also indicated a similar underestimation effect. In this paper we report an experiment that further investigates these effects. The experiment compared VR and AR conditions to two real-world control conditions, and studied the effect of motion parallax across all conditions. Our combined VR and AR head-mounted display (HMD) allowed us to develop very careful calibration procedures based on real-world calibration widgets, which cannot be replicated with VR-only HMDs. To our knowledge, this is the first study to directly compare VR and AR conditions as part of the same experiment.


IEEE Transactions on Visualization and Computer Graphics | 2007

Egocentric depth judgments in optical, see-through augmented reality

J. Edward Swan; Adam Thomas Jones; Eric Kolstad; Mark A. Livingston; Harvey S. Smallman

A fundamental problem in optical, see-through augmented reality (AR) is characterizing how it affects the perception of spatial layout and depth. This problem is important because AR system developers need to both place graphics in arbitrary spatial relationships with real-world objects, and to know that users will perceive them in the same relationships. Furthermore, AR makes possible enhanced perceptual techniques that have no real-world equivalent, such as x-ray vision, where AR users are supposed to perceive graphics as being located behind opaque surfaces. This paper reviews and discusses protocols for measuring egocentric depth judgments in both virtual and augmented environments, and discusses the well-known problem of depth underestimation in virtual environments. It then describes two experiments that measured egocentric depth judgments in AR. Experiment I used a perceptual matching protocol to measure AR depth judgments at medium and far-field distances of 5 to 45 meters. The experiment studied the effects of upper versus lower visual field location, the x-ray vision condition, and practice on the task. The experimental findings include evidence for a switch in bias, from underestimating to overestimating the distance of AR-presented graphics, at ~ 23 meters, as well as a quantification of how much more difficult the x-ray vision condition makes the task. Experiment II used blind walking and verbal report protocols to measure AR depth judgments at distances of 3 to 7 meters. The experiment examined real-world objects, real-world objects seen through the AR display, virtual objects, and combined real and virtual objects. The results give evidence that the egocentric depth of AR objects is underestimated at these distances, but to a lesser degree than has previously been found for most virtual reality environments. The results are consistent with previous studies that have implicated a restricted field-of-view, combined with an inability for observers to scan the ground plane in a near-to-far direction, as explanations for the observed depth underestimation.


Presence: Teleoperators & Virtual Environments | 2006

The effects of text drawing styles, background textures, and natural lighting on text legibility in outdoor augmented reality

Joseph L. Gabbard; J. Edward Swan; Deborah Hix

A challenge in presenting augmenting information in outdoor augmented reality (AR) settings lies in the broad range of uncontrollable environmental conditions that may be present, specifically large-scale fluctuations in natural lighting and wide variations in likely backgrounds or objects in the scene. In this paper, we motivate the need for research on the effects of text drawing styles, outdoor background textures, and natural lighting on user performance in outdoor AR. We present a pilot study and a follow-on user-based study that examined the effects on user performance of outdoor background textures, changing outdoor illuminance values, and text drawing styles in a text identification task using an optical, see-through AR system. We report significant effects for all these variables, and discuss user interface design guidelines and ideas for future work.


ieee visualization | 1997

An anti-aliasing technique for splatting

J. Edward Swan; Klaus Mueller; Torsten Möller; Naeem Shareef; Roger Crawfis; Roni Yagel

Splatting is a popular direct volume rendering algorithm. However, the algorithm does not correctly render cases where the volume sampling rate is higher than the image sampling rate (e.g. more than one voxel maps into a pixel). This situation arises with orthographic projections of high-resolution volumes, as well as with perspective projections of volumes of any resolution. The result is potentially severe spatial and temporal aliasing artifacts. Some volume ray-casting algorithms avoid these artifacts by employing reconstruction kernels which vary in width as the rays diverge. Unlike ray-casting algorithms, existing splatting algorithms do not have an equivalent mechanism for avoiding these artifacts. The authors propose such a mechanism, which delivers high-quality splatted images and has the potential for a very efficient hardware implementation.


ieee visualization | 1998

Battlefield visualization on the responsive workbench

Jim Durbin; J. Edward Swan; Brad Colbert; John Crowe; Rob King; Tony King; Christopher Scannell; Zachary Wartell; Terry Welsh

In this paper we describe a battlefield visualization system, called Dragon, which we have implemented on a virtual reality responsive workbench. The Dragon system has been successfully deployed as part of two large military exercises: the Hunter Warrior advanced warfighting experiment, in March 1997, and the Joint Counter Mine advanced concept tactical demonstration, in August and September 1997. We describe battlefield visualization, the Dragon system, and the workbench, and we describe our experiences as part of these two real-world deployments, with an emphasis on lessons learned and needed future work.


ieee virtual reality conference | 2009

Measurement Protocols for Medium-Field Distance Perception in Large-Screen Immersive Displays

Eric Klein; J. Edward Swan; Gregory S. Schmidt; Mark A. Livingston; Oliver G. Staadt

How do users of virtual environments perceive virtual space? Many experiments have explored this question, but most of these have used head-mounted immersive displays. This paper reports an experiment that studied large-screen immersive displays at medium-field distances of 2 to 15 meters. The experiment measured ego-centric depth judgments in a CAVE, a tiled display wall, and a real-world outdoor field as a control condition. We carefully modeled the outdoor field to make the three environments as similar as possible. Measuring egocentric depth judgments in large-screen immersive displays requires adapting new measurement protocols; the experiment used timed imagined walking, verbal estimation, and triangulated blind walking. We found that depth judgments from timed imagined walking and verbal estimation were very similar in all three environments. However, triangulated blind walking was accurate only in the out-door field; in the large-screen immersive displays it showed under-estimation effects that were likely caused by insufficient physical space to perform the technique. These results suggest using timed imagined walking as a primary protocol for assessing depth perception in large-screen immersive displays. We also found that depth judgments in the CAVE were more accurate than in the tiled display wall, which suggests that the peripheral scenery offered by the CAVE is helpful when perceiving virtual space.


IEEE Transactions on Visualization and Computer Graphics | 2013

Peripheral Stimulation and its Effect on Perceived Spatial Scale in Virtual Environments

J. A. Jones; J. Edward Swan; Mark T. Bolas

The following series of experiments explore the effect of static peripheral stimulation on the perception of distance and spatial scale in a typical head-mounted virtual environment. It was found that applying constant white light in an observers far periphery enabled the observer to more accurately judge distances using blind walking. An effect of similar magnitude was also found when observers estimated the size of a virtual space using a visual scale task. The presence of the effect across multiple psychophysical tasks provided confidence that a perceptual change was, in fact, being invoked by the addition of the peripheral stimulation. These results were also compared to observer performance in a very large field of view virtual environment and in the real world. The subsequent findings raise the possibility that distance judgments in virtual environments might be considerably more similar to those in the real world than previous work has suggested.


applied perception in graphics and visualization | 2010

Depth judgment measures and occluding surfaces in near-field augmented reality

Gurjot Singh; J. Edward Swan; J. Adam Jones; Stephen R. Ellis

In this paper we describe an apparatus and experiment that measured depth judgments in augmented reality at near-field distances of 34 to 50 centimeters. The experiment compared perceptual matching, a closed-loop task for measuring depth judgments, with blind reaching, a visually open-loop task for measuring depth judgments. The experiment also studied the effect of a highly salient occluding surface appearing behind, coincident with, and in front of a virtual object. The apparatus and closed-loop matching task were based on previous work by Ellis and Menges. The experiment found maximum average depth judgment errors of 5.5 cm, and found that the blind reaching judgments were less accurate than the perceptual matching judgments. The experiment found that the presence of a highly-salient occluding surface has a complicated effect on depth judgments, but does not lead to systematically larger or smaller errors.


applied perception in graphics and visualization | 2011

Peripheral visual information and its effect on distance judgments in virtual and augmented environments

J. Adam Jones; J. Edward Swan; Gurjot Singh; Stephen R. Ellis

A frequently observed problem in medium-field virtual environments is the underestimation of egocentric depth. This problem has been described numerous times and with widely varying degrees of severity, and although there has been considerable progress made in modifying observer behavior to compensate for these misperceptions, the question of why these errors exist is still an open issue. This paper presents the findings of a series of experiments, comprising 103 participants, that attempts to identify and quantify the source of a pattern of adaptation and improved depth judgment accuracy over time scales of less than one hour. Taken together, these experiments suggest that peripheral visual information is an important source of information for the calibration of movement within medium-field virtual environments.


visual analytics science and technology | 2009

Guided analysis of hurricane trends using statistical processes integrated with interactive parallel coordinates

Chad A. Steed; J. Edward Swan; T. J. Jankun-Kelly; Patrick J. Fitzpatrick

This paper demonstrates the promise of augmenting interactive multivariate representations with information from statistical processes in the domain of weather data analysis. Statistical regression, correlation analysis, and descriptive statistical calculations are integrated via graphical indicators into an enhanced parallel coordinates system, called the Multidimensional Data eXplorer (MDX). These statistical indicators, which highlight significant associations in the data, are complemented with interactive visual analysis capabilities. The resulting system allows a smooth, interactive, and highly visual workflow. The systems utility is demonstrated with an extensive hurricane climate study that was conducted by a hurricane expert. In the study, the expert used a new data set of environmental weather data, composed of 28 independent variables, to predict annual hurricane activity. MDX shows the Atlantic Meridional Mode increases the explained variance of hurricane seasonal activity by 7–15% and removes less significant variables used in earlier studies. The findings and feedback from the expert (1) validate the utility of the data set for hurricane prediction, and (2) indicate that the integration of statistical processes with interactive parallel coordinates, as implemented in MDX, addresses both deficiencies in traditional weather data analysis and exhibits some of the expected benefits of visual data analysis.

Collaboration


Dive into the J. Edward Swan's collaboration.

Top Co-Authors

Avatar

Kenneth R. Moser

Mississippi State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Mark A. Livingston

United States Naval Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

T. J. Jankun-Kelly

Association for Computing Machinery

View shared research outputs
Top Co-Authors

Avatar

Gurjot Singh

Mississippi State University

View shared research outputs
Top Co-Authors

Avatar

J. Adam Jones

Mississippi State University

View shared research outputs
Top Co-Authors

Avatar

Chad A. Steed

Oak Ridge National Laboratory

View shared research outputs
Top Co-Authors

Avatar

Dennis G. Brown

United States Naval Research Laboratory

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge