Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where J. E. Swan is active.

Publication


Featured researches published by J. E. Swan.


IEEE Computer Graphics and Applications | 1999

User-centered design and evaluation of virtual environments

Joseph L. Gabbard; Deborah Hix; J. E. Swan

We present a structured, iterative methodology for user-centered design and evaluation of VE user interaction. We recommend performing (1) user task analysis followed by (2) expert guidelines-based evaluation, (3) formative user-centered evaluation, and finally (4) comparative evaluation. In this article we first give the motivation and background for our methodology, then we describe each technique in some detail. We applied these techniques to a real-world battlefield visualization VE. Finally, we evaluate why this approach provides a cost-effective strategy for assessing and iteratively improving user interaction in VEs.


ieee virtual reality conference | 1999

User-centered design and evaluation of a real-time battlefield visualization virtual environment

Deborah Hix; J. E. Swan; Joseph L. Gabbard; Mike McGee; Jim Durbin; Tony King

The ever-increasing power of computers and hardware rendering systems has, to date, primarily motivated the creation of visually rich and perceptually realistic virtual environment (VE) applications. Comparatively very little effort has been expended on the user interaction components of VEs. As a result, VE user interfaces are often poorly designed and are rarely evaluated with users. Although usability engineering is a newly emerging facet of VE development, user-centered design and usability evaluation in VEs as a practice still lags far behind what is needed. This paper presents a structured, iterative approach for the user-centered design and evaluation of VE user interaction. This approach consists of the iterative use of expert heuristic evaluation, followed by formative usability evaluation, followed by summative evaluation. We describe our application of this approach to a real-world VE for battlefield visualization, describe the resulting series of design iterations, and present evidence that this approach provides a cost-effective strategy for assessing and iteratively improving user interaction design in VEs. This paper is among the first to report applying an iterative, structured, user-centered design and evaluation approach to VE user interaction design.


ieee virtual reality conference | 2003

A comparative study of user performance in a map-based virtual environment

J. E. Swan; Joseph L. Gabbard; Deborah Hix; Robert S. Schulman; Keun P. Kim

We present a comparative study of user performance with tasks involving navigation, visual search, and geometric manipulation, in a map-based battlefield visualization virtual environment (VE). Specifically, our experiment compared user performance of the same task across four different VE platforms: desktop, cave, workbench, and wall. Independent variables were platform type, stereopsis (stereo, mono), movement control mode (rate, position), and frame of reference (egocentric, exocentric). Overall results showed that users performed tasks fastest using the desktop and slowest using the workbench. Other results are detailed in the article. Notable is that we designed our task in an application context, with tasking much closer to how users would actually use a real-world battlefield visualization system. This is very uncommon for comparative studies, which are usually designed with abstract tasks to minimize variance. This is, we believe, one of the first and most complex studies to comparatively examine, in an application context, this many key variables affecting VE user interface design.


electronic imaging | 2002

Usability Engineering: Domain Analysis Activities for Augmented Reality Systems

Joseph L. Gabbard; J. E. Swan; Deborah Hix; Marco Lanzagorta; Mark A. Livingston; D. B. Brown; Simon J. Julier

This paper discusses our usability engineering process for the Battlefield Augmented Reality System (BARS). Usability engineering is a structured, iterative, stepwise development process. Like the related disciplines of software and systems engineering, usability engineering is a combination of management principals and techniques, formal and semi- formal evaluation techniques, and computerized tools. BARS is an outdoor augmented reality system that displays heads- up battlefield intelligence information to a dismounted warrior. The paper discusses our general usability engineering process. We originally developed the process in the context of virtual reality applications, but in this work we are adapting the procedures to an augmented reality system. The focus of this paper is our work on domain analysis, the first activity of the usability engineering process. We describe our plans for and our progress to date on our domain analysis for BARS. We give results in terms of a specific urban battlefield use case we have designed.


hawaii international conference on system sciences | 2004

A cost-effective usability evaluation progression for novel interactive systems

Deborah Hix; Joseph L. Gabbard; J. E. Swan; Mark A. Livingston; Tobias Höllerer; Simon J. Julier; Yohan Baillot; D. B. Brown

This paper reports on user interface design and evaluation for a mobile, outdoor, augmented reality (AR) application. This novel system, called the battlefield augmented reality system (BARS), supports information presentation and entry for situation awareness in an urban war fighting setting. To our knowledge, this is the first time extensive use of usability engineering has been systematically applied to development of a real-world AR system. Our BARS team has applied a cost-effective progression of usability engineering activities from the very beginning of BARS development. We discuss how we first applied cycles of structured expert evaluations to BARS user interface development, employing user interface mockups representing occluded (non-visible) objects. Then we discuss how results of these evaluations informed our subsequent user-based statistical evaluations and formative evaluations, and present these evaluations and their outcomes. Finally, we discuss how and why this sequence of types of evaluation is cost-effective.


ieee virtual reality conference | 2003

Evaluation of the ShapeTape tracker for wearable, mobile interaction

Yohan Baillot; Joshua J. Eliason; Greg S. Schmidt; J. E. Swan; Dennis G. Brown; Simon J. Julier; Mark A. Livingston; Lawrence J. Rosenblum

We describe two engineering experiments designed to evaluate the effectiveness of Measurands ShapeTape for wearable, mobile interaction. Our initial results suggest that the ShapeTape is not appropriate for interactions which require a high degree of accuracy. However, ShapeTape is capable of reproducing the qualitative motion the user is performing and thus could be used to support 3D gesture-based interaction.


ieee virtual reality conference | 2004

Tutorial: Conducting human-subject experiments with virtual and augmented reality

J. E. Swan; Joseph L. Gabbard; Deborah Hix; Stephen R. Ellis; Bernard D. Adelstein

Provides an abstract of the tutorial presentation and a brief professional biography of the presenter. The complete presentation was not made available for publication as part of the conference proceedings.


In: (pp. pp. 868-875). National Training and Simulation Association: Arlington, US. (2002) | 2002

An augmented reality system for military operations in urban terrain

Mark A. Livingston; Lawrence J. Rosenblum; Simon J. Julier; Dennis G. Brown; Yohan Baillot; J. E. Swan; Joseph L. Gabbard; Deborah Hix


IEEE Transactions on Visualization and Computer Graphics | 2008

Usability Engineering for Augmented Reality: Employing User-Based Studies to Inform Design

Joseph L. Gabbard; J. E. Swan


ieee virtual reality conference | 2005

Objective measures for the effectiveness of augmented reality

Mark A. Livingston; Catherine Zanbaka; J. E. Swan; Harvey S. Smallman

Collaboration


Dive into the J. E. Swan's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Mark A. Livingston

United States Naval Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

Yohan Baillot

United States Naval Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

Simon J. Julier

University College London

View shared research outputs
Top Co-Authors

Avatar

Dennis G. Brown

United States Naval Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

Greg S. Schmidt

United States Naval Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

Lawrence J. Rosenblum

United States Naval Research Laboratory

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

D. B. Brown

United States Naval Research Laboratory

View shared research outputs
Researchain Logo
Decentralizing Knowledge