John T. Kelso
Virginia Tech
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by John T. Kelso.
human factors in computing systems | 1996
H. Rex Hartson; José C. Castillo; John T. Kelso; Wayne C. Neale
Traditional user interface evaluation usually is conducted in a laboratory where users are observed directly by evaluators. However, the remote and distributed location of users on the network precludes the opportunity for direct observation in usability testing. Further, the network itself and the remote work setting have become intrinsic parts of usage patterns, difficult to reproduce in a laboratory setting, and developers often have limited access to representative users for usability testing in the laboratory. In all of these cases, the cost of transporting users or developers to remote locations can be prohibitive. These barriers have led us to consider methods for remote usability evaluation wherein the evaluator, performing observation and analysis, is separated in space and/or time from the user. The network itself serves as a bridge to take interface evaluation to a broad range of networked users, in their natural work settings. Several types of remote evaluation are defined and described in terms of their advantages and disadvantages to usability testing. The initial results of two case studies show potential for remote evaluation. Remote evaluation using video teleconferencing uses the network as a mechanism to transport video data in real time, so that the observer can evaluate user interfaces in remote locations as they are being used. Semi-instrumented remote evaluation is based on critical incident gathering by the user within the normal work context. Additionally, both methods can take advantage of automat ing data collection through questionnaires and instrumented applications.
ieee virtual reality conference | 2002
John T. Kelso; Lance Arsenault; Steven G. Satterfield; Ronald D. Kriz
We present DIVERSE, a highly modular collection of complimentary software packages designed to facilitate the creation of device independent virtual environments. DIVERSE is free/open source software, containing both end-user programs and C++ APIs (Application Programming Interfaces). DgiPf is the DIVERSE graphics interface to OpenGL Performer/sup TM/. A program using DgiPf can run on platforms ranging from fully immersive systems such as CAVEs/sup TM/ to generic desktop workstations without modification. We describe DgiPfs design and present a specific example of how it is being used to aid researchers.
ieee virtual reality conference | 2003
John T. Kelso; Steven G. Satterfield; Lance Arsenault; Peter M. Ketchan; Ronald D. Kriz
We present DIVERSE, a highly modular collection of complimentary software packages designed to facilitate the creation of device-independent virtual environments and distributed asynchronous simulations. DIVERSE is free/open source software, containing both end-user programs and C++ application programming interfaces (APIs). DPF is the DIVERSE graphics interface to OpenGL Performer. A program using the DPF API can run without modification on platforms ranging from fully immersive systems such as CAVEs to generic desktop workstations. The DIVERSE toolkit (DTK) contains all the nongraphical components of DIVERSE, such as networking utilities, hardware device access, and navigational techniques. It introduces a software implementation of networks of replicated noncoherent shared memory. It also introduces a method that seamlessly extends hardware drivers into interprocess and Internet hardware services. We will describe the design of DIVERSE and present a specific example of how it is being used to aid researchers.
Journal of Research of the National Institute of Standards and Technology | 2007
John G. Hagedorn; Joy P. Dunkers; Steven G. Satterfield; Adele P. Peskin; John T. Kelso; Judith E. Terrill
This paper describes a set of tools for performing measurements of objects in a virtual reality based immersive visualization environment. These tools enable the use of the immersive environment as an instrument for extracting quantitative information from data representations that hitherto had be used solely for qualitative examination. We provide, within the virtual environment, ways for the user to analyze and interact with the quantitative data generated. We describe results generated by these methods to obtain dimensional descriptors of tissue engineered medical products. We regard this toolbox as our first step in the implementation of a virtual measurement laboratory within an immersive visualization environment.
Book chapter in Trends in Interactive Visualization | 2009
Judith E. Terrill; William L. George; Terence J. Griffin; John G. Hagedorn; John T. Kelso; Marc Olano; Adele P. Peskin; Steven G. Satterfield; James S. Sims; Jeffrey W. Bullard; Joy P. Dunkers; Nicos Martys; Agnes O’Gallagher; Gillian Haemer
We describe three classes of tools to turn visualizations into a visual laboratory to interactively measure and analyze scientific data. We move the nor- mal activities that scientists perform to understand their data into the visualization environment, which becomes our virtual laboratory, combining the qualitative with the quantitative. We use representation, interactive selection, quantification, and display to add quantitative measurement methods, input tools, and output tools. These allow us to obtain numerical information from each visualization. The exact form that the tools take within each of our three categories depends on features present in the data, hence each is manifested differently in different situations. We illustrate the three approaches with a variety of case studies from immersive to desktop environments that demonstrate the methods used to obtain quantitative knowledge interactively from visual objects.
Journal of Research of the National Institute of Standards and Technology | 2008
James S. Sims; William L. George; Terence J. Griffin; John C. Hagedorn; Howard Hung; John T. Kelso; Marc Olano; Adele P. Peskin; Steven G. Satterfield; Judith Devaney Terrill; Garnett W. Bryant; Jose G. Diaz
This is the third in a series of articles that describe, through examples, how the Scientific Applications and Visualization Group (SAVG) at NIST has utilized high performance parallel computing, visualization, and machine learning to accelerate scientific discovery. In this article we focus on the use of high performance computing and visualization for simulations of nanotechnology.
international conference on human-computer interaction | 1999
John M. Carroll; Mary Beth Rosson; Christina Van Metre; Rekha-Rukmini Kengeri; John T. Kelso; Mridu Darshani
Archive | 2001
Lance Arsenault; John T. Kelso; Ron Kriz; Fernando Das-Neves
Archive | 2002
Brian Melanson; John T. Kelso; Doug A. Bowman
Archive | 2008
Andrew Ray; Doug A. Bowman; Shawn A. Bohner; Denis Gracanin; John T. Kelso; Chris North