Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Deborah Hix is active.

Publication


Featured researches published by Deborah Hix.


Nucleic Acids Research | 2015

The immune epitope database (IEDB) 3.0

Randi Vita; James A. Overton; Jason Greenbaum; Julia V. Ponomarenko; Jason D. Clark; Jason R. Cantrell; Daniel K. Wheeler; Joseph L. Gabbard; Deborah Hix; Alessandro Sette; Bjoern Peters

The IEDB, www.iedb.org, contains information on immune epitopes—the molecular targets of adaptive immune responses—curated from the published literature and submitted by National Institutes of Health funded epitope discovery efforts. From 2004 to 2012 the IEDB curation of journal articles published since 1960 has caught up to the present day, with >95% of relevant published literature manually curated amounting to more than 15 000 journal articles and more than 704 000 experiments to date. The revised curation target since 2012 has been to make recent research findings quickly available in the IEDB and thereby ensure that it continues to be an up-to-date resource. Having gathered a comprehensive dataset in the IEDB, a complete redesign of the query and reporting interface has been performed in the IEDB 3.0 release to improve how end users can access this information in an intuitive and biologically accurate manner. We here present this most recent release of the IEDB and describe the user testing procedures as well as the use of external ontologies that have enabled it.


Teleoperators and Virtual Environments | 2002

A survey of usability evaluation in virtual environments: classification and comparison of methods

Doug A. Bowman; Joseph L. Gabbard; Deborah Hix

Virtual environments (VEs) are a relatively new type of humancomputer interface in which users perceive and act in a three-dimensional world. The designers of such systems cannot rely solely on design guidelines for traditional two-dimensional interfaces, so usability evaluation is crucial for VEs. This paper presents an overview of VE usability evaluation to organize and critically analyze diverse work from this field. First, we discuss some of the issues that differentiate VE usability evaluation from evaluation of traditional user interfaces such as GUIs. We also present a review of some VE evaluation methods currently in use, and discuss a simple classification space for VE usability evaluation methods. This classification space provides a structured means for comparing evaluation methods according to three key characteristics: involvement of representative users, context of evaluation, and types of results produced. Finally, to illustrate these concepts, we compare two existing evaluation approaches: testbed evaluation (Bowman, Johnson, & Hodges, 1999) and sequential evaluation (Gabbard, Hix, & Swan, 1999).


IEEE Computer Graphics and Applications | 1999

User-centered design and evaluation of virtual environments

Joseph L. Gabbard; Deborah Hix; J. E. Swan

We present a structured, iterative methodology for user-centered design and evaluation of VE user interaction. We recommend performing (1) user task analysis followed by (2) expert guidelines-based evaluation, (3) formative user-centered evaluation, and finally (4) comparative evaluation. In this article we first give the motivation and background for our methodology, then we describe each technique in some detail. We applied these techniques to a real-world battlefield visualization VE. Finally, we evaluate why this approach provides a cost-effective strategy for assessing and iteratively improving user interaction in VEs.


ACM Transactions on Information Systems | 1990

The UAN: a user-oriented representation for direct manipulation interface designs

H. Rex Hartson; Antonio C. Siochi; Deborah Hix

Many existing interface representation techniques, especially those associated with UIMS, are constructional and focused on interface implementation, and therefore do not adequately support a user-centered focus. But it is in the behavioral domain of the user that interface designers and evaluators do their work. We are seeking to complement constructional methods by providing a tool-supported technique capable of specifying the behavioral aspects of an interactive system–the tasks and the actions a user performs to accomplish those tasks. In particular, this paper is a practical introduction to use of the User Action Notation (UAN), a task- and user-oriented notation for behavioral representation of asynchronous, direct manipulation interface designs. Interfaces are specified in UAN as a quasihierarchy of asynchronous tasks. At the lower levels, user actions are associated with feedback and system state changes. The notation makes use of visually onomatopoeic symbols and is simple enough to read with little instruction. UAN is being used by growing numbers of interface developers and researchers. In addition to its design role, current research is investigating how UAN can support production and maintenance of code and documentation.


international acm sigir conference on research and development in information retrieval | 1996

Visualizing search results: some alternatives to query-document similarity

Lucy Terry Nowell; Deborah Hix; Lenwood S. Heath; Edward A. Fox

A digital library of computer science literature, Envision provides powerful information visualization by displaying search results as a matrix of icons, with layout semantics under user control. Envision’s Graphic View interacts with an Item Summary Window giving users access to bibliographic information, and XMosaic provides access to complete bibliographic information, abstracts, and full content. While many visualization interfaces for information retrieval systems depict ranked query-document similarity, Envision graphically presents a variety of document characteristics and supports an extensive range of user tasks. Formative usability evaluation results show great user satisfaction with Envision’s style of presentation and the document characteristics visualized.


Journal of the Association for Information Science and Technology | 1993

Users, user interfaces, and objects: Envision, a digital library

Edward A. Fox; Deborah Hix; Lucy Terry Nowell; Dennis J. Brueni; Durgesh Rao; William C. Wake; Lenwood S. Heath

Project Envision aims to build a “user-centered database from the computer science literature,” initially using the publications of the Association for Computing Machinery (ACM). Accordingly, we have interviewed potential users, as well as experts in library, information, and computer science—to understand their needs, to become aware of their perception of existing information systems, and to collect their recommendations. Design and formative usability evaluation of our interface have been based on those interviews, leading to innovative query formulation and search results screens that work well according to our usability testing. Our development of the Envision database, system software, and protocol for client-server communication builds upon work to identify and represent “objects” that will facilitate reuse and high-level communication of information from author to reader (user). All these efforts are leading not only to a usable prototype digital library but also to a set of nine principles for digital libraries, which we have tried to follow, covering issues of representation, architecture, and interfacing.


ieee virtual reality conference | 1999

User-centered design and evaluation of a real-time battlefield visualization virtual environment

Deborah Hix; J. E. Swan; Joseph L. Gabbard; Mike McGee; Jim Durbin; Tony King

The ever-increasing power of computers and hardware rendering systems has, to date, primarily motivated the creation of visually rich and perceptually realistic virtual environment (VE) applications. Comparatively very little effort has been expended on the user interaction components of VEs. As a result, VE user interfaces are often poorly designed and are rarely evaluated with users. Although usability engineering is a newly emerging facet of VE development, user-centered design and usability evaluation in VEs as a practice still lags far behind what is needed. This paper presents a structured, iterative approach for the user-centered design and evaluation of VE user interaction. This approach consists of the iterative use of expert heuristic evaluation, followed by formative usability evaluation, followed by summative evaluation. We describe our application of this approach to a real-world VE for battlefield visualization, describe the resulting series of design iterations, and present evidence that this approach provides a cost-effective strategy for assessing and iteratively improving user interaction design in VEs. This paper is among the first to report applying an iterative, structured, user-centered design and evaluation approach to VE user interaction design.


Effect of Touch Screen Target Location on User Accuracy | 1990

Effect of Touch Screen Target Location on User Accuracy

Michael Leahy; Deborah Hix

Users can be frustrated by touch screen applications that inaccurately record their touches. Enlarging touch sensitive regions can improve touch accuracy, but few specific quantitative guidelines are available. This paper reports on a controlled experiment that investigated the effect of target location and horizontal viewing location on user accuracy. Measurements showed that persons tended to touch below the target, with touch distance increasing as the target location moved down the screen. In addition, they tended to touch toward the sides of the screen. Using collected data for each of nine screen sectors, graphs were prepared that show the relationship between touch target size and expected accuracy. For example, a 36 mm2 target in the top left sector would be expected to accurately record 99% of its touches. The empirically-derived, quantitative guidelines will help designers create screens that decrease user errors and frustration.


human factors in computing systems | 1998

Remote usability evaluation: can users report their own critical incidents?

José C. Castillo; H. Rex Hartson; Deborah Hix

In this paper, we briefly introduce the user-reported critical incident method (originally called semi-instrumented critical incident gathering [3]) for remote usability evaluation, and describe results and lessons learned in its development and use. Our findings indicate that users can, in fact, identify and report their own critical incidents.


Presence: Teleoperators & Virtual Environments | 2006

The effects of text drawing styles, background textures, and natural lighting on text legibility in outdoor augmented reality

Joseph L. Gabbard; J. Edward Swan; Deborah Hix

A challenge in presenting augmenting information in outdoor augmented reality (AR) settings lies in the broad range of uncontrollable environmental conditions that may be present, specifically large-scale fluctuations in natural lighting and wide variations in likely backgrounds or objects in the scene. In this paper, we motivate the need for research on the effects of text drawing styles, outdoor background textures, and natural lighting on user performance in outdoor AR. We present a pilot study and a follow-on user-based study that examined the effects on user performance of outdoor background textures, changing outdoor illuminance values, and text drawing styles in a text identification task using an optical, see-through AR system. We report significant effects for all these variables, and discuss user interface design guidelines and ideas for future work.

Collaboration


Dive into the Deborah Hix's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

J. E. Swan

United States Naval Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

Mark A. Livingston

United States Naval Research Laboratory

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Antonio C. Siochi

Christopher Newport University

View shared research outputs
Top Co-Authors

Avatar

Yohan Baillot

United States Naval Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

Simon J. Julier

University College London

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge