Jacob P. Somervell
Virginia Tech
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Jacob P. Somervell.
ACM Transactions on Computer-Human Interaction | 2003
D. Scott McCrickard; Christa M. Chewar; Jacob P. Somervell; Ali Ndiwalana
Addressing the need to tailor usability evaluation methods (UEMs) and promote effective reuse of HCI knowledge for computing activities undertaken in divided-attention situations, we present the foundations of a unifying model that can guide evaluation efforts for notification systems. Often implemented as ubiquitous systems or within a small portion of the traditional desktop, notification systems typically deliver information of interest in a parallel, multitasking approach, extraneous or supplemental to a users attention priority. Such systems represent a difficult challenge to evaluate meaningfully. We introduce a design model of user goals based on blends of three critical parameters---interruption, reaction, and comprehension. Categorization possibilities form a logical, descriptive design space for notification systems, rooted in human information processing theory. This model allows conceptualization of distinct action models for at least eight classes of notification systems, which we describe and analyze with a human information processing model. System classification regions immediately suggest useful empirical and analytical evaluation metrics from related literature. We present a case study that demonstrates how these techniques can assist an evaluator in adapting traditional UEMs for notification and other multitasking systems. We explain why using the design model categorization scheme enabled us to generate evaluation results that are more relevant for the system redesign than the results of the original exploration done by the systems designers.
user interface software and technology | 2003
Craig H. Ganoe; Jacob P. Somervell; Dennis C. Neale; Philip L. Isenhour; John M. Carroll; Mary Beth Rosson; D. Scott McCrickard
Classroom BRIDGE supports activity awareness by facilitating planning and goal revision in collaborative, project-based middle school science. It integrates large-screen and desktop views of project times to support incidental creation of awareness information through routine document transactions, integrated presentation of awareness information as part of workspace views, and public access to subgroup activity. It demonstrates and develops an object replication approach to integrating synchronous and asynchronous distributed work for a platform incorporating both desktop and large-screen devices. This paper describes an implementation of these concepts with preliminary evaluation data, using timeline-based user interfaces.
technical symposium on computer science education | 2004
D. Scott McCrickard; Christa M. Chewar; Jacob P. Somervell
Reacting to challenges that have been observed in human-computer interaction (HCI) education, as well as the multidisciplinary design, science, and engineering underpinnings, we investigate a pedagogical approach based on case methods. Our study of various case method techniques in an undergraduate HCI class provides insights into challenges that can be expected in the employment of case methods, student learning outcomes, and considerations for HCI curriculum planning. In general, case methods show great promise with a wide variety of topics, and we present broad recommendations for future work that will improve integration of HCI professional practice, research, and education.
Interacting with Computers | 2005
Jacob P. Somervell; D. Scott McCrickard
This paper describes a heuristic creation process based on the notion of critical parameters, and a comparison experiment that demonstrates the utility of heuristics created for a specific system class. We focus on two examples of using the newly created heuristics to illustrate the utility of the usability evaluation method, as well as to provide support for the creation process, and we report on successes and frustrations of two classes of users, novice evaluators and domain experts, who identified usability problems with the new heuristics. We argue that establishing critical parameters for other domains will support efforts in creating tailored evaluation tools.
human factors in computing systems | 2003
Ali Ndiwalana; Christa M. Chewar; Jacob P. Somervell; D. Scott McCrickard
One of the challenges in building and evaluating ubiquitous computing systems emanates from the fact that they generally have been built to showcase technological innovation without considering how to foretell whether and how people will eventually accept them in their lives. In this study, participants are introduced to the notion of ubiquitous computing via a scenario-centric presentation including basic everyday objects imbued with some computational power to convey information. Through a detailed survey, participants provide feedback relating to their impressions, rating the performance of each interface on a number of metrics and making comparisons between the ubiquitous and desktop interfaces. We inspire them to think of new ways to use existing ubiquitous interfaces to support their current and possible information needs, as well as better interfaces that can convey this information.
Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2004
Jacob P. Somervell; D. Scott McCrickard
Heuristic evaluation method comparison is important for developing new heuristic sets, to ensure effectiveness and utility. However, comparing different sets of heuristics requires a common baseline upon which a comparison can be made, usually some set of usability problems from a particular interface. This is often accomplished by having evaluators perform system evaluation to produce a set of usability problems for each method in question. A problem arises in that different methods produce different sets of problems, thus introducing validity concerns and ambiguity in resolution of disparate problem sets. We address this problem by illustrating a new comparison technique in which predetermined usability issues are presented to the evaluators up front, followed by assessment of thoroughness, reliability, and cost for the target methods. Comparison of method effectiveness is simplified, and validity concerns are ameliorated.
Frontiers in Education | 2004
Jacob P. Somervell; Christa M. Chewar; D.S. McCrickard
This paper investigates case-based methods for bridging the conflicting goals of providing both topic coverage and practical experience in teaching human-computer interaction (HCI). We evaluate benefits and limitations of five types of case materials - contemporary articles, professionally prepared cases, familiar interfaces, ongoing development projects, and incomplete information (jigsaw) - to probe how they should be structured and approached by an HCI instructor. Through an experience that assessed case-based activities in an undergraduate HCI course, we determined tradeoffs relating to student participation, preparation characteristics, and short- and long-term learning outcomes. Based on our results, we can make several conclusions that should influence selection and development of materials for case-based pedagogy, and we illustrate the need for structured case creation processes that can be performed conjointly with system development efforts.
AICPS | 2002
Jacob P. Somervell; D. Scott McCrickard; Chris North; Maulik Shukla
international conference on human-computer interaction | 2003
Jacob P. Somervell; Shahtab Wahid
SE | 2003
Jacob P. Somervell; Christa M. Chewar; D. Scott McCrickard; Ali Ndiwalana