Laura A. McNamara
Sandia National Laboratories
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Laura A. McNamara.
Reliability Engineering & System Safety | 2007
Alyson G. Wilson; Laura A. McNamara; Gregory D. Wilson
This paper develops a framework to determine the performance or reliability of a complex system. We consider a case study in missile reliability that focuses on the assessment of a high fidelity launch vehicle intended to emulate a ballistic missile threat. In particular, we address the case of how to make a system assessment when there are limited full-system tests. We address the development of a system model and the integration of a variety of data using a Bayesian network.
international conference on augmented cognition | 2015
Laura E. Matzen; Michael Joseph Haass; Laura A. McNamara; Susan Marie Stevens-Adams; Stephanie N. McMichael
Vision is one of the dominant human senses and most human-computer interfaces rely heavily on the capabilities of the human visual system. An enormous amount of effort is devoted to finding ways to visualize information so that humans can understand and make sense of it. By studying how professionals engage in these visual search tasks, we can develop insights into their cognitive processes and the influence of experience on those processes. This can advance our understanding of visual cognition in addition to providing information that can be applied to designing improved data visualizations or training new analysts.
international conference on augmented cognition | 2015
Laura A. McNamara; Kerstan Suzanne Cole; Michael Joseph Haass; Laura E. Matzen; J. Daniel Morrow; Susan Marie Stevens-Adams; Stephanie N. McMichael
Researchers at Sandia National Laboratories are integrating qualitative and quantitative methods from anthropology, human factors and cognitive psychology in the study of military and civilian intelligence analyst workflows in the United States’ national security community. Researchers who study human work processes often use qualitative theory and methods, including grounded theory, cognitive work analysis, and ethnography, to generate rich descriptive models of human behavior in context. In contrast, experimental psychologists typically do not receive training in qualitative induction, nor are they likely to practice ethnographic methods in their work, since experimental psychology tends to emphasize generalizability and quantitative hypothesis testing over qualitative description. However, qualitative frameworks and methods from anthropology, sociology, and human factors can play an important role in enhancing the ecological validity of experimental research designs.
visual analytics science and technology | 2009
Courtney C. Dornburg; Laura E. Matzen; Travis L. Bauer; Laura A. McNamara
The current visual analytics literature highlights design and evaluation processes that are highly variable and situation dependent, which raises at least two broad challenges. First, lack of a standardized evaluation criterion leads to costly re-designs for each task and specific user community. Second, this inadequacy in criterion validation raises significant uncertainty regarding visualization outputs and their related decisions, which may be especially troubling in high consequence environments like those of the Intelligence Community. As an attempt to standardize the “apples and oranges” of the extant situation, we propose the creation of standardized evaluation tools using general principles of human cognition. Theoretically, visual analytics enables the user to see information in a way that should attenuate the users memory load and increase the users task-available cognitive resources. By using general cognitive abilities like available working memory resources as our dependent measures, we propose to develop standardized evaluative capabilities that can be generalized across contexts, tasks, and user communities.
international conference on human-computer interaction | 2011
Alisa Bandlow; Laura E. Matzen; Kerstan Suzanne Cole; Courtney C. Dornburg; Charles J. Geiseler; John A. Greenfield; Laura A. McNamara; Susan Marie Stevens-Adams
Information visualization tools are being promoted to aid decision support. These tools assist in the analysis and comprehension of ambiguous and conflicting data sets. Formal evaluations are necessary to demonstrate the effectiveness of visualization tools, yet conducting these studies is difficult. Objective metrics that allow designers to compare the amount of work required for users to operate a particular interface are lacking. This in turn makes it difficult to compare workload across different interfaces, which is problematic for complicated information visualization and visual analytics packages. We believe that measures of working memory load can provide a more objective and consistent way of assessing visualizations and user interfaces across a range of applications. We present initial findings from a study using measures of working memory load to compare the usability of two graph representations.
Ground/Air Multisensor Interoperability, Integration, and Networking for Persistent ISR IX | 2018
Laura A. McNamara; Kristin M. Divis; J. Daniel Morrow
Many companies rely on user experience metrics, such as Net Promoter scores, to monitor changes in customer attitudes toward their products. This paper suggests that similar metrics can be used to assess the user experience of the pilots and sensor operators who are tasked with using our radar, EO/IR, and other remote sensing technologies. As we have previously discussed, the problem of making our national security remote sensing systems useful, usable and adoptable is a human-system integration problem that does not get the sustained attention it deserves, particularly given the high- throughput, information-dense task environments common to military operations. In previous papers, we have demonstrated how engineering teams can adopt well-established human-computer interaction principles to fix significant usability problems in radar operational interfaces. In this paper, we describe how we are using a combination of Situation Awareness design methods, along with techniques from the consumer sector, to identify opportunities for improving human-system interactions. We explain why we believe that all stakeholders in remote sensing – including program managers, engineers, or operational users – can benefit from systematically incorporating some of these measures into the evaluation of our national security sensor systems. We will also provide examples of our own experience adapting consumer user experience metrics in operator-focused evaluation of currently deployed radar interfaces.
international conference on augmented cognition | 2017
Laura A. McNamara; Kristin M. Divis; J. Daniel Morrow; David Nikolaus Perkins
Researchers at Sandia National Laboratories in Albuquerque, New Mexico, are engaged in the empirical study of human-information interaction in high-consequence national security environments. This focus emerged from our longstanding interactions with military and civilian intelligence analysts working across a broad array of domains, from signals intelligence to cybersecurity to geospatial imagery analysis. In this paper, we discuss how several years’ of work with Synthetic Aperture Radar (SAR) imagery analysts revealed the limitations of eye tracking systems for capturing gaze events in the dynamic, user-driven problem-solving strategies characteristic of geospatial analytic workflows. We also explain the need for eye tracking systems capable of supporting inductive study of dynamic, user-driven problem-solving strategies characteristic of geospatial analytic workflows. We then discuss an ongoing project in which we are leveraging some of the unique properties of SAR image products to develop a prototype eyetracking data collection and analysis system that will support inductive studies of visual workflows in SAR image analysis environments.
Proceedings of SPIE | 2017
Laura M. Klein; Laura A. McNamara
In this paper, we address the needed components to create usable engineering and operational user interfaces (UIs) for airborne Synthetic Aperture Radar (SAR) systems. As airborne SAR technology gains wider acceptance in the remote sensing and Intelligence, Surveillance, and Reconnaissance (ISR) communities, the need for effective and appropriate UIs to command and control these sensors has also increased. However, despite the growing demand for SAR in operational environments, the technology still faces an adoption roadblock, in large part due to the lack of effective UIs. It is common to find operational interfaces that have barely grown beyond the disparate tools engineers and technologists developed to demonstrate an initial concept or system. While sensor usability and utility are common requirements to engineers and operators, their objectives for interacting with the sensor are different. As such, the amount and type of information presented ought to be tailored to the specific application.
Proceedings of SPIE | 2017
Laura A. McNamara; Leif Berg; Karin Butler; Laura M. Klein
Even as remote sensing technology has advanced in leaps and bounds over the past decade, the remote sensing community lacks interfaces and interaction models that facilitate effective human operation of our sensor platforms. Interfaces that make great sense to electrical engineers and flight test crews can be anxiety-inducing to operational users who lack professional experience in the design and testing of sophisticated remote sensing platforms. In this paper, we reflect on an 18-month collaboration which our Sandia National Laboratory research team partnered with an industry software team to identify and fix critical issues in a widely-used sensor interface. Drawing on basic principles from cognitive and perceptual psychology and interaction design, we provide simple, easily learned guidance for minimizing common barriers to system learnability, memorability, and user engagement.
Security Informatics | 2012
Laura A. McNamara
Biological metaphors abound in computational modeling and simulation, inspiring creative and novel approaches to conceptualizing, representing, simulating and analyzing a wide range of phenomena. Proponents of this research suggest that biologically-inspired informatics have practical national security importance, because they represent a new way to analyze sociopolitical dynamics and trends, from terrorist recruitment to cyber warfare. However, translating innovative basic research into useful, usable, adoptable, and trustworthy tools that benefit the daily work of national security experts is challenging. Drawing on several years’ worth of ethnographic fieldwork among national security experts, this paper suggests that information ecology, activity theory, and participatory modeling provide theoretical frameworks and practical suggestions to support design and development of useful, usable, and adoptable modeling and simulation approaches for complex national security challenges.