Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Ryan M. Kilgore is active.

Publication


Featured researches published by Ryan M. Kilgore.


AIAA Infotech@Aerospace 2007 Conference and Exhibit | 2007

Mission Planning and Monitoring for Heterogeneous Unmanned Vehicle Teams: A Human-Centered Perspective

Ryan M. Kilgore; Karen Harper; Carl E. Nehme; Mary L. Cummings

Future unmanned systems in the military will be highly heterogeneous in nature, with vehicles from multiple domains—aerial, underwater, and land—working in collaborative teams to complete a variety of missions. The complexity of supervising these teams will be enormous and will rely on human creativity, judgment, and experience. Therefore, the design and development of mission planning and monitoring technologies must be rooted in a deep understanding of the human operators role as mission manager, and must effectively address the reasoning skills and limitations of both the human and autonomous intelligent system. In this paper we present our work to approach these supervisory issues from a human-centered perspective. We first review the findings of a cognitive task analysis, through which we defined critical informational requirements and developed display interfaces for human operators developing and executing mission plans for a small team of underwater and aerial unmanned vehicles. These findings raise several operations issues for unmanned vehicle management, namely (1) vehicle and task heterogeneity and (2) the coordination of command and control across a vehicle team. We discuss the impact of both of these design requirements on the human-centered development of mission planning tools for unmanned systems. Finally, we introduce an investigative approach to support the rapid evaluation of interfaces that flexibly accommodate alternative command and control philosophies for heterogeneous automated systems using a combination of modeling and human-in-the-loop evaluation processes


international conference on virtual augmented and mixed reality | 2014

Increasing the Transparency of Unmanned Systems: Applications of Ecological Interface Design

Ryan M. Kilgore; Martin Voshell

This paper describes ongoing efforts to address the challenges of supervising teams of heterogeneous unmanned vehicles through the use of demonstrated Ecological Interface Design EID principles. We first review the EID framework and discuss how we have applied it to the unmanned systems domain. Then, drawing from specific interface examples, we present several generalizable design strategies for improved supervisory control displays. We discuss how ecological display techniques can be used to increase the transparency and observability of highly automated unmanned systems by enabling operators to efficiently perceive and reason about automated support outcomes and purposefully direct system behavior.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2008

Predicting the Impact of Heterogeneity on Unmanned-Vehicle Team Performance

Carl E. Nehme; Ryan M. Kilgore; Mary L. Cummings

Several recent studies have addressed the possible impact of using highly autonomous platforms to invert todays multiple-operators-per-single-unmanned-vehicle control paradigm. These studies, however, have generally focused on homogeneous vehicle teams and have not addressed the potential effects of vehicle, capability, or mission type heterogeneity on operator control capacity. Important implications of heterogeneous unmanned teams include increases in the diversity of potential team configurations, as well as the diversity of possible attention allocation strategies that may be utilized by operators in managing a given vehicle team. This paper presents preliminary findings from a modeling and simulation effort exploring the impact of heterogeneity on the supervisory control of unmanned vehicle teams. Results from a discrete event simulation study suggest that performance costs of team heterogeneity are highly dependent on resultant changes in operator utilization. Heterogeneous teams that result in lower overall operator utilization may lead to improved performance under certain operator control strategies.


ieee international conference on technologies for homeland security | 2013

A Precision Information Environment (PIE) for emergency responders: Providing collaborative manipulation, role-tailored visualization, and integrated access to heterogeneous data

Ryan M. Kilgore; Alex Godwin; Amanda Davis; Chris Hogan

During a crisis, emergency responders must rapidly integrate information from many separate sources to satisfy their role-specific needs and to make time-sensitive decisions. Responders currently receive this information through numerous software applications and face the challenge of integrating this heterogeneous data into an all-encompassing picture. Individual responders with distinct roles, such as police, fire, and EMS, often have very different information needs, but existing tools do not provide individual tailoring of workspaces to support this need. As responders communicate using text, voice, or other multimodal collaboration systems, important details can also be lost or become stale over time. This paper describes our approach to developing a Precision Information Environment (PIE) that: (1) streamlines access to multiple information resources by fusing heterogeneous information for presentation through a single access point; (2) supports role-tailorable understanding of the unified data sources through a flexible workspace; and (3) supports collaboration between teams of local and distributed responders by providing a work environment that allows teams to share and manipulate dynamic data sources in real time. We also describe our initial results from a usability evaluation of the system with subject matter experts.


visual analytics science and technology | 2010

Visualization of temporal relationships within coordinated views

Stephanie Dudzic; J. Alex Godwin; Ryan M. Kilgore

In command and control (C2) environments, decision makers must rapidly understand and address key temporal relationships that exist between critical tasks as conditions fluctuate. However, traditional temporal displays, such as mission timelines, fail to support user understanding of and reasoning about critical relationships. We have developed visualization methods to compactly and effectively convey key temporal constraints. In this paper, we present examples of our visualization approach and describe how we are exploring interaction methods within an integrated visualization workspace to support user awareness of temporal constraints.


Proceedings of SPIE | 2010

Visual strategies for enhancing user perception of task relationships in emergency operations centers

Stephanie Dudzic; Alex Godwin; Ryan M. Kilgore

In time-sensitive environments, such as DHS emergency operations centers (EOCs), it is imperative for decision makers to rapidly understand and address key logical relationships that exist between tasks, entities, and events, even as conditions fluctuate. These relationships often have important temporal characteristics, such as tasks that must be completed before others can be started (e.g., buses must be transported to an area before an evacuation process can begin). Unfortunately, traditional temporal display methods, such as mission timelines, typically reveal only rudimentary event details and fail to support user understanding of and reasoning about critical temporal constraints and interrelationships across multiple mission components. To address these shortcomings, we developed a visual language to enhance temporal data displays by explicitly and intuitively conveying these constraints and relationships to decision makers. In this paper, we detail these design strategies and describe ongoing evaluation efforts to assess their usability and effectiveness to support decision-making tasks in complex, time-sensitive environments. We present a case study in which we applied our visual enhancements to a timeline display, improving the perception of logical relationships among events in a Master Scenario Event List (MSEL). These methods reduce the cognitive workload of decision makers and improve the efficacy of identification.


Human Factors | 2009

Simple Displays of Talker Location Improve Voice Identification Performance in Multitalker, Spatialized Audio Environments

Ryan M. Kilgore

Objective: The aim of this study was to assess the voice identification benefits of visual depictions of the relative locations of spatialized talkers in a serial listening task. Background: Although spatialized audio is known to improve speech intelligibility and voice identification accuracy within multitalker environments, prior studies have not found any additional benefit for augmenting spatialized audio with visual depictions of relative voice locations. These studies, however, were restricted to small audio environments (four voices), potentially limiting the ability of simple talker location displays to provide additional identification benefit. Method: In the first experiment, 18 participants performed a voice identification task for four- and eight-voice environments under three display conditions: (a) nonspatialized voices with an audio-only display, (b) spatialized voices with an audio-only display, and (c) spatialized voices augmented by a visual display of relative talker locations. In the second experiment, 32 participants performed the same voice identification task within a spatialized eight-voice environment but with audio and visual displays of differing angular scale. Results: Visually depicting relative talker locations improved voice identification performance in terms of both accuracy and response time, particularly for more populous auditory spaces. Both auditory and visual display scale affected these benefits, with large-angle displays performing the best for both modalities. Conclusion: Results indicate that simple visual representations of spatialized audio environments help listeners identify voices and that these representations are more effective when the angular spacing (auditory and visual) between talker locations is increased. Application: These results have important implications for the design and implementation of collaborative audio environments for shared, desktop, and portable communication devices.


international conference on supporting group work | 2016

Towards Card-based User Interfaces Workspaces for Group Mission Planning

Stephanie Kane; Erika von Kelsch; Martin Voshell; Ryan M. Kilgore

Current mission planning interfaces are difficult to understand, cumbersome to use, and do not support the collaborative aspect of group mission planning. To address this critical shortfall, this paper describes the designed and demonstrated set of card-based user interfaces (card UIs) to increase the effectiveness of group mission planning workflows. These interfaces provide consistent visual structures for a diverse set of tasks across team members, enable team members to understand progress across distributed tasks and facilitate situational awareness of the overall evolving mission plan. This paper describes key considerations for group mission planning activities and present examples of our card UI interface supporting group mission planning tasks.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2013

Cognitive Engineering Across Domains: What the Wide-angle View can Provide

Emilie M. Roth; Ryan M. Kilgore; Catherine M. Burns; Robert L. Wears; John D. Lee; Greg A. Jamieson; Ann M. Bisantz

A strength of the field of cognitive engineering and decision-making lies in its wide applicability across the complex socio-technical systems, which are ubiquitous in modern society. Methods and theoretical advances in CEDM have been both developed through, and adapted across, domains as diverse as nuclear power, health systems, and aviation. While all of these domains clearly differ in terms of their surface characteristics, cognitive engineers are able to make fundamental connections across domains. These connections are supported by the types of methodological tools deployed within CEDM and allow problem solutions to be extended and adapted across domains. This panel brings together researchers and practitioners who have worked in a wide variety of domains to discuss a variety of design and methodological challenges they have and are facing. The panel will focus on synthesizing these challenges across domains – both across the panellists, and members of the audience, with the goal of providing both guidance and direction for future research.


international conference on virtual, augmented and mixed reality | 2018

Helmet-Mounted Displays to Support Off-Axis Pilot Spatial Orientation

Stephanie Kane; Ryan M. Kilgore

Aerial refueling is one of the most demanding and dangerous activities faced by pilots. To monitor refueling, pilots must focus for long periods of time while looking up and outside the aircraft (“off-axis”), a more difficult task than focusing forward in the direction of flight (“on-axis”). To address these challenges, we designed a set of augmented reality display strategies for head-mounted displays (HMDs) that support pilot spatial orientation during off-axis activities, such as refueling. These display strategies include extending traditional on-axis displays (e.g., pitch ladders) for the off-axis context and designing new displays that convey critical information specifically tailored for the off-axis context. In this paper, we present our overall approach and a subset of concepts to address these needs. We also describe plans for formal evaluations.

Collaboration


Dive into the Ryan M. Kilgore's collaboration.

Top Co-Authors

Avatar

Chris Hogan

Charles River Laboratories

View shared research outputs
Top Co-Authors

Avatar

Martin Voshell

Charles River Laboratories

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Carl E. Nehme

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

J. Alex Godwin

Charles River Laboratories

View shared research outputs
Top Co-Authors

Avatar

Stephanie Kane

Charles River Laboratories

View shared research outputs
Top Co-Authors

Avatar

Alex Godwin

Charles River Laboratories

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Michael Jenkins

Charles River Laboratories

View shared research outputs
Top Co-Authors

Avatar

Stephanie Dudzic

Charles River Laboratories

View shared research outputs
Researchain Logo
Decentralizing Knowledge