Summer L. Brandt
San Jose State University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Summer L. Brandt.
symposium on human interface on human interface and management of information | 2009
Arik-Quang V. Dao; Summer L. Brandt; Vernol Battiste; Kim-Phuong L. Vu; Thomas Z. Strybel; Walter W. Johnson
This study compared situation awareness across three flight deck decision aiding modes. Pilots resolved air traffic conflicts using a click and drag software tool. In the automated aiding condition, pilots executed all resolutions generated by the automation. In the interactive condition, automation suggested a maneuver, but pilots had the choice of accepting or modifying the provided resolution. In the manual condition pilots generated resolutions independently. A technique that combines both Situation Global Assessment Technique and Situation Present Awareness Method was used to assess situation awareness. Results showed that situation awareness was better in the Manual and Interactive conditions when compared to the Automated condition. The finding suggests that pilots are able to maintain greater situation awareness when they are actively engaged in the conflict resolution process.
international conference on human computer interaction | 2014
Joel Lachter; Summer L. Brandt; Vernol Battiste; Sarah V. Ligda; Michael Matessa; Walter W. Johnson
This document describes the second human-in-the-loop study in a series that examines the role of a ground operator in enabling single pilot operations (SPO). The focus of this study was decision-making and communication between a distributed crew (airborne pilot and ground operator). A prototype ground station and tools designed to enhance collaboration were also assessed for further development. Eighteen crews flew challenging, off-nominal scenarios in three configurations: Baseline (current two-pilot operations) and SPO with and without Collaboration Tools. Subjective ratings were largely favorable to SPO; however, there was preference for the Baseline configuration. Crew comments suggest improvements to increase the usability of the collaboration tools.
International Journal of Human-computer Interaction | 2012
Kim-Phuong L. Vu; Thomas Z. Strybel; Vernol Battiste; Joel Lachter; Arik-Quang V. Dao; Summer L. Brandt; Sarah V. Ligda; Walter W. Johnson
The Next Generation Air Transportation System (NextGen) will revolutionize the air traffic management system in the United States. NextGen will involve human operators interacting with new technologies in a complex system, making human factors and human–computer interaction considerations a major concern. The present study reports data from a human-in-the-loop simulation that evaluated pilot performance, workload, and situation awareness under one of three plausible NextGen concepts of operation. The concepts of operation differed with respect to the allocation of separation responsibility across human pilots and air traffic controllers (ATCs), and automation. Pilots were asked to employ trajectory-based operations to perform weather avoidance maneuvers, an interval management task, and a continuous descent approach. Depending on the concept being tested, they were also given the responsibility of separation assurance (Concept 1) or received conflict resolutions from an ATC (Concept 2) or automated system (Concept 3). Overall, pilot performance on the various flight tasks was worse in Concept 3 than in Concepts 1 and 2. Although pilot workload did not differ across the three concepts, pilot situation awareness was highest in Concept 1, in which the pilots were given the most responsibilities. These findings suggest that keeping pilots engaged in separation assurance tasks may be preferable to having them rely on automation alone.
International Conference on Applied Human Factors and Ergonomics | 2017
Summer L. Brandt; Joel Lachter; Ricky Russell; Robert J. Shively
Human involvement with increasingly autonomous systems must adjust to allow for a more dynamic relationship involving cooperation and teamwork. As part of an ongoing project to develop a framework for human-autonomy teaming (HAT) in aviation, a study was conducted to evaluate proposed tenets of HAT. Participants performed a flight-following task at a ground station both with and without HAT features enabled. Overall, participants preferred the ground station with HAT features enabled over the station without the HAT features. Participants reported that the HAT displays and automation were preferred for keeping up with operationally important issues. Additionally, participants reported that the HAT displays and automation provided enough situation awareness to complete the task, reduced the necessary workload and were efficient. Overall, there was general agreement that HAT features supported teaming with the automation. These results will be used to refine and expand our proposed framework for human-autonomy teaming.
International Conference on Applied Human Factors and Ergonomics | 2017
R. Jay Shively; Joel Lachter; Summer L. Brandt; Michael Matessa; Vernol Battiste; Walter W. Johnson
Automation has entered nearly every aspect of our lives, but it often remains hard to understand. Why is this? Automation is often brittle, requiring constant human oversight to assure it operates as intended. This oversight has become harder as automation has become more complicated. To resolve this problem, Human-Autonomy Teaming (HAT) has been proposed. HAT is based on advances in providing automation transparency, a method for giving insight into the reasoning behind automated recommendations and actions, along with advances in human automation communications (e.g., voice). These, in turn, permit more trust in the automation when appropriate, and less when not, allowing a more targeted supervision of automated functions. This paper proposes a framework for HAT, incorporating three key tenets: transparency, bi-directional communication, and operator directed authority. These tenets, along with more capable automation, represent a shift in human-automation relations.
Cognition, Technology & Work | 2017
Joel Lachter; Summer L. Brandt; Vernol Battiste; Michael Matessa; Walter W. Johnson
From the 1950s through the 1980s, aircraft design was marked by an increase in reliability and automation, and, correspondingly, a decrease in the crew complement required to fly, resulting in the two-pilot operations seen today. However, while technological progress has continued, there have been no further reductions in crew complement, largely because the two pilots mitigate each other’s failures (both mistakes and incapacitation). We present a conceptual framework under which we believe a reduction in crew complement could be made while maintaining current levels of safety. Under this framework, much of the monitoring and verification would fall upon automation. Ground personnel performing an enhanced flight following role would aid the remaining pilot in assessment of any off-nominal event. Additionally, in particularly high-workload or risky situations, a ground pilot could step into the role of first officer. We then discuss four human-in-the-loop simulations conducted at NASA Ames Research Center that illustrate key aspects of this conceptual framework and informed key aspects of its development.
Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2010
Arik-Quang V. Dao; Joel Lachter; Vernol Battiste; Summer L. Brandt; Kim-Phuong L. Vu; Thomas Z. Strybel; Nhut Ho; Patrick Martin; Walter W. Johnson
In this study, pilots were asked to achieve a specific time in trail while flying an arrival into Louisville International airport. Weather shortly before the start of the descent added variability to the initial intervals. A spacing tool calculated airspeeds intended to achieve the desired time in trail at the final approach fix. Pilots were exposed to four experimental conditions which varied how strictly the pilots were told to follow these speeds and whether speeds had to be entered into the autopilot manually. Giving the pilots more discretion had little effect on the final spacing interval. However, pilots required to enter speeds into the autopilot manually did not effectively manage their airplanes energy resulting in less accurate performance. While these results may not always generalize to alternative spacing implementations, one should not assume pilots manually closing the loop on automated commands can perform as well as a fully automated system.
International Conference on Applied Human Factors and Ergonomics | 2017
Joel Lachter; Summer L. Brandt; Garrett G. Sadler; R. Jay Shively
Elsewhere we have discussed a number of problems typical of highly automated systems and proposed tenets for addressing these problems based on Human-Autonomy Teaming (HAT) [1]. We have examined these principles in the context of aviation [2, 3]. Here we discuss the generality of these tenets by examining how they might be applied to photography and automotive navigation. While these domains are very different, we find application of our HAT tenets provides a number of opportunities for improving interaction between human operators and automation. We then illustrate how the generalities found across aviation, photography and navigation can be captured in a design pattern.
international conference on human interface and management of information | 2018
Vernol Battiste; Joel Lachter; Summer L. Brandt; Armando Alvarez; Thomas Z. Strybel; Kim-Phuong L. Vu
Full autonomy seems to be the goal for system developers in almost every area of the economy. However, as we move from automated systems to autonomous systems, designers have needed to insert humans to oversee automation that has traditionally been brittle or incomplete. This creates its own problems as the operator is usually out of the loop when the automation hands over problems that it cannot handle. To better handle these situations, it has been proposed that we develop human automation teams that have shared goals and objectives to support task performance. This paper first summarizes a body of research to develop ground station automation support for single pilot transport operations. Then the paper will describe an initial model of Human Automation Teaming (HAT) which has three elements: transparency, bi-directional communications, and human-directed execution. Transparency in our model is a method for giving insight into the reasoning behind automated recommendations and actions, bi-directional communication allows the operator to communicate directly with the automation, and finally the automation defers execution to the human. The model was implemented through a number of features on an electronic flight bag (EFB) which are described in the paper. The EFB was installed in a mid-fidelity flight simulator and used by 12 airline pilots to support diversion decisions during off-nominal flight scenarios. Pilots reported that working with the HAT automation made diversion decisions easier and reduced their workload. They also reported that the information provided about diversion airports was similar to what they would receive from ground dispatch, thus making coordination with dispatch easier and less time consuming. These HAT features engender more trust in the automation when appropriate, and less when not, allowing improved supervision of automated functions by flight crews.
international conference on human interface and management of information | 2011
Arik-Quang V. Dao; Summer L. Brandt; L. Paige Bacon; Joshua M. Kraut; Jimmy H. Nguyen; Katsumi Minakata; Hamzah Raza; Walter W. Johnson
This study compared pilot situation awareness across three traffic management concepts that varied traffic separation responsibility between the pilots, air-traffic controllers, and an automation system. In Concept 1, the flight deck was equipped with conflict resolution tools that enable them to perform the tasks of weather avoidance and self-separation from surrounding traffic. In Concept 2, air-traffic controllers were responsible for traffic separation, but pilots were provided tools for weather and traffic avoidance. In Concept 3, a ground based automation was used for conflict detection and resolution, and the flight deck tools allowed pilots to deviate for weather, but not detect conflicts. Results showed that pilot situation awareness was highest in Concept 1, where the pilots were most engaged, and lowest in Concept 3, where automation was heavily used. These findings suggest that pilot situation awareness on conflict resolution tasks can be improved by keeping them in the decision-making loop.