Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Heath A. Ruff is active.

Publication


Featured researches published by Heath A. Ruff.


Teleoperators and Virtual Environments | 2002

Human interaction with levels of automation and decision-aid fidelity in the supervisory control of multiple simulated unmanned air vehicles

Heath A. Ruff; S. Narayanan; Mark H. Draper

Remotely operated vehicles (ROVs) are vehicular robotic systems that are teleoperated by a geographically separated user. Advances in computing technology have enabled ROV operators to manage multiple ROVs by means of supervisory control techniques. The challenge of incorporating telepresence in any one vehicle is replaced by the need to keep the human in the loop of the activities of all vehicles. An evaluation was conducted to compare the effects of automation level and decision-aid fidelity on the number of simulated remotely operated vehicles that could be successfully controlled by a single operator during a target acquisition task. The specific ROVs instantiated for the study were unmanned air vehicles (UAVs). Levels of automation (LOAs) included manual control, management-by-consent, and management-by-exception. Levels of decision-aid fidelity (100 correct and 95 correct) were achieved by intentionally injecting error into the decision-aiding capabilities of the simulation. Additionally, the number of UAVs to be controlled varied (one, two, and four vehicles). Twelve participants acted as UAV operators. A mixed-subject design was utilized (with decision-aid fidelity as the between-subjects factor), and participants were not informed of decision-aid fidelity prior to data collection. Dependent variables included mission efficiency, percentage correct detection of incorrect decision aids, workload and situation awareness ratings, and trust in automation ratings. Results indicate that an automation level incorporating management-by-consent had some clear performance advantages over the more autonomous (management-by-exception) and less autonomous (manual control) levels of automation. However, automation level interacted with the other factors for subjective measures of workload, situation awareness, and trust. Additionally, although a 3D perspective view of the mission scene was always available, it was used only during low-workload periods and did not appear to improve the operators sense of presence. The implications for ROV interface design are discussed, and future research directions are proposed.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2003

MANUAL VERSUS SPEECH INPUT FOR UNMANNED AERIAL VEHICLE CONTROL STATION OPERATIONS

Mark H. Draper; Gloria L. Calhoun; Heath A. Ruff; David T. Williamson; Timothy P. Barry

Unmanned aerial vehicle (UAV) control stations feature multiple menu pages with systems accessed by keyboard presses. Use of speech-based input may enable operators to navigate through menus and select options more quickly. This experiment examined the utility of conventional manual input versus speech input for tasks performed by operators of a UAV control station simulator at two levels of mission difficulty. Pilots performed a continuous flight/navigation control task while completing eight different data entry task types with each input modality. Results showed that speech input was significantly better than manual input in terms of task completion time, task accuracy, flight/navigation measures, and pilot ratings. Across tasks, data entry time was reduced by approximately 40% with speech input. Additional research is warranted to confirm that this head-up, hands-free control is still beneficial in operational UAV control station auditory environments and does not conflict with intercom operations and intra-crew communications.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2004

Tactile versus Aural Redundant Alert Cues for UAV Control Applications

Gloria L. Calhoun; John V. Fontejon; Mark H. Draper; Heath A. Ruff; Brian J. Guilfoos

In complex UAV control stations, it is important to alert operators to actionable information in a timely manner. Tactile displays may alleviate visual workload by transmitting information through the skin, cueing operators to high priority, unexpected events. The utility of tactile alerts (vibration on wrists) in substitution for aural alerts, as a redundant cue to visual alerts, was examined. Participants responded to critical events alerted with aural or tactile redundant cues, while performing multiple tasks in a simulated UAV control station. Results showed that there were no significant performance differences between the conditions employing unique aural and tactile cues. These data suggest that the non-visual alerts may be equally compelling and the tactile alerts can substitute for aural alerts as a redundant cue to visual cues. Also, there was not a strong indication that tactile alerts were advantageous in a high noise environment. However, subjective comments and trends in the data suggest that tactile alerts may be especially advantageous in noisy task environments requiring long periods of vigilance.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2012

Adaptable and Adaptive Automation for Supervisory Control of Multiple Autonomous Vehicles

Brian Kidwell; Gloria L. Calhoun; Heath A. Ruff; Raja Parasuraman

Supervisory control of multiple autonomous vehicles raises many issues concerning the balance of system autonomy with human interaction for optimal operator situation awareness and system performance. An unmanned vehicle simulation designed to manipulate the application of automation was used to evaluate participants’ performance on image analysis tasks under two automation control schemes: adaptable (level of automation directly manipulated by participant throughout trials) and adaptive (level of automation adapted as a function of participants’ performance on four types of tasks). The results showed that while adaptable automation increased workload, it also improved change detection, as well as operator confidence in task-related decision-making.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2003

Evaluation of Tactile Alerts for Control Station Operation

Gloria L. Calhoun; Mark H. Draper; Heath A. Ruff; John V. Fontejon; Brian J. Guilfoos

In complex systems, it is difficult to efficiently provide operators with timely and meaningful information. Tactile displays may alleviate visual workload by transmitting information through the skin. This study examined the utility of active tactile alerts versus salient visual and/or auditory alerts in an unmanned aerial vehicle ground control station simulation. Tactor location (right or left arm) and number of factors vibrating (one or two) were used to code three types of alerts. Results showed that tactile stimulation, when presented in concert with visual and auditory alerts, did not aid (or degrade) performance, suggesting they could substitute for auditory alerts when the aural channel is overloaded. Results also indicated that tactile alerts used as the sole cue, compared to a visual alert, improved reaction time which suggests that vibratory stimulation may be an effective non-redundant cue. Further research is warranted to determine how best to apply tactile alerts in control stations to reduce visual and auditory workload.


document analysis systems | 2003

A human factors testbed for command and control of unmanned air vehicles

Kam S. Tso; Gregory K. Tharp; Ann T. Tai; Mark H. Draper; Gloria L. Calhoun; Heath A. Ruff

In this paper, the testbed which is built upon the Multi-Modal Immersive Intelligent Interface for Remote Operation (MIIIRO) to support UAV control is presented. The testbed implements a client/server architecture in which UAV operations are simulated on a server that maintains the states of the UAVs. The testbed supports both the route planning and execution of human factors experiments.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2002

Utilty of a Tactile Display for Cueing Faults

Gloria L. Calhoun; Mark H. Draper; Heath A. Ruff; John V. Fontejon

Tactile displays have been proposed as a multisensory interface technology that can relieve the typically overburdened visual channel of operators. This study compared the ability of operators, while simultaneously completing a tracking task, to detect and identify system faults in a monitoring task with three types of alert cues: tactile, visual, and redundant tactile and visual. For the tactile display, the location and vibration pulse speed of two tactors were mapped to four system faults. Response time was significantly faster with the tactile cue. Also, the tactile cue resulted in less interference with the concurrent tracking task, while not degrading vigilance to an additional concurrent visual monitoring task. These results suggest that further tactile cue research is warranted to examine potential applications in complex systems, such as control stations for unmanned aerial vehicles.


Journal of Cognitive Engineering and Decision Making | 2011

Automation-Level Transference Effects in Simulated Multiple Unmanned Aerial Vehicle Control

Gloria L. Calhoun; Heath A. Ruff; Mark H. Draper; Evan J. Wright

Supervisory control of multiple unmanned aerial vehicles (UAVs) raises many questions concerning the balance of system autonomy with human interaction for effective operator situation awareness and system performance. The reported experiment used a UAV simulation environment to evaluate two applications of autonomy levels across two primary control tasks: allocation (assignment of sensor tasks to vehicles) and router (determining vehicles’ flight plans). In one application, the autonomy level was the same across these two tasks. In the other, the autonomy levels differed, one of the two tasks being more automated than the other. Trials also involved completion of other mission-related secondary tasks as participants supervised three UAVs. The results showed that performance on both the primary tasks and many secondary tasks was better when the level of automation was the same across the two sequential primary tasks. These findings suggest that having the level of automation similar across closely coupled tasks reduces mode awareness problems, which can negate the intended benefits of a fine-grained application of automation. Several research issues are identified to further explore the impact of automation-level transference in supervisory control applications involving the application of automation across numerous tasks.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2011

Performance-based Adaptive Automation for Supervisory Control

Gloria L. Calhoun; Victoria B.R. Ward; Heath A. Ruff

Supervisory control of multiple autonomous vehicles raises many issues concerning the balance of system autonomy with human interaction for optimal operator situation awareness and system performance. An unmanned vehicle simulation designed to manipulate the application of automation was used to evaluate participants’ performance on image analysis tasks under two automation conditions: static (level of automation remained constant throughout trials) and adaptive (level of automation adapted as a function of performance on five types of tasks). The results showed that performance-based adaptation of the image task autonomy level improved performance on this task, as well as other tasks. Additionally, participants preferred the adaptive automation condition and felt that it reduced their cognitive workload and aided performance. Research issues are identified to further evaluate performance-based adaptation for supervisory control.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2005

Tactile and aural alerts in high auditory load UAV control environments

Gloria L. Calhoun; Mark H. Draper; Brian J. Guilfoos; Heath A. Ruff

Tactile displays may alleviate visual workload in complex UAV control stations, cueing operators to high priority events via the haptic channel. Previous results suggest that tactile alerts (vibration on wrists) can substitute for aural alerts, as a redundant cue to visual alerts in relatively short test sessions. The present experiment investigated whether tactile alerts are advantageous in high auditory loads during longer periods of vigilance. Participants responded to events alerted via aural or tactile redundant cues, while performing multiple tasks in a simulated UAV control station. Results did not show an advantage of tactile over aural alerts in high auditory loads over 30-minute periods. Despite the lack of performance advantage of tactile alerts over aural alerts, research participants favored the tactile alerts, rating them as more salient and faster in attracting their attention.

Collaboration


Dive into the Heath A. Ruff's collaboration.

Top Co-Authors

Avatar

Gloria L. Calhoun

Wright-Patterson Air Force Base

View shared research outputs
Top Co-Authors

Avatar

Mark H. Draper

Wright-Patterson Air Force Base

View shared research outputs
Top Co-Authors

Avatar

Sarah Spriggs

Air Force Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

Gerald Matthews

University of Central Florida

View shared research outputs
Top Co-Authors

Avatar

Gregory J. Funke

Air Force Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

Jinchao Lin

University of Central Florida

View shared research outputs
Top Co-Authors

Avatar

Ryan Wohleber

University of Central Florida

View shared research outputs
Top Co-Authors

Avatar

Jessica Bartik

Air Force Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

Austen T. Lefebvre

Air Force Research Laboratory

View shared research outputs
Researchain Logo
Decentralizing Knowledge