Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Michael W. Haas is active.

Publication


Featured researches published by Michael W. Haas.


Human Factors | 2000

Postural Instability and Motion Sickness in a Fixed-Base Flight Simulator

Thomas A. Stoffregen; Lawrence J. Hettinger; Michael W. Haas; Merry M. Roe; L. James Smart

We evaluated the prediction that postural instability would precede the subjective symptoms of motion sickness in a fixed-base flight simulator. Participants sat in a cockpit in a video projection dome and were exposed to optical flow that oscillated in the roll axis with exposure durations typical of flight simulation. The frequencies of oscillation were those that characterize spontaneous postural sway during stance. Head motion was measured prior to and during exposure to imposed optical flow. Of 14 participants, 6 were classified as motion sick, either during or after exposure to the optical oscillation. Prior to the onset of subjective symptoms, head motion among participants who later became sick was significantly greater than among participants who did not become motion sick. We argue that the results support the postural instability theory of motion sickness. Actual or potential applications include the prevention or mitigation of motion sickness in virtual environments.


Human Factors | 1998

Effects of Localized Auditory Information on Visual Target Detection Performance Using a Helmet-Mounted Display

W. Todd Nelson; Lawrence J. Hettinger; James A. Cunningham; Bart J. Brickman; Michael W. Haas; Richard L. McKinley

An experiment was conducted to evaluate the effects of localized auditory information on visual target detection performance. Visual targets were presented on either a wide field-of-view dome display or a helmet-mounted display and were accompanied by either localized, nonlocalized, or no auditory information. The addition of localized auditory information resulted in significant increases in target detection performance and significant reductions in workload ratings as compared with conditions in which auditory information was either nonlocalized or absent. Qualitative and quantitative analyses of participants′ head motions revealed that the addition of localized auditory information resulted in extremely efficient and consistent search strategies. Implications for the development and design of multisensory virtual environments are discussed. Actual or potential applications of this research include the use of spatial auditory displays to augment visual information presented in helmet-mounted displays, thereby leading to increases in performance efficiency, reductions in physical and mental workload, and enhanced spatial awareness of objects in the environment.


Archive | 2003

Virtual and Adaptive Environments: Applications, Implications, and Human Performance Issues

Lawrence J. Hettinger; Michael W. Haas

Preface Introduction, L. Hettinger and M. Haas GENERAL ISSUES IN THE DESIGN AND USE OF VIRTUAL AND ADAPTIVE ENVIRONMENTS Visual Perception of Egocentric Distance in Real and Virtual Environments, J. Loomis and J. Knapp A Unified Approach to Presence and Motion Sickness, J. Prothero and D. Parker Transfer of Training in Virtual Environments: Issues for Human Performance, M. Sebrechts, C. Lathan, D. Clawson, M. Miller, and C. Trepagnier Beyond the Limits of Real-Time Realism: Moving From Stimulation Correspondence to Information Correspondence, P.J. Stappers, W. Gaver, and K. Overbeeke On the Nature and Evaluation of Fidelity in Virtual Environments, T. Stoffregen, B. Bardy, L.J. Smart, and R. Pagulayan Adapting to Telesystems, R. Welch VIRTUAL ENVIRONMENTS A Tongue-Based Tactile Display for Portrayal of Environmental Characteristics, P. Bach-y-Rita, K. Kaczmarek, and M. Tyler Spatial Audio Displays for Target Acquisition and Speech Communications, R. Bolia and W.T. Nelson, Learning Action Plans in a Virtual Environment, S. Goss and A. Pearce Fidelity of Disparity-Based Stereopsis, I. Howard Configural Scoring of Simulator Sickness, Cybersickness, and Space Adaptation Syndrome: Similarities and Differences, R.S. Kennedy, J.M. Drexler, D.E. Compton, K.M. Stanney, D.S. Lanham, and D.L. Harm A Cybernetic Analysis of the Tunnel-in-the-Sky Display, M. Mulder, H. Stassen, and J.A. Mulder Alternative Control Technology for Uninhabited Aerial Vehicles: Human Factors Considerations, W.T. Nelson, T.R. Anderson, and G.R. McMillan Medical Applications of Virtual Reality, R. Satava and S. Jones Face-to-Face Communication, N.M. Thalmann, P. Kalra, and M. Escher Integration of Human Factors Aspects in the Design of Spatial Navigation Displays, E. Theunissen Implementing Perception-Action Coupling for Laparoscopy, F. Voorhorst, K. Overbeeke, and G. Smets Psychological and Physiological Issues in the Medical Use of Virtual Reality, T. Yamaguchi ADAPTIVE ENVIRONMENTS Supporting the Adaptive Human Expert: A Critical Element in the Design of Meaning Processing Systems, J. Flach and C. Dominguez A Human Factors Approach to Adaptive Aids, S. Hourlier, J-Y. Grau, and C. Valot Adaptive Pilot/Vehicle Interfaces for the Tactical Air Environment, S.S. Mulgund, G. Rinkus, and G. Zacharias The Implementation of Psycho-Electrophysiological Interface Technologies for the Control and Feedback of Interactive Virtual Environments, A. Sagi-Dolev


The International Journal of Aviation Psychology | 2001

A Theoretical Analysis and Preliminary Investigation of Dynamically Adaptive Interfaces

Kevin B. Bennett; Jeffrey D. Cress; Lawrence J. Hettinger; Dean Stautberg; Michael W. Haas

Adynamically adaptive interface (DAI) is a computer interface that changes the display or control characteristics of a system (or both) in real time. The goal of DAIs is to anticipate informational needs or desires of the user and provide that information without the requirement of an explicit control input by the user. DAIs have the potential to improve overall human-machine system performance if properly designed; they also have a very real potential to degrade performance if they are not properly designed. This article explores both theoretical and practical issues in the design of DAIs. The relation of the DAI concept to decision aiding and automation is discussed, and a theoretical framework for design is outlined.Apreliminary investigation of the DAI design concept was conducted in the domain of aviation (precision, low-level navigation). Nontraditional controls (a force reflecting stick) and displays (a configural flight director) were developed to support performance at the task. A standard interface (conventional controls and displays), a candidate interface (alternative controls and displays), and an adaptive interface (dynamically alternating between the standard and candidate displays) were evaluated. The results indicate that significant performance advantages in the quality of route navigation were obtained with the candidate and adaptive interfaces relative to the standard interface; no significant differences between the candidate and adaptive interfaces were obtained. The implications of these results are discussed, with special emphasis on their relation to fundamental challenges that must be met for the DAI concept to be a viable design alternative.


systems man and cybernetics | 2005

Human-machine haptic interface design using stochastic resonance methods

Daniel W. Repperger; Chandler A. Phillips; James E. Berlin; Amy T. Neidhard-Doll; Michael W. Haas

A study was conducted with haptic (forces produced on a human via mechanical systems) interfaces and how they may interact with an operator to improve tracking performance. What differs from traditional approaches is that a certain amount of random noise (in a haptic sense) was inserted into the human-machine loop. It was observed that the tracking performance of the operator benefitted from this interaction.


The International Journal of Aviation Psychology | 2000

Multisensory Interface Design for Complex Task Domains: Replacing Information Overload With Meaning in Tactical Crew Stations

Bart J. Brickman; Lawrence J. Hettinger; Michael W. Haas

Modern aerial combat represents a highly complex task domain that imposes many significant challenges on aviators. In modern cockpits there are more sources of dynamic information than any single pilot has the ability to attend to, let alone comprehend, all at once. As technological developments lead to the deployment of new weapons systems, the future aerial battlefield will inevitably yield increased levels of complexity. A significant challenge for cockpit designers is to devise pilot-vehicle interfaces that take full advantage of the parallel information extraction capabilities of humans through the use of integrated multisensory displays. This article describes an approach to multisensory display development that uses human performance evaluations as a design tool, and concludes with a description of a multisensory crew station designed by an interdisciplinary team.


The International Journal of Aviation Psychology | 2001

Applying Adaptive Control and Display Characteristics to Future Air Force Crew Stations

Michael W. Haas; W. Todd Nelson; Daniel W. Repperger; Robert S. Bolia; Greg Zacharias

The Human Effectiveness Directorate of the Air Force Research Laboratory is developing and evaluating human-machine interface concepts to enhance overall weapon system performance by embedding knowledge of the operators state inside the interface, enabling the interface to make informed, automated decisions regarding many of the interfaces information management display characteristics. Some of these characteristics include information modality, spatial arrangement, and temporal organization. By increasing the ability of the interface to respond, or adapt, to the changing requirements of the human operator in real time-in essence closing the loop-the interface provides intuitive information management to the operator and provides real-time human engineering.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 1995

An Initial Study of the Effects of 3-Dimensional Auditory Cueing on Visual Target Detection

Richard L. McKinley; William R. D'Angelo; Michael W. Haas; David R. Perrot; W. Todd Nelson; Lawrence J. Hettinger; Bart J. Brickman

Developments in virtual environment technology are enabling the rapid generation of systems that provide synthetic visual and auditory displays. The successful use of this technology in education, training, entertainment, and various other applications relies to a great extent on the effective combination of visual and auditory information. Little is known about the basic interactions between the auditory system and the visual system in real environments or virtual environments. Therefore, the purpose of the current study was to begin to assess the effectiveness of various combinations of visualauditory information in supporting the performance of a common task (detecting targets) in a virtual environment.


Perceptual and Motor Skills | 1997

DESIGN OF A HAPTIC STICK INTERFACE AS A PILOT'S ASSISTANT IN A HIGH TURBULENCE TASK ENVIRONMENT

Daniel W. Repperger; Michael W. Haas; B. J. Brickman; L. J. Hettinger; L. Lu; M. M. Roe

A study involving 8 Air Force pilots was conducted to examine the efficacy of a force-reflecting joystick to improve performance during a simulated landing task in wind turbulence. By adding certain force characteristics to a joystick, it was of interest to see if performance may change, different control effort may be utilized, and workload measures may be altered based on the joystick utilized. The main results show that certain performance measures significantly improved by having the force reflection condition on. The implications of this study are that in certain types of precision tracking tasks, subjected to external disturbances, the addition of the force characteristics to the joystick can significantly improve performance, result in less effort for control, and lower subjective workload.


Perceptual and Motor Skills | 2003

Effects of Haptic Feedback and Turbulence on Landing Performance Using an Immersive Cave Automatic Virtual Environment (CAVE)

Daniel W. Repperger; R. H. Gilkey; R. Green; T. Lafleur; Michael W. Haas

An investigation was conducted in which subjects had to land a simulated F-16 aircraft using a CAVE (Cave Automatic Virtual Environment) facility. This was a three-dimensional virtual setting consisting of multiple mirrors, 3-D video-projected displays in a highly stressful environment employing a haptic joystick. 6 subjects learned a task which required landing in wind turbulence with a reduced visual scene. Analyses indicated that during landing, performance error variables which occurred in the same direction as the haptic forces were significantly reduced. This was true, especially when the visual scene was occluded and more reliance on the proprioceptive condition was beneficial.

Collaboration


Dive into the Michael W. Haas's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Daniel W. Repperger

Air Force Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

W. Todd Nelson

Air Force Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

Merry M. Roe

University of Cincinnati

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Michael R. Grimaila

Air Force Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Robert F. Mills

Air Force Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Joel S. Warm

Air Force Research Laboratory

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Eric E. Geiselman

Wright-Patterson Air Force Base

View shared research outputs
Researchain Logo
Decentralizing Knowledge