Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Earl L. Wiener is active.

Publication


Featured researches published by Earl L. Wiener.


Ergonomics | 1980

Flight-deck automation: promises and problems

Earl L. Wiener; Renwick E. Curry

Modern microprocessor technology and display systems make it entirely feasible to automate many of the flight-deck functions previously performed manually. There are many benefits to be derived from automation; the question today is not whether a function can be automated, but whether it should be, due to various human factors issues. It is highly questionable whether total system safety is always enhanced by allocating functions to automatic devices rather than human operators, and there is some reason to believe that flight-deck automation may have already passed its optimum point. This is an age-old question in the human factors profession, and there are few guidelines available to the system designer. This paper presents the state-of-the-art in human factors in flight-deck automation, identifies a number of critical problem areas, and offers broad design guidelines. Some automation-related aircraft accidents and incidents are discussed as examples of human factors problems in automated flight.


Human Factors | 1985

Beyond the sterile cockpit

Earl L. Wiener

The rapid advance of cockpit automation, enabled by microprocessor technology and motivated by the quest for safer and more efficient flight, has both its supporters and its detractors. Even the supporters tend to view the march toward computer-directed flight as a mixed blessing. Certain dramatic accidents and incidents in recent years, as well as the destruction of Korean Airlines Flight 007, have been interpreted by many as automation induced. Many of the critics outside of the aviation community, journalists, and the general public, have harped on the negative side of flight-deck automation without recognizing its positive aspects. The author advances the view that the time-honored recommendation that humans should serve as monitors of automatic devices must be reconsidered, and that the human must be brought back into a more active role in the control loop, aided by decision support systems.


Human Factors | 1977

Controlled Flight into Terrain Accidents: System-Induced Errors

Earl L. Wiener

Controlled flight into terrain accidents are those in which an aircraft, under the control of the crew, is flown into terrain (or water) with no prior awareness on the part of the crew of the impending disaster. This paper examines recent experience with these accidents, seeing them as the result of errors generated by a complex air traffic control system with ample opportunities for system-induced errors. Such problem areas as pilot-controller communication, flightdeck workload, noise-abatement procedures, government regulation, visual illusions, and cockpit-and ground-radar warning devices are discussed, with numerous examples of recent accident cases. The failure of the human factors profession to play a more significant role in the air traffic complex is also considered.


Human Factors | 1987

Application of vigilance research: rare, medium, or well done?

Earl L. Wiener

Despite a history of over 40 years and a vast amount of experimentation, vigilance research has not had a palpable impact on real-world systems. The problem has been blamed in part on the types of studies and the range of variables that have been chosen, but this is only part of the picture. The failure of implementation can also be attributed to insufficient interest in bridging the gap between the laboratory and the work world and lack of data from not the former but the latter. The situation will probably remain unremedied until more effort is made to understand the nature of complex systems and their dependence on human monitors and until the myth that automation of functions eliminates or even diminishes the need for human vigilance is abandoned.


Human Factors | 1970

Diver performance in cold water

Paul R. Stang; Earl L. Wiener

Twelve experienced divers repeatedly performed several representative underwater work tasks for 90-min. sessions at water temperatures of 50°, 60°, and 70° F. Time to complete the task was the primary performance measure; choice reaction time, with mental arithmetic as loading task, and four physiological measurements were also recorded. The subjects worked in 6 1/2 ft. of water wearing full 3/16-in.-thick wet suits and SCUBA equipment. Performance on all tasks except mental arithmetic tended to decrease as water temperature decreased. Most performance measures also showed a significant decrement over time and a significant time-by-temperature interaction. The general trend in performance measures was also reflected in several of the physiological measurements.


Ergonomics | 1973

Adaptive Measurement of Vigilance Decrement.

Earl L. Wiener

This paper describes a computer-baserd monitoring task which is adaptive, or self-adjusting, with the size of the signal stimulus (compared to a fixed non-signal stimulus) being mediated by the detection score of the subject, so as to maintain a constant detection rate. Data are presented which indicate that in order to maintain a fixed detection criterion over a 48-min vigil, the adaptive variable (separation distance of a pair of dots presented simultaneously) behaved in a manner consistent with the usual measures of vigilance decrement. Several adaptive strategies are discussed


Ergonomics | 1983

Human factors of flight-deck automation: Report on a NASA-industry workshop

D. A. Boehm-Davis; R. E. Curry; Earl L. Wiener; R. L. Harrison

With the advent of microprocessor technology, it has become possible to automate many of the functions on the flight deck of commercial aircraft that were previously performed manually. However, it is not clear whether these functions should be automated, taking into consideration various human factors issues. A NASA-industry workshop was held to identify the human factors issues related to flight-deck automation which would require research for resolution. The scope of automation, the benefits of automation and automation-induced problems were discussed, and,a list of potential research topics was generated by the participants. This report summarizes the workshop discussions and presents the questions developed at that time. While the workshop was specifically directed towards flight-deck automation, the issues raised and the research questions generated are more generally applicable to most complex interactive systems


Ergonomics | 1968

Training for Vigilance: Repeated Sessions with Knowledge of Results

Earl L. Wiener

Abstract Two groups of subjects were run in a visual monitoring test, one with knowledge of results (KR) providing instant feedback of correct responses, commissive errors, and missed signals, and the other with no knowledge of results (NKR). The groups were run for five 48-minute sessions on consecutive days, with a follow-up transfer session five weeks later. Results showed significant differences in detection rates between the groups on all five training sessions, but not on the transfer session. Detection rates increased significantly during the five training sessions for both groups, and at approximately the same rate. Commissive errors were significantly different only in the first two training sessions, with the KR subjects showing more false alarms. Commissive errors did not increase or decrease over time within sessions, or over the five training sessions.


Proc. of the NATO Advanced Research Workshop on Information systems: failure analysis | 1987

Fallible humans and vulnerable systems: lessons learned from aviation

Earl L. Wiener

In 1959, the commercial jet transport era in the United States began with the Boeing 707. Soon after, a rather serious incident occurred over Newfoundland. A 707, flying at 35,000 feet, experienced an autopilot disconnect and began a downward spiral. The captain was in the cabin, and the copilot and flight engineer did not detect the loss of control, even after the aircraft was descending rapidly, and by then were overcome by g forces and unable to make a recovery. Somehow the captain made his way back to his seat and recovered the plane at about 6000 feet above the Atlantic (Civil Aeronautics Board, 1959). With all of its electronic sophistication, the autopilot was capable of uncoupling, and perhaps more important, the crew was seemingly incapable of detecting and remedying the situation.


Perceptual and Motor Skills | 1969

MONEY AND THE MONITOR

Earl L. Wiener

4 groups of Ss were run under a fourfold combination of financial incentive and no incentive, and knowledge of results and no knowledge of results in a visual monitoring task. Knowledge of results was provided automatically by means of colored lights, and the financial incentive was 5 cents for each detection, minus 5 cents for each commissive error (false alarm). Knowledge of results led to significantly higher detection rates and earnings for Ss (p < .001), but the financial incentive did not affect performance on any measure. Commissive errors were not influenced by either experimental treatment. The problems of financial incentives and software design for monitoring systems are discussed.

Collaboration


Dive into the Earl L. Wiener's collaboration.

Top Co-Authors

Avatar

Asaf Degani

San Jose State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Robert L. Helmreich

University of Texas at Austin

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge