Jens Rasmussen
University of Copenhagen
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Jens Rasmussen.
systems man and cybernetics | 1983
Jens Rasmussen
The introduction of information technology based on digital computers for the design of man-machine interface systems has led to a requirement for consistent models of human performance in routine task environments and during unfamiliar task conditions. A discussion is presented of the requirement for different types of models for representing performance at the skill-, rule-, and knowledge-based levels, together with a review of the different levels in terms of signals, signs, and symbols. Particular attention is paid to the different possible ways of representing system properties which underlie knowledge-based performance and which can be characterised at several levels of abstraction-from the representation of physical form, through functional representation, to representation in terms of intention or purpose. Furthermore, the role of qualitative and quantitative models in the design and evaluation of interface systems is mentioned, and the need to consider such distinctions carefully is discussed.
Safety Science | 1997
Jens Rasmussen
Abstract In spite of all efforts to design safer systems, we still witness severe, large-scale accidents. A basic question is: Do we actually have adequate models of accident causation in the present dynamic society? The socio-technical system involved in risk management includes several levels ranging from legislators, over managers and work planners, to system operators. This system is presently stressed by a fast pace of technological change, by an increasingly aggressive, competitive environment, and by changing regulatory practices and public pressure. Traditionally, each level of this is studied separately by a particular academic discipline, and modelling is done by generalising across systems and their particular hazard sources. It is argued that risk management must be modelled by cross-disciplinary studies, considering risk management to be a control problem and serving to represent the control structure involving all levels of society for each particular hazard category. Furthermore, it is argued that this requires a system-oriented approach based on functional abstraction rather than structural decomposition. Therefore, task analysis focused on action sequences and occasional deviation in terms of human errors should be replaced by a model of behaviour shaping mechanisms in terms of work system constraints, boundaries of acceptable performance, and subjective criteria guiding adaptation to change. It is found that at present a convergence of research paradigms of human sciences guided by cognitive science concepts supports this approach. A review of this convergence within decision theory and management research is presented in comparison with the evolution of paradigms within safety research.
systems man and cybernetics | 1992
Kim J. Vicente; Jens Rasmussen
A theoretical framework for designing interfaces for complex human-machine systems is proposed. The framework, called ecological interface design (EID), is based on the skills, rules, and knowledge taxonomy of cognitive control. The basic goals of EID are not to force processing to a higher level than the demands of the task require, and to support each of the three levels of cognitive control. Thus, an EID interface should not contribute to the difficulty of the task, and at the same time, it should support the entire range of activities that operators will be faced with. Three prescriptive design principles are suggested to achieve this objective, each directed at supporting a particular level of cognitive control. Particular attention is paid to presenting a coherent deductive argument justifying the principles of EID. Support for the EID framework is discussed. Some issues for future research are outlined. >
Journal of Occupational Accidents | 1982
Jens Rasmussen
This paper describes the definition and the characteristics of human errors. Different types of human behavior are classified, and their relation to different error mechanisms are analyzed. The effect of conditioning factors related to affective, motivating aspects of the work situation as well as physiological factors are also taken into consideration. The taxonomy for event analysis, including human malfunction, is presented. Possibilities for the prediction of human error are discussed. The need for careful studies in actual work situations is expressed. Such studies could provide a better understanding of the complexity of human error situations as well as the data needed to characterize these situations.
American Journal of Psychology | 1989
Abigail Sellen; Donald A. Norman; Jens Rasmussen; Keith Duncan; Jacques Leplat
This book is about the nature of human error and the implications for design of modern industrial installations. It is the first book discussing the topic from the point of view of cognitive psychology, social psychology and safety engineering. Advanced students, researchers and professional psychologists in industrial psychology/human factors and engineers or systems designers concerned with man-machine systems will find this book essential reading.
systems man and cybernetics | 1985
Jens Rasmussen
The knowledge representation of a decision-maker in control of a complex system can be structured in several levels of abstraction in a functional hierarchy. The role of such an abstraction hierarchy in supervisory systems control is reviewed, and the difference between causal and intentional systems and formal games is discussed in terms of the role of an abstraction hierarchy in the related decision strategies. This relationship is then discussed with reference to the classical psychological problem-solving research of O. Selz (1922) and others. Finally, the implications for the design of decision-support systems are discussed. It is argued that an explicit description of the functional properties of the system to be controlled in terms of an abstraction hierarchy is necessary for a consistent design of databases and display formats for decision-support systems. Also, it is necessary to consider the role of the abstraction hierarchy in reasoning when planning experiments on human decision-making.
International Journal of Human-computer Studies \/ International Journal of Man-machine Studies | 1989
Jens Rasmussen; Kim J. Vicente
Research during recent years has revealed that human errors are not stochastic events which can be removed through improved training programs or optimal interface design. Rather, errors tend to reflect either systematic interference between various models, rules, and schemata, or the effects of the adaptive mechanisms involved in learning. In terms of design implications, these findings suggest that reliable human-system interaction will be achieved by designing interfaces which tend to minimize the potential for control interference and support recovery from errors. In other words, the focus should be on control of the effects of errors rather than on the elimination of errors per se. In this paper, we propose a theoretical framework for interface design that attempts to satisfy these objectives. The goal of our framework, called ecological interface design, is to develop a meaningful representation of the process which is not just optimised for one particular level of cognitive control, but that supports all three levels simultaneously. The paper discusses the necessary requirements for a mapping between the process and the combined action/observation surface, and analyses of the resulting influence on both the interferences causing error and on the opportunity for error recovery left to the operator. There has been a rapidly growing interest in the analysis of human error caused by technological development. The growing complexity of technical installations makes it increasingly difficult for operators to understand the system’s internal functions. At the same time, the large scale of operations necessary for competitive production makes the effects of human errors increasingly unacceptable. Naturally enough, human error analysis has become an essential part of systems design. In order to conduct such an analysis, a taxonomy suited to describe human errors is essential. The structure and dimensions of the error taxonomy, however, will depend on the aim of the analysis. Therefore, different categorisations of human errors are useful during the various stages of systems design. At least two different perspectives can be identified, each with its own unique set of requirements. One point of view is useful for predicting the effects of human error on system performance, i.e. a failure-mode-and-effect analysis. For this purpose, a taxonomy based on a model of human error mechanisms should be adopted. A second perspective for error analysis is required for identifying possible improvements in system design. In order to meet the requirements of such an analysis, an error taxonomy based on cognitive control mechanisms (Rasmussen, 1983) is more appropriate. Both types of analyses are essential to system design. The failure-mode-and-effect analysis allows the designer to identify plausible human
Safety Science | 2002
Inge Svedung; Jens Rasmussen
Graphic representation of accident scenarios: mapping system structure and the causation of accidents
Ergonomics | 1990
Jens Rasmussen
During recent years, the significance of the concept of human error has changed considerably. The reason for this has partly been an increasing interest of psychological research in the analysis of complex real-life phenomena, and partly the changes of modern work conditions caused by advanced information technology. Consequently, the topic of the present contribution is not a definition of the concept or a proper taxonomy. Instead, a review is given of two professional contexts for which the concept of error is important. Three cases of analysis of human-system interaction are reviewed: (1) traditional task analysis and human reliability estimation; (2) causal analysis of accidents after the fact; and, finally, (3) design of reliable work conditions in modern socio-technical systems. It is concluded that ‘;errors’ cannot be studied as a separate category of behaviour fragments; the object of study should be cognitive control of behaviour in complex environments.
Quality & Safety in Health Care | 2005
R. Cook; Jens Rasmussen
Rather than being a static property of hospitals and other healthcare facilities, safety is dynamic and often on short time scales. In the past most healthcare delivery systems were loosely coupled—that is, activities and conditions in one part of the system had only limited effect on those elsewhere. Loose coupling allowed the system to buffer many conditions such as short term surges in demand. Modern management techniques and information systems have allowed facilities to reduce inefficiencies in operation. One side effect is the loss of buffers that previously accommodated demand surges. As a result, situations occur in which activities in one area of the hospital become critically dependent on seemingly insignificant events in seemingly distant areas. This tight coupling condition is called “going solid”. Rasmussen’s dynamic model of risk and safety can be used to formulate a model of patient safety dynamics that includes “going solid” and its consequences. Because the model addresses the dynamic aspects of safety, it is particularly suited to understanding current conditions in modern healthcare delivery and the way these conditions may lead to accidents.