Neelke Doorn
Delft University of Technology
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Neelke Doorn.
Science and Engineering Ethics | 2012
Ibo van de Poel; Jessica Nihlén Fahlquist; Neelke Doorn; Sjoerd D. Zwart; Lambèr M. M. Royakkers
In some situations in which undesirable collective effects occur, it is very hard, if not impossible, to hold any individual reasonably responsible. Such a situation may be referred to as the problem of many hands. In this paper we investigate how the problem of many hands can best be understood and why, and when, it exactly constitutes a problem. After analyzing climate change as an example, we propose to define the problem of many hands as the occurrence of a gap in the distribution of responsibility that may be considered morally problematic. Whether a gap is morally problematic, we suggest, depends on the reasons why responsibility is distributed. This, in turn, depends, at least in part, on the sense of responsibility employed, a main distinction being that between backward-looking and forward-looking responsibility.
Science and Engineering Ethics | 2012
Neelke Doorn
In the last decades increasing attention is paid to the topic of responsibility in technology development and engineering. The discussion of this topic is often guided by questions related to liability and blameworthiness. Recent discussions in engineering ethics call for a reconsideration of the traditional quest for responsibility. Rather than on alleged wrongdoing and blaming, the focus should shift to more socially responsible engineering, some authors argue. The present paper aims at exploring the different approaches to responsibility in order to see which one is most appropriate to apply to engineering and technology development. Using the example of the development of a new sewage water treatment technology, the paper shows how different approaches for ascribing responsibilities have different implications for engineering practice in general, and R&D or technological design in particular. It was found that there was a tension between the demands that follow from these different approaches, most notably between efficacy and fairness. Although the consequentialist approach with its efficacy criterion turned out to be most powerful, it was also shown that the fairness of responsibility ascriptions should somehow be taken into account. It is proposed to look for alternative, more procedural ways to approach the fairness of responsibility ascriptions.
Science, Technology, & Human Values | 2012
Neelke Doorn
The present article explores the rationales of scientists and engineers for distributing moral responsibilities related technology development. On the basis of a qualitative case study, it was investigated how the actors within a research network distribute responsibilities for these issues. Rawls’ Wide Reflective Equilibrium model was used as a descriptive framework. This study indicates that there is a correlation between the actors’ ethics position and their responsibility rationale. When discussing how to address ethical issues or how to distribute the responsibility for addressing them, actors with similar normative background theories referred to the same type of normative arguments. It was found that these deliberative processes could best be interpreted in terms of an interplay between different layers of morality. The case suggests that people seek coherence between these layers rather than work through them one-directionally. By distinguishing between rationales for distributing responsibilities and the actual distributions, possible sources of misunderstanding can be identified. The benefit from acknowledging these different rationales is that it enables actors to recognize the legitimacy of other people’s opinions, ultimately contributing to a responsibility distribution that is both complete and accepted by all as justified.
Bulletin of Science, Technology & Society | 2010
Neelke Doorn; Jessica Nihlén Fahlquist
Traditionally, the management of technology has focused on the stages before or after development of technology. In this approach the technology itself is conceived as the result of a deterministic enterprise; a result that is to be either rejected or embraced. However, recent insights from Science and Technology Studies (STS) have shown that there is ample room to modulate technology during development. This requires technology managers and engineering ethicists to become more involved in the technological research rather than assessing it from an outsider perspective. Instead of focusing on the question whether or not to authorize, approve, or adopt a certain technology or on the question of who is to blame for potential mistakes, the guiding question in this new approach is how research is to be carried out.
Science and Engineering Ethics | 2012
Neelke Doorn; Ibo van de Poel
From its earliest issue back in 1995, responsibility has been a major theme in the Journal of Science and Engineering Ethics. Although the focus of attention and conceptions of responsibility vary considerably over the different articles, the prevalence of responsibility as a topic is remarkable. On a superficial reading, we could see it as the logical corollary of the importance of responsibility for the broader field of applied ethics. However, this can only partly explain why responsibility is such a widely discussed topic in the Journal of Science and Engineering Ethics. If we take, for example, the academic journals for medical ethics, responsibility is not half as frequently mentioned in the title or keywords.1 Apparently, there is something special about the relation between responsibility and technology and engineering. To see what is special about responsibility in technology and engineering, one should perhaps start with the kind of paradigmatic situation that is usually the focus of attention in philosophical ethics, and even in most of the applied ethics literature. In this paradigmatic situation, there is usually an individual that is confronted with a difficult ethical choice. Although the choice at hand is morally complex, it is usually assumed that what raises ethical concerns are the possible actions of the individual and the direct consequences of these actions. Moreover, it is usually assumed that such consequences are more or less certain. The ethical literature thus often assumes: (1) that it are individuals who act, (2) that the consequences of their actions are causally direct traceable, and (3) that these consequences are certain. None of these assumptions seem to apply to many of the ethical issues raised by modern technology and engineering. First, engineering and technology development typically take place in collective settings, in which a lot of different agents, apart from the engineers involved, eventually shape the technology developed and its social consequences. Second, engineering and technology development are complex processes, which are characterized by long causal chains between the actions of engineers and scientists and the eventual effects that raise ethical concern. Third, social consequences of technology are often hard to predict beforehand. Jonas (1984) has suggested that the three characteristics of technological action just mentioned (collectivity, indirect causation, uncertainty) require an ethics of responsibility rather than an ethics based on traditional ethical notions like consequences or duties. At the same time, it is clear that a lot of the traditional philosophical literature on responsibility still ignores the mentioned characteristics. Although philosophers have paid attention to the fact that consequences can be realized in a multiplicity of ways (e.g. Fischer and Ravizza 1998), their focus often is on rather direct consequences. Moreover, they tend to focus on responsibility for certain outcomes rather than on responsibility for risks. What has been recognized without doubt is the collective nature of action as witnessed by the vastly growing literature on collective responsibility (e.g. Smiley 2008; May and Hoffman 1991; French 1984). There is yet another reason why responsibility is an important theme in engineering and technology. Technology increasingly influences the context in which humans have to act and so seems to co-shape their responsibility. An example of this mediating role is the use of automatic pilots in planes. Who is responsible if a plane with an automatic pilot crashes? Does it make sense to attribute responsibility in such cases to technology or is it possible to understand these complex situations so that all responsibility can in the end be attributed to humans? Examples like the automatic pilot also raise interesting questions about the responsibility of designers. It may be argued that they design the technological environment in which others have to act and to exercise certain responsibilities. Can, and should, designers design technological systems so that they enhance rather than limit the responsibility of the users of these systems; and what does such an obligation exactly entail? The articles collected in this issue have been presented at the conference “Moral Responsibility: Neuroscience, Organization & Engineering,” which was held in Delft, August 25–27, 2009. The central aim of this conference was to improve our understanding of responsibility in our current society. By drawing on philosophical and scientific expertise and insight, the discussion moved beyond the traditional discussion on free will and determinism, which has dominated the philosophical responsibility debate. One of the conclusions of the conference was that there is not one unitary, agreed-upon definition of responsibility but that there are many different meanings attached to the notion of responsibility, some being merely descriptive, others having an explicitly normative content. By reflecting on these diverse meanings, this special issue of Science and Engineering Ethics provides a topical overview of how the theme responsibility in technology and engineering is currently approached by different scholars in the field. Given the different meanings attached to the notion of responsibility, it is important to start with some conceptual clarifications. In fact, this is exactly what is done in the contributions by Davis (forthcoming), Kermisch (forthcoming), Van de Poel et al. (forthcoming), and Doorn (forthcoming). Davis provides the most fine-grained distinctions. Without revealing the full taxonomy here, we could say that the different senses of responsibility can be classified along two main dimensions. The first one is a temporal one, referring to the distinction between forward-looking responsibilities (that is, a sense of responsibility that refers to things that have not yet occurred) and backward-looking responsibilities (that is, senses of responsibility that refer to things that have happened in the past).2 It should be noted that some senses of responsibility contain both backward-looking and forward-looking elements. For example, responsibility-as-liability is typically attributed on the basis of past (causal) contributions that agents have made, but it also implies a duty to rectify, or at least compensate for, damage or loss. A further distinction could be made between (more or less) descriptive notions of responsibility and normative notions of responsibility that imply an evaluation (e.g. “he is an irresponsible person”) or a prescription (e.g. “you should pay because you are responsible for what happened”). The prime focus of most contributions in this special issue is on such normative notions of responsibility. The contributions to this special issue touch upon a number of themes and issues in relation to responsibility in technology and engineering. For the purpose of this introduction, we have consolidated them into three categories. The first category is what is also referred to as the problem of many hands, that is, the problem of attributing responsibility in complex collective settings. This problem is, as we will see, due to the collective nature of action in technology and engineering but also due to the other two characteristics of technological action discussed above, that is, long causal chains and uncertainty. The second theme, we distinguish, is responsibility for risks. Although there has recently been a special issue of this journal (Volume 16. No. 3) devoted to this theme, it is still an issue that hasn’t been discussed a lot (cf. Van de Poel and Nihlen-Fahlquist forthcoming). The third theme is the way technologies shape the contexts of human actions, and the consequences of this “mediating role” for especially the responsibility of designers of new technology.
Science and Engineering Ethics | 2013
Neelke Doorn; J. Otto Kroesen
In this paper, we discuss the use of role plays in ethics education for engineering students. After presenting a rough taxonomy of different objectives, we illustrate how role plays can be used to broaden students’ perspectives. We do this on the basis of our experiences with a newly developed role play about a Dutch political controversy concerning pig transport. The role play is special in that the discussion is about setting up an institutional framework for responsible action that goes beyond individual action. In that sense, the role play serves a double purpose. It not only aims at teaching students to become aware of the different dimensions in decision making, it also encourages students to think about what such an institutional framework for responsible action might possibly look like.
Science and Engineering Ethics | 2016
Neelke Doorn
Abstract The management of water is a topic of great concern. Inadequate management may lead to water scarcity and ecological destruction, but also to an increase of catastrophic floods. With climate change, both water scarcity and the risk of flooding are likely to increase even further in the coming decades. This makes water management currently a highly dynamic field, in which experiments are made with new forms of policy making. In the current paper, a case study is presented in which different interest groups were invited for developing new water policy. The case was innovative in that stakeholders were invited to identify and frame the most urgent water issues, rather than asking them to reflect on possible solutions developed by the water authority itself. The case suggests that stakeholders can participate more effectively if their contribution is focused on underlying competing values rather than conflicting interests.
Environmental Toxicology and Chemistry | 2017
Henriette Selck; Peter B. Adamsen; Thomas Backhaus; Gary Thomas Banta; Peter K.H. Bruce; G. Allen Burton; Michael Butts; Eva Boegh; John J. Clague; Khuong Van Dinh; Neelke Doorn; Jonas S. Gunnarsson; Henrik Hauggaard-Nielsen; Charles Hazlerigg; Agnieszka Hunka; John Jensen; Yan Lin; Susana Loureiro; Simona Miraglia; Wayne R. Munns; Farrokh Nadim; Annemette Palmqvist; Robert A. Rämö; Lauren Paige Seaby; Kristian Syberg; Stine Rosendal Tangaa; Amalie Thit; Ronja Windfeld; Maciej Zalewski; Peter M. Chapman
Roskilde University (Denmark) hosted a November 2015 workshop, Environmental Risk-Assessing and Managing Multiple Risks in a Changing World. This Focus article presents the consensus recommendations of 30 attendees from 9 countries regarding implementation of a common currency (ecosystem services) for holistic environmental risk assessment and management; improvements to risk assessment and management in a complex, human-modified, and changing world; appropriate development of protection goals in a 2-stage process; dealing with societal issues; risk-management information needs; conducting risk assessment of risk management; and development of adaptive and flexible regulatory systems. The authors encourage both cross-disciplinary and interdisciplinary approaches to address their 10 recommendations: 1) adopt ecosystem services as a common currency for risk assessment and management; 2) consider cumulative stressors (chemical and nonchemical) and determine which dominate to best manage and restore ecosystem services; 3) fully integrate risk managers and communities of interest into the risk-assessment process; 4) fully integrate risk assessors and communities of interest into the risk-management process; 5) consider socioeconomics and increased transparency in both risk assessment and risk management; 6) recognize the ethical rights of humans and ecosystems to an adequate level of protection; 7) determine relevant reference conditions and the proper ecological context for assessments in human-modified systems; 8) assess risks and benefits to humans and the ecosystem and consider unintended consequences of management actions; 9) avoid excessive conservatism or possible underprotection resulting from sole reliance on binary, numerical benchmarks; and 10) develop adaptive risk-management and regulatory goals based on ranges of uncertainty. Environ Toxicol Chem 2017;36:7-16.
Risk Analysis | 2015
Neelke Doorn
Many risk scholars recognize the importance of including ethical considerations in risk management. Risk ethics can provide in-depth ethical analysis so that ethical considerations can be part of risk-related decisions, rather than an afterthought to those decisions. In this article, I present a brief sketch of the field of risk ethics. I argue that risk ethics has a bias toward technological hazards, thereby overlooking the risks that stem from natural and semi-natural hazards. In order to make a contribution to the field of risk research, risks ethics should broaden its scope to include natural and semi-natural hazards and develop normative distribution criteria that can support decision making on such hazards.
Science and Engineering Ethics | 2010
Neelke Doorn
Due to their non-hierarchical structure, socio-technical networks are prone to the occurrence of the problem of many hands. In the present paper an approach is introduced in which people’s opinions on responsibility are empirically traced. The approach is based on the Rawlsian concept of Wide Reflective Equilibrium (WRE) in which people’s considered judgments on a case are reflectively weighed against moral principles and background theories, ideally leading to a state of equilibrium. Application of the method to a hypothetical case with an artificially constructed network showed that it is possible to uncover the relevant data to assess a consensus amongst people in terms of their individual WRE. It appeared that the moral background theories people endorse are not predictive for their actual distribution of responsibilities but that they indicate ways of reasoning and justifying outcomes. Two ways of ascribing responsibilities were discerned, corresponding to two requirements of a desirable responsibility distribution: fairness and completeness. Applying the method triggered learning effects, both with regard to conceptual clarification and moral considerations, and in the sense that it led to some convergence of opinions. It is recommended to apply the method to a real engineering case in order to see whether this approach leads to an overlapping consensus on a responsibility distribution which is justifiable to all and in which no responsibilities are left unfulfilled, therewith trying to contribute to the solution of the problem of many hands.