Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Edward W. Rogers is active.

Publication


Featured researches published by Edward W. Rogers.


Journal of Management | 2016

Organizational Correctives for Improving Recognition of Near-Miss Events

Robin L. Dillon; Catherine H. Tinsley; Peter Madsen; Edward W. Rogers

Despite decades of research on organizational disasters, such events remain too common. Scholars across a wide range of disciplines agree that one of the most viable approaches to preventing such catastrophes is to observe near-misses and use them to identify and eliminate problems before they produce large failures. Unfortunately, these important warning signals are too often ignored because they are perceived as successes rather than near-misses (or near-failures). In this article, we explore the effect of a climate for safety on improving near-miss recognition by observers, hypothesizing that safety climate increases the level of attention that observers pay to the underlying processes that generate an apparently successful outcome. Using a database of anomaly reports for unmanned NASA missions, we show that organizational safety climate and project stakes increase reporting rates of near-misses, both independently and interactively. In follow-up laboratory experiments, we confirm the independence of these effects to improve the likelihood that people differentiate near-miss outcomes from successes. Results suggest organizations can increase the recognition of near-misses with organizational messages that emphasize a positive safety climate.


ieee aerospace conference | 2006

The near-miss bias in decision making

Edward W. Rogers; C.H. Tinsley

The Columbia Accident Investigation Board (CAIB) report states that NASA needs to develop an organizational culture that reflects the best characteristics of a learning organization and that NASA historically has not demonstrated such characteristics. When there is a technical failure, most organizations are good at identifying the technical cause and learning not to repeat that same mistake. However, it is more difficult to learn from near-misses and lucky successes (i.e., situations where a technical failure does not occur but nearly did) (MacCormack, 2004). This research shows that managers whose decisions ended in a failure were perceived as significantly less competent, as having made poorer decisions, and as less deserving of promotions than managers that made the same decisions but whose project outcomes were either a success or a near miss. Moreover, there were no significant differences in the perceptions of a managers competence or promotability when that managers decisions resulted in a near-miss or a complete success. Therefore, even when a problem occurs that is linked to prior managerial decisions, if the project outcome is successful, that manager may not be held as accountable for any faulty decision making compared to managers of projects that fail. These results indicate a potential mechanism to explain why organizations fail to learn from their successes


ieee aerospace conference | 2005

Linking acquisition strategy to contract performance over the product life cycle

Edward W. Rogers; David Berkowitz; Tushar Singh; Charlotte Linde

This paper draws attention to challenges facing NASA in the new environment of performance based contracting. Unless NASA fully understands the marketplace, industrial policy and competitive game structure of this new environment it may be open to accepting unintended consequences of contract choices. Creating bilateral relationships between NASA and its contractors is essential to safeguarding scarce resources and ensuring their efficient and effective use on behalf of the American taxpayer. The paper compares acquisition sustainment models from NASA Space Shuttle, DoD, and the commercial product world


ieee aerospace conference | 2016

A different kind of organizational silence: When individuals fail to recognize a problem exists

Robin L. Dillon; Edward W. Rogers; David J. Oberhettinger; Catherine H. Tinsley

After major disasters, significant contributing factors are commonly identified but too often only with hindsight. For individuals to report potential problems before the disaster, three steps need to occur: he or she needs to recognize an event as a risk, problem, or possible wrongdoing, he or she needs to choose to either speak up, neglect the problem, or leave the organization based on an assessment of the benefits and costs of each alternative, and he or she must take action if speaking up is the chosen response. The organizational silence literature focuses mostly on step 2, where the culture of the organization causes individuals to choose to not speak up even when a problem is recognized. In this paper, we focus on step 1: where characteristics of the organization or the particular problem cause individuals to fail to recognize a problem exists. We first examine the 1998 incident at Wallops Flight Facility where an aircraft crashed during an engine water ingestion test. We then describe a series of behavioral lab experiments conducted to demonstrate how different conditions in the situation can influence the participants ability to recognize increasing risk in a task.


ieee aerospace conference | 2014

Using organizational messages to improve the recognition of near-miss events on projects

Robin L. Dillon; Catherine H. Tinsley; Edward W. Rogers

Although organizations may extract valuable lessons from visible failures, they too often neglect near-miss events-those that occur before a catastrophe-for the early learning opportunities these events can provide. Near-misses are situations where a failure could have occurred except for the intervention of good fortune and are often harbingers of future failure. Prior research has demonstrated a natural propensity for individuals and organizations to ignore these warning signals because they perceive the near-misses as successes. We show that people can be made more cognizant of near-misses by using organizational messages to help people recognize the difference between a near-miss and a success. In three studies, subtle primes that promoted a sense of accountability, project significance and risk aversion made both MBA students and NASA managers and contractors examine near-miss events more critically.


ieee aerospace conference | 2013

Improving the recognition of near-miss events on NASA missions

Robin L. Dillon; Peter Madsen; Edward W. Rogers; Catherine H. Tinsley

Organizations that ignore near-miss data may be inappropriately rewarding risky behavior. If managers engage in risky behavior and succeed, research shows that these managers are likely to be promoted without close scrutiny of their risky decisions, even if the success is because of good fortune. Over time such risk taking compounds as similar near-misses are repeatedly observed and the ability to recognize anomalies and document the events decreases (i.e., normalization of deviance). History from the shuttle program shows that only the occasional large failure increases attention to anomalies again. This research demonstrates the presence of normalization of deviance in NASA missions and also examines a factor (the significance of the project) that may increase peoples awareness of near-misses to counter this trend. Increasing awareness of chance success should increase the likelihood that significant learning can occur from the mission regardless of outcome. We conclude with prescriptions for project managers based on several on-going activities at NASA Goddard Space Flight Center (GSFC) to improve organizational learning. We discuss how these efforts can contribute to reducing near-miss bias and the normalization of deviance. This research should help organizations design learning processes that draw lessons from near-misses.


ieee aerospace conference | 2005

Pausing for learning: applying the after action review process at the NASA Goddard space flight center

Edward W. Rogers; J. Milam


ieee aerospace conference | 2007

Avoiding Common Pitfalls in Lessons Learned Processes that Support Decisions with Significant Risks

Edward W. Rogers; Robin L. Dillon; Catherine H. Tinsley


ieee aerospace conference | 2018

Improving the use of risk matrices at NASA

Robin L. Dillon; Gerald A. Klein; Edward W. Rogers; Christopher J. Scolese


System Health Management: With Aerospace Applications | 2011

4. Knowledge Management

Edward W. Rogers

Collaboration


Dive into the Edward W. Rogers's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Peter Madsen

Brigham Young University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

David J. Oberhettinger

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Gerald A. Klein

Goddard Space Flight Center

View shared research outputs
Researchain Logo
Decentralizing Knowledge