Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Cameron A. MacKenzie is active.

Publication


Featured researches published by Cameron A. MacKenzie.


Risk Analysis | 2014

Summarizing Risk Using Risk Measures and Risk Indices

Cameron A. MacKenzie

Our society is fascinated with risk in many different areas and disciplines. One of the main ways to describe and communicate the level of risk is through risk indices, which summarize risk using numbers or categories such as words, letters, or colors. These indices are used to communicate risks to the public, understand how risk is changing over time, compare among different risks, and support decision making. Given the different methods to construct risk indices, including flawed methods such as risk matrices, this article develops specific steps that analysts can follow to create a risk index. This article emphasizes the importance of describing risk with a probability distribution, developing a numerical risk measure that summarizes the probability distribution, and finally translating the risk measure to an index. Measuring the risk is the most difficult part and requires the analyst to summarize a probability distribution into one or possibly a few numbers. The risk measure can then be transformed to a numerical or categorical index. I apply the method outlined in this article to construct a risk index that compares the risk of fatalities in aviation and highway transportation.


Iie Transactions | 2014

Modeling a severe supply chain disruption and post-disaster decision making with application to the Japanese earthquake and tsunami

Cameron A. MacKenzie; Kash Barker; Joost R. Santos

Modern supply chains are increasingly vulnerable to disruptions, and a disruption in one part of the world can cause supply difficulties for companies around the globe. This article develops a model of severe supply chain disruptions in which several suppliers suffer from disabled production facilities and firms that purchase goods from those suppliers may consequently suffer a supply shortage. Suppliers and firms can choose disruption management strategies to maintain operations. A supplier with a disabled facility may choose to move production to an alternate facility, and a firm encountering a supply shortage may be able to use inventory or buy supplies from an alternate supplier. The supplier’s and firm’s optimal decisions are expressed in terms of model parameters such as the cost of each strategy, the chances of losing business, and the probability of facilities reopening. The model is applied to a simulation based on the 2011 Japanese earthquake and tsunami, which closed several facilities of key suppliers in the automobile industry and caused supply difficulties for both Japanese and U.S. automakers.


Journal of Infrastructure Systems | 2013

Empirical data and regression analysis for estimation of infrastructure resilience, with application to electric power outages

Cameron A. MacKenzie; Kash Barker

Recent natural disasters have highlighted the need for increased planning for disruptive events. Forecasting damage and time that a system will be inoperable is important for disruption planning. The resilience of critical infrastructure systems, or their ability to recover quickly from a disruption, can mitigate adverse consequences of the disruption. This paper quantifies the resilience of a critical infrastructure sector through the dynamic inoperability input-output model (DIIM). The DIIM, which describes how inoperability propagates through a set of interdependent industry and infrastructure sectors following a disruptive event, includes a resilience parameter that has not yet been adequately assessed. This paper provides a data-driven approach to derive the resilience parameter through regression models. Data may contain different disruption scenarios, and regression models can incorporate these scenarios through the use of categorical or dummy variables. A mixed-effects model offers an alternate approach of accounting for these scenarios, and these models estimate parameters based on the combination of all scenarios (fixed effects) and an individual scenario (random effects). These regression models are illustrated with electric power outage data and a regional disruption that uses the DIIM to model production losses in Oklahoma following an electric power outage.


Annals of Operations Research | 2016

Static and dynamic resource allocation models for recovery of interdependent systems: application to the Deepwater Horizon oil spill

Cameron A. MacKenzie; Hiba Baroud; Kash Barker

Determining where and when to invest resources during and after a disruption can challenge policy makers and homeland security officials. Two decision models, one static and one dynamic, are proposed to determine the optimal resource allocation to facilitate the recovery of impacted industries after a disruption where the objective is to minimize the production losses due to the disruption. The paper presents necessary conditions for optimality for the static model and develops an algorithm that finds every possible solution that satisfies those necessary conditions. A deterministic branch-and-bound algorithm solves the dynamic model and relies on a convex relaxation of the dynamic optimization problem. Both models are applied to the Deepwater Horizon oil spill, which adversely impacted several industries in the Gulf region, such as fishing, tourism, real estate, and oil and gas. Results demonstrate the importance of allocating enough resources to stop the oil spill and clean up the oil, which reduces the economic loss across all industries. These models can be applied to different homeland security and disaster response situations to help governments and organizations decide among different resource allocation strategies during and after a disruption.


ASME 2011 30th International Conference on Ocean, Offshore and Arctic Engineering | 2011

Extremes of Nonlinear Vibration: Models Based on Moments, L-Moments, and Maximum Entropy

Steven R. Winterstein; Cameron A. MacKenzie

Nonlinear effects beset virtually all aspects of offshore structural loading and response. These nonlinearities cause non-Gaussian statistical effects, which are often most consequential in the extreme events—e.g., 100- to 10,000-year conditions—that govern structural reliability. Thus there is engineering interest in forming accurate non-Gaussian models of time-varying loads and responses, and calibrating them from the limited data at hand. We compare here a variety of non-Gaussian models. We first survey moment-based models; in particular, the 4-moment “Hermite” model, a cubic transformation often used in wind and wave applications. We then derive an “L-Hermite” model, an alternative cubic transformation calibrated by the response “L-moments” rather than its ordinary statistical moments. These L-moments have recently found increasing use, in part because they show less sensitivity to distribution tails than ordinary moments. We find here, however, that these L-moments may not convey sufficient information to accurately estimate extreme response statistics. Finally, we show that 4-moment maximum entropy models, also applied in the literature, may be inappropriate to model broader-than-Gaussian cases (e.g., responses to wind and wave loads).Copyright


Fuzzy Sets and Systems | 2014

A New Fuzzy Logic Approach to Capacitated Dynamic Dial-a-Ride Problem

Maher Maalouf; Cameron A. MacKenzie; Sridhar Radakrishnan; Mary C. Court

Abstract Almost all Dial-a-Ride problems (DARP) described in the literature pertain to the design of optimal routes and schedules for n customers who specify pick-up and drop-off times. In this article we assume that the customer is mainly concerned with the drop-off time because it is the most important to the customer. Based on the drop-off time specified by the customer and the customers location, a pick-up time is calculated and given to the customer by the dispatching office. We base our formulation on a dynamic fuzzy logic approach in which a new request is assigned to a vehicle. The fuzzy logic algorithm chooses the vehicle to transport the customer by seeking to satisfy two objectives. The first reflects the customers preference and minimizes the time a customer spends in the vehicle, and the second reflects the companys preference and minimizes the distance a vehicle needs to travel to transport the customer. The proposed heuristic algorithm is relatively simple and computationally efficient in comparison with most deterministic algorithms for solving both small and large sized problems.


Risk Analysis | 2016

Allocating Resources to Enhance Resilience, with Application to Superstorm Sandy and an Electric Utility

Cameron A. MacKenzie; Christopher W. Zobel

This article constructs a framework to help a decisionmaker allocate resources to increase his or her organizations resilience to a system disruption, where resilience is measured as a function of the average loss per unit time and the time needed to recover full functionality. Enhancing resilience prior to a disruption involves allocating resources from a fixed budget to reduce the value of one or both of these characteristics. We first look at characterizing the optimal resource allocations associated with several standard allocation functions. Because the resources are being allocated before the disruption, however, the initial loss and recovery time may not be known with certainty. We thus also apply the optimal resource allocation model for resilience to three models of uncertain disruptions: (1) independent probabilities, (2) dependent probabilities, and (3) unknown probabilities. The optimization model is applied to an example of increasing the resilience of an electric power network following Superstorm Sandy.


The International Journal of Logistics Management | 2017

Modeling disruption in a fresh produce supply chain

Cameron A. MacKenzie; Aruna Apte

Purpose The purpose of this paper is to quantify elements that make fresh produce supply chains (FPSCs) vulnerable to disruptions and to quantify the benefits of different disruption-management strategies. Design/methodology/approach This paper develops a mathematical model of a disruption in a FPSC and analyzes the relationships among variables. Findings The model determines the optimal safety stock as a function of the perishability of the produce, the length of time it takes to find the contamination, the level of demand during the disruption, and the amount of produce that can be rerouted. Applying the model to the 2006 E. coli spinach contamination reveals that the drop in customer demand for fresh spinach plays the largest role in Dole losing sales. Research limitations/implications The model includes several parameters that may be difficult to estimate. Future models can incorporate uncertainty that is inherent in supply chain disruptions. Practical implications The model in this paper can help a supply chain (SC) manager explore the trade-offs of different disruption-management strategies. For example, a SC manager can determine the value of holding additional safety stock vs trying to improve traceability in the SC. Originality/value This paper quantifies and models insights delivered in the qualitative analyses of FPSC disruptions. The theoretical contributions include an analysis of the interaction among safety stock, levels of demand, communication, and traceability parameters in order to help SC managers evaluate different strategies to mitigate the effects of contaminated produce.


Military Operations Research Society Journal | 2016

Improving Risk Assessment Communication

Mark A. Gallagher; Cameron A. MacKenzie; David M. Blum; Douglas A. Boerman

Assessors often diminish communicating risk to show a single category or color without providing a full context of the evaluation, basis, and assumptions behind the risk assessment. We attempt to remedy that by presenting an approach to communicate risk assessments more completely with a clearer understanding of these issues. First, we specify assessor should present necessary information as part of a standard risk assessment statement. This information is discussed in four groups: 1) the activity or a collection of activities being assessed, 2) the context of the assessment (who made it, when, with what scope, and how rigorously), 3) setting of the assessment (scenario, assumed conditions, timeframe, assumed choices, and mitigation measures), and 4) the resulting assessment. Second, we propose an approach to standardize the presentation of the actual assessment by applying the principles of simplicity, scalability, and consistency. The assessor needs to develop outcome-centric measures for key activities to provide a basis to assess the potential consequences, determine the success and failure points of the activity, and present the expected outcome for each scenario setting. We standardize the presentation of the risk assessments as categorical risks, such as colored ranges, by apportioning the expected consequences on the metric scales. We discuss combining assessments for a single activity and for an aggregate activity. The United States Air Force has implemented both our standard risk statement and our presentation approach.


Statistical Analysis and Data Mining | 2014

A Bayesian beta kernel model for binary classification and online learning problems

Cameron A. MacKenzie; Theodore B. Trafalis; Kash Barker

Recent advances in data mining have integrated kernel functions with Bayesian probabilistic analysis of Gaussian distributions. These machine learning approaches can incorporate prior information with new data to calculate probabilistic rather than deterministic values for unknown parameters. This paper extensively analyzes a specic Bayesian kernel model that uses a kernel function to calculate a posterior beta distribution that is conjugate to the prior beta distribution. Numerical testing of the beta kernel model on several benchmark data sets reveals that this model’s accuracy is comparable with those of the support vector machine, relevance vector machine, naive Bayes, and logistic regression, and the model runs more quickly than other algorithms. When one class occurs much more frequently than the other class, the beta kernel model often outperforms other strategies to handle imbalanced data sets, including undersampling, over-sampling, and the Synthetic Minority Over-Sampling Technique. If data arrive sequentially over time, the beta kernel model easily and quickly updates the probability distribution, and this model is more accurate than an incremental support vector machine algorithm for online learning.

Collaboration


Dive into the Cameron A. MacKenzie's collaboration.

Top Co-Authors

Avatar

Kash Barker

University of Oklahoma

View shared research outputs
Top Co-Authors

Avatar

Chao Hu

Iowa State University

View shared research outputs
Top Co-Authors

Avatar

Eva Regnier

Naval Postgraduate School

View shared research outputs
Top Co-Authors

Avatar

Joost R. Santos

George Washington University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Meng Li

Iowa State University

View shared research outputs
Top Co-Authors

Avatar

Soobum Lee

University of Maryland

View shared research outputs
Researchain Logo
Decentralizing Knowledge