Scott L. Rosen
Mitre Corporation
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Scott L. Rosen.
Systems Engineering | 2015
Scott L. Rosen; Christopher P. Saunders; Samar K. Guharay
With increasing complexity of real-world systems, especially for continuously evolving scenarios, systems analysts encounter a major challenge with the modeling techniques that capture detailed system characteristics defining input-output relationships. The models become very complex and require long time of execution. In this situation, techniques to construct approximations of the simulation model by metamodeling alleviate long run times and the need for large computational resources; it also provides a means to aggregate a simulations multiple outputs of interest and derives a single decision-making metric. The method described here leverages simulation metamodeling to map the three basic SE metrics, namely, measures of performance to measures of effectiveness to a single figure of merit. This enables using metamodels to map multilevel system measures supports rapid decision making. The results from a case study demonstrate the merit of the method. Several metamodeling techniques are compared and bootstrap error analysis and predicted residual sums of squares statistic are discussed to evaluate the standard error and error due to bias.
international conference on system of systems engineering | 2013
Scott L. Rosen; Christopher P. Saunders; Michael Tierney; Samar K. Guharay
This paper presents a model-based systems engineering approach developed for rapid analysis of complex systems, not requiring the use of high computational resources. The basis of the approach involves the mapping of three basic systems engineering metrics, namely, Measures of Performance (MOP) to Measures of Effectiveness (MOE) to a single Figure of Merit (FOM), through metamodeling. Through this approach, analysts can leverage validated metamodels to map system measures, from component level MOPs to the overall system FOM in real-time to support decisions under constrained time-frames. Through metamodeling we achieve approximations of the simulation model in mathematical form, which alleviates long run times and the need for large computational resources. The metamodels also provide an effective means to aggregate a simulations multiple outputs of interest via a preference function. These two approaches together form the foundation of this rapid, model-based systems engineering approach. The effectiveness of this model-based approach is demonstrated on configuring a standoff detection system.
winter simulation conference | 2006
Scott L. Rosen; John A. Stine; William J. Weiland
In this paper we describe a new simulation tool used to study the creation and optimization of propagation maps for node state routing protocols within wireless and mobile ad hoc networks. The simulation is developed in MATLAB and interfaces with DLL libraries for network data management support. These DLL libraries also have applications within OPNET for other node state routing studies
winter simulation conference | 2015
Scott L. Rosen; David Slater; Emmet Beeker; Samar K. Guharay; Garry M. Jacyna
This paper presents an application of simulation metamodeling to improve the analysis capabilities within a decision support tool for Critical Infrastructure network evaluation. Simulation metamodeling enables timeliness of analysis, which was not achievable by the original large-scale network simulation due to long set-up times and slow run times. We show through a case study that the behavior of a large-scale simulation for Critical Infrastructure analysis can be effectively captured by Neural Network metamodels and Stochastic Kriging metamodels. Within the case study, metamodeling is integrated into the second step of a two-step analysis process for vulnerability assessment of the network. This consists first of an algorithmic exploration of a power grid network to locate the most susceptible links leading to cascading failures. These links represent the riskiest links in the network and were used by the metamodels to visualize how their failure probabilities affect global network performance measures.
winter simulation conference | 2013
Scott L. Rosen; Samar K. Guharay
Metamodeling of large-scale simulations consisting of a large number of input parameters can be very challenging. Neural Networks have shown great promise in fitting these large-scale simulations even without performing factor screening. However, factor screening is an effective method for logically reducing the dimensionality of an input space and thus enabling more feasible metamodel calibration. Applying factor screening methods before calibrating Neural Network metamodels or any metamodel can have both positive and negative effects. The critical assumption for factor screening under investigation involves the prevalence of two-way interactions that contain a variable without a significant main effect by itself. In a simulation with a large parameter space, the prevalence of two-way interactions and their contribution to the total variability in the model output is far from transparent. Important questions therefore arise regarding factor screening and Neural Network metamodels: (a) is this a process worth doing with todays more powerful computing processors, which provide a larger library of runs to do metamodeling; and (b), does erroneously screening these buried interaction terms critically impact the level of metamodel fidelity that one can achieve. In this paper we examine these questions through the construction of a case study on a large-scale simulation. This study projects regional homelessness levels per county of interest based on a large array of budget decisions and resource allocations that expand out to hundreds of input parameters.
winter simulation conference | 2012
Scott L. Rosen; Christopher P. Saunders; Samar K. Guharay
Long run times of a simulation can be a hindrance when an analyst is attempting to use the model for timely system analysis and optimization. In this situation, techniques such as simulation metamodeling should be considered to expedite the end users intended analysis procedure. A difficult problem arises in the application of metamodeling when the simulation inputs and outputs are not of a single value, but constitute a time series, a phenomenon that is seen repeatedly in the area of financial simulations and many naturally occurring events. This paper provides a method to develop a mapping between multiple time series inputs of a simulation and a single Figure of Merit (FoM) of the system across a given time period of interest. In addition, this paper discusses a means for an end user to define a tailored FoM with respect to their own specific system beliefs and objectives in the case of multiple simulation outputs.
winter simulation conference | 2016
Scott L. Rosen; Peter Salemi; Brian Wickham; Ashley Williams; Christine Harvey; Erin Catlett; Sajjad Taghiyeh; Jie Xu
Real-life simulation optimization applications often deal with large-scale simulation models that are time-consuming to execute. Parallel computing environments, such as high performance computing clusters and cloud computing services, provide the computing power needed to scale to such applications. In this paper, we show how the Empirical Stochastic Branch and Bound algorithm, an effective globally convergent random search algorithm for discrete optimization via simulation, can be adapted to a high-performance computing environment to effectively utilize the power of parallelism. We propose a master-worker structure driven by MITREs Goal-Directed Grid-Enabled Simulation Experimentation Environment. Numerical experiments with the popular Ackley benchmark test function and a real-world simulation called runwaySimulator demonstrate the number of cores needed to achieve a good scaled efficiency of parallel empirical stochastic branch and bound for increasing levels of simulation run times.
Journal of Statistical Computation and Simulation | 2017
Christine Harvey; Scott L. Rosen; Jim Ramsey; Christopher P. Saunders; Samar K. Guharay
ABSTRACT A significant challenge in fitting metamodels of large-scale simulations with sufficient accuracy is in the computational time required for rigorous statistical validation. This paper addresses the statistical computation issues associated with the Bootstrap and modified PRESS statistic, which yield key metrics for error measurements in metamodelling validation. Experimentation is performed on different programming languages, namely, MATLAB, R, and Python, and implemented on different computing architectures including traditional multicore personal computers and high-power clusters with parallel computing capabilities. This study yields insight into the effect that programming languages and computing architecture have on the computational time for simulation metamodel validation. The experimentation is performed across two scenarios with varying complexity.
Applied Economics | 2016
Samar K. Guharay; Gaurav Thakur; Fred J. Goodman; Scott L. Rosen; Daniel Houser
ABSTRACT With the objective of identifying instability signatures of the financial system, this article integrates two classes of data-driven techniques. The first class of techniques is utilized to investigate macroeconomic behaviour by aggregating an ensemble of heterogeneous nonstationary time-series data and the second class of techniques examines the local dynamics of the microstructures in each time series. Moving window principal component analysis (PCA) and functional PCA (fPCA) are shown to extract collective signatures of the financial system for understanding macroeconomic behaviour, and the Synchrosqueezing and Markov switching techniques are used to study local dynamics within each individual time series. The integrated data analytics successfully identifies the diverse events from 1986 to 2012. All events, both major and minor, have been identified by fPCA. The major economic events, especially the 2008 Great Recession, along with several minor events, showed a strong leading indicator in the density index derived from Synchrosqueezing. The capability of this integrated analytics suite is demonstrated in this article, and it motivates further studies encompassing data sets from broader sectors. As a complement to existing model-driven approach, this would lead to achieving a robust and reliable method that can help in taking measures to avoid catastrophic collapse in the constantly evolving financial system.
Economics Letters | 2013
Samar K. Guharay; Gaurav Thakur; Fred J. Goodman; Scott L. Rosen; Daniel Houser