Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Ross Gore is active.

Publication


Featured researches published by Ross Gore.


winter simulation conference | 2008

Applying causal inference to understand emergent behavior

Ross Gore; Paul F. Reynolds

Emergent behaviors in simulations require explanation, so that valid behaviors can be separated from design or coding errors. Validation of emergent behavior requires accumulation of insight into the behavior and the conditions under which it arises. Previously, we have introduced an approach, Explanation Exploration (EE), to gather insight into emergent behaviors using semi-automatic model adaptation. We improve our previous work by iteratively applying causal inference procedures to samples gathered from the semi-automatic model adaptation. Iterative application of causal inference procedures reveals the interactions of identified abstractions within the model that cause the emergent behavior. Uncovering these interactions gives the subject matter expert new insight into the emergent behavior and facilitates the validation process.


international conference on software engineering | 2012

Reducing confounding bias in predicate-level statistical debugging metrics

Ross Gore; Paul F. Reynolds

Statistical debuggers use data collected during test case execution to automatically identify the location of faults within software. Recent work has applied causal inference to eliminate or reduce control and data flow dependence confounding bias in statement-level statistical debuggers. The result is improved effectiveness. This is encouraging but motivates two novel questions: (1) how can causal inference be applied in predicate-level statistical debuggers and (2) what other biases can be eliminated or reduced. Here we address both questions by providing a model that eliminates or reduces control flow dependence and failure flow confounding bias within predicate-level statistical debuggers. We present empirical results demonstrating that our model significantly improves the effectiveness of a variety of predicate-level statistical debuggers, including those that eliminate or reduce only a single source of confounding bias.


PLOS ONE | 2015

You Are What You Tweet: Connecting the Geographic Variation in America’s Obesity Rate to Twitter Content

Ross Gore; Saikou Y. Diallo; Jose J. Padilla

We conduct a detailed investigation of the relationship among the obesity rate of urban areas and expressions of happiness, diet and physical activity on social media. We do so by analyzing a massive, geo-tagged data set comprising over 200 million words generated over the course of 2012 and 2013 on the social network service Twitter. Among many results, we show that areas with lower obesity rates: (1) have happier tweets and frequently discuss (2) food, particularly fruits and vegetables, and (3) physical activities of any intensity. Additionally, we provide evidence that each of these results offer different and unique insight into the variation of the obesity rate in urban areas within the United States. Our work shows how the contents of social media may potentially be used to estimate real-time, population-scale measures of factors related to obesity.


winter simulation conference | 2007

An exploration-based taxonomy for emergent behavior analysis in simulations

Ross Gore; Paul F. Reynolds

Emergent behaviors in simulations require explanation, so that valid behaviors can be separated from design or coding errors. We present a taxonomy, to be applied to emergent behaviors of unknown validity. Our goal is to facilitate the explanation process. Once a user identifies an emergent behavior as a certain type within our taxonomy, exploration can commence in a manner befitting that type. Exploration based on type supports narrowing of possibilities and suggests exploration methods, thus facilitating the exploration process. Ideally, a taxonomy would be robust, allowing reasonable variation in behavior type assignment without penalty in cost or correctness during the exploration process. The taxonomy we present is robust, comprehensive and suitable for use with our established emergent behavior exploration methods. In addition to the taxonomy, we present our design rationale, and a summary of results from a test application of our taxonomy.


automated software engineering | 2011

Statistical debugging with elastic predicates

Ross Gore; Paul F. Reynolds; David Kamensky

Traditional debugging and fault localization methods have addressed localization of sources of software failures. While these methods are effective in general, they are not tailored to an important class of software, including simulations and computational models, which employ floating-point computations and continuous stochastic distributions to represent, or support evaluation of, an underlying model. To address this shortcoming, we introduce elastic predicates, a novel approach to predicate-based statistical debugging. Elastic predicates introduce profiling of values assigned to variables within a failing program. These elastic predicates are better predictors of software failure than the static and uniform predicates used in existing techniques such as Cooperative Bug Isolation (CBI). We present experimental results for established fault localization benchmarks and widely used simulations that show improved effectiveness.


workshop on parallel and distributed simulation | 2007

Explanation Exploration: Exploring Emergent Behavior

Ross Gore; Paul F. Reynolds; Lingjia Tang; David C. Brogan

Understanding emergent behavior(s) exhibited in simulations poses an interesting challenge. Emergence can represent a valid behavior arising from seemingly unrelated phenomena, or it can reflect an error in a model or its implementation. We propose a new method for gathering insight into emergent behavior in simulations using the model adaptation technique, COERCE. COERCE allows a user to efficiently adapt a model to meet new requirements and can be employed to explore emergent behavior exhibited in a simulation. A subject matter expert (SME) can coerce a simulation to gather insight into characteristics of the emergent behavior as the simulated phenomenon is driven toward conditions of interest.


Complexity | 2014

Toward a formalism of modeling and simulation using model theory

Saikou Y. Diallo; Jose J. Padilla; Ross Gore; Heber Herencia-Zapana; Andreas Tolk

This article proposes a Modeling and Simulation (M&S) formalism using Model Theory. The article departs from the premise that M&S is the science that studies the nature of truth using models and simulations. Truth in models and simulations is relative as they seek to answer specific modeling questions. Consequently, truth in M&S is relative because every model is a purposeful abstraction of reality. We use Model Theory to express the proposed formalism because it is built from the premise that truth is relative. The proposed formalism allows us to: (1) deduce formal definitions and explanations of areas of study in M&S, including conceptual modeling, validity, and interoperability, and (2) gain insight into which tools can be used to semi-automate validation and interoperation processes.


ACM Transactions on Modeling and Computer Simulation | 2015

Statistical Debugging for Simulations

Ross Gore; Paul F. Reynolds; David Kamensky; Saikou Y. Diallo; Jose J. Padilla

Predictions from simulations have entered the mainstream of public policy and decision-making practices. Unfortunately, methods for gaining insight into faulty simulations outputs have not kept pace. Ideally, an insight gathering method would automatically identify the cause of a faulty output and explain to the simulation developer how to correct it. In the field of software engineering, this challenge has been addressed for general-purpose software through statistical debuggers. We present two research contributions, elastic predicates and many-valued labeling functions, that enable debuggers designed for general-purpose software to become more effective for simulations employing random variates and continuous numbers. Elastic predicates address deficiencies of existing debuggers related to continuous numbers, whereas many-valued labeling functions support the use of random variates. When used in combinations, these contributions allow a simulation developer tasked with localizing the program statement causing the faulty simulation output to examine 40% fewer statements than the leading alternatives. Our evaluation shows that elastic predicates and many-valued labeling functions maintain their ability to reduce the number of program statements that need to be examined under the imperfect conditions that developers experience in practice.


Scientometrics | 2016

Identifying key papers within a journal via network centrality measures

Saikou Y. Diallo; Christopher J. Lynch; Ross Gore; Jose J. Padilla

This article examines the extent to which existing network centrality measures can be used (1) as filters to identify a set of papers to start reading within a journal and (2) as article-level metrics to identify the relative importance of a paper within a journal. We represent a dataset of published papers in the Public Library of Science (PLOS) via a co-citation network and compute three established centrality metrics for each paper in the network: closeness, betweenness, and eigenvector. Our results show that the network of papers in a journal is scale-free and that eigenvector centrality (1) is an effective filter and article-level metric and (2) correlates well with citation counts within a given journal. However, closeness centrality is a poor filter because articles fit within a small range of citations. We also show that betweenness centrality is a poor filter for journals with a narrow focus and a good filter for multidisciplinary journals where communities of papers can be identified.


winter simulation conference | 2013

The need for usable formal methods in verification and validation

Ross Gore; Saikou Y. Diallo

The process of developing, verifying and validating models and simulations should be straightforward. Unfortunately, following conventional development approaches can render a model design that appeared complete and robust into an incomplete, incoherent and invalid simulation during implementation. An alternative approach is for subject matter experts (SMEs) to employ formal methods to describe their models. However, formal methods are rarely used in practice due to their intimidating syntax and semantics rooted in mathematics. In this paper we argue for a new approach to verification and validation that leverages two techniques from computer science: (1) model checking and (2) automated debugging. The proposed vision offers an initial path to replace conventional simulation verification and validation methods with new automated analyses that eventually will be able to yield feedback to SMEs in a familiar language.

Collaboration


Dive into the Ross Gore's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Hamdi Kavak

Old Dominion University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge