Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Gregory L. Boylan is active.

Publication


Featured researches published by Gregory L. Boylan.


winter simulation conference | 2004

Simulation modeling requirements for determining soldier tactical mission system effectiveness

Eric S. Tollefson; Michael J. Kwinn; Phillip Martin; Gregory L. Boylan; Bobbie L. Foote

In order to maintain an edge during this time of unprecedented technological growth, the Army must field infantry soldier systems quickly; however, the cost of doing so without some assessment of utility is quite high. Therefore, the acquisition community must estimate the operational impact of proposed systems with an increasing degree of accuracy. For this, the Army has turned to combat simulations. However, the focus in the past has been on larger battlefield systems and unit-level analyses. Additionally, infantry soldier models require unprecedented fidelity in terms of the soldier entity and his environment. As a result, the simulation representation of the individual soldier on the battlefield has not kept pace with other representations. In this paper, we discuss our identification of the unique simulation requirements for modeling the infantry soldier as a system of systems in support of acquisition decision making.


International Journal of Production Research | 2013

Robust parameter design in embedded high-variability production processes: an alternative approach to mitigating sources of variability

Gregory L. Boylan; Byung Rae Cho

Many of today’s industrial firms seek the continuous and systematic reduction of variability as a primary engineering goal across key production dimensions. Robust parameter design (RPD) is often considered among the most important methods for achieving these ends. Focused on statistical modelling and numerical optimisation strategies, most researchers typically assume processes possess moderate to low variability, which facilitates the use of ordinary least squares (OLS) regression. Realistically, however, industrial processes often exhibit high variability. In such cases, many of the modelling assumptions underpinning OLS methods do not hold. Consequently, the results and recommendations provided to decision-makers using these could generate suboptimal modifications to processes and products. This paper proposes an alternative method for dealing with high process variability. Specifically, using the coefficient of variation to identify influential sources of variability between design points, the proposed method advocates removing these sources and then applying optimal design theory to rebalance the experimental framework. Thereafter, RPD optimisation schemes may be applied to obtain more precise optimal operating conditions with less variability and bias. A numerical example combined with Monte Carlo simulation is used to illustrate the proposed procedure.


Quality and Reliability Engineering International | 2012

The Normal Probability Plot as a Tool for Understanding Data: A Shape Analysis from the Perspective of Skewness, Kurtosis, and Variability

Gregory L. Boylan; Byung Rae Cho

Continuous quality improvement is an effort to improve the quality of products, processes, or services. A program intended to effectively implement such efforts begins with the collection and analysis of data. The primary purpose of the normal probability plot, which is one of the most frequently used graphical tools by quality practitioners and researchers, is for normality testing; however, the plot offers other valuable insights into data analysis that have rarely been addressed in the research community. This article provides an overview of distributional characteristics in the context of the four sample moments and investigates how variations in these moments affect the normal probability plot, focusing primarily on the presence of skewness and kurtosis and the effects of variability. This article then lays out a comprehensive analysis of how various statistical characteristics within a data set can influence the shape and corresponding properties of a normal probability plot, demonstrating how variations in the characteristics of the data can reveal or mask the degree of concavity, convexity, or the S shape in the plot, as well as the spread of the data about the mean and in the tails. This can provide engineers with a better understanding of the ways in which data “communicate” through the plot, thereby providing a better basis for initial assumptions, as well as facilitating more accurate model estimation and optimization results thereafter. Copyright


Computers & Industrial Engineering | 2013

Comparative studies on the high-variability embedded robust parameter design from the perspective of estimators

Gregory L. Boylan; Byung Rae Cho

Engineers and scientists often identify robust parameter design (RPD) as one of the most important process and quality improvement methods. Focused on determining the optimum operating conditions that facilitate target attainment with minimum variability, typical approaches to RPD use ordinary least squares methods to obtain response functions for the mean and variance by assuming that process data are normally distributed and exhibit reasonably low variability. Consequently, the sample mean and standard deviation are the most common estimators used in the initial tier of estimation, as they perform best when these assumptions hold. Realistically, however, industrial processes often exhibit high variability, particularly in mass production lines. If ignored, such conditions can cause the quality of the estimates obtained using the sample mean and standard deviation to deteriorate. This paper examines several alternatives to the sample mean and standard deviation, incorporating them into RPD modeling and optimization approaches to ascertain which tend to yield better solutions when highly variable conditions prevail. Monte Carlo simulation and numerical studies are used to compare the performances of the proposed methods with the traditional approach.


Quality and Reliability Engineering International | 2013

Studies on the Effects of Estimator Selection in Robust Parameter Design under Asymmetric Conditions

Gregory L. Boylan; Byung Rae Cho

The primary goal of robust parameter design (RPD) is to determine the optimum operating conditions that achieve process performance targets while minimizing variability in the results. To achieve this goal, typical approaches to RPD problems use ordinary least squares methods to obtain response functions for the mean and variance by assuming that the experimental data follow a normal distribution and are relatively free of contaminants or outliers. Consequently, the most common estimators used in the initial tier of estimation are the sample mean and sample variance, as they are very good estimators when these assumptions hold. However, it is often the case that such assumed conditions do not exist in practice; notably, that inherent asymmetry pervades system outputs. If unaccounted for, such conditions can affect results tremendously by causing the quality of the estimates obtained using the sample mean and standard deviation to deteriorate. Focusing on asymmetric conditions, this paper examines several highly efficient estimators as alternatives to the sample mean and standard deviation. We then incorporate these estimators into RPD modeling and optimization approaches to ascertain which estimators tend to yield better solutions when skewness exists. Monte Carlo simulation and numerical studies are used to substantiate and compare the performance of the proposed methods with the traditional approach. Copyright


International Journal of Experimental Design and Process Optimisation | 2011

Achieving cost robustness in processes with mixed multiple quality characteristics and dynamic variability

Gregory L. Boylan; Paul L. Goethals

The tenuous economic conditions of the past several years continue to threaten the economic existence of many companies, forcing them to re-examine ways for reducing costs without surrendering quality. A common technique for achieving high quality at minimal cost focuses on identifying the ideal process mean setting among various quality characteristics. In the design of this approach, referred to as the ‘optimal process mean problem’, comparatively little research has addressed the realities of mixed multiple quality characteristics and the dynamic nature of process variability. To address this gap, this paper examines situations involving mixed multiple quality characteristics, proposing a methodology to accurately predict the location of the optimal process mean vector as it shifts in response to changing process variability. This knowledge of a feasible region for the optimal vector can help to achieve robustness in cost by providing a mechanism for maintaining the most profitable process target settings.


International Journal of Quality Engineering and Technology | 2011

Analysing the effects of variability measure selection on process and product optimisation

Paul L. Goethals; Gregory L. Boylan

Since the integration of response surface methods into process robustness studies, many researchers have suggested numerous approaches to further enhance product development. Generally, these robust design methods seek the factor settings that minimise variability and the deviation of the mean from the desired target value. In the absence of a uniform approach to modelling process variability, researchers have typically chosen the standard deviation, variance, or logarithm of the standard deviation. Each measure, however, can produce a different set of optimal factor settings, thus complicating comparison studies. The purpose of this paper is to examine the effects of variability measure selection on solutions and suggest a uniform approach.


systems and information engineering design symposium | 2007

Identification of Critical Factor Sets for Close range Engagements in Urban Operations

Zach Griffiths; Seth Sanert; Mike Snodgrass; Sean Snook; Gregory L. Boylan

The dynamic and asymmetric environment of todays battlefield and the proliferation of technologically advanced equipment at the soldier level have increased the need for more accurate and higher-fidelity modeling of the individual soldier in combat Traditionally, the focus of the combat modeling community has been on large-scale battlefield platforms and unit-level analyses. Consequently, the representation of the individual soldier on the battlefield has not kept pace with other representations. These Infantry soldier models require unprecedented fidelity in terms of the Infantry soldier entity, his behavioral and decision processes, and his environment. In order to enable the modeling community to more accurately represent the individual soldier in combat simulations in a timely and cost-effective manner, it has become important to first identify and develop the critical factors and functions that have the greatest impacts on soldier lethality, survivability, and combat effectiveness in close range (0-50 m), team sized engagements. We used the systems decision process (SDP) to identify these critical factors and recommend to our client the order in which they should be addressed by the modeling community. Our application of standard systems engineering tools led to a unique characterization of the factors considered most critical for the accurate representation of individual soldiers in close range engagements. The result will be an ordered set of factors that will help to focus the efforts of the modeling community and ensure the more timely and cost effective integration of soldier behavioral and decision-making aspects into combat simulations. Our paper will discuss the problem background and methodology we applied to find critical factors that will enhance the modeling and accurate representation of the individual soldier in a close combat situation.


Archive | 2004

Simulation Roadmap for Program Executive Office (PEO) Soldier

Eric S. Tollefson; Gregory L. Boylan; Michael J. Kwinn; Bobbie L. Foote; Paul D. West


Applied Mathematical Modelling | 2013

Robust parameter design in resource-constrained environments: An investigation of trade-offs between costs and precision within variable processes

Gregory L. Boylan; Paul L. Goethals; Byung Rae Cho

Collaboration


Dive into the Gregory L. Boylan's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Bobbie L. Foote

United States Military Academy

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Eric S. Tollefson

United States Military Academy

View shared research outputs
Top Co-Authors

Avatar

Michael J. Kwinn

United States Military Academy

View shared research outputs
Top Co-Authors

Avatar

Mike Snodgrass

United States Military Academy

View shared research outputs
Top Co-Authors

Avatar

Paul D. West

United States Military Academy

View shared research outputs
Top Co-Authors

Avatar

Phillip Martin

United States Military Academy

View shared research outputs
Top Co-Authors

Avatar

Sean Snook

United States Military Academy

View shared research outputs
Top Co-Authors

Avatar

Seth Sanert

United States Military Academy

View shared research outputs
Researchain Logo
Decentralizing Knowledge