Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Gary C. McDonald is active.

Publication


Featured researches published by Gary C. McDonald.


Journal of the American Statistical Association | 1975

A Monte Carlo Evaluation of Some Ridge-Type Estimators

Gary C. McDonald; Diane I. Galarneau

Abstract Consider the standard linear model . Ridge regression, as viewed here, defines a class of estimators of indexed by a scalar parameter k. Two analytic methods of specifying k are proposed and evaluated in terms of mean square error by Monte Carlo simulations. With three explanatory variables and determined by the largest eigenvalue of the correlation matrix, least squares is dominated by these estimators in all cases investigated; however, mixed results are obtained with determined by the smallest eigenvalue. These estimators compare favorably with other ridge-type estimators evaluated elsewhere for two explanatory variables.


Technometrics | 1973

Instabilities of Regression Estimates Relating Air Pollution to Mortality

Gary C. McDonald; Richard C. Schwing

The instability of ordinary least squares estimates of linear regression coefficients is demonstrated for mortality rates regressed around various socioeconomic, weather and pollution variables. A ridge regression technique presented by Hoer1 and Kennard (Technometrics 12 (1970) 69–82) is employed to arrive at “stable” regression coefficients which, in some instances, differ considerably from the ordinary least squares estimates. In addition, two methods of variable elimination are compared—one based on total squared error and the other on a ridge trace analysis.


Naval Research Logistics | 2001

Balancing and optimizing a portfolio of R&D projects

George Beaujon; Samuel P. Marin; Gary C. McDonald

A mathematical formulation of an optimization model designed to select projects for inclusion in an R&D portfolio, subject to a wide variety of constraints (e.g., capital, headcount, strategic intent, etc.), is presented. The model is similar to others that have previously appeared in the literature and is in the form of a mixed integer programming (MIP) problem known as the multidimensional knapsack problem. Exact solution of such problems is generally difficult, but can be accomplished in reasonable time using specialized algorithms. The main contribution of this paper is an examination of two important issues related to formulation of project selection models such as the one presented here. If partial funding and implementation of projects is allowed, the resulting formulation is a linear programming (LP) problem which can be solved quite easily. Several plausible assumptions about how partial funding impacts project value are presented. In general, our examples suggest that the problem might best be formulated as a nonlinear programming (NLP) problem, but that there is a need for further research to determine an appropriate expression for the value of a partially funded project. In light of that gap in the current body of knowledge and for practical reasons, the LP relaxation of this model is preferred. The LP relaxation can be implemented in a spreadsheet (even for relatively large problems) and gives reasonable results when applied to a test problem based on GMs R&D project selection process. There has been much discussion in the literature on the topic of assigning a quantitative measure of value to each project. Although many alternatives are suggested, no one way is universally accepted as the preferred way. There does seem to be general agreement that all of the proposed methods are subject to considerable uncertainty. A systematic way to examine the sensitivity of project selection decisions to variations in the measure of value is developed. It is shown that the solution for the illustrative problem is reasonably robust to rather large variations in the measure of value. We cannot, however, conclude that this would be the case in general.


Science of The Total Environment | 1976

Measures of association of some air pollutants, natural ionizing radiation and cigarette smoking with mortality rates.

Richard C. Schwing; Gary C. McDonald

Two methods are employed to estimate the association the association of hydrocarbons, sulfur compounds, nitrogen compounds, natural ionizing radiation, and cigarette smoking with some age stratified and disease specific United States mortality rates for white males. The first method is based on a ridge regression technique and the second on a sign constrained least squares analysis. The measure of association between these environmental factors and mortality are quantified as elasticities; i.e., the indicated percentage change in the average mortality rate corresponding to a 1% change in the average level of environmental factor. Elasticities are estimated for age specific and disease specific mortality rates, and these values are then aggregated and compared to estimates corresponding to total mortality rates. Overall, consistent results are obtained using the above methods for sulfur compounds and cigarette smoking. Many of these results differ considerably from corresponding results obtained from the ordinary least squares regression analysis, highlighting the need for applying the appropriate estimation methods. In addition to the variables already specified, these analyses take into consideration the following groups of explanatory variables: Climate--Precipitation, January temperature, July temperature, humidity, and solar radiation. Socioeconomic--Age, education, sound housing, population per household, population density, % non-white, % white-collar, income, and city size.


Graphical Representation of Multivariate Data | 1978

SOME APPLICATIONS OF THE “CHERNOFF FACES”: A TECHNIQUE FOR GRAPHICALLY REPRESENTING MULTIVARIATE DATA

Gary C. McDonald; James A. Ayers

This paper presents a brief description and several applications of a relatively new method of graphical representation of multivariate data. The technique has been developed by H. Chernoff, and consists of mapping a vector-valued data point (presently limited to 18 or less components) into a geometrically constructed face. To provide an example of this technique the data analyzed in a recent mortality and pollution study have been mapped into faces. Each of the sixty faces represents a portrait of a particular Standard Metropolitan Statistical Area (SMSA). These faces are then employed to initialize a cluster analysis algorithm, and to examine certain trends in least squares residuals.


Communications in Statistics-theory and Methods | 1981

Selecting logistic populations using the sample medians

Thomas J. Lorenzen; Gary C. McDonald

This paper is concerned primarily with subset selection procedures based on the sample mediansof logistic populations. A procedure is given which chooses a nonempty subset from among kindependent logistic populations, having a common known variance, so that the populations with thelargest location parameter is contained in the subset with a pre‐specified probability. Theconstants required to apply the median procedure with small sample sizes (≤= 19) are tabulated and can also be used to construct simultaneous confidence intervals. Asymptotic formulae are provided for application with larger sample sizes. It is shown that, under certain situations, rules based on the median are substantially more efficient than analogous procedures based either on sample means or on the sum of joint ranks.


Technometrics | 1979

Nonparametric Selection Procedures Applied to State Traffic Fatality Rates

Gary C. McDonald

This article reviews the practical aspects of several nonparametric subset selection rules useful in block design problems, and discusses advantages and disadvantages of these methods. The populations are assumed stochastically ordered by the parameter of interest. Rules based on ranked observations are given for selecting a subset of populations which contains, with a specified confidence level, the population characterized by the smallest (or largest) parameter value. These procedures are applied to state traffic fatality rates recorded yearly (1960-76). New England states and Middle Atlantic states comprise most of the subset asserted, with a 90% confidence level, to contain the state with the smallest fatality rate; whereas, Southern states, Southwestern states and Rocky Mountain states generally comprise the subset for the state with the largest fatality rate.


Journal of Quality Technology | 1986

A statistical selection approach to binomial models

Shanti S. Gupta; Gary C. McDonald

Operating characteristics are studied for slippage and equi-spaced parametric configurations. Tables and graphs relating to selection probabilities and expected subset size are presented as well as examples for illustrating their use. Also, a new rule i..


Technometrics | 1984

A Rational Interpretation of the Ridge Trace

Diane I. Gibbons; Gary C. McDonald

The ridge regression estimator may be written as a linear combination of the least squares estimators derived from all possible subset regressions. This article delineates the relationship between the ridge estimator and the subset regression estimators and highlights the implications of this relationship for ridge trace interpretation. Ridge-regression examples are provided, illustrating how the interpretation of a ridge trace is enhanced.


Technometrics | 1979

A Class of Multiple Run Sampling Plans

Lonnie C. Vance; Gary C. McDonald

This article presents a class of attribute sampling plans based on the theory of runs. These plans call for the acceptance of a process if a run of specified length of nondefective items occurs before a fixed number of defectives have been observed. Recursion formulas for the probabilities of rejection and acceptance at each stage of sampling are derived. The operating characteristic (OC) curves, the average sample number (ASN) curves. and the variance of sample number (VSN) curves are given for representative plans within the class. This new class of plans is compared with a sequential plan and with a related sampling plan, also based on the theory of runs, which accepts a process if a run of specified length of nondefective items occurs in a fixed number of trials.

Collaboration


Dive into the Gary C. McDonald's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Richard F. Gunst

Southern Methodist University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge