Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where James E. Campbell is active.

Publication


Featured researches published by James E. Campbell.


Journal of Quality Technology | 1981

An Approach to Sensitivity Analysis of Computer Models: Part I - Introduction, Input Variable Selection and Preliminary Variable Assessment

Ronald L. Inman; Jon C. Helson; James E. Campbell

This is the first part of a two-part article presenting a statistical approach to the sensitivity analysis of computer models. Part I defines the objectives of sensitivity analysis and presents a computer model that is used for purposes for illustration..


IEEE Transactions on Reliability | 1995

Genetic algorithms in optimization of system reliability

Laura Painton; James E. Campbell

After initial production, improvements are often made to components of a system, to upgrade system performance; for example, when designing a later version or release. This paper presents an optimization model that identifies the types of component improvements and the level of effort spent on those improvements to maximize one or more performance measures (e.g., system reliability or availability) subject to constraints (e.g., cost) in the presence of uncertainty about the component failure rates. For each component failure mode, some possible improvements are identified along with their cost and the resulting improvement in failure rates for that failure mode. The objective function is defined as a stochastic function of the performance measure of interest-in this case, 5/sup th/ percentile of the mean time-between-failure distribution. The problem formulation is combinatorial and stochastic. Genetic algorithms are used as the solution method. Our approach is demonstrated on a case study of a personal computer system. Results and comparison with enumeration of the configuration space show that genetic algorithms perform very favorably in the face of noise in the output: they are able to find the optimum over a complicated, high dimensional, nonlinear space in a tiny fraction of the time required for enumeration. The integration of genetic algorithm optimization capabilities with reliability analysis can provide a robust, powerful design-for-reliability tool. >


Journal of Quality Technology | 1981

An Approach to Sensitivity Analysis of Computer Models: Part II - Ranking of Input Variables, Response Surface Validation, Distribution Effect and Technique Synopsis

Ronald L. Inman; Jon C. Helson; James E. Campbell

This is the second part of a two-part paper presenting a statistical approach to the sensitivity analysis of computer models. In this part consideration is given to response surface construction techniques including identification of possible overfit an..


6. American Institute of Aeronautics and Astronautics (AIAA)/USAF/NASA/ISSMO symposium on multidisciplinary analysis and optimization, Bellevue, WA (United States), 4-6 Sep 1996 | 1996

Optimization of reliability allocation strategies through use of genetic algorithms

James E. Campbell; Laura Painton

This paper examines a novel optimization technique called genetic algorithms and its application to the optimization of reliability allocation strategies. Reliability allocation should occur in the initial stages of design, when the objective is to determine an optimal breakdown or allocation of reliability to certain components or subassemblies in order to meet system specifications. The reliability allocation optimization is applied to the design of a cluster tool, a highly complex piece of equipment used in semiconductor manufacturing. The problem formulation is presented, including decision variables, performance measures and constraints, and genetic algorithm parameters. Piecewise ``effort curves`` specifying the amount of effort required to achieve a certain level of reliability for each component of subassembly are defined. The genetic algorithm evolves or picks those combinations of ``effort`` or reliability levels for each component which optimize the objective of maximizing Mean Time Between Failures while staying within a budget. The results show that the genetic algorithm is very efficient at finding a set of robust solutions. A time history of the optimization is presented, along with histograms or the solution space fitness, MTBF, and cost for comparative purposes.


Advances in Water Resources | 1981

Distributed velocity method of solving the convective-dispersion equation: 1. Introduction, mathematical theory, and numerical implementation

James E. Campbell; Dennis E. Longsine; Mark Reeves

Abstract This is the first part of a two-part paper presenting a new method for treating convective-dispersive transport. The motivation for developing this technique arises from the demands of performing a risk assessment for a nuclear waste repository. These demands include computational efficiency over a relatively large range of Peclet numbers, the ability to handle chains of decaying radionuclides with rather extreme contrasts in both solution velocities and half lives, and the ability to treat leach- or solubility-limited sources. To the extent it has been tested to date, the distributed velocity method (DVM) appears to satisfy these demands. This part contains an overall introduction and presents the mathematical theory, numerical implementation, and example results.


Archive | 1991

A self-teaching curriculum for the NRC/SNL (Nuclear Regulatory Commission/Sandia National Laboratory) low-level waste performance assessment methodology

M.S.Y. Chu; M.W. Kozak; James E. Campbell; B.H. Thompson

A performance assessment methodology has been developed for use by the US Nuclear Regulatory Commission in evaluating license applications for low-level waste disposal facilities. This report provides detailed guidance on input and output procedures for the computer codes recommended for use in the methodology. Seven sample problems are provided for various aspects of a performance assessment analysis of a simple hypothetical conceptual model. When combined, these sample problems demonstrate how the methodology is used to produce a dose history for the site under normal conditions, and to demonstrate an analysis of an intruder scenario. 20 refs., 26 figs., 4 tabs.


Reliability Engineering & System Safety | 1990

Application of generic risk assessment software to radioactive waste disposal

James E. Campbell; Dennis E. Longsine

Abstract Monte Carlo methods are used in a variety of applications such as risk assessment, probabilistic safety assessment and reliability analysis. While Monte Carlo methods are simple to use, their application can be laborious. A new microcomputer software package has been developed that substantially reduces the effort required to conduct Monte Carlo analyses. The Sensitivity and Uncertainty Analysis Shell (SUNS) is a software shell in the sense that a wide variety of application models can be incorporated into it. SUNS offers several useful features including a menu-driven environment, a flexible input editor, both Monte Carlo and Latin Hypercube sampling, the ability to perform both repeated trials and parametric studies in a single run, and both statistical and graphical output. SUNS also performs all required file management functions.


Advances in Water Resources | 1981

Distributed velocity method of solving the convective-dispersion equation: 2. Error analysis and comparison with other methods

James E. Campbell; Dennis E. Longsine; Mark Reeves

Abstract This is the second part of a two-part paper presenting a new method for treating convective-dispersive transport. The motivation for developing this technique arises from the demands of performing a risk assessment for a nuclear waste repository. These demands include computational efficiency over a relatively large range of Peclet numbers, the ability to handle chains of decaying radionuclides with rather extreme contrasts in both solution velocities and half lives and the ability to treat leach- or solubitity-limited sources. To the extent it has been tested to date, the distributed velocity method (DVM) appears to satisfy these demands. This part presents an error analysis employing statistical sampling and regression analysis techniques, and comparisons of DVM with other methods for convective-dispersive transport.


Archive | 2005

Human performance modeling for system of systems analytics :soldier fatigue.

Craig R. Lawton; James E. Campbell; Dwight Peter Miller

The military has identified Human Performance Modeling (HPM) as a significant requirement and challenge of future systems modeling and analysis initiatives as can be seen in the Department of Defenses (DoD) Defense Modeling and Simulation Offices (DMSO) Master Plan (DoD 5000.59-P 1995). To this goal, the military is currently spending millions of dollars on programs devoted to HPM in various military contexts. Examples include the Human Performance Modeling Integration (HPMI) program within the Air Force Research Laboratory, which focuses on integrating HPMs with constructive models of systems (e.g. cockpit simulations) and the Navys Human Performance Center (HPC) established in September 2003. Nearly all of these initiatives focus on the interface between humans and a single system. This is insufficient in the era of highly complex network centric SoS. This report presents research and development in the area of HPM in a system-of-systems (SoS). Specifically, this report addresses modeling soldier fatigue and the potential impacts soldier fatigue can have on SoS performance.


MRS Proceedings | 1982

An Analysis of Waste Package Behavior for High-Level Waste

Margaret S. Chu; James E. Campbell; Stephen E. Stuckwisch; Krishan K. Wahi; Malcolm Dean Siegel

A sensitivity analysis was performed to determine the impact of the containment time criterion on compliance with the Environmental Protection Agency Draft Standard when using temperature-dependent leach rates. The results of this analysis indicate that canister design life can be shown to be important when the effects of temperature on leach rates are taken into account.

Collaboration


Dive into the James E. Campbell's collaboration.

Top Co-Authors

Avatar

Dennis E. Longsine

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar

Laura Painton

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar

Jon C. Helson

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar

Ronald L. Inman

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar

Brenda S. Langkopf

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar

Bruce Thompson

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar

Daniel Briand

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar

Krishan K. Wahi

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar

Malcolm Dean Siegel

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar

Margaret S. Chu

Sandia National Laboratories

View shared research outputs
Researchain Logo
Decentralizing Knowledge