Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Katherine Campbell is active.

Publication


Featured researches published by Katherine Campbell.


Ecological Applications | 1997

OVERSTORY‐IMPOSED HETEROGENEITY IN SOLAR RADIATION AND SOIL MOISTURE IN A SEMIARID WOODLAND

David D. Breshears; Paul M. Rich; Fairley J. Barnes; Katherine Campbell

We thank Clif Meyer, Susan Johnson, Katherine Dayem, Laura Campbell, Robert Lucero, Teng-Chiu Lin, Diana A. Heisler, Kathy E. Lee, and Chris Heil for research assistance; John W. Nyhan for precipitation data; Mary Lu Breshears for editorial assistance; Shawki A. Ibrahim, William K. Lauenroth, and F. Ward Whicker for guidance; and Craig D. Allen, Christopher Field, Geoffrey M. Henebry, Bruce T. Milne, Bradford P.Wilcox, and an anonymous reviewer for their very helpful reviews. This work was supported by the Los Alamos National Environmental Research Park, Los Alamos National Laboratory Environmental Restoration Project, Kansas Applied Remote Sensing Program, the Kansas Biological Survey, and the University of Kansas Research Development and General Research Funds.


Reliability Engineering & System Safety | 2006

Sensitivity analysis when model outputs are functions

Katherine Campbell; Michael D. McKay; Brian J. Williams

When outputs of computational models are time series or functions of other continuous variables like distance, angle, etc., it can be that primary interest is in the general pattern or structure of the curve. In these cases, model sensitivity and uncertainty analysis focuses on the effect of model input choices and uncertainties in the overall shapes of such curves. We explore methods for characterizing a set of functions generated by a series of model runs for the purpose of exploring relationships between these functions and the model inputs.


Reliability Engineering & System Safety | 2006

Statistical calibration of computer simulations

Katherine Campbell

Abstract This paper surveys issues associated with the statistical calibration of physics-based computer simulators. Even in solidly physics-based models there are usually a number of parameters that are suitable targets for calibration. Statistical calibration means refining the prior distributions of such uncertain parameters based on matching some simulation outputs with data, as opposed to the practice of “tuning” or point estimation that is commonly called calibration in non-statistical contexts. Older methods for statistical calibration are reviewed before turning to recent work in which the calibration problem is embedded in a Gaussian process model. In procedures of this type, parameter estimation is carried out simultaneously with the estimation of the relationship between the calibrated simulator and truth.


Technometrics | 1998

Setting Environmental Standards: The Statistical Approach to Handling Uncertainty and Variation

Katherine Campbell

Setting environmental standards: the statistical approach to handling uncertainty and variation , Setting environmental standards: the statistical approach to handling uncertainty and variation , مرکز فناوری اطلاعات و اطلاع رسانی کشاورزی


Human and Ecological Risk Assessment | 2000

Separating Variability and Uncertainty in Environmental Risk Assessment—Making Choices

Elizabeth J. Kelly; Katherine Campbell

This article reviews some of the current guidance concerning the separation of variability and uncertainty in presenting the results of human health and ecological risk assessments. Such guidance and some of the published examples of its implementation using two-stage Monte Carlo simulation methods have not emphasized the fact that there is considerable judgment involved in determining which input parameters can be modeled as purely variable or purely uncertain, and which require explicit treatment in both dimensions. Failure to discuss these choices leads to confusion and misunderstanding of the proposed methods. We conclude with an example illustrating some of the reasoning and statistical calculations that might be used to inform such choices.


Journal of Contaminant Hydrology | 2003

Chlorine-36 data at Yucca Mountain: statistical tests of conceptual models for unsaturated-zone flow.

Katherine Campbell; Andrew V. Wolfsberg; June Fabryka‐Martin; Donald S. Sweetkind

An extensive set of chlorine-36 (36Cl) data has been collected in the Exploratory Studies Facility (ESF), an 8-km-long tunnel at Yucca Mountain, Nevada, for the purpose of developing and testing conceptual models of flow and transport in the unsaturated zone (UZ) at this site. At several locations, the measured values of 36Cl/Cl ratios for salts leached from rock samples are high enough to provide strong evidence that at least a small component of bomb-pulse 36Cl, fallout from atmospheric testing of nuclear devices in the 1950s and 1960s, was measured, implying that some fraction of the water traveled from the ground surface through 200-300 m of unsaturated rock to the level of the ESF during the last 50 years. These data are analyzed here using a formal statistical approach based on log-linear models to evaluate alternative conceptual models for the distribution of such fast flow paths. The most significant determinant of the presence of bomb-pulse 36Cl in a sample from the welded Topopah Spring unit (TSw) is the structural setting from which the sample was collected. Our analysis generally supports the conceptual model that a fault that cuts through the nonwelded Paintbrush tuff unit (PTn) that overlies the TSw is required in order for bomb-pulse 36Cl to be transmitted to the sample depth in less than 50 years. Away from PTn-cutting faults, the ages of water samples at the ESF appear to be a strong function of the thickness of the nonwelded tuff between the ground surface and the ESF, due to slow matrix flow in that unit.


IEEE Computer | 1974

Digital image processing at EG&G

George W. Wecksung; Katherine Campbell

EG&G, Inc., a leader in the field of photographie data acquisition and analysis for over 25 years, has for the past five years been placing increased emphasis on digital image processing in support of the Field Testing Division of the Los Alamos Scientific Laboratory under AEC Contract No. AT(29–1)1183. During that time, capabilities that originally involved photogrammetric and radiometrie analysis have been augmented to include two-dimensional Fourier frequency analysis and spatial filtering of images. This article describes some of the image processing equipment at EG&G, along with the results obtained using this equipment, and discusses digital holography in greater detail.


Reliability Engineering & System Safety | 2006

Combined array experiment design

Leslie M. Moore; Michael D. McKay; Katherine Campbell

Abstract Experiment plans formed by combining two or more designs, such as orthogonal arrays primarily with 2- and 3-level factors, creating multi-level arrays with subsets of different strength are proposed for computer experiments to conduct sensitivity analysis. Specific illustrations are designs for 5-level factors with fewer runs than generally required for 5-level orthogonal arrays of strength 2 or more. At least 5 levels for each input are desired to allow for runs at a nominal value, 2-values either side of nominal but within a normal, anticipated range, and two, more extreme values either side of nominal. This number of levels allows for a broader range of input combinations to test the input combinations where a simulation code operates. Five-level factors also allow the possibility of up to fourth-order polynomial models for fitting simulation results, at least in one dimension. By having subsets of runs with more than strength 2, interaction effects may also be considered. The resulting designs have a “checker-board” pattern in lower-dimensional projections, in contrast to grid projection that occurs with orthogonal arrays. Space-filling properties are also considered as a basis for experiment design assessment.


Journal of Geochemical Exploration | 1983

R-Mode Factor Analysis Applied to Uranium Exploration in the Montrose Quadrangle, Colorado

Stephen L. Bolivar; Katherine Campbell; George W. Wecksung

Abstract R-mode factor analysis is used to describe the relationships among 18 remotely sensed and geochemical data sets (variables) for the Montrose 1° × 2° quadrangle, Colorado, a region that covers 19 000 km 2 . The data sets contain reconnaissance-scale information and include Landsat imagery, airborne geophysical information (eU, eTh, K40, aeromagnetics), elevation, and hydrogeochemical and stream sediment analyses. The elements U, K, Dy, Hf, V, Th, Ca and Ba in sediments and U in waters were selected. The results of the factor analysis for the entire quadrangle are compared to the results for a 50 km × 50 km test area containing several known uranium occurrences. Four factors account for 70.0% of the total variance in the data. These are interpreted as a felsic factor, Landsat factor, economic or mineralization factor (in terms of uranium mineralization and potential mineralized areas), and a volcanic factor. Graphical representations (maps) of the raw data, factor approximations, residuals for each data set, and the four-factor model greatly aid interpretation of the analytic results. We find that data integration techniques and R-mode factor analysis can be used with some success in uranium resource appraisal.


The Journal of Geology | 1980

Principal Components Analysis as a Tool for Interpreting Nure Aerial Radiometric Survey Data

Fredric L. Pirkle; Katherine Campbell; George W. Wecksung

Since 1974 the Grand Junction, Colorado, Office, U.S. Department of Energy, through its National Uranium Resource Evaluation (NURE) program, has been conducting aerial surveys over various portions of the United States. The multivariate structure of the data suggests that statistical techniques of multivariate analysis are appropriate. Principal components analysis provides immediate insight into the structure of multivariate data and was applied to aerial data collected in the Lubbock and Plainview NTMS quadrangles. The components appear to lend themselves to an interpretation which sheds light on the geological history of the study area and aid in the identification of areas favorable to uranium deposition. The use of this technique may result in different conclusions in various geologic environments; however, it will always allow insight into the structure of the aerial radiometric data.

Collaboration


Dive into the Katherine Campbell's collaboration.

Top Co-Authors

Avatar

George W. Wecksung

Los Alamos National Laboratory

View shared research outputs
Top Co-Authors

Avatar

Andrew V. Wolfsberg

Los Alamos National Laboratory

View shared research outputs
Top Co-Authors

Avatar

Elizabeth J. Kelly

Los Alamos National Laboratory

View shared research outputs
Top Co-Authors

Avatar

June Fabryka‐Martin

Los Alamos National Laboratory

View shared research outputs
Top Co-Authors

Avatar

Michael D. McKay

Los Alamos National Laboratory

View shared research outputs
Top Co-Authors

Avatar

Brian J. Williams

Los Alamos National Laboratory

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Donald S. Sweetkind

United States Geological Survey

View shared research outputs
Top Co-Authors

Avatar

Fairley J. Barnes

Los Alamos National Laboratory

View shared research outputs
Researchain Logo
Decentralizing Knowledge