George T. Flatman
United States Environmental Protection Agency
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by George T. Flatman.
Mathematical Geosciences | 1987
Evangelos A. Yfantis; George T. Flatman; Joseph V. Behar
Although several researchers have pointed out some advantages and disadvantages of various soil sampling designs in the presence of spatial autocorrelation, a more detailed study is presented herein which examines the geometrical relationship of three sampling designs, namely the square, the equilateral triangle, and the regular hexagon. Both advantages and disadvantages exist in the use of these designs with respect to estimation of the semivariogram and their effect on the mean square error or variance of error. This research could be used to design optimal sampling strategies; it is based on the theory of regionalized variables, in which the intrinsic hypothesis is satisfied. Among alternative designs, an equilateral triangle design gives the most reliable estimate of the semivariogram. It also gives the minimum maximum mean square error of point estimation of the concentration over the other two designs for the same number of measurements when the nugget effect is small relative to the variance. If the nugget effect is large (.90 σ2 or more), and the linear sampling density is >0.85r where r is the range, the hexagonal design is best. This study computes and compares the maximum mean square error for each of these designs.
Environmental Monitoring and Assessment | 1984
George T. Flatman; Angelo Yfantis
A soil sampling strategy for spatially correlated variables using the tools of geostatistical analysis is developed. With a minimum of equations, the logic of geostatistical analysis is traced from the modeling of a semi-variogram to the output isomaps of pollution estimates and their standard deviations. These algorithms provide a method to balance precision, accuracy, and costs. Their axiomatic assumptions dictate a two-stage sampling strategy. The first stage is a sampling survey, using a radial gird, to collect enough data to define, by a semi-variogram, the ranges of influence and the orientation of the correlation structure of the pollutant plume. The second stage is a census of the suspected area with grid shape, sizes and orientation dictated by the semi-variogram. The subsequent kriging analysis of this data gives isopleth maps of the pollution field and the standard error isomap of this contouring. These outputs make the monitoring data understandable for the decision maker.
Mathematical Geosciences | 1994
Anita Singh; Ashok K. Singh; George T. Flatman
Samples from hazardous waste site investigations frequently come from two or more statistical populations. Assessment of “background” levels of contaminants can be a significant problem. This problem is being investigated at the U.S. Environmental Protection Agencys Environmental Monitoring Systems Laboratory in Las Vegas. This paper describes a statistical approach for assessing background levels from a dataset. The elevated values that may be associated with a plume or contaminated area of the site are separated from lower values that are assumed to represent background levels. It would be desirable to separate the two populations either spatially by Kriging the data or chronologically by a time series analysis, provided an adequate number of samples were properly collected in space and/or time. Unfortunately, quite often the data are too few in number or too improperly designed to support either spatial or time series analysis. Regulations typically call for nothing more than the mean and standard deviation of the background distribution. This paper provides a robust probabilistic approach for gaining this information from poorly collected data that are not suitable for above-mentioned alternative approaches. We assume that the site has some areas unaffected by the industrial activity, and that a subset of the given sample is from this clean part of the site. We can think of this multivariate data set as coming from two or more populations: the background population, and the contaminated populations (with varying degrees of contamination). Using robust M-estimators, we develop a procedure to classify the sample into component populations. We derive robust simultaneous confidence ellipsoids to establish background contamination levels. Some simulated as well as real examples from Superfund site investigations are included to illustrate these procedures. The method presented here is quite general and is suitable for many geological and biological applications.
Mathematical Geosciences | 1988
Evangelos A. Yfantis; George T. Flatman; Evan J. Englund
Methods suggested in the past for simulated ore concentration or pollution concentration over an area of interest, subject to the condition that the simulated surface is passing through specifying points, are based on the assumption of normality. A new method is introduced here which is a generalization of the subdivision method used in fractals. This method is based on the construction of a fractal plane-to-line functionf(x, y, R, e, u), where(x, y) is in[a, b]×[c, d], R is the autocorrelation function,e is the resolution limit, andu is a random real function on [−1, 1]. The simulation using fractals escapes from any distribution assumptions of the data. The given network of points is connected to form quadrilaterals; each one of the quadrilaterals is split based on ways which are extensions of the well-known subdivision method. The quadrilaterals continue to split and grow until resolution obtained in bothx andy directions is smaller than a prespecified resolution. If thex coordinate of theith quadrilateral is in[ai,bi] and they coordinate is in[ci,di], the growth of this quadrilateral is a function of(bi−ai) and(di−ci); the quadrilateral could grow toward the positive or negativez axis with equal probability forming four new quadrilaterals having a common vertex.
Communications in Statistics - Simulation and Computation | 1991
Martin A. Stapanian; Forest C. Garner; Kirk E. Fitzgerald; George T. Flatman; Evan J. Englund
Mardias multivariate kurtosis and the generalized distance have desirable properties as multivariate outlier tests. However, extensive critical values have not been published heretofore. A published approximation formula for critical values of the kurtosis is shown to inadequately control the type I error rate, with observed error rates often differing from their intended values by a factor of two or more. Critical values derived from simulations for both tests for up to 25 dimensions and 500 observations are presented. The power curves of both tests are discussed. The generalized distance is the more powerful test when exactly one outlier is present and the contaminant is substantially mean-shifted. However, as the number of outliers increases, the kurtosis becomes the more powerful test. The two tests are compared with respect to power and vulnerability to masking. Recommendations for the use of these tests and interpretation of results are given.
Environmental Monitoring and Assessment | 1991
Thomas H. Starks; George T. Flatman
The problems of developing and comparing statistical procedures appropriate to the monitoring of ground water at hazardous waste sites are discussed. It is suggested that these decision procedures should be viewed as quality control schemes and compared in the same way that industrial quality control schemes are compared. The results of a Monte Carlo simulation study of run-length distribution of a combined Shewhart-CUSUM quality control scheme are reported.
Computers & Geosciences | 1988
Evangelos A. Yfantis; George T. Flatman
Abstract Sampling dependent random variables constituting a nonstationary spatial two-dimensional random process due to the presence of drift is a difficult problem consisting of determining the optimum sampling design and sampling density or number of samples, in order for one to attain a desirable precision as it is expressed by the Mean Square Error (MSE). The complexity of the problem prevents the development of a closed form solution expressing the number of samples needed to attain a given precision. Thus a user friendly FORTRAN program solving this problem for the situation of spherical semivariogram, the equilateral, square and hexagon designs, and no drift, linear, and quadratic, is given.
Chemometrics and Intelligent Laboratory Systems | 1988
Evangelos A. Yfantis; George T. Flatman; Forest C. Garner
Abstract This paper develops an algorithm by which to compute the optimal frequency of calibration monitoring to minimize the total cost of analyzing a set of samples and the required calibration standards. Optimum calibration monitoring is needed because of the high cost and calibration drift of the analyzing equipment. Gas chromatographs with mass spectroscopy equipment give trace analyses with previously unattainable precision, accuracy, and speed of turn-around when properly calibrated and systematically monitored. Calibration monitoring is done by including a known standard every fixed number of samples. Currently, this fixed number, or calibration monitoring frequency, is arbitrarily chosen; however, this paper gives a method to compute the optimal calibration monitoring frequency so that the total analyses costs for both samples and standards will be minimized.
Analytica Chimica Acta | 1993
Malwane M. A. Ananda; Ashok K. Singh; George T. Flatman
Abstract The problem of estimating the product of three normal means of three independent normal distributions is considered. These types of estimation problems arise in many environmental applications, such as exposure assessment and risk modeling. Classical confidence interval estimates are available in the literature; we consider the problem from the Bayesian approach using two different proper prior distributions and one non-informative prior distribution. Assuming the quadratic loss, Bayesian estimates and Bayesian confidence intervals are given. Numerical integration or simulation are necessary to evaluate such confidence intervals. Computer programs written in Fortran are given to calculate these confidence intervals. Examples are provided.
Archive | 1993
Evangelos A. Yfantis; George T. Flatman; F. Miller
A parametric surface estimation algorithm is examined. The algorithm is a perfect interpolator. The points surrounding the point to be estimated are weighted according to the length of their paths from the point to be estimated, and not their Euclidean distance from that point. The algorithm is capable of estimating surfaces that are not functions and twist, turn, and fold, into the three dimensional space in any direction. The big advantage of this family of algorithms is that they do not require the process, the data came from, to satisfy the intrinsic hypothesis, or be second order stationary. Furthermore, they do not require equal distance between sampling points or continuity of the first or second derivatives. From the computational point of view they do not require matrix inversion. This family of methods is therefore robust. Given any set of points in the three dimensional space we show that this family of interpolators converges and always produces a surface. The disadvantage of this method is that, due to the lack of strict assumptions, it is difficult to calculate the error of estimation. Under the assumption of stationarity we calculate the error of the estimate, produced by interpolating with our method. Thus under the assumption of stationarity our method can be compared with kriging.