Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Clayton V. Deutsch is active.

Publication


Featured researches published by Clayton V. Deutsch.


Technometrics | 1998

Geostatistical Software Library and User's Guide

Eric R. Ziegel; Clayton V. Deutsch; Andre G. Journel

This book will be an important text to most of geostatisticians, including graduate students and experts in the field of practical geostatistics. The guts of this volume are the two highdensity IBM disks which come with it and contain 37 programs which can be run in both UNIX and DOS environments but are not machine specific. The programs are aimed at three major areas of geostatistics: quantifying spatial variability (variograms), generalized linear regression techniques (kriging), and stochastic simulation. In all there are some 80 source files included with the distribution diskettes. The programs are not execuable but require to be compiled before running them. A machine with a fortran compiler is required. The intent of the authors is to make this suite of programs accessible to anyone who wants to use them. The source code of these programs has been assembled, developed, tested, and tried at Stanford University over a period of some 12 years. Though this library of programs is not intended as a commercial product it represents a gold mine to those who need a jump start into the field of geostatistics.


Mathematical Geosciences | 2001

Teacher's Aide Variogram Interpretation and Modeling 1

Emmanuel Gringarten; Clayton V. Deutsch

The variogram is a critical input to geostatistical studies: (1) it is a tool to investigate and quantify the spatial variability of the phenomenon under study, and (2) most geostatistical estimation or simulation algorithms require an analytical variogram model, which they will reproduce with statistical fluctuations. In the construction of numerical models, the variogram reflects some of our understanding of the geometry and continuity of the variable, and can have a very important impact on predictions from such numerical models. The principles of variogram modeling are developed and illustrated with a number of practical examples. A three-dimensional interpretation of the variogram is necessary to fully describe geologic continuity. Directional continuity must be described simultaneously to be consistent with principles of geological deposition and for a legitimate measure of spatial variability for geostatistical modeling algorithms. Interpretation principles are discussed in detail. Variograms are modeled with particular functions for reasons of mathematical consistency. Used correctly, such variogram models account for the experimental data, geological interpretation, and analogue information. The steps in this essential data integration exercise are described in detail through the introduction of a rigorous methodology.


Mathematical Geosciences | 1996

Hierarchical object-based stochastic modeling of fluvial reservoirs

Clayton V. Deutsch; Libing Wang

This paper describes a novel approach to modeling braided stream fluvial reservoirs. The approach is based on a hierarchical set of coordinate transformations involving relative straingraphic coordinates, translations, rotations, and straightening functions. The emphasis is placed on geologically sound geometric concepts and realistically-attainable conditioning statistics including areal and vertical facies proportions. Modeling proceeds in a hierarchical fashion, that is (1) a stratigraphic coordinate system is established for each reservoir layer, (2) a number of channel complexes are positioned within each layer, and then (3) channels are positioned within each channel complex. The geometric specification of each sand-filled channel within the background of floodplain shales is a marked point process. Each channel is marked with a starting location, size parameters, and sinuosity parameters. We present the hierarchy of eight coordinate transformations, introduce an analytical expression for the channel cross-section shape, describe the simulation algorithm, and demonstrate how the realizations are made to honor local conditioning data from wells and global conditioning data such as areal and vertical proportions.


Computers & Geosciences | 2002

FLUVSIM: a program for object-based stochastic modeling of fluvial depositional systems

Clayton V. Deutsch; Thomas T. Tran

This paper presents a FORTRAN program for hierarchical object-based modeling of complex fluvial facies. Unique features of this program include (1) a simple approach to place channel, levee, and crevasse sands within a matrix of floodplain shales, (2) templates for fast rastering of fluvial facies objects, leading to fast CPU times, and (3) the use of simulated annealing and non-random perturbation rules for conditioning to extensive soft facies-proportion data and local well data. Object-based modeling techniques are widely applicable to modeling fluvial depositional systems. Public domain software for such modeling is rare and inflexible with respect to the variety of conditioning data that can be handled. Commercial software is costly and also of limited flexibility. The fluvsim program overcomes many of these limitations with an accessible research code.


Mathematical Geosciences | 1993

Entropy and spatial disorder

Andre G. Journel; Clayton V. Deutsch

The majority of geostatistical estimation and simulation algorithms rely on a covariance model as the sole characteristic of the spatial distribution of the attribute under study. The limitation to a single covariance implicitly calls for a multivariate Gaussian model for either the attribute itself or for its normal scores transform. The Gaussian model could be justified on the basis that it is both analytically simple and it is a maximum entropy model, i.e., a model that minimizes unwarranted structural properties. As a consequence, the Gaussian model also maximizes spatial disorder (beyond the imposed covariance) which can cause flow simulation results performed on multiple stochastic images to be very similar; thus, the space of response uncertainty could be too narrow entailing a misleading sense of safety. The ability of the sole covariance to adequately describe spatial distributions for flow studies, and the assumption that maximum spatial disorder amounts to either no additional information or a safe prior hypothesis are questioned. This paper attempts to clarify the link between entropy and spatial disorder and to provide, through a detailed case study, an appreciation for the impact of entropy of prior random function models on the resulting response distributions.


Spe Formation Evaluation | 1989

Calculating Effective Absolute Permeability in Sandstone/Shale Sequences

Clayton V. Deutsch

In this paper two averaging algorithms are proposed for determining block effective absolute permeability. The experimental relationship between the effective permeability, the volume fraction of shale, and the anisotropy of the shales is first observed through repeated flow simulations. A power-averaging model and a percolation model are proposed to fit the experimentally observed relationship. The power-averaging model provides a surprisingly easy and efficient way to calculate block effective absolute permeability. A simple graph is given to determine the averaging power from the geometric anisotropy (aspect ratio) of the shales for both vertical and horizontal steady-state flow. A correction for large shales relative to small gridblocks is also proposed.


Mathematical Geosciences | 1994

Practical considerations in the application of simulated annealing to stochastic simulation

Clayton V. Deutsch; Perry W. Cockerham

Realizations generated by conditional simulation techniques must honor as much data as possible to be reliable numerical models of the attribute under study. The application of optimization methods such as simulated annealing to stochastic simulation has the potential to honor more data than conventional geostatistical simulation techniques. The essential feature of this approach is the formulation of stochastic imaging as an optimization problem with some specified objective function. The data to be honored by the stochastic images are coded as components in a global objective function. This paper describes the basic algorithm and then addresses a number of practical questions: (1) what are the criteria for adding a component to the global objective function? (2) what perturbation mechanism should be employed in the annealing simulation? (3) when should the temperature be lowered in the annealing procedure? (4) how are edge/border nodes handled? (5) how are local conditioning data handled? and (6) how are multiple components weighted in the global objective function?


Computers & Geosciences | 1996

Correcting for negative weights in ordinary kriging

Clayton V. Deutsch

Abstract Negative weights in ordinary kriging (OK) arise when data close to the location being estimated screen outlying data. Depending on the variogram and the amount of screening, the negative weights can be significant; there is nothing in the OK algorithm that alerts the kriging system about the zero threshold for weights. Negative weights, when interpreted as probabilities for constructing a local conditional distribution, are nonphysical. Also, negative weights when applied to high data values may lead to negative and nonphysical estimates. In these situations the negative weights in ordinary kriging must be corrected. An algorithm is presented to reset negative kriging weights, and compensating positive weights to zero. The sum of the remaining nonzero weights is restandardized to 1.0 to ensure unbiasedness. The situations when this correction is appropriate are described and a number of examples are given.


Computers & Geosciences | 1989

DECLUS: a FORTRAN 77 program for determining optimum spatial declustering weights

Clayton V. Deutsch

Most data collected in the Earth Sciences are clustered preferentially. The clustering may be in high or low “grade” zones or the data may be clustered in areas accessible easily to sampling. Because all statistical and geostatistical analysis requires a distribution that is representative of the entire area of interest, a declustering procedure is necessary. This paper presents a FORTRAN 77 program to compute declustering weights by a modified cell declustering procedure. An example is given and the results are compared to polygonal declustering and global kriging.


Computers & Geosciences | 2006

A sequential indicator simulation program for categorical variables with point and block data: BlockSIS

Clayton V. Deutsch

Stochastic simulation of facies or geologic units is important before the assignment of continuous rock properties. Sequential indicator simulation (SIS) remains a reasonable approach when there are no clear genetic shapes that could be put into object-based modeling. Constraining SIS to soft secondary data coming from geological interpretation or geophysical measurements is important. There are a number of techniques including indicator kriging (IK) with a local mean, collocated cokriging, Bayesian updating, permanence of ratios, block kriging and block cokriging. BlockSIS implements all of these and more (nine all together). The images may also be cleaned using maximum a -posteriori selection.

Collaboration


Dive into the Clayton V. Deutsch's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

W. Ren

University of Alberta

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge