Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where James Gunning is active.

Publication


Featured researches published by James Gunning.


Geophysics | 2007

Detection of reservoir quality using Bayesian seismic inversion

James Gunning; Michael E. Glinsky

Sorting is a useful predictor for permeability. We show how to invert seismic data for a permeable rock sorting parameter by incorporating a probabilistic rock-physics model with floating grains into a Bayesian seismic inversion code that operates directly on rock-physics variables. The Bayesian prior embeds the coupling between elastic properties, porosity, and the floating-grain sorting parameter.The inversion uses likelihoods based on seismic amplitudes and a forwardconvolutionalmodeltogenerateaposteriordistribution containing refined estimates of the floating-grain parameter anditsuncertainty.Theposteriordistributioniscomputedusing Markov Chain Monte Carlo methods. The test cases we examineshowthatsignificantinformationaboutbothsorting characteristics and porosity is available from this inversion, even in difficult cases where the contrasts with the bounding lithologies are not strong, provided the signal-to-noise ratio S/N of the data is favorable. These test cases show about 25% and 15% improvements in estimated standard deviationsforporosityandfloating-grainfraction,respectively,for peak S/N of6:1.The full posterior distribution offloatinggraincontentismoreinformative,andshowsenhancedseparationintotwoclustersofcleanandpoorlysortedrocks.This holds true even in the more difficult test case we examine, wherenotably,thelaminatedreservoirnet-to-grossisnotsignificantlyimprovedbytheinversionprocess.


Geophysics | 2010

Resolution and uncertainty in 1D CSEM inversion: A Bayesian approach and open-source implementation

James Gunning; Michael Glinsky; John Norman Hedditch

We show that resolution and uncertainty in CSEM inversion is most naturally approached using a Bayesian framework. Resolution can be inferred by either hierarchical models with free parameters for effective correlation lengths (“Bayesian smoothing”), or model–choice frameworks applied to variable resolution spatial models (“Bayesian splitting/merging”). We find that typical 1D CSEM data can be modelled with quite parsimonious models, typically O(10) parameters per common midpoint. Efficient opt imizations for the CSEM problem must address the challenges of poor scaling, strong nonlinearity, multimodality and the necessity of bound constraints. The posterior parameter uncertainties are frequently controlled by the nonlinearity, and linearised approaches to uncertainty are usually very poor. In Markov chain Monte Carlo (MCMC) approaches the nonlinearity and poor scaling make good mixing hard to achieve. A novel, approximate frequentist method we call the Bayesianized parametric bootstrap (sometimes called randomized maximum likelihood) is much more efficient than MCMC in this problem, is considerably better than linearized analysis, but tends to modestly overstate uncertainties. The software that implements these ideas for the 1D CSEM problem is made available under an open–source license agreement.


Geophysical Prospecting | 2017

Well tie for broadband seismic data

Ehsan Zabihi Naeini; James Gunning; Roy White

ABSTRACT The seismic industry is increasingly acquiring broadband data in order to reap the benefits of extra low‐ and high‐frequency contents. At the low end, as the sharp low‐cut decay gets closer to zero frequency, it becomes harder for a well tie to estimate the low‐frequency response correctly. The fundamental difficulty is that well logs are too short to allow accurate estimation of the long‐period content of the data. Three distinctive techniques, namely parametric constant phase, frequency‐domain least squares with multi‐tapering, and Bayesian time domain with broadband priors, are introduced in this paper to provide a robust solution to the wavelet estimation problem for broadband seismic data. Each of these techniques has a different mathematical foundation that would enable one to explore a wide range of solutions that could be used on a case‐by‐case basis depending on the problem at hand. A case study from the North West Shelf Australia is used to analyse the performance of the proposed techniques. Cross‐validation is proposed as a robust quality control measure for evaluating well‐tie applications. It is observed that when the seismic data are carefully processed, then the constant phase approach would likely offer a good solution. The frequency‐domain method does not assume a constant phase. This flexibility makes it prone to over‐fitting when the phase is approximately constant. Broadband priors for the time‐domain least‐squares method are found to perform well in defining low‐frequency side lobes to the wavelet.


Spe Journal | 2009

Downscaling Multiple Seismic Inversion Constraints to Fine-Scale Flow Models

Subhash Kalla; Christopher D. White; James Gunning; Michael E. Glinsky

This paper (SPE 110771) was accepted for presentation at the SPE Annual Technical Conference and Exhibition, Anaheim, California, 11–14 November 2007, and revised for publication. Original manuscript received for review 2 August 2007. Revised manuscript received for review 19 June 2009. Paper peer approved 16 July 2009. Summary Well data reveal reservoir layering with high vertical resolution but are areally sparse, whereas seismic data have low vertical resolution but are areally dense. Improved reservoir models can be constructed by integrating these data. The proposed method combines stochastic seismic inversion results, finer-scale well data, and geologic continuity models to build ensembles of flow models. Stochastic seismic inversions operating at the mesoscale generate rock property estimates, such as porosity, that are consistent with regional rock physics and true-amplitude imaged seismic data. These can be used in a cascading workflow to generate ensembles of fine-scale reservoir models wherein each realization from the Bayesian seismic inversion is treated as an exact constraint for a subensemble of fine-scale models. Exact constraints ensure that relevant interproperty and interzone correlations implied by rock physics and seismic data are preserved in the downscaled models. Uncertainty in the rock physics and seismic response is included by using multiple stochastic inversions in a cascading workflow. In contrast, inexact constraints generally do not preserve these correlations. We use two-point covariance at the fine scale to provide prior model thickness and porosity distributions of multiple facies. A Bayesian formulation uses the kriged data as the prior with the coarse constraints as the likelihood, and this posterior is sampled using a Markov Chain Monte Carlo (MCMC) method in a sequential simulation framework. These methods generate rich pinchout behavior and flexible spatial connectivities in the fine-scale model. These flow models are easily represented on a cornerpoint grid. 2D examples illustrate the interactions of prior and constraint data, and 3D examples demonstrate algorithm performance and the effects of stratigraphic variability on flow behavior.


SPE Annual Technical Conference and Exhibition | 2007

Imposing Multiple Seismic-Inversion Constraints on Reservoir Simulation Models

Subhash Kalla; Christopher D. White; James Gunning; Michael E. Glinsky

Well data reveal reservoir layering with relatively high vertical resolution but are areally sparse, whereas seismic data have low vertical resolution but are areally dense. Improved reservoir models can be constructed by integrating these data. The proposed method combines stochastic seismic inversion, finerscale well data, and geologic continuity models to build ensembles of geomodels. Stochastic seismic inversions operating at the mesoscale (≈10 m) generate rock property estimates that are consistent with regional rock physics and true–amplitude imaged seismic data. These can be used in a cascading workflow to generate ensembles of fine–scale reservoir models, wherein each realization from the Bayesian seismic inversion is treated as an exact constraint for an associated finer scale stochastic model. We use two–point statistical models for the fine–scale model, modeling thickness and porosity of multiple facies directly. The update of these fine–scale models by the seismic constraints yields highly correlated truncated Gaussian distributions. These generate potentially rich pinchout behavior and flexible spatial connectivities in the fine scale model. The seismic constraints confine the fine–scale models to a posterior subspace corresponding to the constraint hypersurface. A Markov Chain Monte Carlo samples the posterior distribution in this subspace using projection methods that exploit the reduced dimensionality that comes with the exact constraints. These methods are demonstrated in three-dimensional flow simulations on a cornerpoint grid, illustrating the effects of stratigraphic variability on flow behavior. Introduction Reservoirs are sparsely sampled by well penetrations, but seismic survey results provide controls for reservoir stratigraphy and properties such as average porosity. However, beds thinner than about 1/8 to 1/4 the dominant seismic wavelength cannot be resolved in these surveys.1, 2 At a depth of 3000 m, the maximum frequency in the signal is typically about 40 Hz and for average velocities are circa 2,000 m/s; this translates to best resolutions of about 10 m. The resolution limits and errors inherent in seismic-derived estimates complicate use of seismic inversion data.3 Mesoscale (≈10 m) reservoir models obtained by seismic inversion using rock-physics concepts and effective-media ideas are a manageable basis for Bayesian seismic integration because seismic is usefully informative at this scale as explained above. An attractive route to typical geocellular scale (≈1 m) models is downscaling mesoscale models to meter-scale models using constraint equations embodying the effective media laws. In particular, downscaling specific realizations drawn from the posterior of a stochastic mesoscale inversion produces sum or average constraint equations for fine scale models. We use probabilistic depth and thickness information originating from the layer–based seismic inversion code DELIVERY4 as input to a downscaling algorithm operating on a cornerpoint grid. Seismic constraints and priors are modeled on the quasivertical block edges, analogous to seismic traces. Simulation at the edges preserves geometric detail required for cornerpoint reservoir models used in many commercial reservoir simulators (e.g., ECLIPSE5). Block-center properties such as porosity are obtained by averaging the edge properties. Realization ensembles from the inversion code DELIVERY4 carry rich information about interproperty and vertical interzone correlations induced by the seismic information (Fig. 1). These ensembles are generated assuming no trace-to-trace correlation, and the traces generally do not coincide with cornerpoint edges in the flow grid. The DELIVERYMASSAGER6, 7 code augments the interproperty and interzone correlations with the mesoscale lateral correlation structures required for geological continuity, and constructs models or model–samples at the quasivertical cornerpoint edges of the flow grid. Each realiza2 MULTIPLE SEISMIC INVERSION CONSTRAINTS ON RESERVOIR MODELS SPE 110771 tion from DELIVERYMASSAGER thus captures vertical, horizontal, and interproperty correlations at the mesoscale (Fig. 2). These realizations are natural inputs to the downscaling problem we describe. They contain the requisite coupling between geometry and rock properties that seismic inversion induces, plus the necessary spatial correlations required for geological smoothness. These mesoscale models provide explicit sum and average constraints on the corresponding subseismic layers. Such constraints are nontrivial to respect using conventional geostatistical algorithms for fine–scale heterogeneity. Specifically, we consider a fine–scale model of k ∈ {1 . . .K} layers, each layer k with thickness hk and porosity φk. We use t as an untruncated surrogate for layer thickness, so hk = max(0, tk): the proxy t may take on negative values, whereas h is truncated at zero. If one wishes to ensure consistency of both thickness and average porosity in a downscaling problem, the following constraints must be imposed at each column of gridblock corners:


Geophysical Prospecting | 2018

Joint facies and rock properties Bayesian amplitude-versus-offset inversion using Markov random fields: Markov random fields

James Gunning; Mark Sams

ABSTRACT Seismic reflection pre‐stack angle gathers can be simultaneously inverted within a joint facies and elastic inversion framework using a hierarchical Bayesian model of elastic properties and categorical classes of rock and fluid properties. The Bayesian prior implicitly supplies low frequency information via a set of multivariate compaction trends for each rock and fluid type, combined with a Markov random field model of lithotypes, which carries abundance and continuity preferences. For the likelihood, we use a simultaneous, multi‐angle, convolutional model, which quantifies the data misfit probability using wavelets and noise levels inferred from well ties. Under Gaussian likelihood and facies‐conditional prior models, the posterior has simple analytic form, and the maximum a‐posteriori inversion problem boils down to a joint categorical/continuous non‐convex optimisation problem. To solve this, a set of alternative, increasingly comprehensive optimisation strategies is described: (i) an expectation–maximisation algorithm using belief propagation, (ii) a globalisation of method (i) using homotopy, and (iii) a discrete space approach using simulated annealing. We find that good‐quality inversion results depend on both sensible, elastically separable facies definitions, modest resolution ambitions, reasonably firm abundance and continuity parameters in the Markov random field, and suitable choice of algorithm. We suggest usually two to three, perhaps four, unknown facies per sample, and usage of the more expensive methods (homotopy or annealing) when the rock types are not strongly distinguished in acoustic impedance. Demonstrations of the technique on pre‐stack depth‐migrated field data from the Exmouth basin show promising agreements with lithological well data, including prediction accuracy improvements of 24% in vp/vs and twofold in density, in comparison to a standard simultaneous inversion. Much clearer and extensive recovery of the thin Pyxis gas field was evident using stronger coupling in the Markov random field model and use of the homotopy or annealing algorithms.


73rd EAGE Conference and Exhibition - Workshops 2011 | 2011

Some Newer Algorithms in Joint Categorical and Continuous Inversion Problems around Seismic Data

James Gunning; Michel Kemper

Conventional geophysical inversion tools often use purely continuous optimization techniques that model rock properties as if they come from some common population, even though geological formations usually have a strong mixture character. Such pooling imposes a strong prior-model footprint on inversion results. Newer hierarchical Bayesian approaches that embed a categorical/facies aspect via discrete Markov random fields, coupled with conditional prior distributions that embed rock-physics relationships, are a tractable way to represent the categorical aspects of geology. We show that maximum a posteriori model inference in joint lithology-fluid/rock-properties problems using seismic is possible using some newer algorithms from computer vision. The optimization is cast as an EM algorithm, using Bayesian Belief Propagation as the “E” step, and conventional large-scale least squares as the “M” step. Very fast approximate alternatives to the “E” step are available using graph-cutting algorithms.


Spe Journal | 2008

Consistent Downscaling of Seismic Inversion Thicknesses to Cornerpoint Flow Models

Subhash Kalla; Christopher D. White; James Gunning; Michael E. Glinsky

Accurate reservoir simulation requires data-rich geomodels. In this paper, geomodels integrate stochastic seismic inversion results (for means and variances of packages of meter-scale beds), geologic modeling (for a framework and priors), rock physics (to relate seismic to flow properties), and geostatistics (for spatially correlated variability). These elements are combined in a Bayesian framework. The proposed workflow produces models with plausible bedding geometries, where each geomodel agrees with seismic data to the level consistent with the signal-to-noise ratio of the inversion. An ensemble of subseismic models estimates the means and variances of properties throughout the flow simulation grid. Grid geometries with possible pinchouts can be simulated using auxiliary variables in a Markov chain Monte Carlo (MCMC) method. Efficient implementations of this method require a posterior covariance matrix for layer thicknesses. Under assumptions that are not too restrictive, the inverse of the posterior covariance matrix can be approximated as a Toeplitz matrix, which makes the MCMC calculations efficient. The proposed method is examined using two-layer examples. Then, convergence is demonstrated for a synthetic 3D, 10,000 trace, 10 layer cornerpoint model. Performance is acceptable. The Bayesian framework introduces plausible subseismic features into flow models, whilst avoiding overconstraining to seismic data, well data, or the conceptual geologic model. The methods outlined in this paper for honoring probabilistic constraints on total thickness are general, and need not be confined to thickness data obtained from seismic inversion: Any spatially dense estimates of total thickness and its variance can be used, or the truncated geostatistical model could be used without any dense constraints.


Geophysics | 2007

The value of using relative amplitude changes

Michael E. Glinsky; Robert Pascoe; Bruce Asher; Guy Duncan; James Gunning

There is a rich history of using differential measurements to improve the signal-to-noise (S/N) ratio when there is correlated signal to improve the signal-to-noise ratio. With respect to seismic amplitude measurements, this is done in a practical sense by comparing the amplitude of reflections on structure to those off structure, or by comparing anomalous amplitudes to an average background. In fact, anomalous is defined in reference to the background. While this is intuitively the thing to do, the value of this methodology needs to be calculated by the quantitative effect on risk and uncertainty and the interaction of this effect with business decisions. It is only by influencing the business decisions with the information that value is realized.


78th EAGE Conference and Exhibition 2016 | 2016

Wavelet Estimation for Broadband Seismic Data

E. Zabihi Naeini; James Gunning; Roy White; P. Spaans

The volumes of broadband seismic data acquired and processed by the industry have grown rapidly. There is also an increasing emphasis on the benefits of broadband seismic for quantitative interpretation. The bottleneck for achieving a satisfactory quantitative interpretation and subsequently reservoir parameter estimation is the well tie, a process through which the seismic wavelet is estimated. However, broadband seismic data pose a challenge for well ties as the duration of the well log is often inadequate to estimate the low frequency decay towards zero frequency. Three distinctive techniques, namely parametric constant phase, frequency domain least-squares with multi-tapering and Bayesian time domain with broadband priors, are introduced in this paper to provide a robust solution to the wavelet estimation problem for broadband seismic data. A case study from North West Shelf Australia is used to analyse the performance the proposed techniques. Generally, when the seismic data is carefully processed then the constant phase approach would likely offer a good solution. Broadband priors for the time domain least-squares method are found to perform well in defining low-frequency side-lobes to the wavelet.

Collaboration


Dive into the James Gunning's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Roman Pevzner

Cooperative Research Centre

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

David Annetts

Commonwealth Scientific and Industrial Research Organisation

View shared research outputs
Top Co-Authors

Avatar

Juerg Hauser

Commonwealth Scientific and Industrial Research Organisation

View shared research outputs
Top Co-Authors

Avatar

Subhash Kalla

Louisiana State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jim Patel

Commonwealth Scientific and Industrial Research Organisation

View shared research outputs
Top Co-Authors

Avatar

Lincoln Paterson

Commonwealth Scientific and Industrial Research Organisation

View shared research outputs
Researchain Logo
Decentralizing Knowledge