Karl Dunbar Stephen
Heriot-Watt University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Karl Dunbar Stephen.
Petroleum Geoscience | 2008
T. Manzocchi; Jonathan N. Carter; Arne Skorstad; Bjørn Fjellvoll; Karl Dunbar Stephen; John A. Howell; John D. Matthews; John J. Walsh; M. Nepveu; C. Bos; Jonathan O. Cole; P. Egberts; Stephen S. Flint; C. Hern; Lars Holden; H. Hovland; H. Jackson; Odd Kolbjørnsen; Angus Smith Macdonald; P.A.R. Nell; K. Onyeagoro; J. Strand; A. R. Syversveen; A. Tchistiakov; Canghu Yang; Graham Yielding; Robert W. Zimmerman
Estimates of recovery from oil fields are often found to be significantly in error, and the multidisciplinary SAIGUP modelling project has focused on the problem by assessing the influence of geological factors on production in a large suite of synthetic shallow-marine reservoir models. Over 400 progradational shallow-marine reservoirs, ranging from comparatively simple, parallel, wave-dominated shorelines through to laterally heterogeneous, lobate, river-dominated systems with abundant low-angle clinoforms, were generated as a function of sedimentological input conditioned to natural data. These sedimentological models were combined with structural models sharing a common overall form but consisting of three different fault systems with variable fault density and fault permeability characteristics and a common unfaulted end-member. Different sets of relative permeability functions applied on a facies-by-facies basis were calculated as a function of different lamina-scale properties and upscaling algorithms to establish the uncertainty in production introduced through the upscaling process. Different fault-related upscaling assumptions were also included in some models. A waterflood production mechanism was simulated using up to five different sets of well locations, resulting in simulated production behaviour for over 35 000 full-field reservoir models. The model reservoirs are typical of many North Sea examples, with total production ranging from c. 15×106 m3 to 35×106 m3, and recovery factors of between 30% and 55%. A variety of analytical methods were applied. Formal statistical methods quantified the relative influences of individual input parameters and parameter combinations on production measures. Various measures of reservoir heterogeneity were tested for their ability to discriminate reservoir performance. This paper gives a summary of the modelling and analyses described in more detail in the remainder of this thematic set of papers.
Petroleum Geoscience | 2000
Gillian Elizabeth Pickup; Karl Dunbar Stephen
The calculation of pseudo-relative permeabilities can be speeded up considerably by using steady-state methods. The capillary equilibrium limit may be assumed at small scales (30 cm or less), when the flood rate is low. At high flow rates and larger distance scales, we may use a viscous-dominated steady-state method which assumes constant fractional flow. Steady-state pseudos may also be calculated at intermediate flow rates using fine-scale simulations, and allowing the flood to come into equilibrium at different fractional flow levels. The aim of this paper is to assess the accuracy of steady-state scale-up for small-scale sedimentary structures. We have tested steady-state scale-up methods using a variety of small-scale geological models. The success of steady-state scale-up depends not only on the flow rate, but also on the nature of the heterogeneity. If high permeability zones are surrounded by low permeability ones (e.g. low permeability laminae or bed boundaries), oil trapping may occur in a water-wet system. In this case pseudo-oil-relative permeabilities are very sensitive to flow rate, and care must be taken to upscale using the correct viscous/capillary ratio. However, in permeability models, where phase trapping may not occur (unconnected low permeability regions), the pseudos are similar, whatever the viscous/capillary ratio. The disadvantage of steady-state scale-up is that it cannot take account of numerical dispersion, in the manner in which dynamic methods can. However, we show examples of coarse-scale simulations with viscous-dominated steady-state pseudos which agree favourably with fine-scale simulations. Provided there are sufficient grid blocks in the coarse-scale model, the smearing of the flood front due to numerical effects is not serious.
Petroleum Geoscience | 2001
Karl Dunbar Stephen; Julian David Clark; Andrew Richard Gardiner
In sandstone-dominated successions of sheet-like turbidites, erosion of thin shale horizons during deposition of the overlying turbidite may lead locally to vertical amalgamation of sandstone beds, resulting in discontinuous thin fine-grained beds or, in the extreme case, thoroughly amalgamated sandstone. Measurements of discontinuous shale lengths from very well exposed turbidite successions have enabled the development of a mixed rule-based/stochastic model for the erosion of shales. Monte Carlo realizations of 2D cross-sections were used to examine the effects of shale discontinuities on both single-and two-phase flow, at the genetic sedimentary unit scale. Results demonstrate that the flow is strongly dependent on the balance of viscous, capillary and gravity forces, which can vary according to the distribution of amalgamation surfaces. The single-phase upscaled ratio of horizontal to vertical permeability and the fraction of mobile oil recovered can be related to the fraction of shale removed (amalgamation ratio) by log-linear and linear relationships respectively.
Petroleum Geoscience | 2008
John D. Matthews; Jonathan N. Carter; Karl Dunbar Stephen; Robert W. Zimmerman; Arne Skorstad; T. Manzocchi; John A. Howell
Reservoir management is a balancing act between making timely operational decisions and the need to obtain data on which such decisions can be made. There is a further problem: estimates of recovery for prospective development plans are subject to uncertainty because of the uncertainty of the geological description within the simulation model. The SAIGUP project was designed to analyse the sensitivity of estimates of recovery due to geological uncertainty in a suite of shallow-marine reservoir models. However, although it was generic, it had the hallmarks of active reservoir management, because those members of the team responsible for deriving the notional development plans for individual models via reservoir simulation, and computing the recoveries, had to work in parallel with others under time and budget constraints. This paper describes the way the reservoir engineering was carried out to achieve these objectives, the assumptions made, the reasoning behind them, and how the principles could be used in other studies. Sample results are also presented, although the bulk of the results are presented in other papers in the project series. One surprising result was that faults that impede flow can improve recovery. The underlying physical explanation for this behaviour is provided.
Petroleum Geoscience | 2008
Karl Dunbar Stephen; Canghu Yang; Jonathan N. Carter; John A. Howell; T. Manzocchi; Arne Skorstad
Geological models are often created at a scale finer than is suitable for flow simulation and also ignore the effects of sub-cellular heterogeneities. Upscaling of static and dynamic reservoir properties is an important process that captures the impact of smaller scales, ensuring that both heterogeneity and the flow physics are represented more accurately. A Geopseudo upscaling approach for shallow-marine reservoirs is presented, which captures the essential flow characteristics across a range of scales from laminae to the simulation grid. Starting with a base-case set of minimum assumptions enables generation of one set of pseudo-relative permeability and capillary pressure curves per facies. This is then expanded to investigate the limitations of these assumptions and compare their impact against variations in large-scale geological and structural parameters. For the analysis, two-level full factorial experimental design is used to determine important parameters. A comparison of upscaling effects is also performed. The most important upscaling and fine-scale parameters identified by the analysis are the shape of the capillary pressure curve, lamina-scale permeability variation and upscaling flow speed. Of similar importance are the sedimentological parameters for shoreline aggradation angle and curvature. Fault direction (perpendicular and parallel to the shoreline) and the fine-scale upscaling method are of moderate to low importance. The shallow-marine parameter for clinoform barrier strength and the direction of flow considered when upscaling are unimportant. Analysis of upscaling effects suggests that the algorithm used at the intermediate scale is not important, while the assumed flow speed is very important, typically resulting in a 10% maximum variation in cumulative recovery. Fine-scale properties and upscaling methods affect recovery mostly due to increased initial water saturations but also because of early breakthrough.
AAPG Bulletin | 2002
Karl Dunbar Stephen; Mark Dalrymple
The principal aim of reservoir simulation of this study at outcrop is to quantify the impact that lithological heterogeneity on a scale of one to hundreds of meters has on the production of hydrocarbons from incised valley fill reservoirs. Excellent exposure of an incised valley fill unit in the Kaiparowits Plateau region of southern Utah has enabled high-resolution interpretations of the lithofacies distributions to be adapted as two-dimensional flow simulations. The outcrop section through incised valley fill strata is oriented approximately perpendicular to paleoflow and is above the A sandstone sequence boundary within the Cretaceous Straight Cliffs Formation. The lithofacies, identified as shale, heterolithics, and sand bodies with bounding surfaces, give rise to heterogeneity, predominantly in the vertical direction. The direction of least variability is horizontal and parallel to the paleocurrent. Petrophysical properties of the lithofacies have been varied by altering the flow properties, thus generating different scenarios and realizations for comparison. This allows the impact of each rock type on the fluid-flow simulation to be quantified. Our simulation results indicate that for linear drive, where horizontal flow is induced by an injector-producer pair, the distributions of zero- and low-permeability shale and heterolithic bodies only affect flow significantly if sand body properties vary significantly. For vertical flow, however, these lithological units strongly affect the flow because of their effects on flow-path tortuousity. Our simulations show that horizontal well placement, parallel to the paleocurrent (i.e., in the direction of least variability), offers the best sweep efficiency, although the well location must be optimized.
Transport in Porous Media | 2001
Karl Dunbar Stephen; Gillian Elizabeth Pickup; Kenneth Stuart Sorbie
The balance of viscous, capillary and gravity forces strongly affects two-phase flow through porous media and can therefore influence the choice of appropriate methods for numerical simulation and upscaling. A strict separation of the effects of these various forces is not possible due to the nature of the nonlinear coupling between the various terms in the transport equations. However, approximate prediction of this force balance is often made by calculation of dimensionless quantities such as capillary and gravity numbers. We present an improved method for the numerical analysis of simulations which recognises the changing balance of forces – in both space and time – in a given domain. The classical two-phase transport equations for immiscible incompressible flow are expressed in two forms: (i) the convection–diffusion-gravity (CDG) formulation where convection and diffusion represent viscous and capillary effects, respectively, (ii) the oil pressure formulation where the viscous effects are attributed to the product of mobility difference and the oil pressure gradient. Each formulation provides a different perspective on the balance of forces although the two forms are equivalent. By discretising the different formulations, the effect of each force on the rate of change of water saturation can be calculated for each cell, and this can be analysed visually using a ternary force diagram. The methods have been applied to several simple models, and the results are presented here. When model parameters are varied to determine sensitivity of the estimators for the balance of forces, the CDG formulation agrees qualitatively with what is expected from physical intuition. However, the oil pressure formulation is dominated by the steady-state solution and cannot be used accurately. In addition to providing a physical method of visualising the relative magnitudes of the viscous, gravity and capillary forces, the local force balance may be used to guide our choice of upscaling method.
Petroleum Geoscience | 2008
T. Manzocchi; John D. Matthews; J. Strand; Jonathan N. Carter; Arne Skorstad; John A. Howell; Karl Dunbar Stephen; John J. Walsh
The differences in oil production are examined for a simulated waterflood of faulted and unfaulted versions of synthetic shallow-marine reservoir models with a range of structural and sedimentological characteristics. Fault juxtaposition can reduce the economic value of the reservoirs by up to 30%, with the greatest losses observed in models with lower sedimentological aggradation angles and faults striking parallel to waterflood direction. Fault rock has a greater effect than fault juxtaposition on lowering the economic value of the reservoir models in the compartmentalized cases only – and only when the fault rock permeability model is based on the least permeable published laboratory data. Moderately sealing faults can increase the economic value of reservoirs except when the main flow direction is parallel to the faults. These results arise from the dependence of economic value on both sweep efficiency and production rate. Simple predictors of fault juxtaposition and fault-rock heterogeneity have been established and combined with two-dimensional considerations from streamline theory in an attempt to capture quantitatively the change in economic reservoir value arising from faults. Despite limitations associated with the three-dimensional role of juxtaposition, the results are encouraging and represent a step towards establishing a rapid transportable predictor of the effects of faults on production.
First Break | 2006
Karl Dunbar Stephen; Colin MacBeth
Reservoir managers would like to know the current state of their field and be able to see into the future to know how it will change. The former requires information about current fluid sweep and pressure change while the latter requires accurate reservoir description and a predictive tool such as a simulation model. Important decisions can then be made regarding facility maintenance and well optimization, but more importantly, unswept areas can be identified and new wells drilled. Conventionally, simulation models have been used to determine the possible reservoir state and predict its behaviour. The modelling commonly begins with the geologist who creates a number of static geomodels, often constrained to the core data and the petrophysicist’s well log data in addition to the geophysicist’s pre-production 2D or 3D seismic. The upscaled models are then modified by an engineer so that they match static and dynamic well data, including fluid production rates and local pressures. Because the wells are widely spaced, many possible solutions exist where the well data will match. Time-lapse (4D) seismic can reduce the non-uniqueness by identifying changes in fluid saturation and/or pressures. This information is now available in qualitative form almost routinely in a number of North Sea and Gulf of Mexico fields, but the goal is to integrate this data quantitatively with the modelling process together with other available data. To achieve this goal, we have developed an automated history matching method to include as much reservoir data as is necessary and sufficient, including core and well logs, seismic, production data, SCAL, etc. Here, we apply our method to the Schiehallion UKCS reservoir where we update the operator’s model using geostatistical approaches and obtain an improved match to seismic and a good match to production data. Finally, the uncertainty of the parameters and predicted behaviour is analysed.
Geophysical Prospecting | 2014
Karl Dunbar Stephen; Alireza Kazemi
Updating of reservoir models by history matching of 4D seismic data along with production data gives us a better understanding of changes to the reservoir, reduces risk in forecasting and leads to better management decisions. This process of seismic history matching requires an accurate representation of predicted and observed data so that they can be compared quantitatively when using automated inversion. Observed seismic data is often obtained as a relative measure of the reservoir state or its change, however. The data, usually attribute maps, need to be calibrated to be compared to predictions. In this paper we describe an alternative approach where we normalize the data by scaling to the model data in regions where predictions are good. To remove measurements of high uncertainty and make normalization more effective, we use a measure of repeatability of the monitor surveys to filter the observed time-lapse data. We apply this approach to the Nelson field. We normalize the 4D signature based on deriving a least squares regression equation between the observed and synthetic data which consist of attributes representing measured acoustic impedances and predictions from the model. Two regression equations are derived as part of the analysis. For one, the whole 4D signature map of the reservoir is used while in the second, 4D seismic data is used from the vicinity of wells with a good production match. The repeatability of time-lapse seismic data is assessed using the normalized root mean square of measurements outside of the reservoir. Where normalized root mean square is high, observations and predictions are ignored. Net: gross and permeability are modified to improve the match. The best results are obtained by using the normalized root mean square filtered maps of the 4D signature which better constrain normalization. The misfit of the first six years of history data is reduced by 55 per cent while the forecast of the following three years is reduced by 29 per cent. The well based normalization uses fewer data when repeatability is used as a filter and the result is poorer. The value of seismic data is demonstrated from production matching only where the history and forecast misfit reductions are 45% and 20% respectively while the seismic misfit increases by 5%. In the best case using seismic data, it dropped by 6%. We conclude that normalization with repeatability based filtering is a useful approach in the absence of full calibration and improves the reliability of seismic data.