Xuri Huang
Western Geophysical
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Xuri Huang.
Software - Practice and Experience | 1997
Xuri Huang; Laurent J. Meister; Rick Workman
Today, geostatistical reservoir characterization from 3D seismic volumes provides most static descriptions for reservoir models. These models can be improved by integrating the dynamic data in the reservoir description process. 3-D time-lapse seismic surveys have been proposed to relate time dependent-changes in seismic attributes to the flow processes in the reservoir. This paper presents a new approach to reservoir characterization by integrating time-lapse seismic and production data. The issues involved in the integration will be examined. A case study was conducted over a turbidite sheet sand reservoir in the Gulf of Mexico. Seismic data from the base survey were combined with log and production data to build an initial reservoir model which was run forward to the time of a second monitor seismic survey. Dynamic history matching by a simulated annealing type of optimization further improved the model. The output from this simulation was then converted to a synthetic monitor seismic survey using Gassmanns equations and a simple convolutional approach. A quantitative combined seismic and production history-matching methodology was then tested. It constrains the modeling process to match the production history and simultaneously minimize the differences between the synthetic and real 3-D seismic time-lapse data. This new systematic approach provides us with a quantitative time-lapse seismic analysis and reservoir characterization tool which has the potential to improve reservoir management.
Geophysics | 1998
Xuri Huang; Laurent J. Meister; Rick Workman
This paper presents a new approach to reservoir management by integrating time‐lapse seismic data with production data. The basic steps involve combining the seismic data from the base survey with log and production data to build an initial reservoir model which is run forward to the time of the repeat seismic survey. The output from this simulation was then converted to a synthetic monitor survey, using Gassmann’s equations and a simple convolutional approach. Finally, the differences between the synthetic and real seismic time‐lapse data are minimized using an optimization algorithm.
Seg Technical Program Expanded Abstracts | 1995
Xuri Huang; Mohan Kelkar; A.K. Chopra; C.T. Yang
The wavelet determination is important for seismic inversion. This study investigates the wavelet sensitivity for inversion using heuristic combinatorial algorithms in the time domain. The results conclude that for inversion using heuristic combinatorial algorithms, precise knowledge of the shape of the wavelet (i. e., frequency, phase and time interval) is not necessary, even though these do help the inversion. Moreover, the frequency band of the wavelet for the new inversion strategy needs to be wider than the frequency band of the true wavelet. The phase and the length have less impact lead to a on the inversion results than the frequency, and mainly time shift of the inverted results.
Software - Practice and Experience | 1995
Chung-Tien Yang; A.K. Chopra; J. Chu; Xuri Huang; Mohan Kelkar
Seismic data are routinely and effectively used to delineate the structure of a reservoir. However, the use of seismic data in reservoir modeling has been limited. This paper introduces a new approach of incorporating 3-D seismic data in reservoir porosity modeling. This approach employs a stochastic seismic inversion technique to generate the seismic impedance. The inversion technique uses a modified stochastic hillclimbing method. Correlation between porosity and the inverted impedance is established at well locations. The resulting relationship is used to generate 3-D porosity models. The generation of these models involves a stochastic co-simulation of inverted seismic impedance of log-derived porosity. This modeling technique is applied to the Yacheng 13-1 Gas Field. The results are compared with porosity models generated using well-log data only, as well as with using seismic amplitude and well-log data since a good correlation between seismic amplitude and well log data is also observed after transforming the data into similar scales. The results demonstrate a protocol for early integration of geological and geophysical data in a gas reservoir. This approach will allow easy revision and refinement of the description with additional data, such as new well data or new interpretation of the existing data.
Software - Practice and Experience | 1996
Xuri Huang; Mohan Kelkar
In this paper, a new approach is proposed to integrate static data with dynamic information in the frequency domain. The spatial relationship-variogram is represented by power spectra and self-correlation spectra in the frequency domain. It is assumed that the large scale information can be obtained by static data such as geological and seismic information. In the frequency domain, this will be represented by the low frequency. The dynamic data-rate versus time at early time is assumed to be influenced by near well bore permeability values. In the frequency domain, these are represented by high frequencies. By selectively perturbing high frequency values, the dynamic data can be matched without affecting the large scale features represented by the low frequency information. The final reservoir description can both honor the static data and dynamic information. A case study is presented to validate the proposed method.
Seg Technical Program Expanded Abstracts | 1995
Xuri Huang; Mohan Kelkar; A.K. Chopra; C.T. Yang
The heuristic combinatorial algorithm inversion has several advantages when compared to conventional methods. However, most of the algorithms, such as GA and SA are highly computationally demanding. Even though the modified stochastic hillclimbing algorithm is fast, it also needs to be computationally efficient for further popular practical applications. In this work, a better initialization was used, that is, the well impedance was used for initialization in starting location if there is a close to that location. The previously inverted result was used for current inversion initialization for the other traces. If no well impedance was available at the location where we expected to start, a random initialization was used for that location. To be compatible with the conventional methods, some fixed points of the impedance which can represent the known stable layer or boundary impedance, can also be used for the inversion. All of these changes improve the results as well as the computational efficiency. The two schemes are tested using a field data set and a synthetic data set. the a priori defined probability distribution function, the previously inverted impedance in the closest trace can be used for starting the current trace inversion. The process can be continued until all traces are inverted. If the starting location we expect has no well impedance or pseudo-velocity, the first trace can be inverted using a random initialization, then the process can be continued as we described above. The entire process can be started in a way such that only in the well location is the initialization replaced by the well impedance or pseudo-velocity for initialization. (2) Fixed (or unchanged) points problem or boundary conditions: Generally, we know a certain layer that is of stable deposition and the impedance is known with a certain precision. To assist the inversion using this information, the points corresponding to the known layers are fixed, i.e., the impedance for these points will not be perturbed. This will reduce the number of perturbations, and will speed up the final determination of the other points which are not corresponding to this layer. However, the algorithm needs to check the points whether they are fixed points or boundaries, and this costs some computational time and slows down the process.
Seg Technical Program Expanded Abstracts | 1995
Xuri Huang; Mohan Kelkar
Heuristic combinatorial algorithms search for optimum solutions in a large or global solution domain in an intelligent way. The algorithms to be discussed in this work are: the genetic algorithm, simulated annealing, the greedy algorithm and the modified stochastic hillclimbing. For seismic inversion process, the results show that all the algorithms obtain a reasonable inverted pseudovelocity log. However, the computational efficiencies differ significantly. The study shows that the modified stochastic hillclimbing method have the ability to adjust the a priori pseudo-velocity or impedance probability distribution function. Because these algorithms only require the probability distribution function to initialize the solution and constraint the solution, they do not require running the inversion starting from the surface or from some stable layer.
Seg Technical Program Expanded Abstracts | 2000
Xuri Huang; Robert Will
In general, integration of time-lapse seismic and production data requires extensive work to build a “common” earth model (Huang et al., 1998,1998) honoring both types of data. This typically leads to longer turnaround times for time-lapse seismic projects. This paper demonstrates a method to shorten this time by reconciling production data with time-lapse seismic data in the data domain rather than in the earth model domain. In order to obtain a meaningful difference from time-lapse seismic data, repeatability is the criterion to control the differencing quality geophysically. In the paper, we propose another criterion to constrain the seismic differencing process. That is, using the production data and material balance, the seismic differencing can be tuned to represent the production change. The relation between the difference volume and amplitude threshold is characterized. Combined with the production data, it serves as important tool to constrain seismic differencing. Using the material balance relation of production data along with seismic base and monitor survey data, greater constraints may be imposed on estimates of residual reserve. The methodology is demonstrated on a gas reservoir from the Gulf of Mexico.
Seg Technical Program Expanded Abstracts | 1999
Xuri Huang; Thaddeus Charles Jones; Albert Berni
Time lapse seismic data have recently been used for the detection of bypassed oil as well as the general problem of reservoir management. In many cases, legacy data has been used because it is the least expensive approach. However, the poor repeatability of legacy data is a significant limitation (Beasley et al. 1996). Using two 3D seismic data sets acquired over a field in the Gulf of Mexico, this paper demonstrates how data reprocessing will affect the reservoir modeling, especially the dynamic performance prediction. Based on the “off-the-shelf “ data, a reasonable difference map was obtained, and an initial reservoir model was characterized using the base survey. The model was further updated using seismic history matching. Dynamic performance in the time frame after the monitor survey underestimated actual production. Using the newly reprocessed data, the model is updated and performance prediction is improved.
Seg Technical Program Expanded Abstracts | 1999
Xuri Huang; Robert Will; Laurent J. Meister; Rick Workman
Methods for application of time lapse seismic data to reservoir management range from purely qualitative evaluation, to its use as a constraint in rigorous numerical dynamic model optimization. The advantage of numerical optimization over qualitative analysis is that the former provides a quantitative reservoir model much needed for practical reservoir management applications. However, the large gap existing between these qualitative and quantitative methods not only inhibits wider use of time lapse seismic data as a reservoir monitoring tool, but also represents the unnecessary delay of integration between seismic and production data until very advanced stages of reservoir model building. Recent work by Huang et al (1999a) lead to a method for early integration of time-lapse seismic and production data in a Gulf of Mexico gas field 4D study. Such early integration helps to reduce ambiguity in time-lapse seismic data processing and provides useful semi-quantitative results which may be used for some engineering applications. The numerical optimization process data benefits from this preliminary step through improved initial estimate and more efficient convergence. The overall results of this two stage integration approach are; useful intermediate products, improved turn-around time, and increased reliability in model based time lapse analysis.