Guilherme Daniel Avansi
State University of Campinas
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Guilherme Daniel Avansi.
Offshore Technology Conference | 2015
Guilherme Daniel Avansi; Denis José Schiozer
Reservoir characterization is very important to the success of a history matching and production forecasting. Thus, numerical simulation becomes a powerful tool for the reservoir engineer in quantifying the impact of uncertainties in field development and management planning, calibrating a model with history data and forecasting field production, resulting in a reliable numerical model. History matching has been integrated into several areas, such as geology (geological characterization and petrophysical attributes), geophysics (4D seismic data), statistical approaches (Bayesian theory and Markov field), and computer science (evolutionary algorithms). Although most integrated history-matching studies use a unique objective function (OF), this is not enough. A history matching by simultaneous calibrations of different OF is necessary because all wells must have their OF near the acceptance range as well as maintain the consistency of generated geological models during reservoir characterization. The main goal of this work is to integrate history matching and reservoir characterization; applying a simultaneous calibration of different OF in a history matching procedure and keeping the geological consistency in an adjustment approach to reliably forecast production. We also integrate virtual wells and geostatistical methods into the reservoir characterization to ensure realistic geomodels without creating the geological discontinuities to match the reservoir numerical model. The proposed integrated calibration methodology consists of using a geostatistical method for modelling the spatial reservoir property distribution based on the well log data, running a numerical simulator and adjusting conditional realizations (models) based on geological modeling (variogram model, vertical proportion curve and regularized well log data) and reservoir uncertainties, using a simultaneous adjustment of different OF to evaluate the history matching process and virtual wells to perturb geological continuities such as channels and barriers. In conclusion, we present an effective methodology to preserve the consistency of geological models during history matching process. In addition, we simultaneously combine different OF to calibrate and validate the models with well production data. Reliable numerical and geological models are used in the forecasting production under uncertainties to validate the integrated procedure. Introduction Reservoir modeling is an important step for reservoir prediction as it is directly involved in the reservoir management during the different stages of the field production life. Besides use in the construction of the numerical model, it helps create a consistent geological model, integrating with production data to ensure a reliable future forecast of the field. Thus, reservoir characterization is an important tool for reservoir engineering to improve geological and numerical modelling integration (Gosselin et al., 2003; Mezghani et al., 2004), estimating reservoir models consistent with multiple data types for the calibration and prediction period. Updating the reservoir model to match the current production is an important step, mainly in the initial stage of the field management because of the high level of uncertainties. It is also important to appropriately include uncertainties during the reservoir characterization phase. Then, a probabilistic approach is used to introduce and analyze the uncertainties in the reservoir characterization and history matching integrated studies. This process aims to integrate different data types and generate multiple reservoir scenarios (Behrens and Tran, 1998; Kazemi and Stephen, 2012; Skorstad et al., 2006; Suzuki and Caers, 2006). The acquired data are spatially dependent on static data such as well logs, core analysis; as well as dynamic data, such as production rate (Q) and bottomhole pressure (BHP). This study integrates the dynamic data of production wells in the history matching process. We define a way to quantify the quality of calibration represented by an objective function (OF), which is minimized through an automatic, assisted or
Eurosurveillance | 2017
Guilherme A. Polizel; Guilherme Daniel Avansi; Denis José Schiozer
Risk assessment is a crucial process in the development of an oil field. Many tools and methods have been developed and applied in this area, such as reservoir simulation, surface response methodology and emulators. The tool and methodology choices should link to the level of uncertainity and the type of uncertain parameters (continuous, discrete and stochastic attributes), aiming at reliable results with reduced computational time/cost. The main goal of this work is to apply the Response Surface Methodology (RSM) to build proxy models in order to substitute the reservoir simulation model of medium-fidelity yielding to speed up part of the risk analysis processes that demands a high computational effort. The model passes through statistical and cross-validation in order to be used in production prediction. These validations ensure that the model output resides within a previously established confidence interval. Uncertainty analysis is evaluated based on building risk curves and then the curves are compared with the simulation output, to evaluate the proxy model efficiency based on the Normalized Quadratic Deviation with Signal (NQDS). The proxy study involves the following steps: (1) Statistical design to perform a sensitivity analysis of the objective functions; (2) Statistical design to obtain the response surface; (3) Statistical validation through analysis of variance (ANOVA); (4) Consistency check of proxies with the simulator outputs; (5) Latin Hypercube Sampling (LHS) technique to generate equi-probable scenarios during a crossvalidation procedure; (6) Application, building the risk curves through the generated proxy models to assist the decision analysis and (7) Comparison with the curves generated by the simulation using 500 scenarios generated by the LHS. From the LHS tool, 500 equi-probable scenarios were generated to build the risk curves using the proxy model and then compared with those generated by the reservoir simulator. The NQDS indicator measures the distance among curves. If the value resides between the interval of -1 and +1 it means that the proxy is efficiently reproducing the simulator response within a previous defined tolerance (5% in this case). This template is provided to give authors a basic shell for preparing your manuscript for submittal to an SPE meeting or event. Styles have been included to give you a basic idea of how your finalized paper will look before it is published by SPE. All manuscripts submitted to SPE will be extracted from this template and tagged into an XML format; SPE’s standardized styles and fonts will be used when laying out the final manuscript. Links will be added to your manuscript for references, tables, and equations. Figures and tables should be placed directly after the first paragraph they are mentioned in. The content of your paper WILL NOT be changed.
Eurosurveillance | 2010
Guilherme Daniel Avansi; Rafael Souza; Denis José Schiozer
Time-lapse seismic has been providing valuable information on identifying fluid movements, to locate bypassed oil and well placement optimization to reduce uncertainties in reservoir development and production management. Nevertheless, most of these are made through a qualitative approach, which limits seismic data integration in reservoir simulation studies due to different scales. In order to overcome this and develop new data integration techniques, many studies involving these processes have the assumption that both frameworks are in the same scale. Nonetheless, the quantification of the lost information in the scaling procedures needs to be evaluated and quantitative seismic data integration into the simulator should improve the mitigation of these problems improving the knowledge about the representativeness of the scaled data. In this context, it is presented a technique regarding to scale issues relating time-lapse seismic data, geological and simulation models aiming to quantify the information lost due to the scaling problems. This paper describes a methodology involving seismic attributes and reservoir simulation. Saturation and pressure trends derived from acoustic impedance behavior, which were obtained from seismic data. The lost information due to scaling procedures is evaluated through comparison between flux model properties and seismic attributes, at each respective framework, for estimating the best step to do it. This technique allowed the reliability improvement of the reservoir parameters from seismic attributes regarding to the lost information due to scaling procedures. It has been shown that it is possible to improve the model and results reliability through a quantitative analysis involving scaling procedures between 4D seismic data and reservoir simulation integrated studies. As main contributions of this technique are the integration procedures regarding to scaling issues between reservoir simulation models and time-lapse seismic information. Introduction Time-lapse data, or 4D seismic, is the process of repeating 3D seismic surveys over a producing reservoir in time-lapse mode. It has a potentially large impact in reservoir engineering due to its capability to estimate dynamic reservoir properties such as fluid movement, pressure build-up, and heat flow in reservoirs. Seismic images are sensitive to spatial contrasts in two distinct types of reservoir properties: (1) non-time-varying static geology properties such as lithology, porosity, shale content; and (2) time-varying dynamic fluid-flow properties such as fluid saturation, pore pressure and temperature. Given a single 3D seismic survey, representing a single snapshot in time of the reservoir, the static geology and the dynamic fluidflow contributions to the seismic image are non-uniquely coupled and therefore difficult to separate. However, time-lapse studies allow the static geologic contributions to cancel, resulting in a direct image of the time-varying changes caused by the reservoir fluid flow. Aiming an integration process of time-lapse data into the reservoir simulation, the 4D seismic technology appeared to reduce uncertainties from qualitative analysis (Lumley and Behrens, 1997; Pagano et al. 2000; Elde at al., 2000; O’Donovan et al., 2000; Fagervik et al., 2001; Aggio and Burns, 2000, Hatchell et al., 2002 and Pannett et al., 2004), identifying water saturation front, resulting in a great improvement in reservoir characterization and consequently in the history matching. In spite of being very usefull, they can lead erroneous analysis, being indispensable an additional procedure to better use the information of this technology. Therefore, some authors have been evaluating methodologies which seismic information is considered in reservoir simulation procedures with a quantitative approach, such as Mezghani et al. (2004), Falcone et al. (2004), Ida (2009) and Souza et al. (2010) are examples of quantitative use of 4D seismic data in history matching. In the traditional history matching, the inversion process is made of pressure and saturation onto impedance (Ida, 2009); according to Risso (2007), the reverse process is necessary, nevertheless the transition of impedance onto pressure and saturation could have multiple solutions due to many possible variable combinations that could result in the same answer. These approaches have assumed the same scale
Eurosurveillance | 2017
S. Toledo; Guilherme Daniel Avansi; Denis José Schiozer
The high level of uncertainties during early phases of oilfield projects makes economic decisions challenging. One effective way to reduce uncertainties is gathering reservoir information from dynamic sources, such as well tests and production logging. This data must be incorporated into reservoir characterization integrated studies to generate probabilistic simulation models (scenarios). The objective of this work is to develop a procedure to update reservoir simulation models during reservoir characterization including well test and production logs, aiming better production forecasts. The proposed methodology consists in generating phi vs. log(k) equations derived from well test and production logging interpretation to update the permeability distribution in probabilistic scenarios without losing geological consistency. We also establish criteria for selecting wells to be tested based on openhole log data. We evaluate the result of the well this data incorporation in a synthetic field based on data from a real reservoir located in Campos Offshore Basin, Brazil. Then, we compared the results with the same reservoir model using classical phi-log(k) equations from laboratory experiments. Results showed that well test and production logging incorporation can improve the performance of the field measured by the risk curve of the net present value compared to the case without tests. We also showed how the history matching of the pressure derivatives can reduce variability of the prediction of the reservoir future behavior. The main contribution of this work is to evaluate how new information derived from well test and production logging improves the consistency of geomodels in early development of petroleum fields. Besides this improvement, the results also indicate that this methodology is an efficient way of reducing variability in the production forecast when integrated to other techniques such as history matching. The use of a benchmark case allowed us to show the influence of the quality of the simulation model in the process. Introduction One of the main challenges in early phases of petroleum field development is to make decisions under uncertainties, which arise mainly from the lack of information about reservoir properties, economic scenario and operational constraints. Long term decisions are made at this stage, such as size and number of production units, and number and position of wells. Risk can be mitigated by conceiving flexible projects, increasing robustness or gathering more information. Among all the reservoir information, permeability is one of the most important attributes due to its huge impact on fluid displacement. Hence, we must ensure that the permeability estimation is as reliable as possible.
Journal of The Brazilian Society of Mechanical Sciences and Engineering | 2017
Denis José Schiozer; Guilherme Daniel Avansi; Antonio Alberto de Souza dos Santos
International Journal of Modeling and Simulation for the Petroleum Industry | 2015
Ana T.F.S. Gaspar; Guilherme Daniel Avansi; Antonio Alberto de Souza dos Santos; João Carlos von Hohendorff Filho; Denis José Schiozer
Spe Reservoir Evaluation & Engineering | 2016
Guilherme Daniel Avansi; Célio Maschio; Denis José Schiozer
Eurosurveillance | 2011
Guilherme Daniel Avansi; Denis José Schiozer
SPE Annual Technical Conference and Exhibition | 2008
Gabriel Alves da Costa Lima; Saul B. Suslick; Guilherme Daniel Avansi
SPE Trinidad and Tobago Section Energy Resources Conference | 2016
Ana T.F.S. Gaspar; Guilherme Daniel Avansi; Célio Maschio; Antonio Alberto de Souza dos Santos; Denis José Schiozer