Alessandra Davolio
State University of Campinas
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Alessandra Davolio.
Journal of Geophysics and Engineering | 2012
Alessandra Davolio; Célio Maschio; Denis José Schiozer
This work represents the first step of a study to integrate time lapse seismic and reservoir engineering data where a petro-elastic inversion from seismic data to pressure and saturation is presented. This inversion is made through an optimization procedure. In order to better understand and validate the initial step of the methodology, synthetic data (initially free of noise and errors) have been used. Through this ideal set of data, it was possible to show that pressure and saturation can be extracted from P and S impedances using only one seismic survey (3D inversion). It is also shown that this 3D approach is not robust when errors are assumed in reservoir data and it fails when, for instance, uncertainty in porosity data occurs. Thus, an improvement is made and the algorithm is rewritten based on 4D differences that diminish the wrong reservoir data effect. For both algorithms (3D and 4D), we have presented a discussion of the objective function behaviour concerning the use of P and S impedances simultaneously, the initial guess and the solution space. A sensitivity analysis discussing the influence of porosity and the dynamic properties on P and S impedances for 3D and 4D approaches is also presented. After understanding the inversion process behaviour for an ideal data set, an analysis of its results assuming different combinations of pressure and saturation variations and including some errors in the data set used is presented in the last subsections.
Journal of Geophysics and Engineering | 2015
Bruno Pazetti; Alessandra Davolio; Denis José Schiozer
The integration of 4D seismic (4DS) attributes and reservoir simulation is used to reduce risks in the management of petroleum fields. One possible alternative is the saturation and pressure domain. In this case, we use estimations of saturation and pressure changes from 4D seismic data as input in history matching processes to yield more reliable production predictions in simulation models. The estimation of dynamic changes from 4DS depends on the knowledge of reservoir rock and fluid properties that are uncertain in the process of estimation. This paper presents a study of the impact of rock and fluid uncertainties on the estimation of saturation and pressure changes achieved through a 4D petro-elastic inversion. The term impact means that the saturation and pressure estimation can be perturbed by the rock and fluid uncertainties. The motivation for this study comes from the necessity to estimate uncertainties in saturation and pressure variation to incorporate them in the history matching procedures, avoiding the use of deterministic values from 4DS, which may not be reliable. The study is performed using a synthetic case with known response from where it is possible to show that the errors of estimated saturation and pressure depend on the magnitude of rock and fluid uncertainties jointly with the reservoir dynamic changes. The main contribution of this paper is to show how uncertain reservoir properties can affect the reliability of pressure and saturation estimation from 4DS and how it depends on reservoir changes induced by production. This information can be used in future projects which use quantitative inversion to integrate reservoir simulation and 4D seismic data.
SPE Annual Technical Conference and Exhibition | 2014
Forlan La Rosa Almeida; Alessandra Davolio; Denis José Schiozer
Assisted history matching procedures are usually implemented within an optimization procedure, where the goal is to minimize the object function, which is written in terms of the error between observed and simulated data. One of the problems of the process is that the objective function represented by a single value is not sufficient to express the complexity of the problem. The errors that are normally measured for each well and for all production rates and pressure are converted to a single value; usually the norm of all error vectors. If the problem is well behaved and the objective function quickly converges within a desired tolerance, this would not be a problem, but this is usually not the case for complex history matching processes, since it is very difficult to find a model (or a set of them) that matches with a reasonable tolerance the production profiles for every wells. This work proposes a new approach to deal simultaneously with several objective functions (for instance, well rates and pressure). The methodology follows a probabilistic approach where several simulation models are handled during the entire procedure. The initial step is to generate several simulation models by combining the most important uncertainties of the reservoir. Then, an iterative procedure is performed to iteratively change the reservoir attributes and filter the models that are closer to history data. This procedure encompass a re-characterization step, where the probability of the attributes are updated, global multipliers are applied and local changes are made around problematic wells in order to provide a set of model that yield production and pressure curves with better dispersion compared to history data for every well. The key point of the methodology is that for every iteration the errors of each model, well and objective function can be visualized in a very concise plot that is based on the normalized quadratic error of each curve. The plot clearly shows global and local problems of the set of simulation models, so it is a good indicator of the changes to be made in the next iteration. Another point to highlight is that the same type of iterative procedure is performed to integrate 4D seismic into the process. Thus, after selecting a set of models with a good representation of well history data the same iterative process is repeated to generate a new set of models that match 4D seismic data as well. Thus, the proposed methodology integrates 4D seismic and well history data to reduce uncertainties with a probabilistic approach. To validate the methodology an application is shown where it can be seen the gradual improvement achieved for the models along the iterations. Introduction The history matching is a well-known procedure used to calibrate reservoir simulation models with dynamic data, in order to improve reliability in the production predictions. The process is normally very complex, mainly due to three aspects: (1) the number of uncertain attributes to be changed in the process, (2) multiple possible solutions with similar quality (considering a deviation from the observed data) and (3) number of functions to be calibrated. To make the process viable, the number of uncertainties is reduced based on the knowledge of the problem and the results of a sensitivity analyses that can indicate the most relevant attributes. In this work, the most relevant uncertainties are the attributes that represent the spatial variation of reservoir properties (facies, porosity, permeability, net-to-gross ratio). These properties are combined with other attributes using a Discretized Latin Hypercube technique (called DLHG by Schiozer et al., 2014). The multiple solutions can be treated considering an uncertainty analysis combined with history matching process. Instead of using an optimization method to find the best solution, as typically done in history matching procedures, like illustrated per Maschio & Schiozer (2008), Bertolini & Schiozer (2011), Becerra et al (2011).The process, in this work, is treated differently. It starts with multiple possible scenarios (probabilistic approach) generated using the DLHG technique, then a reduction of scenarios is performed based on the quality of the solutions considering the difference from the observed data.
Eurosurveillance | 2015
F.B. Mesquita; Alessandra Davolio; Denis José Schiozer
The petroleum industry has a high demand for production forecasting under uncertainties, which is performed through probabilistic approaches. Although these approaches have been used very often in recent years, one of the challenges to their application is the development of a methodology with multiple scenarios that honor the dynamic data available (history matching procedures). Many works, which propose different probabilistic history matching approaches have been published in the literature but are based on methods that demand a very high number of reservoir simulation runs. Thus, to make this type of application feasible, these methods rely on the application of proxy models, which, however, are not able to capture the full physical behavior of the model. Different from those methods, this work proposes a methodology for uncertainties reduction of reservoir properties integrated with an assisted history matching procedure, in a multi-objective approach that does not employ proxies. The methodology proposed is an extension of a procedure previously published by our group, which has shown to be an efficient way to perform probabilistic history matching. At each iteration of the process, the model uncertainties are combined through a robust sampling technique, which originates a set of model realizations that are then simulated. The results are evaluated for all Objective Functions (OF), namely, the quadratic deviations of all well rates and pressures. This work proposes the use of quantitative tools to perform the matching: (1) indexes that quantify the matching quality for each OF and allow prioritizing the worst OF to be matched and (2) a bi-dimensional matrix that supports the identification of sources to the highest deviations and gradual model properties updating. At the end of the procedure, good models (which must present satisfactory matching for all OF) are filtered and further employed in production forecasting under uncertainties. The methodology is applied to a synthetic model (UNISIM-I-H) based on the Namorado field in Brazil. The results show not only the gradual improvement of all OF during the matching procedure but also and, more importantly, how the reservoir uncertainties are updated based on the tools proposed here. The proposed indicators make the application systematic and can be used for automatizing the process. The methodology is able to deal with complex cases, especially those involving several wells and uncertainties with non-linear interaction among them. The developments provide robustness to the application, resulting in a more reliable production forecast and management of the field.
Eurosurveillance | 2013
Alessandra Davolio; Célio Maschio; Denis José Schiozer
Nowadays there are several methodologies to incorporate quantitatively 4D seismic data in the history matching of reservoir simulation models. Most of them add a map attribute derived from 4D seismic, which can be impedance maps, saturations etc., in the objective function of the optimization process. In these cases, the goal is to match the dynamic information provided by 4D seismic with simulation response in the whole reservoir area, or the whole area covered by seismic data. In this work, it is proposed a history matching methodology that uses 4D seismic information locally for each injector (and the associated producers) individually, trying to match the water front movement related to each injector with the observed dynamic changes. In order to guarantee consistency with geology, the matching is done by combining static properties that comes from different simulation models. These models are generated through a combination of the most important uncertain parameters in an uncertainty analysis procedure, geologically consistent, that should be run previously. Then, the matching procedure is based on the computation of the water saturation errors at some sub-regions defined around the injector for multiple scenarios; the error is computed between water saturation derived from 4D seismic and each model simulation result. The models which presented the smallest error for each sub-region are selected and a new simulation model is built by combining the static properties extracted from the selected models at each sub-region. Before calculating the saturation errors the volume of water computed from 4D seismic is calibrated to the volume of injected water providing a more robust calibration. The methodology was applied to a synthetic case where porosity and permeability were the main uncertainties updated by the methodology proposed. Since these properties were generated through geostatistic realizations, the adjusted model kept the geological features. The promising results observed in the synthetic case showed that the methodology can be a good alternative for history matching, since it is an easy to implement procedure and it does not require sophisticated optimization algorithms to incorporate 4D seismic into the process. Introduction Reservoir simulation is one of the main tools used for reservoir management. Thus, one of the main tasks of geologists and reservoir engineers is to build reliable models, which is not an easy task, due to the complexity present in this kind of problem. The complexity comes mainly from the lack of geological information needed to build such models. This issue can be minimized by matching dynamic data (well production, 4D seismic, etc) in a history matching procedure, which is an ill poised inversion process with several possible solutions. As it is a difficult and important task, several works can be found in the literature approaching the problem with different optimization algorithms, discussing how to use the dynamic data available to be matched and so on. There two relevant issues that has been subject of active research in the last ten years. One is the quantitative incorporation of 4D seismic data in the history matching procedure and the other is the development of history matching methodologies that generates models geologically consistent. The works of Caers (2003) and Hoffman and Caers (2005) are good examples of history matching methodologies geologically consistent. In Caers (2003) it is proposed the multiple-point geostatistics method where geological information is jointly integrate with well production data through a geological training image. An extension of this approach is presented in Hoffman and Caers (2005) where the authors propose a methodology to use the same kind of probability perturbation method (using the training images) but now performing regional perturbations. Consequently, the reservoir can be split into regions and the method allows different amount of properties change for different parts of the model, being that no discontinuities are formed along the borders. 2 SPE 164883-MS The two methodologies mentioned above originally consider only well data to perform the matching, however an application of the probability perturbation method proposed by Caers (2003) that also includes 4D seismic data quantitatively can be seen in Castro et al (2009), for the Oseberg field, and Tolstukhin et al (2012) for the Ekofisk field. Landa and Kumar (2011) and Stephen et al (2005) are other examples of history matching process that consider well production and 4D seismic data jointly to perturb the model properties; their methodologies also keep the integration with geological information. Both of them perform the history matching with a probabilistic approach but with different algorithms. Landa and Kumar (2011) uses a Monte Carlo type sampling algorithm and Stephen et al (2005) uses the Neighbourhood Algorithm (NA). This work proposes a new methodology to run a local history matching using 4D seismic data quantitatively. Multiple models are used to perform the local matching in a simple process of comparing the errors between 4D seismic data and the reservoir model realizations. Differently of the methods usually find in the literature, as the one mentioned before, in the methodology proposed here no optimization process is performed. Another point is that it assumes that an uncertainty analysis was run previously so that it generated several model realizations. These realizations are used to update the reservoir properties, hence the model obtained after the matching will keep some geological consistence. In order to validate the methodology an application in a synthetic dataset is presented. The main contribution of this work remains in the simple approach proposed which can be an alternative to the sophisticated algorithms mentioned before. Methodology It is common nowadays to run statistical analysis to better understand the effect of uncertainties on reservoir behavior; as a result of this kind of procedure, several model realizations are generated by combining the most important uncertainties. The goal of the methodology proposed in this work is to harness these models together with 4D seismic to run a local history matching procedure that updates the reservoir properties without losing their geological aspect. The proposed approach aims to use the data provided by 4D seismic locally, by performing a local matching, considering each injector region, trying to adjust the movement of the water front. The 4D seismic data considered in this work consist of water saturation difference map (∆4DSwseis), generated by an inversion process such as the one described in Davolio et al (2012a). Thus, the local matching is done by dividing the reservoir into regions according to the location of the injector wells and its correspondent water saturation error map observed. Then, local properties such as porosity, permeability etc., are modified in order to minimize water saturation errors between the data provided by 4D seismic and the simulation model. The local modification of the static properties is done by using several model realizations. A more detailed description of the whole matching process is presented below. Note that the operator ∆4D stands for the time lapse difference: property at time 1 – property at time 0 (time 0 and 1correspond to two seismic surveys). 1Simulate the n models generated by an uncertainty analysis procedure run previously and calculate the water saturation time lapse difference for each model (∆4DSwsim_i, for i=1:n); 2Simulate the base model and calculate the water saturation time lapse difference (∆4DSwbase); 3Compute water saturation error map of the base model: = ∆ − ∆ ; (1) 4Select an injector well and define regions around it according to the error map computed in step 3; 5For each region and each of the n model realization calculate the quadratic error: = ∑ (∆ − ∆ ) ∀ , (2) being that i,j,k are the block coordinates inside the region. 6For each region select the model which presents the smallest quadratic error computed in step 5; 7For the models selected in step 6, cut the local properties inside each region; 8Replace the base model properties inside each region for the properties extracted in step 7; 9Simulate the new base model and analyze the error map (equation 1), if the errors are acceptable select another injector and go to step 2, if not, redefine de regions and go back to step 5. In order to use water saturation maps provided by 4D seismic data quantitatively, as described above, it is important to calibrate this information with engineering data. In this sense, an additional procedure is run before the step 3 defined above. It consists of computing the amount of injected water according to the water saturation map provided by 4D seismic and dividing this value by the known volume of injected water. The result is a correction factor that should be applied to the 4D seismic data by dividing the water saturation map by this number. Another important fact to highlight is that the methodology described above should be used for local properties, so if there are global properties, such as relative permeability, that plays an important role in the reservoir response, then a global matching should be applied before starting the local matching. Since it is not the focus of this work to perform a complex global match, a simple manual matching procedure based only in well data is proposed. The procedure treats each set of global property at a time and it consists of building hundreds of models through the derivative tree technique (Schiozer et al, 2004), where all the levels of one set of parameters are combined. Then, the quadratic error between each well production curve of each model and history data is computed according to the equation: = ∑ ( − ) , (3) SPE 164883-MS 3 where m is the number of production data and obs is the his
SPE Annual Technical Conference and Exhibition | 2016
Germano S. C. Assunção; Alessandra Davolio; Denis José Schiozer
Traditionally, integration between 4D seismic (4DS) and simulation data has been performed considering the 4DS data deterministically. However, there are uncertainties in the response of seismic. The goal of the methodology presented in this work is to compare the changes of dynamic properties estimated from 4DS and simulation models considering the uncertainties inherent to both data. The relevant reservoir uncertainties can be combined to generate multiple simulation models, which provide maps of dynamic changes, such as pressure change (Δp) and water saturation variation (ΔSw). Available 4DS can also be used to map dynamic changes. Through a stochastic seismic inversion, multiple ΔSw and Δp maps can be obtained from 4DS. After selecting a proper scale (scale transference), we compare the dynamic maps from seismic and simulation data using probabilistic density functions (PDFs), establishing levels of agreement/disagreement between 4DS and simulation data. To validate the methodology we use a synthetic dataset, with moderate complexity and seven uncertainties mapped, such as fault transmissibility, porosity, facies, and permeability. 500 maps of ΔSw and Δp from 4D seismic were generated from prior probabilistic seismic inversion. 500 simulation models previously calibrated using well production data generated the set of 500 maps of ΔSw and Δp from simulation. Applying the methodology, we identify four regions: (1) reservoir locations where both estimates (seismic and simulation) are similar, showing regions properly calibrated, (2) locations where simulation estimates are more precise than 4D seismic, (3) reservoir locations where the data sets indicate divergent estimates, and (4) 4DS estimates are more precise than simulation. This information can be very useful to guide data integration. As an example, we show that region (4) can be used to select the simulation models that reproduce ΔSw or Δp behavior from 4DS, since 4D seismic data is more precise than the simulation estimates in this region. Other useful information from the proposed methodology is that the reservoir zones identified as region (2) can be used as a constraint to reinterpret 4D seismic data, as simulation estimates are more precise. The methodology is a new way to evaluate the information from 4D seismic and simulation data considering uncertainties. The identification of these four regions can be useful in the parametrization phase of the history matching procedure (a complex process), as an additional tool to understand the properties in this procedure. The methodology also indicates possible locations to use reservoir engineering constraints to improve seismic interpretation, in regions where estimates from simulation are more precise than 4D seismic
Eurosurveillance | 2011
Alessandra Davolio; Célio Maschio; Denis José Schiozer
Time-Lapse seismic attributes has been showed as a promising tool to be used in the history matching process. The most common seismic attribute used to integrate these data is P-impedance. The procedure usually made is a minimization of the differences between the seismic P-impedance and the synthetic P-impedance computed from the flow simulation data through a forward modeling. On this paper, we propose a methodology that goes in the opposite way. We invert the seismic data to pressure and saturation and then incorporate these results in the history match procedure in order to integrate the inversion process with the flow simulator, using the results of the reservoir simulation to guide the process. Another feature is that instead of using only the P-impedance in our inversion procedure we are also taking into account a second seismic attribute the S-impedance, which is more sensitive to pressure than saturation variations. A discussion is presented regarding the results obtained using a model with characteristics similar to the Brazilian offshore fields and synthetic seismic data. The main contribution of this work is the integration of the seismic attributes inversion and reservoir simulation processes. We also show the advantages of the inversion procedure to obtain pressure and saturation considering prediction of petroleum production and the details used in the methodology to avoid multiple combinations of saturation and pressure in the inversion process.
Petroleum Geoscience | 2017
Masoud Maleki; Alessandra Davolio; Denis José Schiozer
The Norne Field started production in 1997 and up to 2006 the field experienced intense production activity, making the Norne benchmark case an ideal candidate to explore the challenges in interpreting complex time-lapse seismic data. Seismic amplitude changes and time-shifts are used as the first level approach to interpret the time-lapse differences and to update reservoir models. A common alternative is to invert the seismic data and obtain acoustic impedance variations caused by production activity, and to evaluate their possible interpretations. For this case study, we use a 4D inversion approach to invert the base (2001) and monitor (2006) seismic surveys in order to provide field-wide insights for the Norne benchmark case. We extensively interpret the observed 4D inversion anomalies and decouple, as much as possible, the effects of fluid and pressure variations, supported by production and reservoir engineering data. Moreover, we compare the inversion results with the simulation model from the Norne benchmark case to suggest areas of future modification to the simulation model. This research is intended as a resource to improve the quality of history matching or other 4D inversion methods applied to the Norne benchmark case, and to demonstrate a detailed time-lapse seismic interpretation within the reservoir segments of the Norne Field. Supplementary material: Well-history data of six producer and injector wells is available at https://doi.org/10.6084/m9.figshare.c.3890251
OTC Brasil | 2017
Carla J. Ferreira; Alessandra Davolio; Denis José Schiozer
This work describes a methodology that evaluates the Discrete Latin Hypercube with Geostatistical Realizations (DLHG) sample size for complex models in the history matching under uncertainties process with application to the Norne Benchmark Case. The sample size affects the time demanded and results accuracy in a history matching process because a small sample size can yield inaccurate risk quantification and a high sample size can demand excessive time to reach good results. Both factors should be evaluated in order to improve the project’s efficiency and to obtain reliable results. Such evaluation gains greater importance in complex reservoir models because the number of tests to determine the reservoir scenarios that match dynamic data can be high due to the level of complexity. The methodology presented in this work is divided in three steps. First, we evaluate the ability of DLHG to produce output cumulative distribution functions (CDF) that replicate a more exhaustive sampling technique (Monte Carlo) using the Kolmogorov–Smirnov test. The output is the misfit between observed and simulated production rates; then, we compare the influence and correlation matrices obtained with DLHG and Monte Carlo samples. The influence matrix shows the impact of the uncertainty variation on the outputs and the correlation matrix measures the strength of the dependence between the uncertainty attributes and outputs. Finally, we perform the stability test. The methodology was applied to the Norne benchmark case; a field located in the Norwegian Sea. The main characteristics of the methodology are: (1) it uses a statistical technique to compare the output CDFs from the reference and DLHG samples and (2) it evaluates the ability of the DLHG sample to identify the reservoir attributes that affect the history match results. We evaluated DLHG sample sizes of 20, 50, 100 and 200, and considered a MC sample size of 5,000 to the Norne benchmark case. The DLHG CDFs for the 100 sample size was able to accurately replicate the corresponding MC CDFs, however it did not replicated the behavior of the influence and correlation matrices. The DLHG sample size of 200 was able to reproduce the CDFs outputs, the influence and correlation matrices and it was considered stable. The study showed that even if the sample size is able to represent the CDFs outputs from a reference solution, the influence and correlation matrices should be evaluated. The methodology presented can be incorporated into usual history match routines. This template is provided to give authors a basic shell for preparing your manuscript for submittal to a meeting or event. Styles have been included to give you a basic idea of how your finalized paper will look before it is published. All manuscripts submitted will be extracted from this template and tagged into an XML format; standardized styles and fonts will be used when laying out the final manuscript. Links will be added to your manuscript for references, tables, and equations. Figures and tables should be placed directly after the first paragraph they are mentioned in. The content of your paper WILL NOT be changed.
Eurosurveillance | 2016
Gil G. Correia; Alessandra Davolio; Denis José Schiozer
Some difficulties are frequently associated with the integration of seismic and flow simulation datasets, especially related with the low vertical resolution of the seismic data and the uncertain estimations in the areas between the wells. One challenge is then the integration of both datasets in different scales, in order to take advantage of their characteristics. The present study proposes a redistribution of the reservoir saturation estimated with 4D seismic inversion methods, combining the information provided by the flow simulation in order to improve the quality of the estimations and their resolution. The methodology comprehends a saturation redistribution algorithm that is applied to each reservoir block combining the information of two saturation maps: one derived from the 4D seismic data and the other estimated by the flow simulator. The final saturation estimation follows the vertical distribution given by the simulation data but keep the average behavior observed in the 4D seismic. In order to have a better control of the results, the methodology is applied to a synthetic dataset that includes a reservoir model with different grid resolutions: geomodel, simulation model and seismic model. Two different case studies present the main results of the proposed methodology to improve the saturation predictions. The first case study represents an ideal solution being assumed that the base model (simulation model) is very similar to the reference model (geomodel that represents the true answer), remaining a few differences due to the different scales. In the second case study, the base model is selected between multiple realizations during the uncertainty reduction process. The results shows that is possible to improve the resolution of the saturation variation maps computed from 4D seismic data allowing the identification of new fine scale heterogeneities and providing better estimations of the saturation changes due to production. This methodology can also give additional clues in future history matching procedures through the identification of critical regions. With the continuous calibration of the models during a history matching process the results obtained with the redistribution method tend to improve, approaching the results obtained in the first case study. The proposed redistribution method combines the best characteristics of the seismic and simulation data. It includes the higher sensibility of the seismic data to identify the areal distribution of the main anomalies and inserts the higher sensibility of the simulation data regarding the identification of vertical water flow trends due to gravitational effects. Thus, the procedure introduces, in the maps provided by 4D seismic, new information regarding the injection/production patterns that become more and more reliable as we approach the wells. All final manuscripts will be sent through an XML markup process that will alter the LAYOUT. This will NOT alter the content in any way.