Alexandre Castellini
Chevron Corporation
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Alexandre Castellini.
annual simulation symposium | 2005
Burak Yeten; Alexandre Castellini; Baris Guyaguler; Wen H. Chen
Experimental design method is an alternative to traditional sensitivity analysis. The basic idea behind this methodology is to vary multiple parameters at the same time so that maximum inference can be attained with minimum cost. Once the appropriate design is established and the corresponding experiments (simulations) are performed, the results can be investigated by fitting them to a response surface. This surface is usually an analytical or a simple numerical function which is cheap to sample. Therefore it can be used as a proxy to reservoir simulation to quantify the uncertainties. Designing an efficient sensitivity study poses two main issues: • Designing a parameter space sampling strategy and carrying out experiments. • Analyzing the results of the experiments. (Response surface generation)
information processing and trusted computing | 2005
Alexandre Castellini; Irene Gullapalli; Viet Thai Hoang; Patrick Condon
Understanding the effects of subsurface uncertainties on production responses is an integral part of the decision-making process. A more accurate quantification of the uncertainty of production forecasts contributes to better business decisions. When a field has produced for several years, models must be conditioned to available production data to obtain meaningful predictions. Data integration and uncertainty assessment of future performance of the reservoir are indivisible processes that cannot be addressed generally by simple techniques. A method is presented to solve complex inverse problems that involve highly nonlinear responses.
annual simulation symposium | 2003
Alexandre Castellini; Adwait Chawathe; David Larue; J.L. Landa; F.X. Jian; John Toldi; M.C. Chien
These days “estimating uncertainty” is the mantra. As we do this, we ask ourselves which is better: an array of geologically simple rapidly history-matched models, or a single geologically comprehensive, carefully history-matched model. After all, uncertainty, which is normally characterized by a range of forecasts from techniques such as Experimental Design, is difficult to quantify using just one model however comprehensive it may be. Yet if forecasts are obtained from a series of simple models, how good are they? Choosing one over the other also has significant implications in the time required for modeling, and also reservoir management. Specific questions, that directly affect the cost of modeling, come to mind. These are: What is the optimal level of geological detail, especially if the uncertainty management plan includes history-matching and simulating a series of models? Can the oil trapped behind the flood front be estimated by a series of simple lateral (shales) barriers/baffles or do we always need an extensive sequence stratigraphy framework? A detailed model may not be as amenable to uncertainty estimation by the virtue of its size. Is field-scale history matching adequate or is well-by-well history matching a must? Perhaps the brute force approach of probabilistic forward modeling provides the panacea. After all the proof-ofthe-pudding lies only in the model’s ability to accurately predict field performance. Finally, we also ask - should the modeling strategy, i.e., comprehensive vs. simple, be dependent on the response variable of interest, e.g. ultimate recovery factor vs. infill drill locations. As such, if ultimate recovery is the objective, a simple model may suffice. To answer questions like these, we re-visit current reservoir modeling paradigms. As a datum, we use a comprehensively modeled waterflood from Western Africa. This reservoir was modeled using extensive sequence stratigraphic techniques. The model was scaled-up from about 14 million cells to about 280,000 cells using a flow-based scale-up algorithm, carefully preserving all the mappable mudstones above flooding surfaces. History-matching for a 30-year period was systematically conducted with a team of field engineers and simulation specialists. The whole process took about a year to complete. Against this datum, we compare a series of rapidly built geological models that still honor all the data and the overall depositional architecture, and yet are significantly different from the datum geological model by the virtue of the modeling strategies implemented. The different modeling strategies vary in complexity from changes in variogram lengths and direction to simple tank models with stochastic sandstones and mudstones conditioned by well data. Various geostatistical algorithms were also investigated for facies modeling and petrophysical properties population within facies. The new models were history-matched using the conventional manual method and two separate assisted-history matching methods that use sensitivity coefficients. The question being addressed was: does constraining the geological models to the same dynamic data always create an imprint over the underlying geological variation and result in similar predictions? Preliminary results indicate that the history-matching overprint tends to mask some of the dramatic geological variations. This can have significant ramifications in modeling strategies, especially when assessing uncertainty in presence of substantial history.
Archive | 2007
Tina Yu; Dave Wilkinson; Alexandre Castellini
History matching is the process of updating a petroleum reservoir model using production data. It is a required step before a reservoir model is accepted for forecasting production. The process is normally carried out by flow simulation, which is very time-consuming. As a result, only a small number of simulation runs are conducted and the history matching results are normally unsatisfactory.
ACM Sigevolution | 2007
Alexandre Castellini; Charles F. Guthrie; Burak Yeten; Tina Yu
The GECCO conference is the largest annual conference in the field of Genetic and Evolutionary Computation. Workshops on specific topics are organized every year. In July 2007, Chevron and Tina Yu from the University of Newfoundland organized the first workshop on Petroleum Applications. Speakers from industry and academia shared their recent work on a variety of subjects and a panel discussion addressed the gaps and opportunities for Oil and Gas companies.
Archive | 2006
David A. Wilkinson; Tina Yu; Alexandre Castellini
Archive | 2007
Burak Yeten; Alexandre Castellini
Journal of Artificial Evolution and Applications | 2008
Tina Yu; Dave Wilkinson; Alexandre Castellini
SPE Annual Technical Conference and Exhibition | 2002
F.X. Jian; David Larue; Alexandre Castellini; John Toldi
Archive | 2011
Alexandre Castellini; Herve Gross; Yifan Zhou; Jincong He; Wen Hsiung Chen