Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Pallav Sarma is active.

Publication


Featured researches published by Pallav Sarma.


Spe Reservoir Evaluation & Engineering | 2010

Results of the Brugge Benchmark Study for Flooding Optimization and History Matching

E. Peters; R.J. Arts; G.K. Brouwer; C.R. Geel; S. Cullick; R.J. Lorentzen; Yan Chen; K.N.B. Dunlop; F.C. Vossepoel; R. Xu; Pallav Sarma; A.H. Alhutali; Albert C. Reynolds

In preparation for the SPE Applied Technology Workshop (ATW) held in Brugge in June 2008, a unique benchmark project was organized to test the combined use of waterflooding-optimization and history-matching methods in a closed-loop workflow. The benchmark was organized in the form of an interactive competition during the months preceding the ATW. The goal set for the exercise was to create a set of history-matched reservoir models and then to find an optimal waterflooding strategy for an oil field containing 20 producers and 10 injectors that can each be controlled by three inflow-control valves (ICVs). A synthetic data set was made available to the participants by TNO, consisting of well-log data, the structure of the reservoir, 10 years of production data, inverted time-lapse seismic data, and other information necessary for the exercise. The parameters to be estimated during the history match were permeability, porosity, and net-to gross-(NTG) thickness ratio. The optimized production strategy was tested on a synthetic truth model developed by TNO, which was also used to generate the production data and inverted time-lapse seismic. Because of time and practical constraints, a full closed-loop exercise was not possible; however, the participants could obtain the response to their production strategy after 10 years, update their models, and resubmit a revised production strategy for the final 10 years of production. In total, nine groups participated in the exercise. The spread of the net present value (NPV) obtained by the different participants is on the order of 10%. The highest result that was obtained is only 3% below the optimized case determined for the known truth field. Although not an objective of this exercise, it was shown that the increase in NPV as a result of having three control intervals per well instead of one was considerable (approximately 20%). The results also showed that the NPV achieved with the flooding strategy that was updated after additional production data became available was consistently higher than before the data became available.


annual simulation symposium | 2005

Implementation of Adjoint Solution for Optimal Control of Smart Wells

Pallav Sarma; Khalid Aziz; Louis J. Durlofsky

Practical production optimization problems typically involve large, highly complex reservoir models, thousands of unknowns and many nonlinear constraints, which makes the numerical calculation of gradients for the optimization process impractical. This work explores a new algorithm for production optimization using optimal control theory. The approach is to use the underlying simulator as the forward model and its adjoint for the calculation of gradients. Direct coding of the adjoint model is, however, complex and time consuming, and the code is dependent on the forward model in the sense that it must be updated whenever the forward model is modified. We investigate an adjoint procedure that avoids these limitations. For a fully implicit forward model and specific forms of the cost function and nonlinear constraints, all information necessary for the adjoint run is calculated and stored during the forward run itself. The adjoint run then requires only the appropriate assembling of this information to calculate the gradients. This makes the adjoint code essentially independent of the forward model and also leads to enhanced efficiency, as no calculations are repeated. Further, we present an efficient approach for handling nonlinear constraints that also allows us to readily apply commercial constrained optimization packages. The forward model used in this work is the General Purpose Research Simulator (GPRS), a highly flexible compositional/black oil research simulator developed at Stanford University. Through two examples, we demonstrate that the linkage proposed here provides a practical strategy for optimal control within a general-purpose reservoir simulator. These examples illustrate production optimization with conventional wells as well as with smart wells, in which well segments can be controlled individually.


Spe Journal | 2011

A Comparative Study of the Probabilistic-Collocation and Experimental-Design Methods for Petroleum-Reservoir Uncertainty Quantification

Heng Li; Pallav Sarma; Dongxiao Zhang

SummaryReservoir modeling and simulation are subject to significant uncertainty, which usually arises from heterogeneity of the geo-logical formation and deficiency of measured data. Uncertainty quantification, thus, plays an important role in reservoir simula-tion. In order to perform accurate uncertainty analysis, a large number of simulations are often required. However, it is usually prohibitive to do so because even a single simulation of practi-cal large-scale simulation models may be quite time consuming. Therefore, efficient approaches for uncertainty quantification are a necessity. The experimental-design (ED) method is applied widely in the petroleum industry for assessing uncertainties in reservoir production and economic appraisal. However, a key disadvantage of this approach is that it does not take into account the full probability-density functions (PDFs) of the input random parameters consistently—that is, the full PDFs are not used for sampling and design but used only during post-processing, and there is an inherent assumption that the distributions of these parameters are uniform (during sampling), which is rarely the case in reality. In this paper, we propose an approach to deal with arbitrary input probability distributions using the probabilistic-collocation method (PCM). Orthogonal polynomials for arbitrary distributions are first constructed numerically, and then PCM is used for uncertainty propagation. As a result, PCM can be applied efficiently for any arbitrary numerical or analytical distribution of the input parameters. It can be shown that PCM provides optimal convergence rates for linear models, whereas no such guarantees are provided by ED. The approach is also applicable to discrete distributions. PCM and ED are compared on a few synthetic and realistic reservoir models. Different types of PDFs are considered for a number of reservoir parameters. Results indicate that, while the computational efforts are greatly reduced compared to Monte Carlo (MC) simulation, PCM is able to accurately quantify uncer-tainty of various reservoir performance parameters. Results also reveal that PCM is more robust, more accurate, and more efficient than ED for uncertainty analysis.IntroductionReservoir simulations are widely used for reservoir-performance forecasting in the petroleum industry. On the other hand, it is recognized that a simulation model is usually nonunique because a variety of input parameters in the reservoir model may be uncer-tain. The uncertainty can stem from heterogeneity of the geological formation and deficiency of measured data. As a result, model outputs such as hydrocarbon production may have significant uncertainty, the quantification of which is usually accomplished using statistical properties such as the mean, standard deviation, and the probability density. As such, uncertainty quantification plays an important role in reservoir simulation. MC simulation is extensively used for uncertainty quantifica-tion [see, for example, Zhang (2002)]. In the MC method, a large number of realizations of the random inputs are generated and solved to obtain a set of model outputs, which can be further analyzed statistically. The direct-sampling MC method is concep-tually straightforward and easy to implement. However, its main disadvantage is the requirement of large computational effort because of the large number of model simulations needed to obtain statistically accurate results. It makes MC simulations prohibitive in most real applications of reservoir simulation, especially for large-scale problems.The ED method associated with different response-surface methodologies is an alternative that is used widely in the petro-leum industry for assessing uncertainties in reservoir production and economic appraisal (van Elk et al. 2000; Venkataraman 2000; Friedmann et al. 2003; Yeten et al. 2005; Tipping et al. 2008). Despite its wide applications in the petroleum industry, the ED method has seldom been compared with MC simulations for uncer-tainty quantification. ED provides a proxy of reservoir models, and it has been used to speed up the history-matching process (Amudo et al. 2008; Schaaf at al. 2008). The ED method is more efficient than the direct MC method. Most ED methods use the two-level design (with low and high values) or three-level design (includ-ing the middle value). Multilevel designs increase the number of simulations significantly, and some mixed designs are more efficient (Kalla and White 2007). However, a key disadvantage of the ED approach is that it does not take into account the full probability distributions of the parameters consistently while creat-ing the response surface—that is, the full PDFs are not used for sampling and design but used only during post-processing. Further, because all samples are equally weighted for response-surface generation, there is an inherent assumption that the distributions of these parameters are uniform. As a result, the ED method may not be appropriate when parameter distributions are arbitrary, which is common in real-world applications. However, it is still used broadly in the industry and these limitations are usually disregarded. Recently, Lawal (2009) discussed some limitations of ED methods for modeling uncertainty in the oil industry, such as the inconsistency and nonuniqueness of proxy models and the effect of input distributions on the output. The PCM is another efficient stochastic approach. It has been applied for uncertainty quantification in the context of optimiza-tion of petroleum-reservoir production (Sarma et al. 2005) and for quantification of uncertainty for flow in porous media in hydrogeology and petroleum engineering (Li and Zhang 2007, 2009). In the PCM, the dependent random variables are repre-sented by employing the orthogonal polynomial functions (poly-nomial chaos expansions) as the bases of the random space. The polynomial chaos expansions are orthogonal with respect to the specific PDFs of the input random variables. They are capable of encapsulating the possibly nonlinear relationships between input and output random variables by generating accurate “polynomial chaos proxies” with few model evaluations. In addition to Her-mite polynomial chaos expansion specified for Gaussian random variables, generalized polynomial chaos expansions have been


Petroleum Science and Technology | 2008

Computational Techniques for Closed–loop Reservoir Modeling with Application to a Realistic Reservoir

Pallav Sarma; Louis J. Durlofsky; Khalid Aziz

Abstract Real-time model-based reservoir management requires efficient computational techniques for optimizing reservoir performance under uncertainty. A variety of algorithms addressing various aspects of this “closed-loop” methodology have been presented by various investigators, but substantial effort is still needed to make the entire process robust and efficient. In our recent work, we introduced an approximate feasible direction optimization algorithm for treating nonlinear path constraints (which are constraints such as maximum liquid production rate, which must be satisfied at every time step) and a new parameterization based on kernel principal component analysis (KPCA) for multipoint geostatistical models. The KPCA representation allows for the use of a gradient-based history-matching procedure that is able to maintain a higher degree of geological realism in the history-matched model. In this article, we combine these procedures with our general adjoint-based optimization technique to provide a full closed-loop capability. This integrated set of algorithms is then applied to a realistic field case. Specifically, we describe the key computational procedures and highlight the linkages required to provide the closed-loop capability. The example case considered is based on a Gulf of Mexico reservoir and involves three injection wells and four production wells operating under bottom hole pressure, total injection rate, and maximum water cut constraints. For this case, it is demonstrated that application of the closed-loop methodology provides a 25% increase in the net present value (NPV) over predictions for a realistic base case. This improvement is almost the same as that achieved using an open-loop approach, which is an idealized formulation in which the geological model is assumed to be known. These results demonstrate that the overall closed-loop procedure will indeed be applicable for practical cases with uncertain geology.


Computers & Geosciences | 2013

Reduced-order flow modeling and geological parameterization for ensemble-based data assimilation

Jincong He; Pallav Sarma; Louis J. Durlofsky

Reduced-order modeling represents an attractive approach for accelerating computationally expensive reservoir simulation applications. In this paper, we introduce and apply such a methodology for data assimilation problems. The technique applied to provide flow simulation results, trajectory piecewise linearization (TPWL), has been used previously for production optimization problems, where it has provided large computational speedups. The TPWL model developed here represents simulation results for new geological realizations in terms of a linearization around previously simulated (training) cases. The high-dimensional representation of the states is projected into a low-dimensional subspace using proper orthogonal decomposition. The geological models are also represented in reduced terms using a Karhunen-Loeve expansion of the log-transmissibility field. Thus, both the reservoir states and geological parameters are described very concisely. The reduced-order representation of flow and geology is appropriate for use with ensemble-based data assimilation procedures, and here it is incorporated into an ensemble Kalman filter (EnKF) framework to enrich the ensemble at a low cost. The method is able to reconstruct full-order states, which are required by EnKF, whenever necessary. The combined technique enables EnKF to be applied using many fewer high-fidelity reservoir simulations than would otherwise be required to avoid ensemble collapse. For two- and three-dimensional example cases, it is demonstrated that EnKF results using 50 high-fidelity simulations along with 150 TPWL simulations are much better than those using only 50 high-fidelity simulations (for which ensemble collapse is observed) and are, in fact, comparable to the results achieved using 200 high-fidelity simulations.


annual simulation symposium | 2011

Use of Reduced-order Models for Improved Data Assimilation within an EnKF Context

Jincong He; Pallav Sarma; Louis J. Durlofsky

Reduced-order modeling represents an attractive framework for accelerating computationally expensive reservoir simulation applications. In this paper we introduce and apply a reduced-order modeling approach for history matching. The method considered, trajectory piecewise linearization (TPWL), has been used previously for production optimization problems, where it has provided large computational speedups. The TPWL model developed here represents simulation results for new geological models in terms of a linearization around previously simulated (training) cases. The high-dimensional state space is projected into a low-dimensional subspace using proper orthogonal decomposition (POD). The geological model is represented in terms of a Karhunen-Loeve expansion of the log-transmissibility field, so both the reservoir states and geological parameters are described in a very concise way. The method is incorporated into an Ensemble Kalman Filter (EnKF) history-matching procedure. The combined technique enables EnKF to be applied using many fewer (high-fidelity) reservoir simulations than would otherwise be required to avoid ensemble collapse. More specifically, it is demonstrated that EnKF results using 50 high-fidelity simulations along with 150 TPWL simulations are much better than those using only 50 high-fidelity simulations and are, in fact, comparable to the results achieved using 200 high-fidelity simulations. Introduction History matching is an essential component of reservoir modeling and management. It entails the updating of the reservoir model using dynamic data, e.g., production data or time-lapse seismic data. History matching can be viewed as an optimization problem in which the mismatch between observed and simulated data is minimized by modifying the parameters associated with the geological model. It typically requires large numbers of simulations, which can be extremely time consuming if high-resolution models are used. Thus, there is a significant need for efficient (proxy or surrogate) models that can predict simulation results with reasonable accuracy. Reduced-order modeling procedures, which have been applied in many application areas including reservoir simulation, represent a means for accelerating flow simulations. Many of these techniques entail the projection of the full-order (high-fidelity) numerical description into a low-dimensional subspace, which reduces the number of unknowns that must be computed at each time step. Existing approaches, applied within the context of reservoir simulation, include procedures based on proper orthogonal decomposition (Cardoso et al., 2009; Markovinović and Jansen, 2006; van Doren et al., 2006) and techniques based on trajectory piecewise linearization, TPWL (Cardoso and Durlofsky, 2010a,b; He, 2010). The target application in these studies was production optimization, and the reduced-order model was used to provide results for varying well control parameters (bottomhole pressure or flow rates).


ECMOR X - 10th European Conference on the Mathematics of Oil Recovery | 2006

Computational Techniques for Closed-Loop Reservoir Modeling with Application to a Realistic Reservoir

Pallav Sarma; Louis J. Durlofsky; Khalid Aziz

This paper extends and applies novel computational procedures for the efficient closed-loop optimal control of petroleum reservoirs under uncertainty. It addresses two important issues that were present in our earlier implementation [2] that limited the application of the procedure to practical problems. Specifically, the previous approach encountered difficulties in handling nonlinear path constraints (constraints that must be satisfied at every time-step of the forward model) during optimization. Such constraints (e.g., maximum liquid production rate) are frequently present in practical problems. To address this issue, an approximate feasible direction optimization algorithm was proposed. The algorithm uses the objective function gradient and a combined gradient of the active constraints [3], both of which can be obtained efficiently with adjoint models. The second limitation of the implementation in [2] was the use of the standard Karhunen-Loeve (K-L) expansion for parameterization of the input random fields of the simulation model. This parameterization is computationally expensive and preserves only two-point statistics of the random field. It is thus not suitable for large simulation models or for complex geological scenarios, such as channelized systems. In another paper [4], a nonlinear form of the K-L expansion, referred to as kernel PCA, is applied for parameterizing arbitrary random fields. Kernel PCA successfully addresses the limitations of the K-L expansion, and is differentiable, meaning that gradient-based methods can be utilized in conjunction with this parameterization within the closed-loop. An example based on a Gulf of Mexico reservoir model is considered. For this case it is demonstrated that the proposed algorithms indeed provide a viable real-time closed-loop optimization framework. Application of the closed-loop methodology is shown to result in a 25% increase in NPV over the base case. This is almost the same improvement achieved using an open-loop approach, which is an idealized formulation in which the geological model is assumed to be known.


annual simulation symposium | 2015

Massively Distributed Simulation and Optimization on Commercial Compute Clouds

Pallav Sarma; Jim Owens; Wen Chen; Xian-Huan Wen

Many of the petroleum industry’s most widely used uncertainty quantification and optimization workflows such as experimental design ideally require significant computational resources (hundreds if not thousands of model evaluations) at a scale not currently available within many oil companies. This study demonstrates the efficiency of using massively distributed computing for reservoir simulation, uncertainty quantification and optimization on a commercially available compute cloud. There are certain benefits to commercial compute clouds, including: 1) virtually unlimited scalability; that is, at a moments notice, thousands of virtual servers can be launched to meet computational workload; 2) on demand nature; one only pays for what is used; and 3) service oriented architecture; the cloud is maintained and managed by the service provider, possibly resulting in reduced maintenance costs. The end result is that cloud solutions can be more economically viable than provisioning and maintianing private clouds. In the study, we launched a 3000 ECU Linux cluster on Amazon EC2™ web service (“EC2”) with MIT StarCluster™ software. The 3000 ECU cluster on EC2 web service was built from 150 RedHat Linux c1.xlarge instances, each consisting of 8 vCPUs (virtual cores), enabling 1200 simultaneous distributed reservoir simulations. An Amazon Machine Image (AMI) was created with a reservoir simulator installed on it, which was then used to launch these 150 c1.xlarge instances, enabling running the simulator on these instances. An in-house optimization software residing in a workstation within’s Chevron’s intranet was connected to a head node on EC2 web service via a pinhole. The headnode was further connected to a master node and the 150 worker nodes via the StarCluster software. The EC2 cluster was then used for ensemble based optimization of a realistic field model to maximize NPV by controlling well BHPs. Because the ensemble gradient can be calculated much more accurately with a large ensemble enabled by a large cluster, an order of magnitude speedup was achieved in the total time required for optimization using such a large distributed compute capacity over our internal clusters with limited number of cores available to a given user for distributed computing.


Computational Geosciences | 2006

Efficient real-time reservoir management using adjoint-based optimal control and model updating

Pallav Sarma; Louis J. Durlofsky; Khalid Aziz; Wen H. Chen


Mathematical Geosciences | 2008

Kernel Principal Component Analysis for Efficient, Differentiable Parameterization of Multipoint Geostatistics

Pallav Sarma; Louis J. Durlofsky; Khalid Aziz

Collaboration


Dive into the Pallav Sarma's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge