Juliana Y. Leung
University of Alberta
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Juliana Y. Leung.
Expert Systems With Applications | 2015
Ehsan Amirian; Juliana Y. Leung; Stefan Zanon; Peter John Dzurman
Data-driven modeling provides an attractive alternative to predict SAGD recovery.The modeling approach is applied successfully for heterogeneous reservoirs.Arps parameters are proposed to parameterize production time-series data.A normalized shale indicator is used as a pertinent input attribute.Accuracy of the prediction is greatly enhanced when cluster analyses are performed. Evaluation of steam-assisted gravity drainage (SAGD) performance that involves detailed compositional simulations is usually deterministic, cumbersome, expensive (manpower and time consuming), and not quite suitable for practical decision making and forecasting, particularly when dealing with high-dimensional data space consisting of large number of operational and geological parameters. Data-driven modeling techniques, which entail comprehensive data analysis and implementation of machine learning methods for system forecast, provide an attractive alternative.In this paper, artificial neural network (ANN) is employed to predict SAGD production in heterogeneous reservoirs, an important application that is lacking in existing literature. Numerical flow simulations are performed to construct a training data set consists of various attributes describing characteristics associated with reservoir heterogeneities and other relevant operating parameters. Empirical Arps decline parameters are tested successfully for parameterization of cumulative production profile and considered as outputs of the ANN models. Sensitivity studies on network configurations are also investigated. Principal components analysis (PCA) is performed to reduce the dimensionality of the input vector, improve prediction quality, and limit over-fitting. In a case study, reservoirs with distinct heterogeneity distributions are fed to the model. It is shown that robustness and accuracy of the prediction capability are greatly enhanced when cluster analysis are performed to identify internal data structures and groupings prior to ANN modeling. Both deterministic and fuzzy-based clustering techniques are compared, and separate ANN model is constructed for each cluster. The model is then tested using a validation data set (cases that have not been used during the training stage).The proposed approach can be integrated directly into most existing reservoir management routines. In addition, incorporating techniques for dimensionality reduction and clustering with ANN demonstrates the viability of this approach for analyzing large field data set. Given that quantitative ranking of operating areas, robust forecasting, and optimization of heavy oil recovery processes are major challenges faced by the industry, the proposed research highlights the significant potential of applying effective data-driven modeling approaches in analyzing other solvent-additive steam injection projects.
Expert Systems With Applications | 2015
Zhiwei Ma; Juliana Y. Leung; Stefan Zanon; Peter John Dzurman
Input attributes descriptive of SAGD reservoir heterogeneities are formulated.Neural network models are trained using a comprehensive field dataset.Uncertainty analysis is performed involving Monte Carlo and bootstrapping methods.Sensitivity of model architecture is explored.Results demonstrate important potential in facilitating SAGD production analysis. Quantitative appraisal of different operating areas and assessment of uncertainty due to reservoir heterogeneities are crucial elements in optimization of production and development strategies in oil sands operations. Although detailed compositional simulators are available for recovery performance evaluation for steam-assisted gravity drainage (SAGD), the simulation process is usually deterministic and computationally demanding, and it is not quite practical for real-time decision-making and forecasting. Data mining and machine learning algorithms provide efficient modeling alternatives, particularly when the underlying physical relationships between system variables are highly complex, non-linear, and possibly uncertain.In this study, a comprehensive training set encompassing SAGD field data compiled from numerous publicly available sources is analyzed. Exploratory data analysis (EDA) is carried out to interpret and extract relevant attributes describing characteristics associated with reservoir heterogeneities and operating constraints. An extensive dataset consisting of over 70 records is assembled. Because of their ease of implementation and computational efficiency, knowledge-based techniques including artificial neural network (ANN) are employed to facilitate SAGD production performance prediction. The principal components analysis (PCA) technique is implemented to reduce the dimensionality of the input vector, alleviate the effects of over-fitting, and improve forecast quality. Statistical analysis is performed to analyze the uncertainties related to ANN model parameters and dataset. Predictions from the proposed approaches are both successful and reliable. It is demonstrated that model predictability is highly influenced by model parameter uncertainty. This work illustrates that data-driven models are capable of predicting SAGD recovery performance from log-derived and operational variables. The modeling approach can be updated when new information becomes available. The analysis presents an important potential to be integrated directly into existing reservoir management and decision-making routines.
Mathematical Geosciences | 2015
Siavash Nejadi; Juliana Y. Leung
The Ensemble Kalman Filter (EnKF) is a Monte Carlo-based technique for assisted history matching and real-time updating of reservoir models. However, it often fails to detect precise locations of distinct facies boundaries and their proportions, as the facies distributions are non-Gaussian, while geologic data for reservoir modeling is usually insufficient. In this paper, a new re-sampling step is introduced to the conventional EnKF formulation; after certain number of assimilation steps, the updated ensemble is used to generate a new ensemble with a novel probability weighted re-sampling scheme. The new ensemble samples from a probability density function that is conditional to both the geological information and the early production data. After the re-sampling step, the forecast model is applied to the new ensemble from the beginning up to the last update step (without any intermediate Kalman updates). Full EnKF is again applied on the ensemble members to assimilate the remaining production history. Combination of EnKF and regenerating new members using the re-sampling method demonstrates reasonable improvement and reduction of uncertainty in history matching of reservoir models with multiple facies. The histogram and the experimental variogram of the updated ensemble members are more consistent with the static geologic information. Moreover, the technique helps maintaining ensemble variance which is essential for uncertainty estimation in the posterior probability distribution of facies proportions.
Journal of Contaminant Hydrology | 2016
Juliana Y. Leung; Sanjay Srinivasan
Modeling transport process at large scale requires proper scale-up of subsurface heterogeneity and an understanding of its interaction with the underlying transport mechanisms. A technique based on volume averaging is applied to quantitatively assess the scaling characteristics of effective mass transfer coefficient in heterogeneous reservoir models. The effective mass transfer coefficient represents the combined contribution from diffusion and dispersion to the transport of non-reactive solute particles within a fluid phase. Although treatment of transport problems with the volume averaging technique has been published in the past, application to geological systems exhibiting realistic spatial variability remains a challenge. Previously, the authors developed a new procedure where results from a fine-scale numerical flow simulation reflecting the full physics of the transport process albeit over a sub-volume of the reservoir are integrated with the volume averaging technique to provide effective description of transport properties. The procedure is extended such that spatial averaging is performed at the local-heterogeneity scale. In this paper, the transport of a passive (non-reactive) solute is simulated on multiple reservoir models exhibiting different patterns of heterogeneities, and the scaling behavior of effective mass transfer coefficient (Keff) is examined and compared. One such set of models exhibit power-law (fractal) characteristics, and the variability of dispersion and Keff with scale is in good agreement with analytical expressions described in the literature. This work offers an insight into the impacts of heterogeneity on the scaling of effective transport parameters. A key finding is that spatial heterogeneity models with similar univariate and bivariate statistics may exhibit different scaling characteristics because of the influence of higher order statistics. More mixing is observed in the channelized models with higher-order continuity. It reinforces the notion that the flow response is influenced by the higher-order statistical description of heterogeneity. An important implication is that when scaling-up transport response from lab-scale results to the field scale, it is necessary to account for the scale-up of heterogeneity. Since the characteristics of higher-order multivariate distributions and large-scale heterogeneity are typically not captured in small-scale experiments, a reservoir modeling framework that captures the uncertainty in heterogeneity description should be adopted.
Archive | 2017
Vikrant Vishal; Juliana Y. Leung
Numerical methods are often used to simulate and analyze flow and transport in heterogeneous reservoirs. However, they are limited by computational restrictions including small time steps and fine grid size to avoid numerical dispersion. The ability to perform efficient coarse-scale simulations that capture the uncertainties in reservoir attributes and transport parameters introduced by scale-up remains challenging. A novel method is formulated to properly represent sub-grid variability in coarse-scale models. First, multiple sub-grid realizations depicting detailed fine-scale heterogeneities and of the same physical sizes as the transport modeling grid block are subjected to random walk particle tracking (RWPT) simulation, which is not prone to numerical dispersion. To capture additional unresolved heterogeneities occurring below even the fine scale, the transition time is sampled stochastically in a fashion similar to the continuous time random walk (CTRW) formulation. Coarse-scale effective dispersivities and transition time are estimated by matching the corresponding effluent history for each realization with an equivalent medium consisting of averaged homogeneous rock properties. Probability distributions of scale-up effective parameters conditional to particular averaged rock properties are established by aggregating results from all realizations. Next, to scale-up porosity and permeability, volume variance at the transport modeling scale is computed corresponding to a given spatial correlation model; numerous sets of “conditioning data” are sampled from probability distributions whose mean is the block average of the actual measured values and the variance is the variance of block mean. Multiple realizations at the transport modeling scale are subsequently constructed via stochastic simulations. The method is applied to model the tracer injection process. Results obtained from coarse-scale models where properties are populated with the proposed approach are in good agreement with those obtained from detailed fine-scale models. With the advances in nanoparticle technology and its increasing application in unconventional reservoirs, the method presented in this study has significant potential in analyzing tracer tests for characterization of complex reservoirs and reliable assessment of fluid distribution. The approach can also be employed to study scale-dependent dispersivity and its impacts in miscible displacement processes.
Spe Reservoir Evaluation & Engineering | 2016
Mingyuan Wang; Juliana Y. Leung
Less than half of fracturing fluid is typically recovered during flow-back operation. This study models the effects of capillarity and geomechanics on water loss in fracture-matrix system and investigates the circumstances under which this phenomenon might be beneficial or detrimental to subsequent on tight oil production. During the shut-in (soaking) and flow-back periods, fracture conductivity decreases as effective stress increases caused by imbibition. Previous works have studied geomechanical influence assuming single-phase flow, but the coupled effect of imbibition due to multiphase flow and stress-dependent fracture properties is less understood. A series of mechanistic simulation models are constructed to simulate multiphase flow and fluid distribution during shut-in and flow-back. Three systems: matrix, hydraulic fracture and micro fractures are explicitly represented in the computational domain. Sensitivities to wettability and multiphase flow functions (relative permeability and capillary pressure relationships) are investigated. As wettability to water increases, matrix imbibition increases. Imbibed water would help to displace hydrocarbons into nearby micro and hydraulic fractures, enhancing initial oil rate. However, fracture compacts as fluid pressure decreases. Increase in short-term oil production as a result of imbibition could be counteracted by the reduction in flow capability due to fracture closure. Therefore, the coupling of stress-dependent fracture conductivity and imbibition are studied next. Our results indicate that the fracture compaction can enhance imbibition and water loss, which in turn leads to further reduction in fracture pressure and conductivity. Spatial variability in micro-fracture properties is also modeled probabilistically to investigate whether it is possible for fracturing fluid to be trapped in the micro fractures, or conversely, the micro fractures could provide alternate pathways for fluids to access the hydraulic fracture systems. This work presents a quantitative study of the controlling factors of water loss due to fluid-rock properties and geomechanics. The results highlight the crucial interplay between imbibition and geomechanics in shortand longterm production performances. The results in this study would have considerable impacts on understanding and improving current industry practice on fracturing design and assessment of stimulated reservoir volume.
IFAC Proceedings Volumes | 2012
Siavash Nejadi; Juliana Y. Leung
Abstract The Ensemble Kalman Filter (EnKF) is a Monte-Carlo based technique for assisted history matching and real time updating of reservoir models. However, it often fails to detect facies boundaries and proportions as the facies distributions are non-Gaussian, while geologic data for reservoir modeling is usually insufficient. It is convenient to represent distinct facies with non-Gaussian categorical indicators; we implemented discrete cosine transform (DCT) to parameterize the facies indicators into coefficients of the retained cosine basis functions that are Gaussian. For highly complex and heterogeneous models, though observed data were matched, it failed to reproduce realistic facies distribution corresponding to reference variogram and facies proportion. In this paper we propose a new ensemble filtering method in-between of EnKF and PF, where EnKF as predictor combines the advantages of accurate large updates with small ensembles and corrector for non-Gaussian distributions followed by EnKF again for analysis step. Correction is performed by regenerating new realizations using a new pilot point method. The ensemble members that are more consistent with the early production history and the available geological information are considered as high weight particles and used for the applications. Combination of DCT-EnKF and regenerating new realizations using the new pilot point method demonstrates reasonable improvement and reduction of uncertainty in facies detection. Incorporating the new step in the procedure assists the filter to honor the reference distribution and experimental variogram during the history matching process and presents an important potential in improved characterization of complex reservoirs.
Stochastic Environmental Research and Risk Assessment | 2018
Vikrant Vishal; Juliana Y. Leung
Numerical techniques for subsurface flow and transport modeling are often limited by computational limitations including fine mesh and small time steps to control artificial dispersion. Particle-tracking simulation offers a robust alternative for modeling solute transport in subsurface formations. However, the modeling scale usually differs substantially from the rock measurement scale, and the scale-up of measurements have to be made accounting for the pattern of spatial heterogeneity exhibited at different scales. Therefore, it is important to construct accurate coarse-scale simulations that are capable of capturing the uncertainties in reservoir and transport attributes due to scale-up. A statistical scale-up procedure developed in our previous work is extended by considering the effects of unresolved (residual) heterogeneity below the resolution of the finest modeling scale in 3D. First, a scale-up procedure based on the concept of volume variance is employed to construct realizations of permeability and porosity at the (coarse) transport modeling scale, at which flow or transport simulation is performed. Next, to compute various effective transport parameters, a series of realizations exhibiting detailed heterogeneities at the fine scale, whose domain size is the same as the transport modeling scale, are generated. These realizations are subjected to a hybrid particle-tracking simulation. Probabilistic transition time is considered, borrowing the idea from the continuous time random walk (CTRW) technique to account for any sub-scale heterogeneity at the fine scale level. The approach is validated against analytical solutions and general CTRW formulation. Finally, coarse-scale transport variables (i.e., dispersivities and parameterization of transition time distribution) are calibrated by minimizing the mismatch in effluent history with the equivalent averaged models. Construction of conditional probability distributions of effective parameters is facilitated by integrating the results over the entire suite of realizations. The proposed method is flexible, as it does not invoke any explicit assumption regarding the multivariate distribution of the heterogeneity. In contrast to other hierarchical CTRW formulation for modeling multi-scale heterogeneities, the proposed approach does not impose any length scale requirement regarding sub-grid heterogeneities. In fact, it aims to capture the uncertainty in effective reservoir and transport properties due to the presence of heterogeneity at the intermediate scale, which is larger than the finest resolution of heterogeneity but smaller than the representative elementary volume, but it is often comparable to the transport modeling scale.
Transport in Porous Media | 2018
Vikrant Vishal; Juliana Y. Leung
Coarse-scale models are generally preferred in the numerical simulation of multi-phase flow due to computational constraints. However, capturing the effects of fine-scale heterogeneity on flow and isolating the impacts of numerical (artificial) dispersion, which increases with scale, are not trivial. In this paper, a particle-tracking method is devised and integrated in a scale-up workflow to estimate the conditional probability distributions of multi-phase flow functions, which can be considered as inputs in coarse-scale simulations with existing commercial packages. First, a novel particle-tracking method is developed to solve the saturation transport equation. The transport calculation is coupled with a velocity update, following the implicit pressure, explicit saturation framework, to solve the governing equations of two-phase immiscible flow. Each phase particle is advanced in a deterministic convection step according to the phase velocity, as well as in a stochastic dispersion step based on the random Brownian motion. A kernel-based formulation is proposed for computation of fluid saturation in accordance with the phase particle distribution. A novel aspect is that this method employs the kernel approach to construct saturation from phase particle distribution, which is an important improvement to the conventional box method that necessitates a large number of particles per grid cell for consistent saturation interpolation. The model is validated against various analytical solutions. Finally, the validated model is integrated in a statistical scale-up procedure to calibrate effective, or “pseudo,” multi-phase flow functions (e.g., relative permeability functions). The proposed scale-up framework does not impose any length scale requirement regarding the distribution of sub-grid heterogeneities.
Neural Computing and Applications | 2018
Jingwen Zheng; Juliana Y. Leung; Ronald P. Sawatzky; Jose Alvarez
An artificial intelligence (AI)-based workflow is deployed to develop and test procedures for estimating shale barrier configurations from SAGD production profiles. The data employed in this project are derived from a set of synthetic SAGD reservoir simulations based on petrophysical properties and operational constraints representative of Athabasca oil sands reservoirs. Initially, a two-dimensional reservoir simulation model is employed. The underlying model is homogeneous. Its petrophysical properties, such as the porosity, permeability, initial oil saturation and net pay thickness, have been taken from average values for several pads in Suncor’s Firebag project. Reservoir heterogeneities are simulated by superimposing sets of idealized shale barrier configurations on the homogeneous model. The location and geometry of each shale barrier is parameterized by a unique set of indices. The resulting heterogeneous model is subjected to flow simulation to simulate SAGD production. Next, a two-step workflow is followed: (1) a network model based on AI tools is constructed to match the output of the reservoir simulation (shale indices are inputs, while production rate is the output) for a known training set of shale barrier configurations; (2) for a new SAGD production history generated via reservoir simulation with a shale barrier configuration that is unknown to the AI model generated in Step 1, an optimization scheme based on a genetic algorithm approach is adopted to perturb the shale indices until the difference between the target production history and the production history predicted from the AI model is minimized. A number of cases have been tested. The results show a good agreement between the shale barrier configurations predicted by the AI model with the configurations used to generate production histories in the reservoir simulation model (i.e., the “true” model). Thus, this optimization workflow offers potential to become an alternative tool for indirect inference of the uncertain distribution of shale barriers in SAGD reservoirs from data capturing field performance. This work highlights the potential of an AI-based workflow to infer the presence and distribution of heterogeneous shale barriers from field SAGD production time series data. It presents an innovative parameterization scheme suitable for representing heterogeneous characteristics of shale barriers. If this approach proves to be successful, it could allow the distribution of shale barriers to be inferred together with the impact of these barriers on SAGD performance. This would provide a basis for developing operating strategies to reduce the impact of the barriers.