Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Alexandre Boucher is active.

Publication


Featured researches published by Alexandre Boucher.


IEEE Transactions on Geoscience and Remote Sensing | 2008

Geostatistical Solutions for Super-Resolution Land Cover Mapping

Alexandre Boucher; Phaedon C. Kyriakidis; Collin Cronkite-Ratcliff

Super-resolution land cover mapping aims at producing fine spatial resolution maps of land cover classes from a set of coarse-resolution class fractions derived from satellite information via, for example, spectral unmixing procedures. Based on a prior model of spatial structure or texture that encodes the expected patterns of classes at the fine (target) resolution, this paper presents a sequential simulation framework for generating alternative super-resolution maps of class labels that are consistent with the coarse class fractions. Two modes of encapsulating the prior structural information are investigated-one uses a set of indicator variogram models, and the other uses training images. A case study illustrates that both approaches lead to super-resolution class maps that exhibit a variety of spatial patterns ranging from simple to complex. Using four different examples, it is demonstrated that the structural model controls the patterns seen on the super-resolution maps, even for cases where the coarse fraction data are highly constraining.


Computers & Geosciences | 2008

A SGeMS code for pattern simulation of continuous and categorical variables: FILTERSIM

Jianbing Wu; Alexandre Boucher; Tuanfeng Zhang

The new multiple-point geostatistical algorithm (FILTERSIM), which can handle both categorical and continuous variable training images, is implemented in the SGeMS software. The spatial patterns depicted by the training image are first summarized into a few filter scores; then classified into pattern groups in the filter score space. The sequential simulation approach proceeds by associating each conditioning data event to a closest pattern group using some distance function. A training pattern is then sampled from that group and pasted back onto the simulation grid. Local multiple-point statistics carried by patterns are captured from the training image, and reproduced in the simulation realizations. Hence complex multiple-scale geological structures can be re-constructed in the simulation grid, conditional to a variety of sub-surface data such as well data and seismic survey.


Water Resources Research | 2010

Combining geologic-process models and geostatistics for conditional simulation of 3-D subsurface heterogeneity

Holly A. Michael; Hongmei Li; Alexandre Boucher; Tao Sun; Jef Caers; Steven M. Gorelick

[1] The goal of simulation of aquifer heterogeneity is to produce a spatial model of the subsurface that represents a system such that it can be used to understand or predict flow and transport processes. Spatial simulation requires incorporation of data and geologic knowledge, as well as representation of uncertainty. Classical geostatistical techniques allow for the conditioning of data and uncertainty assessment, but models often lack geologic realism. Simulation of physical geologic processes of sedimentary deposition and erosion (process-based modeling) produces detailed, geologically realistic models, but conditioning to local data is limited at best. We present an aquifer modeling methodology that combines geologic-process models with object-based, multiple-point, and variogram-based geostatistics to produce geologically realistic realizations that incorporate geostatistical uncertainty and can be conditioned to data. First, the geologic features of grain size, or facies, distributions simulated by a process-based model are analyzed, and the statistics of feature geometry are extracted. Second, the statistics are used to generate multiple realizations of reduced-dimensional features using an object-based technique. Third, these realizations are used as multiple alternative training images in multiple-point geostatistical simulation, a step that can incorporate local data. Last, a variogram-based geostatistical technique is used to produce conditioned maps of depositional thickness and erosion. Successive realizations of individual strata are generated in depositional order, each dependent on previously simulated geometry, and stacked to produce a fully conditioned three-dimensional facies model that mimics the architecture of the process-based model. We demonstrate the approach for a typical subsea depositional complex.


IEEE Transactions on Geoscience and Remote Sensing | 2006

A Novel Method for Mapping Land Cover Changes: Incorporating Time and Space With Geostatistics

Alexandre Boucher; Karen C. Seto; Andre G. Journel

Landsat data are now available for more than 30 years, providing the longest high-resolution record of Earth monitoring. This unprecedented time series of satellite imagery allows for extensive temporal observation of terrestrial processes such as land cover and land use change. However, despite this unique opportunity, most existing change detection techniques do not fully capitalize on this long time series. In this paper, a method that exploits both the temporal and spatial domains of time series satellite data to map land cover changes is presented. The time series of each pixel in the image is modeled with a combination of: 1) pixel-specific remotely sensed data; 2) neighboring pixels derived from ground observation data; and 3) time series transition probabilities. The spatial information is modeled with variograms and integrated using indicator kriging; time series transition probabilities are combined using an information-based cascade approach. This results in a map that is significantly more accurate in identifying when, where, and what land cover changes occurred. For the six images used in this paper, the prediction accuracy of the time series improves significantly, increasing from 31% to 61%, when both space and time are considered with the maximum likelihood. The consideration of spatial continuity also reduced unwanted speckles in the classified images, removing the need for any postprocessing. These results indicate that combining space and time domains significantly improves the accuracy of temporal change detection analyses and can produce high-quality time series land cover maps


Computers & Geosciences | 2009

Considering complex training images with search tree partitioning

Alexandre Boucher

Using a complex training image (TI) for the single normal equation simulation (SNESIM) algorithm results in poor simulated realizations since that image contains trends and location specific patterns. By pooling all the TI patterns in a single search tree and not recording the relative locations of those patterns, some critical features of these complex TIs are lost. The search tree partitioning approach subdivides a large TI into imbricated, homogeneous, smaller images, called partition classes. Each of these partition classes has a corresponding search tree that can be utilized by the SNESIM algorithm. These partition classes are obtained by processing the TIs with spatial filters that are pattern sensitive. The resulting filter scores are then clustered into partition classes. All patterns within a partition class are recorded by a search tree; there is one tree per partition class. At each pixel along the simulation path, the partition class is retrieved first and used to select the appropriate search tree. That search tree contains the patterns relevant to that partition class. In practice, the partitioning approach adds flexibility in choosing a TI. TIs that were easier to obtain but traditionally too complex for simulation can now be considered as input to SNESIM. In many cases, it also significantly increases the simulation speed by searching a vector of smaller trees instead of a single large one. A plugin for the SGeMS software is provided.


Computational Geosciences | 2013

History matching and uncertainty quantification of facies models with multiple geological interpretations

Hyucksoo Park; Céline Scheidt; Darryl Fenwick; Alexandre Boucher; Jef Caers

Uncertainty quantification is currently one of the leading challenges in the geosciences, in particular in reservoir modeling. A wealth of subsurface data as well as expert knowledge are available to quantify uncertainty and state predictions on reservoir performance or reserves. The geosciences component within this larger modeling framework is partially an interpretive science. Geologists and geophysicists interpret data to postulate on the nature of the depositional environment, for example on the type of fracture system, the nature of faulting, and the type of rock physics model. Often, several alternative scenarios or interpretations are offered, including some associated belief quantified with probabilities. In the context of facies modeling, this could result in various interpretations of facies architecture, associations, geometries, and the way they are distributed in space. A quantitative approach to specify this uncertainty is to provide a set of alternative 3D training images from which several geostatistical models can be generated. In this paper, we consider quantifying uncertainty on facies models in the early development stage of a reservoir when there is still considerable uncertainty on the nature of the spatial distribution of the facies. At this stage, production data are available to further constrain uncertainty. We develop a workflow that consists of two steps: (1) determining which training images are no longer consistent with production data and should be rejected and (2) to history match with a given fixed training image. We illustrate our ideas and methodology on a test case derived from a real field case of predicting flow in a newly planned well in a turbidite reservoir off the African West coast.


Photogrammetric Engineering and Remote Sensing | 2007

Integrating Fine Scale Information in Super-resolution Land-cover Mapping

Alexandre Boucher; Phaedon C. Kyriakidis

Super-resolution or sub-pixel class mapping is the task of providing fine spatial resolution maps of, for example, land-cover classes, from satellite sensor measurements obtained at a coarser spatial resolution. Often, the only information available consists of coarse class fraction data, typically obtained through spectral unmixing. This paper shows how to integrate, in addition to such coarse fractions, class labels at a set of fine pixels obtained independent of the satellite sensor measurements. The integration of such fine spatial resolution information is achieved within the Indicator Kriging formalism in either a prediction or simulation mode. The spatial dissimilarity or texture of class labels at the fine (target) resolution is quantified in a non-parametric way from an analog scene using a set of experimental indicator semivariogram maps. The output of the proposed procedure consists of maps of probabilities of class occurrence, or of a series of simulated class maps characterizing the inherent spatial uncertainty in the super-resolution mapping process.


Mathematical Geosciences | 2012

Multivariate Block-Support Simulation of the Yandi Iron Ore Deposit, Western Australia

Alexandre Boucher; Roussos Dimitrakopoulos

Mineral deposits frequently contain several elements of interest that are spatially correlated and require the use of joint geostatistical simulation techniques in order to generate models preserving their spatial relationships. Although joint-simulation methods have long been available, they are impractical when it comes to more than three variables and mid to large size deposits. This paper presents the application of block-support simulation of a multi-element mineral deposit using minimum/maximum autocorrelation factors to facilitate the computationally efficient joint simulation of large, multivariable deposits. The algorithm utilized, termed dbmafsim, transforms point-scale spatial attributes of a mineral deposit into uncorrelated service variables leading to the generation of simulated realizations of block-scale models of the attributes of interest of a deposit. The dbmafsim algorithm is utilized at the Yandi iron ore deposit in Western Australia to simulate five cross-correlated elements, namely Fe, SiO2, Al2O3, P and LOI, that are all critical in defining the quality of iron ore being produced. The block-scale simulations reproduce the direct- and cross-variograms of the elements even though only the direct variograms of the service variables have to be modeled. The application shows the efficiency, excellent performance and practical contribution of the dbmafsim algorithm in simulating large multi-element deposits.


Geophysics | 2009

Revisiting the use of seismic attributes as soft data for subseismic facies prediction: Proportions versus probabilities

Lisa Stright; Anne Bernhardt; Alexandre Boucher; Tapan Mukerji; Richard Derksen

Geostatistical modeling originated within the mining industry to estimate average minable ore grade from large support volumes given samples measured on small volume support. In petroleum geostatistics, the goal is more equivocal due to several different scales of support of input data, which are often incongruent with the desired prediction scale. More specifically, the goal is to utilize indirect measurements (e.g., seismic data) from a scale larger than the prediction scale for fine-scale spatial distributions of facies and petrophysical properties grounded by undersampled point data (e.g., well-log data). (Note, volume support is a geostatistical term that describes the size or resolution of the sample or measurement.)


Mathematical Geosciences | 2014

Simulation of Geological Contacts from Interpreted Geological Model Using Multiple-Point Statistics

Alexandre Boucher; Joao Felipe Coimbra Leite Costa; Luis Gustavo Rasera; Eduardo Motta

Applications of multiple-point statistics (mps) algorithms to large non-repetitive geological objects such as those found in mining deposits are difficult because most mps algorithms rely on pattern repetition for simulation. In many cases, an interpreted geological model built from a computer-aided design system is readily available but suffers as a training image due to the lack of patterns repetitiveness. Porphyry copper deposits and iron ore formations are good examples of such mining deposits with non-repetitive patterns. This paper presents an algorithm called contactsim that focuses on reproducing the patterns of the contacts between geological types. The algorithm learns the shapes of the lithotype contacts as interpreted by the geologist, and simulates their patterns at a later stage. Defining a zone of uncertainty around the lithological contact is a critical step in contactsim, because it defines both the zones where the simulation is performed and where the algorithm should focus to learn the transitional patterns between lithotypes. A larger zone of uncertainty results in greater variation between realizations. The definition of the uncertainty zone must take into consideration the geological understanding of the deposit, and the reliability of the contact zones. The contactsim algorithm is demonstrated on an iron ore formation.

Collaboration


Dive into the Alexandre Boucher's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Lisa Stright

Colorado State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Denis Marcotte

École Polytechnique de Montréal

View shared research outputs
Researchain Logo
Decentralizing Knowledge