Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Patrick Bogaert is active.

Publication


Featured researches published by Patrick Bogaert.


Water Resources Research | 1999

Optimal spatial sampling design for the estimation of the variogram based on a least squares approach

Patrick Bogaert; David Russo

A methodology for the choice of optimal spatial sampling designs is proposed. The problem is to find an optimal finite set of space locations where a random field has to be sampled, in order to minimize the variability of the parametric variogram estimator. Under the hypothesis that the random field is Gaussian second-order stationary and using a first-order approximation for the linearization of the parametric variogram with respect to its parameters, we are able to express the covariance matrix of the parameter estimators as a function of the sampling design. An optimization algorithm based on a generalized least squares approach is proposed in order to reduce the variability of these estimators through the minimization of the determinant of their covariance matrix. In order to validate this approach, a practical case study is conducted for two different variogram models. It shows that compared to random sampling designs, the benefit of the optimization procedure is somewhat limited for a variogram model without nugget effect. For a variogram model with nugget effect, random sampling designs are associated with much higher variability on the parameter estimators than optimized designs because of the typical lack of information offered by random sampling designs for small distances between locations. The performance of random sampling designs compared to optimized or alternative regular designs is shown to be poor with respect to parameter estimation, especially when a nugget effect is included in the variogram model. The parameters that benefit the most from the optimization procedure are the variance and the nugget effect, whereas the improvement for the range parameter estimation is limited. The optimization algorithm provides yardstick results, yielding reference values for the selection of an alternative regular design. The applicability of the algorithm is very wide, and it can greatly help the user to understand the way the parameters are influenced by the choice of a set of sampling locations instead of another.


Computers, Environment and Urban Systems | 2007

Spatial analysis and modelling of land use distributions in Belgium

Nicolas Dendoncker; Mark Rounsevell; Patrick Bogaert

When statistical analyses of land use drivers are performed, they rarely deal explicitly with spatial autocorrelation. Most studies are undertaken on autocorrelation-free data samples. By doing this, a great deal of information that is present in the dataset is lost. This paper presents a spatially explicit, cross-sectional analysis of land use drivers in Belgium. It is shown that purely regressive logistic models only identify trends or global relationships between socio-economic or physico-climatic drivers and the precise location of each land use type. However, when the goal of a study is to obtain the best statistical model fit of land use distribution, a purely autoregressive model is appropriate. It is shown that this type of model deals appropriately with spatial autocorrelation as measured by the lack of autocorrelation in the deviance residuals of the model. More specifically, three types of autoregressive models are compared: (1) a set of binomial logistic regression models (one for each modelled land use) accounting only for the proportion of the modelled land use within the neighbourhood of a cell; (2) a multinomial autologistic regression that accounts for the composition of a cells neighbourhood; and (3) a stateof-the-art Bayesian Maximum Entropy (BME) based model that accounts fully for the spatial organization of the land uses within the neighbourhood of a cell. The comparative analysis shows that the BME approach has no advantages over the other methods, for our specific application, but that accounting for the composition of a cells neighbourhood is essential in obtaining an optimal fit


International Journal of Geographical Information Science | 2011

Thematic accuracy assessment of geographic object-based image classification

Julien Radoux; Patrick Bogaert; Dominique Fasbender; Pierre Defourny

Geographic object-based image analysis is an image-processing method where groups of spatially adjacent pixels are classified as if they were behaving as a whole unit. This approach raises concerns about the way subsequent validation studies must be conducted. Indeed, classical point-based sampling strategies based on the spatial distribution of sample points (using systematic, probabilistic or stratified probabilistic sampling) do not rely on the same concept of objects and may prove to be less appropriate than the methods explicitly built on the concept of objects used for the classification step. In this study, an original object-based sampling strategy is compared with other approaches used in the literature for the thematic accuracy assessment of object-based classifications. The new sampling scheme and sample analysis are founded on a sound theoretical framework based on few working hypotheses. The performance of the sampling strategies is quantified using simulated object-based classifications results of a Quickbird imagery. The bias and the variance of the overall accuracy estimates were used as indicators of the methods benefits. The main advantage of the object-based predictor of the overall accuracy is its performance: for a given confidence interval, it requires fewer sampling units than the other methods. In many cases, this can help to noticeably reduce the sampling effort. Beyond the efficiency, more conceptual differences between point-based and object-based samplings are discussed. First, geolocation errors do not influence the object-based thematic accuracy as they do for point-based accuracy. These errors need to be addressed independently to provide the geolocation precision. Second, the response design is more complex in object-based accuracy assessment. This is interesting for complex classes but might be an issue in case of large segmentation errors. Finally, there is a larger likelihood to reach the minimum sample size for each class with an object-based sampling than in a point-based sampling. Further work is necessary to reach the same suitability than point-based sampling for pixel-based classification, but this pioneer study shows that object-based sampling could be implemented within a statistically sound framework.


Journal of remote sensing | 2007

Mean Compositing, an alternative strategy for producing temporal syntheses. Concepts and performance assessment for SPOT VEGETATION time series

Christelle Vancutsem; Jean-François Pekel; Patrick Bogaert; Pierre Defourny

Various compositing criteria have been proposed to produce cloud‐free images from optical time series. However, they often favour specific atmospheric and geometric conditions, which may cause serious inconsistencies in the syntheses. Algorithms including BRDF normalization minimize variations induced by the anisotropy of the target. However, their operational implementation faces some issues. This study proposes to avoid these issues by using a new strategy based on a statistical approach, i.e. Mean Compositing, and by comparing it with three existing techniques. A quantitative evaluation methodology with statistical tests on reflectance and texture values as well as visual comparisons were applied to numerous SPOT VEGETATION time series. The performance criterion was to best mimic the information content of a single cloud‐free near‐nadir view image. Moreover a quantitative approach was used to assess the temporal consistency of the syntheses. The results showed that the proposed strategy combined with an efficient quality control produces images with greater spatial consistency than currently available VEGETATION products but produces slightly more uneven time series than the most advanced compositing algorithm.


Mathematical Geosciences | 1996

Comparison of kriging techniques in a space-time context

Patrick Bogaert

Space-time processes constitute a particular class, requiring suitable tools in order to predict values in time and space, such as a space-time variogram or covariance function. The space-time co-variance function is defined and linked to the Linear Model of Coregionalization under second-order space-time stationarity. Simple and ordinary space-time kriging systems are compared to simple and ordinary cokriging and their differences for unbiasedness conditions are underlined. The ordinary space-time kriging estimation then is applied to simulated data. Prediction variances and prediction errors are compared with those for ordinary kriging and cokriging under different unbiasedness conditions using a cross-validation. The results show that space-time kriging tend to produce lower prediction variances and prediction errors that kriging and cokriging.


Measurement Science and Technology | 2005

Assessing the error of polygonal area measurements: a general formulation with applications to agriculture

Patrick Bogaert; J Delinc; S. Kay

This paper proposes a simple and general theoretical framework for the error assessment of area measurements of planar polygonal surfaces. The general formulation is first developed, both for the case of correlated and for the case of independent measurements, where a compact formulation can be obtained for the latter. These results are then used in the context of agriculture, with the aim of assessing field area measurement errors when using a global positioning system (GPS) device, enhanced using a simulated EGNOS (European geostationary navigation overlay service) time series. They show that for GPS/EGNOS measurements made by an operator moving along the border of a field, area measurement error is linked both to the operator speed and to the acquisition rate of the GPS device. For typical field sizes found in the European Union, ranging from 0.5 ha to 5 ha, the coefficient of variation (CV) for area measurement errors is about 1% to 5%. These results depend on the field area, but they can be considered to be insensitive with respect to the field shape. They also show that field area measurement errors can be limited if an appropriate combination of operator speed and GPS acquisition rate is selected. Though the practical case study presented here is focused on polygonal agricultural fields, it is expected that various other fields (medical and remotely sensed imagery, geographical information system data, computer vision analysis....) should also benefit from the theoretical results hereby obtained.


IEEE Geoscience and Remote Sensing Letters | 2008

Support-Based Implementation of Bayesian Data Fusion for Spatial Enhancement: Applications to ASTER Thermal Images

Dominique Fasbender; Devis Tuia; Patrick Bogaert; Mikhail Kanevski

In this letter, a general Bayesian data fusion (BDF) approach is proposed and applied to the spatial enhancement of ASTER thermal images. This method fuses information coming from the visible or near-infrared bands (15 times 15 m pixels) with the thermal infrared bands (90 times 90 m pixels) by explicitly accounting for the change of support. By relying on linear multivariate regression assumptions, differences of support size for input images can be explicitly accounted for. Due to the use of locally varying variances, it also avoids producing artifacts on the fused images. Based on a set of ASTER images over the region of Lausanne, Switzerland, the advantages of this support-based approach are assessed and compared to the downscaling cokriging approach recently proposed in the literature. Results show that improvements are substantial with respect to both visual and quantitative criteria. Although the method is illustrated here with a specific case study, it is versatile enough to be applied to the spatial enhancement problem in general. It thus opens new avenues in the context of remotely sensed images.


Environmental Pollution | 2012

Pollutant plume delineation from tree core sampling using standardized ranks

Agung Wahyudi; Patrick Bogaert; Stefan Trapp; Jirina Machackova

There are currently contradicting results in the literature about the way chloroethene (CE) concentrations from tree core sampling correlate with those from groundwater measurements. This paper addresses this issue by focusing on groundwater and tree core datasets in CE contaminated site, Czech Republic. Preliminary analyses revealed strongly and positively skewed distributions for the tree core dataset, with an intra-tree variability accounting for more than 80% of the total variability, while the spatial analyses based on variograms indicated no obvious spatial pattern for CE concentration. Using rank transformation, it is shown how the results were improved by revealing the initially hidden spatial structure for both variables when they are handled separately. However, bivariate analyses based on cross-covariance functions still failed to indicate a clear spatial correlation between groundwater and tree core measurements. Nonetheless, tree core sampling and analysis proved to be a quick and inexpensive semi-quantitative method and a useful tool.


Geoderma | 2003

Continuous-valued map reconstruction with the Bayesian Maximum Entropy

Dimitri D'Or; Patrick Bogaert

Thematic maps are one of the most common tools for representing the spatial variation of a variable. They are easy to interpret thanks to the simplicity of presentation: clear boundaries define homogeneous areas. However, especially when the variable is continuous, abrupt changes between cartographic units are often unrealistic and the intra-unit variation is hidden behind a single representative value. In many applications, such non-natural transitions are not satisfactory as is the poor precision of such maps. As additional samples are often cost prohibitive, one should try to use the information in the available map to evaluate the spatial variation of the variable under study. This paper shows how the Bayesian Maximum Entropy (BME) approach can be used to achieve such a goal using only the vague (soft) information in the map. BME is compared to a method frequently used in soil sciences: the legend quantification method. It is illustrated first on a simulated case study that BME increases noticeably the precision of the estimates. Resulting BME maps have smooth transitions between mapping units which is conform to the expected behavior of continuous variables. These observations are then corroborated in a real case study where the sand, silt and clay contents in soils have to be estimated from a soil map


IEEE Transactions on Geoscience and Remote Sensing | 1996

Spatiotemporal analysis of spring water ion processes derived from measurements at the Dyle basin in Belgium

George Christakos; Patrick Bogaert

This paper deals with the study of natural variations and mapping of spatiotemporal spring water ion processes by means of stochastic analysis. Natural variations in space/time are the result of the combined effects of the physical, chemical, and topographical laws as well as the uncertainties and heterogeneities underlying the phenomenon under consideration. Maps of the space/time distribution of natural processes constitute a fundamental element of physical explanation and prediction, and are extremely valuable tools for numerous applications in environmental sciences including, e.g., water quality management, solute transport characterization, and human exposure to pollutants and hazardous substances. The spatiotemporal random field theory is applied to spring water solute contents (calcium, nitrate, and chloride ions) which are irregularly distributed in space/time over the Dyle river catchment area in Belgium. The integration of the spatial and temporal components in a space/time continuum has considerable advantages as regards the analytical investigation of solute content processes. It provides a rigorous characterization of the ion concentration data set, which exhibits a spatially nonhomogeneous and temporally nonstationary variability, in general. The physics of the situation can be expressed in terms of differential equations that emphasize the importance of space/time continuity. The characterization of the latter involves certain random field parameters. A rich class of covariance models is determined from the properties of these parameters that includes, as special cases, separable generalized covariance models. In practice, the results of the space/time analysis may depend on the scale under consideration and, thus, a scale level must be specified that reveals important features of the spatiotemporal solute content variability. The analysis leads to maps of continuity orders and covariance coefficients that provide information about space/time solute content correlations and trends. Solute content estimations and the associated estimation errors are calculated at unmeasured locations/instants over the Dyle region using a space/time estimation algorithm. The analysis is general and can be applied to various data sets from environmental, hydrogeologic, atmospheric, and meteorologic sciences.

Collaboration


Dive into the Patrick Bogaert's collaboration.

Top Co-Authors

Avatar

Marnik Vanclooster

Université catholique de Louvain

View shared research outputs
Top Co-Authors

Avatar

Pierre Defourny

Université catholique de Louvain

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Dominique Fasbender

Université catholique de Louvain

View shared research outputs
Top Co-Authors

Avatar

Bas van Wesemael

Université catholique de Louvain

View shared research outputs
Top Co-Authors

Avatar

François Stevens

Université catholique de Louvain

View shared research outputs
Top Co-Authors

Avatar

Sébastien Lambot

Université catholique de Louvain

View shared research outputs
Top Co-Authors

Avatar

Marc L. Serre

University of North Carolina at Chapel Hill

View shared research outputs
Top Co-Authors

Avatar

Agung Wahyudi

Université catholique de Louvain

View shared research outputs
Researchain Logo
Decentralizing Knowledge