Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Boris Kargoll is active.

Publication


Featured researches published by Boris Kargoll.


Archive | 2010

GOCE Data Analysis: From Calibrated Measurements to the Global Earth Gravity Field

Jan Martin Brockmann; Boris Kargoll; I. Krasbutter; Wolf-Dieter Schuh; Martin Wermuth

The goal of this chapter is to describe an in-situ approach to determine a global Earth gravity model and its variance/covariance information on the basis of calibrated measurements from the GOCE mission. As the main characteristics of this procedure, the GOCE data are processed sequentially on a parallel computer system, iteratively via application of the method of preconditioned conjugate gradient multiple adjustment (PCGMA), and in situ via development of the functionals at the actual location and orientation of the gradiometer. We will further explain the adaption of the unknown stochastic model, determined by estimating decorrelation filters and variance components with respect to the GOCE observation types (i.e. SST, SGG, and regularizing prior information).


Archive | 2005

Comparison of some robust parameter estimation techniques for outlier analysis applied to simulated GOCE mission data

Boris Kargoll

Until now, methods of gravity field determination using satellite data have virtually excluded robust estimators despite the potentially disastrous effect of outliers. This paper presents computationally-feasible algorithms for Huber’s M-estimator (a classic robust estimator) as well as for the class of R-estimators which have not traditionally been considered for geodetic applications. It is shown that the computational time required for the proposed algorithms is comparable to the direct method of least squares. Furthermore, a study with simulated GOCE satellite gradiometry data demonstrates that the robust gravity field solution remains almost unaffected by additive outliers. In addition, using robustly-estimated residuals proves to be more efficient at detecting outliers than using residuals resulting from least squares estimation. Finally, the non-parametric R-estimators make less assumptions about the measurement errors and produce similar results to Huber’s M-estimator, making that class a viable robust alternative.


Journal of Applied Geodesy | 2013

Expectation maximization algorithm for the variance-inflation model by applying the t-distribution

Karl-Rudolf Koch; Boris Kargoll

Abstract An adaptive robust estimation for the linear model using the t-distribution is available. Unknown weights for the observations to identify outliers are introduced, i.e. the variance-inflation model is applied. The EM (expectation maximization) algorithm is used for the estimation of the unknown parameters and results in an iteratively reweighted least squares adjustment. Small weights indicate the outliers. However, it is found out here that the weights continuously increase for outliers with small absolute values without a hint where the outliers stop and the observations begin. The suspected outliers are therefore introduced into the EM algorithm for a robust estimation based on the mean-shift model. It starts with zero weights for the outliers and shows at the end of the iterations a clear indication between the weights for the outliers and the observations. Thus, the EM algorithms for the variance-inflation and the mean-shift model complement each other. The first algorithm is very sensitive to outliers because of its adaptive estimation and the second one provides the distinction between outliers and observations.


Journal of Applied Geodesy | 2015

Outlier detection by the EM algorithm for laser scanning in rectangular and polar coordinate systems

Karl-Rudolf Koch; Boris Kargoll

Abstract To visualize the surface of an object, laser scanners determine the rectangular coordinates of points of a grid on the surface of the object in a local coordinate system. Vertical angles, horizontal angles and distances of a polar coordinate system are measured with the scanning. Outliers generally occur as gross errors in the distances. It is therefore investigated here whether rectangular or polar coordinates are better suited for the detection of outliers. The parameters of a surface represented by a polynomial are estimated in the nonlinear Gauss Helmert (GH) model and in a linear model. Rectangular and polar coordinates are used, and it is shown that the results for both coordinate systems are identical. It turns out that the linear model is sufficient to estimate the parameters of the polynomial surface. Outliers are therefore identified in the linear model by the expectation maximization (EM) algorithm for the variance-inflation model and are confirmed by the EM algorithm for the mean-shift model. Again, rectangular and polar coordinates are used. The same outliers are identified in both coordinate systems.


Archive | 2004

The numerical treatment of the downward continuation problem for the gravity potential

Wolf-Dieter Schuh; Boris Kargoll

This paper discusses numerical and statistical techniques used to recover the gravity potential from GOCE mission data. In particular, in a closed loop simulation, it is shown that two completely different and independent solution strategies, i.e. the direct method and the semi-analytic approach, lead to essentially identical results. Both methods can give only a finite representation of the gravity potential. The truncation of the infinite series leads to a special type of regularization, denoted as spectral leakage. The size of this effect is estimated in a closed loop simulation. To verify the quality of the gravity field recovery a detailed analysis of the whole adjustment procedure is necessary. Therefore, a step-by-step statistical test strategy is introduced to validate the deterministic model as well as the stochastic assumptions by analyzing the residuals of the decorrelated adjustment problem. First of all a test of randomness is applied to check the assumption of uncorrelated residuals. The assumption of stationarity is checked by variance analysis of different observation groups. Finally, the autocorrelation function and the periodogram of the transformed residuals are tested for significant correlations. To eliminate the remaining correlations the filter model and thus the stochastic model is improved.


Journal of Geodesy | 2018

An iteratively reweighted least-squares approach to adaptive robust adjustment of parameters in linear regression models with autoregressive and t-distributed deviations

Boris Kargoll; Mohammad Omidalizarandi; Ina Loth; Jens-André Paffenholz; Hamza Alkhatib

In this paper, we investigate a linear regression time series model of possibly outlier-afflicted observations and autocorrelated random deviations. This colored noise is represented by a covariance-stationary autoregressive (AR) process, in which the independent error components follow a scaled (Student’s) t-distribution. This error model allows for the stochastic modeling of multiple outliers and for an adaptive robust maximum likelihood (ML) estimation of the unknown regression and AR coefficients, the scale parameter, and the degree of freedom of the t-distribution. This approach is meant to be an extension of known estimators, which tend to focus only on the regression model, or on the AR error model, or on normally distributed errors. For the purpose of ML estimation, we derive an expectation conditional maximization either algorithm, which leads to an easy-to-implement version of iteratively reweighted least squares. The estimation performance of the algorithm is evaluated via Monte Carlo simulations for a Fourier as well as a spline model in connection with AR colored noise models of different orders and with three different sampling distributions generating the white noise components. We apply the algorithm to a vibration dataset recorded by a high-accuracy, single-axis accelerometer, focusing on the evaluation of the estimated AR colored noise model.


Archive | 2015

Magic Square of Real Spectral and Time Series Analysis with an Application to Moving Average Processes

I. Krasbutter; Boris Kargoll; Wolf-Dieter Schuh

This paper is concerned with the spectral analysis of stochastic processes that are real-valued, one-dimensional, discrete-time, covariance-stationary, and which have a representation as a moving average (MA) process. In particular, we will review the meaning and interrelations of four fundamental quantities in the time and frequency domain, (1) the stochastic process itself (which includes filtered stochastic processes), (2) its autocovariance function, (3) the spectral representation of the stochastic process, and (4) the corresponding spectral distribution function, or if it exists, the spectral density function. These quantities will be viewed as forming the corners of a square (the “magic square of spectral and time series analysis”) with various connecting lines, which represent certain mathematical operations between them. To demonstrate the evaluation of these operations, we will discuss the example of a q-th order MA process.


Archive | 2014

Adjustment of Digital Filters for Decorrelation of GOCE SGG Data

I. Krasbutter; Jan Martin Brockmann; Boris Kargoll; Wolf-Dieter Schuh

GOCE satellite gravity gradiometry (SGG) data are strongly autocorrelated within the various tensor components. Consideration of these correlations in the least-squares adjustment for gravity field determination can be carried out by digital decorrelation filters. Due to the complexity of the correlation pattern the used decorrelation filters consist of a cascade of individual filters. In this contribution some of the properties of these filters and their application to GOCE SGG data decorrelation will be presented.


Journal of Applied Geodesy | 2017

Statistical evaluation of the influence of the uncertainty budget on B-spline curve approximation

Xin Zhao; Hamza Alkhatib; Boris Kargoll; Ingo Neumann

Abstract In the field of engineering geodesy, terrestrial laser scanning (TLS) has become a popular method for detecting deformations. This paper analyzes the influence of the uncertainty budget on free-form curves modeled by B-splines. Usually, free-form estimation is based on scanning points assumed to have equal accuracies, which is not realistic. Previous findings demonstrate that the residuals still contain random and systematic uncertainties caused by instrumental, object-related and atmospheric influences. In order to guarantee the quality of derived estimates, it is essential to be aware of all uncertainties and their impact on the estimation. In this paper, a more detailed uncertainty budget is considered, in the context of the “Guide to the Expression of Uncertainty in Measurement” (GUM), which leads to a refined, heteroskedastic variance covariance matrix (VCM) of TLS measurements. Furthermore, the control points of B-spline curves approximating a measured bridge are estimated. Comparisons are made between the estimated B-spline curves using on the one hand a homoskedastic VCM and on the other hand the refined VCM. To assess the statistical significance of the differences displayed by the estimates for the two stochastic models, a nested model misspecification test and a non-nested model selection test are described and applied. The test decisions indicate that the homoskedastic VCM should be replaced by a heteroskedastic VCM in the direction of the suggested VCM. However, the tests also indicate that the considered VCM is still inadequate in light of the given data set and should therefore be improved.


Remote Sensing | 2018

Model Selection for Parametric Surfaces Approximating 3D Point Clouds for Deformation Analysis

Xin Zhao; Boris Kargoll; Mohammad Omidalizarandi; Xiangyang Xu; Hamza Alkhatib

Deformation monitoring of structures is a common application and one of the major tasks of engineering surveying. Terrestrial laser scanning (TLS) has become a popular method for detecting deformations due to high precision and spatial resolution in capturing a number of three-dimensional point clouds. Surface-based methodology plays a prominent role in rigorous deformation analysis. Consequently, it is of great importance to select an appropriate regression model that reflects the geometrical features of each state or epoch. This paper aims at providing the practitioner some guidance in this regard. Different from standard model selection procedures for surface models based on information criteria, we adopted the hypothesis tests from D.R. Cox and Q.H. Vuong to discriminate statistically between parametric models. The methodology was instantiated in two numerical examples by discriminating between widely used polynomial and B-spline surfaces as models of given TLS point clouds. According to the test decisions, the B-spline surface model showed a slight advantage when both surface types had few parameters in the first example, while it performed significantly better for larger numbers of parameters. Within B-spline surface models, the optimal one for the specific segment was fixed by Vuong’s test whose result was quite consistent with the judgment of widely used Bayesian information criterion. The numerical instabilities of B-spline models due to data gap were clearly reflected by the model selection tests, which rejected inadequate B-spline models in another numerical example.

Collaboration


Dive into the Boris Kargoll's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Vladik Kreinovich

University of Texas at El Paso

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Hao Yang

University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Olga Kosheleva

University of Texas at El Paso

View shared research outputs
Researchain Logo
Decentralizing Knowledge