Sonja Kuhnt
Dortmund University of Applied Sciences and Arts
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Sonja Kuhnt.
Statistics and Computing | 2012
Thomas Muehlenstaedt; Olivier Roustant; Laurent Carraro; Sonja Kuhnt
Kriging models have been widely used in computer experiments for the analysis of time-consuming computer codes. Based on kernels, they are flexible and can be tuned to many situations. In this paper, we construct kernels that reproduce the computer code complexity by mimicking its interaction structure. While the standard tensor-product kernel implicitly assumes that all interactions are active, the new kernels are suited for a general interaction structure, and will take advantage of the absence of interaction between some inputs. The methodology is twofold. First, the interaction structure is estimated from the data, using a first initial standard Kriging model, and represented by a so-called FANOVA graph. New FANOVA-based sensitivity indices are introduced to detect active interactions. Then this graph is used to derive the form of the kernel, and the corresponding Kriging model is estimated by maximum likelihood. The performance of the overall procedure is illustrated by several 3-dimensional and 6-dimensional simulated and real examples. A substantial improvement is observed when the computer code has a relatively high level of complexity.
Journal of Simulation | 2010
Sonja Kuhnt; Sigrid Wenzel
Design, organisation and management of Large Logistics Networks (LLN) usually involve model-based analyses of the networks. The usefulness of such an analysis highly depends on the quality of the input data, which of course should be best possible to capture the real circumstances. In this paper, an advanced procedure model for a structured, goal- and task-oriented information and data acquisition for the model-based analyses of LLN is proposed. This procedure model differs from other approaches by focussing on information acquisition rather than solely on data acquisition, and by employing a consequent verification and validation concept. All steps of the procedure model—Goal Setting, Information Identification, Preparation of Information and Data Collection, Information and Data Collection, Data Recording, Data Structuring, Statistical Data Analysis, Data Usability Test—are described and exemplified for a network of air-freight-flow.
Technical reports | 2003
Sonja Kuhnt; Jörg Pawlitschko
Observations which seem to deviate strongly from the main part of the data may occur in every statistical analysis. These observations, usually labelled as outliers, may cause completely misleading results when using standard methods and may also contain information about special events or dependencies. We discuss outliers in situations where a generalized linear model is assumed as null model for the regular data and introduce rules for their identification. For the special cases of a loglinear Poisson model and a logistic regression model some one-step identifiers based on robust and non-robust estimators are proposed and compared.
Technical reports | 2002
Sonja Kuhnt; Claudia Becker
Graphical modeling as a form of multivariate analysis has turned out to be a capable tool for the detection and modeling of complex dependency structures. Statistical models are related to graphs, in which variables are represented by points and associations between each two of them as lines. The usefulness of graphical modeling depends of course on finding a graphical model, which fits the data appropriately. We will investigate how existing model building strategies and estimation methods can be affected by model disturbances or outlying observations. The focus of our sensitivity analysis lies on mixed graphical models, where both discrete and continuous variables are considered.
Statistical Modelling | 2004
Sonja Kuhnt; Martina Erdbrügge
In this article, we provide a strategy for the simultaneous optimization of multiple responses. Cases are covered where a set of response variables has finite target values and depends on easy to control as well as on hard to control variables. Our approach is based on loss functions, without the need for a predefined cost matrix. For each element of a sequence of possible weights assigned to the individual responses, settings of the easy to control parameters are determined, which minimize the estimated mean of a multivariate loss function. The estimation is based on statistical models, which depend only on the easy to control variables. The loss function itself takes the value zero, if all responses are on target with zero variances. In each case, the derived parameter settings are connected to a specific compromise of the responses, which is graphically displayed to the engineer by so called joint optimization plots. The expert can thereby gain valuable insight into the production process and then decide on the most sensible parameter setting. The proposed strategy is illustrated with a data set from the literature and new data from an up to date application.
Quality and Reliability Engineering International | 2011
Martina Erdbrügge; Sonja Kuhnt; Nikolaus Rudak
Most of the existing methods for the analysis and optimization of multiple responses require some kinds of weighting of these responses, for instance in terms of cost or desirability. Particularly at the design stage, such information is hardly available or will rather be subjective. An alternative strategy uses loss functions and a penalty matrix that can be decomposed into a standardizing (data-driven) and a weight matrix. The effect of different weight matrices is displayed in joint optimization plots in terms of predicted means and variances of the response variables. In this article, we propose how to choose weight matrices for two and more responses. Furthermore, we prove the Pareto optimality of every point that minimizes the conditional mean of the loss function. Copyright
Production Engineering | 2011
M. Gösling; H. Kracker; Alexander Brosius; Sonja Kuhnt; A. E. Tekkaya
In this article, strategies which compensate geometrical deviations caused by springback are discussed using finite element simulations and statistical modelling techniques. First of all the ability to predict springback using a finite element simulation model is analysed. For that purpose numerical predictions and experiments are compared with each other regarding the amount of springback. In a next step, different strategies for compensating springback such as a modification of stress condition, component stiffness and tool geometry are introduced. On the basis of finite element simulations these different compensation strategies are illustrated for a stretch bending process and experimentally checked for an example. Finally springback simulations are compared regarding their robustness against noise variables such as friction and material properties. Thereby a method based on statistical prediction models is introduced which allows for an accurate approximation of the springback distribution with less numerical effort in comparison to a classical Monte-Carlo method.
Journal of Multivariate Analysis | 2016
Sonja Kuhnt; André Rehage
A measure especially designed for detecting shape outliers in functional data is presented. It is based on the tangential angles of the intersections of the centred data and can be interpreted like a data depth. Due to its theoretical properties we call it functional tangential angle (FUNTA) pseudo-depth. Furthermore we introduce a robustification (rFUNTA). The existence of intersection angles is ensured through the centring. Assuming that shape outliers in functional data follow a different pattern, the distribution of intersection angles differs. Furthermore we formulate a population version of FUNTA in the context of Gaussian processes. We determine sample breakdown points of FUNTA and compare its performance with respect to outlier detection in simulation studies and a real data example.
Reliability Engineering & System Safety | 2015
Jana Fruth; Olivier Roustant; Sonja Kuhnt
Computer experiments are nowadays commonly used to analyze industrial processes aiming at achieving a wanted outcome. Sensitivity analysis plays an important role in exploring the actual impact of adjustable parameters on the response variable. In this work we focus on sensitivity analysis of a scalar-valued output of a time-consuming computer code depending on scalar and functional input parameters. We investigate a sequential methodology, based on piecewise constant functions and sequential bifurcation, which is both economical and fully interpretable. The new approach is applied to a sheet metal forming problem in three sequential steps, resulting in new insights into the behavior of the forming process over time.
Acta Acustica United With Acustica | 2008
Sonja Kuhnt; Christoph Schürmann; Martin Schütte; Edna Wenning; Barbara Griefahn; Matthias Vormann; Jürgen Hellbrück
Annoyance is one of the most studied reactions to noise. Nevertheless, little is known about the effect of the simultaneous occurrence of noise from different sources. Existing models which predict annoyance resulting from combined noise sources are derived from results for single sources and have not yet been validated. The present study empirically investigates actual annoyance as caused from different combinations of road and rail noise in a laboratory experiment. 72 volunteers were exposed to different noise scenarios consisting of combinations of road and rail traffic noise. During noise presentation, test persons had to carry out a task on a personal computer. After each noise scenario, they had to rate their subjective annoyance. A statistical model is derived from the resulting data set, which describes the relationship between noise exposure, task difficulty and annoyance.