Steffen Unkel
Open University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Steffen Unkel.
Journal of Climate | 2009
Abdel Hannachi; Steffen Unkel; Nickolay T. Trendafilov; Ian T. Jolliffe
The complexity inherent in climate data makes it necessary to introduce more than one statistical tool to the researcher to gain insight into the climate system. Empirical orthogonal function (EOF) analysis is one of the most widely used methods to analyze weather/climate modes of variability and to reduce the dimensionality of the system. Simple structure rotation of EOFs can enhance interpretability of the obtained patterns but cannot provide anything more than temporal uncorrelatedness. In this paper, an alternative rotation method based on independent component analysis (ICA) is considered. The ICA is viewed here as a method of EOF rotation. Starting from an initial EOF solution rather than rotating the loadings toward simplicity, ICA seeks a rotation matrix that maximizes the independence between the components in the time domain. If the underlying climate signals have an independent forcing, one can expect to find loadings with interpretable patterns whose time coefficients have properties that go beyond simple noncorrelation observed in EOFs. The methodology is presented and an application to monthly means sea level pressure (SLP) field is discussed. Among the rotated (to independence) EOFs, the North Atlantic Oscillation (NAO) pattern, an Arctic Oscillation–like pattern, and a Scandinavian-like pattern have been identified. There is the suggestion that the NAO is an intrinsic mode of variability independent of the Pacific.
Journal of Computational and Graphical Statistics | 2011
Nickolay T. Trendafilov; Steffen Unkel
A new approach for exploratory factor analysis (EFA) of data matrices with more variables p than observations n is presented. First, the classic EFA model (n > p) is considered as a specific data matrix decomposition with fixed unknown matrix parameters. Then, it is generalized to a new model, called for short GEFA, which covers both cases of data, with either n > p or p ≥ n. An alternating least squares algorithm GEFALS is proposed for simultaneous estimation of all GEFA model parameters. Like principal component analysis (PCA), GEFALS is based on singular value decomposition, which makes GEFA an attractive alternative to PCA for descriptive data analysis and dimensionality reduction. The existence and uniqueness of the GEFA parameter estimation is studied and the convergence properties of GEFALS are established. Finally, the new approach is applied to both artificial (Thurstone’s 26-variable box data) and real high-dimensional data, while the performance of GEFALS is illustrated with simulation experiment. Some codes and data are available online as supplemental materials.
Computational Statistics & Data Analysis | 2010
Steffen Unkel; Nickolay T. Trendafilov
A new approach for fitting the exploratory factor analysis (EFA) model is considered. The EFA model is fitted directly to the data matrix by minimizing a weighted least squares (WLS) goodness-of-fit measure. The WLS fitting problem is solved by iteratively performing unweighted least squares fitting of the same model. A convergent reweighted least squares algorithm based on iterative majorization is developed. The influence of large residuals in the loss function is curbed using Hubers criterion. This procedure leads to robust EFA that can resist the effect of outliers in the data. Applications to real and simulated data illustrate the performance of the proposed approach.
Cancer Letters | 2015
Anne Ernst; Heike Anders; Heidi Kapfhammer; Michael Orth; Roman Hennel; Karin Seidl; Nicolas Winssinger; Claus Belka; Steffen Unkel; Kirsten Lauber
Radiotherapy is an essential part of multi-modal treatment for soft tissue sarcomas. Treatment failure is commonly attributed to radioresistance, but comprehensive analyses of radiosensitivity are not available, and suitable biomarkers or candidates for targeted radiosensitization are scarce. Here, we systematically analyzed the intrinsic radioresistance of a panel of soft tissue sarcoma cell lines, and extracted scores of radioresistance by principal component analysis (PCA). To identify molecular markers of radioresistance, transcriptomic profiling of DNA damage response regulators was performed. The expression levels of HSP90 and its clients ATR, ATM, and NBS1 revealed strong, positive correlations with the PCA-derived radioresistance scores. Their functional involvement was addressed by HSP90 inhibition, which preferentially sensitized radioresistant sarcoma cells and was accompanied by delayed γ-H2AX foci clearance and HSP90 client protein degradation. The induction of apoptosis and necrosis was not significantly enhanced, but increased levels of basal and irradiation-induced senescence upon HSP90 inhibition were detected. Finally, evaluation of our findings in the TCGA soft tissue sarcoma cohort revealed elevated expression levels of HSP90, ATR, ATM, and NBS1 in a relevant subset of cases with particularly poor prognosis, which might preferentially benefit from HSP90 inhibition in combination with radiotherapy in the future.
American Journal of Epidemiology | 2013
C. Paddy Farrington; Heather J. Whitaker; Steffen Unkel; Richard Pebody
In this paper, we propose new methods for investigating the extent of heterogeneity in effective contact rates relevant to the transmission of infections. These methods exploit the correlations between ages at infection for different infections within individuals. The methods are developed for serological surveys, which provide accessible individual data on several infections, and are applied to a wide range of infections. We find that childhood infections are often highly correlated within individuals in early childhood, with the correlations persisting into adulthood only for infections sharing a transmission route. We discuss 2 applications of the methods: 1) to making inferences about routes of transmission when these are unknown or uncertain and 2) to estimating epidemiologic parameters such as the basic reproduction number and the critical immunization threshold. Two examples of such applications are presented: elucidating the transmission route of polyomaviruses BK and JC and estimating the basic reproduction number and critical immunization coverage of varicella-zoster infection in Belgium, Italy, Poland, and England and Wales. We speculate that childhood correlations stem from confounding of different transmission routes and represent heterogeneity in childhood circumstances, notably nursery-school attendance. In contrast, it is suggested that correlations in adulthood are route-specific.
Orphanet Journal of Rare Diseases | 2016
Steffen Unkel; Christian Röver; Nigel Stallard; Norbert Benda; Martin Posch; Sarah Zohar; Tim Friede
BackgroundRandomized controlled trials (RCTs) are the gold standard design of clinical research to assess interventions. However, RCTs cannot always be applied for practical or ethical reasons. To investigate the current practices in rare diseases, we review evaluations of therapeutic interventions in paediatric multiple sclerosis (MS) and Creutzfeldt-Jakob disease (CJD). In particular, we shed light on the endpoints used, the study designs implemented and the statistical methodologies applied.MethodsWe conducted literature searches to identify relevant primary studies. Data on study design, objectives, endpoints, patient characteristics, randomization and masking, type of intervention, control, withdrawals and statistical methodology were extracted from the selected studies. The risk of bias and the quality of the studies were assessed.ResultsTwelve (seven) primary studies on paediatric MS (CJD) were included in the qualitative synthesis. No double-blind, randomized placebo-controlled trial for evaluating interventions in paediatric MS has been published yet. Evidence from one open-label RCT is available. The observational studies are before-after studies or controlled studies. Three of the seven selected studies on CJD are RCTs, of which two received the maximum mark on the Oxford Quality Scale. Four trials are controlled observational studies.ConclusionsEvidence from double-blind RCTs on the efficacy of treatments appears to be variable between rare diseases. With regard to paediatric conditions it remains to be seen what impact regulators will have through e.g., paediatric investigation plans. Overall, there is space for improvement by using innovative trial designs and data analysis techniques.
Biostatistics | 2012
Steffen Unkel; C. Paddy Farrington
In this paper, a new measure for assessing the temporal variation in the strength of association in bivariate current status data is proposed. This novel measure is relevant for shared frailty models. We show that this measure is particularly convenient, owing to its connection with the relative frailty variance and its interpretability in suggesting appropriate frailty models. We introduce a method of estimation and standard errors for this measure. We discuss its properties and compare it to an existing measure of association applicable to current status data. Small sample performance of the measure in realistic scenarios is investigated using simulations. The methods are illustrated with bivariate serological survey data on a pair of infections, where the time-varying association is likely to represent heterogeneities in activity levels and/or susceptibility to infection.
Radiation Oncology | 2016
Steffen Unkel; Claus Belka; Kirsten Lauber
BackgroundThe most frequently used method to quantitatively describe the response to ionizing irradiation in terms of clonogenic survival is the linear-quadratic (LQ) model. In the LQ model, the logarithm of the surviving fraction is regressed linearly on the radiation dose by means of a second-degree polynomial. The ratio of the estimated parameters for the linear and quadratic term, respectively, represents the dose at which both terms have the same weight in the abrogation of clonogenic survival. This ratio is known as the α/β ratio. However, there are plausible scenarios in which the α/β ratio fails to sufficiently reflect differences between dose-response curves, for example when curves with similar α/β ratio but different overall steepness are being compared. In such situations, the interpretation of the LQ model is severely limited.MethodsColony formation assays were performed in order to measure the clonogenic survival of nine human pancreatic cancer cell lines and immortalized human pancreatic ductal epithelial cells upon irradiation at 0-10 Gy. The resulting dataset was subjected to LQ regression and non-linear log-logistic regression. Dimensionality reduction of the data was performed by cluster analysis and principal component analysis.ResultsBoth the LQ model and the non-linear log-logistic regression model resulted in accurate approximations of the observed dose-response relationships in the dataset of clonogenic survival. However, in contrast to the LQ model the non-linear regression model allowed the discrimination of curves with different overall steepness but similar α/β ratio and revealed an improved goodness-of-fit. Additionally, the estimated parameters in the non-linear model exhibit a more direct interpretation than the α/β ratio. Dimensionality reduction of clonogenic survival data by means of cluster analysis was shown to be a useful tool for classifying radioresistant and sensitive cell lines. More quantitatively, principal component analysis allowed the extraction of scores of radioresistance, which displayed significant correlations with the estimated parameters of the regression models.ConclusionsUndoubtedly, LQ regression is a robust method for the analysis of clonogenic survival data. Nevertheless, alternative approaches including non-linear regression and multivariate techniques such as cluster analysis and principal component analysis represent versatile tools for the extraction of parameters and/or scores of the cellular response towards ionizing irradiation with a more intuitive biological interpretation. The latter are highly informative for correlation analyses with other types of data, including functional genomics data that are increasingly beinggenerated.
Statistics and Computing | 2013
Nickolay T. Trendafilov; Steffen Unkel; Wojtek J. Krzanowski
Exploratory Factor Analysis (EFA) and Principal Component Analysis (PCA) are popular techniques for simplifying the presentation of, and investigating the structure of, an (n×p) data matrix. However, these fundamentally different techniques are frequently confused, and the differences between them are obscured, because they give similar results in some practical cases. We therefore investigate conditions under which they are expected to be close to each other, by considering EFA as a matrix decomposition so that it can be directly compared with the data matrix decomposition underlying PCA. Correspondingly, we propose an extended version of PCA, called the EFA-like PCA, which mimics the EFA matrix decomposition in the sense that they contain the same unknowns. We provide iterative algorithms for estimating the EFA-like PCA parameters, and derive conditions that have to be satisfied for the two techniques to give similar results. Throughout, we consider separately the cases n>p and p≥n. All derived algorithms and matrix conditions are illustrated on two data sets, one for each of these two cases.
Journal of The American Academy of Audiology | 2015
Jürgen Kiessling; Melanie Leifholz; Steffen Unkel; Jörn Pons-Kühnemann; Charlotte Thunberg Jespersen; Jenny Nesgaard Pedersen
BACKGROUND In-situ audiometry is a hearing aid feature that enables the measurement of hearing threshold levels through the hearing instrument using the built-in sound generator and the hearing aid receiver. This feature can be used in hearing aid fittings instead of conventional pure-tone audiometry (PTA), particularly in places where no standard audiometric equipment is available. Differences between conventional and in-situ thresholds are described and discussed for some particular hearing aids. No previous investigation has measured and compared these differences for a number of current hearing aid models by various manufacturers across a wide range of hearing losses. PURPOSE The purpose of this study was to perform a model-based comparison of conventionally and in-situ measured hearing thresholds. Data were collected for a range of hearing aid devices to study and generalize the effects that may occur under clinical conditions. RESEARCH DESIGN Research design was an experimental and regression study. STUDY SAMPLE A total of 30 adults with sensorineural hearing loss served as test persons. They were assigned to three subgroups of 10 subjects with mild (M), moderate to severe (MS), and severe (S) sensorineural hearing loss. INTERVENTION All 30 test persons underwent both conventional PTA and in-situ audiometry with four hearing aid models by various manufacturers. DATA COLLECTION AND ANALYSIS The differences between conventionally and in-situ measured hearing threshold levels were calculated and evaluated by an exploratory data analysis followed by a sophisticated statistical modeling process. RESULTS At 500 and 1500 Hz, almost all threshold differences (conventional PTA minus in-situ data) were negative, i.e., in the low to mid frequencies, hearing loss was overestimated by most devices relative to PTA. At 4000 Hz, the majority of differences (7 of 12) were positive, i.e., in the frequency range above 1500 Hz, hearing loss was frequently underestimated. As hearing loss increased (M→MS→S), the effect of the underestimation decreased. At 500 and 1500 Hz, Resound devices showed the smallest threshold deviations, followed by Phonak, Starkey, and Oticon instruments. At 4000 Hz, this observed pattern partly disappeared and Starkey and Oticon devices showed a reversed effect with increasing hearing loss (M→MS→S). Because of high standard errors for the estimates, only a few explicit rankings of the devices could be established based on significant threshold differences (5% level). CONCLUSIONS Differences between conventional PTA and in-situ threshold levels may be attributed to (1) frequency, (2) device/hearing loss, and (3) calibration/manufacturer effects. Frequency effects primarily resulting in an overestimation of hearing loss by in-situ audiometry in the low and mid frequencies are mainly due to sound drain-off through vents and leaks. Device/hearing loss effects may be due to leakage as well as boundary effects because in-situ audiometry is confined to a limited measurement range. Finally, different calibration approaches may result in different offset levels between PTA and in-situ audiometry calibration. In some cases, the observed threshold differences of up to 10-15 dB may translate to varied hearing aid fittings for the same user depending on how hearing threshold levels were measured.