Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Tatjana Pavlenko is active.

Publication


Featured researches published by Tatjana Pavlenko.


Journal of Statistical Planning and Inference | 2003

On feature selection, curse-of-dimensionality and error probability in discriminant analysis

Tatjana Pavlenko

Abstract Discrimination performance, measured by the limiting error probability, is considered from the point of view of feature discriminating power. For assessing the latter, a concept of feature informativeness is introduced. A threshold feature selection technique is considered. Selection is incorporated into the discriminant function by means of an inclusion–exclusion factor which eliminates the sets of features whose informativeness do not exceed a given threshold. An issue is how this selection procedure affects the error rate when sample based estimates are used in the discriminant function. This effect is evaluated in a growing dimension asymptotic framework. In particular, the increase of the moments of the discriminant function induced by the curse-of-dimensionality is shown together with the effect of the threshold-based feature selection. The asymptotic normality of the discriminant function, which makes it possible to express the overall error probability in a closed form and view it as a function of a given threshold of selection.


Statistics | 2001

Effect of dimensionality on discrimination

Tatjana Pavlenko; Dietrich von Rosen

Discrimination problems in a high-dimensional setting is considered. New results are concerned with the role of the dimensionality in the performance of the discrimination procedure. Assuming that data consist of a block structure two different asymptotic approaches are presented. These approaches are characterized by different types of relations between the dimensionality and the size of the training samples. Asymptotic expressions for the error probabilities are obtained and a consistent approximation of the discriminant function is proposed. Throughout the paper the importance of the dimensionality in the asymptotic analysis is stressed.


Journal of Applied Statistics | 2012

Covariance structure approximation via gLasso in high-dimensional supervised classification

Tatjana Pavlenko; Anders Björkström; Annika Tillander

Recent work has shown that the Lasso-based regularization is very useful for estimating the high-dimensional inverse covariance matrix. A particularly useful scheme is based on penalizing the ℓ1 norm of the off-diagonal elements to encourage sparsity. We embed this type of regularization into high-dimensional classification. A two-stage estimation procedure is proposed which first recovers structural zeros of the inverse covariance matrix and then enforces block sparsity by moving non-zeros closer to the main diagonal. We show that the block-diagonal approximation of the inverse covariance matrix leads to an additive classifier, and demonstrate that accounting for the structure can yield better performance accuracy. Effect of the block size on classification is explored, and a class of asymptotically equivalent structure approximations in a high-dimensional setting is specified. We suggest a variable selection at the block level and investigate properties of this procedure in growing dimension asymptotics. We present a consistency result on the feature selection procedure, establish asymptotic lower an upper bounds for the fraction of separative blocks and specify constraints under which the reliable classification with block-wise feature selection can be performed. The relevance and benefits of the proposed approach are illustrated on both simulated and real data.


Journal of statistical theory and practice | 2014

Modified Jarque–Bera Type Tests for Multivariate Normality in a High-Dimensional Framework

Kazuyuki Koizumi; Masashi Hyodo; Tatjana Pavlenko

In this article, we introduce two types of new omnibus procedures for testing multivariate normality based on the sample measures of multivariate skewness and kurtosis. These characteristics, initially introduced by, for example, Mardia (1970) and Srivastava (1984), were then extended by Koizumi, Okamoto, and Seo (2009), who proposed the multivariate Jarque-Bera type test (MJB1) based on the Srivastava (1984) principal components measure scores of skewness and kurtosis. We suggest an improved MJB test (MJB2) that is based on the Wilson-Hilferty transform, and a modified MJB test (mMJB) that is based on the F-approximation to mMJB. Asymptotic properties of both tests are examined, assuming that both dimensionality and sample size go to infinity at the same rate. Our simulation study shows that the suggested mMJB test outperforms both MJB1 and MJB2 for a number of high-dimensional scenarios. The mMJB test is then used for testing multivariate normality of the real data digitalized character image.


soft methods in probability and statistics | 2013

Bayesian Block-Diagonal Predictive Classifier for Gaussian Data

Jukka Corander; Timo Koski; Tatjana Pavlenko; Annika Tillander

The paper presents a method for constructing Bayesian predictive classifier in a high-dimensional setting. Given that classes are represented by Gaussian distributions with block-structured covariance matrix, a closed form expression for the posterior predictive distribution of the data is established. Due to factorization of this distribution, the resulting Bayesian predictive and marginal classifier provides an efficient solution to the high-dimensional problem by splitting it into smaller tractable problems. In a simulation study we show that the suggested classifier outperforms several alternative algorithms such as linear discriminant analysis based on block-wise inverse covariance estimators and the shrunken centroids regularized discriminant analysis.


Journal of Multivariate Analysis | 2015

Asymptotic properties of the misclassification rates for Euclidean Distance Discriminant rule in high-dimensional data

Hiroki Watanabe; Masashi Hyodo; Takashi Seo; Tatjana Pavlenko

Performance accuracy of the Euclidean Distance Discriminant rule (EDDR) is studied in the high-dimensional asymptotic framework which allows the dimensionality to exceed sample size. Under mild assumptions on the traces of the covariance matrix, our new results provide the asymptotic distribution of the conditional misclassification error and the explicit expression for the consistent and asymptotically unbiased estimator of the expected misclassification error. To get these properties, new results on the asymptotic normality of the quadratic forms and traces of the higher power of Wishart matrix, are established. Using our asymptotic results, we further develop two generic methods of determining a cut-off point for EDDR to adjust the misclassification errors. Finally, we numerically justify the high accuracy of our asymptotic findings along with the cut-off determination methods in finite sample applications, inclusive of the large sample and high-dimensional scenarios.


The Open Pediatric Medicine Journal | 2011

Clinical course of steroid sensitive nephrotic syndrome in children: outcome and outlook

Svitlana Fomina; Tatjana Pavlenko; Erling Englund; Ingretta Bagdasarova

Introduction: The aim of our study was to investigate the relative efficiency and adverse effects of various treatments of steroid sensitive nephrotic syndrome (SSNS) in children, and to determine factors associated with relapse risk in these patients. Materials and Method: We retrospectively studied the data from 690 SSNS children treated in referral center over 25 years. The analyzed treatment protocols were: Prednisolone (PRED, eight weeks in a dose 1.5-2.0 mg/kg, then it tapering and given for 9-12 months), Chlorambucil (CHL, cumulative dose 28.5-30 mg/kg), Cyclophosphamide intravenously (CYC I.V., cumulative dose of 30-36 mg/kg, then supporting dose of CHL, cumulative dose of 20-25 mg/kg) and intramuscular (CYC I.M., cumulative dose of 120-150 mg/kg). The alkylating agents were used after remission induction by PRED and under its protection. Results: Cumulative relapse-free survival was 81.9%, 69.0% and 64.5% after 12, 36 and 60 months, respectively. In multivariate analyses, relapse risk was associated with age of treatment (<6 years), and both PRED and CYC I.V. The only predictive factor for early relapse was PRED, unlike two and more relapses group where PRED and CYC I.V. as well as age from 3 to 6 years was highly prognostic. The high probability of sustained remission in combination with relatively mild adverse effects was observed for PRED used at first episode and CHL used at relapse. Conclusion: To summarize, our protocols characterized by the prolonged PRED and CHL demonstrated promising results and should be considered as an efficient alternative strategy in SSNS management.


soft methods in probability and statistics | 2006

Scoring Feature Subsets for Separation Power in Supervised Bayes Classification

Tatjana Pavlenko; Hakan Fridén

We present a method for evaluating the discriminative power of compact feature combinations (blocks) using the distance-based scoring measure, yielding an algorithm for selecting feature blocks tha ...


soft methods in probability and statistics | 2010

Exploiting Sparse Dependence Structure in Model Based Classification

Tatjana Pavlenko; Anders Björkström

Sparsity patterns discovered in the data dependence structure were used to reduce the dimensionality and improve performance accuracy of the model based classifier in a high dimensional framework.


Clinical Physiology and Functional Imaging | 2010

Lung aeration during sleep in patients with obstructive sleep apnoea

Jonas Appelberg; Christer Janson; Eva Lindberg; Tatjana Pavlenko; Göran Hedenstierna

Background:  Previous studies have indicated that patients with obstructive sleep apnoea (OSA) have altered ventilation and lung volumes awake and the results suggest that this may be a determinant of severity of desaturations during sleep. However, little is known about regional lung aeration during sleep in patients with OSA.

Collaboration


Dive into the Tatjana Pavlenko's collaboration.

Top Co-Authors

Avatar

Dietrich von Rosen

Swedish University of Agricultural Sciences

View shared research outputs
Top Co-Authors

Avatar

Masashi Hyodo

Osaka Prefecture University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Takashi Seo

Osaka Prefecture University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Zhanna Andrushchenko

Swedish University of Agricultural Sciences

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge