Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Oleg Sysoev is active.

Publication


Featured researches published by Oleg Sysoev.


40th WORKSHOP LARGE SCALE NONLINEAR OPTIMIZATION, Erice, Italy, June 22 - July 1, 2004, | 2006

An O(n2) algorithm for isotonic regression

Oleg Burdakov; Oleg Sysoev; Anders Grimvall; Mohamed Hussian

We consider the problem of minimizing the distance from a given n-dimensional vector to a set defined by constraints of the form x i ≤ x j. Such constraints induce a partial order of the components x i, which can be illustrated by an acyclic directed graph. This problem is also known as the isotonic regression (IR) problem. IR has important applications in statistics, operations research and signal processing, with most of them characterized by a very large value of n. For such large-scale problems, it is of great practical importance to develop algorithms whose complexity does not rise with n too rapidly. The existing optimization-based algorithms and statistical IR algorithms have either too high computational complexity or too low accuracy of the approximation to the optimal solution they generate. We introduce a new IR algorithm, which can be viewed as a generalization of the Pool-Adjacent-Violator (PAV) algorithm from completely to partially ordered data. Our algorithm combines both low computational complexity O(n 2) and high accuracy. This allows us to obtain sufficiently accurate solutions to IR problems with thousands of observations.


Computational Statistics & Data Analysis | 2011

A segmentation-based algorithm for large-scale partially ordered monotonic regression

Oleg Sysoev; Oleg Burdakov; Anders Grimvall

Monotonic regression (MR) is an efficient tool for estimating functions that are monotonic with respect to input variables. A fast and highly accurate approximate algorithm called the GPAV was recently developed for efficient solving large-scale multivariate MR problems. When such problems are too large, the GPAV becomes too demanding in terms of computational time and memory. An approach, that extends the application area of the GPAV to encompass much larger MR problems, is presented. It is based on segmentation of a large-scale MR problem into a set of moderate-scale MR problems, each solved by the GPAV. The major contribution is the development of a computationally efficient strategy that produces a monotonic response using the local solutions. A theoretically motivated trend-following technique is introduced to ensure higher accuracy of the solution. The presented results of extensive simulations on very large data sets demonstrate the high efficiency of the new algorithm.


Knowledge and Information Systems | 2018

A Smoothed Monotonic Regression via L2 Regularization

Oleg Sysoev; Oleg Burdakov

Monotonic regression is a standard method for extracting a monotone function from non-monotonic data, and it is used in many applications. However, a known drawback of this method is that its fitted response is a piecewise constant function, while practical response functions are often required to be continuous. The method proposed in this paper achieves monotonicity and smoothness of the regression by introducing an L2 regularization term. In order to achieve a low computational complexity and at the same time to provide a high predictive power of the method, we introduce a probabilistically motivated approach for selecting the regularization parameters. In addition, we present a technique for correcting inconsistencies on the boundary. We show that the complexity of the proposed method is


Communications in Statistics - Simulation and Computation | 2016

Bootstrap confidence intervals for large-scale multivariate monotonic regression problems

Oleg Sysoev; Anders Grimvall; Oleg Burdakov


Journal of Statistical Computation and Simulation | 2013

Bootstrap estimation of the variance of the error term in monotonic regression models

Oleg Sysoev; Anders Grimvall; Oleg Burdakov

O(n^2)


Journal of Computational Mathematics | 2006

Data preordering in generalized PAV algorithm for monotonic regression

Oleg Burdakov; Anders Grimvall; Oleg Sysoev


The 4th European Congress of Computational Methods in Applied Science and Engineering "ECCOMAS 2004" | 2006

An O(n2) algorithm for isotonic regression problems

Oleg Burdakov; Oleg Sysoev; Anders Grimvall; Mohammed Hussian

O(n2). Our simulations demonstrate that when the data are large and the expected response is a complicated function (which is typical in machine learning applications) or when there is a change point in the response, the proposed method has a higher predictive power than many of the existing methods.


Journal of Mathematical Psychology | 2016

A statistical test of the equality of latent orders

Michael L. Kalish; John C. Dunn; Oleg Burdakov; Oleg Sysoev

Recently, the methods used to estimate monotonic regression (MR) models have been substantially improved, and some algorithms can now produce high-accuracy monotonic fits to multivariate datasets containing over a million observations. Nevertheless, the computational burden can be prohibitively large for resampling techniques in which numerous datasets are processed independently of each other. Here, we present efficient algorithms for estimation of confidence limits in large-scale settings that take into account the similarity of the bootstrap or jackknifed datasets to which MR models are fitted. In addition, we introduce modifications that substantially improve the accuracy of MR solutions for binary response variables. The performance of our algorithms is illustrated using data on death in coronary heart disease for a large population. This example also illustrates that MR can be a valuable complement to logistic regression.


Match | 2005

Monotonic regression for the detection of temporal trends in environmental quality data

Mohamed Hussian; Anders Grimvall; Oleg Burdakov; Oleg Sysoev

The variance of the error term in ordinary regression models and linear smoothers is usually estimated by adjusting the average squared residual for the trace of the smoothing matrix (the degrees of freedom of the predicted response). However, other types of variance estimators are needed when using monotonic regression (MR) models, which are particularly suitable for estimating response functions with pronounced thresholds. Here, we propose a simple bootstrap estimator to compensate for the over-fitting that occurs when MR models are estimated from empirical data. Furthermore, we show that, in the case of one or two predictors, the performance of this estimator can be enhanced by introducing adjustment factors that take into account the slope of the response function and characteristics of the distribution of the explanatory variables. Extensive simulations show that our estimators perform satisfactorily for a great variety of monotonic functions and error distributions.


european conference on machine learning | 2009

Generalized PAV algorithm with block refinement for partially ordered monotonic regression

Oleg Burdakov; Anders Grimvall; Oleg Sysoev

Collaboration


Dive into the Oleg Sysoev's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Michael L. Kalish

University of Louisiana at Lafayette

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ivan Kapyrin

Russian Academy of Sciences

View shared research outputs
Top Co-Authors

Avatar

Yuri V. Vassilevski

Moscow Institute of Physics and Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge