Yu. Yu. Linke
Novosibirsk State University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Yu. Yu. Linke.
Siberian Mathematical Journal | 2000
Yu. Yu. Linke; A. I. Sakhanenko
with ξ1, . . . , ξN a sequence of independent identically distributed random variables satisfying the conditions Eξi = 0, Dξi = 1. (1.2) Moreover, the values ai > 0 and bi > 0 are assumed known, while the values of the parameter θ and the variances DXi ≡ σ i are unknown. The values of the random variables ξ1, . . . , ξN are assumed unknown either. In this article, we study the problem of estimating the unknown parameter θ > 0 from the observations X1, . . . , XN . This problem is a particular instance of the nonlinear regression problem which is usually solved by the method of least squares or its modifications. Searching an estimator approximately, we often use linearization methods, the steepest descent method, etc. (see, for instance, [1]) whose implementation requires application of computers in view of a huge number of iterations. However, it turns out that for a linear-fractional regression problem of the form (1.1) the simple estimator
Siberian Mathematical Journal | 2001
Yu. Yu. Linke; A. I. Sakhanenko
are known numbers. The random variables ξi, i = 1, . . . , N , in (1.1) are nonobservable measurement errors. Below we impose some constraints on the limit behavior of the distributions of some linear combinations of these random variables. In this article we consider the problem of estimating the unknown vector θ with coordinates θj > 0, j = 1, . . . ,m, through the random variables Z1, . . . , ZN . We propose some rather simple method for obtaining asymptotically normal estimators of unknown parameters for the linear-fractional regression model (1.1)–(1.3). Unlike the method of least squares which is usually used for solving such nonlinear regression problems, implementation of the proposed method does not require iterational procedures which in turn create difficulties in selecting initial approximation, convergence rate of the process, etc. and necessitate employment of computers due to a huge number of iterations. The main goal of this article is to describe the method for constructing estimators in its general form together with the scheme of studying these estimators, as well as to demonstrate application of some ideas that can be used in studying the estimators. A mathematically-rigorous complete justification of the method was given in [1] in the simplest one-dimensional case of the linear-fractional regression problem. In a forthcoming article, the authors thoroughly study the case
Siberian Mathematical Journal | 2011
A. I. Sakhanenko; Yu. Yu. Linke
Under consideration is the problem of estimating the linear regression parameter in the case when the variances of observations depend on the unknown parameter of the model, while the coefficients (independent variables) are measured with random errors. We propose a new two-step procedure for constructing estimators which guarantees their consistency, find general necessary and sufficient conditions for the asymptotic normality of these estimators, and discuss the case in which these estimators have the minimal asymptotic variance.
Siberian Mathematical Journal | 2001
Yu. Yu. Linke; A. I. Sakhanenko
Under consideration is the problem of estimating unknown parameters in the Michaelis–Menten equation which is frequent in natural sciences. The authors suggest and study asymptotically normal explicit estimates of unknown parameters which often have a minimal covariance matrix.
Siberian Mathematical Journal | 2011
Yu. Yu. Linke
We study the two-step statistical estimates that admit certain expressions of a sufficiently general form. These constructions arise in various statistical models, for instance in regression problems. Under rather weak restrictions we find necessary and sufficient conditions for the normalized difference of a two-step estimate and the unknown parameter to converge weakly to an arbitrary distribution.
Siberian Mathematical Journal | 2011
A. I. Sakhanenko; Yu. Yu. Linke
We consider the linear regression model in the case when the independent variables are measured with errors, while the variances of the main observations depend on an unknown parameter. In the case of normally distributed replicated regressors we propose and study new classes of two-step estimates for the main unknown parameter. We find consistency and asymptotic normality conditions for first-step estimates and an asymptotic normality condition for second-step estimates. We discuss conditions under which these estimates have the minimal asymptotic variance.
Siberian Advances in Mathematics | 2014
Yu. Yu. Linke; A. I. Sakhanenko
We study the accuracy of estimation of unknown parameters in the case of two-step statistical estimates admitting special representations. An approach to the study of such problems previously proposed by the authors is extended to the case of the estimation of a multidimensional parameter. As a result, we obtain necessary and sufficient conditions for the weak convergence of the normalized estimation error to a multidimensional normal distribution.
Siberian Advances in Mathematics | 2012
Yu. Yu. Linke; A. I. Sakhanenko
In this article, we consider the problem of finding a solution to a functional equation which in a special way depends on the distribution of a random variable. Such equations naturally arise in construction of consistent estimates in regression problems in the case when the variances of the main observations depend on underlying unknown parameter and the regression coefficients are determined with random errors. A simple example of a regression problem is demonstrated when the equation under consideration occurs.
Siberian Mathematical Journal | 2009
Yu. Yu. Linke; A. I. Sakhanenko
Siberian Mathematical Journal | 2006
A. I. Sakhanenko; Yu. Yu. Linke