Alexander Samarov
Massachusetts Institute of Technology
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Alexander Samarov.
Journal of Economic Dynamics and Control | 2004
Dimitris Bertsimas; Geoffrey Lauprete; Alexander Samarov
Motivated from second-order stochastic dominance, we introduce a risk measure that we call shortfall. We examine shortfall’s properties and discuss its relation to such commonly used risk measures as standard deviation, VaR, lower partial moments, and coherent risk measures. We show that the mean-shortfall optimization problem, unlike mean-VaR, can be solved e7ciently as a convex optimization problem, while the sample mean-shortfall portfolio optimization problem can be solved very e7ciently as a linear optimization problem. We provide empirical evidence (a) in asset allocation, and (b) in a problem of tracking an index using only a limited number of assets that the mean-shortfall approach might have advantages over mean-variance. ? 2003 Elsevier B.V. All rights reserved.
Journal of Time Series Analysis | 1997
Liudas Giraitis; Peter Robinson; Alexander Samarov
There exist several estimators of the memory parameter in long- memory time series models with the spectrum specified only locally near zero frequency. In this paper we give an asymptotic lower bound for the minimax risk of any estimator of the memory parameter as a function of the degree of local smoothness of the spectral density at zero. The lower bound allows one to evaluate and compare different estimators by their asymptotic behaviour, and to claim the rate optimality for any estimator attaining the bound. A log-periodogram regression estimator, analysed by Robinson (Log-periodogram regression of time series with long range dependence. Ann. Stat. 23 (1995), 1048--72), is then shown to attain the lower bound, and is thus rate optimal.
Electronic Journal of Statistics | 2010
Umberto Amato; Anestis Antoniadis; Alexander Samarov; Alexandre B. Tsybakov
We consider the problem of multivariate density estimation when the unknown density is assumed to follow a particular form of dimensionality reduction, a noisy independent factor analysis (IFA) model. In this model the data are generated by a number of latent independent components having unknown distributions and are observed in Gaussian noise. We do not assume that either the number of components or the matrix mixing the components are known. We show that the densities of this form can be estimated with a fast rate. Using the mirror averaging aggregation algorithm, we construct a density estimator which achieves a nearly parametric rate (log1/4 n)/√n, independent of the dimensionality of the data, as the sample size n tends to infinity. This estimator is adaptive to the number of components, their distributions and the mixing matrix. We then apply this density estimator to construct nonparametric plug-in classifiers and show that they achieve the best obtainable rate of the excess Bayes risk, to within a logarithmic factor independent of the dimension of the data. Applications of this classifier to simulated data sets and to real data from a remote sensing experiment show promising results.
Archive | 2015
Alexander Samarov
In this paper we consider the problem of multivariate density estimation assuming that the density allows some form of dimensionality reduction. Estimation of high-dimensional densities and dimensionality reduction models are important topics in nonparametric and semi-parametric econometrics.We start with the Independent Component Analysis (ICA) model, which can be considered as a form of dimensionality reduction of a multivariate density. We then consider multiple index model, describing the situations where high-dimensional data has a low-dimensional non-Gaussian component while in all other directions the data are Gaussian, and the independent factor analysis (IFA) model, which generalizes the ordinary factor analysis, principal component analysis, and ICA. For each of these models, we review recent results, obtained in our joint work with Tsybakov, Amato, and Antoniadis, on the accuracy of the corresponding density estimators, which combine model selection with estimation. One of the main applications of multivariate density estimators is in classification, where they can be used to construct plug-in classifiers by estimating the densities of each labeled class. We give a bound to the excess risk of nonparametric plug-in classifiers in terms of the MISE of the density estimators of each class. Combining this bound with the above results on the accuracy of density estimation, we show that the rate of the excess Bayes risk of the corresponding plug-in classifiers does not depend on the dimensionality of the data.
Journal of Multivariate Analysis | 2000
Liudas Giraitis; Peter Robinson; Alexander Samarov
Bernoulli | 2004
Alexander Samarov; Alexandre B. Tsybakov
Archive | 2005
Alexander Samarov; Alexandre B. Tsybakov
Journal of the American Statistical Association | 1985
Alexander Samarov
LSE Research Online Documents on Economics | 2000
Liudas Giraitis; Peter Robinson; Alexander Samarov
Journal of Multivariate Analysis | 2000
Liudas Giraitis; Peter Robinson; Alexander Samarov