Journal of Statistical Planning and Inference | 2021

A brief review of linear sufficient dimension reduction through optimization

 

Abstract


Abstract In this paper, we review three families of methods in linear sufficient dimension reduction through optimization. Through minimization of general loss functions, we cast classical methods, such as ordinary least squares and sliced inverse regression, and modern methods, such as principal support vector machines and principal quantile regression, under a unified framework. Then we review sufficient dimension reduction methods through maximizing dependence measures, which include the distance covariance, the Hilbert–Schmidt independence criterion, the martingale difference divergence, and the expected conditional difference. Last but not least, we provide an information-theoretic perspective for the third family of sufficient dimension reduction methods.

Volume 211
Pages 154-161
DOI 10.1016/j.jspi.2020.06.006
Language English
Journal Journal of Statistical Planning and Inference

Full Text