Mingqiu Wang
Qufu Normal University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Mingqiu Wang.
Communications in Statistics - Simulation and Computation | 2016
Mingqiu Wang; Lixin Song; Xiaoguang Wang
The high-dimensional data arises in diverse fields of sciences, engineering and humanities. Variable selection plays an important role in dealing with high dimensional statistical modelling. In this article, we study the variable selection of quadratic approximation via the smoothly clipped absolute deviation (SCAD) penalty with a diverging number of parameters. We provide a unified method to select variables and estimate parameters for various of high dimensional models. Under appropriate conditions and with a proper regularization parameter, we show that the estimator has consistency and sparsity, and the estimators of nonzero coefficients enjoy the asymptotic normality as they would have if the zero coefficients were known in advance. In addition, under some mild conditions, we can obtain the global solution of the penalized objective function with the SCAD penalty. Numerical studies and a real data analysis are carried out to confirm the performance of the proposed method.
Journal of Computational and Applied Mathematics | 2014
Mingqiu Wang; Xiuli Wang; Xiaoguang Wang
The one-step estimator, covering various penalty functions, enjoys the oracle property with a good initial estimator. The initial estimator can be chosen as the least squares estimator or maximum likelihood estimator in low-dimensional settings. However, it is not available in ultrahigh dimensionality. In this paper, we study the one-step estimator with the initial estimator being marginal ordinary least squares estimates in the ultrahigh linear model. Under some appropriate conditions, we show that the one-step estimator is selection consistent. Finite sample performance of the proposed procedure is assessed by Monte Carlo simulation studies.
Journal of Applied Statistics | 2014
Guo-Liang Tian; Mingqiu Wang; Lixin Song
In survival studies, current status data are frequently encountered when some individuals in a study are not successively observed. This paper considers the problem of simultaneous variable selection and parameter estimation in the high-dimensional continuous generalized linear model with current status data. We apply the penalized likelihood procedure with the smoothly clipped absolute deviation penalty to select significant variables and estimate the corresponding regression coefficients. With a proper choice of tuning parameters, the resulting estimator is shown to be a root n/pn-consistent estimator under some mild conditions. In addition, we show that the resulting estimator has the same asymptotic distribution as the estimator obtained when the true model is known. The finite sample behavior of the proposed estimator is evaluated through simulation studies and a real example.
Journal of Nonparametric Statistics | 2016
Mingqiu Wang; Guo-Liang Tian
High-dimensional data with a group structure of variables arise always in many contemporary statistical modelling problems. Heavy-tailed errors or outliers in the response often exist in these data. We consider robust group selection for partially linear models when the number of covariates can be larger than the sample size. The non-convex penalty function is applied to achieve both goals of variable selection and estimation in the linear part simultaneously, and we use polynomial splines to estimate the nonparametric component. Under regular conditions, we show that the robust estimator enjoys the oracle property. Simulation studies demonstrate the performance of the proposed method with samples of moderate size. The analysis of a real example illustrates that our method works well.
Communications in Statistics-theory and Methods | 2015
Mingqiu Wang; Lixin Song; Guo-Liang Tian
When outliers and/or heavy-tailed errors exist in linear models, the least absolute deviation (LAD) regression is a robust alternative to the ordinary least squares regression. Existing variable-selection methods in linear models based on LAD regression either only consider the finite number of predictors or lack the oracle property associated with the estimator. In this article, we focus on the variable selection via LAD regression with a diverging number of parameters. The rate of convergence of the LAD estimator with the smoothly clipped absolute deviation (SCAD) penalty function is established. Furthermore, we demonstrate that, under certain regularity conditions, the penalized estimator with a properly selected tuning parameter enjoys the oracle property. In addition, the rank correlation screening method originally proposed by Li et al. (2011) is applied to deal with ultrahigh dimensional data. Simulation studies are conducted for revealing the finite sample performance of the estimator. We further illustrate the proposed methodology by a real example.
Communications in Statistics-theory and Methods | 2011
Mingqiu Wang; Lixin Song; Xiaoguang Wang
This article studies variable selection and parameter estimation in the partially linear model when the number of covariates in the linear part increases to infinity. Using the bridge penalty method, we succeed in selecting the important covariates of the linear part. Under regularity conditions, we have shown that the bridge penalized estimator of the parametric part enjoys the oracle property. We also obtain the convergence rate of the estimator of the nonparametric part. Simulation studies show that the bridge estimator performs as well as the oracle estimator for the partially linear model. An application is analyzed to illustrate the bridge procedure.
Journal of Applied Statistics | 2014
Ying Dong; Lixin Song; Mingqiu Wang; Ying Xu
In the economics and biological gene expression study area where a large number of variables will be involved, even when the predictors are independent, as long as the dimension is high, the maximum sample correlation can be large. Variable selection is a fundamental method to deal with such models. The ridge regression performs well when the predictors are highly correlated and some nonconcave penalized thresholding estimators enjoy the nice oracle property. In order to provide a satisfactory solution to the collinearity problem, in this paper we report the combined-penalization (CP) mixed by the nonconcave penalty and ridge, with a diverging number of parameters. It is observed that the CP estimator with a diverging number of parameters can correctly select covariates with nonzero coefficients and can estimate parameters simultaneously in the presence of multicollinearity. Simulation studies and a real data example demonstrate the well performance of the proposed method.
Journal of Inequalities and Applications | 2017
Xiuli Wang; Mingqiu Wang
This paper studies group selection for the partially linear model with a diverging number of parameters. We propose an adaptive group bridge method and study the consistency, convergence rate and asymptotic distribution of the global adaptive group bridge estimator under regularity conditions. Simulation studies and a real example show the finite sample performance of our method.
Journal of Applied Statistics | 2015
Xiuli Wang; Mingqiu Wang
High-dimensional data arise frequently in modern applications such as biology, chemometrics, economics, neuroscience and other scientific fields. The common features of high-dimensional data are that many of predictors may not be significant, and there exists high correlation among predictors. Generalized linear models, as the generalization of linear models, also suffer from the collinearity problem. In this paper, combining the nonconvex penalty and ridge regression, we propose the weighted elastic-net to deal with the variable selection of generalized linear models on high dimension and give the theoretical properties of the proposed method with a diverging number of parameters. The finite sample behavior of the proposed method is illustrated with simulation studies and a real data example.
Statistics & Probability Letters | 2010
Mingqiu Wang; Lixin Song; Xiaoguang Wang