Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Junlong Zhao is active.

Publication


Featured researches published by Junlong Zhao.


Annals of Statistics | 2013

High Dimensional Influence Measure

Junlong Zhao; Chenlei Leng; Lexin Li; Hansheng Wang

Influence diagnosis is important since presence of influential observations could lead to distorted analysis and misleading interpretations. For high dimensional data, it is particularly so, as the increased dimensionality and complexity may amplify both the chance of an observation being influential, and its potential impact on the analysis. In this article, we propose a novel high dimensional influence measure for regressions with the number of predictors far exceeding the sample size. Our proposal can be viewed as a high dimensional counterpart to the classical Cooks distance. However, whereas the Cooks distance quantifies the individual observations influence on the least squares regression coefficient estimate, our new diagnosis measure captures the influence on the marginal correlations, which in turn exerts serious influence on downstream analysis including coefficient estimation, variable selection and screening. Moreover, we establish the asymptotic distribution of the proposed influence measure by letting the predictor dimension go to infinity. Availability of this asymptotic distribution leads to a principled rule to determine the critical value for influential observation detection. Both simulations and real data analysis demonstrate usefulness of the new influence diagnosis measure.


Computational Statistics & Data Analysis | 2016

Robust shrinkage estimation and selection for functional multiple linear model through LAD loss

Lele Huang; Junlong Zhao; HuiwF. A. DiazDelaOen Wang; Siyang Wang

In functional data analysis (FDA), variable selection in regression model is an important issue when there are multiple functional predictors. Most of the existing methods are based on least square loss and consequently sensitive to outliers in error. Robust variable selection procedure is desirable. When functional predictors are considered, both non-data-driven basis (e.g. B-spline) and data-driven basis (e.g. functional principal component (FPC)) are commonly used. The data-driven basis is flexible and adaptive, but it raise some difficulties, since the basis must be estimated from data.Since least absolute deviation (LAD) loss has been proven robust to the outliers in error, we propose in this paper a robust variable selection with data-driven basis FPC and LAD loss function. The asymptotic results are established for both fixed and diverging p . Our results include the existing results as special cases. Simulation results and a real data example confirm the effectiveness of the proposed method.


Communications in Statistics-theory and Methods | 2007

Extending Save and PHD

Junlong Zhao; Xingzhong Xu; Jianjun Ma

SAVE and PHD are effective methods in dimension reduction problems. Both methods are based on two assumptions: linearity condition and constant covariance condition. But in the situation where constant covariance condition fails, even if linearity condition holds, SAVE and PHD often pick the directions which are out side of the central subspace (CS) or central mean subspace (CMS). In this article, we generalize the SAVE and PHD under weaker conditions. This generalization make it possible to get the correct estimates of central subspace (CS) and central mean subspace (CMS).


international conference on intelligent computing | 2017

Adaptive Kendall’s τ Correlation in Bipartite Network for Recommendation

Xihan Shan; Junlong Zhao

The commonly used algorithms in recommender system tend to recommend popular items. The recently proposed algorithm, denoted as G-CosRA, shows good performance in handling this problem, with two parameters to control the popularity of items and activeness of users. In this paper, we refine this algorithm and propose a new recommendation algorithm based on adaptive Kendall’s τ correlation, where only one tuning parameter is involved. The proposal has better performance in accuracy, popularity and diversity, compared with G-CosRA and other existing algorithms. A parameter-free version, named weighted Kendall, is also proposed for better efficiency in computing.


Computational Statistics & Data Analysis | 2017

Inference for biased transformation models

Xuehu Zhu; Tao Wang; Junlong Zhao; Lixing Zhu

Working regression models are often parsimonious for practical use and however may be biased. This is because either some strong signals to the response are not included in working models or too many weak signals are excluded in the modeling stage, which make cumulative bias. Thus, estimating consistently the parameters of interest in biased working models is then a challenge. This paper investigates the estimation problem for linear transformation models with three aims. First, to identify strong signals in the original full models, a sufficient dimension reduction approach is applied to transferring linear transformation models to pro forma linear models. This method can efficiently avoid high-dimensional nonparametric estimation for the unknown model transformation. Second, after identifying strong signals, a semiparametric re-modeling with some artificially constructed predictors is performed to correct model bias in working models. The construction procedure is introduced and a ridge ratio estimation is proposed to determine the number of these predictors. Third, root- n consistent estimators of the parameters in working models are defined and the asymptotic normality is proved. The performance of the new method is illustrated through simulation studies and a real data analysis.


Computational Statistics & Data Analysis | 2017

Trace regression model with simultaneously low rank and row(column) sparse parameter

Junlong Zhao; Lu Niu; Shushi Zhan

Abstract In this paper, we consider the trace regression model with matrix covariates, where the parameter is a matrix of simultaneously low rank and row(column) sparse. To estimate the parameter, we formulate a convex optimization problem with the nuclear norm and group Lasso penalties, and propose an alternating direction method of multipliers (ADMM) algorithm. The asymptotic properties of the estimator are established. Simulation results confirm the effectiveness of our method.


Communications in Statistics-theory and Methods | 2017

Dimension reduction boosting

Junlong Zhao; Xiuli Zhao

ABSTRACT L2Boosting is an effective method for constructing model. In the case of high-dimensional setting, Bühlmann and Yu (2003) proposed the componentwise L2Boosting, but componentwise L2Boosting can only fit a special limited model. In this paper, by combining a boosting and sufficient dimension reduction method, e.g., sliced inverse regression (SIR), we propose a new method for regression, called dimension reduction boosting (DRBoosting). Compared with L2Boosting, the computation of DRBoosting is less intensive and its prediction is better, especially for high-dimensional data. Simulations confirm the advantage of the new method.


international conference on intelligent computing | 2015

Detecting Multiple Influential Observations in High Dimensional Linear Regression

Junlong Zhao; Ying Zhang; Lu Niu

In this paper, we consider the detection of multiple influential observations in high dimensional regression, where the p number of covariates is much larger than sample size n. Detection of influential observations in high dimensional regression is challenging. In the case of single influential observation, Zhao et al. (2013) developed a method called High dimensional Influence Measure (HIM). However, the result of HIM is not applicable to the case of multiple influential observations, where the detection of influential observations is much more complicated than the case of single influential observation. We propose in this paper a new method based on the multiple deletion to detect the multiple influential.


The Scientific World Journal | 2013

Testing normal means: the reconcilability of the P value and the Bayesian evidence.

Yuliang Yin; Junlong Zhao

The problem of reconciling the frequentist and Bayesian evidence in testing statistical hypotheses has been extensively studied in the literature. Most of the existing work considers cases without the nuisance parameters which is not the frequently encountered situation since the presence of the nuisance parameters is very common in practice. In this paper, we consider the reconcilability of the Bayesian evidence against the null hypothesis H 0 in terms of the posterior probability of H 0 being true and the frequentist evidence against H 0 in terms of the P value in testing normal means where the nuisance parameters are present. The reconcilability of evidence can be obtained both for testing a normal mean and for the Behrens-Fisher problem.


international conference on intelligent computing | 2012

Modeling by Combining Dimension Reduction and L2Boosting

Junlong Zhao

Dimension reduction techniques are widely used in high dimensional modeling. The two stage approach, first making dimension reduction and then applying existing regression or classification method, is commonly used in practice. However, an important issue is that when two stage approach can lead to consistent estimate. In this paper, we focus on L2boosting and discuss the consistency of the two stage method-dimension reduction based L2boosting (briey DRL2B). We establish the conditions under which DRL2B method results in consistent estimate. This theoretical finding provides some useful guideline for practical application. In addition, we propose an iterative DRL2B approach and make some simulation study. Simulation results shows that iterative DRL2B method has good performance

Collaboration


Dive into the Junlong Zhao's collaboration.

Top Co-Authors

Avatar

Xingzhong Xu

Beijing Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jianjun Ma

Beijing Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Na Li

Chinese Academy of Sciences

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Siyang Wang

Central University of Finance and Economics

View shared research outputs
Top Co-Authors

Avatar

Tao Wang

Shanghai Jiao Tong University

View shared research outputs
Top Co-Authors

Avatar

Wei Lan

Southwestern University of Finance and Economics

View shared research outputs
Researchain Logo
Decentralizing Knowledge