Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Deguang Kong is active.

Publication


Featured researches published by Deguang Kong.


conference on information and knowledge management | 2011

Robust nonnegative matrix factorization using L21-norm

Deguang Kong; Chris H. Q. Ding; Heng Huang

Nonnegative matrix factorization (NMF) is widely used in data mining and machine learning fields. However, many data contain noises and outliers. Thus a robust version of NMF is needed. In this paper, we propose a robust formulation of NMF using L21 norm loss function. We also derive a computational algorithm with rigorous convergence analysis. Our robust NMF approach, (1) can handle noises and outliers; (2) provides very efficient and elegant updating rules; (3) incurs almost the same computational cost as standard NMF, thus potentially to be used in more real world application tasks. Experiments on 10 datasets show that the robust NMF provides more faithful basis factors and consistently better clustering results as compared to standard NMF.


computer vision and pattern recognition | 2012

Multi-label ReliefF and F-statistic feature selections for image annotation

Deguang Kong; Chris H. Q. Ding; Heng Huang; Haifeng Zhao

The classical ReliefF and F-statistic feature selections can not be directly applied into multi-label problems due to the ambiguity produced from a data point attributed to multiple classes simultaneously. In this paper, we present MReliefF and MF-statistic algorithms for multi-label feature selections. Discriminant features are selected to boost the multi-label classification accuracy. The proposed MReliefF and MF-statistic can be used in image categorization and annotation problems. Extensive experiments on image annotation tasks show the good performance of our approach. To our knowledge, this is the first work to generalize the ReliefF and F-statistic feature selection algorithms for multi-label image annotation tasks.


international conference on data mining | 2013

Efficient Algorithms for Selecting Features with Arbitrary Group Constraints via Group Lasso

Deguang Kong; Chris H. Q. Ding

Feature structure information plays an important role for regression and classification tasks. We consider a more generic problem: group lasso problem, where structures over feature space can be represented as a combination of features in a group. These groups can be either overlapped or non-overlapped, which are specified in different structures, e.g., structures over a line, a tree, a graph or even a forest. We propose a new approach to solve this generic group lasso problem, where certain features are selected in a group, and an arbitrary family of subset is allowed. We employ accelerated proximal gradient method to solve this problem, where a key step is solve the associated proximal operator. We propose a fast method to compute the proximal operator, where its convergence is rigorously proved. Experimental results on different structures (e.g., group, tree, graph structures) demonstrate the efficiency and effectiveness of the proposed algorithm.


european conference on machine learning | 2013

Minimal Shrinkage for Noisy Data Recovery Using Schatten-p Norm Objective

Deguang Kong; Miao Zhang; Chris H. Q. Ding

Noisy data recovery is an important problem in machine learning field, which has widely applications for collaborative prediction, recommendation systems, etc. One popular model is to use trace norm model for noisy data recovery. However, it is ignored that the reconstructed data could be shrank i.e., singular values could be greatly suppressed. In this paper, we present novel noisy data recovery models, which replaces the standard rank constraint i.e., trace norm using Schatten-p Norm. The proposed model is attractive due to its suppression on the shrinkage of singular values at smaller parameter p. We analyze the optimal solution of proposed models, and characterize the rank of optimal solution. Efficient algorithms are presented, the convergences of which are rigorously proved. Extensive experiment results on 6 noisy datasets demonstrate the good performance of proposed minimum shrinkage models.


international conference on acoustics, speech, and signal processing | 2012

Nonnegative matrix factorization using a robust error function

Chris H. Q. Ding; Deguang Kong

Nonnegative matrix factorization (NMF) is widely used in image analysis. However, most images contain noises and outliers. Thus a robust version of NMF is needed. We propose a novel NMF using a robust error function which smoothly interpolates between the least squares at small errors and L1-norm at large errors. An efficient computational algorithm is derived with rigorous convergence analysis. Extensive experiments are made on six image datasets to show the effectiveness of proposed approach. Robust NMF consistently provides better reconstructed images, and better clustering results as compared to standard NMF.


international conference on data mining | 2012

A Semi-definite Positive Linear Discriminant Analysis and Its Applications

Deguang Kong; Chris H. Q. Ding

Linear Discriminant Analysis (LDA) is widely used for dimension reduction in classification tasks. However, standard LDA formulation is not semi definite positive (s.d.p), and thus it is difficult to obtain the global optimal solution when standard LDA formulation is combined with other loss functions or graph embedding. In this paper, we present an alternative approach to LDA. We rewrite the LDA criterion as a convex formulation (semi-definite positive LDA, i.e., sdpLDA) using the largest eigen-value of the generalized eigen-value problem of standard LDA. We give applications by incorporating sdpLDA as a regularization term into discriminant regression analysis. Another application is to incorporate sdpLDA into standard Laplacian embedding, which utilizes the supervised information to improve the Laplacian embedding performance. Proposed sdpLDA formulation can be used for both multi-class classification tasks. Extensive experiments results on 10 multi-class datasets indicate promising results of proposed method.


international conference on machine learning | 2012

An Iterative Locally Linear Embedding Algorithm

Deguang Kong; Chris H. Q. Ding; Heng Huang; Feiping Nie


neural information processing systems | 2014

Exclusive Feature Learning on Arbitrary Structures via ell_1,2-norm

Deguang Kong; Ryohei Fujimaki; Ji Liu; Feiping Nie; Chris H. Q. Ding


national conference on artificial intelligence | 2014

Robust non-negative dictionary learning

Qihe Pan; Deguang Kong; Chris H. Q. Ding; Bin Luo


european conference on machine learning | 2012

Maximum consistency preferential random walks

Deguang Kong; Chris H. Q. Ding

Collaboration


Dive into the Deguang Kong's collaboration.

Top Co-Authors

Avatar

Chris H. Q. Ding

University of Texas at Arlington

View shared research outputs
Top Co-Authors

Avatar

Heng Huang

University of Texas at Arlington

View shared research outputs
Top Co-Authors

Avatar

Feiping Nie

Northwestern Polytechnical University

View shared research outputs
Top Co-Authors

Avatar

Ji Liu

University of Rochester

View shared research outputs
Top Co-Authors

Avatar

Miao Zhang

University of Texas at Arlington

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge