Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Qingtao Tang is active.

Publication


Featured researches published by Qingtao Tang.


Journal of Visual Communication and Image Representation | 2017

A generic denoising framework via guided principal component analysis

Tao Dai; Zhiya Xu; Haoyi Liang; Ke Gu; Qingtao Tang; Yisen Wang; Weizhi Lu; Shu-Tao Xia

Though existing state-of-the-art denoising algorithms, such as BM3D, LPG-PCA and DDF, obtain remarkable results, these methods are not good at preserving details at high noise levels, sometimes even introducing non-existent artifacts. To improve the performance of these denoising methods at high noise levels, a generic denoising framework is proposed in this paper, which is based on guided principle component analysis (GPCA). The propose framework can be split into two stages. First, we use statistic test to generate an initial denoised image through back projection, where the statistical test can detect the significantly relevant information between the denoised image and the corresponding residual image. Second, similar image patches are collected to form different patch groups, and local basis are learned from each patch group by principle component analysis. Experimental results on natural images, contaminated with Gaussian and non-Gaussian noise, verify the effectiveness of the proposed framework.


IEEE Transactions on Neural Networks | 2018

A Novel Consistent Random Forest Framework: Bernoulli Random Forests

Yisen Wang; Shu-Tao Xia; Qingtao Tang; Jia Wu; Xingquan Zhu

Random forests (RFs) are recognized as one type of ensemble learning method and are effective for the most classification and regression tasks. Despite their impressive empirical performance, the theory of RFs has yet been fully proved. Several theoretically guaranteed RF variants have been presented, but their poor practical performance has been criticized. In this paper, a novel RF framework is proposed, named Bernoulli RFs (BRFs), with the aim of solving the RF dilemma between theoretical consistency and empirical performance. BRF uses two independent Bernoulli distributions to simplify the tree construction, in contrast to the RFs proposed by Breiman. The two Bernoulli distributions are separately used to control the splitting feature and splitting point selection processes of tree construction. Consequently, theoretical consistency is ensured in BRF, i.e., the convergence of learning performance to optimum will be guaranteed when infinite data are given. Importantly, our proposed BRF is consistent for both classification and regression. The best empirical performance is achieved by BRF when it is compared with state-of-the-art theoretical/consistent RFs. This advance in RF research toward closing the gap between theory and practice is verified by the theoretical and experimental studies in this paper.


international joint conference on artificial intelligence | 2017

Robust Survey Aggregation with Student-t Distribution and Sparse Representation

Qingtao Tang; Tao Dai; Li Niu; Yisen Wang; Shu-Tao Xia; Jianfei Cai

Most existing survey aggregation methods assume that the sample data follow Gaussian distribution. However, these methods are sensitive to outliers, due to the thin-tailed property of Gaussian distribution. To address this issue, we propose a robust survey aggregation method based on Student-t distribution and sparse representation. Specifically, we assume that the samples follow Student-t distribution, instead of the common Gaussian distribution. Due to the Student-t distribution, our method is robust to outliers, which can be explained from both Bayesian point of view and non-Bayesian point of view. In addition, inspired by James-Stain estimator (JS) and Compressive Averaging (CAvg), we propose to sparsely represent the global mean vector by an adaptive basis comprising both dataspecific basis and combined generic basis. Theoretically, we prove that JS and CAvg are special cases of our method. Extensive experiments demonstrate that our proposed method achieves significant improvement over the state-of-the-art methods on both synthetic and real datasets.


international joint conference on artificial intelligence | 2017

Student-t Process Regression with Student-t Likelihood

Qingtao Tang; Li Niu; Yisen Wang; Tao Dai; Wangpeng An; Jianfei Cai; Shu-Tao Xia

Gaussian Process Regression (GPR) is a powerful Bayesian method. However, the performance of GPR can be significantly degraded when the training data are contaminated by outliers, including target outliers and input outliers. Although there are some variants of GPR (e.g., GPR with Student-t likelihood (GPRT)) aiming to handle outliers, most of the variants focus on handling the target outliers while little effort has been done to deal with the input outliers. In contrast, in this work, we aim to handle both the target outliers and the input outliers at the same time. Specifically, we replace the Gaussian noise in GPR with independent Student-t noise to cope with the target outliers. Moreover, to enhance the robustness w.r.t. the input outliers, we use a Student-t Process prior instead of the common Gaussian Process prior, leading to Student-t Process Regression with Student-t Likelihood (TPRT). We theoretically show that TPRT is more robust to both input and target outliers than GPR and GPRT, and prove that both GPR and GPRT are special cases of TPRT. Various experiments demonstrate that TPRT outperforms GPR and its variants on both synthetic and real datasets.


international joint conference on artificial intelligence | 2016

Bernoulli random forests: closing the gap between theoretical consistency and empirical soundness

Yisen Wang; Qingtao Tang; Shu-Tao Xia; Jia Wu; Xingquan Zhu


european conference on artificial intelligence | 2016

Student-t Process Regression with Dependent Student-t Noise.

Qingtao Tang; Yisen Wang; Shu-Tao Xia


international conference on image processing | 2018

Portrait-Aware Artistic Style Transfer.

Yeli Xing; Jiawei Li; Tao Dai; Qingtao Tang; Li Niu; Shu-Tao Xia


international conference on image processing | 2018

Cyclic Annealing Training Convolutional Neural Networks for Image Classification with Noisy Labels.

Jiawei Li; Tao Dai; Qingtao Tang; Yeli Xing; Shu-Tao Xia


international conference on acoustics, speech, and signal processing | 2018

Sure-Based Dual Domain Image Denoising.

Zhiya Xu; Tao Dai; Li Niu; Jiawei Li; Qingtao Tang; Shu-Tao Xia


international conference on acoustics, speech, and signal processing | 2018

Self-paced mixture of t distribution model

Yang Zhang; Qingtao Tang; Li Niu; Tao Dai; Xi Xiao; Shu-Tao Xia

Collaboration


Dive into the Qingtao Tang's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ke Gu

Beijing University of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jia Wu

Macquarie University

View shared research outputs
Top Co-Authors

Avatar

Haoyi Liang

University of Virginia

View shared research outputs
Top Co-Authors

Avatar

Xingquan Zhu

Florida Atlantic University

View shared research outputs
Researchain Logo
Decentralizing Knowledge