Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Zhengdong Lu is active.

Publication


Featured researches published by Zhengdong Lu.


computer vision and pattern recognition | 2008

Constrained spectral clustering through affinity propagation

Zhengdong Lu; Miguel Á. Carreira-Perpiñán

Pairwise constraints specify whether or not two samples should be in one cluster. Although it has been successful to incorporate them into traditional clustering methods, such as K-means, little progress has been made in combining them with spectral clustering. The major challenge in designing an effective constrained spectral clustering is a sensible combination of the scarce pairwise constraints with the original affinity matrix. We propose to combine the two sources of affinity by propagating the pairwise constraints information over the original affinity matrix. Our method has a Gaussian process interpretation and results in a closed-form expression for the new affinity matrix. Experiments show it outperforms state-of-the-art constrained clustering methods in getting good clusterings with fewer constraints, and yields good image segmentation with user-specified pairwise constraints.


Neural Computation | 2007

Penalized Probabilistic Clustering

Zhengdong Lu; Todd K. Leen

While clustering is usually an unsupervised operation, there are circumstances in which we believe (with varying degrees of certainty) that items A and B should be assigned to the same cluster, while items A and C should not. We would like such pairwise relations to influence cluster assignments of out-of-sample data in a manner consistent with the prior knowledge expressed in the training set. Our starting point is probabilistic clustering based on gaussian mixture models (GMM) of the data distribution. We express clustering preferences in a prior distribution over assignments of data points to clusters. This prior penalizes cluster assignments according to the degree with which they violate the preferences. The model parameters are fit with the expectation-maximization (EM) algorithm. Our model provides a flexible framework that encompasses several other semisupervised clustering models as its special cases. Experiments on artificial and real-world problems show that our model can consistently improve clustering results when pairwise relations are incorporated. The experiments also demonstrate the superiority of our model to other semisupervised clustering methods on handling noisy pairwise relations.


Neural Networks | 2007

2007 Special Issue: Fast neural network surrogates for very high dimensional physics-based models in computational oceanography

Rudolph van der Merwe; Todd K. Leen; Zhengdong Lu; Sergey Frolov; António M. Baptista

We present neural network surrogates that provide extremely fast and accurate emulation of a large-scale circulation model for the coupled Columbia River, its estuary and near ocean regions. The circulation model has O(10(7)) degrees of freedom, is highly nonlinear and is driven by ocean, atmospheric and river influences at its boundaries. The surrogates provide accurate emulation of the full circulation code and run over 1000 times faster. Such fast dynamic surrogates will enable significant advances in ensemble forecasts in oceanography and weather.


computer vision and pattern recognition | 2008

Dimensionality reduction by unsupervised regression

Miguel Á. Carreira-Perpiñán; Zhengdong Lu

We consider the problem of dimensionality reduction, where given high-dimensional data we want to estimate two mappings: from high to low dimension (dimensionality reduction) and from low to high dimension (reconstruction). We adopt an unsupervised regression point of view by introducing the unknown low-dimensional coordinates of the data as parameters, and formulate a regularised objective functional of the mappings and low-dimensional coordinates. Alternating minimisation of this functional is straightforward: for fixed low-dimensional coordinates, the mappings have a unique solution; and for fixed mappings, the coordinates can be obtained by finite-dimensional non-linear minimisation. Besides, the coordinates can be initialised to the output of a spectral method such as Laplacian eigenmaps. The model generalises PCA and several recent methods that learn one of the two mappings but not both; and, unlike spectral methods, our model provides out-of-sample mappings by construction. Experiments with toy and real-world problems show that the model is able to learn mappings for convoluted manifolds, avoiding bad local optima that plague other methods.


computer vision and pattern recognition | 2010

Parametric dimensionality reduction by unsupervised regression

Miguel Á. Carreira-Perpiñán; Zhengdong Lu

We introduce a parametric version (pDRUR) of the recently proposed Dimensionality Reduction by Unsupervised Regression algorithm. pDRUR alternately minimizes reconstruction error by fitting parametric functions given latent coordinates and data, and by updating latent coordinates given functions (with a Gauss-Newton method decoupled over coordinates). Both the fit and the update become much faster while attaining results of similar quality, and afford dealing with far larger datasets (105 points). We show in a number of benchmarks how the algorithm efficiently learns good latent coordinates and bidirectional mappings between the data and latent space, even with very noisy or low-quality initializations, often drastically improving the result of spectral and other methods.


neural information processing systems | 2004

Semi-supervised Learning with Penalized Probabilistic Clustering

Zhengdong Lu; Todd K. Leen


international conference on artificial intelligence and statistics | 2007

The Laplacian Eigenmaps Latent Variable Model

Miguel Á. Carreira-Perpiñán; Zhengdong Lu


neural information processing systems | 2007

People Tracking with the Laplacian Eigenmaps Latent Variable Model

Zhengdong Lu; Cristian Sminchisescu; Miguel Á. Carreira-Perpiñán


international conference on artificial intelligence and statistics | 2007

Semi-supervised Clustering with Pairwise Constraints: A Discriminative Approach

Zhengdong Lu


neural information processing systems | 2011

A Denoising View of Matrix Completion

Weiran Wang; Miguel Á. Carreira-Perpiñán; Zhengdong Lu

Collaboration


Dive into the Zhengdong Lu's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Weiran Wang

Toyota Technological Institute at Chicago

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge