Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Nizar Bouguila is active.

Publication


Featured researches published by Nizar Bouguila.


IEEE Transactions on Image Processing | 2004

Unsupervised learning of a finite mixture model based on the Dirichlet distribution and its application

Nizar Bouguila; Djemel Ziou; Jean Vaillancourt

This paper presents an unsupervised algorithm for learning a finite mixture model from multivariate data. This mixture model is based on the Dirichlet distribution, which offers high flexibility for modeling data. The proposed approach for estimating the parameters of a Dirichlet mixture is based on the maximum likelihood (ML) and Fisher scoring methods. Experimental results are presented for the following applications: estimation of artificial histograms, summarization of image databases for efficient retrieval, and human skin color modeling and its application to skin detection in multimedia databases.


IEEE Transactions on Pattern Analysis and Machine Intelligence | 2007

High-Dimensional Unsupervised Selection and Estimation of a Finite Generalized Dirichlet Mixture Model Based on Minimum Message Length

Nizar Bouguila; Djemel Ziou

We consider the problem of determining the structure of high-dimensional data without prior knowledge of the number of clusters. Data are represented by a finite mixture model based on the generalized Dirichlet distribution. The generalized Dirichlet distribution has a more general covariance structure than the Dirichlet distribution and offers high flexibility and ease of use for the approximation of both symmetric and asymmetric distributions. This makes the generalized Dirichlet distribution more practical and useful. An important problem in mixture modeling is the determination of the number of clusters. Indeed, a mixture with too many or too few components may not be appropriate to approximate the true model. Here, we consider the application of the minimum message length (MML) principle to determine the number of clusters. The MML is derived so as to choose the number of clusters in the mixture model that best describes the data. A comparison with other selection criteria is performed. The validation involves synthetic data, real data clustering, and two interesting real applications: classification of Web pages, and texture database summarization for efficient retrieval.


IEEE Transactions on Pattern Analysis and Machine Intelligence | 2009

A Hybrid Feature Extraction Selection Approach for High-Dimensional Non-Gaussian Data Clustering

Sabri Boutemedjet; Nizar Bouguila; Djemel Ziou

This paper presents an unsupervised approach for feature selection and extraction in mixtures of generalized Dirichlet (GD) distributions. Our method defines a new mixture model that is able to extract independent and non-Gaussian features without loss of accuracy. The proposed model is learned using the expectation-maximization algorithm by minimizing the message length of the data set. Experimental results show the merits of the proposed methodology in the categorization of object images.


IEEE Transactions on Knowledge and Data Engineering | 2006

Unsupervised selection of a finite Dirichlet mixture model: an MML-based approach

Nizar Bouguila; Djemel Ziou

This paper proposes an unsupervised algorithm for learning a finite Dirichlet mixture model. An important part of the unsupervised learning problem is determining the number of clusters which best describe the data. We extend the minimum message length (MML) principle to determine the number of clusters in the case of Dirichlet mixtures. Parameter estimation is done by the expectation-maximization algorithm. The resulting method is validated for one-dimensional and multidimensional data. For the one-dimensional data, the experiments concern artificial and real SAP image histograms. The validation for multidimensional data involves synthetic data and two real applications: shadow detection in images and summarization of texture image databases for efficient retrieval. A comparison with results obtained for other selection criteria is provided


Journal of Electronic Imaging | 2008

Finite general Gaussian mixture modeling and application to image and video foreground segmentation

Mohand Said Allili; Nizar Bouguila; Djemel Ziou

In this paper, we propose a finite mixture model of generalized Gaussian distributions (GDD) for robust segmentation and data modeling in the presence of noise and outliers. The model has more flexibility to adapt the shape of data and less sensibility for over-fitting the number of classes than the Gaussian mixture. In a first part of the present work, we propose a derivation of the maximum-likelihood estimation of the parameters of the new mixture model and we propose an information-theory based approach for the selection of the number of classes. In a second part, we propose some applications relating to image, motion and foreground segmentation to measure the performance of the new model in image data modeling with comparison to the Gaussian mixture.


Statistics and Computing | 2006

Practical Bayesian estimation of a finite beta mixture through gibbs sampling and its applications

Nizar Bouguila; Djemel Ziou; Ernest Monga

This paper deals with a Bayesian analysis of a finite Beta mixture model. We present approximation method to evaluate the posterior distribution and Bayes estimators by Gibbs sampling, relying on the missing data structure of the mixture model. Experimental results concern contextual and non-contextual evaluations. The non-contextual evaluation is based on synthetic histograms, while the contextual one model the class-conditional densities of pattern-recognition data sets. The Beta mixture is also applied to estimate the parameters of SAR images histograms.


IEEE Transactions on Neural Networks | 2012

Variational Learning for Finite Dirichlet Mixture Models and Applications

Wentao Fan; Nizar Bouguila; Djemel Ziou

In this paper, we focus on the variational learning of finite Dirichlet mixture models. Compared to other algorithms that are commonly used for mixture models (such as expectation-maximization), our approach has several advantages: first, the problem of over-fitting is prevented; furthermore, the complexity of the mixture model (i.e., the number of components) can be determined automatically and simultaneously with the parameters estimation as part of the Bayesian inference procedure; finally, since the whole inference process is analytically tractable with closed-form solutions, it may scale well to large applications. Both synthetic and real data, generated from real-life challenging applications namely image databases categorization and anomaly intrusion detection, are experimented to verify the effectiveness of the proposed approach.


canadian conference on computer and robot vision | 2007

A Robust Video Foreground Segmentation by Using Generalized Gaussian Mixture Modeling

Mohand Said Allili; Nizar Bouguila; Djemel Ziou

In this paper, we propose a robust video foreground modeling by using a finite mixture model of generalized Gaussian distributions (GDD). The model has a flexibility to model the video background in the presence of sudden illumination changes and shadows, allowing for an efficient foreground segmentation. In a first part of the present work, we propose a derivation of the online estimation of the parameters of the mixture of GDDS and we propose a Bayesian approach for the selection of the number of classes. In a second part, we show experiments of video foreground segmentation demonstrating the performance of the proposed model.


IEEE Transactions on Neural Networks | 2010

A Dirichlet Process Mixture of Generalized Dirichlet Distributions for Proportional Data Modeling

Nizar Bouguila; Djemel Ziou

In this paper, we propose a clustering algorithm based on both Dirichlet processes and generalized Dirichlet distribution which has been shown to be very flexible for proportional data modeling. Our approach can be viewed as an extension of the finite generalized Dirichlet mixture model to the infinite case. The extension is based on nonparametric Bayesian analysis. This clustering algorithm does not require the specification of the number of mixture components to be given in advance and estimates it in a principled manner. Our approach is Bayesian and relies on the estimation of the posterior distribution of clusterings using Gibbs sampler. Through some applications involving real-data classification and image databases categorization using visual words, we show that clustering via infinite mixture models offers a more powerful and robust performance than classic finite mixtures.


IEEE Transactions on Knowledge and Data Engineering | 2008

Clustering of Count Data Using Generalized Dirichlet Multinomial Distributions

Nizar Bouguila

In this paper, we examine the problem of count data clustering. We analyze this problem using finite mixtures of distributions. The multinomial distribution and the multinomial Dirichlet distribution (MDD) are widely accepted to model count data. We show that these two distributions cannot be the best choice in all the applications, and we propose another model called the multinomial generalized Dirichlet distribution (MGDD) that is the composition of the generalized Dirichlet distribution and the multinomial, in the same way that the MDD is the composition of the Dirichlet and the multinomial. The estimation of the parameters and the determination of the number of components in our model are based on the deterministic annealing expectation-maximization (DAEM) approach and the minimum description length (MDL) criterion, respectively. We compare our method to standard approaches such as multinomial and multinomial Dirichlet mixtures to show its merits. The comparison involves different applications such as spatial color image databases indexing, handwritten digit recognition, and text document clustering.

Collaboration


Dive into the Nizar Bouguila's collaboration.

Top Co-Authors

Avatar

Djemel Ziou

Université de Sherbrooke

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge