Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Thierry Denoeux is active.

Publication


Featured researches published by Thierry Denoeux.


systems man and cybernetics | 1995

A k-nearest neighbor classification rule based on Dempster-Shafer theory

Thierry Denoeux

In this paper, the problem of classifying an unseen pattern on the basis of its nearest neighbors in a recorded data set is addressed from the point of view of Dempster-Shafer theory. Each neighbor of a sample to be classified is considered as an item of evidence that supports certain hypotheses regarding the class membership of that pattern. The degree of support is defined as a function of the distance between the two vectors. The evidence of the k nearest neighbors is then pooled by means of Dempsters rule of combination. This approach provides a global treatment of such issues as ambiguity and distance rejection, and imperfect knowledge regarding the class membership of training patterns. The effectiveness of this classification scheme as compared to the voting and distance-weighted k-NN procedures is demonstrated using several sets of simulated and real-world data. >


systems man and cybernetics | 2000

A neural network classifier based on Dempster-Shafer theory

Thierry Denoeux

A new adaptive pattern classifier based on the Dempster-Shafer theory of evidence is presented. This method uses reference patterns as items of evidence regarding the class membership of each input pattern under consideration. This evidence is represented by basic belief assignments (BBA) and pooled using the Dempsters rule of combination. This procedure can be implemented in a multilayer neural network with specific architecture consisting of one input layer, two hidden layers and one output layer. The weight vector, the receptive field and the class membership of each prototype are determined by minimizing the mean squared differences between the classifier outputs and target values. After training, the classifier computes for each input vector a BBA that provides a description of the uncertainty pertaining to the class of the current pattern, given the available evidence. This information may be used to implement various decision rules allowing for ambiguous pattern rejection and novelty detection. The outputs of several classifiers may also be combined in a sensor fusion context, yielding decision procedures which are very robust to sensor failures or changes in the system environment. Experiments with simulated and real data demonstrate the excellent performance of this classification scheme as compared to existing statistical and neural network techniques.


systems man and cybernetics | 1998

An evidence-theoretic k-NN rule with parameter optimization

Lalla Merieme Zouhal; Thierry Denoeux

The paper presents a learning procedure for optimizing the parameters in the evidence-theoretic k-nearest neighbor rule, a pattern classification method based on the Dempster-Shafer theory of belief functions. In this approach, each neighbor of a pattern to be classified is considered as an item of evidence supporting certain hypotheses concerning the class membership of that pattern. Based on this evidence, basic belief masses are assigned to each subset of the set of classes. Such masses are obtained for each of the k-nearest neighbors of the pattern under consideration and aggregated using Dempsters rule of combination. In many situations, this method was found experimentally to yield lower error rates than other methods using the same information. However, the problem of tuning the parameters of the classification rule was so far unresolved. The authors determine optimal or near-optimal parameter values from the data by minimizing an error function. This refinement of the original method is shown experimentally to result in substantial improvement of classification accuracy.


systems man and cybernetics | 2004

EVCLUS: evidential clustering of proximity data

Thierry Denoeux; Marie-Hélène Masson

A new relational clustering method is introduced, based on the Dempster-Shafer theory of belief functions (or evidence theory). Given a matrix of dissimilarities between n objects, this method, referred to as evidential clustering (EVCLUS), assigns a basic belief assignment (or mass function) to each object in such a way that the degree of conflict between the masses given to any two objects reflects their dissimilarity. A notion of credal partition is introduced, which subsumes those of hard, fuzzy, and possibilistic partitions, allowing to gain deeper insight into the structure of the data. Experiments with several sets of real data demonstrate the good performances of the proposed method as compared with several state-of-the-art relational clustering techniques.


Pattern Recognition | 1997

Analysis of evidence-theoretic decision rules for pattern classification

Thierry Denoeux

The Dempster-Shafer theory provides a convenient framework for decision making based on very limited or weak information. Such situations typically arise in pattern recognition problems when patterns have to be classified based on a small number of training vectors, or when the training set does not contain samples from all classes. This paper examines different strategies that can be applied in this context to reach a decision (e.g. assignment to a class or rejection), provided the possible consequences of each action can be quantified. The corresponding decision rules are analysed under different assumptions concerning the completeness of the training set. These approaches are then demonstrated using real data.


systems man and cybernetics | 2006

Classification Using Belief Functions: Relationship Between Case-Based and Model-Based Approaches

Thierry Denoeux; Philippe Smets

The transferable belief model (TBM) is a model to represent quantified uncertainties based on belief functions, unrelated to any underlying probability model. In this framework, two main approaches to pattern classification have been developed: the TBM model-based classifier, relying on the general Bayesian theorem (GBT), and the TBM case-based classifier, built on the concept of similarity of a pattern to be classified with training patterns. Until now, these two methods seemed unrelated, and their connection with standard classification methods was unclear. This paper shows that both methods actually proceed from the same underlying principle, i.e., the GBT, and that they essentially differ by the nature of the assumed available information. This paper also shows that both methods collapse to a kernel rule in the case of precise and categorical learning data and for certain initial assumptions, and a simple relationship between basic belief assignments produced by the two methods is exhibited in a special case. These results shed new light on the issues of classification and supervised learning in the TBM. They also suggest new research directions and may help users in selecting the most appropriate method for each particular application, depending on the nature of the information at hand


IEEE Transactions on Fuzzy Systems | 2004

Principal component analysis of fuzzy data using autoassociative neural networks

Thierry Denoeux; Marie-Hélène Masson

This paper describes an extension of principal component analysis (PCA) allowing the extraction of a limited number of relevant features from high-dimensional fuzzy data. Our approach exploits the ability of linear autoassociative neural networks to perform information compression in just the same way as PCA, without explicit matrix diagonalization. Fuzzy input values are propagated through the network using fuzzy arithmetics, and the weights are adjusted to minimize a suitable error criterion, the inputs being taken as target outputs. The concept of correlation coefficient is extended to fuzzy numbers, allowing the interpretation of the new features in terms of the original variables. Experiments with artificial and real sensory evaluation data demonstrate the ability of our method to provide concise representations of complex fuzzy data.


systems man and cybernetics | 2006

Risk assessment based on weak information using belief functions: a case study in water treatment

Sabrina Démotier; Walter Schön; Thierry Denoeux

Whereas probability theory has been very successful as a conceptual framework for risk analysis in many areas where a lot of experimental data and expert knowledge are available, it presents certain limitations in applications where only weak information can be obtained. One such application investigated in this paper is water treatment, a domain in which key information such as input water characteristics and failure rates of various chemical processes is often lacking. An approach to handle such problems is proposed, based on the Dempster-Shafer theory of belief functions. Belief functions are used to describe expert knowledge of treatment process efficiency, failure rates, and latency times, as well as statistical data regarding input water quality. Evidential reasoning provides mechanisms to combine this information and assess the plausibility of various noncompliance scenarios. This methodology is shown to boil down to the probabilistic one where data of sufficient quality are available. This case study shows that belief function theory may be considered as a valuable framework for risk analysis studies in ill-structured or poorly informed application domains


International Journal of Approximate Reasoning | 2011

Ensemble clustering in the belief functions framework

Marie-Hélène Masson; Thierry Denoeux

In this paper, belief functions, defined on the lattice of intervals partitions of a set of objects, are investigated as a suitable framework for combining multiple clusterings. We first show how to represent clustering results as masses of evidence allocated to sets of partitions. Then a consensus belief function is obtained using a suitable combination rule. Tools for synthesizing the results are also proposed. The approach is illustrated using synthetic and real data sets.


Knowledge Based Systems | 2015

EK-NNclus

Thierry Denoeux; Orakanya Kanjanatarakul; Songsak Sriboonchitta

We propose a new clustering algorithm based on the evidential K nearest-neighbor (EK-NN) rule. Starting from an initial partition, the algorithm, called EK-NNclus, iteratively reassigns objects to clusters using the EK-NN rule, until a stable partition is obtained. After convergence, the cluster membership of each object is described by a Dempster-Shafer mass function assigning a mass to each cluster and to the whole set of clusters. The mass assigned to the set of clusters can be used to identify outliers. The method can be implemented in a competitive Hopfield neural network, whose energy function is related to the plausibility of the partition. The procedure can thus be seen as searching for the most plausible partition of the data. The EK-NNclus algorithm can be set up to depend on two parameters, the number K of neighbors and a scale parameter, which can be fixed using simple heuristics. The number of clusters does not need to be determined in advance. Numerical experiments with a variety of datasets show that the method generally performs better than density-based and model-based procedures for finding a partition with an unknown number of clusters.

Collaboration


Dive into the Thierry Denoeux's collaboration.

Top Co-Authors

Avatar

Marie-Hélène Masson

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Benjamin Quost

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar

Franck Davoine

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Fahed Abdallah

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar

Véronique Cherfaoui

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge