Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Eric Lefevre is active.

Publication


Featured researches published by Eric Lefevre.


decision support systems | 2013

How to preserve the conflict as an alarm in the combination of belief functions

Eric Lefevre; Zied Elouedi

In the belief function framework, a unique function is induced from the use of a combination rule so allowing to synthesize all the knowledge of the initial belief functions. When information sources are reliable and independent, the conjunctive rule of combination, proposed by Smets, may be used. This rule is equivalent to the Dempster rule without the normalization process. The conjunctive combination provides interesting properties, as the commutativity and the associativity. However, it is characterized by having the empty set, called also the conflict, as an absorbing element. So, when we apply a significant number of conjunctive combinations, the mass assigned to the conflict tends to 1 which makes impossible returning the distinction between the problem arisen during the fusion and the effect due to the absorption power of the empty set.The objective of this paper is then to define a formalism preserving the initial role of the conflict as an alarm signal announcing that there is a kind of disagreement between sources. More exactly, that allows to preserve some conflict, after the fusion by keeping only the part of conflict reflecting the opposition between the belief functions. This approach is based on dissimilarity measures and on a normalization process between belief functions. Our proposed formalism is tested and compared with the conjunctive rule of combination on synthetic belief functions. In belief function theory, one of the main combination rules is the conjunctive rule.With this rule, a series of combinations results in mass equal to 1 on the conflict.In this case, it is impossible to identify a potential problem in the fusion process.The proposed method allows us to keep the real opposition between belief functions.Using this approach, the conflict regains its initial role of alarm.


International Journal of Approximate Reasoning | 2012

Belief functions contextual discounting and canonical decompositions

David Mercier; Eric Lefevre; François Delmotte

In this article, the contextual discounting of a belief function, a classical discounting generalization, is extended and its particular link with the canonical disjunctive decomposition is highlighted. A general family of correction mechanisms allowing one to weaken the information provided by a source is then introduced, as well as the dual of this family allowing one to strengthen a belief function.


IEEE Transactions on Dielectrics and Electrical Insulation | 2009

Robust diagnostics of stator insulation based on high frequency resonances measurements

F. Perisse; David Mercier; Eric Lefevre; D. Roger

The stator insulation breakdown is a major cause of AC machine failures. Ground insulation defaults are easily detected by classical systems based on leakage current measurements, however the turn-to-turn insulation degradations are more difficult to detect. For large machines, on-line methods, based on partial discharge detection and analysis, give good results but they cannot be used for low-voltage machines fed by adjustable speed drives (ASD). Previously, it has been shown by some of the authors that it was possible to estimate the aging of an AC machine winding thanks to HF measurements of current or magnetic field. In this paper, it is proposed to exploit conjointly all these different estimations to obtain a more robust and reliable diagnostic. The merging of the different estimations being realized through the belief functions framework, this approach is tested on real measurements.


international conference on tools with artificial intelligence | 2010

Discountings of a Belief Function Using a Confusion Matrix

Zied Elouedi; Eric Lefevre; David Mercier

In this paper, we present an analysis of different approaches relative to the correction of belief functions based on the results given by a confusion matrix. Three different mechanisms based on discountings are detailed. These methods have the objective to assess the discounting rates to be assigned to a source of information. These discounting rates allow to correct raw data, based on learnt decisions given by the confusion matrix. These corrections differ according to the use of classical or contextual or distance using discountings. An illustrative example is presented to emphasize the interest and also to show the differences between these adjustments. We carry experimentations on real databases to analyze and interpret these adjustment approaches.


Information Fusion | 2014

Object tracking and credal classification with kinematic data in a multi-target context

Samir Hachour; François Delmotte; David Mercier; Eric Lefevre

Abstract This article proposes a method to classify multiple maneuvering targets at the same time. This task is a much harder problem than classifying a single target, as sensors do not know how to assign captured observations to known targets. This article extends previous results scattered in the literature and unifies them in a single global framework with belief functions. Through two examples, it is shown that the full algorithm using belief functions improves results obtained with standard Bayesian classifiers and that it can be applied to a large variety of applications.


Expert Systems With Applications | 2012

Evidential calibration process of multi-agent based system

Alexandre Veremme; Eric Lefevre; Gildas Morvan; Daniel Dupont; Daniel Jolly

Highlights? We present a problem of calibration and validation of multi-agent based simulations. ? Simulation validation consists to measure if the simulation is close to a reality. ? Before the validation, calibration is a process to define the model parameters. ? We employ the belief function to deal imperfect data, used in validation/calibration. ? This global process is illustrated with an application to forensic entomology. Forensic entomology consists, during a criminal investigation, in studying the insects found on a cadaver to estimate the time of death. This is the only technique that can be used for a large post-mortem interval. But, because of the important system complexity, the result given by the expert are imperfect. In this paper, a decision support system (DSS) has been developed to take into account all the ecosystemic parameters and a significant quantity of biological models. The proposed DSS is based on the belief function theory to validate and calibrate agent based simulations. First results of this architecture are presented within the framework of a real forensic examination.


international conference on artificial intelligence | 2011

Handling partial preferences in the belief AHP method: application to life cycle assessment

Amel Ennaceur; Zied Elouedi; Eric Lefevre

This paper proposes a novel multi-criteria decision making method under uncertainty that combines the Analytic Hierarchy Process (AHP) with the belief function theory. Our method, named belief AHP, allows the expert to express incomplete and imprecise information about groups of alternatives instead of single ones. On the other hand and in order to judge the importance of criteria, he can also present his opinions on groups of criteria. Then, the uncertainty will be taken into account in the final decision. Finally, another purpose of this paper is also to solve a real application problem which deals with the PVC life cycle assessment.


Information Fusion | 2008

Improvement of an association algorithm for obstacle tracking

Yann Lemeret; Eric Lefevre; Daniel Jolly

This article describes a modification of an association algorithm for object tracking based on the evidence theory. This association algorithm was first developed by Rombaut and subsequently improved in a general way by Gruyer. This algorithm has been modified here in order to obtain better results when data reliability is poor. This article presents the basic concepts of the evidence theory. Then, the association algorithm developed by Rombaut is explained, and some examples are given to show that this algorithm fails to give the proper decision when data reliability decreases. Finally, the new algorithm is presented and the two algorithms are compared using synthetic data. In order to test the robustness of the two algorithms, they were also tested using real data coming from a CCD camera and these data can be qualified as very noisy with a reliability ranging from good to very bad.


knowledge and systems engineering | 2014

Mining Frequent Itemsets in Evidential Database

Ahmed Samet; Eric Lefevre; Sadok Ben Yahia

Mining frequent patterns is widely used to discover knowledge from a database. It was originally applied on Market Basket Analysis (MBA) problem which represents the Boolean databases. In those databases, only the existence of an article (item) in a transaction is defined. However, in real-world application, the gathered information generally suffer from imperfections. In fact, a piece of information may contain two types of imperfection: imprecision and uncertainty. Recently, a new database representing and integrating those two types of imperfection were introduced: Evidential Database. Only few works have tackled those databases from a data mining point of view. In this work, we aim to discuss evidential itemset’s support. We improve the complexity of state of art methods for support’s estimation. We also introduce a new support measure gathering fastness and precision. The proposed methods are tested on several constructed evidential databases showing performance improvement.


international conference information processing | 2014

Classification with Evidential Associative Rules

Ahmed Samet; Eric Lefevre; Sadok Ben Yahia

Mining database provides valuable information such as frequent patterns and especially associative rules. The associative rules have various applications and assets mainly data classification. The appearance of new and complex data support such as evidential databases has led to redefine new methods to extract pertinent rules. In this paper, we intend to propose a new approach for pertinent rule’s extraction on the basis of confidence measure redefinition. The confidence measure is based on conditional probability basis and sustains previous works. We also propose a classification approach that combines evidential associative rules within information fusion system. The proposed methods are thoroughly experimented on several constructed evidential databases and showed performance improvement.

Collaboration


Dive into the Eric Lefevre's collaboration.

Top Co-Authors

Avatar

Zied Elouedi

Institut Supérieur de Gestion

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Amel Ennaceur

Institut Supérieur de Gestion

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Asma Trabelsi

Institut Supérieur de Gestion

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge