Dominic Grenier
Laval University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Dominic Grenier.
Information Fusion | 2001
Anne-Laure Jousselme; Dominic Grenier; Eloi Bosse
Abstract We present a measure of performance (MOP) for identification algorithms based on the evidential theory of Dempster–Shafer. As an MOP, we introduce a principled distance between two basic probability assignments (BPAs) (or two bodies of evidence) based on a quantification of the similarity between sets. We give a geometrical interpretation of BPA and show that the proposed distance satisfies all the requirements for a metric. We also show the link with the quantification of Dempsters weight of conflict proposed by George and Pal. We compare this MOP to that described by Fixsen and Mahler and illustrate the behaviors of the two MOPs with numerical examples.
systems man and cybernetics | 2006
Anne-Laure Jousselme; Chunsheng Liu; Dominic Grenier; Eloi Bosse
In the framework of evidence theory, ambiguity is a general term proposed by Klir and Yuan in 1995 to gather the two types of uncertainty coexisting in this theory: discord and nonspecificity. Respecting the five requirements of total measures of uncertainty in the evidence theory, different ways have been proposed to quantify the total uncertainty, i.e., the ambiguity of a belief function. Among them is a measure of aggregate uncertainty, called AU, that captures in an aggregate fashion both types of uncertainty. But some shortcomings of AU have been identified, which are that: 1) it is complicated to compute; 2) it is highly insensitive to changes in evidence; and 3) it hides the distinction between the two types of uncertainty that coexist in every theory of imprecise probabilities. To overcome the shortcomings, Klir and Smith defined the TU1 measure that is a linear combination of the AU measure and the nonspecificity measure N. But the TU1 measure cannot solve the problem of computing complexity, and brings a new problem with the choice of the linear parameter delta. In this paper, an alternative measure to AU for quantifying ambiguity of belief functions is proposed. This measure, called Ambiguity Measure (AM), besides satisfying all the requirements for general measures also overcomes some of the shortcomings of the AU measure. Indeed, AM overcomes the limitations of AU by: 1) minimizing complexity for minimum number of focal points; 2) allowing for sensitivity changes in evidence; and 3) better distinguishing discord and nonspecificity. Moreover, AM is a special case of TU1 that does not need the parameter delta
Information Fusion | 2009
Mihai Cristian Florea; Anne-Laure Jousselme; íloi Bossé; Dominic Grenier
Dempsters rule of combination in evidence theory is a powerful tool for reasoning under uncertainty. Since Zadeh highlighted the counter-intuitive behaviour of Dempsters rule, a plethora of alternative combination rules have been proposed. In this paper, we propose a general formulation for combination rules in evidence theory as a weighted sum of the conjunctive and disjunctive rules. Moreover, with the aim of automatically accounting for the reliability of sources of information, we propose a class of robust combination rules (RCR) in which the weights are a function of the conflict between two pieces of information. The interpretation given to the weight of conflict between two BPAs is an indicator of the relative reliability of the sources: if the conflict is low, then both sources are reliable, and if the conflict is high, then at least one source is unreliable. We show some interesting properties satisfied by the RCRs, such as positive belief reinforcement or the neutral impact of vacuous belief, and establish links with other classes of rules. The behaviour of the RCRs over non-exhaustive frames of discernment is also studied, as the RCRs implicitly perform a kind of automatic deconditioning through the simple use of the disjunctive operator. We focus our study on two special cases: (1) RCR-S, a rule with symmetric coefficients that is proved to be unique and (2) RCR-L, a rule with asymmetric coefficients based on a logarithmic function. Their behaviours are then compared to some classical combination rules proposed thus far in the literature, on a few examples, and on Monte Carlo simulations.
IEEE Transactions on Image Processing | 1995
Haiqing Wu; Dominic Grenier; Gilles Y. Delisle; Da-Gang Fang
In inverse synthetic aperture radar (ISAR) imaging, the target rotational motion with respect to the radar line of sight contributes to the imaging ability, whereas the translational motion must be compensated out. This paper presents a novel two-step approach to translational motion compensation using an adaptive range tracking method for range bin alignment and a recursive multiple-scatterer algorithm (RMSA) for signal phase compensation. The initial step of RMSA is equivalent to the dominant-scatterer algorithm (DSA). An error-compensating point source is then recursively synthesized from the selected range bins, where each contains a prominent scatterer. Since the clutter-induced phase errors are reduced by phase averaging, the image speckle noise can be reduced significantly. Experimental data processing for a commercial aircraft and computer simulations confirm the validity of the approach.
Fuzzy Sets and Systems | 2008
Mihai Cristian Florea; Anne-Laure Jousselme; Dominic Grenier; Eloi Bosse
With the recent rising of numerous theories for dealing with uncertain pieces of information, the problem of connection between different frames has become an issue. In particular, questions such as how to combine fuzzy sets with belief functions or probability measures often emerge. The alternative is either to define transformations between theories, or to use a general or unified framework in which all these theories can be framed. Random set theory has been proposed as such a unified framework in which at least probability theory, evidence theory, possibility theory and fuzzy set theory can be represented. Whereas the transformations of belief functions or probability distributions into random sets are trivial, the transformations of fuzzy sets or possibility distributions into random sets lead to some issues. This paper is concerned with the transformation of fuzzy membership functions into random sets. In practice, this transformation involves the creation of a large number of focal elements (subsets with non-null probability) based on the @a-cuts of the fuzzy membership functions. In order to keep a computationally tractable fusion process, the large number of focal elements needs to be reduced by approximation techniques. In this paper, we propose three approximation techniques and compare them to classical approximations techniques used in evidence theory. The quality of the approximations is quantified using a distance between two random sets.
systems man and cybernetics | 2007
Chunsheng Liu; Dominic Grenier; Anne-Laure Jousselme; Eloi Bosse
In the theory of evidence, two kinds of uncertainty coexist, nonspecificity and discord. An aggregate uncertainty (AU) measure has been defined to include these two kinds of uncertainty, in an aggregate fashion. Meyerowitz et al. proposed an algorithm for calculating AU and validated its practical usage. Although this algorithm was proven to be absolutely correct by Klir and Wierman, in some cases, it remains too complex. In fact, when the cardinality of the frame of discernment is very large, it can be impossible to calculate AU. Therefore, based on Klirs and Harmanecs seminal work, we give some justifications for restricting the computation of AU(Bel) to the core of the corresponding belief function, and we also propose an algorithm to calculate AU(Bel), the F-algorithm, which reduces the computational complexity of the original algorithm of Meyerowitz et al. We prove that this algorithm gives the same results as Meyerowitzs algorithm, and we outline conditions under which it reduces the computational complexity significantly. Moreover, we illustrate the use of the F-algorithm in computing AU in a practical scenario of target identification.
Information Fusion | 2006
íloi Bossé; Pierre Valin; Anne-Claire Boury-Brisset; Dominic Grenier
The Information Fusion (IF) process is becoming increasingly more sophisticated, particularly through the incorporation of methods for high-level reasoning when applied to the situation analysis domain. A fundamental component of the IF process is a database (or databases) containing a priori knowledge that lists expected objects, behaviors of objects, and relationships between objects as well as all the possible attributes that can be inferred from measurements coming from a given sensor suite. We first present the basic concept of an existing support database (consisting of more than 2200 platforms) for Identity information fusion, and discuss its extension for higher-level fusion (e.g. situation and threat assessment). The database contains all the salient features needed for refining the identity of any target by the fusion of sensor information, and for addressing the situation and threat posed by groups of objects. The database is especially well suited for use in a Dempster-Shafer evidential reasoning scheme although it can also be used with Bayesian reasoning, if a priori probability distributions are known. Convincing results on several realistic scenarios of Maritime Air Area Operations and Direct Fleet Support are presented. This paper then develops the advanced concept of a Knowledge Management and Exploitation Server (KNOWMES) to support the IF process, through the use of ontologies and heterogeneous knowledge sources, which are necessary for higher level fusion.
international conference on information fusion | 2007
Pascal Djiknavorian; Dominic Grenier; Pierre Valin
In the context of electronic support measures, the use of the Dempster-Shafer theory is not flexible enough to obtain a clear evaluation of the state of allegiance for a detected target. With the new theory of plausible, paradoxical, and neutrosophic reasoning, the Dezert-Smarandache theory, we are able to get a clearer assessment. The current paper presents our research for these cases.
canadian conference on electrical and computer engineering | 1996
Eloi Bosse; J. Roy; Dominic Grenier
This paper presents a discussion on the feasibility and usefulness of data fusion applied to a suite of dissimilar sensors. This suite comprises surveillance radar, forward looking infrared (FLIR), electronic support measurement (ESM), an interrogation friend and foe (IFF), a synthetic aperture radar (SAR), acoustic sensors and a data link (LINK 11). An analysis of applicable sensor fusion processes is presented followed by a discussion on the expected performance improvements. Finally, a three step incremental approach is proposed with recoverable steps where different level of fusion sophistication can be implemented based on the availability of the technology and the actual status of the sensors.
vehicular technology conference | 1993
Michel Lecours; Dominic Grenier; M. Baqarhi; S. Cherkaoui
The authors discuss CW measurements and simulations of received signals in a room, in corridors, and around corners at 900 MHz and at three frequencies in the 20-60-GHz band, namely, 21.6, 37.2 and 59.6 GHz. Reported measurements include signal transmission with directional antennas in corridors and rooms and around corners with the transmitting station stationary and the receiving station moving autonomously along a fixed path. Ray tracing is used with good success to compare the measurements in corridors with the results of simulations taking into account the main reflections along floors, ceilings and walls. Knife edge diffraction and geometric theory of diffraction are used to explain results of diffraction by corners. Results pertaining to the simulation of signal transmissions at 900 MHz in a single room are presented. The results show, for a sample of transmitter-receiver localizations, the effect of the different multipath rays, from which certain general conclusions can be drawn for improved system design.