Lionel Fillatre
Centre national de la recherche scientifique
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Lionel Fillatre.
IEEE Transactions on Signal Processing | 2012
Lionel Fillatre
This paper deals with the detection of hidden bits in the Least Significant Bit (LSB) plane of a natural image. The mean level and the covariance matrix of the image, considered as a quantized Gaussian random matrix, are unknown. An adaptive statistical test is designed such that its probability distribution is always independent of the unknown image parameters, while ensuring a high probability of hidden bits detection. This test is based on the likelihood ratio test except that the unknown parameters are replaced by estimates based on a local linear regression model. It is shown that this test maximizes the probability of detection as the image size becomes arbitrarily large and the quantization step vanishes. This provides an asymptotic upper-bound for the detection of hidden bits based on the LSB replacement mechanism. Numerical results on real natural images show the relevance of the method and the sharpness of the asymptotic expression for the probability of detection.
information hiding | 2011
Rémi Cogranne; Cathel Zitzmann; Lionel Fillatre; Florent Retraint; Igor Nikiforov; Philippe Cornu
This paper investigates reliable steganalysis of natural cover images using a local non-linear parametric model. In the framework of hypothesis testing theory, the use of the model permits to warrant predictable results under false alarm constraint.
information hiding | 2011
Cathel Zitzmann; Rémi Cogranne; Florent Retraint; Igor Nikiforov; Lionel Fillatre; Philippe Cornu
The goal of this paper is to show how the statistical decision theory based on the parametric statistical model of the cover media can be useful in theory and practice of hidden information detection.
Computer Networks | 2010
Pedro Casas; Sandrine Vaton; Lionel Fillatre; Igor Nikiforov
Recent studies from major network technology vendors forecast the advent of the Exabyte era, a massive increase in network traffic driven by high-definition video and high-speed access technology penetration. One of the most formidable difficulties that this forthcoming scenario poses for the Internet is congestion problems due to traffic volume anomalies at the core network. In the light of this challenging near future, we develop in this work different network-wide anomaly detection and isolation algorithms to deal with volume anomalies in large-scale network traffic flows, using coarse-grained measurements as a practical constraint. These algorithms present well-established optimality properties in terms of false alarm and miss detection rate, or in terms of detection/isolation delay and false detection/isolation rate, a feature absent in previous works. This represents a paramount advantage with respect to current in-house methods, as it allows to generalize results independently of particular evaluations. The detection and isolation algorithms are based on a novel linear, parsimonious, and non-data-driven spatial model for a large-scale network traffic matrix. This model allows detecting and isolating anomalies in the Origin-Destination traffic flows from aggregated measurements, reducing the overhead and avoiding the challenges of direct flow measurement. Our proposals are analyzed and validated using real traffic and network topologies from three different large-scale IP backbone networks.
Annual Reviews in Control | 2014
Fouzi Harrou; Lionel Fillatre; Igor Nikiforov
Abstract Anomaly detection is addressed within a statistical framework. Often the statistical model is composed of two types of parameters: the informative parameters and the nuisance ones. The nuisance parameters are of no interest for detection but they are necessary to complete the model. In the case of unknown, non-random and non-bounded nuisance parameters, their elimination is unavoidable. Some approaches based on the assumption that the nuisance parameters belonging to a subspace interfere with the informative ones in a linear manner, use the theory of invariance to reject the nuisance. Unfortunately, this can lead to a serious degradation of the detector capacity because some anomalies are masked by nuisance parameters. Nevertheless, in many cases the physical nature of nuisance parameters is (partially) known, and this a priori knowledge permits to define lower and upper bounds for the nuisance parameters. The goal of this paper is to study the statistical performances of the constrained generalized likelihood ratio test used to detect an additive anomaly in the case of bounded nuisance parameters. An example of the integrity monitoring of GNSS train positioning illustrates the relevance of the proposed method.
Sequential Analysis | 2012
Blaise Kevin Guepie; Lionel Fillatre; Igor Nikiforov
Abstract This article addresses the transient change detection problem. It is assumed that a change occurs at an unknown (but nonrandom) change-point and the duration of post-change period is finite and known. A latent detection—that is, a detection that occurs after signal disappearance—is considered as a missed detection. A new optimality criterion adapted to the detection of transient changes involves the minimization of the worst-case probability of missed detection under constraint on the false alarm rate for a given period. A suboptimal sequential transient change detection algorithm is proposed. It is based on a window-limited cumulative sum (CUSUM) test. An upper bound for the worst-case probability of missed detection and a lower and an upper bound for the false alarm rate are proposed. Based on these bounds, the window-limited CUSUM test is optimized with respect to the proposed criterion. The developed algorithm and theoretical findings are applied to drinking water distribution network monitoring.
international conference on acoustics, speech, and signal processing | 2012
Cathel Zitzmann; Rémi Cogranne; Lionel Fillatre; Igor Nikiforov; Florent Retraint; Philippe Cornu
The goal of this paper is to propose the optimal statistical test based on the modeling of discrete cosine transform (DCT) coefficients with a quantified Laplacian distribution. This paper focuses on the detection of hidden information embedded in bits of the DCT coefficients of a JPEG image. This problem is difficult, in terms of statistical decision, for two main reasons: first, the number of DCT coefficients used to conceal the hidden bits is random; second, the JPEG image compression induces a strong quantization of DCT coefficients. The proposed test explicitly takes into account the randomness of the number of DCT coefficients used. It maximizes the probability of hidden information detection by ensuring a prescribed level of false alarm.
international symposium on information theory | 2011
Rémi Cogranne; Cathel Zitzmann; Lionel Fillatre; Florent Retraint; Igor Nikiforov; Philippe Cornu
In the last two decades substantial progress has been made in the detection of hidden information or hidden communication channels in media files or streams. Typically, it is necessary to reliably detect in a huge set of files (image, audio, and video) which of these files contain the hidden information. The goal of this paper is to study the problem of hypothesis testing based on quantized observations by using a parametric statistical model with nuisance parameters and to apply the obtained tests to the hidden information detection.
international conference on control applications | 2014
Van Long Do; Lionel Fillatre; Igor Nikiforov
This paper addresses the problem of detecting cyber/physical attacks on Supervisory Control And Data Acquisition (SCADA) systems. The detection of cyber/physical attacks is formulated as the problem of detecting transient changes in stochastic-dynamical systems in the presence of unknown system states (often regarded as the nuisance parameter). The Variable Threshold Window Limited CUmulative SUM (VTWL CUSUM) test is adapted to the detection of transient changes of known profiles in the presence of nuisance parameter. Taking into account the performance criterion of the transient change detection problem, which minimizes the worst-case probability of missed detection for a given value of the worst-case probability of false alarm, the thresholds are tuned for optimizing the VTWL CUSUM algorithm. The optimal choice of thresholds leads to the simple Finite Moving Average (FMA) algorithm. The proposed algorithms are utilized for detecting the covert attack on a simple water distribution system, targeting at stealing water from the reservoir without being detected.
IEEE Transactions on Signal Processing | 2012
Lionel Fillatre; Igor Nikiforov
This paper addresses the problem of multiple hypothesis testing (detection and isolation of mean vectors) in the case of Gaussian linear model with nuisance parameters. An invariant constrained asymptotically uniformly minimax test is proposed to solve this problem. The invariance of the test with respect to the nuisance parameters is obtained by projecting the measurement vector onto a subspace of invariant statistics. The proposed test minimizes the maximum probability of false isolation uniformly with respect to the lower bounded projections of the vectors defining the alternative hypotheses. This minimization is achieved provided that the signal-to-noise ratio (SNR) becomes arbitrary large. The asymptotic probabilities of false alarm and false isolations and their nonasymptotic bounds are analytically established. To illustrate the practical relevance of the proposed test, it is applied to the problem of network monitoring. It is aimed to detect and isolate volume anomalies in network origin-destination (OD) traffic demands from simple link load measurements. The ambient traffic, i.e. the OD traffic matrix corresponding to the nonanomalous network state, is unknown and considered as a nuisance parameter. An original linear parsimonious model of the ambient traffic which is indispensable for the proposed asymptotically optimal test is designed. The statistical performances of this approach to detect and isolate the anomalies are evaluated by using real data from the Abilene network.