Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Mounir Kaaniche is active.

Publication


Featured researches published by Mounir Kaaniche.


IEEE Transactions on Image Processing | 2009

Vector Lifting Schemes for Stereo Image Coding

Mounir Kaaniche; Amel Benazza-Benyahia; Béatrice Pesquet-Popescu; Jean-Christophe Pesquet

Many research efforts have been devoted to the improvement of stereo image coding techniques for storage or transmission. In this paper, we are mainly interested in lossy-to-lossless coding schemes for stereo images allowing progressive reconstruction. The most commonly used approaches for stereo compression are based on disparity compensation techniques. The basic principle involved in this technique first consists of estimating the disparity map. Then, one image is considered as a reference and the other is predicted in order to generate a residual image. In this paper, we propose a novel approach, based on vector lifting schemes (VLS), which offers the advantage of generating two compact multiresolution representations of the left and the right views. We present two versions of this new scheme. A theoretical analysis of the performance of the considered VLS is also conducted. Experimental results indicate a significant improvement using the proposed structures compared with conventional methods.


international conference on acoustics, speech, and signal processing | 2010

Two-dimensional non separable adaptive lifting scheme for still and stereo image coding

Mounir Kaaniche; Jean-Christophe Pesquet; Amel Benazza-Benyahia; Béatrice Pesquet-Popescu

Many existing works related to lossy-to-lossless image compression are based on the lifting concept. However, it has been observed that the separable lifting scheme structure presents some limitations because of the separable processing performed along the image lines and columns. In this paper, we propose to use a 2D non separable lifting scheme decomposition that enables progressive reconstruction and exact decoding of images. More precisely, we focus on the optimization of all the involved decomposition operators. In this respect, we design the prediction filters by minimizing the variance of the detail signals. Concerning the update filters, we propose a new optimization criterion which aims at reducing the inherent aliasing artefacts. Simulations carried out on still and stereo images show the benefits which can be drawn from the proposed optimization of the lifting operators.


Optical Engineering | 2014

Vector lifting scheme for phase-shifting holographic data compression

Yafei Xing; Mounir Kaaniche; Béatrice Pesquet-Popescu; Frederic Dufaux

Abstract. With the increasing interest in holography in three-dimensional imaging applications, the use of hologram compression techniques is mandatory for storage and transmission purposes. The state-of-the-art approach aims at encoding separately each interference pattern by resorting to common still-image compression techniques. Contrary to such an independent scheme, a joint hologram coding scheme is investigated in this paper. More precisely, instead of encoding all the interference patterns, it is proposed that only two sets of data be compressed by taking into account the redundancies existing among them. The resulting data are encoded by applying a joint multiscale decomposition based on the vector lifting concept. Experimental results show the benefits that can be drawn from the proposed hologram compression approach.


EURASIP Journal on Advances in Signal Processing | 2012

Adaptive lifting scheme with sparse criteria for image coding

Mounir Kaaniche; Béatrice Pesquet-Popescu; Amel Benazza-Benyahia; Jean-Christophe Pesquet

Lifting schemes (LS) were found to be efficient tools for image coding purposes. Since LS-based decompositions depend on the choice of the prediction/update operators, many research efforts have been devoted to the design of adaptive structures. The most commonly used approaches optimize the prediction filters by minimizing the variance of the detail coefficients. In this article, we investigate techniques for optimizing sparsity criteria by focusing on the use of an ℓ1 criterion instead of an ℓ2 one. Since the output of a prediction filter may be used as an input for the other prediction filters, we then propose to optimize such a filter by minimizing a weighted ℓ1 criterion related to the global rate-distortion performance. More specifically, it will be shown that the optimization of the diagonal prediction filter depends on the optimization of the other prediction filters and vice-versa. Related to this fact, we propose to jointly optimize the prediction filters by using an algorithm that alternates between the optimization of the filters and the computation of the weights. Experimental results show the benefits which can be drawn from the proposed optimization of the lifting operators.


Signal Processing | 2011

Non-separable lifting scheme with adaptive update step for still and stereo image coding

Mounir Kaaniche; Amel Benazza-Benyahia; Béatrice Pesquet-Popescu; Jean-Christophe Pesquet

Many existing works related to lossy-to-lossless multiresolution image compression are based on the lifting concept. It is worth noting that a separable lifting scheme may not appear very efficient to cope with the 2D characteristics of edges which are neither horizontal nor vertical. In this paper, we propose to use 2D non-separable lifting schemes that still enable progressive reconstruction and exact decoding of images. Their relevant advantage is to yield a tractable optimization of all the involved decomposition operators. More precisely, we design the prediction operators by minimizing the variance of the detail coefficients. Concerning the update filters, we propose a new optimization criterion which aims at reducing the inherent aliasing artifacts. A theoretical analysis of the proposed method is conducted in terms of the adaptation criterion considered in the optimization of the update filter. Simulations carried out on still images and residual ones generated from stereo pairs show the benefits which can be drawn from the proposed optimization of the lifting operators.


Applied Optics | 2015

Adaptive nonseparable vector lifting scheme for digital holographic data compression

Yafei Xing; Mounir Kaaniche; Béatrice Pesquet-Popescu; Frederic Dufaux

Holographic data play a crucial role in recent three-dimensional imaging as well as microscopic applications. As a result, huge amounts of storage capacity will be involved for this kind of data. Therefore, it becomes necessary to develop efficient hologram compression schemes for storage and transmission purposes. In this paper, we focus on the shifted distance information, obtained by the phase-shifting algorithm, where two sets of difference data need to be encoded. More precisely, a nonseparable vector lifting scheme is investigated in order to exploit the two-dimensional characteristics of the holographic contents. Simulations performed on different digital holograms have shown the effectiveness of the proposed method in terms of bitrate saving and quality of object reconstruction.


multimedia signal processing | 2009

Dense disparity estimation in multiview video coding

Ismaël Daribo; Mounir Kaaniche; Wided Miled; Marco Cagnazzo; Béatrice Pesquet-Popescu

Multiview video coding is an emerging application where, in addition to classical temporal prediction, an efficient disparity prediction should be performed in order to achieve the best compression performance. A popular coder is the multiview video coding (MVC) extension of H.264/AVC, which uses a block-based disparity estimation (just like temporal prediction in H.264/AVC). In this paper, we propose to improve the MVC extension by using a dense estimation method that generates a smooth disparity map with ideally infinite precision. The obtained disparity is then segmented and efficiently encoded by using a rate-distortion optimization technique. Experimental results show that significant gains can be obtained compared to the block-based disparity estimation technique used in the MVC extension.


Multimedia Tools and Applications | 2018

Efficient transform-based texture image retrieval techniques under quantization effects

Amani Chaker; Mounir Kaaniche; Amel Benazza-Benyahia; Marc Antonini

With the great demand for storing and transmitting images as well as their managing, the retrieval of compressed images is a field of intensive research. While most of the works have been devoted to the case of losslessly encoded images (by extracting features from the unquantized transform coefficients), new studies have shown that lossy compression has a negative impact on the performance of conventional retrieval systems. In this work, we investigate three different quantization schemes and propose for each one an efficient retrieval approach. More precisely, the uniform quantizer, the moment preserving quantizer and the distribution preserving quantizer are considered. The inherent properties of each quantizer are then exploited to design an efficient retrieval strategy, and hence, to reduce the drop of retrieval performances resulting from the quantization effect. Experimental results, carried out on three standard texture databases and a color dataset, show the benefits which can be drawn from the proposed retrieval approaches.


european signal processing conference | 2015

An efficient statistical-based retrieval approach for JPEG2000 compressed images

Amani Chaker; Mounir Kaaniche; Amel Benazza-Benyahia; Marc Antonini

This paper deals with the problem of image retrieval when the database is presented in a compressed form, by using typically the JPEG2000 encoding scheme based on wavelet transform followed by an uniform scalar quantization. The state-of-the-art method aims at applying a preprocessing step before the feature extraction to reduce the difference in the compression qualities between the images. Our contribution consists in extracting robust features directly from the quantized coefficients. More precisely, assuming that the unquantized coefficients within a subband have a Laplacian distribution, we propose to estimate the distribution parameter from the quantized coefficients. Then, the estimated parameters of the whole subbands are used to build a salient feature for the indexing process. Experimental results show that the proposed retrieval approach significantly improves the state-of-the-art one.


IEEE Transactions on Image Processing | 2014

A Bit Allocation Method for Sparse Source Coding

Mounir Kaaniche; Aurélia Fraysse; Béatrice Pesquet-Popescu; Jean-Christophe Pesquet

In this paper, we develop an efficient bit allocation strategy for subband-based image coding systems. More specifically, our objective is to design a new optimization algorithm based on a rate-distortion optimality criterion. To this end, we consider the uniform scalar quantization of a class of mixed distributed sources following a Bernoulli-generalized Gaussian distribution. This model appears to be particularly well-adapted for image data, which have a sparse representation in a wavelet basis. In this paper, we propose new approximations of the entropy and the distortion functions using piecewise affine and exponential forms, respectively. Because of these approximations, bit allocation is reformulated as a convex optimization problem. Solving the resulting problem allows us to derive the optimal quantization step for each subband. Experimental results show the benefits that can be drawn from the proposed bit allocation method in a typical transform-based coding application.

Collaboration


Dive into the Mounir Kaaniche's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Yafei Xing

Institut Mines-Télécom

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Faouzi Alaya Cheikh

Norwegian University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge