Gabriel Dauphin
Institut Galilée
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Gabriel Dauphin.
Signal Processing-image Communication | 2015
Aysha Kadaikar; Gabriel Dauphin; Anissa Mokraoui
This paper deals with the problem of block-based disparity map estimation for stereoscopic image coding where the estimated map is transmitted to the decoder in order to predict one view from the other. The estimation problem of the disparity map is thus a trade-off between the quality prediction and the binary cost of the disparities to be stored or transmitted. This trade-off is modeled as a joint entropy-distortion metric assuming that the disparity map is encoded with an entropy coder; and one of the two views is fully predicted using this map when applied to the other view. However minimizing this joint metric is a complex combinatorial optimization problem where choices of disparities are all interrelated. A sub-optimal optimization solution is then proposed. It is based on a tree structure which is constructed sequentially whenever a block is matched. The developed algorithm, called Modified M-Algorithm (MMA), processes the reference view in a raster scanning order and assumes that the disparities to be selected in the unprocessed area are likely to follow a chosen disparity distribution. This algorithm has the ability at each step of the process not only to retain the M-best paths of the tree in terms of entropy-distortion cost but also to explore all possible extensions of each of these M paths until reading the last block of the view. Simulations, conducted on stereoscopic images extracted from Middlebury and Deimos datasets, show the advantage of our MMA compared to the conventional Block Matching Algorithm (BMA) with and without regularization both in terms of reducing bitrate and distortion. HighlightsA block-based disparity map estimation algorithm for stereo image coding is proposed.The optimization algorithm relies on a joint entropy-distortion metric.A sub-optimal disparity map is built sequentially using a simplified tree structure.Comparisons are done with block-matching algorithms with and without regularization.Simulation results show significant gains in terms of rate-distortion.
information sciences, signal processing and their applications | 2003
Gabriel Dauphin; Azeddine Beghdadi; P.V. de Lesegno
This article proposed a new definition for the local contrast inspired by some findings on the human visual system mechanisms. This measure exploits the multichannel nature of the human visual system. This new definition takes into account explicitly the frequency and directional selectivity of the cortex as demonstrated through many neurophysiological and psychovisual experiments. The visual information conveyed by this new contrast, called local directional bandlimited contrast, are analysed on real and synthetic images. The consistency of this local contrast with the subjective perception and its computational complexity are compared to other similar contrasts.
international symposium on signal processing and information technology | 2015
Aysha Kadaikar; Gabriel Dauphin; Anissa Mokraoui
This paper deals with the disparity map estimation problem to encode non-rectified stereoscopic images. This encoding issue is considered as a trade-off between the quality of the predicted view and the bit-rate required to encode the estimated disparity map. A sub-optimal optimization algorithm, known as Modified-M-Algorithm, based on an entropy-distortion metric has already been proposed specifically for rectified stereoscopic images. Indeed the selected disparities reduce not only the distortion of the predicted image but also the entropy of the estimated disparity map under a low computational complexity. An extension of this strategy is proposed to non-rectified stereoscopic images. Simulation results confirm that our extended algorithm still achieves better rate-distortion performance than the traditional block-matching algorithm. Moreover an improvement in terms of rate-distortion performance is also observed even in the case of rectified stereoscopic images.
international symposium on signal processing and information technology | 2015
Aysha Kadaikar; Gabriel Dauphin; Anissa Mokraoui
This paper deals with the block-based disparity map estimation of a stereoscopic image. While most existing algorithms estimate this map by minimizing a dissimilarity metric, the proposed optimization algorithm aims at minimizing the rate-distortion compromise using the disparity map yielded by the traditional block matching algorithm as an initial reference map. The developed algorithm analyzes the performance impact of the permutation of each disparity of the reference map with all possible disparities. The retained disparity is one that improves the joint rate-distortion metric. This process is repeated as long as improvements are observed. Moreover, a particular attention is given to the updating process of the joint metric so that the algorithm computational cost is not affected. Simulation results clearly show that our approach achieves better performance than the traditional block matching algorithm in terms of rate-distortion compromise.
information sciences, signal processing and their applications | 2003
Patrick Bonnin; Olivier Stasse; Vincent Hugel; Pierre Blazevic; Gabriel Dauphin
Taking into account the constraints of mobile and autonomous robotics, and the tasks devoted to the vision system embedded on different robots for various applications, we propose a research in matter of fast pixel gathering mechanisms for the connected component extraction and the color region segmentation. In this work, we studied three different mechanisms, and proposed a new method to compare and to evaluate the algorithms according to two criteria: the speed of processing and the quality of the results for robotic applications. This method allows to choose the most appropriate algorithm for a given robotic application, and to create new algorithms, by hybridizing older ones while keeping interesting properties.
Signal Processing-image Communication | 2018
Aysha Kadaikar; Gabriel Dauphin; Anissa Mokraoui
Abstract This paper addresses the disparity map estimation problem in the context of stereoscopic image coding. It is undeniable that the use of variable size blocks offers the possibility to describe more precisely the predicted view but at the expense of a high bitrate if no particular consideration is taken into account by the estimation algorithm. Indeed more information related to the block layout, considered here as a block-length map, is required at the prediction step. This paper presents an algorithm which jointly optimizes the block-length map as well as the disparity map so as to ensure a good reconstruction of the predicted view while minimizing the bitrate. This is done thanks to a joint metric taking into account the quality of the reconstruction as well as the bitrate needed to encode the maps. Moreover the developed algorithm iteratively improves its performance by refining the estimated maps. Simulation results conducted on several stereoscopic images from the CMU-VASC and the Middlebury dataset confirm the benefits of this approach as compared to competitive block matching algorithms.
european workshop on visual information processing | 2016
Aysha Kadaikar; Gabriel Dauphin; Anissa Mokraoui
This paper deals with the blockwise disparity map estimation problem for stereoscopic image coding. Generally, disparities are selected amongst a search area by minimizing a local distortion. In addition the larger the search area is, the more often a better disparity can be chosen and the lower the global distortion is. However, the resulting disparity map containing higher number of idfferent disparities is encoded with a larger bitrate. This paper proposes two approaches to take advantage of large search areas while reducing not only the bitrate of the estimated disparity map but also the computational complexity of the optimal solution. The developed sub-optimal algorithms rely on the initial set of disparities selected by the traditional Block-Matching Algorithm (BMA) to compute new sets minimizing the distortion of the predicted view under a bitrate constraint. Simulation results confirm the benefits of our algorithms compared to the BMA in terms of bitrate-distortion.
international conference on image processing | 2015
Gabriel Dauphin; Mounir Kaaniche; Anissa Zergaïnoh-Mokraoui
With the recent advances in stereoscopic display technologies, there is a growing demand for designing efficient stereo image compression techniques. For this reason, a great attention should be paid to the disparity/estimation process used to generate the residual image. In this paper, we propose to improve the disparity compensation process in a typical closed-loop-based stereo image coding scheme. A new formulation of this process, based on a block dependent dictionary, is developed. More specifically, the main idea aims to link together the disparities yielding similar compensations and assign a common disparity candidate to each subset of disparities. Experimental results have shown the interest of the proposed method in terms of bitrate saving and quality of reconstruction.
european workshop on visual information processing | 2011
Gabriel Dauphin; Sami Khanfir
As the aging population is growing, new challenges are arising to provide a safe living environment with remote medical monitoring to allow elderly people to stay at home. This paper is concerned with the monitoring of medication intake. A new technique is proposed for background suppression designed to achieve indoor monitoring for a given video capture device, including low-cost commercially available cameras or webcams with low capturing resolution. The true background image is supposed to be found in the test video sequence, as it is thought to be possible in this application. The background suppression process can be thought of as a quality measure with reference; the reference being the background image. Instead of taking into account findings on human visual system (HVS), the proposed technique is actually based on measurements of noise output from video capture device. Experimental results are presented, comparing foreground detection by the proposed technique, two published background suppression algorithms, and three well-known quality measures.
european workshop on visual information processing | 2010
Gabriel Dauphin; P. Viaris de Lesegno
Image quality assessment is becoming increasingly used in many applications. In most of the existing image quality assessment approaches, the main objective is to develop measures that are consistent with the subjective evaluation. Therefore, the performance of a given image quality metric is evaluated against the MOS determined from a series of subjective tests performed on a database. A plethora of image quality metrics has been developed. However, a few studies have been reported on the analysis and comparison of these metrics. This study attempts to provide a new framework for analysing and comparing some of the most common image quality metrics. Three quality representative metrics of the most known approaches have been chosen for this study. The Peak Signal to Noise Ratio (PSNR), the Visible Differences Predictor (VDP) and the Mean Structural SIMilarity index (SSIM). The two latter are found to be consistent with Webers law. However, subjective testing in literature and computations derived from uniform color spaces such as CIE-L*a*b* suggest a different photometric invariance law. In this paper, we establish this photometric invariance law and show through numerical simulations how to check whether a given quality metric is compliant with this law.