Olivier Egger
École Polytechnique Fédérale de Lausanne
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Olivier Egger.
Proceedings of the IEEE | 1995
Olivier Egger; Wei Li; Murat Kunt
A morphological subband decomposition with perfect reconstruction is proposed. Critical subsampling is achieved. The reconstructed images using this decomposition do not suffer from any ringing effect. In order to avoid poor texture representation by the morphological filters an adaptive subband decomposition is introduced. It chooses linear filters on textured regions and morphological filters otherwise. A simple and efficient texture detection criterion is proposed and applied to the adaptive decomposition. Comparisons to other coding techniques such as JPEG and linear subband coding show that the proposed scheme performs significantly better both in terms of PSNR and visual quality. >
Proceedings of the IEEE | 1999
Olivier Egger; Pascal Fleury; Touradj Ebrahimi; Murat Kunt
Digital images have become an important source of information in the modern world of communication systems. In their raw form, digital images require a tremendous amount of memory. Many research efforts have been devoted to the problem of image compression in the last two decades. Two different compression categories must be distinguished: lossless and lossy. Lossless compression is achieved if no distortion is introduced in the coded image. Applications requiring this type of compression include medical imaging and satellite photography. For applications such as video telephony or multimedia applications, some loss of information is usually tolerated in exchange for a high compression ratio. In this two-part paper, the major building blocks of image coding schemes are overviewed. Part I covers still image coding, and Part II covers motion picture sequences. In this first part, still image coding schemes have been classified into predictive, block transform, and multiresolution approaches. Predictive methods are suited to lossless and low-compression applications. Transform-based coding schemes achieve higher compression ratios for lossy compression but suffer from blocking artifacts at high-compression ratios. Multiresolution approaches are suited for lossy as well for lossless compression. At lossy high-compression ratios, the typical artifact visible in the reconstructed images is the ringing effect. New applications in a multimedia environment drove the need for new functionalities of the image coding schemes. For that purpose, second-generation coding techniques segment the image into semantically meaningful pairs. Therefore, parts of these methods have been adapted to work for arbitrarily shaped regions. In order to add another functionality, such as progressive transmission of the information, specific quantization algorithms must he defined. A final step in the compression scheme is achieved by the codeword assignment. Finally, coding results are presented which compare state-of-the-art techniques for lossy and lossless compression. The different artifacts of each technique are highlighted and discussed. Also, the possibility of progressive transmission is illustrated.
IEEE Transactions on Image Processing | 1995
Olivier Egger; Wei Li
We address the problem of the choice of subband filters in the context of image coding. The ringing effects that occur in subband-based compression schemes are the major unpleasant distortions. A new set of two-band filter banks suitable for image coding applications is presented. The basic properties of these filters are linear phase, perfect reconstruction, asymmetric length, and maximum regularity. The better overall performances compared to the classical QMF subband filters are explained. The asymmetry of the filter lengths results in a better compaction of the energy, especially in the highpass subbands. Moreover, the quantization error is reduced due to the short lowpass synthesis filter. The undesirable ringing effect is considerably reduced due to the good step response of the synthesis lowpass filter. The proposed design takes into account the statistics of natural images and the effect of quantization errors in the reconstructed images, which explains the better coding performance.
international conference on image processing | 1994
Olivier Egger; Wei Li
A morphological subband decomposition with perfect reconstruction is proposed. Critical subsampling is achieved. Reconstructed images using this decomposition do not suffer from any ringing effect. In order to avoid poor texture representation by morphological filters an adaptive subband decomposition is introduced. It chooses linear filters on textured regions and morphological filters otherwise. Comparisons to other coding techniques such as JPEG and linear subband coding show that the proposed scheme performs significantly better both in terms of PSNR and visual quality.<<ETX>>
international conference on image processing | 1995
Olivier Egger; Murat Kunt
In this paper the problem of progressive lossless image coding is addressed. Many applications require a lossless compression of the image data. The possibility of progressive decoding of the bitstream adds a new functionality for those applications using data browsing. In practice, the proposed scheme can be of intensive use when accessing large databases of images requiring a lossless compression (especially for medical applications). The international standard JPEG allows a lossless mode. It is based on an entropy reduction of the data using various kinds of estimators followed by source coding. The proposed algorithm works with a completely different philosophy summarized in the following four key points: 1) a perfect reconstruction hierarchical morphological subband decomposition yielding only integer coefficients, 2) prediction of the absence of significant information across scales using zerotrees of wavelet coefficients, 3) entropy-coded successive-approximation quantization, and 4) lossless data compression via adaptive arithmetic coding. This approach produces a completely embedded bitstream. Thus, it is possible to decode only partially the bitstream to reconstruct an approximation of the original image.
international conference on acoustics speech and signal processing | 1996
Olivier Egger; Touradj Ebrahimi; Murat Kunt
In order to satisfy the needs of new multimedia applications, the problem of content-based video coding has to be addressed. A new approach of object interior coding is proposed. It is based on an arbitrarily-shaped subband transform followed by a generalized embedded zerotree wavelet algorithm. It is shown that the proposed technique achieves good compression results and has additional properties such as being computationally efficient, keeping the same dimensionality in the transformed domain, being perfect reconstruction, and allowing a perfect rate control. In addition a lossless mode can be defined by using an appropriate filter bank.
international conference on acoustics, speech, and signal processing | 1997
Pascal Fleury; Olivier Egger
Developments in digital image coding tend to involve more and more complex algorithms, and require therefore an increasing amount of computation. To improve the overall system performance, some schemes apply different coding algorithms to separate parts of an image according to the content of this subimage. Such schemes are referred to as dynamic coding schemes. Applying the best suited coding algorithm to a part of an image will lead to an improved coding quality, but implies an algorithm selection phase. Current selection methods require the computation of the reconstructed image after coding and decoding with all the selected algorithms in order to choose the best method. Some other schemes use ways of pruning the search in the algorithm space. Both approaches suffer from a heavy computational load. Furthermore, the computational complexity is increased even more if the parameters have to be adjusted for a given algorithm during the search. This paper describes a way to predict the coding quality of a region of the input image for any given coding method. The system will then be able to select the best suited coding algorithm for each region according to the predicted quality. This prediction scheme has low complexity, and also enables the adjustment of algorithm specific parameters during the search.
international conference on acoustics, speech, and signal processing | 1995
Wei Li; Olivier Egger; Murat Kunt
This paper addresses the problem of the quantization noise reduction in subband image coding schemes. Two major artifacts occur for such coding schemes at high compression factors: the ringing effect around high-contrast contours and the blurred false contours in large smooth regions. The first distortion can be considerably reduced by an appropriate design of the subband filters. The second one can be eliminated by using the noise reduction technique proposed in this paper, which consists of applying a noise reduction filter to the DC subband. The advantages of this approach are as follows: first, it can be applied to any kind of subband decompositions; second, it removes quantization noise to which the eye is most sensitive; and third, it is computationally very efficient due to the small size (typically 64/spl times/64) of the DC subband. The colored quantization noise in the DC subband is rendered white by using the Roberts pseudonoise technique. The proposed noise reduction filter is a Wiener type filter with adaptive directional support. It has the advantage of reducing the noise without blurring the reconstructed image. It is shown that the proposed noise reduction filter augments the visual quality of the reconstructed image as well as its PSNR value.
Medical Imaging 1996: Image Display | 1996
Roberto Castagno; Rosa C. Lancini; Olivier Egger
In this paper different methods for the quantization of wavelet transform coefficients are compared in view of medical imaging applications. The goal is to provide users with a comprehensive and application-oriented review of these techniques. The performance of four quantization methods (namely standard scalar quantization, embedded zerotree, variable dimension vector quantization and pyramid vector quantization) are compared with regard to their application in the field of medical imaging. In addition to the standard rate-distortion criterion, we took into account the possibility of bitrate control, the feasibility of real-time implementation, the genericity (for use in non-dedicated multimedia environments) of each approach. In addition, the diagnostical reliability of the decompressed images has been assessed during a viewing session and with the help of a specialist. Classical scalar quantization methods are briefly reviewed. As a result, it is shown that despite the relatively simple design of the optimum quantizers, their performance in terms of rate-distortion tradeoff are quite poor. For high quality subband coding, it is of major importance to exploit the existing zero-correlation across subbands as proposed with the embedded zerotree wavelet (EZW) algorithm. In this paper an improved EZW-algorithm is used which is termed embedded zerotree lossless (EZL) algorithm -- due to the importance of lossless compression in medical imaging applications -- having the additional possibility of producing an embedded lossless bitstream. VQ based methods take advantage of statistical properties of a block or a vector of data values, yielding good quality results of reconstructed images at the same bitrates. In this paper, we take in account two classes of VQ methods, random quantizers (VQ) and geometric quantizers (PVQ). Algorithms belonging to the first group (the most widely known being that developed by Linde-Buzo-Gray) suffer from the common drawback of requiring a computationally demanding training procedure in order to produce a codebook. The second group represents an interesting alternative, based on the multidimensional properties of the distribution of the source to code. In particular a pyramid vector quantization has been taken into account. Despite being based on the implicit geometry of independent and identically distributed (i.i.d.) Laplacian sources, this method proved to achieve good results with other distributions. Tests show that zerotree yields the most promising results in the rate- distortion sense. Moreover, this approach allows an exact rate control and has the possibility of a progressive bitstream which can be used either for data browsing or up to a lossless representation of the input image.
international conference on acoustics speech and signal processing | 1998
Olivier Egger; Reto Grüter; Jean-Marc Vesin; Murat Kunt
A novel decomposition scheme for image compression is presented. It is capable of applying any nonlinear model to compress images in a lossless way. Here, a very efficient polynomial model that considers spatial information as well as order statistic information is introduced. This new rank order polynomial decomposition (ROPD) that allows also for a progressive bitstream is applied to various images of different nature and compared to the morphological subband decomposition (MSD) and to the best prediction mode for lossless compression of the international standard, JPEG. For all compressed images, ROPD provides better compression results than MSD and clearly outperforms the lossless mode of JPEG.