Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Christos Chrysafis is active.

Publication


Featured researches published by Christos Chrysafis.


data compression conference | 1998

Line based reduced memory, wavelet image compression

Christos Chrysafis; Antonio Ortega

In this work we propose a novel algorithm for wavelet based image compression with very low memory requirements. The wavelet transform is performed progressively and we only require that a reduced number of lines from the original image be stored at any given time. The result of the wavelet transform is the same as if we were operating on the whole image, the only difference being that the coefficients of different subbands are generated in an interleaved fashion. We begin encoding the (interleaved) wavelet coefficients as soon as they become available. We classify each new coefficient in one of several classes, each corresponding to a different probability model, with the models being adapted on the fly for each image. Our scheme is fully backward adaptive and it relies only on coefficients that have already been transmitted. Our experiments demonstrate that our coder is still very competitive with respect to similar state-of-the-art coders. It is noted that schemes based on zero trees or bit plane encoding basically require the whole image to be transformed (or else have to be implemented using tiling). The features of the algorithm make it well suited for a low memory mode coding within the emerging JPEG2000 standard.


data compression conference | 1997

Efficient context-based entropy coding for lossy wavelet image compression

Christos Chrysafis; Antonio Ortega

We present an adaptive image coding algorithm based on novel backward-adaptive quantization/classification techniques. We use a simple uniform scalar quantizer to quantize the image subbands. Our algorithm puts the coefficient into one of several classes depending on the values of neighboring previously quantized coefficients. These previously quantized coefficients form contexts which are used to characterize the subband data. To each context type corresponds a different probability model and thus each subband coefficient is compressed with an arithmetic coder having the appropriate model depending on that coefficients neighborhood. We show how the context selection can be driven by rate-distortion criteria, by choosing the contexts in a way that the total distortion for a given bit rate is minimized. Moreover the probability models for each context are initialized/updated in a very efficient way so that practically no overhead information has to be sent to the decoder. Our results are comparable or in some cases better than the recent state of the art, with our algorithm being simpler than most of the published algorithms of comparable performance.


international conference on acoustics, speech, and signal processing | 2000

SBHP-a low complexity wavelet coder

Christos Chrysafis; Amir Said; Alexander I. Drukarev; Asad Islam; William A. Pearlman

We present a low-complexity entropy coder originally designed to work in the JPEG2000 image compression standard framework. The algorithm is meant for embedded and non-embedded coding of wavelet coefficients inside a subband, and is called subband-block hierarchical partitioning (SBHP). It was extensively tested following the standard experiment procedures, and it was shown to yield a significant reduction in the complexity of entropy coding, with small loss in compression performance. Furthermore, it is able to seamlessly support all JPEG2000 features. We present a description of the algorithm, an analysis of its complexity, and a summary of the results obtained after its integration into the verification model (VM).


international conference on image processing | 2001

JPEG2000-matched MRC compression of compound documents

Debargha Mukherjee; Christos Chrysafis; Amir Said

The mixed raster content (MRC) ITU document compression standard (T.44) specifies a multilayer decomposition model for compound documents into two contone image layers and a binary mask layer for independent compression. While T.44 does not recommend any procedure for decomposition, it does specify a set of allowable layer codecs to be used after decomposition. While T.44 only allows older standardized codecs such as JPEG/JBIG/G3/G4, higher compression could be achieved if newer contone and bi-level compression standards such as JPEG2000/JBIG2 were used instead. We present an MRC compound document codec using JPEG2000 as the image layer codec and a layer decomposition scheme matched to JPEG2000 for efficient compression. JBIG still codes the mask. Noise removal routines enable efficient coding of scanned documents along with electronic ones. Resolution scalable decoding features are also implemented. The segmentation mask, obtained from layer decomposition, serves to separate text and other features.


international conference on image processing | 2002

Low complexity guaranteed fit compound document compression

Debargha Mukherjee; Christos Chrysafis; Amir Said

We propose a new, very low complexity, single-pass algorithm for compression of continuous tone compound documents, known as GRAFIT (GuaRAnteed FIT), that can guarantee a minimum compression ratio of as much as 12:1 and even more, for all images in a single pass, while maintaining visually lossless quality when reproduced at resolution 300 dpi or more. The compression ratio is guaranteed in a single pass irrespective of the image being compressed. The complexity of the proposed encoder and decoder is orders of magnitude smaller than all image compression algorithms known today. For electronic compound documents, text is always compressed losslessly, and depending on the type of image, the actual compression ratio achieved may be as high as 200:1 or more. For photographic images, while the compression performance is inferior to DCT or wavelet coders, for documents at resolution 300 dpi and above, the quality is still visually lossless. Overall, performance of GRAFIT is highly competitive with the more expensive algorithms like JPEG2000, JPEG, or JPEG-LS and this performance is achieved in a single pass at much lower cost in both software and hardware.


asilomar conference on signals, systems and computers | 1996

Context-based adaptive image coding

Christos Chrysafis; Antonio Ortega

In this paper we present an adaptive image coding algorithm based on novel backward-adaptive quantization/classification techniques. We use a simple scalar quantizer to quantize the image subbands. Our algorithm uses several contexts to characterize the subband data and different arithmetic coder parameters are matched to each context. We show how the context selection can be driven by rate-distortion criteria and how the performance can be improved by replacing the scalar quantization strategy by a entropy-constrained approach. Our results are comparable or better than the recent state-of-the-art with our algorithm also having advantages in terms of simplicity.


SPIE's International Symposium on Optical Science, Engineering, and Instrumentation | 1999

Implementations of the discrete wavelet transform: complexity, memory, and parallelization issues

Antonio Ortega; Wenqing Jiang; Paul Fernandez; Christos Chrysafis

The discrete wavelet transform (DWT) has been touted as a very effective tool in many signal processing application, including compression, denoising and modulation. For example, the forthcoming JPEG 2000 image compression standard will be based on the DWT. However, in order for the DWT to achieve the popularity of other more established techniques (e.g., the DCT in compression) a substantial effort is necessary in order to solve some of the related implementation issues. Specific issues of interest include memory utilization, computation complexity and scalability. In this paper we concentrate on wavelet-based image compression and provide examples, based on our recent work, of how these implementation issues can be addressed in three different environments, namely, memory constrained applications, software-only encoding/decoding, and parallel computing engines. Specifically we will discuss (1) a low memory image coding algorithm that employs a line-based transform, (2) a technique to exploit the sparseness of non- zero wavelet coefficients in a software-only image decoder, and (3) parallel implementation techniques that take full advantage of lifting filterbank factorizations.


Proceedings of SPIE, the International Society for Optical Engineering | 2000

Minimum memory implementations of the lifting scheme

Christos Chrysafis; Antonio Ortega

All publications on the lifting scheme up to now consider non-casual systems, where the assumption is that the whole input signal is buffered. This is problematic if we want to use lifting gin a low memory scenario. In this paper we present an analysis for making a lifting implementation of a filter bank causal, while at the same time reducing the amount of delay needed for the whole system. The amount of memory needed for the lifting implementation of any filter bank can br shown to be always smaller than the corresponding convolution implementation. The amount of memory saving sis filter bank dependent, it ranges form no savings for the Haar transform to 40 percent for a 2-10 filter bank. The amount of savings depends on the number of lifting steps as well as the length of the lifting steps used. We will also elaborate on the use of boundary extensions on each lifting step instead of the whole signal. This leads to lower memory requirements as well as simpler implementations.


IEEE Transactions on Image Processing | 2000

Line-based, reduced memory, wavelet image compression

Christos Chrysafis; Antonio Ortega


Archive | 1999

Efficient wavelet-based compression of large images

David Taubman; Christos Chrysafis; Erik Ordentlich

Collaboration


Dive into the Christos Chrysafis's collaboration.

Top Co-Authors

Avatar

Antonio Ortega

University of Southern California

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

David Taubman

University of New South Wales

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Asad Islam

Rensselaer Polytechnic Institute

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Paul Fernandez

University of Southern California

View shared research outputs
Top Co-Authors

Avatar

Wenqing Jiang

University of Southern California

View shared research outputs
Top Co-Authors

Avatar

William A. Pearlman

Rensselaer Polytechnic Institute

View shared research outputs
Researchain Logo
Decentralizing Knowledge