Koen Denecker
Ghent University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Koen Denecker.
data compression conference | 1997
Koen Denecker; J. Van Overloop; Ignace Lemahieu
Summary form only given. The output of medical imaging devices is increasingly digital and both storage space and transmission time of the images profit from compression. The introduction of PACS systems into the hospital environment fortifies this need. Since any loss of diagnostic information is to be avoided, lossless compression techniques are preferable. We present an experimental comparison of several lossless coders and investigate their compression efficiency and speed for different types of medical images. The coders are: five image coders (LJPEG, BTPC, FELICS, S+P, CALIC), and two general-purpose coders (GnuZIP, STAT). The medical imaging techniques are: CT, MRI, X-ray, angiography, mammography, PET and echography. Lossless JPEG (LJPEG), the current lossless compression standard, combines simple linear prediction with Huffman coding. Binary tree predictive coding (BTPC) is a multi-resolution technique which decomposes the image into a binary tree. The fast and efficient lossless image compression system (FELICS) conditions the pixel data on the values of the two nearest neighbours. Compression with reversible embedded wavelets (S+P) uses a lossless wavelet transform. The context-based, adaptive, lossless/nearly-lossless coding scheme for continuous-tone images (CALIC) combines non-linear prediction with advanced statistical error modelling techniques. GnuZIP uses LZ77, a form of sliding window compression. STAT is a PPM-lilte general-purpose compression technique. We give combined compression ratio vs. speed results for the different compression methods as an average over the different image types.
Computerized Medical Imaging and Graphics | 2001
W. Philips; S. Van Assche; D. De Rycke; Koen Denecker
This paper explains the basic principles of lossless two-dimensional (2D) and three-dimensional (3D) image coding at a high level of abstraction. It discusses also a new inter-frame technique for lossless video coding based on intra-frame prediction and inter-frame context modelling. The performance of this technique is compared to that of state-of-the-art 2D coders on CT and MRI data sets from the Visual Human Project. The results show that the inter-frame technique outperforms state-of-the-art intra-frame coders, i.e. Calic and JPEG-LS. The improvement in compression ratio is significant in the case of CT data but is rather small in the case of MRI data.
Inverse Problems | 1998
Koen Denecker; Jeroen Van Overloop; Frank Sommen
The general quadratic Radon transform in two dimensions is investigated. Whereas the classical Radon transform of a smooth function represents the integration over all lines, the general quadratic Radon transform integrates over all conic sections. First, the parabolic isofocal Radon transform, i.e. the restriction of the general quadratic Radon transform to all parabolae with focus in the origin, is defined and illustrated. We show its intense relation to the classical Radon transform, deduce a support theorem, formulate an extension of the support theorem and derive an inversion formula. The natural extension to a more general class of isofocal quadratic Radon transforms is outlined. We show how the general quadratic Radon transform can be derived from the integrals over all parabolae by solving the related Cauchy problem. Finally, we introduce an entirely geometrical definition of a generalized Radon transform, the oriented generalized Radon transform.
Signal Processing-image Communication | 2002
Koen Denecker; Dimitri Van De Ville; Frederik Habils; Wim Meeus; Marnik Brunfaut; Ignace Lemahieu
The popularity of high-resolution digital printers and the growing computational requirements of new applications such as printing-on-demand and personalized printing have increased the need for fast and efficient lossless halftone image compression. In a previous paper, we have shown that the compression performance can be improved significantly by adapting the context template to the halftone parameters. Unfortunately, this variability in data dependency makes the modeling stage more complex and slows down the overall compression scheme. In this paper, we describe the design of an improved block-based software and hardware implementation. The software implementation uses complementary line-shifting to by-pass the adaptivity of the template. The hardware implementation is based on the automated construction of a microcoded program from a given template. Experimental results show that our improved implementation achieves approximately the same processing speed as when the fixed context template is applied. The proposed implementation is also of importance for the emerging JBIG2 standard which uses up to four adaptive template pixels.
Journal of Electronic Imaging | 1999
Steven Van Assche; Koen Denecker; Peter De Neve; Wilfried Philips; Ignace Lemahieu
In the prepress industry, color images have both a high spatial and a high color resolution. Such images require a consider- able amount of storage space and impose long transmission times. Data compression is desired to reduce these storage and transmis- sion problems. Because of the high quality requirements in the pre- press industry, mostly only lossless compression is acceptable. Most existing lossless compression schemes operate on gray-scale images. In this case the color components of color images are com- pressed independently. However, higher compression ratios can be achieved by exploiting intercolor redundancies. In this paper we present a comparison of three state-of-the-art lossless compression techniques which exploit such color redundancies: intercolor error prediction and a Karhunen-Loeve transform-based technique, which are both linear color decorrelation techniques, and interframe CALIC, which uses a nonlinear approach to color decorrelation. It is shown that these techniques are able to exploit color redundancies and that color decorrelation can be done effectively and efficiently. The linear color decorrelators provide a considerable coding gain (about 2 bpp) on some typical prepress images. Surprisingly, the nonlinear interframe CALIC predictor does not yield better results.
Computer Graphics Forum | 2002
Koen Denecker; P. De Neve; S. Van Assche; R. Van de Walle; Ignace Lemahieu; W. Philips
In the digital prepress workflow, images are represented in the CMYK colour space. Lossy image compression alleviates the need for high storage and bandwidth capacities, resulting from the high spatial and tonal resolution. After the image has been printed on paper, the introduced visual quality loss should not be noticeable to a human observer. Since visual image quality depends on the compression algorithm both quantitatively and qualitatively, and since no visual image quality models incorporating the end‐to‐end image reproduction process are satisfactory, an experimental comparison is the only viable way to quantify subjective image quality. This paper presents the results from an intensive psychovisual study based on a two‐alternative forced‐choice approach involving 164 people, with expert and non‐expert observers distinguished. The primary goal is to evaluate two previously published adaptations of JPEG to CMYK images, and to determine a visually lossless compression ratio threshold for typical printing applications. The improvements are based on tonal decorrelation and overlapping block transforms. Results on three typical prepress test images indicate that the proposed adaptations are useful and that for the investigated printing configuration, compression ratios up to 20 can be used safely.
Journal of Electronic Imaging | 2000
Dimitri Van De Ville; Koen Denecker; Wilfried Philips; Ignace Lemahieu
Printing applications using classical halftoning need to resample the original image to a screen lattice. This resampling can cause undesirable moire artifacts in the screened image. Some printing techniques, e.g., gravure printing, are highly susceptible to moire , not only because of the low resolution screen lattices they employ but also because the degree of freedom in constructing halftone dots is limited by the physical constraints of the engraving mechanism. Current resampling methods compute new samples by simple interpolation techniques that cannot prevent sampling moire very well. Therefore precautions against moire have to be made in the prepress phase, which is not practical and sometimes not feasible. A novel technique is presented to adaptively resample an image on the screen lattice using a local estimate of the risk of aliasing. The purpose is to suppress moire´ while maintaining the sharpness of the image. Experimental results demonstrate the feasibility of the proposed approach.
Computer Graphics Forum | 2000
P. De Neve; Koen Denecker; Wilfried Philips; Ignace Lemahieu
CMYK color images are used extensively in prepress applications. When compressing those color images one has to deal with four different color channels. Usually compression algorithms only take into account the spatial redundancy that is present in the image data. This approach does not yield an optimal data reduction since there also exists a high correlation between the different colors in natural images.
international conference on acoustics, speech, and signal processing | 1997
Koen Denecker; P. De Neve
The huge sizes of screened colour-separated photographic images makes lossless compression very beneficial for both storage and transmission. Because of the special structure induced by the half-tone dots, the compression results obtained on the CCITT test images might not apply to high-resolution screened images and the default parameters of existing compression algorithms may not be optimal. In this paper we compare the performance of different classes of lossless coders: general-purpose one-dimensional coders, non-adaptive two-dimensional black-and-white coders and adaptive two-dimensional coders. Firstly, experiments on a set of test images screened under different conditions showed that MGBILEVEL and JBIG perform best with respect to compression efficiency; the difference with the other coders is significant. Secondly, we investigated the influence of the screening method (stochastic or classical screening) and screening resolution on the compression ratio for these techniques.
visual communications and image processing | 1998
Steven Van Assche; Koen Denecker; Wilfried Philips; Ignace Lemahieu
In the pre-press industry color images have both a high spatial and a high color resolution. Such images require a considerable amount of storage space and impose long transmission times. Data compression is desired to reduce these storage and transmission problems. Because of the high quality requirements in the pre-press industry only lossless compression is acceptable. Most existing lossless compression schemes operate on gray-scale images. In this case the color components of color images must be compressed independently. However, higher compression ratios can be achieved by exploiting inter-color redundancies. In this paper we present a comparison of three state-of-the-art lossless compression techniques which exploit such color redundancies: IEP (Inter- color Error Prediction) and a KLT-based technique, which are both linear color decorrelation techniques, and Interframe CALIC, which uses a non-linear approach to color decorrelation. It is shown that these techniques are able to exploit color redundancies and that color decorrelation can be done effectively and efficiently. The linear color decorrelators provide a considerable coding gain (about 2 bpp) on some typical prepress images. The non-linear interframe CALIC predictor does not yield better results, but the full interframe CALIC technique does.