Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jordi Portell is active.

Publication


Featured researches published by Jordi Portell.


data compression communications and processing | 2009

A resilient and quick data compression method of prediction errors for space missions

Jordi Portell; Alberto G. Villafranca; Enrique García-Berro

It has passed more than a decade since the Consultative Committee for Space Data Systems (CCSDS) made its recommendation for lossless data compression. The CCSDS standard is commonly used for scientific missions because it is a general-purpose lossless compression technique with a low computational cost which results in acceptable compression ratios. At the core of this compression algorithm it is the Rice coding method. Its performance rapidly degrades in the presence of noise and outliers, as the Rice coder is conceived for noiseless data following geometric distributions. To overcome this problem we present here a new coder, the so-called Prediction Error Coder (PEC), as well as its fully adaptive version (FAPEC) which we show is a reliable alternative to the CCSDS standard. We show that PEC and FAPEC achieve large compression ratios even when high levels of noise are present in the data. This is done testing our compressors with synthetic and real data, and comparing the compression ratios and processor requirements with those obtained using the CCSDS standard.


data compression communications and processing | 2010

Optimizing GPS data transmission using entropy coding compression

Alberto G. Villafranca; Iu Mora; Patrícia Ruiz-Rodríguez; Jordi Portell; Enrique García-Berro

The Global Positioning System (GPS) has long been used as a scientific tool, and it has turned into a very powerful technique in domains like geophysics, where it is commonly used to study the dynamics of a large variety of systems, like glaciers, tectonic plates and others. In these cases, the large distances between receivers as well as their remote locations usually pose a challenge for data transmission. The standard format for scientific applications is a compressed RINEX file - a raw data format which allows post-processing. Its associated compression algorithm is based on a pre-processing stage followed by a commercial data compressor. In this paper we present a new compression method which can achieve better compression ratios with a faster operation. We have improved the pre-compression stage, split the resulting file into two, and applied the most appropriate compressor to each file. FAPEC, a highly resilient entropy coder, is applied to the observables file. The results obtained so far demonstrate that it is possible to obtain average compression gains of about 35% with respect to the original compressor.


IEEE Transactions on Aerospace and Electronic Systems | 2006

High-performance payload data handling system for Gaia

Jordi Portell; X. Luri; Enrique García-Berro

Gaia is the most ambitious space astrometry mission currently envisaged and is a technological challenge in all its aspects. We describe a proposal for the payload data handling system of Gaia, as an example of a high-performance, real-time, concurrent, and pipelined data system. This proposal includes the front-end systems for the instrumentation, the data acquisition and management modules, the star data processing modules, and the payload data handling unit. We also review other payload and service module elements and we illustrate a data flux proposal.


Journal of Applied Remote Sensing | 2015

Image data compression with hierarchical pixel averaging and fully adaptive prediction error coder

Riccardo Iudica; Gabriel Artigues; Jordi Portell; Enrique García-Berro

Abstract. The fully adaptive prediction error coder (FAPEC) is an entropy coder that typically offers better results than the adaptive Rice compressor. It uses basic preprocessing stages such as delta preprocessing, but it can also be combined with a discrete wavelet transform. We describe a new algorithm called hierarchical pixel averaging (HPA). It divides an image into blocks of 16×16  pixels, which are subsequently divided into smaller blocks, up to the basic level where one block corresponds to one pixel. Average pixel values are determined for each level from which differential coefficients are extracted. HPA allows the introduction of controlled losses with several quality levels, also allowing to progressively decompress a given image from lower to higher quality. It achieves better resolution in sharp image edges when compared to other lossy algorithms. HPA is based on simple arithmetic operations, allowing a very simple (thus quick) implementation. It does not use any floating-point operations, which is an interesting feature for satellite or embedded data compression. We present a first implementation of HPA and the results obtained on a variety of images, both for the lossless and lossy cases with different quality levels. Our results indicate that HPA + FAPEC offer a performance comparable to that of CCSDS 122.0.


Journal of Applied Remote Sensing | 2013

Discrete wavelet transform fully adaptive prediction error coder: image data compression based on CCSDS 122.0 and fully adaptive prediction error coder

Gabriel Artigues; Jordi Portell; Alberto Gonzalez Villafranca; Hamed Ahmadloo; Enrique García-Berro

Abstract The Consultative Committee for Space Data Systems (CCSDS) 122.0 recommendation defines a particular image data compression algorithm for space. It is based on a discrete wavelet transform (DWT) algorithm followed by bit-plane encoder stage (BPE) based on Rice codes. The low complexity and memory efficiency of the algorithm makes it suitable for use onboard a spacecraft. On the other hand, fully adaptive prediction error coder (FAPEC) is a quick entropy coder aimed to achieve excellent compression ratios under almost any situation, including large fractions of outliers in the data and large sample sizes. A new image compression solution based on the DWT stage of the CCSDS 122.0 recommendation is presented, followed by our FAPEC entropy coder, thus removing the BPE stage. The purpose is to obtain a low-complexity algorithm for image compression able to provide similar or even better compression ratios than those obtained using the CCSDS 122.0 recommendation. A prototype of DWTFAPEC, the combination of the DWT stage with FAPEC, is presented here. Its lossless operation, as well as the results with a first lossy option with selectable quality using a wide variety of images, including the official CCSDS 122.0 image corpus as well as several astronomical and ground images, is tested. The results are satisfactory, achieving significantly better compression times. The lossless ratios are very close to those of the standard, while the lossy ratios are higher (for the same level of quality loss). Thus, DWTFAPEC can be used as an alternative to CCSDS 122.0 standard for space missions.


Archive | 2012

Data Management at Gaia Data Processing Centers

Pilar de Teodoro; Alexander Hutton; Benoit Frezouls; Alain Montmory; Jordi Portell; Rosario Messineo; M. Riello; K. Nienartowicz

Gaia is a European Space Agency mission that will deal with large volumes of data that have to be processed at, and transferred between, different data processing centers (DPCs) in Europe. Managing the data and the associated databases will be a significant challenge. This paper presents the different data management configurations that have been evaluated at the Gaia DPCs in order to cope with the requirements of Gaia’s complex data handling.


data compression communications and processing | 2010

Simple resiliency improvement of the CCSDS standard for lossless data compression

Marcial Clotet; Jordi Portell; Alberto G. Villafranca; Enrique García-Berro

The Consultative Committee for Space Data Systems (CCSDS) recommends the use of a two-stage strategy for lossless data compression in space. At the core of the second stage is the Rice coding method. The Rice compression ratio rapidly decreases in the presence of noise and outliers, since this coder is specially conceived for noiseless data following geometric distributions. This, in turn, makes the CCSDS recommendation too sensitive in front of outliers in the data, leading to non-optimal ratios in realistic scenarios. In this paper we propose to substitute the Rice coder of the CCSDS recommendation by a subexponential coder. We show that this solution offers high compression ratios even when large amounts of noise are present in the data. This is done by testing both compressors with synthetic and real data. The performance is actually similar to that obtained with the FAPEC coder, although with slightly higher processing requirements. Therefore, this solution appears as a simple improvement that can be done to the current CCSDS standard with an excellent return.


Journal of Applied Remote Sensing | 2013

Prediction Error Coder: a fast lossless compression method for satellite noisy data

Alberto Gonzalez Villafranca; Jordi Portell; Enrique García-Berro

Abstract Lossless compression is often required for downloading data of scientific payloads in space missions. The consultative committee for space data systems (CCSDS) 121.0 recommendation on lossless data compression is a “de-facto” standard, and it has been used in several missions so far owing to the reasonable compression ratios achieved with low processing requirements. Although the Rice coder used by this standard is optimal when dealing with noiseless Laplacian-distributed data, its performance rapidly degrades when noisy data are compressed or when there is a significant fraction of outliers in the input data. An alternative to this is PEC, the Prediction Error Coder, which is the core of the FAPEC adaptive entropy coder. We describe, analyze and test PEC on real and simulated data, revealing its key role in the excellent outlier resiliency of FAPEC. PEC is a fast and noise-resilient semi-adaptive entropy coder that can achieve better performances than the CCSDS standard in the presence of noise or when the input data contains a sizable fraction of outliers, while requiring very low processing resources.


Proceedings of SPIE | 2012

The on-board data handling concept for the LOFT large area detector

Slawomir Suchy; P. Uter; C. Tenzer; A. Santangelo; A. Argan; M. Feroci; T. Kennedy; P. J. Smith; D. Walton; S. Zane; Jordi Portell; E. García-Berro

The Large Observatory for X-ray Timing (LOFT) is one of the four candidate ESA M3 missions considered for launch in the timeframe of 2022. It is specifically designed to perform fast X-ray timing and probe the status of the matter near black holes and neutron stars. The LOFT scientific payload consists of a Large Area Detector and a Wide Field Monitor. The LAD is a 10m2-class pointed instrument with high spectral (200 eV @ 6 keV) and timing (< 10 μs) resolution over the 2-80 keV range. It is designed to observe persistent and transient X-ray sources with a very large dynamic range from a few mCrab up to an intensity of 15 Crab. An unprecedented large throughput (~280.000 cts/s from the Crab) is achieved with a segmented detector, making pile-up and dead-time, often worrying or limiting focused experiments, secondary issues. We present the on-board data handling concept that follows the highly segmented and hierarchical structure of the instrument from the front-end electronics to the on-board software. The system features customizable observation modes ranging from event-by-event data for sources below 0.5 Crab to individually adjustable time resolved spectra for the brighter sources. On-board lossless data compression will be applied before transmitting the data to ground.


Archive | 2012

Outlier-Resilient Entropy Coding

Jordi Portell; Alberto G. Villafranca; Enrique García-Berro

Many data compression systems rely on a final stage based on an entropy coder, generating short codes for the most probable symbols. Images, multispectroscopy or hyperspectroscopy are just some examples, but the space mission concept covers many other fields. In some cases, especially when the on-board processing power available is very limited, a generic data compression system with a very simple pre-processing stage could suffice. The Consultative Committee for Space Data Systems made a recommendation on lossless data compression in the early 1990s, which has been successfully used in several missions so far owing to its low computational cost and acceptable compression ratios. Nevertheless, its simple entropy coder cannot perform optimally when large amounts of outliers appear in the data, which can be caused by noise, prompt particle events, or artifacts in the data or in the pre-processing stage. Here we discuss the effect of outliers on the compression ratio and we present efficient solutions to this problem. These solutions are not only alternatives to the CCSDS recommendation, but can also be used as the entropy coding stage of more complex systems such as image or spectroscopy compression.

Collaboration


Dive into the Jordi Portell's collaboration.

Top Co-Authors

Avatar

Enrique García-Berro

Polytechnic University of Catalonia

View shared research outputs
Top Co-Authors

Avatar

Alberto G. Villafranca

Polytechnic University of Catalonia

View shared research outputs
Top Co-Authors

Avatar

X. Luri

University of Barcelona

View shared research outputs
Top Co-Authors

Avatar

Gabriel Artigues

Institut de Ciències de l'Espai

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

E. Masana

University of Barcelona

View shared research outputs
Top Co-Authors

Avatar

J. Torra

University of Barcelona

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Riccardo Iudica

Polytechnic University of Catalonia

View shared research outputs
Top Co-Authors

Avatar

D. Walton

University College London

View shared research outputs
Researchain Logo
Decentralizing Knowledge