George Tzagkarakis
University of Crete
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by George Tzagkarakis.
IEEE Transactions on Image Processing | 2006
George Tzagkarakis; Baltasar Beferull-Lozano; Panagiotis Tsakalides
This paper presents a novel rotation-invariant image retrieval scheme based on a transformation of the texture information via a steerable pyramid. First, we fit the distribution of the subband coefficients using a joint alpha-stable sub-Gaussian model to capture their non-Gaussian behavior. Then, we apply a normalization process in order to Gaussianize the coefficients. As a result, the feature extraction step consists of estimating the covariances between the normalized pyramid coefficients. The similarity between two distinct texture images is measured by minimizing a rotation-invariant version of the Kullback-Leibler Divergence between their corresponding multivariate Gaussian distributions, where the minimization is performed over a set of rotation angles
ad hoc networks | 2014
Dimitrios Milioris; George Tzagkarakis; Artemis Papakonstantinou; Maria Papadopouli; Panagiotis Tsakalides
Accurate location awareness is of paramount importance in most ubiquitous and pervasive computing applications. Numerous solutions for indoor localization based on IEEE802.11, bluetooth, ultrasonic and vision technologies have been proposed. This paper introduces a suite of novel indoor positioning techniques utilizing signal-strength (SS) fingerprints collected from access points (APs). Our first approach employs a statistical representation of the received SS measurements by means of a multivariate Gaussian model by considering a discretized grid-like form of the indoor environment and by computing probability distribution signatures at each cell of the grid. At run time, the system compares the signature at the unknown position with the signature of each cell by using the Kullback-Leibler Divergence (KLD) between their corresponding probability densities. Our second approach applies compressive sensing (CS) to perform sparsity-based accurate indoor localization, while reducing significantly the amount of information transmitted from a wireless device, possessing limited power, storage, and processing capabilities, to a central server. The performance evaluation which was conducted at the premises of a research laboratory and an aquarium under real-life conditions, reveals that the proposed statistical fingerprinting and CS-based localization techniques achieve a substantial localization accuracy.
international conference of the ieee engineering in medicine and biology society | 2010
Alin Achim; Benjamin Buxton; George Tzagkarakis; Panagiotis Tsakalides
This paper introduces a novel framework for compressive sensing of biomedical ultrasonic signals based on modelling data with stable distributions. We propose an approach to ℓp norm minimisation that employs the iteratively reweighted least squares (IRLS) algorithm but in which the parameter p is judiciously chosen by relating it to the characteristic exponent of the underlying alpha-stable distributed data. Our results show that the proposed algorithm, which we prefer to call S±S-IRLS, outperforms previously proposed ℓ1 minimisation algorithms, such as basis pursuit or orthogonal matching pursuit, both visually and in terms of PSNR.
international conference on acoustics, speech, and signal processing | 2010
George Tzagkarakis; Dimitrios Milioris; Panagiotis Tsakalides
Traditional bearing estimation techniques perform Nyquist-rate sampling of the received sensor array signals and as a result they require high storage and transmission bandwidth resources. Compressed sensing (CS) theory provides a new paradigm for simultaneously sensing and compressing a signal using a small subset of random incoherent projection coefficients, enabling a potentially significant reduction in the sampling and computation costs. In this paper, we develop a Bayesian CS (BCS) approach for estimating target bearings based on multiple noisy CS measurement vectors, where each vector results by projecting the received source signal on distinct over-complete dictionaries. In addition, the prior belief that the vector of projection coefficients should be sparse is enforced by fitting directly the prior probability distribution with a Gaussian Scale Mixture (GSM) model. The experimental results show that our proposed method, when compared with norm-based constrained optimization CS algorithms, as well as with single-measurement BCS methods, improves the reconstruction performance in terms of the detection error, while resulting in an increased sparsity.
international conference on acoustics, speech, and signal processing | 2005
George Tzagkarakis; B. Beferall-Lozano; Panagiotis Tsakalides
This paper presents a novel rotation-invariant image retrieval scheme based on a transformation of the texture information via a steerable pyramid. First, we fit the distribution of the subband coefficients using a joint alpha-stable sub-Gaussian model to capture their non-Gaussian behavior. Then, we apply a normalization process in order to Gaussianize the coefficients. As a result, the feature extraction step consists of estimating the covariances between the normalized pyramid coefficients. The similarity between two distinct texture images is measured by minimizing a rotation-invariant version of the Kullback-Leibler Divergence between their corresponding multivariate Gaussian distributions, where the minimization is performed over a set of rotation angles
modeling analysis and simulation of wireless and mobile systems | 2010
Dimitrios Milioris; Lito Kriara; Artemis Papakonstantinou; George Tzagkarakis; Panagiotis Tsakalides; Maria Papadopouli
This paper proposes a novel localization technique based on a multivariate Gaussian modeling of the signal strength measurements collected from several access points (APs) at different locations. It considers a discretized grid-like form of the environment and computes a signature at each cell of the grid. At run time the system compares the signature at the unknown position with the signature of each cell using the Kullback-Leibler Divergence estimation (KLD) between their corresponding probability densities. The paper evaluates the performance of the proposed technique and compares it with other statistical fingerprint-based localization systems. The performance analysis studies were conducted at the premises of a research laboratory and an aquarium under various conditions. Furthermore, the paper evaluates the impact of the number of APs and the size of the measurement datasets.
Performance Evaluation | 2009
George Tzagkarakis; Maria Papadopouli; Panagiotis Tsakalides
Network traffic load in an IEEE802.11 infrastructure arises from the superposition of traffic accessed by wireless clients associated with access points (APs). An accurate load characterization can be beneficial in modeling network traffic and addressing a variety of problems including coverage planning, resource reservation and network monitoring for anomaly detection. This study focuses on the statistical analysis of the traffic load measured in a campus-wide IEEE802.11 infrastructure at each AP. Using the Singular Spectrum Analysis approach, we found that the time-series of traffic load at a given AP has a small intrinsic dimension. In particular, these time-series can be accurately modeled using a small number of leading (principal) components. This proved to be critical for understanding the main features of the components forming the network traffic. Statistical analysis of leading components has demonstrated that even a few first components form the main part of the information. The residual components capture the small irregular variations, which do not fit in the basic part of the network traffic and can be interpreted as a stochastic noise. Based on these properties, we also studied contributions of the various components to the overall structure of the traffic load of an AP and its variation over time. Finally, we designed and evaluated the performance of a traffic predictor for the trend component, obtained by projecting the original time-series on the set of leading components.
modeling analysis and simulation of wireless and mobile systems | 2007
George Tzagkarakis; Maria Papadopouli; Panagiotis Tsakalides
Network traffic load in an IEEE802.11 infrastructure arises from the superposition of traffic accessed by wireless clients associated with access points (APs). An accurate characterization of these data can be beneficial in modelling network traffic and addressing a variety of problems including coverage planning, resource reservation and network monitoring for anomaly detection. This study focuses on the statistical analysis of the traffic load measured in a campus-wide IEEE802.11 infrastructure at each AP.n Using the Singular Spectrum Analysis approach, we found that the time-series of traffic load at a given AP has a small intrinsic dimension. In particular, these time-series can be accurately modelled using a small number of leading (principal) components. This proved to be critical for understanding the main features of the components forming the network traffic.n The statistical analysis of leading components has demonstrated that even a few first components form the main part of the information. The residual components capture the small irregular variations, which do not fit in the basic part of the network traffic and can be interpreted as a stochastic noise. Based on these properties, we also studied contributions of the various components to the overall structure of the traffic load of an AP and its variation over time.
international conference on acoustics, speech, and signal processing | 2010
George Tzagkarakis; Panagiotis Tsakalides
The ease of image storage and transmission in modern applications would be unfeasible without compression, which converts high-resolution images into a relatively small set of significant transform coefficients. Due to the specific content of many real-world images they are highly sparse in an appropriate orthonormal basis. The inherent property of compressed sensing (CS) theory working simultaneously as a sensing and compression protocol, using a small subset of random incoherent projection coefficients, enables a potentially significant reduction in the sampling and computation costs of images favoring its use in real-time applications which do not require an excellent reconstruction performance. In this paper, we develop a Bayesian CS (BCS) approach for obtaining highly sparse representations of images based on a set of noisy CS measurements, where the prior belief that the vector of projection coefficients should be sparse is enforced by fitting directly its prior probability distribution by means of a Gaussian Scale Mixture (GSM). The experimental results show that our proposed method, when compared with norm-based constrained optimization algorithms, maintains the reconstruction performance, in terms of the reconstruction error and the PSNR, while achieving an increased sparsity using much less basis functions.
international conference on acoustics, speech, and signal processing | 2008
Tao Wan; George Tzagkarakis; Panagiotis Tsakalides; Nishan Canagarajah; Alin Achim
A novel context enhancement technique is presented to automatically combine images of the same scene captured at different times or seasons. A unique characteristic of the algorithm is its ability to extract and maintain the meaningful information in the enhanced image while recovering the surrounding scene information by fusing the background image. The input images are first decomposed into multiresolution representations using the Dual-Tree Complex Wavelet Transform (DT-CWT) with the subband coefficients modelled as Cauchy random variables. Then, the convolution of Cauchy distributions is applied as a probabilistic prior to model the fused coefficients, and the weights used to combine the source images are optimised via Maximum Likelihood (ML) estimation. Finally, the importance map is produced to construct the composite approximation image. Experiments show that this new model significantly improves the reliability of the feature selection and enhances fusion process.