Jorge Igual
Polytechnic University of Valencia
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Jorge Igual.
Pattern Recognition | 2010
Addisson Salazar; Luis Vergara; Arturo Serrano; Jorge Igual
This paper presents a new procedure for learning mixtures of independent component analyzers. The procedure includes non-parametric estimation of the source densities, supervised-unsupervised learning of the model parameters, incorporation of any independent component analysis (ICA) algorithm into the learning of the ICA mixtures, and estimation of residual dependencies after training for correction of the posterior probability of every class to the testing observation vector. We demonstrate the performance of the procedure in the classification of ICA mixtures of two, three, and four classes of synthetic data, and in the classification of defective materials, consisting of 3D finite element models and lab specimens, in non-destructive testing using the impact-echo technique. The application of the proposed posterior probability correction demonstrates an improvement in the classification accuracy. Semi-supervised learning shows that unlabeled data can degrade the performance of the classifier when they do not fit the generative model. Comparative results of the proposed method and standard ICA algorithms for blind source separation in one and multiple ICA data mixtures show the suitability of the non-parametric ICA mixture-based method for data modeling.
international conference on acoustics, speech, and signal processing | 2003
Francisco Castells; Jorge Igual; José Joaquín Rieta; César Sánchez; José Millet
The analysis and characterization of atrial fibrillation requires the prior extraction of the atrial activity from the electrocardiogram, where the independent atrial and ventricular activities are combined in addition to noise. An independent component analysis method is proposed where additional knowledge about the time and statistical structure of the sources is incorporated. Finally, a combined method based on maximum likelihood and second order blind identification is obtained and validated with results that improve those obtained with traditional ICA algorithms.
Digital Signal Processing | 2004
Pablo Bernabeu; Luis Vergara; I. Bosh; Jorge Igual
Abstract We propose a prediction/detection scheme for automatic forest fire surveillance by means of passive infrared sensors. Prediction takes advantages of the highly correlated environment in the infrared band to improve signal to noise ratio. We have observed that, in general, data are non-Gaussian distributed; thence nonlinear prediction allows improvements in the predictor performance. In particular, we consider the nonlinear Wiener system. In addition, the prediction step allows assuming Gaussianity for the detector design. A specific problem in the detection step is to distinguish uncontrolled fire from what we call occasional effects. This situation justifies basing the detection in a vector signature. We exploit the expected particular characteristics about fire signatures by means of two different detectors: a matched subspace detector and a detector that exploits the presence of increasing trends in the signature (increase detector). The problem with the fusion of the two decisions is also considered. Real data experiments validate the interest of the proposed scheme.
Digital Signal Processing | 2011
Raul Llinares; Jorge Igual; Addisson Salazar; Andres Camacho
Atrial fibrillation is the most common human arrhythmia. During atrial fibrillation episodes, the surface electrocardiogram contains the linear superposition of the atrial and ventricular rhythms in addition to other non-cardiac artifacts. Since these signals can be considered statistically independent, a Blind Source Separation (BSS) approach fits the problem properly. The signal that contains useful clinical information is the atrial one. We present a solution that focuses on the extraction of the atrial activity, enforcing simultaneously the statistical and temporal properties of the atrial signal. In addition, we propose the use of kurtosis as a parameter to measure the quality of the extraction. The algorithm is applied successfully to synthetic and real data. It improves the extraction of the atrial signal in comparison to other BSS methods, recovers only the interesting atrial rhythm using the information contained in all the leads and reduces the computational cost. The results obtained are shown to be highly satisfactory, with an average of 53.9% of spectral concentration, -0.04 of kurtosis value, 2.98 of ventricular residua and 4.77% of significant QRS residua over a database of thirty patients.
Artificial Intelligence in Medicine | 2009
Raul Llinares; Jorge Igual
OBJECTIVES The extraction of the atrial activity in atrial fibrillation episodes is a must for clinical purposes. During atrial fibrillation arrhythmia, the independent atrial and ventricular signals are superposed in the electrocardiogram, fulfilling the independent component analysis (ICA) model. We propose three new algorithms that constrain the classical ICA solution to fit the spectral content of the atrial component. This constraint allows the statement of the problem in terms of semiblind source extraction instead of blind source separation (BSS), in the sense that we only recover one source and we exploit the prior information about the sources in the extraction process. METHODS AND MATERIALS The methods used are extensions of classical BSS methods based on second and higher order statistics. We exploit the prior assumption about the sources in order to obtain the source extraction algorithms that are focused on the extraction of the atrial component. The material corresponds to 10 synthetic recordings in order to measure and compare the quality of the different algorithms and 66 real recordings coming from two different databases, one public database from Physionet and one database from the Clinical University Hospital, Valencia, Spain. RESULTS We have analyzed the performance of the three new algorithms and compared it with the performance of the traditional ICA algorithms. In the case of the synthetic data, it is possible to obtain the mean square error, so the comparison is easier. The new methods outperform the non-constrained versions in addition to simplifying the solution, since they do not need to recover all the components in order to estimate the atrial activity, i.e., the new methods are focused on the extraction of the atrial activity, so the extraction is stopped after the atrial signal is recovered. CONCLUSIONS We have shown that the ICA only version of the algorithms can be improved and adapted to fulfill the prior information about the characteristics of the atrial activity. This modification allows us to obtain new algorithms that have the following advantages compared to ICA only based solutions: they exploit prior information during the extraction, not in the postprocessing identification of the atrial signal; they extract only the interesting clinical signal instead of all the components; they outperform the ICA only version of the algorithm, improving the estimation of the atrial signal.
Neurocomputing | 2003
Jorge Igual; Luis Vergara; Andres Camacho; Ramón Miralles
Abstract In the Independent Component Analysis (ICA) problem, a linear transformation of the original statistically independent sources is observed. ICA algorithms usually do not include any prior information about the mixing matrix that models the linear transformation. We investigate in this paper in a general framework how the criterion functions can be modified if a prior information about the entries of the mixing matrix is available. We find that the prior can be nicely introduced in the ICA formulation, so a direct modification of traditional algorithms can be carried out. Including prior information in the learning rule does not only improve convergence properties but also extends the application of ICA techniques to data that do not satisfy exactly ICA assumptions.
Sensors | 2015
Jorge Igual; Addisson Salazar; Gonzalo Safont; Luis Vergara
The detection and identification of internal defects in a material require the use of some technology that translates the hidden interior damages into observable signals with different signature-defect correspondences. We apply impact-echo techniques for this purpose. The materials are classified according to their defective status (homogeneous, one defect or multiple defects) and kind of defect (hole or crack, passing through or not). Every specimen is impacted by a hammer, and the spectrum of the propagated wave is recorded. This spectrum is the input data to a Bayesian classifier that is based on the modeling of the conditional probabilities with a mixture of Gaussians. The parameters of the Gaussian mixtures and the class probabilities are estimated using an extended expectation-maximization algorithm. The advantage of our proposal is that it is flexible, since it obtains good results for a wide range of models even under little supervision; e.g., it obtains a harmonic average of precision and recall value of 92.38% given only a 10% supervision ratio. We test the method with real specimens made of aluminum alloy. The results show that the algorithm works very well. This technique could be applied in many industrial problems, such as the optimization of the marble cutting process.
international conference on independent component analysis and signal separation | 2004
Addisson Salazar; Luis Vergara; Jorge Igual; Jorge Gosálbez; Ramón Miralles
This article presents an ICA model for applying in Non Destructive Testing by Impact-Echo. The approach consists in considering flaws inside the material as sources for blind separation using ICA. A material is excited by a hammer impact and a convolutive mixture is sensed by a multichannel system. Obtained information is used for classifying in defective or non defective material. Results based on simulation by finite element method are presented, including different defect geometry and location.
international symposium on neural networks | 2007
Addisson Salazar; Jorge Igual; Luis Vergara; Arturo Serrano
This paper presents a novel procedure to classify data from mixtures of independent component analyzers. The procedure includes two stages: learning the parameters of the mixtures (basis vectors and bias terms) and clustering the ICA mixtures following a bottom-up agglomerative scheme to construct a hierarchy for classification. The approach for the estimation of the source probability density function is non-parametric and the minimum kullback-Leibler distance is used as a criterion for merging clusters at each level of the hierarchy. Validation of the proposed method is presented from several simulations including ICA mixtures with uniform and Laplacian source distributions and processing real data from impact-echo testing experiments.
signal processing systems | 2004
Jorge Igual; Andres Camacho; Luis Vergara
Sinusoidal interferences are found in ultrasonic signals when we try to characterize a material, as for example interferences coming from PC cards. We are interested in obtaining a robust method that cancels these interferences preserving the waveform of the signal. A Blind Source Separation method to extract these sinusoids is presented in this paper. We will get so many linear mixtures of the backscattering echo of the material and the sinusoids as we need from different pulse responses of the material.