Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Zulima Fernández-Muñiz is active.

Publication


Featured researches published by Zulima Fernández-Muñiz.


soft methods in probability and statistics | 2010

Inverse Problems and Model Reduction Techniques

Juan Luis Fernández-Martínez; Michael J. Tompkins; Zulima Fernández-Muñiz; Tapan Mukerji

Real problems come from engineering, industry, science and technology. Inverse problems for real applications usually have a large number of parameters to be reconstructed due to the accuracy needed to make accurate data predictions. This feature makes these problems highly underdetermined and ill-posed. Good prior information and regularization techniques are needed when using local optimization methods but only linear model appraisal (uncertainty) around the solution can be performed. The large number of parameters precludes the use of global sampling methods to approach inverse problem solution and appraisal. In this paper we show how to construct different kinds of reduced bases using Principal Component Analysis (PCA), Singular Value Decomposition (SVD), Discrete Cosine Transform (DCT) and Discrete Wavelet Transform (DWT). The use of a reduced base helps us to regularize the inverse problem and to find the set of equivalent models that fit the data within a prescribed tolerance and are compatible with the model prior.


Materials | 2016

Hard-Rock Stability Analysis for Span Design in Entry-Type Excavations with Learning Classifiers

Esperanza García-Gonzalo; Zulima Fernández-Muñiz; Paulino José García Nieto; Antonio Bernardo Sánchez; Marta Fernández

The mining industry relies heavily on empirical analysis for design and prediction. An empirical design method, called the critical span graph, was developed specifically for rock stability analysis in entry-type excavations, based on an extensive case-history database of cut and fill mining in Canada. This empirical span design chart plots the critical span against rock mass rating for the observed case histories and has been accepted by many mining operations for the initial span design of cut and fill stopes. Different types of analysis have been used to classify the observed cases into stable, potentially unstable and unstable groups. The main purpose of this paper is to present a new method for defining rock stability areas of the critical span graph, which applies machine learning classifiers (support vector machine and extreme learning machine). The results show a reasonable correlation with previous guidelines. These machine learning methods are good tools for developing empirical methods, since they make no assumptions about the regression function. With this software, it is easy to add new field observations to a previous database, improving prediction output with the addition of data that consider the local conditions for each mine.


international conference on bioinformatics and biomedical engineering | 2018

Sampling Defective Pathways in Phenotype Prediction Problems via the Holdout Sampler.

Juan Luis Fernández-Martínez; Ana Cernea; Enrique J. deAndrés-Galiana; Francisco Javier Fernández-Ovies; Zulima Fernández-Muñiz; Oscar Alvarez-Machancoses; Leorey N. Saligan; Stephen T. Sonis

In this paper, we introduce the holdout sampler to find the defective pathways in high underdetermined phenotype prediction problems. This sampling algorithm is inspired by the bootstrapping procedure used in regression analysis to established confidence bounds. We show that working with partial information (data bags) serves to sample the linear uncertainty region in a simple regression problem, mainly along the axis of greatest uncertainty that corresponds to the smallest singular value of the system matrix. This procedure applied to a phenotype prediction problem, considered as a generalized prediction problem between the set of genetic signatures and the set of classes in which the phenotype is divided, serves to unravel the ensemble of altered pathways in the transcriptome that are involved in the disease development. The algorithm looks for the minimum-scale genetic signature in each random holdout and the likelihood (predictive accuracy) is established using the validation dataset via a nearest-neighbor classifier. The posterior analysis serves to identify the header genes that most-frequently appear in the different hold-outs and are therefore robust to a partial lack of samples. These genes are used to establish the genetic pathways and the biological processes involved in the disease progression. This algorithm is much faster, robust and simpler than Bayesian Networks. We show its application to a microarray dataset concerning a type of breast cancers with poor prognoses (TNBC).


Geophysical Prospecting | 2017

Uncertainty analysis and probabilistic segmentation of electrical resistivity images: the 2D inverse problem

Juan Luis Fernández-Martínez; Shan Xu; Colette Sirieix; Zulima Fernández-Muñiz; Joëlle Riss

In this paper, we present the uncertainty analysis of the 2D electrical tomography inverse problem using model reduction and performing the sampling via an explorative member of the Particle Swarm Optimization (PSO) family, called the Regressive-Regressive PSO (RR-PSO). The procedure begins with a local inversion to find a good resistivity model located in the nonlinear equivalence region of the set of plausible solutions. The dimension of this geophysical model is then reduced using spectral decomposition, and the uncertainty space is explored via PSO. Using this approach, we show that it is possible to sample the uncertainty space of the electrical tomography inverse problem. We illustrate this methodology with the application to a synthetic and a real dataset coming from a karstic geological set-up. By computing the uncertainty of the inverse solution, it is possible to perform the segmentation of the resistivity images issued from inversion. This segmentation is based on the set of equivalent models that have been sampled, and makes it possible to answer geophysical questions in a probabilistic way, performing risk analysis. This article is protected by copyright. All rights reserved


international conference on bioinformatics and biomedical engineering | 2018

Sampling Defective Pathways in Phenotype Prediction Problems via the Fisher’s Ratio Sampler

Ana Cernea; Juan Luis Fernández-Martínez; Enrique J. deAndrés-Galiana; Francisco Javier Fernández-Ovies; Zulima Fernández-Muñiz; Oscar Alvarez-Machancoses; Leorey N. Saligan; Stephen T. Sonis

In this paper, we introduce the Fisher’s ratio sampler that serves to unravel the defective pathways in highly underdetermined phenotype prediction problems. This sampling algorithm first selects the most discriminatory genes, that are at the same time differentially expressed, and samples the high discriminatory genetic networks with a prior probability that it is proportional to their individual Fisher’s ratio. The number of genes of the different networks is randomly established taking into account the length of the minimum-scale signature of the phenotype prediction problem which is the one that contains the most discriminatory genes with the maximum predictive power. The likelihood of the different networks is established via leave-one-out-cross-validation. Finally, the posterior analysis of the most frequently sampled genes serves to establish the defective biological pathways. This novel sampling algorithm is much faster and simpler than Bayesian Networks. We show its application to a microarray dataset concerning a type of breast cancers with very bad prognosis (TNBC). In these kind of cancers, the breast cancer cells have tested negative for hormone epidermal growth factor receptor 2 (HER-2), estrogen receptors (ER), and progesterone receptors (PR). This lack causes that common treatments like hormone therapy and drugs that target estrogen, progesterone, and HER-2 are ineffective. We believe that the genetic pathways that are identified via the Fisher’s ratio sampler, which are mainly related to signaling pathways, provide new insights about the molecular mechanisms that are involved in this complex disease. The Fisher’s ratio sampler can be also applied to the genetic analysis of other complex diseases.


international conference on bioinformatics and biomedical engineering | 2018

Comparison of Different Sampling Algorithms for Phenotype Prediction

Ana Cernea; Juan Luis Fernández-Martínez; Enrique J. deAndrés-Galiana; Francisco Javier Fernández-Ovies; Zulima Fernández-Muñiz; Oscar Alvarez-Machancoses; Leorey N. Saligan; Stephen T. Sonis

In this paper, we compare different sampling algorithms used for identifying the defective pathways in highly underdetermined phenotype prediction problems. The first algorithm (Fisher’s ratio sampler) selects the most discriminatory genes and samples the high discriminatory genetic networks according to a prior probability that it is proportional to their individual Fisher’s ratio. The second one (holdout sampler) is inspired by the bootstrapping procedure used in regression analysis and uses the minimum-scale signatures found in different random hold outs to establish the most frequently sampled genes. The third one is a pure random sampler which randomly builds networks of differentially expressed genes. In all these algorithms, the likelihood of the different networks is established via leave one out cross-validation (LOOCV), and the posterior analysis of the most frequently sampled genes serves to establish the altered biological pathways. These algorithms are compared to the results obtained via Bayesian Networks (BNs). We show the application of these algorithms to a microarray dataset concerning Triple Negative Breast Cancers. This comparison shows that the Random, Fisher’s ratio and Holdout samplers are most effective than BNs, and all provide similar insights about the genetic mechanisms that are involved in this disease. Therefore, it can be concluded that all these samplers are good alternatives to Bayesian Networks which much lower computational demands. Besides this analysis confirms the insight that the altered pathways should be independent of the sampling methodology and the classifier that is used to infer them.


international conference on artificial intelligence and soft computing | 2018

On the Use of Principal Component Analysis and Particle Swarm Optimization in Protein Tertiary Structure Prediction

Óscar Álvarez; Juan Luis Fernández-Martínez; Celia Fernández-Brillet; Ana Cernea; Zulima Fernández-Muñiz; Andrzej Kloczkowski

We discuss applicability of Principal Component Analysis and Particle Swarm Optimization in protein tertiary structure prediction. The proposed algorithm is based on establishing a low-dimensional space where the sampling (and optimization) is carried out via Particle Swarm Optimizer (PSO). The reduced space is found via Principal Component Analysis (PCA) performed for a set of previously found low-energy protein models. A high frequency term is added into this expansion by projecting the best decoy into the PCA basis set and calculating the residual model. Our results show that PSO improves the energy of the best decoy used in the PCA considering an adequate number of PCA terms.


Archive | 2014

The Effect of the Noise and the Regularization in Inverse Problems: Geophysical Implications

J. L. G. Pallero; Juan Luis Fernández-Martínez; Zulima Fernández-Muñiz; L.M. Pedruelo-González

The solution of geophysical inverse problems has an intrinsic uncertainty that is mainly caused by noise in data, incomplete data sampling and simplified physics. This paper analyzes the roles of noise in data and that of the regularization for nonlinear inverse problems. We prove that noise deforms the topography of the cost function non-homogeneously, generally decreasing the regions of low misfits. As a result of this deformation, finding the global optimum by direct search methods becomes a more difficult task. Nevertheless, noise acts similarly to a regularization when local optimization methods are used. Tikhonov’s regularization transforms the linearized hyper-quadric of equivalence from an elliptical cylinder to a very oblong ellipsoid in the directions that originally spanned the kernel of the linearized forward operator in absence of regularization, and also deforms anisotropically the regions of equivalence. Prior models in the regularization term serves to inform the components of the solution that locally belongs to the kernel of the Jacobian. Unfortunately regularization does not cause the disappearance of the nonlinear equivalent models. Thus a full nonlinear uncertainty analysis is still needed.


Journal of Applied Geophysics | 2013

From Bayes to Tarantola: New insights to understand uncertainty in inverse problems

Juan Luis Fernández-Martínez; Zulima Fernández-Muñiz; J. L. G. Pallero; L.M. Pedruelo-González


Journal of Applied Geophysics | 2014

The effect of noise and Tikhonov's regularization in inverse problems. Part I: The linear case

Juan Luis Fernández-Martínez; J. L. G. Pallero; Zulima Fernández-Muñiz; L.M. Pedruelo-González

Collaboration


Dive into the Zulima Fernández-Muñiz's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

J. L. G. Pallero

Technical University of Madrid

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Leorey N. Saligan

National Institutes of Health

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge