Paloma Main
Complutense University of Madrid
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Paloma Main.
Communications in Statistics-theory and Methods | 2007
Miguel A. Gómez-Villegas; Paloma Main; Rosario Susi
This article develops a method for computing the sensitivity analysis in a Gaussian Bayesian network. The measure presented is based on the Kullback–Leibler divergence and is useful to evaluate the impact of prior changes over the posterior marginal density of the target variable in the network. We find that some changes do not disturb the posterior marginal density of interest. Finally, we describe a method to compare different sensitivity measures obtained depending on where the inaccuracy was. An example is used to illustrate the concepts and methods presented.
Cancer Research | 2015
Angelo Gámez-Pozo; Julia Berges-Soria; Jorge M. Arevalillo; Paolo Nanni; Rocío López-Vacas; Hilario Navarro; Jonas Grossmann; Carlos A. Castaneda; Paloma Main; Mariana Díaz-Almirón; Enrique Espinosa; Eva Ciruelos; Juan Ángel Fresno Vara
Better knowledge of the biology of breast cancer has allowed the use of new targeted therapies, leading to improved outcome. High-throughput technologies allow deepening into the molecular architecture of breast cancer, integrating different levels of information, which is important if it helps in making clinical decisions. microRNA (miRNA) and protein expression profiles were obtained from 71 estrogen receptor-positive (ER(+)) and 25 triple-negative breast cancer (TNBC) samples. RNA and proteins obtained from formalin-fixed, paraffin-embedded tumors were analyzed by RT-qPCR and LC/MS-MS, respectively. We applied probabilistic graphical models representing complex biologic systems as networks, confirming that ER(+) and TNBC subtypes are distinct biologic entities. The integration of miRNA and protein expression data unravels molecular processes that can be related to differences in the genesis and clinical evolution of these types of breast cancer. Our results confirm that TNBC has a unique metabolic profile that may be exploited for therapeutic intervention.
Statistics | 2009
Miguel Angel Gómez Villegas; Paloma Main; Luis Sanz
A Bayesian test for the point null testing problem in the multivariate case is developed. A procedure to get the mixed distribution using the prior density is suggested. For comparisons between the Bayesian and classical approaches, lower bounds on posterior probabilities of the null hypothesis, over some reasonable classes of prior distributions, are computed and compared with the p-value of the classical test. With our procedure, a better approximation is obtained because the p-value is in the range of the Bayesian measures of evidence.
Reliability Engineering & System Safety | 2009
Paloma Main; Hilario Navarro
Abstract Gaussian Bayesian networks are graphical models that represent the dependence structure of a multivariate normal random variable with a directed acyclic graph (DAG). In Gaussian Bayesian networks the output is usually the conditional distribution of some unknown variables of interest given a set of evidential nodes whose values are known. The problem of uncertainty about the assumption of normality is very common in applications. Thus a sensitivity analysis of the non-normality effect in our conclusions could be necessary. The aspect of non-normality to be considered is the tail behavior. In this line, the multivariate exponential power distribution is a family depending on a kurtosis parameter that goes from a leptokurtic to a platykurtic distribution with the normal as a mesokurtic distribution. Therefore a more general model can be considered using the multivariate exponential power distribution to describe the joint distribution of a Bayesian network, with a kurtosis parameter reflecting deviations from the normal distribution. The sensitivity of the conclusions to this perturbation is analyzed using the Kullback–Leibler divergence measure that provides an interesting formula to evaluate the effect.
Information Sciences | 2013
Miguel A. Gómez-Villegas; Paloma Main; Rosario Susi
In this work we study the effects of model inaccuracies on the description of a Gaussian Bayesian network with a set of variables of interest and a set of evidential variables. Using the Kullback-Leibler divergence measure, we compare the output of two different networks after evidence propagation: the original network, and a network with perturbations representing uncertainties in the quantitative parameters. We describe two methods for analyzing the sensitivity and robustness of a Gaussian Bayesian network on this basis. In the sensitivity analysis, different expressions are obtained depending on which set of parameters is considered inaccurate. This fact makes it possible to determine the set of parameters that most strongly disturbs the network output. If all of the divergences are small, we can conclude that the network output is insensitive to the proposed perturbations. The robustness analysis is similar, but considers all potential uncertainties jointly. It thus yields only one divergence, which can be used to confirm the overall sensitivity of the network. Some practical examples of this method are provided, including a complex, real-world problem.
Information Sciences | 2014
Miguel A. Gómez-Villegas; Paloma Main; Paola Viviani
We introduce a methodology for sensitivity analysis of evidence variables in Gaussian Bayesian networks. Knowledge of the posterior probability distribution of the target variable in a Bayesian network, given a set of evidence, is desirable. However, this evidence is not always determined; in fact, additional information might be requested to improve the solution in terms of reducing uncertainty. In this study we develop a procedure, based on Shannon entropy and information theory measures, that allows us to prioritize information according to its utility in yielding a better result. Some examples illustrate the concepts and methods introduced.
Expert Systems With Applications | 2011
Miguel A. Gómez-Villegas; Paloma Main; Hilario Navarro; Rosario Susi
In this work, we evaluate the sensitivity of Gaussian Bayesian networks to perturbations or uncertainties in the regression coefficients of the network arcs and the conditional distributions of the variables. The Kullback-Leibler divergence measure is used to compare the original network to its perturbation. By setting the regression coefficients to zero or non-zero values, the proposed method can remove or add arcs, making it possible to compare different network structures. The methodology is implemented with some case studies.
Applied Mathematics and Computation | 2013
Miguel A. Gómez-Villegas; Paloma Main; Hilario Navarro; Rosario Susi
The multivariate exponential power family is considered for n-dimensional random variables, Z, with a known partition Z=(Y,X) of dimensions p and n-p, respectively, with interest focusing on the conditional distribution Y|X. An infinitesimal variation of any parameter of the joint distribution produces perturbations in both the conditional and marginal distributions. The aim of the study was to determine the local effect of kurtosis deviations using the Kullback-Leibler divergence measure between probability distributions. The additive decomposition of this measure in terms of the conditional and marginal distributions, Y|X and X, is used to define a relative sensitivity measure of the conditional distribution family {Y|X=x}. Finally, simulated results suggest that for large dimensions, the measure is approximately equal to the ratio p/n, and then the effect of non-normality with respect to kurtosis depends only on the relative size of the variables considered in the partition of the random vector.
Archive | 2011
Miguel A. Gómez-Villegas; Eusebio Gómez-Sánchez-Manzano; Paloma Main; Hilario Navarro
As an alternative to the multivariate normal distribution we have dealt with a wider class of distributions, including the normal, that considers slightly different tail behavior than the normal tail. This is the multivariate exponential power family of distributions with a kurtosis parameter to give the possible forms of the distributions. To measure distribution deviations the Kullback-Leibler divergence will be used as an asymmetric dissimilarity measure from an information-theoretic basis. Thus, a local quantitative description of the non-normality could be established for joint distributions in this family as well as the impact this perturbation causes in the marginal and conditional distributions.
Scientific Reports | 2017
Angelo Gámez-Pozo; Lucia Trilla-Fuertes; Julia Berges-Soria; Nathalie Selevsek; Rocío López-Vacas; Mariana Díaz-Almirón; Paolo Nanni; Jorge M. Arevalillo; Hilario Navarro; Jonas Grossmann; Francisco Gayá Moreno; Rubén Gómez Rioja; Guillermo Prado-Vazquez; Andrea Zapater-Moros; Paloma Main; Jaime Feliu; Purificación Martínez del Prado; Pilar Zamora; Eva Ciruelos; Enrique Espinosa; Juan Ángel Fresno Vara
Breast cancer is a heterogeneous disease comprising a variety of entities with various genetic backgrounds. Estrogen receptor-positive, human epidermal growth factor receptor 2-negative tumors typically have a favorable outcome; however, some patients eventually relapse, which suggests some heterogeneity within this category. In the present study, we used proteomics and miRNA profiling techniques to characterize a set of 102 either estrogen receptor-positive (ER+)/progesterone receptor-positive (PR+) or triple-negative formalin-fixed, paraffin-embedded breast tumors. Protein expression-based probabilistic graphical models and flux balance analyses revealed that some ER+/PR+ samples had a protein expression profile similar to that of triple-negative samples and had a clinical outcome similar to those with triple-negative disease. This probabilistic graphical model-based classification had prognostic value in patients with luminal A breast cancer. This prognostic information was independent of that provided by standard genomic tests for breast cancer, such as MammaPrint, OncoType Dx and the 8-gene Score.