Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Darío Ramos-López is active.

Publication


Featured researches published by Darío Ramos-López.


Knowledge Based Systems | 2018

AMIDST: A Java toolbox for scalable probabilistic machine learning

Andrés R. Masegosa; Ana M. Martinez; Darío Ramos-López; Rafael Cabañas; Antonio Salmerón; Helge Langseth; Thomas Dyhre Nielsen; Anders L. Madsen

Abstract The AMIDST Toolbox is an open source Java software for scalable probabilistic machine learning with a special focus on (massive) streaming data. The toolbox supports a flexible modelling language based on probabilistic graphical models with latent variables. AMIDST provides parallel and distributed implementations of scalable algorithms for doing probabilistic inference and Bayesian parameter learning in the specified models. These algorithms are based on a flexible variational message passing scheme, which supports discrete and continuous variables from a wide range of probability distributions.


international conference on data mining | 2016

Financial Data Analysis with PGMs Using AMIDST

Rafael Cabañas; Ana M. Martinez; Andrés R. Masegosa; Darío Ramos-López; Antonio Sameron; Thomas Dyhre Nielsen; Helge Langseth; Anders L. Madsen

The AMIDST Toolbox an open source Java 8 library for scalable learning of probabilistic graphical models (PGMs) based on both batch and streaming data. An important application domain with streaming data characteristics is the banking sector, where we may want to monitor individual customers (based on their financial situation and behavior) as well as the general economic climate. Using a real financial data set from a Spanish bank, we have previously proposed and demonstrated a novel PGM framework for performing this type of data analysis with particular focus on concept drift. The framework is implemented in the AMIDST Toolbox, which was also used to conduct the reported analyses. In this paper, we provide an overview of the toolbox and illustrate with code examples how the toolbox can be used for setting up and performing analyses of this particular type.


International Journal of Approximate Reasoning | 2017

Scaling up Bayesian variational inference using distributed computing clusters

Andrés R. Masegosa; Ana M. Martinez; Helge Langseth; Thomas Dyhre Nielsen; Antonio Salmerón; Darío Ramos-López; Anders L. Madsen

Abstract In this paper we present an approach for scaling up Bayesian learning using variational methods by exploiting distributed computing clusters managed by modern big data processing tools like Apache Spark or Apache Flink, which efficiently support iterative map-reduce operations. Our approach is defined as a distributed projected natural gradient ascent algorithm, has excellent convergence properties, and covers a wide range of conjugate exponential family models. We evaluate the proposed algorithm on three real-world datasets from different domains (the Pubmed abstracts dataset, a GPS trajectory dataset, and a financial dataset) and using several models (LDA, factor analysis, mixture of Gaussians and linear regression models). Our approach compares favorably to stochastic variational inference and streaming variational Bayes, two of the main current proposals for scaling up variational methods. For the scalability analysis, we evaluate our approach over a network with more than one billion nodes and approx. 75 % latent variables using a computer cluster with 128 processing units (AWS). The proposed methods are released as part of an open-source toolbox for scalable probabilistic machine learning ( http://www.amidsttoolbox.com ) Masegosa et al. (2017) [29] .


Progress in Artificial Intelligence | 2017

MAP inference in dynamic hybrid Bayesian networks

Darío Ramos-López; Andrés R. Masegosa; Ana M. Martinez; Antonio Salmerón; Thomas Dyhre Nielsen; Helge Langseth; Anders L. Madsen

In this paper, we study the maximum a posteriori (MAP) problem in dynamic hybrid Bayesian networks. We are interested in finding the sequence of values of a class variable that maximizes the posterior probability given evidence. We propose an approximate solution based on transforming the MAP problem into a simpler belief update problem. The proposed solution constructs a set of auxiliary networks by grouping consecutive instantiations of the variable of interest, thus capturing some of the potential temporal dependences between these variables while ignoring others. Belief update is carried out independently in the auxiliary models, after which the results are combined, producing a configuration of values for the class variable along the entire time sequence. Experiments have been carried out to analyze the behavior of the approach. The algorithm has been implemented using Java 8 streams, and its scalability has been evaluated.


european conference on artificial intelligence | 2016

Parallel Filter-Based Feature Selection Based on Balanced Incomplete Block Designs

Antonio Salmerón; Anders L. Madsen; Frank Jensen; Helge Langseth; Thomas Dyhre Nielsen; Darío Ramos-López; Ana M. Martinez; Andrés R. Masegosa

In this paper we propose a method for scaling up filterbased feature selection in classification problems. We use the conditional mutual information as filter measure and show how the required statistics can be computed in parallel avoiding unnecessary calculations. The distribution of the calculations between the available computing units is determined based on balanced incomplete block designs, a strategy first developed within the area of statistical design of experiments. We show the scalability of our method through a series of experiments on synthetic and real-world datasets.


International Journal of Approximate Reasoning | 2018

Scalable importance sampling estimation of Gaussian mixture posteriors in Bayesian networks

Darío Ramos-López; Andrés R. Masegosa; Antonio Salmerón; Rafael Rumí; Helge Langseth; Thomas Dyhre Nielsen; Anders L. Madsen

Abstract In this paper we propose a scalable importance sampling algorithm for computing Gaussian mixture posteriors in conditional linear Gaussian Bayesian networks. Our contribution is based on using a stochastic gradient ascent procedure taking as input a stream of importance sampling weights, so that a mixture of Gaussians is dynamically updated with no need to store the full sample. The algorithm has been designed following a Map/Reduce approach and is therefore scalable with respect to computing resources. The implementation of the proposed algorithm is available as part of the AMIDST open-source toolbox for scalable probabilistic machine learning ( http://www.amidsttoolbox.com ).


probabilistic graphical models | 2016

d-VMP: Distributed Variational Message Passing

Andrés R. Masegosa; Ana M. Martinez; Helge Langseth; Thomas Dyhre Nielsen; Antonio Salmerón; Darío Ramos-López; Anders L. Madsen


Benelux Conference on Artificial Intelligence | 2015

AMIDST: Analysis of MassIve Data STreams

Andrés R. Masegosa; Ana M. Martinez; Hanen Borchani; Darío Ramos-López; Thomas Dyhre Nielsen; Helge Langseth; Antonio Salmerón; Anders L. Madsen


probabilistic graphical models | 2016

Scalable MAP inference in Bayesian networks based on a Map-Reduce approach

Darío Ramos-López; Antonio Salmerón; Rafael Rumí; Ana M. Martinez; Thomas Dyhre Nielsen; Andrés R. Masegosa; Helge Langseth; Anders L. Madsen


international conference on machine learning | 2017

Bayesian Models of Data Streams with Hierarchical Power Priors

Andrés R. Masegosa; Thomas Dyhre Nielsen; Helge Langseth; Darío Ramos-López; Antonio Salmerón; Anders L. Madsen

Collaboration


Dive into the Darío Ramos-López's collaboration.

Top Co-Authors

Avatar

Helge Langseth

Norwegian University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Andrés R. Masegosa

Norwegian University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Andrés R. Masegosa

Norwegian University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge