Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Guido Montúfar is active.

Publication


Featured researches published by Guido Montúfar.


Neural Computation | 2011

Refinements of universal approximation results for deep belief networks and restricted boltzmann machines

Guido Montúfar; Nihat Ay

We improve recently published results about resources of restricted Boltzmann machines (RBM) and deep belief networks (DBN) required to make them universal approximators. We show that any distribution on the set of binary vectors of length can be arbitrarily well approximated by an RBM with hidden units, where is the minimal number of pairs of binary vectors differing in only one entry such that their union contains the support set of . In important cases this number is half the cardinality of the support set of (given in Le Roux & Bengio, 2008). We construct a DBN with , hidden layers of width that is capable of approximating any distribution on arbitrarily well. This confirms a conjecture presented in Le Roux and Bengio (2010).


SIAM Journal on Discrete Mathematics | 2015

When Does a Mixture of Products Contain a Product of Mixtures

Guido Montúfar; Jason Morton

We derive relations between theoretical properties of restricted Boltzmann machines (RBMs), popular machine learning models which form the building blocks of deep learning models, and several natural notions from discrete mathematics and convex geometry. We give implications and equivalences relating RBM-representable probability distributions, perfectly reconstructible inputs, Hamming modes, zonotopes and zonosets, point configurations in hyperplane arrangements, linear threshold codes, and multicovering numbers of hypercubes. As a motivating application, we prove results on the relative representational power of mixtures of product distributions and products of mixtures of pairs of product distributions (RBMs) that formally justify widely held intuitions about distributed representations. In particular, we show that a mixture of products requiring an exponentially larger number of parameters is needed to represent the probability distributions which can be obtained as products of mixtures.


Archive | 2013

Selection Criteria for Neuromanifolds of Stochastic Dynamics

Nihat Ay; Guido Montúfar; Johannes Rauh

We present ways of defining neuromanifolds – models of stochastic matrices – that are compatible with the maximization of an objective function such as the expected reward in reinforcement learning theory. Our approach is based on information geometry and aims to reduce the number of model parameters with the hope to improve gradient learning processes.


Neural Computation | 2014

Universal approximation depth and errors of narrow belief networks with discrete units

Guido Montúfar

We generalize recent theoretical work on the minimal number of layers of narrow deep belief networks that can approximate any probability distribution on the states of their visible units arbitrarily well. We relax the setting of binary units (Sutskever & Hinton, 2008; Le Roux & Bengio, 2008, 2010; Montúfar & Ay, 2011) to units with arbitrary finite state spaces and the vanishing approximation error to an arbitrary approximation error tolerance. For example, we show that a q-ary deep belief network with layers of width for some can approximate any probability distribution on without exceeding a Kullback-Leibler divergence of . Our analysis covers discrete restricted Boltzmann machines and naive Bayes models as special cases.


PLOS Computational Biology | 2015

A Theory of Cheap Control in Embodied Systems

Guido Montúfar; Keyan Ghazi-Zahedi; Nihat Ay

We present a framework for designing cheap control architectures of embodied agents. Our derivation is guided by the classical problem of universal approximation, whereby we explore the possibility of exploiting the agent’s embodiment for a new and more efficient universal approximation of behaviors generated by sensorimotor control. This embodied universal approximation is compared with the classical non-embodied universal approximation. To exemplify our approach, we present a detailed quantitative case study for policy models defined in terms of conditional restricted Boltzmann machines. In contrast to non-embodied universal approximation, which requires an exponential number of parameters, in the embodied setting we are able to generate all possible behaviors with a drastically smaller model, thus obtaining cheap universal approximation. We test and corroborate the theory experimentally with a six-legged walking machine. The experiments indicate that the controller complexity predicted by our theory is close to the minimal sufficient value, which means that the theory has direct practical implications.


International Journal of Approximate Reasoning | 2017

Hierarchical models as marginals of hierarchical models

Guido Montúfar; Johannes Rauh

We investigate the representation of hierarchical models in terms of marginals of other hierarchical models with smaller interactions. We focus on binary variables and marginals of pairwise interaction models whose hidden variables are conditionally independent given the visible variables. In this case the problem is equivalent to the representation of linear subspaces of polynomials by feedforward neural networks with soft-plus computational units. We show that every hidden variable can freely model multiple interactions among the visible variables, which allows us to generalize and improve previous results. In particular, we show that a restricted Boltzmann machine with less than


Frontiers in Robotics and AI | 2016

Evaluating Morphological Computation in Muscle and DC-Motor Driven Models of Hopping Movements

Keyan Ghazi-Zahedi; Daniel F. B. Haeufle; Guido Montúfar; Syn Schmitt; Nihat Ay

[ 2(\log(v)+1) / (v+1) ] 2^v-1


Entropy | 2014

On the Fisher Metric of Conditional Probability Polytopes

Guido Montúfar; Johannes Rauh; Nihat Ay

hidden binary variables can approximate every distribution of


arXiv: Statistics Theory | 2013

Maximal Information Divergence from Statistical Models defined by Neural Networks

Guido Montúfar; Johannes Rauh; Nihat Ay

v


arXiv: Machine Learning | 2017

Dimension of Marginals of Kronecker Product Models

Guido Montúfar; Jason Morton

visible binary variables arbitrarily well, compared to

Collaboration


Dive into the Guido Montúfar's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jason Morton

Pennsylvania State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ruedi Seiler

Technical University of Berlin

View shared research outputs
Top Co-Authors

Avatar

Tyll Krüger

Technical University of Berlin

View shared research outputs
Top Co-Authors

Avatar

Yoshua Bengio

Université de Montréal

View shared research outputs
Top Co-Authors

Avatar

A. Knorr

Technical University of Berlin

View shared research outputs
Researchain Logo
Decentralizing Knowledge