Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Toni Verbeiren is active.

Publication


Featured researches published by Toni Verbeiren.


Journal of Physics A | 2004

The signal-to-noise analysis of the Little-Hopfield model revisited

Désiré Bollé; J. Busquets Blanco; Toni Verbeiren

Using the generating functional analysis an exact recursion relation is derived for the time evolution of the effective local field of the fully connected Little–Hopfield model. It is shown that, by leaving out the feedback correlations arising from earlier times in this effective dynamics, one precisely finds the recursion relations usually employed in the signal-to-noise approach. The consequences of this approximation as well as the physics behind it are discussed. In particular, it is pointed out why it is hard to notice the effects, especially for model parameters corresponding to retrieval. Numerical simulations confirm these findings. The signal-to-noise analysis is then extended to include all correlations, making it a full theory for dynamics at the level of the generating functional analysis. The results are applied to the frequently employed extremely diluted (a)symmetric architectures and to sequence processing networks.


Physics Letters A | 2002

An optimal Q-state neural network using mutual information

Désiré Bollé; Toni Verbeiren

Abstract Starting from the mutual information we present a method in order to find a Hamiltonian for a fully connected neural network model with an arbitrary, finite number of neuron states, Q . For small initial correlations between the neurons and the patterns it leads to optimal retrieval performance. For binary neurons, Q =2, and biased patterns we recover the Hopfield model. For three-state neurons, Q =3, we find back the recently introduced Blume–Emery–Griffiths network Hamiltonian. We derive its phase diagram and compare it with those of related three-state models. We find that the retrieval region is the largest.


Journal of Physics A | 2003

Thermodynamics of fully connected Blume-Emery-Griffiths neural networks

Désiré Bollé; Toni Verbeiren

The thermodynamic and retrieval properties of fully connected Blume–Emery–Griffiths networks are studied using replica mean-field theory. These networks can be considered as generalizations of the Hopfield model to the storage of ternary patterns. Capacity–temperature phase diagrams are derived for several values of the pattern activity. It is found that the retrieval phase is the largest in comparison with other three-state neuron models. Furthermore, the meaning and stability of the so-called quadrupolar phase is discussed as a function of both the temperature and the pattern activity. Where appropriate, the results are compared with the diluted version of the model.


F1000Research | 2014

dendsort: modular leaf ordering methods for dendrogram representations in R

Ryo Sakai; Raf Winand; Toni Verbeiren; Andrew Vande Moere; Jan Aerts

Dendrograms are graphical representations of binary tree structures resulting from agglomerative hierarchical clustering. In Life Science, a cluster heat map is a widely accepted visualization technique that utilizes the leaf order of a dendrogram to reorder the rows and columns of the data table. The derived linear order is more meaningful than a random order, because it groups similar items together. However, two consecutive items can be quite dissimilar despite proximity in the order. In addition, there are 2 n-1 possible orderings given n input elements as the orientation of clusters at each merge can be flipped without affecting the hierarchical structure. We present two modular leaf ordering methods to encode both the monotonic order in which clusters are merged and the nested cluster relationships more faithfully in the resulting dendrogram structure. We compare dendrogram and cluster heat map visualizations created using our heuristics to the default heuristic in R and seriation-based leaf ordering methods. We find that our methods lead to a dendrogram structure with global patterns that are easier to interpret, more legible given a limited display space, and more insightful for some cases. The implementation of methods is available as an R package, named ”dendsort”, from the CRAN package repository. Further examples, documentations, and the source code are available at [https://bitbucket.org/biovizleuven/dendsort/].


Journal of Physics A | 2003

A spherical Hopfield model

Désiré Bollé; Th. M. Nieuwenhuizen; I Pérez Castillo; Toni Verbeiren

A spherical Hopfield-type neural network is introduced, involving neurons and patterns that are continuous variables. Both the thermodynamics and dynamics of this model are studied. In order to have a retrieval phase a quartic term is added to the Hamiltonian. The thermodynamics of the model is exactly solvable and the results are replica symmetric. A Langevin dynamics leads to a closed set of equations for the order parameters and effective correlation and response function typical for neural networks. The stationary limit corresponds to the thermodynamic results. Numerical calculations illustrate these findings.


Physica A-statistical Mechanics and Its Applications | 2004

The Blume–Emery–Griffiths neural network: dynamics for arbitrary temperature

Désiré Bollé; J. Busquets Blanco; G.M. Shim; Toni Verbeiren

The parallel dynamics of the fully connected Blume–Emery–Griffiths neural network model is studied for arbitrary temperature. By employing a probabilistic signal-to-noise approach, a recursive scheme is found determining the time evolution of the distribution of the local fields and, hence, the evolution of the order parameters. A comparison of this approach is made with the generating functional method, allowing to calculate any physical relevant quantity as a function of time. Explicit analytic formula are given in both methods for the first few time steps of the dynamics. Up to the third time step the results are identical. Some arguments are presented why beyond the third time step the results differ for certain values of the model parameters. Furthermore, fixed-point equations are derived in the stationary limit. Numerical simulations confirm our theoretical findings.


Physica A-statistical Mechanics and Its Applications | 2006

Synchronous versus sequential updating in the three-state Ising neural network with variable dilution

Désiré Bollé; R. Erichsen; Toni Verbeiren

The Q=3-state Ising neural network with synchronous updating and variable dilution is discussed starting from the appropriate Hamiltonians. The thermodynamic and retrieval properties are examined using replica mean-field theory. The appearance and properties of two-cycles are discussed. Capacity–temperature phase diagrams are derived for several values of the pattern activity and different gradations of dilution, and the information content is calculated. It is found that the asymptotic behaviour is rather similar to that for sequential updating. The retrieval region is enhanced marginally, but the spin-glass region is visibly enlarged. Only the presence of self-coupling can enlarge the retrieval region substantially. The dynamics of the network is studied for general Q and both synchronous and sequential updating using an extension of the generating function technique. The differences with the signal-to-noise approach are outlined. Typical flow diagrams for the Q=3 overlap order parameter are presented.


arXiv: Disordered Systems and Neural Networks | 2004

Multiplicative versus additive noise in multistate neural networks

Désiré Bollé; Jordi Busquets Blanco; Toni Verbeiren

The effects of a variable amount of random dilution of the synaptic couplings in Q-Ising multi-state neural networks with Hebbian learning are examined. A fraction of the couplings is explicitly allowed to be anti-Hebbian. Random dilution represents the dying or pruning of synapses and, hence, a static disruption of the learning process which can be considered as a form of multiplicative noise in the learning rule. Both parallel and sequential updating of the neurons can be treated. Symmetric dilution in the statics of the network is studied using the mean-field theory approach of statistical mechanics. General dilution, including asymmetric pruning of the couplings, is examined using the generating functional (path integral) approach of disordered systems. It is shown that random dilution acts as additive gaussian noise in the Hebbian learning rule with a mean zero and a variance depending on the connectivity of the network and on the symmetry. Furthermore, a scaling factor appears that essentially measures the average amount of anti-Hebbian couplings.


Physical Review E | 2000

Correlated patterns in nonmonotonic graded-response perceptrons

Désiré Bollé; Toni Verbeiren

The optimal capacity of graded-response perceptrons storing biased and spatially correlated patterns with non-monotonic input-output relations is studied. It is shown that only the structure of the output patterns is important for the overall performance of the perceptrons.


IEEE VIS 2014 | 2014

A Pragmatic Approach to Biases in Visual Data Analysis

Toni Verbeiren; Ryo Sakai; Jan Aerts

Collaboration


Dive into the Toni Verbeiren's collaboration.

Top Co-Authors

Avatar

Désiré Bollé

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

Jan Aerts

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

Ryo Sakai

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

J. Busquets Blanco

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

Andrew Vande Moere

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

Raf Winand

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

I Pérez Castillo

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

Jordi Busquets Blanco

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

G.M. Shim

Chungnam National University

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge