Bart Bakker
Radboud University Nijmegen
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Bart Bakker.
Neural Computing and Applications | 2003
Tom Heskes; Jan-Joost Spanjers; Bart Bakker; Wim Wiegerinck
We describe a software system, called just enough delivery (JED), for optimising single-copy newspaper sales, based on a combination of neural and Bayesian technology. The prediction model is a huge feedforward neural network, in which each output corresponds to the sales prediction for a single outlet. Input-to-hidden weights are shared between outlets. The hidden-to-output weights are specific to each outlet, but linked through the introduction of priors. All weights and hyperparameters can be inferred using (empirical) Bayesian inference. The system has been tested on data for several different newspapers and magazines. Consistent performance improvements of 1 to 3% more sales with the same total amount of deliveries have been obtained.
Malmgren, H.;Borga, M. (ed.), Proceedings ANNIMAB | 2000
Bart Bakker; Hilbert J. Kappen; Tom Heskes
In this article we show that traditional Cox survival analysis can be improved upon when written in terms of a multi-layered perceptron and analyzed in the context of the Bayesian evidence framework. The obtained posterior distribution of network parameters is approximated both by Hybrid Markov Chain Monte Carlo sampling and by variational methods. We discuss the merits of both approaches. We argue that the neural-Bayesian approach circumvents the shortcomings of the original Cox analysis, and therefore yields better predictive results. As a bonus, we apply the Bayesian posterior (the probability distribution of the network parameters given the data) to estimate p-values on the inputs.
Theoretical Computer Science | 2002
Tom Heskes; Bart Bakker; Bert Kappen
We describe two specific examples of neural-Bayesian approaches for complex modeling tasks: survival analysis and multitask learning. In both cases, we can come up with reasonable priors on the parameters of the neural network. As a result, the Bayesian approaches improve their (maximum likelihood) frequentist counterparts dramatically. By illustrating their application on the models under study, we review and compare algorithms that can be used for Bayesian inference: Laplace approximation, variational algorithms, Monte Carlo sampling, and empirical Bayes.
Journal of Machine Learning Research | 2003
Bart Bakker; Tom Heskes
Neural Networks | 2003
Bart Bakker; Tom Heskes
Statistics in Medicine | 2004
Bart Bakker; Tom Heskes; Jan Neijt; Bert Kappen
international conference on artificial neural networks | 1999
Bart Bakker; Tom Heskes
the european symposium on artificial neural networks | 1999
Bart Bakker; Tom Heskes
Computational Statistics & Data Analysis | 2007
Bart Bakker; Tom Heskes
international conference on artificial neural networks | 2002
Bart Bakker; Tom Heskes