Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Janneke H. Bolt is active.

Publication


Featured researches published by Janneke H. Bolt.


information processing and management of uncertainty | 2010

Modelling patterns of evidence in Bayesian networks: a case-study in classical swine fever

Linda C. van der Gaag; Janneke H. Bolt; W.L.A. Loeffen; A.R.W. Elbers

Upon engineering a Bayesian network for the early detection of Classical Swine Fever in pigs, we found that the commonly used approach of separately modelling the relevant observable variables would not suffice to arrive at satisfactory performance of the network: explicit modelling of combinations of observations was required to allow identifying and reasoning about patterns of evidence. In this paper, we outline a general approach to modelling relevant patterns of evidence in a Bayesian network. We demonstrate its application for our problem domain and show that it served to significantly improve our networks performance.


international conference information processing | 2010

An Empirical Study of the Use of the Noisy-Or Model in a Real-Life Bayesian Network

Janneke H. Bolt; Linda C. van der Gaag

The use of the noisy-OR model is advocated throughout the literature as an approach to lightening the task of obtaining all probabilities required for a Bayesian network. Little evidence is available, however, as to the effects of using the model on a network’s performance. In this paper, we construct a noisy-OR version of a real-life hand-built Bayesian network of moderate size, and compare the performance of the original network with that of the constructed noisy-OR version. Empirical results from using the two networks on real-life data show that the performance of the original network does not degrade by using the noisy-OR model.


european conference on symbolic and quantitative approaches to reasoning and uncertainty | 2003

Introducing Situational Influences in QPNs

Janneke H. Bolt; Linda C. van der Gaag; Silja Renooij

A qualitative probabilistic network models the probabilistic relationships between its variables by means of signs. Non-monotonic influences are modelled by the ambiguous sign ‘?’, which indicates that the actual sign of the influence depends on the current state of the network. The presence of influences with such ambiguous signs tends to lead to ambiguous results upon inference. In this paper we introduce the concept of situational influence into qualitative networks. A situational influence is a non-monotonic influence supplemented with a sign that indicates its effect in the current state of the network. We show that reasoning with such situational influences may forestall ambiguous results upon inference; we further show how these influences change as the current state of the network changes.


european conference on artificial intelligence | 2016

Exploiting Bayesian network sensitivity functions for inference in credal networks

Janneke H. Bolt; Jasper De Bock; Silja Renooij

A Bayesian network is a concise representation of a joint probability distribution, which can be used to compute any probability of interest for the represented distribution. Credal networks were introduced to cope with the inevitable inaccuracies in the parametrisation of such a network. Where a Bayesian network is parametrised by defining unique local distributions, in a credal network sets of local distributions are given. From a credal network, lower and upper probabilities can be inferred. Such inference, however, is often problematic since it may require a number of Bayesian network computations exponential in the number of credal sets. In this paper we propose a preprocessing step that is able to reduce this complexity. We use sensitivity functions to show that for some classes of parameter in Bayesian networks the qualitative effect of a parameter change on an outcome probability of interest is independent of the exact numerical specification. We then argue that credal sets associated with such parameters can be replaced by a single distribution.


european conference on symbolic and quantitative approaches to reasoning and uncertainty | 2015

Balanced Tuning of Multi-dimensional Bayesian Network Classifiers

Janneke H. Bolt; Linda C. van der Gaag

Multi-dimensional classifiers are Bayesian networks of restricted topological structure, for classifying data instances into multiple classes. We show that upon varying their parameter probabilities, the graphical properties of these classifiers induce higher-order sensitivity functions of restricted functional form. To allow ready interpretation of these functions, we introduce the concept of balanced sensitivity function in which parameter probabilities are related by the odds ratios of their original and new values. We demonstrate that these balanced functions provide a suitable heuristic for tuning multi-dimensional Bayesian network classifiers, with guaranteed bounds on the changes of all output probabilities.


International Journal of Approximate Reasoning | 2017

Balanced sensitivity functions for tuning multi-dimensional Bayesian network classifiers

Janneke H. Bolt; Linda C. van der Gaag

Multi-dimensional Bayesian network classifiers are Bayesian networks of restricted topological structure, which are tailored to classifying data instances into multiple dimensions. Like more traditional classifiers, multi-dimensional classifiers are typically learned from data and may include inaccuracies in their parameter probabilities. We will show that the graphical properties and dedicated use of these classifiers induce higher-order sensitivity functions of a highly constrained functional form in these parameters. We then introduce the concept of balanced sensitivity function in which multiple parameters are functionally related by the odds ratios of their original and new values, and argue that these functions provide for a suitable heuristic for tuning multi-dimensional classifiers with guaranteed bounds on the effects on their output probabilities. We demonstrate the practicability of our heuristic by means of a classifier for a real-world application in the veterinary field. N-way sensitivity functions which can be established efficiently are given for MDCs.Balanced functions, capturing BN output in multiple, tied, parameters are introduced.Balanced tuning of multi-dimensional Bayesian classifiers is formalised.


probabilistic graphical models | 2014

Local Sensitivity of Bayesian Networks to Multiple Simultaneous Parameter Shifts

Janneke H. Bolt; Silja Renooij

The robustness of the performance of a Bayesian network to shifts in its parameters can be studied with a sensitivity analysis. For reasons of computational efficiency such an analysis is often limited to studying shifts in only one or two parameters at a time. The concept of sensitivity value, an important notion in sensitivity analysis, captures the effect of local changes in a single parameter. In this paper we generalise this concept to an n-way sensitivity value in order to capture the local effect of multiple simultaneous parameters changes. Moreover, we demonstrate that an n-way sensitivity value can be computed efficiently, even for large n. An n-way sensitivity value is direction dependent and its maximum, minimum, and direction of maximal change can be easily determined. The direction of maximal change can, for example, be exploited in network tuning. To this end, we introduce the concept of sliced sensitivity function for an n-way sensitivity function restricted to parameter shifts in a fixed direction. We moreover argue that such a function can be computed efficiently.


european conference on symbolic and quantitative approaches to reasoning and uncertainty | 2017

Structure-Based Categorisation of Bayesian Network Parameters

Janneke H. Bolt; Silja Renooij

Bayesian networks typically require thousands of probability para-meters for their specification, many of which are bound to be inaccurate. Know-ledge of the direction of change in an output probability of a network occasioned by changes in one or more of its parameters, i.e. the qualitative effect of parameter changes, has been shown to be useful both for parameter tuning and in pre-processing for inference in credal networks. In this paper we identify classes of parameter for which the qualitative effect on a given output of interest can be identified based upon graphical considerations.


information processing and management of uncertainty | 2014

The General Expression of the Prior Convergence Error: A Proof

Janneke H. Bolt

In [2], we introduced the notion of the parental synergy. In the same paper, moreover, an expression was advanced for the prior convergence error (the error which is found in the marginal probabilities computed for a node when the parents of this node are wrongfully assumed to be independent), in which the parental synergy has a key position as weighting factor. This key position suggests that the parental synergy captures a fundamental feature of a Bayesian network. In this paper a proof is provided for the correctness of the conjectured expression of the prior convergence error.


european conference on artificial intelligence | 2014

Sensitivity of multi-dimensional Bayesian classifiers

Janneke H. Bolt; Silja Renooij

One-dimensional Bayesian network classifiers (OBCs) are popular tools for classification [2]. An OBC is a Bayesian network [4] consisting of just a single class variable and several feature variables. Multi-dimensional Bayesian network classifiers (MBCs) were introduced to generalise OBCs to multiple class variables [1, 6]. Classification performance of OBCs is known to be rather good. Experimental results that support this observation were substantiated by a study of the sensitivity properties of naive OBCs [5]. In this paper we investigate the sensitivity of MBCs. We present sensitivity functions for the outcome probabilities of interest of an MBC and use these functions to study the sensitivity value. This value captures the sensitivity of an output probability to small changes in a parameter. We compare MBCs to OBCs in this respect and conclude that an MBC will on average be even more robust to parameter changes than an OBC.

Collaboration


Dive into the Janneke H. Bolt's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

A.R.W. Elbers

Wageningen University and Research Centre

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

W.L.A. Loeffen

Wageningen University and Research Centre

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge