Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Daan Fierens is active.

Publication


Featured researches published by Daan Fierens.


Theory and Practice of Logic Programming | 2015

Inference and Learning in Probabilistic Logic Programs using Weighted Boolean Formulas

Daan Fierens; Guy Van den Broeck; Joris Renkens; Dimitar Sht. Shterionov; Bernd Gutmann; Ingo Thon; Gerda Janssens; Luc De Raedt

Probabilistic logic programs are logic programs in which some of the facts are annotated with probabilities. This paper investigates how classical inference and learning tasks known from the graphical model community can be tackled for probabilistic logic programs. Several such tasks such as computing the marginals given evidence and learning from (partial) interpretations have not really been addressed for probabilistic logic programs before. The rst contribution of this paper is a suite of ecient algorithms for various inference tasks. It is based on a conversion of the program and the queries and evidence to a weighted Boolean formula. This allows us to reduce the inference tasks to well-studied tasks such as weighted model counting, which can be solved using state-of-the-art methods known from the graphical model and knowledge compilation literature. The second contribution is an algorithm for parameter estimation in the learning from interpretations setting. The algorithm employs Expectation Maximization, and is built on top of the developed inference algorithms. The proposed approach is experimentally evaluated. The results show that the inference algorithms improve upon the state-of-the-art in probabilistic logic programming and that it is indeed possible to learn the parameters of a probabilistic logic program from interpretations.


Advanced Engineering Informatics | 2007

Mining data from intensive care patients

Jan Ramon; Daan Fierens; Fabian Güiza; Geert Meyfroidt; Hendrik Blockeel; Maurice Bruynooghe; Greet Van den Berghe

In this paper we describe the application of data mining methods for predicting the evolution of patients in an intensive care unit. We discuss the importance of such methods for health care and other application domains of engineering. We argue that this problem is an important but challenging one for the current state of the art data mining methods and explain what improvements on current methods would be useful. We present a promising study on a preliminary data set that demonstrates some of the possibilities in this area.


inductive logic programming | 2005

Logical bayesian networks and their relation to other probabilistic logical models

Daan Fierens; Hendrik Blockeel; Maurice Bruynooghe; Jan Ramon

Logical Bayesian Networks (LBNs) have recently been introduced as another language for knowledge based model construction of Bayesian networks, besides existing languages such as Probabilistic Relational Models (PRMs) and Bayesian Logic Programs (BLPs). The original description of LBNs introduces them as a variant of BLPs and discusses the differences with BLPs but still leaves room for a deeper discussion of the relationship between LBNs and BLPs. Also the relationship to PRMs was not treated in much detail. In this paper, we first give a more compact and clear definition of LBNs. Next, we describe in more detail how PRMs and BLPs relate to LBNs. Like this we not only see what the advantages and disadvantages of LBNs are with respect to PRMs and BLPs, we also gain more insight into the relationships between PRMs and BLPs.


inductive logic programming | 2007

Generalized ordering-search for learning directed probabilistic logical models

Jan Ramon; Tom Croonenborghs; Daan Fierens; Hendrik Blockeel; Maurice Bruynooghe

Recently, there has been an increasing interest in directed probabilistic logical models and a variety of languages for describing such models has been proposed. Although many authors provide high-level arguments to show that in principle models in their language can be learned from data, most of the proposed learning algorithms have not yet been studied in detail. We introduce an algorithm, generalized ordering-search, to learn both structure and conditional probability distributions (CPDs) of directed probabilistic logical models. The algorithm upgrades the ordering-search algorithm for Bayesian networks. We use relational probability trees as a representation for the CPDs. We present experiments on blocks world domains, a gene domain and the Cora dataset.


european conference on machine learning | 2005

A comparison of approaches for learning probability trees

Daan Fierens; Jan Ramon; Hendrik Blockeel; Maurice Bruynooghe

Probability trees (or Probability Estimation Trees, PETs) are decision trees with probability distributions in the leaves. Several alternative approaches for learning probability trees have been proposed but no thorough comparison of these approaches exists. In this paper we experimentally compare the main approaches using the relational decision tree learner Tilde (both on non-relational and on relational datasets). Next to the main existing approaches, we also consider a novel variant of an existing approach based on the Bayesian Information Criterion (BIC). Our main conclusion is that overall trees built using the C4.5-approach or the C4.4-approach (C4.5 without post-pruning) have the best predictive performance. If the number of classes is low, however, BIC performs equally well. An additional advantage of BIC is that its trees are considerably smaller than trees for the C4.5- or C4.4-approach.


Annals of Mathematics and Artificial Intelligence | 2008

Learning directed probabilistic logical models: ordering-search versus structure-search

Daan Fierens; Jan Ramon; Maurice Bruynooghe; Hendrik Blockeel

We discuss how to learn non-recursive directed probabilistic logical models from relational data. This problem has been tackled before by upgrading the structure-search algorithm initially proposed for Bayesian networks. In this paper we show how to upgrade another algorithm for learning Bayesian networks, namely ordering-search. For Bayesian networks, ordering-search was found to work better than structure-search. It is non-obvious that these results carry over to the relational case, however, since there ordering-search needs to be implemented quite differently. Hence, we perform an experimental comparison of these upgraded algorithms on four relational domains. We conclude that also in the relational case ordering-search is competitive with structure-search in terms of quality of the learned models, while ordering-search is significantly faster.


Proceedings of the Workshop on Context-Aware Movie Recommendation | 2010

Three complementary approaches to context aware movie recommendation

Hossein Rahmani; Beau Piccart; Daan Fierens; Hendrik Blockeel

We describe three different approaches to the Context Aware Movie Recommendation (CAMRa) challenge. Each approach is based on different machine learning techniques: two are nearest neighbor approaches, one is based on inductive logic programming. The results obtained with the three techniques are compared.


european conference on machine learning | 2007

Learning Directed Probabilistic Logical Models: Ordering-Search Versus Structure-Search

Daan Fierens; Jan Ramon; Maurice Bruynooghe; Hendrik Blockeel

We discuss how to learn non-recursive directed probabilistic logical models from relational data. This problem has been tackled before by upgrading the structure-search algorithm initially proposed for Bayesian networks. In this paper we propose to upgrade another algorithm, namely ordering-search, since for Bayesian networks this was found to work better than structure-search. We experimentally compare the two upgraded algorithms on two relational domains. We conclude that there is no significant difference between the two algorithms in terms of quality of the learnt models while ordering-search is significantly faster.


inductive logic programming | 2009

On the relationship between logical Bayesian networks and probabilistic logic programming based on the distribution semantics

Daan Fierens

A significant part of current research on (inductive) logic programming deals with probabilistic logical models. Over the last decade many logics or languages for representing such models have been introduced. There is currently a great need for insight into the relationships between all these languages. One kind of languages are those that extend probabilistic models with elements of logic, such as the language of Logical Bayesian Networks (LBNs). Some other languages follow the converse strategy of extending logic programs with a probabilistic semantics, often in a way similar to that of Satos distribution semantics. In this paper we study the relationship between the language of LBNs and languages based on the distribution semantics. Concretely, we define a mapping from LBNs to theories in the Independent Choice Logic (ICL). We also show how this mapping can be used to learn ICL theories from data.


Ai Communications | 2008

Learning directed probabilistic logical models from relational data

Daan Fierens

Data that has a complex relational structure and in which observations are noisy or partially missing poses several challenges to traditional machine learning algorithms. One solution to this problem is the use of so-called probabilistic logical models (models that combine elements of first-order logic with probabilities) and corresponding learning algorithms. In this thesis we focus on directed probabilistic logical models. We show how to represent such models and develop several algorithms to learn such models from data.

Collaboration


Dive into the Daan Fierens's collaboration.

Top Co-Authors

Avatar

Hendrik Blockeel

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

Jan Ramon

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

Maurice Bruynooghe

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Nima Taghipour

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

Jesse Davis

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

Luc De Raedt

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

Wannes Meert

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

Bernd Gutmann

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

Geert Meyfroidt

Katholieke Universiteit Leuven

View shared research outputs
Researchain Logo
Decentralizing Knowledge