Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Elena Bellodi is active.

Publication


Featured researches published by Elena Bellodi.


Sprachwissenschaft | 2015

Probabilistic Description Logics under the Distribution Semantics

Fabrizio Riguzzi; Elena Bellodi; Evelina Lamma; Riccardo Zese

Representing uncertain information is crucial for modeling real world domains. In this paper we present a technique for the integration of probabilistic information in Description Logics (DLs) that is based on the distribution semantics for probabilistic logic programs. In the resulting approach, that we called DISPONTE, the axioms of a probabilistic knowledge base (KB) can be annotated with a real number between 0 and 1. A probabilistic knowledge base then denes a probability distribution over regular KBs called worlds and the probability of a given query can be obtained from the joint distribution of the worlds and the query by marginalization. We present the algorithm BUNDLE for computing the probability of queries from DISPONTE knowledge bases. The algorithm exploits an underlying DL reasoner, such as Pellet, that is able to return explanations for queries. The explanations are encoded in a Binary Decision Diagram from which the probability of the query is computed. The experimentation of BUNDLE on probabilistic knowledge bases shows that it can handle knowledge bases of realistic size.


Theory and Practice of Logic Programming | 2015

Structure learning of probabilistic logic programs by searching the clause space

Elena Bellodi; Fabrizio Riguzzi

Learning probabilistic logic programming languages is receiving an increasing attention, and systems are available for learning the parameters (PRISM, LeProbLog, LFI-ProbLog and EMBLEM) or both structure and parameters (SEM-CP-logic and SLIPCASE) of these languages. In this paper we present the algorithm SLIPCOVER for “Structure LearnIng of Probabilistic logic programs by searChing OVER the clause space.” It performs a beam search in the space of probabilistic clauses and a greedy search in the space of theories using the log likelihood of the data as the guiding heuristics. To estimate the log likelihood, SLIPCOVER performs Expectation Maximization with EMBLEM. The algorithm has been tested on five real world datasets and compared with SLIPCASE, SEM-CP-logic, Aleph and two algorithms for learning Markov Logic Networks (Learning using Structural Motifs (LSM) and ALEPH++ExactL1). SLIPCOVER achieves higher areas under the precision-recall and receiver operating characteristic curves in most cases.


web reasoning and rule systems | 2013

BUNDLE: a reasoner for probabilistic ontologies

Fabrizio Riguzzi; Elena Bellodi; Evelina Lamma; Riccardo Zese

Representing uncertain information is very important for modeling real world domains. Recently, the DISPONTE semantics has been proposed for probabilistic description logics. In DISPONTE, the axioms of a knowledge base can be annotated with a set of variables and a real number between 0 and 1. This real number represents the probability of each version of the axiom in which the specified variables are instantiated. In this paper we present the algorithm BUNDLE for computing the probability of queries from DISPONTE knowledge bases that follow the


Intelligenza Artificiale | 2012

Experimentation of an expectation maximization algorithm for probabilistic logic programs

Elena Bellodi; Fabrizio Riguzzi

\mathcal{ALC}


Theory and Practice of Logic Programming | 2014

Lifted Variable Elimination for Probabilistic Logic Programming

Elena Bellodi; Evelina Lamma; Fabrizio Riguzzi; Vítor Santos Costa; Riccardo Zese

semantics. BUNDLE exploits an underlying DL reasoner, such as Pellet, that is able to return explanations for queries. The explanations are encoded in a Binary Decision Diagram from which the probability of the query is computed. The experiments performed by applying BUNDLE to probabilistic knowledge bases show that it can handle ontologies of realistic size and is competitive with the system PRONTO for the probabilistic description logic P-


web reasoning and rule systems | 2013

Parameter learning for probabilistic ontologies

Fabrizio Riguzzi; Elena Bellodi; Evelina Lamma; Riccardo Zese

\mathcal{SHIQ}


uncertainty reasoning for the semantic web | 2013

Semantics and Inference for Probabilistic Description Logics

Riccardo Zese; Elena Bellodi; Evelina Lamma; Fabrizio Riguzzi; Fabiano Aguiari

(D).


Intelligenza Artificiale | 2017

cplint on SWISH: Probabilistic Logical Inference with a Web Browser

Marco Alberti; Elena Bellodi; Giuseppe Cota; Fabrizio Riguzzi; Riccardo Zese

Statistical Relational Learning and Probabilistic Inductive Logic Programming are two emerging fields that use representation languages able to combine logic and probability. In the field of Logic Programming, the distribution semantics is one of the prominent approaches for representing uncertainty and underlies many languages such as ICL, PRISM, ProbLog and LPADs. Learning the parameters for such languages requires an Expectation Maximization algorithm since their equivalent Bayesian networks contain hidden variables. EMBLEM (EM over BDDs for probabilistic Logic programs Efficient Mining) is an EM algorithm for languages following the distribution semantics that computes expectations directly on the Binary Decision Diagrams that are built for inference. In this paper we present experiments comparing EMBLEM with LeProbLog, Alchemy, CEM, RIB and LFI-ProbLog on six real world datasets. The results show that EMBLEM is able to solve problems on which the other systems fail and it often achieves significantly higher areas under the Precision Recall and the ROC curves in a similar time.


Machine Learning | 2015

Bandit-based Monte-Carlo structure learning of probabilistic logic programs

Nicola Di Mauro; Elena Bellodi; Fabrizio Riguzzi

Lifted inference has been proposed for various probabilistic logical frameworks in order to compute the probability of queries in a time that depends on the size of the domains of the random variables rather than the number of instances. Even if various authors have underlined its importance for probabilistic logic programming (PLP), lifted inference has been applied up to now only to relational languages outside of logic programming. In this paper we adapt Generalized Counting First Order Variable Elimination (GC-FOVE) to the problem of computing the probability of queries to probabilistic logic programs under the distribution semantics. In particular, we extend the Prolog Factor Language (PFL) to include two new types of factors that are needed for representing ProbLog programs. These factors take into account the existing causal independence relationships among random variables and are managed by the extension to variable elimination proposed by Zhang and Poole for dealing with convergent variables and heterogeneous factors. Two new operators are added to GC-FOVE for treating heterogeneous factors. The resulting algorithm, called LP


uncertainty reasoning for the semantic web | 2013

Learning Probabilistic Description Logics

Fabrizio Riguzzi; Elena Bellodi; Evelina Lamma; Riccardo Zese; Giuseppe Cota

^2

Collaboration


Dive into the Elena Bellodi's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge