Nima Taghipour
Katholieke Universiteit Leuven
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Nima Taghipour.
international joint conference on artificial intelligence | 2011
Guy Van den Broeck; Nima Taghipour; Wannes Meert; Jesse Davis; Luc De Raedt
Probabilistic logical languages provide powerful formalisms for knowledge representation and learning. Yet performing inference in these languages is extremely costly, especially if it is done at the propositional level. Lifted inference algorithms, which avoid repeated computation by treating indistinguishable groups of objects as one, help mitigate this cost. Seeking inspiration from logical inference, where lifted inference (e.g., resolution) is commonly performed, we develop a model theoretic approach to probabilistic lifted inference. Our algorithm compiles a first-order probabilistic theory into a first-order deterministic decomposable negation normal form (d-DNNF) circuit. Compilation offers the advantage that inference is polynomial in the size of the circuit. Furthermore, by borrowing techniques from the knowledge compilation literature our algorithm effectively exploits the logical structure (e.g., context-specific independencies) within the first-order model, which allows more computation to be done at the lifted level. An empirical comparison demonstrates the utility of the proposed approach.
inductive logic programming | 2013
Nima Taghipour; Jesse Davis; Hendrik Blockeel
Lifted probabilistic inference methods exploit symmetries in the structure of probabilistic models to perform inference more efficiently. In lifted variable elimination, the symmetry among a group of interchangeable random variables is captured by counting formulas, and exploited by operations that handle such formulas. In this paper, we generalize the structure of counting formulas and present a set of inference operators that introduce and eliminate these formulas from the model. This generalization expands the range of problems that can be solved in a lifted way. Our work is closely related to the recently introduced method of joint conversion. Due to its more fine grained formulation, however, our approach can provide more efficient solutions than joint conversion.
european conference on machine learning | 2010
Wannes Meert; Nima Taghipour; Hendrik Blockeel
Efficient probabilistic inference is key to the success of statistical relational learning. One issue that increases the cost of inference is the presence of irrelevant random variables. The Bayes-ball algorithm can identify the requisite variables in a propositional Bayesian network and thus ignore irrelevant variables. This paper presents a lifted version of Bayes-ball, which works directly on the first-order level, and shows how this algorithm applies to (lifted) inference in directed first-order probabilistic models.
international conference on artificial intelligence and statistics | 2012
Nima Taghipour; Daan Fierens; Jesse Davis; Hendrik Blockeel
Journal of Artificial Intelligence Research | 2013
Nima Taghipour; Daan Fierens; Jesse Davis; Hendrik Blockeel
international conference on artificial intelligence and statistics | 2013
Nima Taghipour; Daan Fierens; Guy Van den Broeck; Jesse Davis; Hendrik Blockeel
neural information processing systems | 2013
Nima Taghipour; Jesse Davis; Hendrik Blockeel
International Workshop on Statistical Relational Learning 2009 | 2009
Nima Taghipour; Wannes Meert; Jan Struyf; Hendrik Blockeel
neural information processing systems | 2012
Wannes Meert; Guy Van den Broeck; Nima Taghipour; Daan Fierens; Hendrik Blockeel; Jesse Davis; Luc De Raedt
arXiv: Artificial Intelligence | 2012
Nima Taghipour; Daan Fierens; Guy Van den Broeck; Jesse Davis; Hendrik Blockeel