Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Steffen Hölldobler is active.

Publication


Featured researches published by Steffen Hölldobler.


Journal of Applied Logic | 2004

Logic Programs and Connectionist Networks

Pascal Hitzler; Steffen Hölldobler; Anthony Karel Seda

Abstract One facet of the question of integration of Logic and Connectionist Systems, and how these can complement each other, concerns the points of contact, in terms of semantics, between neural networks and logic programs. In this paper, we show that certain semantic operators for propositional logic programs can be computed by feedforward connectionist networks, and that the same semantic operators for first-order normal logic programs can be approximated by feedforward connectionist networks. Turning the networks into recurrent ones allows one also to approximate the models associated with the semantic operators. Our methods depend on a well-known theorem of Funahashi, and necessitate the study of when Funahashis theorem can be applied, and also the study of what means of approximation are appropriate and significant.


Applied Intelligence | 1999

Approximating the Semantics of Logic Programs by Recurrent Neural Networks

Steffen Hölldobler; Yvonne Kalinke; Hans-Peter Störr

In [1] we have shown how to construct a 3-layered recurrent neural network that computes the fixed point of the meaning function TP of a given propositional logic program P, which corresponds to the computation of the semantics of P. In this article we consider the first order case. We define a notion of approximation for interpretations and prove that there exists a 3-layered feed forward neural network that approximates the calculation of TP for a given first order acyclic logic program P with an injective level mapping arbitrarily well. Extending the feed forward network by recurrent connections we obtain a recurrent neural network whose iteration approximates the fixed point of TP. This result is proven by taking advantage of the fact that for acyclic logic programs the function TP is a contraction mapping on a complete metric space defined by the interpretations of the program. Mapping this space to the metric space R with Euclidean distance, a real valued function fP can be defined which corresponds to TP and is continuous as well as a contraction. Consequently it can be approximated by an appropriately chosen class of feed forward neural networks.


Journal of Advanced Computational Intelligence and Intelligent Informatics | 2003

The Fuzzy Description Logic ALCFH with Hedge Algebras as Concept Modifiers

Steffen Hölldobler; Hans-Peter Störr; Tran Dinh Khang

In this paper we present the fuzzy description logic ALCFH introduced, where primitive concepts are modified by means of hedges taken from hedge algebras. ALCFH is strictly more expressive than Fuzzy-ALC defined in [11]. We show that given a linearly ordered set of hedges primitive concepts can be modified to any desired degree by prefixing them with appropriate chains of hedges. Furthermore, we define a decision procedure for the unsatisfiability problem in ALCFH , and discuss knowledge base expansion when using terminologies, truth bounds, expressivity as well as complexity issues. We extend [8] by allowing modifiers on non-primitive concepts and extending the satisfiability procedure to handle concept definitions.


international conference on logic programming | 2009

Logic Programs under Three-Valued Łukasiewicz Semantics

Steffen Hölldobler; Carroline Dewi Puspa Kencana Ramli

If logic programs are interpreted over a three-valued logic, then often Kleenes strong three-valued logic with complete equivalence and Fittings associated immediate consequence operator is used. However, in such a logic the least fixed point of the Fitting operator is not necessarily a model for the program under consideration. Moreover, the model intersection property does not hold. In this paper, we consider the three-valued ***ukasiewicz semantics and show that fixed points of the Fitting operator are also models for the program under consideration and that the model intersection property holds. Moreover, we review a slightly different immediate consequence operator first introduced by Stenning and van Lambalgen and relate it to the Fitting operator under ***ukasiewicz semantics. Some examples are discussed to support the claim that ***ukasiewicz semantics and the Stenning and van Lambalgen operator is better suited to model commonsense and human reasoning.


Annals of Mathematics and Artificial Intelligence | 1995

Computing change and specificity with equational logic programs

Steffen Hölldobler; Michael Thielscher

Recent deductive approaches to reasoning about action and chance allow us to model objects and methods in a deductive framework. In these approaches, inheritance of methods comes for free, whereas overriding of methods is unsupported. In this paper, we present an equational logic framework for objects, methods, inheritance and overriding of methods. Overriding is achieved via the concept of specificity, which states that more specific methods are preferred to less specific ones. Specificity is computed with the help of negation as failure. We specify equational logic programs and show that their completed versions behave as intended. Furthermore, we prove that SLDENF-resolution is complete if the equational theory is finitary, the completed programs are consistent and no derivation flounders or is infinite. Moreover, we give syntactic conditions which guarantee that no derivation flounders or is infinite. Finally, we discuss how the approach can be extended to reasoning about the past in the context of incompletely specified objects or situations. It will turn out that constructive negation is needed to solve these problems.


Neurocomputing | 2008

Connectionist model generation: A first-order approach

Sebastian Bader; Pascal Hitzler; Steffen Hölldobler

Knowledge-based artificial neural networks have been applied quite successfully to propositional knowledge representation and reasoning tasks. However, as soon as these tasks are extended to structured objects and structure-sensitive processes as expressed e.g., by means of first-order predicate logic, it is not obvious at all what neural-symbolic systems would look like such that they are truly connectionist, are able to learn, and allow for a declarative reading and logical reasoning at the same time. The core method aims at such an integration. It is a method for connectionist model generation using recurrent networks with feed-forward core. We show in this paper how the core method can be used to learn first-order logic programs in a connectionist fashion, such that the trained network is able to do reasoning over the acquired knowledge. We also report on experimental evaluations which show the feasibility of our approach.


Journal of Logic and Computation | 1996

Linear Deductive Planning

Gerd Grosse; Steffen Hölldobler; Josef Schneeberger

Recently, three approaches to deductive planning were developed, which solve the technical frame problem without the need to state frame axioms explicitly. These approaches are based on the linear connection method, an equational Horn logic, and linear logic. At rst glance these approaches seem to be very diierent. In the linear connection method a syntactical condition | each literal is connected at most once | is imposed on proofs. In the equational logic approach situations and plans are represented as terms and SLDE-resolution is applied as an inference rule. The linear logic approach is a Gentzen style proof system without weakening and contraction rules. On second glance, however, and as a consequence of the results rigourously proved in this paper, it will turn out that the three approaches are equivalent. They are based on the very same idea that facts about a situation are taken as resources which can be consumed and produced.


Lecture Notes in Computer Science | 2002

Incremental Fuzzy Decision Trees

Marina Guetova; Steffen Hölldobler; Hans-Peter Störr

We present a new classification algorithm that combines three properties: It generates decision trees, which proved a valuable and intelligible tool for classification and generalization of data; it utilizes fuzzy logic, that provides for a fine grained description of classified items adequate for human reasoning; and it is incremental, allowing rapid alternation of classification and learning of new data. The algorithm generalizes known non-incremental algorithms for top down induction of fuzzy decision trees, as well as known incremental algorithms for induction of decision trees in classical logic. The algorithm is shown to be terminating and to yield results equivalent to the non-incremental version.


international conference industrial engineering other applications applied intelligent systems | 2012

Solving periodic event scheduling problems with SAT

Peter Großmann; Steffen Hölldobler; Norbert Manthey; Karl Nachtigall; Jens Opitz; Peter Steinke

In this paper, periodic event scheduling problems (PESP) are encoded as satisfiability problems (SAT) and solved by a state-of-the-art SAT solver. Two encodings, based on direct and order encoded domains, are presented. An experimental evaluation suggests that the SAT-based approach using order encoding outperforms constraint-based PESP solvers, which until now were considered to be the best solvers for PESP. This opens the possibility to model significantly larger real-world problems.


Proceedings of the International Workshop on Parallelization in Inference Systems | 1990

CHCL - A Connectionist Infernce System

Steffen Hölldobler; Franz J. Kurfess

Chcl is a Connectionist inference system for Horn logic which is based on the Connection method and uses Limited resources. This paper gives an overview of the system and its implementation.

Collaboration


Dive into the Steffen Hölldobler's collaboration.

Top Co-Authors

Avatar

Sebastian Bader

Dresden University of Technology

View shared research outputs
Top Co-Authors

Avatar

Hans-Peter Störr

Dresden University of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Michael Thielscher

University of New South Wales

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Emmanuelle-Anna Dietz

Dresden University of Technology

View shared research outputs
Top Co-Authors

Avatar

Marco Ragni

University of Freiburg

View shared research outputs
Top Co-Authors

Avatar

Norbert Manthey

Dresden University of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Peter Steinke

Dresden University of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge