Rainer Manthey
University of Bonn
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Rainer Manthey.
conference on automated deduction | 1988
Rainer Manthey; François Bry
Satchmo is a theorem prover consisting of just a few short and simple Prolog programs. Prolog may be used for representing problem clauses as well. SATCHMO is based on a model-generation paradigm. It is refutation-complete if used in a level-saturation manner. The paper provides a thorough report on experiences with SATCHMO. A considerable amount of problems could be solved with surprising efficiency.
extending database technology | 1988
François Bry; Hendrik Decker; Rainer Manthey
Integrity maintenance methods have been defined for preventing updates from violating integrity constraints. Depending on the update, the full check for constraint satisfaction is reduced to checking certain instances of some relevant constraints only. In the first part of the paper new ideas are proposed for enhancing the efficiency of such a method. The second part is devoted to checking constraint satisfiability, i.e., whether a database exists in which all constraints are simultaneously satisfied. A satisfiability checking method is presented that employs integrity maintenance techniques. Simple Prolog programs are given that serve both as specifications as well as a basis for an efficient implementation.
conference on logic programming | 1990
François Bry; Rainer Manthey; Bernd Martens
In order to faithfully describe real-life applications, knowledge bases have to manage general integrity constraints. In this article, we analyse methods for an efficient verification of integrity constraints in updated knowledge bases. These methods rely on the satisfaction of the integrity constraints before the update for simplifying their evaluation in the updated knowledge base. During the last few years, an increasing amount of publications has been devoted to various aspects of this problem. Since they use distinct formalisms and different terminologies, they are difficult to compare. Moreover, it is often complex to recognize commonalities and to find out whether techniques described in different articles are in principle different. A first part of this report aims at giving a comprehensive state-of-the-art in integrity verification. It describes integrity constraint verification techniques in a common formalism. A second part of this report is devoted to comparing several proposals. The differences and similarities between various methods are investigated.
East/West Database Workshop | 1995
Stefano Ceri; Rainer Manthey
Chimera is a novel database model and language which has been designed as a joint conceptual interface of the IDEA project, a major European cooperation initiative aiming at the integration of object-oriented, active and deductive database technology. In this paper, we present a view of the main features of Chimera and discuss the design choices made. The most remarkable characteristic of Chimera is the fact that fully developed rule languages for both active and deductive rules have been integrated in an object-oriented context. Rules are in the center of interest of the IDEA project, which aims at developing prototypical components of a future “intelligent” DBMS.
advances in databases and information systems | 2004
Andreas Behrend; Rainer Manthey
Update propagation in deductive databases can be implemented by combining rule rewriting and fixpoint computation, analogous to the way how query answering is performed via Magic Sets. For efficiency reasons, bottom-up propagation rules have to be subject to Magic rewriting, thus possibly loosing stratifiability. We propose to use the soft stratification approach for computing the well-founded model of the magic propagation rules (guaranteed to be two-valued) because of the simplicity and efficiency of this technique.
german workshop on artificial intelligence | 1987
Rainer Manthey; François Bry
Our work on automated deduction has been motivated by database problems. The set of deduction rules and integrity constraints of a logic database can be considered as axioms of a first-order theory while the actual sets of facts constitute (finite) models of this theory. Satisfiability of the underlying axioms is a necessary prerequisite for any logic database. The procedure described in this paper is the basis of a program called SATCHMO (SATisfiability CHecking by MOdel generation) that has been implemented at ECRC as part of a prototype schema design system for logic databases.
computer science logic | 1987
François Bry; Rainer Manthey
It is shown how certain refutation methods can be extended into semi-decision procedures that are complete for both unsatisfiability and finite satisfiability. The proposed extension is justified by a new characterization of finite satisfiability. This research was motivated by a database design problem: Deduction rules and integrity constraints in definite databases have to be finitely satisfiable.
international database engineering and applications symposium | 2008
Andreas Behrend; Christian Dorau; Rainer Manthey; Gereon Schueller
In this paper we show the usefulness and feasibility of applying conventional SQL queries for analyzing a wide spectrum of data streams. As application area we have chosen the analysis of stock market data, mainly because this kind of application exhibits sufficiently many of those characteristics for which relational query technology can be considered a valuable instrument in a stream context. The resulting TInTo system is a tool for computing so-called technical indicators, numerical values calculated from a certain kind of stock market data, characterizing the development of stock prices over a given time period. Update propagation is used for the incremental recomputation of indicator views defined over a stream of continuously changing price data.
foundations of information and knowledge systems | 2008
Andreas Behrend; Rainer Manthey
In this paper we present a new rule-based approach for consistency preserving view updating in deductive databases. Based on rule transformations performed during schema design, fixpoint evaluations of these rules at run time compute consistent realizations of view update requests. Alternative realizations are expressed using disjunctive Datalog internally. The approach extends and integrates standard techniques for efficient query answering and integrity checking (based on transformation techniques and fixpoint computation, too). Views may be stratifiably recursive. The set-orientedness of the approach makes it potentially useful in the context of (commercial) SQL systems, too.
international conference on deductive and object-oriented databases | 1993
Rainer Manthey
The main conjecture of this contribution is that forthcoming intelligent database systems — in particular future DOOD systems — should be designed in such a way that a major part of the services they provide are implemented using these same services in a bootstrapping-like manner. We call such an approach “reflective”, as is often done by researchers in AI and programming languages. Data dictionaries, being part of any reasonable database system today, exhibit the reflective principle in a nutshell, if they are implemented by means of the same data structures that hold application data. However, even for data dictionaries the reflective implementation is often abandoned for performance reasons. Applying reflection for more advanced and ambitious purposes, up to integrity control or query optimization, is viewed even more skeptically by many, despite the conceptual elegance of the approach. On the other hand, there are a few successful approaches around today that can be interpreted as exhibiting a reflective nature. It is the purpose of this paper to identify such examples and to encourage research to invest more in the reflective style and to look for new solutions to the obstacles still ahead.