Manfred Schmidt-Schauss
Goethe University Frankfurt
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Manfred Schmidt-Schauss.
Mathematical Structures in Computer Science | 2008
David Sabel; Manfred Schmidt-Schauss
We present a higher-order call-by-need lambda calculus enriched with constructors, case expressions, recursive letrec expressions, a seq operator for sequential evaluation and a non-deterministic operator amb that is locally bottom-avoiding. We use a small-step operational semantics in the form of a single-step rewriting system that defines a (non-deterministic) normal-order reduction. This strategy can be made fair by adding resources for book-keeping. As equational theory, we use contextual equivalence (that is, terms are equal if, when plugged into any program context, their termination behaviour is the same), in which we use a combination of may-and must-convergence, which is appropriate for non-deterministic computations. We show that we can drop the fairness condition for equational reasoning, since the valid equations with respect to normal-order reduction are the same as for fair normal-order reduction. We develop a number of proof tools for proving correctness of program transformations. In particular, we prove a context lemma for both may-and must-convergence that restricts the number of contexts that need to be examined for proving contextual equivalence. Combining this with so-called complete sets of commuting and forking diagrams, we show that all the deterministic reduction rules and some additional transformations preserve contextual equivalence. We also prove a standardisation theorem for fair normal-order reduction. The structure of the ordering ≤c is also analysed, and we show that Ω is not a least element and ≤c already implies contextual equivalence with respect to may-convergence.
Journal of Functional Programming | 2008
Manfred Schmidt-Schauss; David Sabel; Marko Schütz
This paper proves correctness of Nockers method of strictness analysis, implemented in the Clean compiler, which is an effective way for strictness analysis in lazy functional languages based on their operational semantics. We improve upon the work Clark, Hankin and Hunt did on the correctness of the abstract reduction rules in two aspects. Our correctness proof is based on a functional core language and a contextual semantics, thus proving a wider range of strictness-based optimizations as correct, and our method fully considers the cycle detection rules, which contribute to the strength of Nockers strictness analysis. Our algorithm SAL is a reformulation of Nockers strictness analysis algorithm in a functional core language LR. This is a higher order call-by-need lambda calculus with case , constructors, letrec , and seq , which is extended during strictness analysis by set constants like Top or Inf, denoting sets of expressions, which indicate different evaluation demands. It is also possible to define new set constants by recursive equations with a greatest fixpoint semantics. The operational semantics of LR is a small-step semantics. Equality of expressions is defined by a contextual semantics that observes termination of expressions. Basically, SAL is a nontermination checker. The proof of its correctness and hence of Nockers strictness analysis is based mainly on an exact analysis of the lengths of evaluations, i.e., normal-order reduction sequences to WHNF. The main measure being the number of “essential” reductions in evaluations. Our tools and results provide new insights into call-by-need lambda calculi, the role of sharing in functional programming languages, and into strictness analysis in general. The correctness result provides a foundation for Nockers strictness analysis in Clean, and also for its use in Haskell.
ACM Transactions on Computational Logic | 2011
Adrià Gascón; Guillem Godoy; Manfred Schmidt-Schauss
Term unification plays an important role in many areas of computer science, especially in those related to logic. The universal mechanism of grammar-based compression for terms, in particular the so-called singleton tree grammars (STGAs), have recently drawn considerable attention. Using STGs, terms of exponential size and height can be represented in linear space. Furthermore, the term representation by directed acyclic graphs (dags) can be efficiently simulated. The present article is the result of an investigation on term unification and matching when the terms given as input are represented using different compression mechanisms for terms such as dags and singleton tree grammars. We describe a polynomial time algorithm for context matching with dags, when the number of different context variables is fixed for the problem. For the same problem, NP-completeness is obtained when the terms are represented using the more general formalism of singleton tree grammars. For first-order unification and matching polynomial time algorithms are presented, each of them improving previous results for those problems.
logic in computer science | 2008
Adrià Gascón; Guillem Godoy; Manfred Schmidt-Schauss
This paper is an investigation of the matching problem for term equations s = t where s contains context variables and first-order variables, and both terms s and t are given using some kind of compressed representation. The main result is a polynomial time algorithm for context matching with dags, when the number of different context variables is fixed for the problem. NP-completeness is obtained when the terms are represented using the more general formalism of singleton tree grammars. As an ingredient of this proof, we also show that the special case of first-order matching with singleton tree grammars is decidable in polynomial time.
logic in computer science | 2012
David Sabel; Manfred Schmidt-Schauss
The calculus CHF models Concurrent Haskell extended by concurrent, implicit futures. It is a lambda and process calculus with concurrent threads, monadic concurrent evaluation, and includes a pure functional lambda-calculus PF which comprises data constructors, case-expressions, letrec-expressions, and Haskells seq. Our main result is conservativity of CHF as extension of PF. This allows us to argue that compiler optimizations and transformations from pure Haskell remain valid in Concurrent Haskell even if it is extended by futures. We also show that conservativity does no longer hold if the extension includes Concurrent Haskell and unsafe Interleave IO.
rewriting techniques and applications | 2010
Manfred Schmidt-Schauss; David Sabel; Elena Machkasova
This paper shows the equivalence of applicative similarity and contextual approximation, and hence also of bisimilarity and contextual equivalence, in the deterministic call-by-need lambda calculus with letrec. Bisimilarity simplifies equivalence proofs in the calculus and opens a way for more convenient correctness proofs for program transformations. Although this property may be a natural one to expect, to the best of our knowledge, this paper is the first one providing a proof. The proof technique is to transfer the contextual approximation into Abramskys lazy lambda calculus by a fully abstract and surjective translation. This also shows that the natural embedding of Abramskys lazy lambda calculus into the call-by-need lambda calculus with letrec is an isomorphism between the respective term-models. We show that the equivalence property proven in this paper transfers to a call-by-need letrec calculus developed by Ariola and Felleisen.
Journal of Functional Programming | 1997
Nigel W. O. Hutchison; Ute Neuhaus; Manfred Schmidt-Schauss; Cordy V. Hall
NATURAL EXPERT is a product that allows users to build knowledge-based systems. It uses a lazy functional language, NATURAL EXPERT LANGUAGE, to implement backward chaining and provide a reliable knowledge processing environment in which development can take place. Customers from all over the world buy the system and have used it to handle a variety of problems, including applications such as airplane servicing and bank loan assessment. Some of these are used 10,000 times or more per month.
frontiers of combining systems | 2011
Manfred Schmidt-Schauss; David Sabel; Altug Anis
The word-problem for a finite set of equational axioms between ground terms is the question whether for terms s, t the equation s = t is a consequence. We consider this problem under grammar based compression of terms, in particular compression with singleton tree grammars (STGs) and with directed acyclic graphs (DAGs) as a special case. We show that given a DAG-compressed ground and reduced term rewriting system T, the T-normal form of an STG-compressed term s can be computed in polynomial time, and hence the T-word problem can be solved in polynomial time. This implies that the word problem of STG-compressed terms w.r.t. a set of DAG-compressed ground equations can be decided in polynomial time. If the ground term rewriting system (gTRS) T is STG-compressed, we show NP-hardness of T-normal-form computation. For compressed, reduced gTRSs we show a PSPACE upper bound on the complexity of the normal form computation of STG-compressed terms. Also special cases are considered and a prototypical implementation is presented.
Theoretical Informatics and Applications | 2007
Manfred Schmidt-Schauss; David Sabel; Marko Schütz
Various static analyses of functional programming languages that permit infinite data structures make use of set constants like Top, Inf, and Bot, denoting all terms, all lists not eventually ending in Nil, and all non-terminating programs, respectively. We use a set language that permits union, constructors and recursive definition of set constants with a greatest fixpoint semantics in the set of all, also infinite, computable trees, where all term constructors are non-strict. This paper proves decidability, in particular DEXPTIME-completeness, of inclusion of co-inductively defined sets by using algorithms and results from tree automata and set constraints. The test for set inclusion is required by certain strictness analysis algorithms in lazy functional programming languages and could also be the basis for further set-based analyses.
rewriting techniques and applications | 2008
Manfred Schmidt-Schauss; Elena Machkasova