Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Matthias Neubauer is active.

Publication


Featured researches published by Matthias Neubauer.


practical aspects of declarative languages | 2004

An Implementation of Session Types

Matthias Neubauer; Peter Thiemann

A session type is an abstraction of a set of sequences of heterogeneous values sent and received over a communication channel. Session types can be used for specifying stream-based Internet protocols.


symposium on principles of programming languages | 2005

From sequential programs to multi-tier applications by program transformation

Matthias Neubauer; Peter Thiemann

Modern applications are designed in multiple tiers to separate concerns. Since each tier may run at a separate location, middleware is required to mediate access between tiers. However, introducing this middleware is tiresome and error-prone.We propose a multi-tier calculus and a splitting transformation to address this problem. The multi-tier calculus serves as a sequential core programming language for constructing a multi-tier application. The application can be developed in the sequential setting. Splitting extracts one process per tier from the sequential program such that their concurrent execution behaves like the original program.The splitting transformation starts from an assignment of primitive operations to tiers. A program analysis determines communication requirements and inserts remote procedure calls. The next transformation step performs resource pooling: it optimizes the communication behavior by transforming sequences of remote procedure calls to a stream-based protocol. The final transformation step splits the resulting program into separate communicating processes.The multi-tier calculus is also applicable to the construction of interactive Web applications. It facilitates their development by providing a uniform programming framework for client-side and server-side programming.


international conference on functional programming | 2003

Discriminative sum types locate the source of type errors

Matthias Neubauer; Peter Thiemann

We propose a type system for locating the source of type errors in an applied lambda calculus with ML-style polymorphism. The system is based on discriminative sum types---known from work on soft typing---with annotation subtyping and recursive types. This way, type clashes can be registered in the type for later reporting. The annotations track the potential producers and consumers for each value so that clashes can be traced to their cause.Every term is typeable in our system and type inference is decidable. A type derivation in our system describes all type errors present in the program, so that a principal derivation yields a principal description of all type errors present. Error messages are derived from completed type derivations. Thus, error messages are independent of the particular algorithm used for type inference, provided it constructs such a derivation.


symposium on principles of programming languages | 2002

Functional logic overloading

Matthias Neubauer; Peter Thiemann; Martin Gasbichler; Michael Sperber

Functional logic overloading is a novel approach to user-defined overloading that extends Haskells concept of type classes in significant ways. Whereas type classes are conceptually predicates on types in standard Haskell, they are type functions in our approach. Thus, we can base type inference on the evaluation of functional logic programs. Functional logic programming provides a solid theoretical foundation for type functions and, at the same time, allows for programmable overloading resolution strategies by choosing different evaluation strategies for functional logic programs. Type inference with type functions is an instance of type inference with constrained types, where the underlying constraint system is defined by a functional logic program. We have designed a variant of Haskell which supports our approach to overloading, and implemented a prototype front-end for the language.


international conference on functional programming | 2002

Type classes with more higher-order polymorphism

Matthias Neubauer; Peter Thiemann

We propose an extension of Haskells type class system with lambda abstractions in the type language. Type inference for our extension relies on a novel constrained unification procedure called guided higher-order unification. This unification procedure is more general than Haskells kind-preserving unification but less powerful than full higher-order unification.The main technical result is the soundness and completeness of the unification rules for the fragment of lambda calculus that we admit on the type level.


symposium/workshop on haskell | 2004

Haskell type browser

Matthias Neubauer; Peter Thiemann

Despite 25 years of experience with ML-style typing and numerous implementations of type inference algorithms for this kind of type system, programmers are still struggling with error messages reported when the inference algorithm fails. Virtually every Haskell programmer can tell stories about type errors where it took hours to identify the actual problem with the program. While initiated functional programmers may accept this as a fact of life, it makes life especially hard for beginners, in fact, too hard for some.


international conference on functional programming | 2001

Down with Emacs Lisp: dynamic scope analysis

Matthias Neubauer; Michael Sperber

It is possible to translate code written in Emacs Lisp or another Lisp dialect which uses dynamic scoping to a more modern programming language with lexical scoping while largely preserving structure and readability of the code. The biggest obstacle to such an idiomatic translation from Emacs Lisp is the translation of dynamic binding into suitable instances of lexical binding: Many binding constructs in real programs in fact exhibit identical behavior under both dynamic and lexical binding. An idiomatic translation needs to detect as many of these binding constructs as possible and convert them into lexical binding constructs in the target language to achieve readability and efficiency of the target code. The basic prerequisite for such an idiomatic translation is thus a dynamic scope analysis which associates variable occurrences with binding constructs. We present such an analysis. It is an application of the Nielson/Nielson framework for flow analysis to a semantics for dynamic binding akin to Moreaus. Its implementation handles a substantial portion of Emacs Lisp, has been applied to realistic Emacs Lisp code, and is highly accurate and reasonably efficient in practice.


international colloquium on automata languages and programming | 2008

Placement Inference for a Client-Server Calculus

Matthias Neubauer; Peter Thiemann

Placement inference assigns locations to operations in a distributed program under the constraints that some operations can only execute on particular locations and that values may not be transferred arbitrarily between locations. An optimal choice of locations additionally minimizes the run time of the program, given that operations take different time on different locations and that a cost is associated to transferring a value from one location to another. We define a language with a time- and location-aware semantics, formalize placement inference in terms of constraints, and show that solving these constraints is an NP-complete problem. We then show that optimal placements are computable via a reformulation of the semantics in terms of matrices and an application of the max-plus spectral theory. A prototype implementation validates our results.


Electronic Notes in Theoretical Computer Science | 2004

Parameterized LR Parsing

Peter Thiemann; Matthias Neubauer

Common LR parser generators lack abstraction facilities for defining recurring patterns of productions. Although there are generators capable of supporting regular expressions on the right hand side of productions, no generator supports user defined patterns in grammars. Parameterized LR parsing extends standard LR parsing technology by admitting grammars with parameterized non-terminal symbols. A generator can implement such a grammar in two ways, either by expansion or directly. We develop the theory required for the direct implementation and show that it leads to significantly smaller parse tables and that it has fewer parsing conflicts than the expanded grammar. Attribute evaluation for a parameterized non-terminal is possible in the same way as before, if the semantic functions related to the non-terminal are polymorphic with respect to the parameter. We have implemented parameterized LR parsing in the context of Essence, a partial-evaluation based LR parser generator for Scheme.


principles and practice of declarative programming | 2008

Macros for context-free grammars

Peter Thiemann; Matthias Neubauer

Current parser generators are based on context-free grammars. Because such grammars lack abstraction facilities, the resulting specifications are often not easy to read. Fischers macro grammars extend context-free grammars with macro-like productions thus providing the equivalent of procedural abstraction. However, their use is hampered by the lack of an efficient, off-the-shelf parsing technology for macro grammars. We define specialization for macro grammars to enable reuse of parsing technology for context-free grammars while facilitating the specification of a language with a macro grammar. This specialization yields context-free rules, but it does not always terminate. We present a sound and complete static analysis that applies to any macro grammar and decides whether specialization terminates for it and thus yields a (finite) context-free grammar. The analysis is based on an intuitive notion of self-embedding nonterminals, which is easy to check by hand. We have implemented the analysis as part of a preprocessing tool that transforms a Yacc grammar extended with macro productions to a standard Yacc grammar.

Collaboration


Dive into the Matthias Neubauer's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge