Holger Pfeifer
University of Ulm
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Holger Pfeifer.
international conference on conceptual modeling | 2010
David Knuplesch; Linh Thao Ly; Stefanie Rinderle-Ma; Holger Pfeifer; Peter Dadam
In the light of an increasing demand on business process compliance, the verification of process models against compliance rules has become essential in enterprise computing. To be broadly applicable compliance checking has to support data-aware compliance rules as well as to consider data conditions within a process model. Independently of the actual technique applied to accomplish compliance checking, data-awareness means that in addition to the control flow dimension, the data dimension has to be explored during compliance checking. However, naive exploration of the data dimension can lead to state explosion. We address this issue by introducing an abstraction approach in this paper. We show how state explosion can be avoided by conducting compliance checking for an abstract process model and abstract compliance rules. Our abstraction approach can serve as preprocessing step to the actual compliance checking and provides the basis for more efficient application of existing compliance checking algorithms.
conference on advanced information systems engineering | 2010
Linh Thao Ly; David Knuplesch; Stefanie Rinderle-Ma; Kevin Göser; Holger Pfeifer; Manfred Reichert; Peter Dadam
In the light of an increasing demand on business process compliance, the verification of process models against compliance rules has become essential in enterprise computing. The SeaFlows Toolset featured in this paper extends process-aware information systems with compliance checking functionality. It provides a user-friendly environment for modeling compliance rules using a graph-based formalism and for enriching process models with these rules. To address a multitude of verification settings, we provide two complementary compliance checking approaches: The structural compliance checking approach derives structural criteria from compliance rules and applies them to detect incompliance. The data-aware behavioral compliance checking approach addresses the state explosion problem that can occur when the data dimension is explored during compliance checking. It performs context-sensitive automatic abstraction to derive an abstract process model which is more compact with regard to the data dimension enabling more efficient compliance checking. Altogether, SeaFlows Toolset constitutes a comprehensive and extensible framework for compliance checking of process models.
New Journal of Physics | 2012
Thomas Häberle; Felix Haering; Holger Pfeifer; Luyang Han; Balati Kuerbanjiang; Ulf Wiedwald; U. Herr; B. Koslowski
We introduce a simple and effective model of a commercial magnetic thin-film sensor for magnetic force microscopy (MFM), and we test the model employing buried magnetic dipoles. The model can be solved analytically in the half-space in front of the sensor tip, leading to a simple 1/R dependence of the magnetic stray field projected to the symmetry axis. The model resolves the earlier issue as to why the magnetic sensors cannot be described reasonably by a restricted multipole expansion as in the point pole approximation: the point pole model must be extended to incorporate a ?lower-order? pole, which we term ?pseudo-pole?. The near-field dependence (?R?1) turns into the well-known and frequently used dipole behavior (?R?3) if the separation, R, exceeds the height of the sensor. Using magnetic nanoparticles (average diameter 18?nm) embedded in a SiO cover as dipolar point probes, we show that the force gradient?distance curves and magnetic images fit almost perfectly to the proposed model. The easy axis of magnetization of single nanoparticles is successfully deduced from these magnetic images. Our model paves the way for quantitative MFM, at least if the sensor and the sample are independent.
theorem proving in higher order logics | 1999
Holger Pfeifer; Harald Rueß
This paper deals with formalizations and Verifications in type theory that are abstracted with respect to a class of datatypes; i.e polytypic constructions. The main advantage of these developments are that they can not only be used to define functions in a generic way but also to formally state polytypic theorems and to synthesize polytypic proof objects in a formal way. This opens the door to mechanically proving many useful facts about large classes of datatypes once and for all.
theorem proving in higher order logics | 1998
Friedrich W. von Henke; Stephan Pfab; Holger Pfeifer; Harald Rueß
We describe an extension of the PVS system that provides a reasonably efficient and practical notion of reflection and thus allows for soundly adding formalized and verified new proof procedures. These proof procedures work on representations of a part of the underlying logic and their correctness is expressed at the object level using a computational reflection function. The implementation of the PVS system has been extended with an efficient evaluation mechanism, since the practicality of the approach heavily depends on careful engineering of the core system, including efficient normalization of functional expressions. We exemplify the process of applying meta-level proof procedures with a detailed description of the encoding of cancellation in commutative monoids and of the kernel of a BDD package.
international conference on computer safety reliability and security | 2008
Cinzia Bernardeschi; Paolo Masci; Holger Pfeifer
We describe an approach of using the evaluation mechanism of the specification and verification system PVSto support formal design exploration of WSN algorithms at the early stages of their development. The specification of the algorithm is expressed with an extensible set of programming primitives, and properties of interest are evaluated with ad hoc network simulators automatically generated from the formal specification. In particular, we build on the PVSiopackage as the core base for the network simulator. According to requirements, properties of interest can be simulated at different levels of abstraction. We illustrate our approach by specifying and simulating a standard routing algorithm for wireless sensor networks.
international symposium on stabilization safety and security of distributed systems | 2009
Cinzia Bernardeschi; Paolo Masci; Holger Pfeifer
We describe an approach to the analysis of protocols for wireless sensor networks in scenarios with mobile nodes and dynamic link quality. The approach is based on the theorem proving system PVS and can be used for formal specification, automated simulation and verification of the behaviour of the protocol. In order to demonstrate the applicability of the approach, we analyse the reverse path forwarding algorithm, which is the basic technique used for diffusion protocols for wireless sensor networks.
emerging technologies and factory automation | 2001
Holger Pfeifer; Friedrich W. von Henke
This paper describes the mechanized formal verification we have performed on some of the crucial algorithms used in the Time-Triggered Architecture (ITA) for safety-critical distributed control. We outline the approach taken to formally analyse the dock synchronization algorithm and the group membership service of TTA, summarize our experience and describe remaining challenges.
formal methods | 1997
Axel Dold; Friedrich W. von Henke; Holger Pfeifer; Harald Rueß
In this paper we describe a formal verification of transformations for peephole optimization using the PVS system [12]. Our basic approach is to develop a generic scheme to mechanize these kinds of verifications for a large class of machine architectures. This generic scheme is instantiated with a formalization of a non-trivial stack machine [14] and a PDP-11 like two-address machine [2], and we prove the correctness of more than 100 published peephole optimization rules for these machines. In the course of verifying these transformations we found several errors in published peephole transformation steps [14]. From the information of failed proof attempts, however, we were able to discover strengthened preconditions for correcting the erroneous transformations.
Archive | 2010
Falk Bartels; Axel Dold; Holger Pfeifer; Friedrich W. von Henke; Harald Rueß
We describe an encoding of major parts of domain theory in the PVS extension of the simply typed calculus these encodings consist of Formalizations of basic structures like partial orders and complete partial orders domains Various domain constructions Notions related to monotonic functions and continuous functions Knaster Tarski xed point theorems for monotonic and continuous functions the proof of this theorem requires Zorn s lemma which has been derived from Hilbert s choice operator Scott s xed point induction for admissible predicates and various variations of xed point induction like Park s lemma Altogether these encodings form a conservative extension of the underlying PVS logic since all developments are purely de nitional Most of our proofs are straightforward transcriptions of textbook knowledge The purpose of this work however was not to merely reproduce textbook knowledge To the contrary our main motivation derived from our work on fully mechanized compiler correctness proofs which requires a full treatment of xed point induction in PVS these requirements guided our selection of which elements of domain theory were formalized A major problem of embedding mathematical theories like domain theory lies in the fact that developing and working with those theories usually generates myriads of applicability and type correctness conditions Our approach to exploiting the PVS device of judgements to establish many applicability conditions behind the scenes leads to a considerable reduction in the number of the conditions that actually need to be proved Finally we exemplify the application of mechanized xed point induction in PVS by a mechanized proof in the context of relating di erent semantics of imperative programming constructs This paper appeared as the technical report UIB from the Universit at Ulm Fakult at f ur Infor matik This research has been funded in part by the Deutsche Forschungsgemeinschaft DFG under project Veri x