Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Bernd Heinrich is active.

Publication


Featured researches published by Bernd Heinrich.


Journal of Data and Information Quality | 2009

A Procedure to Develop Metrics for Currency and its Application in CRM

Bernd Heinrich; Mathias Klier; Marcus Kaiser

Due to the importance of using up-to-date data in information systems, this article analyzes how the data-quality dimension currency can be quantified. Based on several requirements (e.g., normalization and interpretability) and a literature review, we design a procedure to develop probability-based metrics for currency which can be adjusted to the specific characteristics of data attribute values. We evaluate the presented procedure with regard to the requirements and illustrate the applicability as well as its practical benefit. In cooperation with a major German mobile services provider, the procedure was applied in the field of campaign management in order to improve both success rates and profits.


Wirtschaftsinformatik und Angewandte Informatik | 2006

Quantitatives IT-Portfoliomanagement

Alexander Wehrmann; Bernd Heinrich; Frank Seifert

KernpunkteZentraler Bestandteil des IT-Business-Alignments ist das IT-Portfoliomanagement (ITPM), d. h. die auf ökonomischen Kriterien beruhende Auswahl von IT-Projekten und deren Allokation zu einem Portfolio. In diesem Beitrag wird ein Ansatz zum quantitativen ITPM vorgestellt und dessen Umsetzung am Beispiel einer realen Fallstudie expliziert.⊎ Im Ansatz wird der Fokus auf die integrierte Optimierung von Rendite und zeitpunktbezogenen Risiken sowie auf die Operationalisierbarkeit gelegt.Beide Punkte sind in bisherigen methodisch fundierten Verfahren nicht befriedigend berücksichtigt.⊎ Die Steuerung einzelner Geschäftseinheiten über dezentrale IT-Budgets ist ökonomisch nicht sinnvoll, weshalb ein übergreifendes ITPM notwendig ist.⊎ IT-Portfolios werden heute oftmals nur an Renditekennziffern gemessen. Dies ist zwar notwendig, aber nicht hinreichend. Vielmehr muss die Bewertung Risikoverbundeffekte gleichermaßen berücksichtigen, da heutzutage ca. 70% aller IT-Projekte durch Sachverhalte, die sich im Projektrisiko widerspiegeln, nicht planmäßig durchgeführt werden können.AbstractBased on previously released research this paper focuses on the question of how IT projects should be allocated to a risk/return balanced IT portfolio. Therefore we develop an approach that exploits the structure of IS Architectures and scenarios to identify project risks as well as dependencies between projects. As a result, different clusters of efficient portfolios with distinctive risk/return-properties can be derived. The presented approach is designed to support management decisions in a pragmatic manner when selecting IT portfolios. By using real data of a major German financial services provider we exemplify the implementation and the results of the presented approach.


Information Systems and E-business Management | 2009

The process map as an instrument to standardize processes: design and application at a financial service provider

Bernd Heinrich; Matthias Henneberger; Susanne Leist; Gregor Zellner

The standardization of processes and the identification of shared business services in a service-oriented architecture (SOA) are currently widely discussed. Above all in practice, however, there still is a lack of appropriate instruments to support these tasks. In this paper an approach for a process map is introduced which allows for a systematic presentation—as complete as possible—of the processes in an enterprise (division). After a consistent refinement of the process has taken place by means of aggregation/disaggregation respectively, generalization/specialization relations, it is possible to identify primarily functional similarities of the detailed sub-processes. The application of the process map at a financial service provider (FSP) highlights how these similarities can be taken as a basis to standardize processes and to identify shared services.


Journal of Information Science | 2011

Assessing data currency - a probabilistic approach

Bernd Heinrich; Mathias Klier

The growing relevance of data quality has revealed the need for adequate measurement. As time aspects are extremely important in data quality management, we propose a novel approach to assess data currency. Our metric, which is founded on probability theory, enables an objective and widely automated assessment for data liable to temporal decline. Its values are easy to interpret by business users. Moreover, the metric makes it possible to analyse the economic impacts of data quality measures like data cleansing and can therefore build a basis for an economic management of data quality. The approach can be applied in various fields of application where the currency of data is important. To illustrate the practical benefit and the applicability of the novel metric, we provide an extensive real world example. In cooperation with a major German mobile services provider, the approach was successfully applied in campaign management and led to an improved decision support.


Wirtschaftsinformatik und Angewandte Informatik | 2008

IT-Service-Management – Ein Modell zur Bestimmung der Folgen von Interoperabilitätsstandards auf die Einbindung externer IT-Dienstleister

Kathrin Susanne Braunwarth; Bernd Heinrich

ZusammenfassungWie wirken sich Interoperabilitätsstandards (IOS) wie Webservice-Standards oder semantische Auszeichnungen auf die Auslagerung von Prozessen bzw. Prozessaktivitäten und damit die Einbindung externer IT-Dienstleister und deren Services aus? Dazu wird unter Berücksichtigung von Auszahlungs- und Risikoaspekten ein Entscheidungsmodell entwickelt, um Portfolios an (eigenerstellten oder fremdbezogenen) Services bilden zu können. Zentrale Ergebnisse auf Basis der definierten Annahmen sind: Lassen sich zukünftig durch IOS die Einbindungs- und Koordinationskosten für Services und Dienstleister reduzieren, so werden Unternehmen in ihrem Portfolio an IT-Services mehr Dienstleister einbinden, um die Risiken eines Serviceausfalls zu senken.Können zudem auch die Umdisponierungskosten für den Ersatz ausgefallener Services reduziert werden, so entsteht bei Serviceausfällen geringerer ökonomischer Schaden. In diesem Fall verliert die Bildung von Portfolios an IT-Services mit mehreren Services unter Risikogesichtspunkten an Bedeutung und es genügt im Extremfall, jeweils den Günstigsten anzufordern.Durch das heute oftmals übliche, vollständige Auslagern eines Prozesses an einen Dienstleister wird im Sinne des hier vorgestellten Ansatzes Optimierungspotenzial verschenkt. Besteht die Möglichkeit, einzelne Aktivitäten bzw. sogar einzelne Aktivitätsdurchführungen unabhängig voneinander zu vergeben, dann lässt sich damit das Ausfallrisiko diversifizieren.AbstractThis article examines the impact of interoperability standards such as Web Service standards or semantic annotation of services on the outsourcing of business processes or process activities. Especially the integration of external IT service providers is considered. Therefore a decision model is developed for optimizing service portfolios regarding risk-/cost-aspects. By using an extract of the application process for current accounts we exemplify the implementation and the results of the presented model.


Wirtschaftsinformatik und Angewandte Informatik | 2008

SEMPA – Ein Ansatz des Semantischen Prozessmanagements zur Planung von Prozessmodellen

Bernd Heinrich; Marc-Andre Bewernik; Matthias Henneberger; Alexander Krammer; Florian Lautenbacher

ZusammenfassungUnternehmen müssen ihre Prozesse laufend an veränderte Marktentwicklungen anpassen. Die dafür notwendige Modellierung und Verbesserung der Prozesse ist jedoch derzeit oftmals noch mit hohem manuellem Aufwand verbunden. Im Beitrag wird ein Ansatz des Semantischen Prozessmanagements vorgestellt, der eine teilautomatisierte Erstellung (im Sinne einer Planung) von Prozessmodellen aus einzelnen Aktionen ermöglicht:Den Ausgangspunkt bilden mit Hilfe einer Ontologie semantisch beschriebene und in einer Prozessbibliothek gespeicherte Aktionen.Semantische Analysen und Inferenzen sind notwendig, um die Abhängigkeiten zwischen Aktionen abzuleiten und dadurch die Planung von Prozessmodellen zu ermöglichen. Dabei werden auch Kontrollflussstrukturen in den Prozessmodellen eingeplant.Im Unterschied zu Ansätzen der Webservice-Komposition können auf diese Weise technologieunabhängige Prozessmodelle erstellt werden, die danach – bspw. mit den Fachbereichen – abgestimmt werden können.AbstractCurrently process modeling is mostly done manually. Therefore, the initial design of process models as well as changes to process models which are frequently necessary to react to new market developments or new regulations are time-consuming tasks. In this paper we introduce SEMPA, an approach for the partly automatic planning of process models. Using ontologies to semantically describe actions – as envisioned in Semantic Business Process Management –, a process model for a specified problem setting can be created automatically. In comparison to existing planning algorithms our approach creates process models including control structures and is able to cope with complex and numerical input and output parameters of actions. The prototypical implementation as well as an example taken from the financial services domain illustrate the practical benefit of our approach.


decision support systems | 2015

Metric-based data quality assessment - Developing and evaluating a probability-based currency metric

Bernd Heinrich; Mathias Klier

Data quality assessment has been discussed intensively in the literature and is critical in business. The importance of using up-to-date data in business, innovation, and decision-making processes has revealed the need for adequate metrics to assess the currency of data in information systems. In this paper, we propose a data quality metric for currency that is based on probability theory. Our metric allows for a reproducible configuration and a high level of automation when assessing the currency of attribute values. The metric values represent probabilities and can be integrated into a decision calculus (e.g., based on decision theory) to support decision-making. The evaluation of our metric consists of two main steps: (1) we define an instantiation of the metric for a real-use situation of a German mobile services provider to demonstrate both the applicability and the practical benefit of the approach; (2) we use publicly available real world data provided by the Federal Statistical Office of Germany and the German Institute of Economic Research to demonstrate its feasibility by defining an instantiation of the metric and to evaluate its strength (compared to existing approaches). We propose a well-founded probability-based data quality metric for currency.The metric values can be used in a decision calculus to support decision-making.The metric has successfully been applied in several real-use situations.The metric has significant advantages and can yield substantial practical benefit.


web intelligence | 2011

Granularity of Services

Alexander Krammer; Bernd Heinrich; Matthias Henneberger; Florian Lautenbacher

Service-oriented architectures are widely discussed as a design principle for application and enterprise architectures. Nevertheless, an adequate granularity of services has not yet been researched sufficiently from an economic perspective. The finer the granularity to realize the functions of a process, the higher the number of services is, and the more effort has to be directed towards composing them. In contrast, very coarse-grained services bear the disadvantages of higher implementation costs and lower reuse potential (e.g., in different processes). The aim of the decision model proposed in this paper is to determine an adequate granularity of services from an economical perspective. Thus, degrees of freedom, which often exist for the choice of granularity after a domain analysis, can be leveraged to realize a cost-efficient solution. We illustrate the applicability and practical benefits of the decision model with an example from the context of a financial services provider.


Archive | 2002

Die konzeptionelle Gestaltung des Multichannel-Vertriebs anhand von Kundenbedürfnissen

Bernd Heinrich

Um unter dem Eindruck der aktuellen Umwalzungen im Bankensektor eine hohe Kundenorientierung zu bewerkstelligen, spielt der Multichan-nel-Vertrieb eine primare Rolle. Dabei ist es fiir die konzeptionelle Planung im Sinne einer systematischen Identifikation von Mehrwerten notwendig, die Interaktion mit dem Kunden hinsichtlich ihrer Elemente zu differenzieren. Dies tragt auch dazu bei, die Durchgangigkeit zum spateren Prozessentwurf sicherzustellen. Der Beitrag greift die Anforderungen auf und stellt einen Losungsansatz zur bedurfnisorientierten Gestaltung des Marktangebots vor, der im Kompetenzzentrum mit den Partnerunternehmen zusammen entwickelt und angewendet wurde.


european conference on information systems | 2015

Automated Planning of context-aware Process Models

Bernd Heinrich; Dominik Schön

Most real world processes are heavily influenced by environmental factors, which are referred to as the context of a process. Thus, the consideration of context is proposed within the research strand of Business Process Modeling. Most existing context-aware modeling approaches con-sider context only in terms of static information like, for instance, the location where a process is performed. However, context information like the weather could change during the conduction of a process, which we will denote as non-static context information. In order to increase the flexibility concerning environmental influences in general and especially context-related events of context-aware processes, we present an approach for the automated planning of context-aware process models that considers static and non-static context information. We therefore propose an extended state transition system in order to represent context information in terms of context variables and consider process exogenous changes of these context variables through context signals and receive context actions. Further, to ensure a correct, complete and time effi-cient construction of context-aware process models, a planning approach is used to support modelers by means of an algorithm. To demonstrate the feasibility of our approach we mathe-matically evaluated the algorithm and applied it to real world processes.

Collaboration


Dive into the Bernd Heinrich's collaboration.

Top Co-Authors

Avatar

Mathias Klier

University of Regensburg

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Susanne Leist

University of Regensburg

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Lars Lewerenz

University of Regensburg

View shared research outputs
Researchain Logo
Decentralizing Knowledge