Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Albert Maier is active.

Publication


Featured researches published by Albert Maier.


Archive | 2008

BPELDT — Data-Aware Extension for Data-Intensive Service Applications

Dirk Habich; Sebastian Richly; Steffen Preissler; Mike Grasselt; Wolfgang Lehner; Albert Maier

Aside from business processes, the service-oriented approach —currently realized with Web services and BPEL—should be utilizable for data-intensive applications as well. Fundamentally, data-intensive applications are characterized by (i) a sequence of functional operations processing large amounts of data and (ii) the delivery and transformation of huge data sets between those functional activities. However, for the efficient handling of massive data sets, a significant amount of data infrastructure is required and the predefined ‘by value’ data semantic within the invocation of Web services and BPEL is not well suited for this context. To tackle this problem on the BPEL level, we developed a seamless extension to BPEL—the ‘BPEL data transitions’.


ieee congress on services | 2007

Data-aware SOA for Gene Expression Analysis Processes

Dirk Habich; Sebastian Richly; Wolfgang Lehner; Uwe Assmann; Mike Grasselt; Albert Maier; Christian Pilarsky

In the context of genome research, the method of gene expression analysis has been used for several years. Related microarray experiments are conducted all over the world, and consequently, a vast amount of microarray data sets are produced. Having access to this variety of repositories, researchers would like to incorporate this data in their analyses processes to increase the statistical significance of their results. Such analyses processes are typical examples of data-intensive processes. In general, data-intensive processes are characterized by (i) a sequence of functional operations processing large amount of data and (ii) the transportation and transformation of huge data sets between the functional operations. To support data-intensive processes, an efficient and scalable environment is required, since the performance is a key factor today. The service-oriented architecture (SOA) is beneficial in this area according to process orchestration and execution. However, the current realization of SOA with Web services and BPEL includes some drawbacks with regard to the performance of the data propagation between Web services. Therefore, we present in this paper our data-aware service-oriented approach to efficiently support such data-intensive processes.


Information Technology | 2012

Industrializing Data Integration Projects using a Metadata Driven Assembly Line

Albert Maier; Martin Oberhofer; Thomas J. E. Schwarz

Abstract Data integration is essential for the success of many enterprise business initiatives, but also a very significant contributor to the costs and risks of the IT projects supporting these initiatives. Highly skilled consultants and data stewards re-design the usage of data in business processes, define the target landscape and its data models, and map the current information landscape into the target landscape. Still, the largest part of a typical data integration effort is dedicated to the implementation of transformation, cleansing, and data validation logic in robust and highly performing commercial systems. This effort is simple and doesn´t demand skills beyond commercial product knowledge, but it is very labour-intensive and error prone. In this paper we describe a new commercial approach to data integration that helps to “industrialize” data integration projects and significantly lowers the amount of simple, but labour-intensive work. The key idea is that the target landscape for a data integration project has pre-defined data models and associated meta data which can be leveraged for building and automating the data integration process. This approach has been implemented in the context of the support of SAP consolidation projects and is used in some of the largest data integration projects world-wide. Zusammenfassung Bei vielen Umstrukturierungsprojekten in Unternehmen spielt die Datenintegration eine entscheidende Rolle. In den zugehörigen IT Projekten sind ein signifikanter Teil der Kosten sowie des Projektrisikos auf Datenintegration zurückzuführen. Hochdotierte Berater und Datenverantwortliche gestalten die Verwendung der Daten in Geschäftsprozessen neu, definieren die zukünftige IT Landschaft und deren Datenmodelle, und erstellen Abbildungsvorschriften zwischen alten und neuen Anwendungssystemen. Trotzdem steckt ein Großteil des Aufwands von Datenintegrationsprojekten immer noch in der Implementierung von Transformationsvorschriften, Datenaufbereitungs- und Validierungslogik in hochperformanten kommerziellen Systemen. Diese Tätigkeiten sind relativ einfach und verlangen nur Kenntnisse in der eingesetzten Basissoftware. Jedoch sind diese Tätigkeiten arbeitsintensiv und fehleranfällig. In diesem Artikel beschreiben wir einen neuen kommerziellen Ansatz für Datenintegrationsprojekte, welcher diese “industrialisiert” und dabei die einfachen, aber fehleranfälligen Arbeitsschritte signifikant reduziert. Der Ansatz basiert auf der Ausnutzung von Datenmodellen und Metadaten der neuen Anwendungssysteme zur Automatisierung der Datenintegrationsprozesse und zur Generierung der den Prozessschritten zu Grunde liegenden Artefakte. Dieser Ansatz wurde zur Unterstützung von SAP Konsolidierungsprojekten entwickelt und wird derzeit in einigen der weltweit größten Datenintegrationsprojekten eingesetzt.


very large data bases | 2007

An approach to optimize data processing in business processes

Marko Vrhovnik; Holger Schwarz; Oliver Suhre; Bernhard Mitschang; Volker Markl; Albert Maier; Tobias Kraft


Archive | 2005

A method and a computer system for synchronising backups of objects and of meta data about the objects

Albert Maier


Archive | 2008

Method and Apparatus for Optimization in Workflow Management Systems

Matthias Kloppmann; Frank Leymann; Albert Maier; Bernhard Mitschang; Charles Daniel Wolfson


Archive | 2005

Integration of data management operations into a workflow system

Mike Grasselt; Matthias Kloppmann; Albert Maier; Oliver Suhre; Matthias Tschaffler; Charles Daniel Wolfson


Archive | 2008

Interaction solutions for customer support

Martin Oberhofer; Albert Maier; Thomas Schwarz; Sebastian Krebs; Dirk Nowak


Archive | 2009

Generating extract, transform, and load (etl) jobs for loading data incrementally

Thomas Joerg; Albert Maier; Oliver Suhre


Archive | 2011

Automatic generation of instantiation rules to determine quality of data migration

Anja Gruenheid; Albert Maier; Martin Oberhofer; Thomas Schwarz; Manfred Vodegel

Researchain Logo
Decentralizing Knowledge