Maximilian Röglinger
University of Bayreuth
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Maximilian Röglinger.
Business Process Management Journal | 2012
Maximilian Röglinger; Jens Pöppelbuß; Jörg Becker
Purpose – Maturity models are a prospering approach to improving a companys processes and business process management (BPM) capabilities. In fact, the number of corresponding maturity models is so high that practitioners and scholars run the risk of losing track. This paper therefore aims to provide a systematic in‐depth review of BPM maturity models.Design/methodology/approach – The paper follows the accepted research process for literature reviews. It analyzes a sample of ten BPM maturity models according to a framework of general design principles. The framework particularly focuses on the applicability and usefulness of maturity models.Findings – The analyzed maturity models sufficiently address basic design principles as well as principles for a descriptive purpose of use. The design principles for a prescriptive use, however, are hardly met. Thus, BPM maturity models provide limited guidance for identifying desirable maturity levels and for implementing improvement measures.Research limitations/imp...
Business & Information Systems Engineering | 2013
Hans Ulrich Buhl; Maximilian Röglinger; Florian Moser; Julia Heidemann
When looking at the words of Hal Varian, Google’s Chief Economist and professor emeritus at the University of California, Berkeley, thinking of Big Data seems natural. Big Data – a dictum which currently seems to be on everyone’s lips – has recently developed into one of the most discussed topics in research and practice. Looking at academic publications, we find that more than 70 % of all ranked papers which deal with Big Data were published within the last two years (Pospiech and Felden 2012) as well as nearly 12,000 hits for Big Data on GoogleScholar across various fields of research. In 2011, more than 530 academic Big Data related publications could be counted (Chen et al. 2012). We find more hits for “Big Data” than for “Development aid” in Google, and almost daily an IT-related business magazine publishes a Big Data special issue next to a myriad of Big Data business conferences. In Gartner’s current Hype Cycle for Emerging Technologies (Gartner 2012), Big Data is right on the peak of its hype phase, and according to this source a broad adoption is to be expected within the next five years. Big Data provokes excitement across various fields such as science, governments, and industries like media and telecommunications, health care engineering, or finance where organizations are facing a massive quantity of data and new technologies to store, process, and analyze those data. Despite the cherished expectations and hopes, the question is why we face such excitement around Big Data which at first view rather seems to be a fashionable hype than a revolutionary concept. Is Big Data really something new or is it just new wine in old bottles seeing that, e.g., data analytics is doing the same type of analysis since decades? Do more data, increased or faster analytics always imply better decisions, products, or services, or is Big Data just another buzzword to stimulate the IT providers’ sales? Taking the traditional financial service industry, which currently cherishes huge expectations in Big Data, as an example, the collection of massive amounts of data via multiple channels for a long time was part of the business model to customize prices, product offers, or to calculate credit ratings. However, improving financial services by exploiting these huge amounts of data implied constant updating efforts, media disruptions and expensive acquisition and processing of data. Hence, more data resulted in expensive data management, in higher prices for products or services as well as in inconvenient processes regarding the customers’ data entry. Hence, instead of the traditional universal banks that focused on a data-intensive business model, direct banks with a higher grade of standardization and IT support as well as a focus on (very few) key customer data often enough have become more successful. Focusing solely on pure IT-based data acquisition, processing and analysis to save costs on the other side is virtually impossible in industries such as banking due to an intense personal contact. Besides, neither in the financial service industry nor in other industries do more data automatically lead to better data, better business success, better services, better decisions, or (more) satisfied customers. Above all, Big Data brings a lot of still unresolved challenges regarding the volume, velocity, variety, and veracity of data, which should not be underestimated. Often enough, more data even lead to a certain amount of “data garbage” which usually is more easily and better recognized and managed by employees rather than by analytics software (veracity). Additionally, the management of various sources of data such as from, e.g., mobile applications, online social networks, or CRM systems is far from trivial (variety). The high data traffic brings along the challenge of archiving, retrieving, and analyzing huge amounts of data in real-time (volume and velocity). Unsurprisingly, nearly every second Big Data project is canceled before completion (Infochimps 2013). And as if these challenges were not enough, we additionally see a myriad of different legal privacy restrictions in different countries turning into one of Big Data’s most serious challenges.
web intelligence | 2011
Hans Ulrich Buhl; Maximilian Röglinger; Stefan Stöckl; Kathrin Susanne Braunwarth
There is no doubt that at least since the 1990s process orientation has evolved into one of the central paradigms of organizational design. Since then, all process management subtasks have matured. Process management decisions, however, lack economic foundation. They are usually based on qualitative or technical criteria or on plausibility considerations that do not necessarily comply with typical objectives in a market economy. Consequently, design alternatives are hardly comparable and an integrated valuation of a company’s assets is impossible. The status quo is astonishing for several reasons: First, process management decisions usually imply investment projects with different risk/return positions and capital tie-up. Second, the need for designing processes according to their contribution to corporate objectives has been explicated repeatedly. Third, the paradigm of value-based management is an accepted theoretical framework from economic research that enables to consistently valuate the risk/return effects of decisions across functional areas, hierarchy levels, and asset classes. This suggests the hypothesis that process management in general as well as the goal orientation of process management decisions in particular have evolved almost independently of value-based management. In the paper at hand, this hypothesis is confirmed based on a sample of process management publications. We therefore explicate the research gap as regards value orientation in process management. In order to bridge the gap between value-based management and process-oriented organizational design, we transfer economically well-founded objective functions to process management decisions.
Journal of Strategic Information Systems | 2012
Hans Ulrich Buhl; Gilbert Fridgen; Wolfgang König; Maximilian Röglinger; Christian Wagner
During the last decades, strategic information systems (SIS) research has become an influential stream within the information systems discipline. The success story of the Journal of Strategic Information Systems provides strong evidence. Yet, we believe that there is still a lot of untapped potential in the interaction of SIS research and industry. Put bluntly, it is impossible that results of SIS research are publicly available, reconstructable by subject matter experts, and valid beyond the single or very few cases and at the same time constitute the foundation of competitive advantage. We argue that SIS researchers need to become boundary spanners who actively engage in industry collaboration to help create competitive advantage and who disseminate their insights later on to advance the scientific knowledge base. We outline challenges of boundary-spanning SIS research and provide some ideas and recommendations. Wherever sensible, we draw on our experiences from the traditionally strong industry collaboration of the business and information systems engineering community from the German-speaking countries.
web intelligence | 2014
Patrick Afflerbach; Gregor Kastner; Felix Krause; Maximilian Röglinger
Promising to cope with increasing demand variety and uncertainty, flexibility in general and process flexibility in particular are becoming ever more desired corporate capabilities. During the last years, the business process management and the production/operations management communities have proposed numerous approaches that investigate how to valuate and determine an appropriate level of process flexibility. Most of these approaches are very restrictive regarding their application domain, neglect characteristics of the involved processes and outputs other than demand and capacity, and do not conduct a thorough economic analysis of process flexibility. Against this backdrop, the authors propose an optimization model that determines an appropriate level of process flexibility in line with the principles of value-based business process management. The model includes demand uncertainty, variability, criticality, and similarity as process characteristics. The paper also reports on the insights gained from applying the optimization model to the coverage switching processes of an insurance broker pool company.
Journal of Decision Systems | 2014
Eva Forstner; Nora Kamprath; Maximilian Röglinger
Despite the need for sustaining competitive advantage, scholars and practitioners struggle when deciding which organizational capabilities they should develop to what extent. Today, inconsistent recommendations bear the risk of misallocating corporate funds. Despite recent advances, further research needs to be conducted with respect to how uncertainty can be considered in capability development decisions and whether the cutting of capabilities is a feasible option. Against this background, we propose a conceptual framework for structuring capability development decisions. Due to the close relationship between capability development and business process management, the framework builds on process maturity models and the principles of value-based business process management. We also conduct an economic analysis to disclose general relationships that govern capability development based on process maturity models.
Business Process Management Journal | 2017
Sabiölla Hosseini; Alexandra Kees; Jonas Manderscheid; Maximilian Röglinger; Michael Rosemann
Purpose In a world of ever-changing corporate environments and reduced product life cycles, most organizations cannot afford anymore to innovate on their own. Hence, they open their innovation processes to incorporate knowledge of external sources and to increase their innovation potential. As the shift toward open innovation (OI) is difficult and makes many initiatives fail, the question arises which capabilities organizations should develop to successfully implement OI. As the literature encompasses mature but isolated streams on OI capabilities, there is a need for an integrated capability framework. The paper aims to discuss these issues. Design/methodology/approach This paper proposes the open innovation capability framework (OICF) that compiles and structures capabilities relevant for implementing OI. The OICF covers the outside-in and coupled processes of OI. To integrate multiple streams of the OI literature, the OICF builds on a structured literature review. The OICF was also validated in a two-step review process with OI experts from academia and industry. Findings The OICF comprises 23 capability areas grouped along the factors such as strategic alignment, governance, methods, information technology, people, and culture. To analyze the existing body of knowledge on OI capabilities, the authors compare the OICF with other OI-related capability frameworks and compile a heatmap based on the results of the literature review. The authors also discuss the experts’ feedback on individual factors of the OICF as well as on interdependencies among these factors. Practical implications The OICF provides practitioners with a structured overview of the capabilities to consider when implementing OI. Based on the OICF, practitioners can define the scope of their OI initiatives. They can use the OICF as a foundation for prioritizing, selecting, and operationalizing capability areas as well as for deriving implementation roadmaps. Originality/value The OICF is the first framework to take a holistic perspective on OI capabilities. It integrates mature but isolated research streams of OI. It helps practitioners define the scope of OI initiatives and academics gain insights into the current state of the art on OI capabilities.
Business Process Management Journal | 2015
Manuel Bolsinger; Anna Elsäßer; Caroline Helm; Maximilian Röglinger
Purpose – Process improvement is a fundamental activity of the business process management (BPM) lifecycle. However, practitioners still lack concrete guidance and adequate objectives for process improvement. Moreover, improvement projects typically tie up considerable amounts of capital and are very risky. Thus, more guidance is needed on how to derive concrete recommendations for process improvement in a goal-oriented manner. The paper aims to discuss these issues. Design/methodology/approach – The authors propose a decision model that determines along which paths the instances of a process should be routed to maximize the value contribution of the process. To do so, the decision model requires a process model and a set of historical process instances as inputs. Findings – The decision model builds on the idea that only the parameters of the process, i.e., the values according to which it is decided on which path an instance traverses the process, can be modified, without altering the structure of the p...
communication system software and middleware | 2007
Karsten Loesing; Maximilian Röglinger; Christian Wilms; Guido Wirtz
Instant messaging (IM) systems provide its users with the information of which of their contacts are currently online. This presence information supplements text communication of IM systems and is an additional value compared to other synchronous communication media. Unauthorized users could illegally generate online logs of users by exploiting their presence information. Public IM systems lack the reliable means to protect user presence and force the user to trust in the central registry server. In this paper, we propose an IM system, which is explicitly designed to protect user presence without the need of a trusted central registry. We present a Java implementation based on the anonymous communication network Tor [1], the cryptographic suite Bouncy Castle [2], and the distributed hash table OpenDHT [3].
web intelligence | 2012
Hans Ulrich Buhl; Gilbert Fridgen; h.c. Günter Müller; Maximilian Röglinger
This article constitutes an “editorialized”, partly shortened, and partly extended version of the paper “Business and information systems engineering: a complementary approach to information systems – what we can learn from the past and may conclude from present reflection on the future” by Hans Ulrich Buhl, Gunter Muller, Gilbert Fridgen, and Maximilian Roglinger that appeared in the Journal of the Association for Information Systems 13(4):236–253, April 2012. The editorial has been presented as a keynote on the BIS conference in Vilnius, Lithuania, in May 2012 and the BISE workshop in Hannover, Germany, in October 2012. An earlier version has been published in the BIS proceedings. Published online: 2012-11-03