Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Boris Otto is active.

Publication


Featured researches published by Boris Otto.


hawaii international conference on system sciences | 2016

Design Principles for Industrie 4.0 Scenarios

Mario Hermann; Tobias Pentek; Boris Otto

The increasing integration of the Internet of Everything into the industrial value chain has built the foundation for the next industrial revolution called Industrie 4.0. Although Industrie 4.0 is currently a top priority for many companies, research centers, and universities, a generally accepted understanding of the term does not exist. As a result, discussing the topic on an academic level is difficult, and so is implementing Industrie 4.0 scenarios. Based on a quantitative text analysis and a qualitative literature review, the paper identifies design principles of Industrie 4.0. Taking into account these principles, academics may be enabled to further investigate on the topic, while practitioners may find assistance in identifying appropriate scenarios. A case study illustrates how the identified design principles support practitioners in identifying Industrie 4.0 scenarios.


Journal of Data and Information Quality | 2009

One Size Does Not Fit All---A Contingency Approach to Data Governance

Kristin Weber; Boris Otto; Hubert Österle

Enterprizes need Data Quality Management (DQM) to respond to strategic and operational challenges demanding high-quality corporate data. Hitherto, companies have mostly assigned accountabilities for DQM to Information Technology (IT) departments. They have thereby neglected the organizational issues critical to successful DQM. With data governance, however, companies may implement corporate-wide accountabilities for DQM that encompass professionals from business and IT departments. This research aims at starting a scientific discussion on data governance by transferring concepts from IT governance and organizational theory to the previously largely ignored field of data governance. The article presents the first results of a community action research project on data governance comprising six international companies from various industries. It outlines a data governance model that consists of three components (data quality roles, decision areas, and responsibilities), which together form a responsibility assignment matrix. The data governance model documents data quality roles and their type of interaction with DQM activities. In addition, the article describes a data governance contingency model and demonstrates the influence of performance strategy, diversification breadth, organization structure, competitive strategy, degree of process harmonization, degree of market regulation, and decision-making style on data governance. Based on these findings, companies can structure their specific data governance model.


Business & Information Systems Engineering | 2010

Consortium Research A Method for Researcher-Practitioner Collaboration in Design-Oriented IS Research

Hubert Österle; Boris Otto

Design-oriented research in the Information Systems (IS) domain aims at delivering results which are both of scientific rigor and of relevance for practitioners. Today, however, academic researchers are facing the challenge of gaining access to and capturing knowledge from the practitioner community. Against this background, the paper proposes a method for Consortium Research, which is supposed to facilitate multilateral collaboration of researchers and practitioners during the research process. The method’s design is based on a self-evaluating design process which was carried out over a period of 20 years. The paper’s contribution is twofold. First, it addresses the science of design, since it proposes guidance to researchers for practitioner collaboration during the process of artifact design. Second, the method is an artifact itself, hence, the result of a design-oriented research process.


acm symposium on applied computing | 2009

Towards a maturity model for corporate data quality management

Kai M. Hüner; Martin Ofner; Boris Otto

High-quality corporate data is a prerequisite for world-wide business process harmonization, global spend analysis, integrated service management, and compliance with regulatory and legal requirements. Corporate Data Quality Management (CDQM) describes the quality oriented organization and control of a companys key data assets such as material, customer, and vendor data. With regard to the aforementioned business drivers, companies demand an instrument to assess the progress and performance of their CDQM initiative. This paper proposes a reference model for CDQM maturity assessment. The model is intended to be used for supporting the build process of CDQM. A case study shows how the model has been successfully implemented in a real-world scenario.


hawaii international conference on system sciences | 2012

Data Quality Requirements of Collaborative Business Processes

Clarissa Falge; Boris Otto; Hubert Österle

High-quality data is not just a competitive factor for individual companies but also an enabler for collaboration in business networks. The paper takes a cross-company perspective on data quality and identifies requirements collaborative business processes pose to data quality. A qualitative content analysis on Business Networking case studies is conducted. The results show which combinations of data classes (e.g. order data, forecast data) and quality dimensions (e.g. business rule conformity) are crucial for the different collaborative business processes in business networks. The paper interprets the results and closes with a discussion of current data quality trends for collaborative processes.


Business Process Management Journal | 2012

Integrating a data quality perspective into business process management

Martin Ofner; Boris Otto; Hubert Österle

– The purpose of this paper is to conceptualize data quality (DQ) in the context of business process management and to propose a DQ oriented approach for business process modeling. The approach is based on key concepts and metrics from the data quality management domain and supports decision‐making in process re‐design projects on the basis of process models., – The paper applies a design oriented research approach, in the course of which a modeling method is developed as a design artifact. To do so, method engineering is used as a design technique. The artifact is theoretically founded and incorporates DQ considerations into process re‐design. Furthermore, the paper uses a case study to evaluate the suggested approach., – The paper shows that the DQ oriented process modeling approach facilitates and improves managerial decision‐making in the context of process re‐design. Data quality is considered as a success factor for business processes and is conceptualized using a rule‐based approach., – The paper presents design research and a case study. More research is needed to triangulate the findings and to allow generalizability of the results., – The paper supports decision‐makers in enterprises in taking a DQ perspective in business process re‐design initiatives., – The paper reports on integrating DQ considerations into business process management in general and into process modeling in particular, in order to provide more comprehensive decision‐making support in process re‐design projects. The paper represents one of the first contributions to literature regarding a contemporary phenomenon of high practical and scientific relevance.


International Journal of Information Management | 2011

Collaborative management of business metadata

Kai M. Hüner; Boris Otto; Hubert Österle

Legal provisions, cross-company data exchange and intra-company reporting or planning procedures require comprehensively, timely, unambiguously and understandably specified business objects (e.g. materials, customers, and suppliers). On the one hand, this business metadata has to cover miscellaneous regional peculiarities in order to enable business activities anywhere in the world. On the other hand, data structures need to be standardized throughout the entire company in order to be able to perform global spend analysis, for example. In addition, business objects should adapt to new market conditions or regulatory requirements as quickly and consistently as possible. Centrally organized corporate metadata managers (e.g. within a central IT department) are hardly able to meet all these demands. They should be supported by key users from several business divisions and regions, who contribute expert knowledge. However, despite the advantages regarding high metadata quality on a corporate level, a collaborative metadata management approach of this kind has to ensure low effort for knowledge contributors as in most cases these regional or divisional experts do not benefit from metadata quality themselves. Therefore, the paper at hand identifies requirements to be met by a business metadata repository, which is a tool that can effectively support collaborative management of business metadata. In addition, the paper presents the results of an evaluation of these requirements with business experts from various companies and of scenario tests with a wiki-based prototype at the company Bayer CropScience AG. The evaluation shows two things: First, collaboration is a success factor when it comes to establishing effective business metadata management and integrating metadata with enterprise systems, and second, semantic wikis are well suited to realizing business metadata repositories.


Electronic Markets | 2011

Product data quality in supply chains: the case of Beiersdorf

Kai M. Hüner; Andreas Schierning; Boris Otto; Hubert Österle

A number of business requirements (e.g. compliance with regulatory and legal provisions, diffusion of global standards, supply chain integration) are forcing consumer goods manufacturers to increase their efforts to provide product data (e.g. product identifiers, dimensions) at business-to-business interfaces timely and accurately. The quality of such data is a critical success factor for efficient and effective cross-company collaboration. If compliance relevant data (e.g. dangerous goods indicators) is missing or false, consumer goods manufacturers risk being fined and see their company’s image damaged. Or if logistics data (e.g. product dimensions, gross weight) is inaccurate or provided not in time, business with key account trading partners is endangered. To be able to manage the risk of business critical data defects, companies must be able to a) identify such data defects, and b) specify and use metrics that allow to monitor the data’s quality. As scientific research on both these issues has come up with only few results so far, this case study explores the process of identifying business critical product data defects at German consumer goods manufacturing company Beiersdorf AG. Despite advanced data quality management structures such defects still occur and can result in complaints, service level impairment and avoidable costs. The case study analyzes product data use and maintenance in Beiersdorf’s ecosystem, identifies typical product data defects, and proposes a set of data quality metrics for monitoring those defects.


Electronic Markets | 2011

Information and data quality in business networking: a key concept for enterprises in its early stages of development

Boris Otto; Yang W. Lee; Ismael Caballero

Information and data of high quality are critical for successful business performance in general and Business Networking in particular. As the trend toward sharing information between business partners and value networks is still increasing, the position paper aims at providing a comprehensive perspective on the state of research with regard to information and data quality in Business Networking. The paper shows that much has been achieved, but that fundamental aspects still remain unaddressed. Based on the results of a literature review, the paper identifies consequential areas of research and makes six propositions for future research. In doing so, the position paper aims at offering novel perspectives and at introducing new areas of research in a field of particularly high relevance in the networked business and electronic markets domain.


hawaii international conference on system sciences | 2009

The Effect of Using a Semantic Wiki for Metadata Management: A Controlled Experiment

Kai M. Hüner; Boris Otto

A coherent and consistent understanding of corporate data is an important factor for effective management of diversified companies and implies a need for companywide unambiguous data definitions. Inspired by the success of Wikipedia, wiki software has become a broadly discussed alternative for corporate metadata management. However, in contrast to the performance and sustainability of wikis in general, benefits of using semantic wikis have not been investigated sufficiently. The paper at hand presents results of a controlled experiment that investigates effects of using a semantic wiki for metadata management in comparison to a classical wiki. Considering threats to validity, the analysis (i.e. 74 subjects using both a classical and a semantic wiki) shows that the semantic wiki is superior to the classical variant regarding information retrieval tasks. At the same time, the results indicate that more effort is needed to build up the semantically annotated wiki content in the semantic wiki.

Collaboration


Dive into the Boris Otto's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Kai M. Hüner

University of St. Gallen

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Verena Ebner

University of St. Gallen

View shared research outputs
Top Co-Authors

Avatar

Martin Ofner

University of St. Gallen

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Nils Urbach

University of Bayreuth

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge