Lavdim Halilaj
University of Bonn
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Lavdim Halilaj.
ieee international conference semantic computing | 2016
Irlán Grangel-González; Lavdim Halilaj; Gökhan Coskun; Sören Auer; Diego Collarana; Michael Hoffmeister
In the engineering and manufacturing domain, there is currently an atmosphere of departure to a new era of digitized production. In different regions, initiatives in these directions are known under different names, such as industrie du futur in France, industrial internet in the US or Industrie 4.0 in Germany. While the vision of digitizing production and manufacturing gained much traction lately, it is still relatively unclear how this vision can actually be implemented with concrete standards and technologies. Within the German Industry 4.0 initiative, the concept of an Administrative Shell was devised to respond to these requirements. The Administrative Shell is planned to provide a digital representation of all information being available about and from an object which can be a hardware system or a software platform. In this paper, we present an approach to develop such a digital re presentation based on semantic knowledge representation formalisms such as RDF, RDF Schema and OWL. We present our concept of a Semantic I4.0 Component which addresses the communication and comprehension challenges in Industry 4.0 scenarios using semantic technologies. Our approach is illustrated with a concrete example showing its benefits in a real-world use case.
knowledge acquisition, modeling and management | 2016
Lavdim Halilaj; Niklas Petersen; Irlán Grangel-González; Christoph Lange; Sören Auer; Gökhan Coskun; Steffen Lohmann
Vocabularies are increasingly being developed on platforms for hosting version-controlled repositories, such as GitHub. However, these platforms lack important features that have proven useful in vocabulary development. We present VoCol, an integrated environment that supports the development of vocabularies using Version Control Systems. VoCol is based on a fundamental model of vocabulary development, consisting of the three core activities modeling, population, and testing. We implemented VoCol using a loose coupling of validation, querying, analytics, visualization, and documentation generation components on top of a standard Git repository. All components, including the version-controlled repository, can be configured and replaced with little effort to cater for various use cases. We demonstrate the applicability of VoCol with a real-world example and report on a user study that confirms its usability and usefulness.
ieee international conference semantic computing | 2016
Lavdim Halilaj; Irlán Grangel-González; Gökhan Coskun; Sören Auer
Collaborative vocabulary development in the context of data integration is the process of finding consensus between the experts of the different systems and domains. The complexity of this process is increased with the number of involved people, the variety of the systems to be integrated and the dynamics of their domain. In this paper we advocate that the realization of a powerful version control system is the heart of the problem. Driven by this idea and the success of Git in the context of software development, we investigate the applicability of Git for collaborative vocabulary development. Even though vocabulary development and software development have much more similarities than differences there are still important differences. These need to be considered within the development of a successful versioning and collaboration system for vocabulary development. Therefore, this paper starts by presenting the challenges we were faced with during the creation of vocabularies collaboratively and discusses its distinction to software development. Based on these insights we propose Git4Voc which comprises guidelines how Git can be adopted to vocabulary development. Finally, we demonstrate how Git hooks can be implemented to go beyond the plain functionality of Git by realizing vocabulary-specific features like syntactic validation and semantic diffs.
emerging technologies and factory automation | 2016
Irlán Grangel-González; Lavdim Halilaj; Sören Auer; Steffen Lohmann; Christoph Lange; Diego Collarana
Industry 4.0 is a global endeavor of automation and data exchange to create smart factories maximizing production capabilities and allowing for new business models. The Reference Architecture Model for Industry 4.0 (RAMI 4.0) describes the core aspects of Industry 4.0 and defines Administration Shells as digital representations of Industry 4.0 components. In this paper, we present an approach to model and implement Industry 4.0 components with the Resource Description Framework (RDF). The approach addresses the challenges of interoperable communication and machine comprehension in Industry 4.0 settings using semantic technologies. We show how related standards and vocabularies, such as IEC 62264, eCl@ss, and the Ontology of Units of Measure (OM), can be utilized along with the RDF-based representation of the RAMI 4.0 concepts. Finally, we demonstrate the applicability and benefits of the approach using an example from a real-world use case.
knowledge acquisition, modeling and management | 2016
Irlán Grangel-González; Diego Collarana; Lavdim Halilaj; Steffen Lohmann; Christoph Lange; Maria-Esther Vidal; Sören Auer
Industry 4.0 standards, such as AutomationML, are used to specify properties of mechatronic elements in terms of views, such as electrical and mechanical views of a motor engine. These views have to be integrated in order to obtain a complete model of the artifact. Currently, the integration requires user knowledge to manually identify elements in the views that refer to the same element in the integrated model. Existing approaches are not able to scale upi¾?to large models where a potentially large number of conflicts may exist across the different views of an element. To overcome this limitation, we developed Alligator, a deductive rule-based system able to identify conflicts between AutomationML documents. We define a Datalog-based representation of the AutomationML input documents, and a set of rules for identifying conflicts. A deductive engine is used to resolve the conflicts, to merge the input documents and produce an integrated AutomationML document. Our empirical evaluation of the quality of Alligator against a benchmark of AutomationML documents suggest that Alligator accurately identifies various types of conflicts between AutomationML documents, and thus helps increasing the scalability, efficiency, and coherence of models for Industry 4.0 manufacturing environments.
International Journal of Semantic Computing | 2016
Lavdim Halilaj; Irlán Grangel-González; Gökhan Coskun; Steffen Lohmann; Sören Auer
Collaborative vocabulary development in the context of data integration is the process of finding consensus between experts with different backgrounds, system understanding and domain knowledge. The complexity of this process increases with the number of people involved, the variety of the systems to be integrated and the dynamics of their domain. In this paper, we advocate that the usage of a powerful version control system is one of the keys to address this problem. Driven by this idea and the success of the version control system Git in the context of software development, we investigate the applicability of Git for collaborative vocabulary development. Even though vocabulary development and software development have much more similarities than differences, there are still important challenges. These need to be considered in the development of a successful versioning and collaboration system for vocabulary development. Therefore, this paper starts by presenting the challenges we are faced with during the collaborative creation of vocabularies and discusses its distinction to software development. Drawing from these findings, we present Git4Voc which comprises guidelines on how Git can be adopted to vocabulary development. Finally, we demonstrate how Git hooks can be implemented to go beyond the plain functionality of Git by realizing vocabulary-specific features like syntactic validation and semantic diffs.
international joint conference on knowledge discovery knowledge engineering and knowledge management | 2015
Irlán Grangel-González; Lavdim Halilaj; Gökhan Coskun; Sören Auer
A major bottleneck for a wider deployment and use of ontologies and knowledge engineering techniques is the lack of established conventions along with cumbersome and inefficient support for vocabulary and ontology authoring. We argue, that the pragmatic development by convention paradigm well-accepted within software engineering, can be successfully applied for ontology engineering, too. However, the definition of a valid set of conventions requires broadly-accepted best-practices. In this regard, we empirically analyzed a number of popular vocabularies and ontology development efforts with respect to their use of guidelines and common practices. Based on this analysis, we identified the following main aspects of common practices: documentation, internationalization, naming, structure, reuse, validation and authoring. In this paper, these aspects are presented and discussed in detail. We propose a set of practices for each aspect and evaluate their relevance in a study with vocabulary developers. The overall goal is to pave the way for a new paradigm of vocabulary development similar to Software Development by Convention, which we name Vocabulary Development by Convention.
international joint conference on knowledge discovery knowledge engineering and knowledge management | 2016
Lavdim Halilaj; Irlán Grangel-González; Maria-Esther Vidal; Steffen Lohmann; Sören Auer
A Version Control System (VCS) is usually required for successful ontology development in distributed settings. VCSs enable the tracking and propagation of ontology changes, as well as collecting metadata to describe changes, e.g., who made a change at which point in time. Modern VCSs implement an optimistic approach that allows for simultaneous changes of the same artifact and provides mechanisms for automatic as well as manual conflict resolution. However, different ontology development tools serialize the ontology artifacts in different ways. As a consequence, existing VCSs may identify a huge number of false-positive conflicts during the merging process, i.e., conflicts that do not result from ontology changes but the fact that two ontology versions are differently serialized. Following the principle of prevention is better than cure, we designed SerVCS, an approach that enhances VCSs to cope with different serializations of the same ontology. SerVCS is based on a unique serialization of ontologies to reduce the number of false-positive conflicts produced whenever different serializations of the same ontology are compared. We implemented SerVCS on top of Git, utilizing tools such as Rapper and Rdf-toolkit for syntax validation and unique serialization, respectively. We have conducted an empirical evaluation to determine the conflict detection accuracy of SerVCS whenever simultaneous changes to an ontology are performed using different ontology editors. The evaluation results suggest that SerVCS empowers VCSs by preventing them from wrongly identifying serialization related conflicts.
international conference on web engineering | 2018
Fathoni A. Musyaffa; Lavdim Halilaj; Yakun Li; Fabrizio Orlandi; Hajira Jabeen; Sören Auer; Maria-Esther Vidal
Budget and spending data are among the most published Open Data datasets on the Web and continuously increasing in terms of volume over time. These datasets tend to be published in large tabular files – without predefined standards – and require complex domain and technical expertise to be used in real-world scenarios. Therefore, the potential benefits of having these datasets open and publicly available are hindered by their complexity and heterogeneity. Linked Data principles can facilitate integration, analysis and usage of these datasets. In this paper, we present OpenBudgets.eu (OBEU), a Linked Data -based platform supporting the entire open data life-cycle of budget and spending datasets: from data creation to publishing and exploration. The platform is based on a set of requirements specifically collected by experts in the budget and spending data domain. It follows a micro-services architecture that easily integrates many different software modules and tools for analysis, visualization and transformation of data. Data is represented according to a logical model for open fiscal data which is translated into both RDF data and a tabular data formats. We demonstrate the validity of the implemented OBEU platform with real application scenarios and report on a user study conducted to confirm its usability.
database and expert systems applications | 2018
Irlán Grangel-González; Lavdim Halilaj; Maria-Esther Vidal; Omar Rana; Steffen Lohmann; Sören Auer; Andreas W. Müller
Cyber-Physical Systems (CPSs) are engineered systems that result from the integration of both physical and computational components designed from different engineering perspectives (e.g., mechanical, electrical, and software). Standards related to Smart Manufacturing (e.g., AutomationML) are used to describe CPS components, as well as to facilitate their integration. Albeit expressive, smart manufacturing standards allow for the representation of the same features in various ways, thus hampering a fully integrated description of a CPS component. We tackle this integration problem of CPS components and propose an approach that captures the knowledge encoded in smart manufacturing standards to effectively describe CPSs. We devise SemCPS, a framework able to combine Probabilistic Soft Logic and Knowledge Graphs to semantically describe both a CPS and its components. We have empirically evaluated SemCPS on a benchmark of AutomationML documents describing CPS components from various perspectives. Results suggest that SemCPS enables not only the semantic integration of the descriptions of CPS components, but also allows for preserving the individual characterization of these components.