X Xixi Lu
Eindhoven University of Technology
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by X Xixi Lu.
business process management | 2014
X Xixi Lu; Dirk Fahland; Wmp Wil van der Aalst
Conformance checking is becoming more important for the analysis of business processes. While the diagnosed results of conformance checking techniques are used in diverse context such as enabling auditing and performance analysis, the quality and reliability of the conformance checking techniques themselves have not been analyzed rigorously. As the existing conformance checking techniques heavily rely on the total ordering of events, their diagnostics are unreliable and often even misleading when the timestamps of events are coarse or incorrect. This paper presents an approach to incorporate flexibility, uncertainty, concurrency and explicit orderings between events in the input as well as in the output of conformance checking using partially ordered traces and partially ordered alignments, respectively. The paper also illustrates various ways to acquire partially ordered traces from existing logs. In addition, a quantitative-based quality metric is introduced to objectively compare the results of conformance checking. The approach is implemented in ProM plugins and has been evaluated using artificial logs.
emerging technologies and factory automation | 2014
X Xixi Lu; Rs Ronny Mans; Dirk Fahland; Wmp Wil van der Aalst
There is a continuous pressure to make healthcare processes more efficient and effective without sacrificing quality. Conformance checking can be used to improve processes by analyzing event data and directly relating observed behavior and modeled behavior. Conformance checking provides diagnostics that go far beyond measuring traditional key performance indicators. However, current conformance checking techniques focus on a rather simplistic setting where executions of process instances are sequential and homogeneous whereas healthcare processes are known to be dynamic, complex, and ad-hoc. In healthcare process instances of patients often follow a unique path through the process with one-of-a-kind deviations. Moreover, timestamps are often rather coarse (the date is known, but not the time) resulting in an unreliable ordering of events. As current techniques are unable to handle concurrent events, and the obtained sequential alignments are unable to provide structural information about deviations, the diagnostics provided are often insufficient and misleading. This paper presents a novel approach using partially ordered traces and partially ordered alignments which aims to incorporate unreliability and concurrency in the input while providing diagnostics about deviations that take causalities into account. The approach has been implemented in ProM and was evaluated using event data from a Dutch hospital.
conference on advanced information systems engineering | 2015
Ml Maikel van Eck; X Xixi Lu; Sjj Sander Leemans; Wmp Wil van der Aalst
Process mining aims to transform event data recorded in information systems into knowledge of an organisation’s business processes. The results of process mining analysis can be used to improve process performance or compliance to rules and regulations. However, applying process mining in practice is not trivial. In this paper we introduce PM\(^2\), a methodology to guide the execution of process mining projects. We successfully applied PM\(^2\) during a case study within IBM, a multinational technology corporation, where we identified potential process improvements for one of their purchasing processes.
business process management | 2015
X Xixi Lu; Dirk Fahland; Frank J.H.M. van den Biggelaar; Wmp Wil van der Aalst
Deviation detection is a set of techniques that identify deviations from normative processes in real process executions. These diagnostics are used to derive recommendations for improving business processes. Existing detection techniques identify deviations either only on the process instance level or rely on a normative process model to locate deviating behavior on the event level. However, when normative models are not available, these techniques detect deviations against a less accurate model discovered from the actual behavior, resulting in incorrect diagnostics. In this paper, we propose a novel approach to detect deviation on the event level by identifying frequent common behavior and uncommon behavior among executed process instances, without discovering any normative model. The approach is implemented in ProM and was evaluated in a controlled setting with artificial logs and real-life logs. We compare our approach to existing approaches to investigate its possibilities and limitations. We show that in some cases, it is possible to detect deviating events without a model as accurately as against a given precise normative model.
business process management | 2016
X Xixi Lu; Dirk Fahland; Frank J.H.M. van den Biggelaar; Wmp Wil van der Aalst
Processes may require to execute the same activity in different stages of the process. A human modeler can express this by creating two different task nodes labeled with the same activity name (thus duplicating the task). However, as events in an event log often are labeled with the activity name, discovery algorithms that derive tasks based on labels only cannot discover models with duplicate labels rendering the results imprecise. For example, for a log where “payment” events occur at the beginning and the end of a process, a modeler would create two different “payment” tasks, whereas a discovery algorithm introduces a loop around a single “payment” task. In this paper, we present a general approach for refining labels of events based on their context in the event log as a preprocessing step. The refined log can be input for any discovery algorithm. The approach is implemented in ProM and was evaluated in a controlled setting. We were able to improve the quality of up to 42 % of the models compared to using a log with imprecise labeling using default parameters and up to 87 % using adaptive parameters. Moreover, using our refinement approach significantly increased the similarity of the discovered model to the original process with duplicate labels allowing for better rediscoverability. We also report on a case study conducted for a Dutch hospital.
Information Processing Letters | 2018
Niek Tax; X Xixi Lu; Natalia Sidorova; Dirk Fahland; Wmp Wil van der Aalst
Abstract In process mining, precision measures are used to quantify how much a process model overapproximates the behavior seen in an event log. Although several measures have been proposed throughout the years, no research has been done to validate whether these measures achieve the intended aim of quantifying over-approximation in a consistent way for all models and logs. This paper fills this gap by postulating a number of axioms for quantifying precision consistently for any log and any model. Further, we show through counter-examples that none of the existing measures consistently quantifies precision.
Science & Engineering Faculty | 2015
van Ml Maikel Eck; X Xixi Lu; Sjj Sander Leemans; van der Wmp Wil Aalst
Computers & Security | 2018
M Mahdi Alizadeh; X Xixi Lu; Dirk Fahland; Nicola Zannone; Wmp Wil van der Aalst
BPM reports | 2015
X Xixi Lu; Mql Marijn Nagelkerke; D van de Wiel; Dirk Fahland
ZEUS | 2017
X Xixi Lu; D Dirk Fahland