Propositional Dynamic Logic with Converse and Repeat for Message-Passing Systems
PROPOSITIONAL DYNAMIC LOGIC WITH CONVERSE AND REPEATFOR MESSAGE-PASSING SYSTEMS
ROY MENNICKEIlmenau University of Technology, Germany e-mail address : [email protected]
Abstract.
The model checking problem for propositional dynamic logic (PDL) over mes-sage sequence charts (MSCs) and communicating finite state machines (CFMs) asks, givena channel bound B , a PDL formula ϕ and a CFM C , whether every existentially B -boundedMSC M accepted by C satisfies ϕ . Recently, it was shown that this problem is PSPACE-complete. In the present work, we consider CRPDL over MSCs which is PDL equippedwith the operators converse and repeat. The former enables one to walk back and forthwithin an MSC using a single path expression whereas the latter allows to express thata path expression can be repeated infinitely often. To solve the model checking problemfor this logic, we define message sequence chart automata (MSCAs) which are multi-wayalternating parity automata walking on MSCs. By exploiting a new concept called con-catenation states, we are able to inductively construct, for every CRPDL formula ϕ , anMSCA precisely accepting the set of models of ϕ . As a result, we obtain that the modelchecking problem for CRPDL and CFMs is still in PSPACE. Introduction
Automatic verification is the process of translating a computer system to a mathematicalmodel, formulating a requirements specification in a formal language, and automaticallychecking the obtained model against this specification. In the past, finite automata, Kripkestructures, and B¨uchi automata turned out to be suitable formalisms to model the behaviorof complex non-parallel systems. Two of the most common specification languages are thetemporal logics LTL [21] and CTL [3]. After deciding on a modeling and a specification for-malism, automatic verification melts down to the model checking problem : Given a model A with behavior L ( A ) and a specification ϕ representing the expected behavior L ( ϕ ), does L ( A ) ⊆ L ( ϕ ) hold?Distributed systems exchanging messages can be modeled by communicating finite-state machines (CFMs) which were introduced in [2]. A CFM consists of a finite number offinite automata communicating using FIFO channels. Each run of such a machine can beunderstood as a message sequence chart (MSC). The latter is an established ITU standardand comes with a formal definition as well as a convenient graphical notation. In a simplified [ Theory of computation ]: Logic—Verification by model checking.
Key words and phrases: message sequence charts, alternating automata, communicating finite-state ma-chines, propositional dynamic logic.
LOGICAL METHODS l IN COMPUTER SCIENCE DOI:10.2168/LMCS-9(2:12)2013 c (cid:13)
Roy Mennicke CC (cid:13) Creative Commons
ROY MENNICKE model, an MSC can be considered as a structure consisting of send and receive events whichare assigned to unique processes where the events of each process are linearly ordered. Forevery send event, there exists a matching receive event and vice versa. Unfortunately, themodel checking problem for CFMs is undecidable even for very simple temporal logics –this is a direct consequence of the undecidability of the emptiness problem for CFMs. Onesolution to this problem is to establish a bound B on the number of messages pending ona channel. The bounded model checking problem of CFMs then reads as follows: givena channel bound B , a specification ϕ and a CFM C , does every existentially B -boundedMSC M accepted by C satisfy ϕ ? An existentially B -bounded MSC is an MSC whichadmits an execution with B -bounded channels. Using this approach several results fordifferent temporal logics were obtained in [16, 10, 9, 1].In [1], a bidirectional propositional dynamic logic (PDL) was proposed for the automaticverification of distributed systems modeled by CFMs. This logic was originally introducedby Fischer and Ladner [5] for Kripke structures and allows to express fundamental propertiesin an easy and intuitive manner. PDL for MSCs is closed under negation, it is a properfragment of the existential monadic second-order logic (EMSO) in terms of expressiveness(but it is no syntactic fragment) [1], and the logic TLC − considered by Peled [20] is afragment of it. PDL distinguishes between local and global formulas. The former ones areevaluated at a specific event of an MSC whereas the latter are Boolean combinations oflocal formulas quantifying existentially over all events of an MSC. Consider for example thelocal formula α = p ! q ∧ ¬ h proc ∗ i p ? q . An event satisfies α if it is a send event of a messagefrom process p to q which is not followed by a reply message from q to p . The global formula E α expresses that there exists such an event v .By a rather involved translation of PDL formulas into CFMs, Bollig, Kuske, and Mei-necke demonstrated in [1] that the bounded model checking problem for CFMs and PDLcan be decided in polynomial space. However, by means of this approach, Bollig et al. werenot able to support the popular converse operator. The latter, introduced in [22], is anextension of PDL which allows to walk back and forth within an MSC using a single pathexpression of PDL. For example one can specify a path expression ( proc − ; msg ) ∗ describing“zigzag-like” paths going back on a process and traversing a send event in an alternatingmanner. It is an open question whether PDL formulas enriched with the converse operatorcan be translated into CFMs. Bollig et al. only managed to provide an operator whichenables path expressions to either walk backward or forward.In the present work, we consider CRPDL over MSCs which is PDL equipped with theoperators converse ( − ) and repeat ( ω ) [23]. The latter allows to express that a pathexpression can be repeated infinitely often. For example, an event v on process p satisfies h proc i ω if there are infinitely many events on p succeeding v . We are able to demonstrate thatthe bounded model checking problem of CFMs and CRPDL is in PSPACE and thereforegeneralize the model checking result from [1]. In order to obtain this result, we definemulti-way alternating parity automata over MSCs which we call local message sequencechart automata (or local MSCAs for short). Local MSCAs are started at specific events ofan MSC and accept sets of pointed MSCs which are pairs of an MSC M and an event v of M . Using a game theoretic approach, it can be shown that local MSCAs are closedunder complementation. We demonstrate that every local formula α of CRPDL can betranslated in polynomial space into an equivalent local MSCA whose size is linear in thesize of α — this can be done independently from any channel bound. We also define globalMSCAs consisting of a local MSCA M and a set of global initial states. A global initial DL WITH CONVERSE AND REPEAT FOR MESSAGE-PASSING SYSTEMS 3 state is a tuple of states ( ι , ι , . . . , ι n ) where n is the number of processes. If, for everyprocess p , there exists an accepting run of M starting in the minimal event of p and theinitial state ι p , then the global MSCA accepts the whole MSC. For every global formula ϕ ,we can construct in polynomial space a global MSCA G such that G precisely accepts the setof models of ϕ . After fixing a channel bound B , the automaton G is then transformed intoa two-way alternating word automaton and, after that, into a B¨uchi automaton recognizingthe set of all B -bounded linearizations of the models of ϕ .In the literature, one can basically find two types of approaches to turn a temporalformula into a B¨uchi automaton. On the one hand, Vardi and others [25, 8] transformedLTL formulas into alternating automata in one single step and, afterwards, these alternatingautomata were translated into B¨uchi automata. On the other hand, there were performedinductive constructions which lead to a B¨uchi automaton without the need for an interme-diate step [13, 6, 7, 1]. In the present work, we combine these two approaches to obtain avery modular and easy to understand proof. For a given CRPDL formula, we inductivelyconstruct an alternating automaton which is later translated into a B¨uchi automaton. Inthis process, we utilize a new concept called concatenation states. These special states allowthe concatenation of local MSCAs. For example, if M is the local MSCA obtained for theformula h proc i tt , then we can concatenate two copies of M to obtain an automaton for theformula h proc ; proc i tt . Outline.
We proceed as follows. In Sect. 2, we define MSCs, CRPDL, MSCAs, and giveintroductory examples. In Sect. 3, we show that local MSCAs are effectively closed undercomplementation. In Sect. 4, we construct, for every local CRPDL formula α , a localMSCA which precisely accepts the models of α . In Sect. 5, we effectively show that, forevery global CRPDL formula ϕ , the set of models of ϕ is the language of a global MSCA.In the sections 6 and 7, we prove that the bounded satisfiability problem for CRPDL andthe bounded model checking problem for CRPDL and CFMs both are PSPACE-complete.A conference version of this paper was published as [18]. Acknowledgements.
The author likes to express his sincere thanks to his doctoral adviserDietrich Kuske for his guidance and valuable advice. Furthermore, he is grateful to BenediktBollig for comments leading to a considerable technical simplification. This paper alsogreatly benefits from the detailed reviews and helpful remarks of the anonymous referees.2.
Preliminaries
We let poly ( n ) denote the set of polynomial functions in one argument. For every naturalnumber n ≥
1, we set [ n ] = { , , . . . , n } .We fix a finite set P = { , , . . . , | P |} of processes. Let Ch = { ( p, q ) ∈ P | p = q } denote the set of communication channels . For all p ∈ P , we define a local alphabetΣ p = { p ! q, p ? q | q ∈ P \ { p }} which we use in the following way. An event labelled by p ! q marks the send event of a message from process p to process q whereas p ? q is the label ofa receive event of a message sent from q to p . We set Σ = S p ∈ P Σ p . Since P is finite, thelocal alphabets Σ p and Σ are also finite. ROY MENNICKE1 2
Figure 1: An example of a finite MSC.2.1.
Message Sequence Charts.
Message sequence charts model the behavior of a finiteset of parallel processes communicating using FIFO channels. The following example showsthat they come with a convenient graphical representation.
Example 2.1.
Figure 1 shows a finite MSC M over the set of processes P = { , } .In the graphical representation of an MSC M over the set of processes P , there is avertical axis for every process from P . On the edge for process p ∈ P , the events occurringon p are drawn as small black circles. Thus, a linear ordering (cid:22) Mp on the set of events fromprocess p is implicitly defined. In the formal definition of an MSC, which we give in thefollowing, the direct successor relation induced by the linear ordering (cid:22) Mp is given by proc Mp .For technical convenience, we force processes to contain at least one event. Messages sentbetween two processes are depicted by arrows pointing from the send event to the matchingreceive event. Formally, messages are represented by the binary relation msg M and, forevery send event, there exists a matching receive event and vice versa. Definition 2.2. A message sequence chart ( MSC ) is a structure M = (cid:0) V M , ( proc Mp ) p ∈ P , msg M , λ M (cid:1) where • V M is a set of events , • proc Mp , msg M ⊆ ( V M × V M ) for all p ∈ P , • λ M : V M → Σ is a labeling function, • for all p ∈ P , the relation proc Mp is the direct successor relation of a linear order (cid:22) Mp on V Mp := { v ∈ V M | λ ( v ) ∈ Σ p } , • ( V Mp , (cid:22) Mp ) is non-empty and finite or isomorphic to ( N , ≤ ), • for all v, w ∈ V M , we have ( v, w ) ∈ msg M if and only if there exists ( p, q ) ∈ Ch such that λ M ( v ) = p ! q , λ M ( w ) = q ? p , and |{ u ∈ V M | λ M ( u ) = p ! q, u (cid:22) Mp v }| = |{ u ∈ V M | λ M ( u ) = q ? p, u (cid:22) Mq w }| , • for every v ∈ V M there exists w ∈ V M such that ( v, w ) ∈ msg ∪ msg − .If v ∈ V M , then we denote by P M ( v ) the process at which v is located, i.e., P M ( v ) = p ifand only if λ M ( v ) ∈ Σ p . Finally, if v ∈ V M , then the pair ( M, v ) is called a pointed MSC . DL WITH CONVERSE AND REPEAT FOR MESSAGE-PASSING SYSTEMS 5
Definition 2.3.
We fix the set M = { proc , proc − , msg , msg − , id } of directions. An MSC M induces a partial function η M : ( V M × V M ) → M . For all v, v ′ ∈ V M , we define η M ( v, v ′ ) = proc if ( v, v ′ ) ∈ proc M proc − if ( v ′ , v ) ∈ proc M msg if ( v, v ′ ) ∈ msg M msg − if ( v ′ , v ) ∈ msg M id if v = v ′ undefined otherwisewhere proc M = S p ∈ P proc Mp .2.2. Propositional Dynamic Logic with Converse and Repeat.
In this section, we in-troduce a new logic called propositional dynamic logic with converse and repeat (or CRPDLfor short). In CRPDL, we distinguish between local and global formulas. The former onesare evaluated at specific events of an MSC. The latter are positive Boolean combinationsof properties of the form “there exists an event satisfying a local formula” or “all eventssatisfy a local formula”.
Definition 2.4.
Local formulas α and path expressions π of CRPDL are defined by thefollowing grammar, where D ∈ M and σ ranges over the alphabet Σ: α ::= σ | ¬ α | h π i α | h π i ω π ::= D | { α } | π ; π | π + π | π ∗ Formulas of the form h π i α are called path formulas . The size of a local formula α is thelength of the string α .Note that proc − and msg − form the converse operator [22] which allows to walk backand forth within an MSC using a single path expression. The formula h π i ω provides thefunctionality of the repeat operator [23]. It allows to express that a path expression can berepeated infinitely often.Intuitively, a path formula h π i α expresses that one can move along a path describedby π and then α holds. In the following formal definition of the semantics of local formulas,we write reach M ( v, π ) to denote the set of events which can be reached from v using a pathdescribed by π . A formal definition of reach M ( v, π ) is given at the end of Definition 2.5. ROY MENNICKE
Definition 2.5.
Let (
M, v ) be a pointed MSC, σ ∈ Σ, D ∈ M , α be a local formula, π, π , π be path expressions. We define: M, v | = σ ⇐⇒ λ M ( v ) = σM, v | = ¬ α ⇐⇒ M, v = αM, v | = h D i α ⇐⇒ there exists v ′ with η M ( v, v ′ ) = D and M, v ′ | = αM, v | = h{ α }i β ⇐⇒ M, v | = α and M, v | = βM, v | = h π + π i α ⇐⇒ M, v | = h π i α or M, v | = h π i αM, v | = h π ; π i α ⇐⇒ M, v | = h π i h π i αM, v | = h π ∗ i α ⇐⇒ there exists an n ≥ M, v | = ( h π i ) n αM, v | = h π i ω ⇐⇒ there exist infinitely many events v , v , . . . such that v = v and v i +1 ∈ reach M ( v i , π ) for all i ≥ reach M ( v, π ) is inductively defined as follows: reach M ( v, D ) = ( { v ′ } if η M ( v, v ′ ) = D ∅ otherwise reach M ( v, { α } ) = ( { v } if M, v | = α ∅ otherwise reach M ( v, π ; π ) = S v ′ ∈ reach M ( v,π ) reach M ( v ′ , π ) reach M ( v, π + π ) = reach M ( v, π ) ∪ reach M ( v, π ) reach M ( v, π ∗ ) = { v } ∪ S n ≥ reach M ( v, π n )By L ( α ) we denote the set of pointed MSCs which satisfy α .We set tt = σ ∨ ¬ σ for some σ ∈ Σ. If α = h π i tt , then we define reach M ( v, α ) = reach M ( v, π ) .Furthermore, we use α ∧ α as an abbreviation for h{ α }i α and write α ∨ α for theformula ¬ ( ¬ α ∧ ¬ α ). Finally, for all q ∈ P , we define P q = W p ∈ P ,q = p ( q ! p ∨ q ? p ). For everypointed MSC ( M, v ), we have
M, v | = P q if and only if P M ( v ) = q . Remark 2.6.
It can be easily seen that
M, v | = h π i α if and only if M, v | = h π ; { α }i tt .Because of this fact, every time we are dealing with path formulas in the future, we willassume that α = tt . Example 2.7.
The existential until construct α EU β [12] can be expressed by the localformula h ( { α } ; ( proc + msg )) ∗ i β .We now define global formulas which are positive Boolean combinations of propertiesof the form “there exists an event satisfying a local formula α ” or “all events satisfy a localformula α ”. Definition 2.8.
The syntax of global formulas is given by the grammar ϕ ::= E α | A α | ϕ ∨ ϕ | ϕ ∧ ϕ DL WITH CONVERSE AND REPEAT FOR MESSAGE-PASSING SYSTEMS 7 where α ranges over the set of local formulas. Their semantics is as follows: If M is anMSC, α is a local formula, and ϕ , ϕ are global formulas, then M | = E α ⇐⇒ there exists v ∈ V M with M, v | = α , M | = A α ⇐⇒ M, v | = α for all v ∈ V M , M | = ϕ ∨ ϕ ⇐⇒ M | = ϕ or M | = ϕ , and M | = ϕ ∧ ϕ ⇐⇒ M | = ϕ and M | = ϕ .We define the size of a global formula ϕ to be the length of the string ϕ . By L ( ϕ ), wedenote the set of MSCs M with M | = ϕ .Note that even though there are no negation operators allowed in global formulas,the expressible properties are still closed under negation. This is because conjunction anddisjunction operators as well as existential and universal quantification are available. Example 2.9 ([1]) . Let β p = h proc ∗ ; msg ; proc ∗ ; msg i P p . If ( M, v ) is a pointed MSC suchthat
M, v | = β p , then process p can be reached from v with exactly two messages. If M isthe MSC from Fig. 1, then M, v | = β if and only if v is one of the first three events onprocess 1. The global formula ϕ p = A β p states that β p holds for every event of an MSC M (which in particular implies that M is infinite). Example 2.10.
An MSC M satisfies E V p ∈ P ( (cid:10) ( proc + msg + proc − + msg − ) ∗ (cid:11) P p ) if andonly if the graph ( V M , proc M ∪ msg M ∪ ( proc M ) − ∪ ( msg M ) − ) is connected. Example 2.11.
Now, let π p = (( proc + msg ) ∗ ; { P p } ) for every p ∈ P . Imagine that M is anMSC which models the circulation of a single token granting access to a shared resource.Then M | = E (cid:10) π ; π ; . . . ; π | P | (cid:11) ω if and only if no process ever gets excluded from using theshared resource.2.3. Message Sequence Chart Automata (MSCA).
In this section, we give the defi-nition of MSCAs which basically are multi-way alternating parity automata walking forthand back on the process and message edges of MSCs. We first define local MSCAs whichare started at individual events of an MSC. They also come with a so called concatenationstate. This type of state is used to concatenate local MSCAs in order to obtain more com-plex local MSCAs. Using this technique, we will show in a subsequent section that everylocal formula of CRPDL can be transformed into a local MSCA.
Definition 2.12. If X is a non-empty set, then B + ( X ) denotes the set of all positiveBoolean expressions over X together with the expression ⊥ . The latter expression is alwaysevaluated to false. We say that Y ⊆ X is a model of E ∈ B + ( X ) and write Y | = E if E isevaluated to true when assigning true to every element contained in Y and assigning falseto all other elements from X \ Y . The set Y ⊆ X is a minimal model of E if Y | = E and Z = E for all Z ( Y . We denote the set of all models of E by mod ( E ) whereas wewrite J E K for the set of all minimal models of E .For instance, { a, b, c } , { a, b } , { a, c } , { b, c } , and { a } are all models of the positive Booleanexpression a ∨ ( b ∧ c ) ∈ B + ( { a, b, c } ). However, only { a } , and { b, c } are minimal models. Definition 2.13. A local message sequence chart automaton (local MSCA) is a quintuple M = ( S, δ, ι, c, κ ) where
ROY MENNICKE s | s | s | proc Σ, proc Σ msg { q ! p | q = p } id Figure 2: The local MSCA M from Example 2.14. • S is a finite set of states, • δ : ( S × Σ) → B + ( M × S ) is a transition function, • ι ∈ S is an initial state, • c ∈ S is a concatenation state, and • κ : S → N is a ranking function.The size of M is | S | + | δ | . If we do not pay attention to the concatenation state c , then wesometimes write ( S, δ, ι, κ ) instead of (
S, δ, ι, c, κ ). If s ∈ S , σ ∈ Σ, and τ ∈ J δ ( s, σ ) K , then τ is called a transition .For example, the transition τ = { ( proc , s ) , ( msg , s ) } which is a minimal model of theexpression ( proc , s ) ∧ (( msg , s ) ∨ ( msg − , s )) can be interpreted in the following way: Letus assume that M is in state s ∈ S at an event v . If it performs the transition τ , thenit changes, in parallel, from the state s into the states s and s , i.e., the run splits. Inthe case of state s , it moves to the event succeeding the event v on the current process.For s , the automaton walks along a message edge to the receive event of the messagesent in v . Hence, the conjunctive connectives implement universal branching whereas thedisjunctive connectives realize existential branching and nondeterminism, respectively. Asa consequence, local MSCAs are alternating automata and their runs may split. Therefore,in order to be able to define runs of local MSCAs, we first introduce labelled trees.Later, in the construction of local MSCAs from local formulas, the concatenation state c of M will be used to concatenate local MSCAs for simple local formulas in order to obtainautomata which are equivalent to more complex formulas. Example 2.14.
Let p ∈ P be fixed. Consider the local MSCA M = ( S, δ, s , κ ) which isdepicted in Fig. 2. Its set of states S consists of the three states s , s , and s where s is the initial state. Each state is depicted by a circle. The label of the circles also tells usthe rank of each state. For example, s | κ ( s ) = 1. Furthermore, we have κ ( s ) = 1 and κ ( s ) = 0. Transitions are depicted by arrows. For instance, the arrow from s to s labelled by Σ and msg says that the automaton can make a transition from s to s by following a message edge and going to the matching receive event, respectively. Wewrite Σ because this transition can be executed no matter what the label of the currentevent is. Alternatively, the automaton can stay in state s by going to the successor of thecurrent event — this is expressed by the loop at s . More formally, for all σ ∈ Σ, we have δ ( s , σ ) = ( proc , s ) ∨ ( msg , s ), δ ( s , σ ) = ⊥ , and δ ( s , σ ) = ( ( id , s ) if σ = q ! p where q ∈ P \ { p } ( proc , s ) otherwise.Note that the above example makes use of existential branching only whereas the MSCAof the next example also implements universal branching. DL WITH CONVERSE AND REPEAT FOR MESSAGE-PASSING SYSTEMS 9 t | t | s | s | s | proc Σ, proc Σ msg { q ! p | q = p } id Σ ididproc Σ Figure 3: The local MSCA M ′ from Example 2.15. Example 2.15.
Consider the local MSCA from Fig. 3. Note that universal branchingis depicted by forked arrows. We have M ′ = ( S ∪ { t , t } , δ ∪ δ ′ , t , κ ∪ κ ′ ) where M =( S, δ, s , κ ) is the local MSCA from Example 2.14, κ ′ ( t ) = 1, κ ′ ( t ) = 0, and δ ( t , σ ) =( id , t ) ∧ ( id , s ) and δ ( t , σ ) = ( proc , t ) for all σ ∈ Σ. Definition 2.16. A tree is a directed, connected, cycle-free graph ( C, E ) with the set ofnodes C and the set of edges E such that there exists exactly one node with no incomingedges (which is called root ) and all other nodes have exactly one incoming edge.We now define so-called S -labelled trees over pointed MSCs where S is an arbitraryset. Later, the set S will be the set of states of a local MSCA. Definition 2.17.
Let S be an arbitrary set, M be an MSC, and v ∈ M . An S -labelled treeover ( M, v ) is a quintuple ρ = ( C, E, r, µ, ν ) where(1) (
C, E ) is a tree with root r ,(2) µ : C → S is a labeling function,(3) ν : C → V M is a positioning function with ν ( r ) = v ,(4) µ ( y ) = µ ( y ) or ν ( y ) = ν ( y ) for all ( x, y ) , ( x, y ) ∈ E with y = y , and(5) η M ( ν ( x ) , ν ( y )) is defined for all ( x, y ) ∈ E .The elements of C are called configurations . If x ∈ C , then E ρ ( x ) = { y ∈ C | ( x, y ) ∈ E } denotes the set of the direct successor configurations of x in ρ . For convenience, we identify µ with its natural extension, i.e., µ ( x x x . . . ) = µ ( x ) µ ( x ) µ ( x ) . . . ∈ S ∗ ∪ S ω .We use S -labelled trees to define runs of local MSCAs. The condition (4) has no influ-ence on the expressiveness of local MSCAs but simplifies the proofs in Section 4. Intuitively,it prevents a local MSCA from doing unnecessary work. By item (5), we ensure that anMSCA cannot jump within an MSC but must move along process or message edges. Definition 2.18.
Let S be a set, ( M, v ) be a pointed MSC, and ρ = ( C, E, r, µ, ν ) be an S -labelled tree over ( M, v ). A path in ρ of length n ∈ N ∪ { ω } is a sequence x x x . . . ∈ C n such that x i +1 ∈ E ρ ( x i ) for all 1 ≤ i < n . It is a branch of ρ if x = r and E ρ ( x n ) = ∅ (provided that n ∈ N ).That means every branch of ρ begins in the root of ρ and either leads to some leaf of ρ or is infinite. Definition 2.19. If C ′ ⊆ C such that ( C ′ , E ∩ ( C ′ × C ′ )) is a tree with root r ′ , wedenote by ρ ↾ C ′ the restriction of ρ to C ′ , i.e., the S -labelled tree ( C ′ , E ′ , r ′ , µ ′ , ν ′ ) where E ′ = E ∩ ( C ′ × C ′ ), µ ′ = µ ↾ C ′ , and ν ′ = ν ↾ C ′ . We want the runs of local MSCAs to be maximal. That means that, during a run, alocal MSCA is forced to execute a transition if it is able to do so. If the MSCA is unableto proceed, we say that it is stuck.
Definition 2.20.
Let M be an MSC and M = ( S, δ, ι, κ ) be a local MSCA. The automa-ton M is stuck at v ∈ V M in the state s ∈ S if for every transition τ ∈ J δ ( s, λ M ( v )) K thereexists a movement ( D, s ′ ) ∈ τ such that there exists no event v ′ ∈ V M with η M ( v, v ′ ) = D .We are now prepared to define runs of local MSCAs. Definition 2.21.
Let M = ( S, δ, ι, κ ) be a local MSCA and ρ = ( C, E, r, µ, ν ) be an S -labelled tree over a pointed MSC ( M, v ). We define tr ρ : C → M × S to be the functionwhich maps every x ∈ C to the set (cid:8)(cid:0) η M ( ν ( x ) , ν ( x ′ )) , µ ( x ′ ) (cid:1) | x ′ ∈ E ρ ( x ) (cid:9) . The tree ρ is a run of M on ( M, v ) if µ ( r ) = ι and, for all x ∈ C , the run condition isfulfilled, i.e., • if E ρ ( x ) = ∅ , then tr ρ ( x ) ∈ J δ ( µ ( x ) , λ M ( ν ( x ))) K , and • if E ρ ( x ) = ∅ , then M is stuck at the event ν ( x ) in state µ ( x ). Definition 2.22.
Let ( s i ) i ≥ ∈ S ∗ ∪ S ω be a sequence of states. By inf (( s i ) i ≥ ), we denotethe set of states occurring infinitely often in ( s i ) i ≥ . If ( s i ) i ≥ is finite, then it is accepting if it ends in a state s whose rank κ ( s ) is even. If it is infinite, it is accepting if the minimumof the ranks of all states occurring infinitely often is even, i.e., min { κ ( s ) | s ∈ inf (( s i ) i ≥ ) } is even.If ρ is a run of M , and b is a branch of ρ , then b is accepting if its label µ ( b ) is accepting.A run ρ of M is accepting if every branch of ρ is accepting. By L ( M ), we denote the setof all pointed MSCs ( M, v ) for which there exists an accepting run of M . Furthermore,for all p ∈ P , L p ( M ) is the set of MSCs M with ( M, v ) ∈ L ( M ) where v is the minimalelement from V Mp with respect to (cid:22) Mp . Example 2.23.
Let M and M ′ be the MSCAs from the Examples 2.14 and 2.15, respec-tively. It can be easily checked that, for every pointed MSC ( M, v ), we have (
M, v ) ∈ L ( M )if and only if M, v | = β p where β p = h proc ∗ ; msg ; proc ∗ ; msg i P p is the formula from Exam-ple 2.9. In contrast, a pointed MSC ( M, v ) is accepted by M ′ if and only if M, v ′ | = β p forall v ′ ∈ V M with v (cid:22) Mp v ′ .We also introduce the notion of global MSCAs which come as a local MSCA togetherwith a set of global initial states. Definition 2.24. A global message sequence chart automaton (global MSCA) is a tuple G = ( M , I ) where M = ( S, δ, ι, κ ) is a local MSCA and I ⊆ S | P | is a set of global initialstates. The language of G is defined by L ( G ) = [ ( s ,...,s | P | ) ∈ I \ p ∈ P L p ( S, δ, s p , κ ) . The size of G is the size of M .Intuitively, an MSC M is accepted by G if and only if there exists a global initial state( s , s , . . . , s | P | ) ∈ I such that, for every p ∈ P , the local MSCA M accepts ( M, v p ) whenstarted in the state s p where v p is the minimal event on process p . DL WITH CONVERSE AND REPEAT FOR MESSAGE-PASSING SYSTEMS 11
Example 2.25.
Let G = ( M ′ , { ( t , . . . , t ) } ) be the global MSCA where M ′ is the localMSCA from Example 2.15. We have M ∈ L ( G ) if and only if M | = ϕ p where ϕ p is theglobal formula from Example 2.9.3. Closure under Complementation If M is a local MSCA, then a local MSCA M recognizing the complement of L ( M ) can beeasily obtained. Basically one just needs to exchange ∧ and ∨ in the image of the transitionfunction of M and update the ranking function.To make this more precise, let us first define the dual expression e E of a positive Booleanexpression E . Definition 3.1.
Let X be a set and E ∈ B + ( X ). Then the dual expression e E of E denotesthe positive Boolean expression obtained by exchanging ∧ and ∨ in E .Let us state the following two easy lemmas on positive Boolean expressions and theirdual counterparts. Lemma 3.2.
Let X be a set and E ∈ B + ( X ) . Then, for all Y ∈ mod ( E ) and Z ∈ mod ( e E ) ,we have Y ∩ Z = ∅ .Proof. If E = a for some a ∈ X , then the lemma easily follows. For the induction step, letus assume that E = E ∧ E such that, for all i ∈ [2], Y ∈ mod ( E i ), and Z ∈ mod ( e E i ),we have Y ∩ Z = ∅ . If Y ∈ mod ( E ) and Z ∈ mod ( e E ), then, without loss of generality, Y | = E and Z | = e E . It follows from the induction hypothesis that Y ∩ Z = ∅ . The case E = E ∨ E is shown analogously. Lemma 3.3.
Let X be a set and E ∈ B + ( X ) . If Z ⊆ X such that Z ∩ Y = ∅ for all Y ∈ mod ( E ) , then Z ∈ mod ( e E ) .Proof. If E = a for some a ∈ X , then the lemma easily follows. For the induction step, let E , E ∈ B + ( X ) such that, for all i ∈ [2], the following holds: if Z ⊆ X and Z ∩ Y = ∅ forall Y ∈ mod ( E i ), then Z ∈ mod ( e E i ).For the case E = E ∨ E , let Z ⊆ X such that Z ∩ Y = ∅ for all Y ∈ mod ( E ). If i ∈ [2] and Y ∈ mod ( E i ), then Y | = E . Hence, Z ∩ Y = ∅ for all i ∈ [2] and Y ∈ mod ( E i ).From our induction hypothesis it follows that Z | = e E and Z | = e E and, therefore, Z | = e E .Now, let us consider the case E = E ∧ E . Towards a contradiction, suppose that thereexists a Z ⊆ X such that Z ∩ Y = ∅ for all Y ∈ mod ( E ) and Z = e E . Since e E = e E ∨ e E ,we have Z = e E and Z = e E . From our induction hypothesis it follows that there exist Y ∈ mod ( E ) and Y ∈ mod ( E ) with Z ∩ Y = Z ∩ Y = ∅ . Since we also have Y ∪ Y | = E ,this is a contradiction to our definition of Z .We are now prepared to dualize local MSCAs. Definition 3.4.
Let M = ( S, δ, ι, c, κ ) be a local MSCA. The dual MSCA M is the localMSCA ( S, δ , ι, c, κ ) where • κ ( s ) = κ ( s ) + 1 for all s ∈ S and • δ ( s, σ ) = ^ δ ( s, σ ) for all s ∈ S and σ ∈ Σ. Remark 3.5.
Let M = ( S, ∆ , ι, κ ) be a local MSCA and ( s i ) i ≥ ∈ S ∞ be a sequence ofstates. Because of our definition of κ , a state s ∈ S has an even rank in M if and only ifit has an odd rank in M . It follows that ( s i ) i ≥ is accepting in M if and only if it is notaccepting in M .If ( M, v ) is a pointed MSC, ρ is a run of M on ( M, v ), and ρ is a run of M on( M, v ), then one can observe that ρ contains a branch x x x . . . and ρ contains a branch x ′ x ′ x ′ . . . such that µ ( x x x . . . ) = µ ( x ′ x ′ x ′ . . . ), i.e. they are labelled by the samesequence of states. Because of the fact stated in Remark 3.5, x x x . . . is accepting in M if and only if x ′ x ′ x ′ . . . is not accepting in M . By means of this observation, a result onparity games, and the ideas presented in [19], we prove the following theorem: Theorem 3.6. If M is a local MSCA and ( M, v ) is a pointed MSC, then ( M, v ) ∈ L ( M ) ⇐⇒ ( M, v ) / ∈ L ( M ) . The rest of this section prepares the proof of the above theorem. The actual proof canbe found on page 15.
Definition 3.7.
Let M = ( S, δ, ι, c, κ ) be a local MSCA and (
M, v ) be a pointed MSC.With M and the pointed MSC ( M, v ), we associate a game G ( M , M, v ) played by the twoplayers Automaton and
Pathfinder in the arena ( C A , C P , E A , E P ) where C A = V M × S , C P = V M × M × S , E A ⊆ C A × C P , E P ⊆ C P × C A , (cid:0) ( v, s ) , ( v, τ ) (cid:1) ∈ E A ⇐⇒ τ ∈ J δ ( s, λ M ( v )) K and, for all ( D, s ′ ) ∈ τ , there existsan event v ′ ∈ V M such that η M ( v, v ′ ) = D ,and (cid:0) ( v, τ ) , ( v ′ , s ) (cid:1) ∈ E P ⇐⇒ there exists D ∈ M such that ( D, s ) ∈ τ and η M ( v, v ′ ) = D . C A is the set of game positions of the player Automaton. Analogously, at a position from C P it is Pathfinder’s turn. The game position ( v, ι ) is called the initial position .A play of G ( M , M, v ) starts at the initial position ( v, ι ) from the set C A , i.e., the playerAutomaton has to move first. He chooses a transition τ from J δ ( ι, λ M ( v )) K resulting in agame position ( v, τ ) ∈ C P . Now, it is Pathfinder’s turn who has to pick a movement ( D, s )from τ . This leads to the game position ( v ′ , s ) ∈ C A with η M ( v, v ′ ) = D . After that,Automaton has to move next and so on. More formally, we define: Definition 3.8.
Let M = ( S, δ, ι, κ ) be a local MSCA and (
M, v ) be a pointed MSC. A partial play ξ of G ( M , M, v ) is a sequence of one of the following two forms:(1) ξ = (cid:0) ( v i , s i )( v i , τ i ) (cid:1) ≤ i ≤ n ∈ ( C A C P ) n where • n ≥ • ( v , s ) = ( v, ι ) • (( v i , s i ) , ( v i , τ i )) ∈ E A for all 1 ≤ i ≤ n • (( v i , τ i ) , ( v i +1 , s i +1 )) ∈ E P for all 1 ≤ i < n (2) ξ = (cid:0) ( v i , s i )( v i , τ i ) (cid:1) ≤ i The sequence ( s i ) ≤ i ≤ n ∈ S n is called the label of ξ . By ξ ↾ C A we denote the sequence( v , s )( v , s ) . . . which is obtained by restricting ξ to the positions from C A .The sequence ξ = ( v , s ) (cid:0) ( v i , τ i ) , ( v i +1 , s i +1 ) (cid:1) ≤ i A strategy of player Automaton in the game G ( M , M, v ) is a total function f : (( C A C P ) ∗ C A ) → C P . A (partial) play ξ = ( v , s ) (cid:0) ( v i , τ i )( v i +1 , s i +1 ) (cid:1) ≤ i M, v ) be a pointed MSC.Furthermore, let ( C A , C P , E A , E P ) be the arena of G ( M , M, v ) and ( C A , C P , E A , E P ) bethe arena of G ( M , M, v ). Firstly, let us state the fact that parity games enjoy memorylessdeterminacy. Proposition 3.10 ([15]) . From any game position in G ( M , M, v ) , either Automaton orPathfinder has a memoryless winning strategy. We now establish a connection between accepting runs of M and winning strategies ofthe player Automaton. Lemma 3.11. If ( M, v ) is accepted by M , then Automaton has a winning strategy in thegame G ( M , M, v ) .Proof. Let ρ = ( C, E, r, µ, ν ) be an accepting run of M on ( M, v ). We construct a strategy f for Automaton which ensures that, for every f -play ξ of G ( M , M, v ), the label of ξ is alsoa label of a branch of ρ . Let x ∈ C be a configuration with E ρ ( x ) = ∅ and b = x x . . . x n be the unique path from the root r to x in ρ . Consider the finite sequence ξ = ( v , s )( v , τ )( v , s ) . . . ( v n , s n ) ∈ ( C A C P ) n − C A where v i = ν ( x i ), s i = µ ( x i ), τ j = tr ρ ( x j ) for all i ∈ [ n ] and j ∈ [ n − ξ is a partial play of G ( M , M, v ). We define f ( ξ ) = ( v n , tr ρ ( x n )). The partial function f becomes a total function and, therefore, a strategy for the player Automaton by mappingevery value, for which we did not define f , to a fixed game position from C P . We show that every f -play develops along a branch of ρ . Every play of G ( M , M, v )starts in the initial position ( ι, v ) = ( ν ( r ) , µ ( r )). For the induction step, let ξ = ( v , s )( v , τ )( v , s ) . . . ( v n , s n ) ∈ ( C A C P ) n − C A be a partial f -play and b = x . . . x n ∈ C n be the prefix of the branch of ρ such that µ ( x i ) = s i , ν ( x i ) = v i for all i ∈ [ n ] and tr ρ ( x j ) = τ j for all j ∈ [ n − M is stuck in x n ,then we have E ρ ( x n ) = ∅ and the player Automaton cannot proceed in ξ . Otherwise, wehave E ρ ( x n ) = ∅ and f ( ξ ) is defined. After Automaton’s f -conform move we are at gameposition f ( ξ ) = ( v n , tr ρ ( x n )) ∈ C P . For every move ( D, s ) ∈ tr ρ ( x n ) of Pathfinder, thereexists a configuration x ∈ E ρ ( x n ) with µ ( x ) = s and η M ( ν ( x n ) , ν ( x )) = D . Hence, every f -play ξ develops along a branch b of ρ . Since b is accepting and b and ξ are labelled by thesame sequence over S , ξ is a play won by Automaton. Therefore, f is a winning strategyof Automaton in the game G ( M , M, v ). Lemma 3.12. If the player Automaton has a winning strategy in the game G ( M , M, v ) ,then the pointed MSC ( M, v ) is accepted by M .Proof. Let us assume that there exists a winning strategy f for Automaton in G ( M , M, v ).By Prop. 3.10, we can assume that f is memoryless. We inductively construct an acceptingrun ρ of M on ( M, v ). Firstly, we set ρ = ( C , E , r, µ , ν ) where C = { r } , E = ∅ , µ ( r ) = ι , and ν ( r ) = v . Now, let us assume that the S -labelled tree ρ i = ( C i , E i , r, µ i , ν i )is already defined. Let { x , x , . . . , x n } be the set of all leaves of ρ i in which M is notstuck, i.e., for all j ∈ [ n ], the local MSCA M is not stuck in state µ i ( x j ) at position ν i ( x j ). For every j ∈ [ n ], let τ j be the transition such that f ( ν i ( x j ) , µ i ( x j )) = ( ν i ( x j ) , τ j ).We set ρ i +1 = ( C i +1 , E i +1 , r, µ i +1 , ν i +1 ) to the smallest (with respect to the size of theset of configurations C i +1 ) S -labelled tree such that ρ i +1 ↾ C i = ρ i and, for all j ∈ [ n ], tr ρ i +1 ( x j ) = τ j .Let ρ = ( C, E, r, µ, ν ) = S i ≥ ρ i . It can be easily checked that ρ is a run of M on( M, v ). Now, let b = x x x . . . ∈ C ∞ be a branch of ρ . Consider the play ξ = (cid:0) ν ( x ) , µ ( x ) (cid:1)(cid:0) ν ( x ) , tr ρ ( x ) (cid:1)(cid:0) ν ( x ) , µ ( x ) (cid:1)(cid:0) ν ( x ) , tr ρ ( x ) (cid:1) . . . of G ( M , M, v ). It follows from the construction of ρ that ξ is an f -play. Since f is a winningstrategy for player Automaton, ξ is won by Automaton. Since ξ and b share the same label,the branch b is accepting in M . Hence, ρ is an accepting run of the local MSCA M .The next two lemmas state that a player has a winning strategy in the current game ifand only if there exists a winning strategy for its opponent in the dual game. Lemma 3.13. If Automaton has a winning strategy in G ( M , M, v ) , then Pathfinder has awinning strategy in the game G ( M , M, v ) .Proof. Let f A be a winning strategy of Automaton in the game G ( M , M, v ). We show thatthere exists a strategy f P for Pathfinder such that, for every f P -play ξ in G ( M , M, v ),there exists an f A -play ξ in G ( M , M, v ) such that ξ ↾ C A = ξ ↾ C A . Note that the initialpositions of the games G ( M , M, v ) and G ( M , M, v ) are the same. For the inductionstep, let n ≥ ξ ∈ ( C A C P ) n { ( w, s )( w, τ ) } be a partial play of G ( M , M, v ), and ξ ∈ ( C A C P ) n { ( w, s )( w, τ ) } be a partial f A -play of G ( M , M, v ) such that ξ ↾ C A = ξ ↾ C A .From Lemma 3.2 it follows that there exists a movement ( D, s ′ ) ∈ τ ∩ τ . Pathfinder chooses( D, s ′ ) as his next move resulting in a game position ( w ′ , s ′ ) where η M ( w, w ′ ) = D , i.e., DL WITH CONVERSE AND REPEAT FOR MESSAGE-PASSING SYSTEMS 15 f P ( ξ ) = ( w ′ , s ′ ). Clearly, the sequences ξ ( w ′ , s ′ ) and ξ ( w ′ , s ′ ) are equal when restrictingthem to positions from C A .Thus, for every f P -play ξ in G ( M , M, v ), there exists an f A -play ξ in G ( M , M, v )such that µ ( ξ ) = µ ( ξ ). Since f A is a winning strategy, ξ is a play won by Automatonin G ( M , M, v ). From Remark 3.5 it follows that the play ξ in G ( M , M, v ) is wonby Pathfinder. Hence, we showed that Pathfinder has a winning strategy in the game G ( M , M, v ). Lemma 3.14. If Pathfinder has a winning strategy in G ( M , M, v ) , then Automaton hasa winning strategy in the game G ( M , M, v ) .Proof. Let f P be a winning strategy of Pathfinder in the game G ( M , M, v ). We showthat there exists a winning strategy f A of Automaton ensuring that, for every f A -play ξ inthe game G ( M , M, v ), there exists an f P -play ξ in G ( M , M, v ) such that ξ ↾ C A = ξ ↾ C A . The initial positions of the games G ( M , M, v ) and G ( M , M, v ) are the same.For the induction step, let ξ ∈ ( C A C P ) n { ( w, s ) } be a partial play of G ( M , M, v ), and ξ ∈ ( C A C P ) n { ( w, s ) } be a partial f P -play in G ( M , M, v ) such that ξ ↾ C A = ξ ↾ C A .We define X = { ( D, s ′ ) ∈ S × M | there exists τ ∈ J δ ( s, λ M ( w )) K and w ′ ∈ V M such that f P ( w, τ ) = ( w ′ , s ′ ) and η M ( w, w ′ ) = D } to be the set of the possible f P -conform moves of Pathfinder after Automaton’s next movein the play ξ . We claim that there exists a transition τ ∈ J δ ( s, λ M ( w )) K with τ ⊆ X .Towards a contradiction, suppose there is no such τ . Then, for all τ ′ ∈ J δ ( s, λ M ( w )) K ,there exists a movement ( D τ ′ , s τ ′ ) ∈ τ ′ with ( D τ ′ , s τ ′ ) / ∈ X . If Z = { ( D τ ′ , s τ ′ ) | τ ′ ∈ J δ ( s, λ M ( w )) K } , then Z ∩ Y = ∅ for all Y ∈ mod ( δ ( s, λ M ( w ))). From Lemma 3.3 it followsthat Z ∈ mod ( δ ( s, λ M ( w ))). Hence, there exists a transition τ ′′ ∈ J δ ( s, λ M ( w )) K with τ ′′ ⊆ Z . However, we have τ ′′ ∩ X = ∅ which is a contradiction to our definition of X .Automaton chooses the above transition τ with τ ⊆ X as his next move resulting in agame position ( w, τ ) in the game G ( M , M, v ), i.e., f A ( ξ ) = ( w, τ ). For every move ( D, s ′ )of Pathfinder in G ( M , M, v ), there exists a τ ∈ J δ ( w, s ) K with f P ( ξ ( w, τ )) = ( w ′ , s ′ )and η M ( w, w ′ ) = D . This follows from the fact that ( D, s ′ ) ∈ X . Clearly, the sequences ξ ( w, τ )( w ′ , s ′ ) and ξ ( w, τ )( w ′ , s ′ ) are equal when restricting them to positions from C A .Let ξ be an f A -play in G ( M , M, v ). There exists an f P -play ξ in G ( M , M, v ) with ξ ↾ C A = ξ ↾ C A . Since µ ( ξ ) = µ ( ξ ) and since ξ is a play won by Pathfinder in G ( M , M, v ), the play ξ in G ( M , M, v ) must be won by Automaton (by Remark 3.5).Thus, f A is a winning strategy for Automaton in G ( M , M, v ).We are now able to prove our main theorem from this section. Proof of Theorem 3.6. By Lemma 3.11 and Lemma 3.12, the pointed MSC ( M, v ) is ac-cepted by M if and only if Automaton has a winning strategy in G ( M , M, v ). By thelemmas 3.13 and 3.14, the latter is the case if and only if Pathfinder has a winning strategyin the game G ( M , M, v ). From Prop. 3.10 it follows that this is the case if and onlyif Automaton has no winning strategy in G ( M , M, v ) respectively M does not accept( M, v ) (again by the Lemmas 3.11 and 3.12). ι | c | { σ } id ι | c | proc Figure 4: Illustrations of the local MSCAs M σ (left side) and M h proc i tt (right side). M h π i tt M h π i tt ι | c | ι | c | id Figure 5: Illustration of M h π ; π i tt .4. Translation of Local CRPDL Formulas In this section, we show that, for every local CRPDL formula α , one can compute a localMSCA M α in polynomial time which exactly accepts the set of models of α . More formally: Theorem 4.1. From a local formula α , one can construct in time poly ( | α | ) a local MSCA M α such that, for all pointed MSCs ( M, v ) , we have M, v | = α if and only if ( M, v ) ∈ L ( M α ) . The size of M α is linear in the size of α . The rest of this section prepares the proof of the above theorem. The actual proof canbe found on page 23.4.1. Construction. If α is a local formula, then we distinguish the following cases: Case α = σ . We define M σ = ( { ι, c } , δ, ι, c, κ ) where κ ( ι ) = 1, κ ( c ) = 0, and δ ( s, σ ′ ) = ( ( id , c ) if σ = σ ′ and s = ι ⊥ otherwisefor all s ∈ { ι, c } and σ ′ ∈ Σ. The local MSCA M σ is depicted on the left side of Fig. 4. Case α = ¬ β . We define M ¬ β to be the dual automaton of M β (cf. Definition 3.4). Case α = h D i tt with D ∈ M . We define M h D i tt = ( { ι, c } , δ, ι, c, κ ) where κ ( ι ) = 1, κ ( c ) = 0,and δ ( s, σ ) = ( ( D, c ) if s = ι ⊥ otherwisefor all s ∈ { ι, c } and σ ∈ Σ. On the right side of Fig. 4, there is an illustration of M h proc i tt . DL WITH CONVERSE AND REPEAT FOR MESSAGE-PASSING SYSTEMS 17 M β ι | c | ι ′ | idid Figure 6: Illustration of the local MSCA M h{ β }i tt . M h π i tt M h π i tt ι | c | ι | c | ι | c | idid idid ΣΣ Figure 7: Illustration of M h π + π i tt . Case α = h π ; π i tt . If M h π i i tt = ( S i , δ i , ι i , c i , κ i ) for i ∈ [2] and δ ( c , σ ) = ⊥ for all σ ∈ Σ, then we define M h π ; π i tt to be the local MSCA ( S, δ, ι , c , κ ) where S = S ⊎ S , κ = κ ∪ κ , and δ ( s, σ ) = ( id , ι ) if s = c δ ( s, σ ) if s ∈ S \ { c } δ ( s, σ ) if s ∈ S for all s ∈ S and σ ∈ Σ. Figure 5 shows an illustration of M h π ; π i tt .The automaton M h π ; π i tt is the concatenation of the local MSCAs M h π i tt and M h π i tt .Intuitively, M h π ; π i tt starts a copy of M h π i tt and, when this copy changes into its concate-nation state c , the automaton M h π ; π i tt proceeds with starting a copy of the local MSCA M h π i tt . Note that M h π ; π i tt is forced to start the copy of M h π i tt since runs of localMSCAs are maximal by definition (see Definition 2.21) and we have { ( id , ι ) } ∈ J δ ( c , σ ) K for every σ ∈ Σ. Case α = h{ β }i tt . If M β = ( S ′ , δ ′ , ι ′ , c ′ , κ ′ ), then we define M h{ β }i tt = ( S, δ, ι, c, κ ) where S = S ′ ⊎ { ι, c } , κ = κ ′ ∪ { ( ι, , ( c, } , and δ ( s, σ ) = ( id , ι ′ ) ∧ ( id , c ) if s = ι ⊥ if s = cδ ′ ( s, σ ) if s ∈ S ′ for all s ∈ S and σ ∈ Σ. The automaton M h{ β }i tt is depicted in Figure 6.Intuitively, the local MSCA M h{ β }i tt starts M β to test whether M, v | = β holds and,at the same time, changes into its concatenation state. M h π i tt ι | c | ι ′ | c ′ | id ΣΣ idid Σ Figure 8: Illustration of the local MSCA M h π ∗ i tt . Case α = h π + π i tt . If M h π i i tt = ( S i , δ i , ι i , c i , κ i ) and δ i ( c i , σ ) = ⊥ for all i ∈ [2] and σ ∈ Σ, then we define M h π + π i tt to be the local MSCA ( S, δ, ι, c, κ ) where S = S ⊎ S ⊎{ ι, c } , κ = κ ∪ κ ∪ { ( ι, , ( c, } , and δ ( s, σ ) = ( id , ι ) ∨ ( id , ι ) if s = ι ( id , c ) if s = c i and i ∈ [2] δ i ( s, σ ) if s ∈ S i \ { c i } and i ∈ [2] ⊥ if s = c for all s ∈ S and σ ∈ Σ. The local MSCA M h π + π i tt is visualized in Fig. 7. Case α = h π ∗ i tt . If M h π i tt = ( S ′ , δ ′ , ι ′ , c ′ , κ ′ ) and δ ′ ( c ′ , σ ) = ⊥ for all σ ∈ Σ, then we set M h π ∗ i tt = ( S, δ, ι, c, κ ) where S = S ′ ⊎ { ι, c } , κ and κ ′ coincide on S ′ \ { c ′ } , κ ′ ( s ) = 1 if s ∈ { ι, c ′ } , κ ′ ( c ) = 0, and δ ( s, σ ) = ( id , ι ) if s = c ′ ( id , ι ′ ) ∨ ( id , c ) if s = ι ⊥ if s = cδ ′ ( s, σ ) if s ∈ S ′ \ { c ′ } for all s ∈ S and σ ∈ Σ. See Fig. 8 for a visualization of M h π ∗ i tt .Intuitively, the local MSCA M h π ∗ i tt executes a copy of the automaton M h π i tt and, everytime this copy changes into its concatenation state c ′ , the local MSCA M h π ∗ i tt nondeter-ministically decides whether it restarts this copy again or changes into the concatenationstate c . Case α = h π i ω . If M h π i tt = ( S, δ ′ , ι, c, κ ) and δ ′ ( c, σ ) = ⊥ for all σ ∈ Σ, then we set M h π i ω = ( S, δ, ι, c, κ ) where δ ( s, σ ) = ( ( id , ι ) if s = cδ ′ ( s, σ ) if s ∈ S \ { c } for all s ∈ S and σ ∈ Σ. DL WITH CONVERSE AND REPEAT FOR MESSAGE-PASSING SYSTEMS 19 Concatenation States. In this section, we prove a technical proposition stating that,for all path formulas α , every accepting run of the local MSCA M α exhibits exactly oneconfiguration labelled by the concatenation state. It will be of use in Sect. 4.3 to show thecorrectness of our construction.Firstly, we introduce the notion of main states. A state s is called a main state ifthe concatenation state can be reached from s . The intuition of this type of states is thefollowing: If π is a path expression and ρ is an accepting run of M h π i tt , then ρ exhibitsone main branch b by which M “processes” the path expression π . The label of b solelyconsists of the not yet formally defined main states. In all the other branches of ρ , i.e.,in the branches which fork from b , M basically executes tests of the form { α } . All thesebranches are labelled by non-main states. Definition 4.2. Let M = ( S, δ, ι, c, κ ) be a local MSCA and s ∈ S . We inductively definethe set of main states ms ( M ) of M : ms ( M ) is the least set such that, for all s ∈ S , wehave s ∈ ms ( M ) if and only if(1) s = c or(2) there exist s ′ ∈ ms ( M ), σ ∈ Σ, D ∈ M , and τ ∈ J δ ( s, σ ) K such that ( D, s ′ ) ∈ τ .By examining our construction, one can make the following two simple observations. Remark 4.3. If α is a path formula and M α = ( S, δ, ι, c, κ ), the following conditions hold:(1) we have δ ( c, σ ) = ⊥ for every σ ∈ Σ(2) for all s ∈ ms ( M α ), we have κ ( s ) = ( s = c α is a path formula of the form h π ; π i tt , then in our construction of M α , we required δ ( c , σ ) = ⊥ for all σ ∈ Σ where δ is the transition relation and c is the concatenationstate of M h π i tt . It follows from the above observation (1) that our construction can beapplied to all formulas of the form h π ; π i tt . Similarly, this holds for our constructionof M α in the cases α = h π + π i tt , α = h π ∗ i tt , and α = h π i ω . Lemma 4.4. If α is a path formula and M α = ( S, δ, ι, c, κ ) , then the following two condi-tions hold: (a) ι ∈ ms ( M α )(b) for all s ∈ ms ( M α ) , σ ∈ Σ , and τ ∈ J δ ( s, σ ) K , we have | τ ∩ ( ms ( M α ) × M ) | = 1 (4.1)Intuitively, the above lemma states that every run of M α exhibits exactly one branchlabelled solely by main states and that all other configurations of this run which are notpart of this path are labelled by non-main states. Proof. By simple inspection, our claim follows for the cases α = h D i tt with D ∈ M and α = h{ β i} tt . As our induction hypothesis, let us assume that the above lemma holds for M h π i i tt = ( S i , δ i , ι i , c i , κ i ) where i ∈ [2]. If α = h π ; π i tt , then it can be easily checkedthat ms ( M α ) = ms ( M h π i tt ) ∪ ms ( M h π i tt ). Hence, ι ∈ ms ( M α ) and, therefore, property(a) is fulfilled. Now, let τ ∈ J δ ( s, σ ) K for some s ∈ S and σ ∈ Σ. Then τ = { ( id , ι ) } (if s = c ), τ ∈ J δ ( s, σ ) K , or τ ∈ J δ ( s, σ ) K . Together with our induction hypothesis it follows that (4.1) holds for τ . Now, let us consider the case α = h π + π i tt . By easy inspection itfollows that ms ( M α ) = { ι, c } ∪ ms ( M h π i tt ) ∪ ms ( M h π i tt ) . Hence, property (a) follows. If τ ∈ J δ ( s, σ ) K for some s ∈ S and σ ∈ Σ, then τ = { ( id , ι ) } , τ = { ( id , ι ) } , τ = { ( id , c ) } , τ ∈ J δ ( s, σ ) K , or τ ∈ J δ ( s, σ ) K . Property (b) follows from ourinduction hypothesis.Finally, we need to deal with the case α = h π ∗ i tt . For this, we assume that theabove lemma holds for M h π i tt = ( S ′ , δ ′ , ι ′ , c ′ , κ ′ ). Again, it can be easily verified that ms ( M α ) = { ι, c } ∪ ms ( M h π i tt ). Thus, property (a) holds. Now, let τ ∈ J δ ( s, σ ) K for some s ∈ S and σ ∈ Σ. We have τ = { ( id , ι ) } , τ = { ( id , ι ′ ) } , τ = { ( id , c ) } , or τ ∈ J δ ′ ( s, σ ) K .Property (b) follows from our induction hypothesis. Proposition 4.5. Let α be a path formula and ρ = ( C, E, r, µ, ν ) be an accepting run of M α = ( S, δ, ι, c, κ ) . There exists exactly one configuration from C denoted by cs ( ρ ) with µ ( cs ( ρ )) = c .Proof. It follows from Lemma 4.4 that all configurations x ∈ C with µ ( x ) ∈ ms ( M α ) forma unique branch b = x x x . . . ∈ C ∞ of ρ . Since ρ is accepting, b must be accepting. Itfollows from Remark 4.3 that µ ( b ) ∈ ( ms ( M α ) \ { c } ) ∗ { c } . Therefore, every accepting runof M α contains exactly one configuration labelled by c .4.3. Correctness. Let α be a local formula. We show by induction over the constructionof α that L ( M α ) = L ( α ). The following claim is used as the induction hypothesis of ourproof. Recall that reach M ( v, π ) is the set of all events which can be reached from v by apath described by π in the MSC M . Claim 4.6. Let α be a path formula. For all MSCs M , events v, v ′ ∈ V M , we have v ′ ∈ reach M ( v, α ) if and only if there exists an accepting run ρ of M α on ( M, v ) with ν ( cs ( ρ )) = v ′ . The following four technical lemmas deal with the correctness of the constructions ofthe local MSCAs M h π ; π i tt and M h π ∗ i tt . Lemma 4.7. Let M be an MSC, v , v ′ ∈ V M , and π , π be path expressions. If Claim 4.6holds for h π i tt and h π i tt and we have v ′ ∈ reach M ( v , π ; π ) , then there exists an accept-ing run ρ = ( C, E, r, µ, ν ) of M h π ; π i tt on ( M, v ) with ν ( cs ( ρ )) = v ′ .Proof. Let M h π ; π i tt = ( S, δ, ι, c, κ ) and M h π i i tt = ( S i , δ i , ι i , c i , κ i ) for all i ∈ [2]. If wehave v ′ ∈ reach M ( v , π ; π ), then, by definition, there exists an event v ∈ V M such that v ∈ reach M ( v , π ) and v ′ ∈ reach M ( v , π ). It follows from our assumption that thereexists an accepting run ρ = ( C , E , r , µ , ν ) of the local MSCA M h π i tt on ( M, v ) with ν ( cs ( ρ )) = v and that there exists an accepting run ρ = ( C , E , r , µ , ν ) of M h π i tt on( M, v ) with ν ( cs ( ρ )) = v ′ . Consider the S -labelled tree ρ = ( C ⊎ C , E, r , µ, ν ) where E = E ∪ E ∪ { ( cs ( ρ ) , r ) } , µ = µ ∪ µ , and ν = ν ∪ ν . It can be easily checked that ρ is a run of the local MSCA M h π ; π i tt on ( M, v ) with cs ( ρ ) = cs ( ρ ). In Fig. 9, the run ρ is depicted where cs ( ρ i ) is denoted by x i for i ∈ [2].It remains to show that ρ is accepting. Let b be a branch of ρ . We distinguish two cases:If b ∈ C ∞ , then b is also a branch from ρ . Since ρ is accepting, the branch b is acceptingin M h π i tt . Since κ ⊆ κ and µ ⊆ µ , b is accepting in M h π ; π i tt , too. Otherwise (i.e., if DL WITH CONVERSE AND REPEAT FOR MESSAGE-PASSING SYSTEMS 21 ρ r ι x c ρ r ι x c Figure 9: The run ρ of M h π ; π i tt . b ∈ C +1 { r } C ∞ ), there exists a suffix of b which is an accepting branch in ρ . Because ofthis fact, µ ⊆ µ , and κ ⊆ κ , the branch b is also accepting in M h π ; π i tt . Hence, ρ is anaccepting run of the automaton M h π ; π i tt on ( M, v ) with ν ( cs ( ρ )) = ν ( cs ( ρ )) = v ′ . Lemma 4.8. Let M be an MSC, v, v ′ ∈ V M , and π , π be path expressions. If Claim 4.6holds for h π i tt and h π i tt and there exists an accepting run ρ = ( C, E, r , µ, ν ) of M h π ; π i tt on ( M, v ) with ν ( cs ( ρ )) = v ′ , then v ′ ∈ reach M ( v, π ; π ) .Proof. Let M h π ; π i tt = ( S, δ, ι, c, κ ) and M h π i i tt = ( S i , δ i , ι i , c i , κ i ) for all i ∈ [2]. Since µ ( cs ( ρ )) = c and c = c ∈ S , there has to exist a configuration r ∈ C with µ ( r ) = ι .Towards a contradiction, let us assume that there exists another configuration r ∈ C with µ ( r ) = ι . Because ι is a main state in M h π ; π i tt (see the proof of Lemma 4.4) anddue to Lemma 4.4, r and r must occur in a branch of ρ . This is a contradiction to ι / ∈ src M h π π i tt ( ι ), i.e., ι is not reachable from ι in M h π ; π i tt . The latter fact follows bysimple inspection of the transition relation of M h π ; π i tt . Let C = { y ∈ C | ( r , y ) ∈ E ∗ } and C = C \ C . It can be easily checked that the S -labelled tree ρ i = ρ ↾ C i is a run of M h π i i tt for all i ∈ [2]. From the definition of the transition function δ , it follows that thereexists a configuration x ∈ C with µ ( x ) = c , E ρ ( x ) = { r } , ν ( x ) = ν ( r ). Figure 9shows a depiction of the run ρ consisting of ρ and ρ where cs ( ρ ) = x .If b is a branch of ρ and b ′ is the unique path in ρ from r to x , then b ′ b is a branchof ρ . Since ρ is accepting, µ ⊆ µ , and κ ⊆ κ , b is accepting in M h π i tt . Hence, ρ isan accepting run of M h π i tt on ( M, ν ( x )) with cs ( ρ ) = cs ( ρ ). Now, let b be a branch of ρ . If b ∈ ( C \ { x } ) ∞ , then b is a branch of ρ with µ ( b ) ∈ S ∞ . Since ρ is accepting, µ ⊆ µ and κ ⊆ κ , b is accepting in M h π i tt . Otherwise (i.e., if b ∈ C ∗ { x } ), b ends in aconfiguration labelled by c . By Remark 4.3, κ ( c ) is even and, therefore, b is acceptingin ρ . Hence, ρ is an accepting run of M h π i tt on ( M, v ) with cs ( ρ ) = x . By ourassumption, it follows that ν ( x ) ∈ reach M ( v, π ) and v ′ ∈ reach M ( ν ( x ) , π ). Therefore,we have v ′ ∈ reach M ( v, π ; π ). Lemma 4.9. Let M be an MSC, v, v ′ ∈ V M , and π be a path expression. If Claim 4.6holds for h π i tt and we have v ′ ∈ reach M ( v, π ∗ ) , then there exists an accepting run ρ =( C, E, r, µ, ν ) of M h π ∗ i tt on ( M, v ) with ν ( cs ( ρ )) = v ′ .Proof. Let M h π ∗ i tt = ( S, δ, ι, c, κ ) and M h π i tt = ( S ′ , δ ′ , ι ′ , c ′ , κ ′ ). Since v ′ ∈ reach M ( v, π ∗ ),there exist an n ≥ v , v , . . . , v n +1 ∈ V M such that v = v , v n +1 = v ′ , and v i +1 ∈ reach M ( v i , π ) for all i ∈ [ n ]. If n = 0, then the lemma follows by easy inspectionof the construction of M h π ∗ i tt . Now, let us assume that n ≥ 1. Since Claim 4.6 holds for h π i tt , we can assume that there exist, for all i ∈ [ n ], accepting runs ρ i = ( C i , E i , r i , µ i , ν i )of the automaton M h π i tt on the pointed MSC ( M, v i ) with ν i ( cs ( ρ i )) = v i +1 . Without loss y ι ρ r x y c ′ ι ′ ι ρ r x y c ′ ι ′ ι ρ r x y c ′ ι ′ ι zc Figure 10: The run ρ of M h π ∗ i tt consisting of three runs of M h π i tt ( n = 3).of generality, we may assume that C i ∩ C j = ∅ for all i, j ∈ [ n ] (note that we can enforce C i ∩ C j = ∅ by renaming the nodes of the C i ’s). Let x i = cs ( ρ i ) for all i ∈ [ n ]. The S -labelled tree ρ = ( C, E, y , µ, ν ) where C = S i ∈ [ n ] C i ⊎ { y , y , . . . , y n +1 , z } , E = S i ∈ [ n ] E i ∪ { ( y i , r i ) | i ∈ [ n ] } ∪ { ( x i , y i +1 ) | i ∈ [ n ] } ∪ { ( y n +1 , z ) } , µ = S i ∈ [ n ] µ i ∪ { ( y i , ι ) | i ∈ [ n + 1] } ∪ { ( z, c ) } , ν = S i ∈ [ n ] ν i ∪ { ( y i , v i ) | i ∈ [ n + 1] } ∪ { ( z, v ′ ) } is a run of M h π ∗ i tt on ( M, v ) with cs ( ρ ) = v ′ — this follows by an easy inspection of theconstruction of M h π ∗ i tt . Figure 10 shows a depiction of ρ for the case n = 3.It remains to show that ρ is accepting. Let b be a branch of ρ . If b ∈ ( C \ { z } ) ∞ , thenthere exist a suffix b ′ of b and an index i ∈ [ n ] such that b ′ is a branch of ρ i . Note that wehave µ ( b ′ ) ∈ ( S ′ \ { c ′ } ) ∞ . Since ρ i is accepting, b ′ is accepting in M h π i tt . Since µ i ⊆ µ and κ ′ ↾ ( S ′ \ { c ′ } ) ⊆ κ , it follows that b is accepting in M h π ∗ i tt . Otherwise (i.e., if b ∈ C ∗ { z } ),we have µ ( b ) = S ∗ { c } . Since κ ( c ) is even, b is accepting in M h π ∗ i tt . Hence, ρ is an acceptingrun of M h π ∗ i tt on the pointed MSC ( M, v ) with ν ( cs ( ρ )) = v ′ . Lemma 4.10. Let M be an MSC, v, v ′ ∈ V M , and π be a path expression. If Claim 4.6holds for h π i tt and there exists an accepting run ρ = ( C, E, y , µ, ν ) of M h π ∗ i tt on ( M, v ) with ν ( cs ( ρ )) = v ′ , then v ′ ∈ reach M ( v, π ∗ ) .Proof. Let M h π ∗ i tt = ( S, δ, ι, c, κ ) and M h π i tt = ( S ′ , δ ′ , ι ′ , c ′ , κ ′ ). Let R be the set of allconfigurations from C labelled by ι ′ . If R = ∅ , then ρ consists of exactly one branch b = y z with µ ( b ) = ιc and ν ( y ) = ν ( z ). This can be easily verified by inspecting the transitionfunction δ . From ν ( y ) = v and ν ( z ) = ν ( cs ( ρ )) = v ′ , it follows that v = v ′ . Hence, v ′ ∈ reach M ( v, π ∗ ).Now, let us assume that R = ∅ . It follows from ι ′ ∈ ms ( M h π ∗ i tt ) (see the last paragraphof the proof of Lemma 4.4) and Lemma 4.4 that all configurations from R occur in aunique finite branch b = z z . . . z ℓ of ρ . Without loss of generality, we can assume that R = { r , r , . . . , r n } such that there exist i < i < . . . < i n with z i k = r k for all k ∈ [ n ].By examining the transition function δ , one can see that: • For every i ∈ [ n ], there exists a y i ∈ C with E ρ ( y i ) = { r i } and µ ( y i ) = ι . • There exists a configuration y n +1 ∈ C with E ρ ( y n +1 ) = { cs ( ρ ) } and µ ( y n +1 ) = ι . • For every i ∈ [ n ], there exists a configuration x i with E ρ ( x i ) = { y i +1 } and µ ( x i ) = c ′ .Let C i = ( { x ∈ C | ( r i , x ) ∈ E ∗ } \ { x ∈ C | ( y i +1 , x ) ∈ E ∗ } ) for all i ∈ [ n ]. It can beeasily verified that the S -labelled tree ρ i = ( C i , E i , r i , µ i , ν i ) = ρ ↾ C i is a run of M h π i tt on DL WITH CONVERSE AND REPEAT FOR MESSAGE-PASSING SYSTEMS 23 ( M, ν ( r i )) with cs ( ρ i ) = x i . Figure 10 shows a depiction of the runs ρ , ρ , . . . , ρ n formingthe run ρ for the case n = 3.We now show that ρ i is an accepting run for every i ∈ [ n ]. Let i ∈ [ n ] and b be a branchof ρ i . If b ∈ ( C i \ { x i } ), then there exists a path b ′ from y to y i in ρ such that b ′ b is a branchof ρ . Since ρ is accepting, b ′ b is accepting in M h π ∗ i tt . Because of µ i ⊆ µ , µ ( b ) ∈ ( S ′ \ { c ′ } ) ∞ and κ ′ ↾ ( S ′ \ { c ′ } ) ⊆ κ , b is accepting in M h π i tt . If b ∈ C ∗ { x i } , then µ ( b ) ∈ S ′∗ { c ′ } . ByRemark 4.3, κ ′ ( c ′ ) is even and, therefore, b is accepting in M h π i tt . Hence ρ i is an acceptingrun of M h π i tt on ( M, ν ( r i )) with cs ( ρ i ) = x i .Since Claim 4.6 holds for h π i tt , we can assume that ν ( x i ) ∈ reach M ( ν ( r i ) , π ). Bychecking the definition of the transition function δ , one can easily verify that ν ( x i ) = ν ( y i +1 )and ν ( y i ) = ν ( r i ) holds for every i ∈ [ n ]. Hence, we have ν ( y i +1 ) ∈ reach M ( ν ( y i ) , π ) forevery i ∈ [ n ]. From ν ( y n +1 ) = ν ( cs ( ρ )) it follows that v ′ ∈ reach M ( v, π ∗ ).The following lemma dealing with the correctness of the construction of M h π i ω finishesthe preparatory work needed in order to proof Theorem 4.1. Lemma 4.11. Let M be an MSC, v ∈ V M , and π be a path expression. If Claim 4.6 holdsfor h π i tt , then M, v | = h π i ω ⇐⇒ ( M, v ) ∈ L ( M h π i ω ) Proof. Let us assume that M, v | = h π i ω . There exist v , v , v , . . . ∈ V M such that v = v and v i +1 ∈ reach ( v i , π ) for all i ≥ 1. Since Claim 4.6 holds for h π i tt , there exists an acceptingrun ρ i = ( C i , E i , r i , µ i , ν i ) of M h π i tt on ( M, v i ) with ν ( cs ( ρ i )) = v i +1 for every i ≥ 1. The S -labelled tree ρ = ( C, E, r , µ, ν ) with C = U i ≥ C i , E = S i ≥ E i ∪ { ( cs ( ρ i ) , r i +1 ) | i ≥ } , µ = S i ≥ µ i , and ν = S i ≥ ν i is a run of M h π i ω on ( M, v ). Let b be a branch of ρ . If thereexists an i > b ∈ ( C \ { r i } ) ∞ , then there exists a suffix b ′ of b such that b ′ isan accepting branch of ρ j for some j with 1 ≤ j < i . Hence, b is accepting in ρ . Otherwise(i.e., b is a branch going through r i for every i ≥ { κ ( s ) | s ∈ inf ( b ) } = κ ( c ) = 0. Hence, b is accepting and, therefore, ρ is accepting.The converse can be shown analogously.We are now able to prove our main theorem of this section. Proof of Theorem 4.1. By an easy analysis of our construction, one can see that, for alllocal formulas α , the automaton M α can be constructed in polynomial time and that itssize is linear in the size of α .Now, we inductively show that L ( α ) = L ( M α ) for every local formula α . Let us firstconsider the base cases. If α = σ with σ ∈ Σ, then it is easily checked that L ( α ) = L ( M α ).By simple inspection, it also follows that Claim 4.6 holds for α = h D i tt with D ∈ M .Regarding the induction step, we need to distinguish the following cases: If α = ¬ β , theclaim follows from Theorem 3.6. By Lemma 4.11, we have L ( α ) = L ( M α ) for α = h π i ω .Claim 4.6 holds for α = h π ; π i tt and α = h π ∗ i tt because of the lemmas 4.7, 4.8, 4.9, and4.10. Analogously, it can be shown that Claim 4.6 is also true for the cases α = h{ β }i tt and α = h π + π i tt . Note that we have M, v | = α if and only if reach M ( v, α ) = ∅ for allpointed MSCs ( M, v ). Hence, L ( α ) = L ( M α ) holds for the above path formulas. M α ι | ι ′ | f | , proc Σ id Figure 11: Illustration of the local MSCA M E α . M α ι | ι | ι ′ | proc idid Σ Figure 12: Illustration of the local MSCA M A α .5. Translation of Global CRPDL Formulas In this section, we demonstrate that, for every global CRPDL formula ϕ of the form E α or A α , one can compute a global MSCA G ϕ in polynomial time which exactly accepts the setof models of ϕ . Let ϕ be a global formula of the above form. Case ϕ = E α . If M α = ( S ′ , δ ′ , ι ′ , c, κ ′ ), then we set M E α = ( S, δ, ι, c, κ ) where S = S ′ ⊎{ ι, f } , κ ( s ) = κ ′ ( s ) for all s ∈ S ′ , κ ( ι ) = 1, κ ( f ) = 0, and, for all s ∈ S and σ ∈ Σ, δ ( s, σ ) = ( proc , ι ) ∨ ( id , ι ′ ) if s = ι ⊥ if s = fδ ′ ( s, σ ) otherwiseIntuitively, the automaton M E α (depicted in Fig. 11) moves forward on a process finitelymany times. At some event v , it nondeterministically decides to start the automaton M α to check whether ( M, v ) | = α holds.Now, G E α = ( M , I ) is meant to work as follows: it nondeterministically chooses aprocess on which it executes a copy of M E α in state ι . On all the other processes itaccepts immediately by starting M E α in the sink state f with rank 0. More formally, welet G E α = ( M E α , I ) where I = { ( s , s , . . . , s | P | ) | there exists p ∈ P such that s p = ι and s q = f for all p = q } . DL WITH CONVERSE AND REPEAT FOR MESSAGE-PASSING SYSTEMS 25 Case ϕ = A α . If M α = ( S ′ , δ ′ , ι ′ , c, κ ′ ), we set M A α = ( S, δ, ι , c, κ ) where S = S ′ ⊎ { ι , ι } , κ ( s ) = κ ′ ( s ) for all s ∈ S , κ ( ι ) = 1, κ ( ι ) = 0, and δ ( s, σ ) = ( id , ι ) ∧ ( id , ι ′ ) if s = ι ( proc , ι ) if s = ι δ ( s, σ ) otherwiseInformally speaking, the automaton M A α (depicted in Fig. 12) moves forward on a certainprocess p and checks, for every event v ∈ V Mp of this process, if ( M, v ) | = α holds. Notethat, if M A α is in state ι at an event v such that there exists a successor v ′ of v on thesame process, then M A α is forced to move to v ′ and to change into the state ι . That isdue to the fact that runs of local MSCAs are maximal by definition (see Definition 2.21)and because we have { ( proc , ι ) } ∈ J δ ( ι , σ ) K for every σ ∈ Σ.We define G A α = ( M A α , I ) where I = { ( ι , ι , . . . , ι ) } . That means G A α ensures( M, v ) | = α for every v ∈ M by starting M A α in the state ι on every process.Using Theorem 4.1 and by simple inspection of the above construction, the following theo-rem can be shown. Theorem 5.1. From a global formula ϕ of the form ϕ = E α or ϕ = A α , one can constructin time poly ( | ϕ | ) a global MSCA G ϕ such that, for all MSCs M , we have M | = ϕ if andonly if M ∈ L ( G ϕ ) . The size of G ϕ is linear in the size of ϕ . If ϕ is an arbitrary global formula, then we can also construct an equivalent globalMSCA G ϕ = ( M , I ). However, this time the space needed for our construction is exponentialin the number of “global” conjunctions occurring in ϕ . In fact, the size of M is still linearin ϕ but | I | is exponential in the number of conjunctive connectives occurring outside ofsubformulas of the form E α and A α , respectively.When constructing a global MSCA from an arbitrary global formula, we need to dis-tinguish the following two additional cases: Case ϕ = ϕ ∨ ϕ . Let G ϕ i = ( M i , I i ) and M i = ( S i , δ i , ι i , κ i ) for all i ∈ [2]. Then wedefine G ϕ ∨ ϕ = ( M , I ) where M = ( S, δ, ι , κ ), S = S ⊎ S , δ = δ ∪ δ , κ = κ ∪ κ , and I = I ∪ I . Case ϕ = ϕ ∧ ϕ . Let G ϕ i = ( M i , I i ) and M i = ( S i , δ i , ι i , c i , κ i ) for all i ∈ [2]. We define G ϕ ∧ ϕ = ( M , I ) where I = (cid:8)(cid:0) ( s , s ′ ) , ( s , s ′ ) , . . . , ( s | P | , s ′| P | ) (cid:1) | ( s , s , . . . , s | P | ) ∈ I , ( s ′ , s ′ , . . . , s ′| P | ) ∈ I (cid:9) , M = ( S ⊎ S ⊎ S, δ, ι , c , κ ), S = { ( s , s ) | s ∈ S , s ∈ S } , κ = κ ∪ κ ∪ { ( s, | s ∈ S } ,and, for all s ∈ S ∪ S ∪ S , s ∈ S , s ∈ S , and σ ∈ Σ: δ ( s, σ ) = ( id , s ) ∧ ( id , s ) if s = ( s , s ) ∈ Sδ ( s, σ ) if s ∈ S δ ( s, σ ) if s ∈ S Together with Theorem 5.1, we obtain: Corollary 5.2. From a global formula ϕ , one can construct in time poly ( | ϕ | ) a globalMSCA G ϕ such that, for all MSCs M , we have M | = ϕ if and only if M ∈ L ( G ϕ ) . The sizeof G ϕ is exponential in the size of ϕ . The Satisfiability Problem We strive for an algorithm that decides, given a global formula ϕ , whether L ( ϕ ) = ∅ holds.Unfortunately, the satisfiability problem of CRPDL is undecidable. This follows from resultsconcerning Lamport diagrams which can be easily transferred to MSCs [17]. However, ifone only considers existentially B -bounded MSCs [20, 16, 10, 9], then the problem becomesdecidable. Intuitively, an MSC M is existentially B -bounded if its events can be scheduledin such a way that at every moment no communication channel contains more than B pending messages (see definition below). The rest of this section prepares the proof of ourmain theorem which is stated in the following. The proof itself can be found on page 32. Theorem 6.1. The following problem is PSPACE-complete:Input: B ∈ N (given in unary) and a global CRPDL formula ϕ Question: Is there an existentially B -bounded MSC satisfying ϕ ? From MSCAs to Word Automata. In order to be able to give uniform definitionsof automata over MSCs and words, respectively, we also consider words over an alphabet Γas labelled relational structures. For this, we fix the set W = { prev , next , id } of directions. Definition 6.2. Let Γ be an arbitrary alphabet. A word-like structure over Γ is a structure W = ( V W , next W , λ W ) where • V W is a set of positions , • next W ⊆ ( V W × V W ), • λ W : V W → Γ is a labeling function, • next W is the direct successor relation of a linear order (cid:22) W on V W , • ( V W , (cid:22) W ) is finite or isomorphic to ( N , ≤ )The word-like structure W induces a partial function η W : ( V W × V W ) → W . For all v, v ′ ∈ V W , we define η W ( v, v ′ ) = next if ( v, v ′ ) ∈ next W prev if ( v ′ , v ) ∈ next W id if v = v ′ undefined otherwiseEvery finite word W = γ γ . . . γ n ∈ Γ ∗ gives rise to a unique (up to isomorphism)word-like structure W where V W = [ n ], next W = { ( i, i + 1) | ≤ i < n } , and λ W ( i ) = γ i for all i ∈ [ n ]. Analogously, every infinite word W ∈ Γ ω induces a word-like structure W .In the following, we identify W and W for every word W ∈ Γ ∞ .We now formalize the notion of existentially B -bounded MSCs. DL WITH CONVERSE AND REPEAT FOR MESSAGE-PASSING SYSTEMS 27 Definition 6.3. If M is an MSC and W is a word, then W is a linearization of M if V M = V W , λ M = λ W , and ( msg M ∪ S p ∈ P proc Mp ) ∗ ⊆ (cid:22) W . The word W is B -bounded if wehave |{ v ′ | v ′ (cid:22) W v, λ W ( v ′ ) = p ! q }| − |{ v ′ | v ′ (cid:22) W v, λ W ( v ′ ) = q ? p }| ≤ B ,for every v ∈ V W and ( p, q ) ∈ Ch . An MSC M is existentially B -bounded if there exists a B -bounded linearization of M , i.e., if it allows for an execution with B -bounded channels. Example 6.4. Let M be the MSC from Fig. 1. The word W = (1!2) (1!2) (1!2) (2?1) (2!1) (2?1) (2!1) (2?1) (2!1) (1?2) (1?2) (1?2) ∈ Σ ∗ is a 3-bounded linearization of M . Note that parentheses have been introduced for read-ability. There is even a 1-bounded linearization of M : W ′ = (1!2) (2?1) (1!2) (2!1) (2?1) (1!2) (1?2) (2!1) (2?1) (1?2) (2!1) (1?2) ∈ Σ ∗ Hence, W ′ witnesses the fact that M is existentially 1-bounded.We define two-way alternating automata over words in the style of local MSCAs. Definition 6.5. A two-way alternating parity automaton (or for short) is a quadruple P = ( S, δ, ι, κ ) where • S is a finite set of states, • δ : ( S × Σ) → B + ( W × S ) is a transition function, • ι ∈ S is an initial state, and • κ : S → { , , . . . , m − } is a ranking function with m ∈ N .The size of P is | S | + | δ | . If | τ | = 1 for all τ ∈ J δ ( s, σ ) K , s ∈ S , and σ ∈ Σ (i.e., P doesnot make use of universal branching), then P is called a two-way parity automaton (or ). If W is a word, then the definition of an S -labelled tree over W is analogous to thedefinition of an S -labelled tree over a pointed MSC (cf. Definition 2.17). Furthermore, an(accepting) run of a 2APA is defined in a similar way as it is defined for a local MSCA(cf. Definitions 2.20, 2.21, and 2.22). By L ( P ), we denote the set of words W for whichthere exists an accepting run of P on ( W, v ) where v is the minimal element from V W withrespect to (cid:22) W .Now, let us fix a channel bound B ∈ N and the alphabet Γ = Σ × { , , . . . , B − } . Definition 6.6. If W is a B -bounded word over Σ, then we associate with W the unique B -bounded word W B over Γ where V W = V W B , next W = next W B , and, for every v ∈ V W ,we have λ W B ( v ) = ( λ W ( v ) , i ) with i = |{ v ′ ∈ V W | v ′ ≺ W v, λ W ( v ) = λ W ( v ′ ) }| mod B .That means that, in the second component of the labels in W B , we count events labelledby the same action modulo B . Example 6.7. Let W and W ′ be the words from Example 6.4. For instance, W is theword(1!2 , 0) (1!2 , 1) (1!2 , 2) (2?1 , 0) (2!1 , 0) (2?1 , 1) (2!1 , 1) (2?12) (2!1 , 2) (1?2 , 0) (1?2 , 1) (1?2 , W ′ is given by:(1!2 , 0) (2?1 , 0) (1!2 , 1) (2!1 , 0) (2?1 , 1) (1!2 , 0) (1?2 , 0) (2!1 , 1) (2?1 , 0) (1?2 , 1) (2!1 , 0) (1?2 , In W B , we are able to quickly locate matching send and receive events. For example,if v is a send event of W B labelled by ( p ! q, i ), we just need to move to the smallest event v ′ ∈ V W B (with respect to (cid:22) W ) with v (cid:22) W B v ′ and λ W B ( v ′ ) = ( q ? p, i ).Let G = ( M , I ) be a global MSCA. We can construct a 2APA P G = ( S, δ, ι, κ ) thataccepts exactly the set of words W B where W is a B -bounded linearization of an MSC from L ( G ). In order to construct P G , there is one issue which needs to be addressed. Let M be anMSC and W be a B -bounded linearization of M . If v, v ′ ∈ V M with η M ( v, v ′ ) = proc , thena local MSCA is capable of directly moving to v ′ . In general, this cannot be accomplishedby a 2APA running on W B since there may exist events v ′′ ∈ V M with v ≺ W B v ′′ ≺ W B v ′ .To circumvent this limitation, the idea is to introduce transitions which allow the 2APA tomove forward on W B and skip non-relevant events until it reaches the event v ′ . Of course,we have to analogously deal with proc − , msg , and msg − transitions of local MSCAs.More precisely, regarding the 2APA P G , we use states of the form ( s, p, next ) to remem-ber that we are searching for the next event on process p in the next -direction. In contrast,a state of the form ( s, p ! q, i, prev ) means that we are looking for the nearest send event p ! q indexed by i in the prev -direction. The first component is always used to remember thestate from which we need to continue the simulation of the local MSCA M after findingthe correct event. If M = ( S ′ , δ ′ , ι ′ , κ ′ ), then the set of states of P G is the following: S = { ι, t } ∪ S ′ ∪ { ( s, p, prev ) , ( s, p, next ) | s ∈ S ′ , p ∈ P }∪ { ( s, σ, i, prev ) , ( s, σ, i, next ) | s ∈ S ′ , σ ∈ Σ , ≤ i < B } The intuition for the states from I is as follows: From the initial state ι , the 2APA P G nondeterministically changes into a global initial state ( ι , ι , . . . , ι | P | ) from I . That way, itsimulates | P | many copies of M where the p -th copy of M is started in the state ι p in theminimal event of process p (with respect to (cid:22) Mp ). More formally, for all γ ∈ Γ, we define δ ( ι, γ ) = _ ( ι ,...,ι | P | ) ∈ I (cid:0) id , ( ι , , next ) (cid:1) ∧ (cid:0) id , ( ι , , next ) (cid:1) ∧ . . . ∧ (cid:0) id , ( ι | P | , | P | , next ) (cid:1) . Assume that the automaton P G is in a state of the form ( s, p, D ) resp. ( s, σ, i, D ) at an event v . If λ M ( v ) / ∈ Σ p × { , . . . , B − } resp. λ M ( v ) = ( σ, i ), i.e., if v is not the event at whichthe simulation of M needs to be continued, then we stay in the current state and move intodirection D . Otherwise, we simulate a transition τ ∈ J δ ′ ( s, λ M ( v )) K of the local MSCA M in the following manner: If ( proc , s ) ∈ τ , then we change into the state ( s, p, next ) and movealong the next -direction. If ( proc − , s ) ∈ τ , then we act analogously in the prev -direction.Now, let us assume that ( msg , s ) ∈ τ . If λ M ( v ) is of the form ( p ! q, i ), then we change into( s, q ? p, i, next ) and move along the next -direction. In contrast, if v is a receive event, thenthe local MSCA M is unable to execute the movement ( msg , s ). To simulate this behavior,we change into the sink state t and stay at v . From state t , the 2APA P G is unable toaccept. If ( msg − , s ) ∈ τ , then we proceed similarly. Formally, for all s ∈ S ′ , p ∈ P , σ ∈ Σ, DL WITH CONVERSE AND REPEAT FOR MESSAGE-PASSING SYSTEMS 29 i ∈ { , . . . , B − } , D ∈ W , and γ ∈ Γ, we have δ (cid:0) ( s, p, D ) , γ (cid:1) = ((cid:0) D, ( s, p, D ) (cid:1) if γ / ∈ Σ p × { , . . . , B − } ( id , s ) if γ ∈ Σ p × { , . . . , B − } δ (cid:0) ( s, σ, i, D ) , γ (cid:1) = ((cid:0) D, ( s, σ, i, D ) (cid:1) if γ = ( σ, i )( id , s ) if γ = ( σ, i ) δ ( s, γ ) = g ( s, γ ) δ ( t, γ ) = ⊥ where, for all s ∈ S ′ and γ = ( pθq, i ) ∈ Γ, g ( s, γ ) is the positive Boolean expression which isobtained from δ ′ ( s, pθq ) by applying the following substitutions: for all s ′ ∈ S ′ , we exchange • ( proc , s ′ ) by (cid:0) next , ( s ′ , p, next ) (cid:1) , • ( proc − , s ′ ) by (cid:0) prev , ( s ′ , p, prev ) (cid:1) , • ( msg , s ′ ) by ( id , t ) if θ =?, • ( msg , s ′ ) by (cid:0) next , ( s ′ , q ? p, i, next ) (cid:1) if θ =!, • ( msg − , s ′ ) by ( id , t ) if θ =!, and • ( msg , s ′ ) by (cid:0) prev , ( s ′ , q ! p, i, prev ) (cid:1) if θ =?.It remains to define the ranking function κ of P G . For all s ∈ S ′ , we define κ ( s ) = κ ′ ( s ). If s ∈ S \ S ′ , then we set κ ( s ) = m where m is the smallest odd natural number larger thanmax s ∈ S ′ κ ′ ( s ). Theorem 6.8. Let M be an MSC and W some B -bounded linearization of M . We have M ∈ L ( G ) if and only if W B ∈ L ( P G ) . The size of P G is polynomial in B and the size of G .Proof sketch. If ρ is a successful run of P G on an MSC M , then ρ immediately splits into | P | many subtrees ρ q . By easy inspection of the transition function of P G it follows that thereexists a global initial state ( ι , . . . , ι | P | ) ∈ I such that, for every q ∈ P , there is exactly onesubtree ρ q = ( C q , E q , r q , µ q , ν q ) with µ q ( r q ) = ( ι q , q, next ). Each of these subtrees ρ q can bepruned in such a way that one obtains an accepting run ρ ′ q of M starting in state ι q fromthe minimal event of V Mq (with respect to (cid:22) Mq ). Thus, M is accepted by G . Note that weobtain ρ ′ q from ρ q by essentially removing all configurations x with µ q ( x ) / ∈ S ′ ; of course,we need to update E q accordingly.The converse can be shown analogously. Basically, one only needs to pad and combinethe accepting runs of the local MSCA M on the different processes in order to obtain asuccessful run of P G .6.2. Checking the Emptiness of 2APAs. In order to solve the emptiness problem fora 2APA P , we transform P into a B¨uchi automaton. Definition 6.9. Formally, a B¨uchi automaton (or BA ) over the alphabet Σ is a tuple B = ( S, ∆ , ι, F ) where S is a finite set of states, ι is the initial state, F ⊆ S is the set offinal states, and ∆ ⊆ S × Σ × S is the transition relation. The size of B is | S | + | ∆ | . Let W = σ σ . . . ∈ Σ ∞ be a word of length n ∈ N ∪ {∞} . The mapping r : N → S is a runof B on W if r (0) = ι and ( r ( i ) , σ i , r ( i + 1) (cid:1) ∈ ∆ for all i < n . A word W is accepted by B if there exists a run r such that r (0) r (1) r (2) . . . ∈ S ∞ is B¨uchi accepting , i.e., if one of thefollowing conditions is fulfilled: (1) n ∈ N and r ( n ) ∈ F (2) inf (cid:0) r (0) r (1) r (2) . . . (cid:1) ∩ F = ∅ By L ( B ), we denote the set of words which are accepted by B .In contrast to common definitions of B¨uchi automata, item (1) allows B to accept finitewords as well. In the following, we also need to deal with two-way (alternating) B¨uchiautomata (2ABA and 2BA for short) which are defined analogously to 2APA and 2PA butimplement the B¨uchi acceptance condition instead of the parity acceptance condition. Definition 6.10. More precisely, a two-way alternating B¨uchi automaton ( for short)is a tuple B = ( S, δ, ι, F ) where S , δ , and ι are defined as for 2APA’s and F ⊆ S is theset of final states. An (accepting) run of B is defined in a similar way as it is defined fora 2APA with the following modification: A sequence of states ( s i ) i ≥ ∈ S ∞ is accepting ifand only if it is B¨uchi accepting. A two-way B¨uchi automaton ( ) is defined analogouslyto a 2PA. Remark 6.11. Note that using the ideas from [14], a 2APA P can be transformed intoa 2ABA B in polynomial space such that the size of B is polynomial in the size of P and L ( P ) = L ( B ).In [4], Dax and Klaedtke showed the following: Theorem 6.12 ([4]) . From a 2APA P , one can construct a BA B whose size is exponentialin the size of P such that L ( P ) = L ( B ) . Note that Dax and Klaedtke actually stated that one can construct a BA of size 2 O (( nk ) ) where n is the size of P and 2 k is the maximal rank of a state from P . Since we can assumethat the maximal rank of a state of P is linear in the number of states of P , it followsthat the size of B is exponential in the size of P . Furthermore, in [4], only infinite wordsare considered. Nevertheless, it can be easily seen that the result also applies to automatarecognizing infinite and finite words at the same time.In the following, we recall parts of the proof of Theorem 6.12 and adapt it to oursetting in order to be able to prove Prop. 6.13. Let B = ( S, δ, ι, F ) be an 2ABA over thealphabet Σ and let Γ be an abbreviation of the function space S → W × S . If W is a wordand ρ = ( C, E, r, µ, ν ) is an accepting run of P on W , then the authors of [4] argue thatwe can assume without loss of generality that all nodes x and y of ρ with µ ( x ) = µ ( y )and ν ( x ) = ν ( y ) exhibit isomorphic subtrees. Hence, ρ can be thought of as a directedacyclic graph (DAG) which can be represented as a (possibly infinite) word of functions f = f f . . . ∈ Γ ∞ where f j ( q ) = tr ρ ( x ) (cf. Def. 2.21), µ ( x ) = q , and ν ( x ) is the j -thposition of W with respect to next W . From B , an intermediate 2BA B ′ = ( S, δ ′ , ι, S \ F )over the alphabet Σ × Γ is constructed where, for all s ∈ S , σ ∈ Σ, and f ∈ Γ, we have δ ′ (cid:0) s, ( σ, f ) (cid:1) = (W ( D,s ′ ) ∈ f ( s ) ( D, s ′ ) if f ( s ) ∈ J δ ( s, σ ) K ( next , s ) otherwise.Note that the automaton B ′ is of exponential size since the size of the alphabet Γ is ex-ponential in the size of B . However, the set of states of B ′ equals the set of states of B .It is shown that B ′ rejects exactly those words ( σ , f )( σ , f ) . . . ∈ (Σ × Γ) ∞ where thefunction word ( f i ) i ≥ represents an accepting run of B on ( σ i ) i ≥ . In the course of the proofof Theorem 6.12, using [24, Theorem 4.3], a B¨uchi automaton B ′′ whose size is exponential DL WITH CONVERSE AND REPEAT FOR MESSAGE-PASSING SYSTEMS 31 in B with L ( B ′′ ) = (Σ × Γ) ∞ \ L ( B ′ ) is constructed. It is shown that the projection of L ( B ′′ )to the alphabet Σ equals L ( B ).We also recall the essential parts of the proof of Theorem 4.3 of [24], apply a minorcorrection and adapt it to our setting. Let B = ( S, δ, ι, F ) be a 2BA. By bwl ( B ), we denotethe set S ×{ , } . Intuitively, a triple ( s, t, b ) ∈ bwl ( B ) expresses that at the current positionthere is a backward loop starting in state s and ending in state t . We have b = 1 if and onlyif this loop visits a final state. A word ( σ σ . . . , m m . . . , n n . . . ) ∈ (Σ × bwl ( B ) × S ) ∞ of length h ∈ N ∪ {∞} is B -legal if and only if there exists a sequence ℓ ℓ . . . ∈ bwl ( B ) ∞ oflength h such that the following conditions are fulfilled: • ( s, t, ∈ ℓ i if and only if either { ( id , t ) } ∈ J δ ( s, σ i ) K or i ≥ s ′ , t ′ ∈ S and b ∈ { , } such that ( s ′ , t ′ , b ) ∈ m i − , { ( s ′ , prev ) } ∈ J δ ( s, σ i ) K , and { ( t, next ) } ∈ J δ ( t ′ , σ i − ) K • ( s, t, ∈ ℓ i if and only if either { ( id , t ) } ∈ J δ ( s, σ i ) K and t ∈ F or i ≥ s ′ , t ′ ∈ S and b ∈ { , } such that ( s ′ , t ′ , b ) ∈ m i − , { ( s ′ , prev ) } ∈ J δ ( s, σ i ) K , { ( t, next ) } ∈ J δ ( t ′ , σ i − ) K , and in addition either b = 1 or { s ′ , t ′ , t } ∩ F = ∅• ( s, t, ∈ m i if and only if there are s , s , . . . , s k ∈ S and b , b , . . . , b k − ∈ { , } with k > s = s , s k = t , and ( s j , s j +1 , b j ) ∈ ℓ i for all 0 ≥ j > k • ( s, t, ∈ m i if and only if there are s , s , . . . , s k ∈ S and b , b , . . . , b k − ∈ { , } with k > s = s , s k = t , ( s j , s j +1 , b j ) ∈ ℓ i for all 0 ≥ j > k , and { b , b , . . . , b k }∩{ } 6 = ∅• s ∈ n i if and only if there exists a state s ′ ∈ S and b ∈ { , } such that ( s, s ′ , b ) ∈ m i andone of the following conditions holds: − ( s ′ , s ′ , ∈ m i − s ′ ∈ F and B cannot make a transition at position i in state s ′ − i ≥ s ′′ ∈ S such that { ( s ′′ , prev ) } ∈ J δ ( s ′ , σ i ) K and s ′′ ∈ n i − Note that the ℓ i ’s are only used to simplify the definition of the m i ’s. The introductionof the n i ’s is a minor correction of the proof of Theorem 4.3. Intuitively, we have s ∈ n i if there exists a position j ≤ i and a state s ′ ∈ S such that there exists a backward runstarting in s allowing B to visit the j -th position of the input word in state s ′ such thatthe following holds: either s ′ ∈ F and B cannot make a transition at position j in state s ′ or, at position j in state s ′ , the automaton B can enter infinitely often a loop containing afinal state. Note that without the information contained in the n i ’s, we would not captureaccepting runs of B which do not visit all positions of the input word but, at some position i , go backward and then accept without returning to i again.From the 2BA B = ( S, δ, ι, F ), we can construct a BA B recognizing the set of all B -legalwords. Let B = ( S , ∆ , ι , F ) be the BA where S = 2 S × bwl ( B ) × S , ι = ( ∅ , ∅ , ∅ ), F = S and, for all ( p ′ , m ′ , n ′ ) , ( p, m, n ) ∈ S and ( σ, m, n ) ∈ Σ × bwl ( B ) × S , we have (cid:0) ( p ′ , m ′ , n ′ ) , ( σ, m, n ) , ( p, m, n ) (cid:1) ∈ ∆ if and only if there exists ℓ ⊆ bwl ( B ) such that thefollowing conditions hold: • m = m , n = n , • ( s, t ) ∈ p if and only if { ( next , t ) } ∈ J δ ( s, σ ) K • ( s, t, ∈ ℓ if and only if { ( id , t ) } ∈ J δ ( s, σ ) K or there are states s ′ , t ′ ∈ S and b ∈ { , } such that ( s ′ , t ′ , b ) ∈ m ′ , { ( prev , s ′ ) } ∈ J δ ( s, σ ) K , and ( t ′ , t ) ∈ p ′ • ( s, t, ∈ ℓ if and only if { ( id , t ) } ∈ J δ ( s, σ ) K and t ∈ F or there are states s ′ , t ′ ∈ S and b ∈ { , } such that ( s ′ , t ′ , b ) ∈ m ′ , { ( prev , s ′ ) } ∈ J δ ( s, σ ) K , ( t ′ , t ) ∈ p ′ , and in additioneither b = 1 or { s ′ , t ′ , t } ∩ F = ∅ • ( s, t, ∈ m if and only if there are s , s , . . . , s k ∈ S and b , b , . . . , b k − ∈ { , } with k > s = s , s k = t , and ( s j , s j +1 , b j ) ∈ ℓ for all 0 ≥ j > k • ( s, t, ∈ m if and only if there are s , s , . . . , s k ∈ S and b , b , . . . , b k − ∈ { , } with k > s = s , s k = t , ( s j , s j +1 , b j ) ∈ ℓ for all 0 ≥ j > k , and { b , b , . . . , b k − } ∩{ } 6 = ∅• s ∈ n if and only if there exists a state s ′ ∈ S and b ∈ { , } such that ( s, s ′ , b ) ∈ m andone of the following holds: − ( s ′ , s ′ , ∈ m − s ′ ∈ F and B cannot make a transition at the current position in state s ′ − there exists a state s ′′ ∈ S such that { ( s ′′ , prev ) } ∈ J δ ( s ′ , σ ) K and s ′′ ∈ n ′ It remains to specify a BA B such that a B -legal word ( σ σ . . . , m m . . . , n n . . . ) ∈ (Σ × bwl ( B ) × S ) ∞ is accepted by B if and only if ( σ i ) i ≥ is accepted by the 2BA B . Let B = ( S , ∆ , ι , F ) where • S = ( S ∪ {⊥} ) × { , } , • ι = ( ι, b ) with b = 1 if and only if ι ∈ F , • F = ( S ∪ {⊥} ) × { } , and, • we have (cid:0) ( s, b ) , ( σ, m, n ) , ( s ′ , b ′ ) (cid:1) ∈ ∆ if and only if one of the following conditions isfulfilled: − there exist s ′′ ∈ S and b ′′ ∈ { , } such that ( s, s ′′ , b ′′ ) ∈ m , { ( s ′ , next ) } ∈ J δ ( s ′′ , σ ) K ,and ( b ′ = 1 if and only if b ′′ = 1 or s ′ ∈ F ) − s ∈ n and s ′ = ⊥− s ′ = t ′ = ⊥ Intuitively, the states of B come with a flag. The flag is set to 1 if and only if the simulatedautomaton B just visited a final state. It can be shown that the projection of L ( B ) ∩ L ( B )to the alphabet Σ equals the language of B . Proposition 6.13. If P is a 2APA, then one can check the emptiness of L ( P ) in polynomialspace.Proof sketch. By Remark 6.11, we can transform P in polynomial space into a 2ABA rec-ognizing the same language. By Theorem 6.12, we can construct a BA B ′ = ( S, ∆ , ι, F )whose size is exponential in the size of P such that L ( B ′ ) = L ( P ). Clearly, remembering astate of B ′ requires only polynomial space. By inspecting the construction of B ′ , one cansee that B ′ can be obtained in space polynomial in the size of P . This means in particular:given two states s, t ∈ S and σ ∈ Σ, one can check in polynomial space whether ( s, σ, t ) ∈ ∆holds. Since L ( B ′ ) is non-empty if there exists a final state s ∈ F which is reachable from ι (recall that our B¨uchi automata also accept finite words), the emptiness problem of P canbe solved in polynomial space.6.3. The Decision Procedure. We are now able to prove our main theorem: Proof of Theorem 6.1. The global formula ϕ is a positive Boolean combination of globalformulas ϕ , . . . , ϕ n where, for every i ∈ [ n ], ϕ i is of the form A α i or E α i for some localformula α i . It follows from Theorem 5.1 that we can construct in polynomial space a globalMSCAs G i such that L ( ϕ i ) = L ( G i ) and the size of G i is linear in the size of ϕ i for every i ∈ [ n ]. By Theorem 6.8, we can construct, for every i ∈ [ n ], a 2APA P i such that, forall MSCs M and B -bounded linearizations W of M , we have M ∈ L ( ϕ i ) if and only if DL WITH CONVERSE AND REPEAT FOR MESSAGE-PASSING SYSTEMS 33 W B ∈ L ( P i ). By simple inspection of the construction of Sect. 6.1, one can see that P i can be obtained in polynomial space. The number of states of P i is also polynomial. Usingstandard automata constructions for alternating automata, we can combine the automata P , . . . , P n according to the construction of ϕ to obtain a 2APA P ϕ such that, for all MSCs M and B -bounded linearizations W of M , we have W B ∈ L ( P ϕ ) if and only if M ∈ L ( ϕ ).This can be accomplished in polynomial space and the number of states of P ϕ is alsopolynomial in B and the size of ϕ . Clearly, ϕ is satisfiable by an existentially B -boundedMSC if and only if L ( P ) is non-empty. Hence, by Prop. 6.13, the satisfiability problemof ϕ can be decided in polynomial space. The hardness result follows from the PSPACE-hardness of the satisfiability problem of LTL.7. The Model Checking Problem A communicating finite-state machine (also known as message-passing automaton) is wellsuited to model the behavior of a distributed system. It consists of a finite number offinite automata communicating using order-preserving channels. To be more precise, werecapitulate the definition from [1]. Definition 7.1. A communicating finite-state machine (or CFM for short) is a structure C = ( H, ( T p ) p ∈ P , F ) where • H is a finite set of message contents , • for every p ∈ P , T p = ( S p , → p , ι p ) is a finite labelled transition system over the alphabetΣ p × H (i.e., → p ⊆ S p × Σ p × H × S p ) with initial state ι p ∈ S p , • F ⊆ Q p ∈ P S p is a set of global final states.Let C be a CFM and M be an MSC. A run of C on M is a pair ( ζ, χ ) of mappings ζ : V M → S p ∈ P S p and χ : V M → H such that, for all v ∈ V M , • χ ( v ) = χ ( v ′ ) if there exists v ′ ∈ V M with η M ( v, v ′ ) = msg , • ( ζ ( v ′ ) , λ ( v ) , χ ( v ) , ζ ( v )) ∈ → P M ( v ) if there exists v ′ ∈ V M with η M ( v ′ , v ) = proc , and( ι p , λ ( v ) , χ ( v ) , ζ ( v )) ∈ → P M ( v ) otherwise.Let cofin ζ ( p ) = { s ∈ S p | ∀ v ∈ V Mp ∃ v ′ ∈ V Mp : v ≺ Mp v ′ ∧ ζ ( v ′ ) = s } . The run ( ζ, χ ) is accepting if there is some ( s p ) p ∈ P ∈ F such that s p ∈ cofin ζ ( p ) for all p ∈ P . The language of C is the set L ( C ) of all MSCs M for which there exists an accepting run.We now demonstrate that the bounded model checking problem for CFMs and CRPDLis PSPACE-complete. Theorem 7.2. The following problem is PSPACE-complete:Input: B ∈ N (given in unary), CFM C , and a global CRPDL formula ϕ .Question: Is there an existentially B -bounded MSC M ∈ L ( C ) with M | = ϕ ?Proof. In [1], it was shown that one can construct in polynomial space a B¨uchi automaton B C from C which recognizes exactly the set of all B -bounded linearizations of the MSCs from L ( C ). Its number of states is polynomial in the maximal number of local states a transitionsystem of C has and exponential in B . In the proof of Theorem 6.1, we already constructedin polynomial space a B¨uchi automaton B ϕ of exponential size accepting the set of all B -bounded linearizations of the MSCs satisfying ϕ . Hence, the model checking problem canbe decided in polynomial space. The PSPACE-hardness follows from the PSPACE-hardnessof the satisfiability problem. Remark 7.3. The model checking problem for CRPDL and high-level message sequencecharts (HMSCs) asks, given an HMSC H and a global CRPDL formula ϕ , is there anMSC M ∈ L ( H ) with M | = ϕ . Using techniques from [1] and the ideas from the proof ofTheorem 7.2, it can be shown that this problem is also PSPACE-complete.8. Open Questions It is an interesting open question whether the bounded model checking problem of CFMsand CRPDL enriched with the intersection operator [11, 1] is still in PSPACE. It also needsto be investigated whether PDL is a proper fragment of CRPDL and if CRPDL and globalMSCAs are expressively equivalent. Furthermore, we would like to know more about theexpressive power of CRPDL and global MSCAs in general, especially in comparison withthe existential fragment of monadic second-order logic (EMSO). References [1] B. Bollig, D. Kuske, and I. Meinecke. Propositional dynamic logic for message-passing systems. LogicalMethods in Computer Science , 6(3), 2010.[2] D. Brand and P. Zafiropulo. On communicating finite-state machines. J. ACM , 30(2):323–342, 1983.[3] E. M. Clarke and E. A. Emerson. Design and synthesis of synchronization skeletons using branching-timetemporal logic. In Logic of Programs , volume 131 of LNCS , pages 52–71. Springer, 1981.[4] C. Dax and F. Klaedtke. Alternation elimination by complementation (extended abstract). In LPAR ,volume 5330 of LNCS , pages 214–229. Springer, 2008.[5] M. J. Fischer and R. E. Ladner. Propositional dynamic logic of regular programs. J. Comput. Syst. Sci. ,18(2):194–211, 1979.[6] P. Gastin and D. Kuske. Uniform satisfiability in PSPACE for local temporal logics over Mazurkiewicztraces. Fundam. Inform. , 80(1-3):169–197, 2007.[7] P. Gastin and D. Kuske. Uniform satisfiability problem for local temporal logics over Mazurkiewicztraces. Inf. Comput. , 208(7):797–816, 2010.[8] P. Gastin and D. Oddoux. LTL with past and two-way very-weak alternating automata. In MFCS ,volume 2747 of LNCS , pages 439–448. Springer, 2003.[9] B. Genest, D. Kuske, and A. Muscholl. A Kleene theorem and model checking algorithms for existentiallybounded communicating automata. Inf. Comput. , 204(6):920–956, 2006.[10] B. Genest, A. Muscholl, H. Seidl, and M. Zeitoun. Infinite-state high-level MSCs: Model-checking andrealizability. J. Comput. Syst. Sci. , 72(4):617–647, 2006.[11] D. Harel, D. Kozen, and J. Tiuryn. Dynamic Logic . MIT Press, 2000.[12] S. Katz and D. Peled. Interleaving set temporal logic. Theor. Comput. Sci. , 75(3):263–287, 1990.[13] Y. Kesten, A. Pnueli, and L. Raviv. Algorithmic verification of linear temporal logic specifications. In ICALP , volume 1443 of LNCS , pages 1–16. Springer, 1998.[14] V. King, O. Kupferman, and M. Y. Vardi. On the complexity of parity word automata. In F. Honselland M. Miculan, editors, FoSSaCS , volume 2030 of Lecture Notes in Computer Science , pages 276–286.Springer, 2001.[15] R. K¨usters. Memoryless determinacy of parity games. In Automata, Logics, and Infinite Games , volume2500 of LNCS , pages 95–106. Springer, 2001.[16] P. Madhusudan and B. Meenakshi. Beyond message sequence graphs. In FSTTCS , volume 2245 of LNCS , pages 256–267. Springer, 2001.[17] B. Meenakshi and R. Ramanujam. Reasoning about layered message passing systems. Computer Lang.,Systems & Structures , 30(3-4):171–206, 2004.[18] R. Mennicke. Propositional dynamic logic with converse and repeat for message-passing systems. InM. Koutny and I. Ulidowski, editors, CONCUR , volume 7454 of LNCS , pages 531–546. Springer, 2012.[19] D. E. Muller and P. E. Schupp. Alternating automata on infinite trees. Theor. Comput. Sci. , 54:267–276,1987. DL WITH CONVERSE AND REPEAT FOR MESSAGE-PASSING SYSTEMS 35 [20] D. Peled. Specification and verification of message sequence charts. In FORTE , volume 183 of IFIPConference Proceedings , pages 139–154. Kluwer, 2000.[21] A. Pnueli. The temporal logic of programs. In FOCS , pages 46–57. IEEE, 1977.[22] V. R. Pratt. Semantical considerations on Floyd-Hoare logic. In FOCS , pages 109–121. IEEE, 1976.[23] R. S. Streett. Propositional dynamic logic of looping and converse. In STOC , pages 375–383. ACM,1981.[24] M. Y. Vardi. A temporal fixpoint calculus. In J. Ferrante and P. Mager, editors, POPL , pages 250–259.ACM Press, 1988.[25] M. Y. Vardi. Alternating automata and program verification. In Computer Science Today , volume 1000of LNCS , pages 471–485. Springer, 1995. This work is licensed under the Creative Commons Attribution-NoDerivs License. To viewa copy of this license, visit http://creativecommons.org/licenses/by-nd/2.0/http://creativecommons.org/licenses/by-nd/2.0/