Specifiable robustness in reactive synthesis

When synthesizing a system from a given specification, there is room for automatically adding various requirements, hence improving the resulting system. One such requirement covered extensively in past literature is that of robustness. In particular, the system can fail to read the inputs correctly from the environment, and the environment can fail to satisfy our assumptions about its behavior. Nevertheless, we want the system to still satisfy the specification even under these failures, in some limited way. It has to be limited because it is typically too strong of a requirement to realize the property regardless of the inputs and the environment’s assumptions. In this work, we propose a simple and flexible framework for synthesizing robust systems, where the user defines the required robustness via a temporal robustness specification. For example, the user may specify that the environment is eventually reliable, or input misreadings cannot occur more than k\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$k$$\end{document} consecutive steps and synthesize a system under this assumption. Furthermore, our framework enables us to specify a temporal recovery specification, which describes how the designer expects the system to recover after a failure of the environment assumptions. We show examples of robust systems that we synthesized with this method using our synthesis tool Party.


Introduction
The problem of reactive synthesis is to generate a system M such that M ⊧ Φ , where Φ is often defined as Here s (I, O) is a temporal specification over the inputs I and outputs O of M, and e (I, O) a temporal assumption about the environment's behavior. When solving the synthesis problem is computationally possible, it may save labor compared to the more traditional route of manually constructing M and then verifying that it satisfies Φ via model-checking.
In practice, it is difficult to fully specify a system, i.e., specify the exact behavior in reaction to any input sequence from the environment. As a result, there can be many such systems M that satisfy s , and hence it is desirable to define default criteria for the quality of the synthesized system, e.g., [1,10,15], which can guide us in the synthesis process. In this sense, it is like an optimization problem without an explicit objective function. In [10], for example, the implicit objective is to minimize the synthesized system's circuit size. In [7], it is to synthesize a system that satisfies the input property in a non-vacuous way. Another implicit objective directly related to the current article is robustness. This term was used with multiple meanings in the synthesis literature (e. g., [2,3,5,9,26,27]), system testing [23], and software verification literature [29]. Informally, robustness measures the degree to which a system can function correctly in the presence of erroneous input or stressful environmental conditions [14]. Indeed it is common to expect M to operate under an environment whose behavior is not fully specified, cannot be modeled in a precise way (e. g., if it is a physical environment), or cannot be fully trusted. Hence when possible, M should satisfy s even when e does not hold, 1 or when some inputs are misread by the system or misreported by the environment. These additional requirements may depend on the type of system and the expected environment in which it operates and, accordingly, on the selected definition of robustness.
We are aware of four prior works that considered robustness as an implicit objective for synthesis. In [3], the main idea is to build a special game graph that ensures that a winning strategy satisfies the liveness guarantees, even if e is violated infinitely often. In [9], the main idea is a new synthesis algorithm based on a generalized Rabin automaton ensuring that even if the environment violates e a finite number of times, eventually the system returns to satisfy its guarantees s (here e and s are restricted to be invariants; namely they are of the form G where is propositional). In [17], a synthesis algorithm is presented for the case that not all inputs are available (in their terminology: there is 'incomplete information'). Although the term 'robustness' is not mentioned there, implicitly, the goal is to make the synthesized system satisfy s even when some of the inputs are not available. Finally, a recent article [2] proposed a new temporal logic called rLTL (for robust LTL), in which the robustness level is measured by the levels of violations of the temporal operators, forming a partial order of robustness. Specifically, the paper suggests evaluating LTL properties in systems over a 5-valued logic, according to the property's degree of violation, leading to a finer-grained model-checking procedure that outputs the robustness degree of a given system w.r.t. an rLTL specification. This approach results in three-valued GR(1) formulae for synthesis, falling into one of the robustness categories we introduce in Sect. 3. Note that all four prior works suggested a new synthesis algorithm tailored to a particular robustness definition.
This leads us to this article's main contribution: a unified method of reducing the problem of robust synthesis, for many definitions of robustness, to the problem of 'normal' synthesis. Hence, there is no need for a specialized algorithm to cope with this problem with our method. The reduction is simple and amounts to synthesis from a conjunction of Φ and a new LTL formula that we denote by Θ . This formula combines e , s and a user-specified robustness specification χ in LTL. The conjunction with Φ guarantees that the synthesis is of a formula implied by Φ itself.
We further offer another level of flexibility in the framework by allowing the user to write a recovery specification in LTL, defining how the system recovers from failures. Our method accommodates any off-the-shelf synthesizer that supports synthesis with incomplete information [17], by merely modifying the specification. We implemented an even easier route in Party [16] (via bounded synthesis) where in addition to e , s , we only need to add the robustness and recovery specifications.
This article extends an earlier proceedings article [8] with several examples, clarifications, and extended explanations. It also uses quantified propositional temporal logic (QPTL) as the underlying formalism.

Preliminaries
Notation Let A, B be two finite sets of signals over some alphabet. By S(A) we denote (2 A ) , the set of all infinite sequences (also called 'traces') over assignments to A. For a sequence = 0 , 1 , … , we will write k… = k , k+1 , … for the suffix starting at position k. Given two sequences ∈ S(A) , and ∈ S(B) , we define their composition as ( 1 ∪ 1 , 2 ∪ 2 , … ) ∈ S(A ∪ B) and denote it by ( ‖ ) . For two sequences , over A and a subset C ⊆ A , we write Mealy machines We model systems by Mealy machines [19]. Mealy machines are formally defined as tuples M = ⟨Q, q 0 , Σ, Λ, T, G⟩ , where -Q is a finite set of states, -q 0 is the initial state, -Σ is the input alphabet, -Λ is the output alphabet, If Σ = 2 I and Λ = 2 O , which will be our usual case, we will sometimes write M(I, O) to make the inputs and outputs explicit.
We will focus on the infinitary behavior of Mealy machines. An input sequence ∈ Σ induces a run ∈ in M, defined as 0 = q 0 and i+1 = T( i , i ) for i ≥ 0 . The corresponding output word ∈ Λ is defined as i = G( i , i ) and is denoted by M( ).
Temporal logic We will use Linear Temporal Logic as a specification formalism [21]. We will also occasionally use Quantified Propositional Temporal Logic (QPTL) [25]. We will first define QPTL since it is an extension of LTL with temporal quantifiers. Definition 1 Given a set AP of atomic propositions and a set X of variables, any atomic proposition p ∈ AP and any variable x ∈ X are QPTL formulas. If and are QPTL formulas and x is a variable, then ¬ , ∧ , X , U , and ∃x. are also QPTL formulas. An LTL formula is a QPTL formula that does not contain any occurrence of variables or quantifiers.
The semantics of QPTL are given over pairs of infinite traces over AP and over X. We have that As usual, other propositional connectives can be defined as abbreviations: We say that a variable x is free in a QPTL formula if it does not occur in the scope of a quantifier. For clarity, we will sometimes denote by (X � ) where X ′ is the set of free variables in . If the formula does not contain free variables, it is closed and the assignments to X do not matter. So, instead of , ⊧ , we can just write ⊧ . Note that the atomic propositions are never bound by quantifiers and that LTL formulas are closed by definition.
For a Mealy machine M with input alphabet 2 I and output alphabet 2 O and a QPTL formula with atomic propositions I ∪ O , we say that M satisfies (denoted by M ⊧ ) if for all ∈ S(I) , we have that ( ‖ M( )) ⊧ .
As an example, take the QPTL formula This formula is satisfied by any trace over {p} in which p is true in every even step. This property is omega-regular, but not expressible in LTL. Synthesis The realizability problem is to decide whether there exists a system (a Mealy machine) that satisfies an LTL property; the synthesis problem is to find one such system. For this question to make sense, we partition our atomic propositions into inputs I and outputs O. Thus, given an LTL specification (I, O) , synthesis tries to find a Mealy machine M(I, O) that satisfies the property. We can also assume that some inputs are hidden from the system, asking for a system M(I � , O) that satisfies , where I ′ ⊂ I . This is the problem of synthesis with incomplete information. Both problems are 2EXP-TIME complete [17,22].

Definitions of robustness
We will distinguish between two classes of robustness definitions. The first is robustness against 'corrupted' inputs, which means that the system fails to read some of the environment's signals or they are unavailable, as in [17]; The second is robustness against cases in which the environment does not satisfy the assumption e .
Let us now survey various definitions of robustness in these two classes. Those that are described below without a reference are novel to this work.
Definitions of robustness against corrupted inputs: (R1) A robust system satisfies s even if some of the inputs can be misread ('corrupted') in a finite number of steps. (R2) k-robustness (inputs): A robust system satisfies s even if it misreads some of the inputs up to k times consecutively. (R3) A system is robust if it satisfies s even if some or all the inputs are occasionally corrupted.
Definitions of robustness against violations of the assumptions: (R4) A system is robust if despite a finite number of violations of the assumptions, it still satisfies the guarantees s . (R5) According to Ehlers [9] a robust system eventually stabilizes and returns to satisfy s after a finite number of violations of e 's safety assumptions. A similar definition also appeared in [26]: 'bounded disturbances lead to bounded deviations from nominal behavior,' and 'the effect of a sporadic disturbance disappears in finitely many steps'. 2 (R6) According to Bloem [3], a robust system satisfies its liveness guarantees, even if e is violated infinitely often. (R7) k-robustness ( e ): a robust system satisfies s even if the environment violates e up to k consecutive times.
There are also definitions of metric robustness in the literature [11,13,24], but since LTL cannot accommodate such specifications, it is outside the scope of our suggested method. This is also true about [5]'s definition of robustness, which is based on a quantitative measure.
The above definitions of robustness can be further generalized by adding a dimension of a recovery specification, which is an LTL formula that defines how the system recovers from failures, as follows. For example, (R5) with the dimension of recovery can specify that the system should recover within 5 steps after each failure of the environment to satisfy e . We denote the extensions of the above robustness definitions with recovery by replacing R with C, e.g., (R5) becomes (C5).
We note that the informal definitions in (R5) and (R7) generally cannot be applied to environment assumptions in LTL because such formulas are interpreted over infinite traces w.r.t. the initial state. For example, if e = GFr , then we can say whether a given path satisfies it or not, but there is no meaning to saying that it violates e k times or even a finite number of times. The same problem exists if the guarantees are in general LTL: it is meaningless to say that they are violated a finite number of times.
We, therefore, interpret these informal definitions as relevant only for assumptions and guarantees of the form G , where is a propositional formula, and the failure count refers to the number of locations along the path that satisfy ¬ .
The following two sections present our suggested method for synthesizing robust systems against corrupted inputs and assumption violations.

Robustness against corrupted signals
This section describes how to synthesize systems that react appropriately when the input is not quite trustworthy. Initially, we will insist that the system behaves correctly regardless, and in Sect. 4.3 we will look at ways to relax this requirement.

Closeness of traces
For maximal flexibility in the types of robustness we consider, we use a "synchronization bit" that indicates when signals may be corrupted. Using such a synchronization bit, the user can specify a temporal formula χ when to trust a signal's value. We call χ the robustness specification. It can be viewed as a temporal fault model for signals. We will later distinguish between the corruption of inputs and outputs, and correspondingly refer to two separate such bits: sb in and sb out . However, since the definitions that follow are common to both cases, we will use a single symbol sb . We will assume that the robustness specification refers to the whole set of signals (all inputs, or all outputs) at every given step, i.e., either all or none of them can be trusted. It is not hard to extend the technique to work with finer granularity, specifying that certain inputs are always correct whereas other signals can be corrupted.
We now define when two sequences over A and A ′ , where A ′ is the set of primed signals from A, are close. Definition 2 ( -Close) For sequences ∈ S(A) , � ∈ S(A � ) , and an LTL formula χ(sb) (i.e., χ over the single variable sb ), we say that and ′ are χ-close (with witness ) Less formally, and ′ are χ-close if there exists a trace over sb that satisfies χ , such that and ′ are equal in time steps in which sb is true in . We will use the notion of χ -closeness in the subsections that follow.

Input robustness
In this subsection, we will consider the case of corrupted inputs. Here, sb in denotes the synchronization bit for the inputs. This signal can either be visible to the synthesized system or not, depending on whether the environment can produce such information.
For a set of input signals I and output signals O, let Φ be the specification that we want the system to satisfy robustly, that is, even if it receives inputs I ′ that are not identical to the original inputs I. In what follows, we denote a system over the original inputs I by M and a system over the possibly corrupted inputs I ′ by M ′ .
Intuitively, even if a χ in -robust system M ′ is fed with a corrupted input sequence ′ , its output sequence still fulfills the specification when taken together with the original inputs . Note that robustness satisfaction is a stronger notion than satisfaction. In the extreme case, M � (I � , O) is only G¬sb in -robust for Φ if Φ allows the outputs to be chosen independently of the inputs, which implies that M ′ satisfies Φ.
We define which states that traces over I and I ′ are χ-close.
The following theorem forms the basis of our reduction of synthesizing a χ in -robust system to synthesis with incomplete information. The intuition is that the system does not see the original inputs I but rather the corrupted inputs I ′ , which are constrained to be close to I by the specification Θ . A graphical representation can be seen in Fig. 1.
Note that if we make sb in visible to the system, the theorem still holds as the proof does not depend on the fact that M ′ cannot observe sb in . This setting would be appropriate if the environment knows when the inputs are reliable or not. According to this theorem, if realizable, (3) gives us a χ in -robust system.
We will now give an example of an input robust system. Example 1 (χ in -Robust arbiter) The example is inspired by the arbiter of Bloem et al. [4]. The system has a single request input r and a single grant output g. The requirement is that g stays low as long as r is low. When r goes high, g eventually goes high. g should then remain high until r goes down again, after which g eventually goes down. The assumption on the environment is that the request r will not be withdrawn until it is granted with g, and not raised again until g is low.
A system that fulfills Φ can be seen in Fig. 2a.
Now consider the following robustness specification in which the inputs can be erroneously flipped at most one out of every three-time steps.
Note that this is a primitive error-correcting code. By considering three consecutive steps, we can reconstruct whether r was high in all three or low in all three: If r is high (low) thrice consecutively, then r ′ is high (low) at least twice. Figure 3 demonstrates a trace that distinguishes the robust behavior of the system shown in Fig. 2b from the arbiter in Fig. 2a.
An example of a χ in -robust arbiter for this robustness specification, synthesized from the spec by our tool PARTY, can be found in Fig. 2b. The robust arbiter does not react to changes in r ′ immediately. Rather, it waits until the input remains the same for two consecutive time steps before taking action. This means that the system may take two steps to react to a change in r: if r goes high, the robustness specification allows r ′ to remain low for one step, and when r ′ goes high, the system waits for one more step to react.

Recovery specifications
If the inputs to a system are not reliable, it may not be possible to produce outputs satisfying the specification. It thus makes sense to relax the requirements on the outputs a little, stating, in effect, that if the inputs are close to what they should be, then so are the outputs. For instance, if the specification is G(r ↔ g) and the input can be misread once every ten ticks, then the output cannot always conform to the original input, but may also be incorrect once in every ten-time steps. A more interesting example showing that a specification may not always be χ in -robust-realizable follows.

Example 2 (χ in -Robust arbiter)
To continue Example 1, suppose that we have the additional requirement that the grant signal g must be true if the request signal r was true in the previous step. Hence, if Φ(r, g) denotes the specification from Example 1, then the new specification is This specification is realizable by a system that replies to a request in the step after the request is raised. The arbiter in Fig. 2a satisfies the specification in (4)a, but it is not robust w.r.t. (5). The arbiter in Fig. 2b is robust w.r.t. (5), but is not robust w.r.t. the new specification Φ � in (6).
In fact, given the robustness specification (5) from Example 1, there is no system that is χ in (sb in )-robust for Φ � . The reason is that two scenarios are not distinguishable to the system: (1) r remains low, but r ′ goes up incorrectly, and the system must leave g low, and (2) r goes up, r ′ follows one tick later, and the system must raise g immediately. The question is whether a system exists that almost fulfills Φ.
We let the user define a recovery specification χ out over sb out describing when the outputs can deviate from what the specification prescribes. We define in analogy to χ � in (sb in ) [see (2)]. In the following, we present two alternative definitions. We will start with a trace-based view of what correct behavior is and then present a systembased view.

χ in -χ out trace robustness
Intuitively, if the input is close to the intended input, then the output must also be close to a correct output. The definition is illustrated in Fig. 4a, where the ≈ boxes denote the closeness of traces. Note that in general, the notion of χ in -χ out trace robustness is neither stronger nor weaker than the regular notion of satisfaction. For instance, in general, (G¬sb in )-(Gsb out )-robustness for Φ is harder to satisfy than the specification Φ without any robustness requirements, as argued in the paragraph following Definition 3. On the (6) Φ � (r, g) = Φ(r, g) ∧ G(r → Xg).
other hand, any system is (Gsb in )-(G¬sb out )-robust for Φ , as long as Φ allows for some correct output for any input. (See also Example 4.) Before explaining how synthesis can be performed for χ in -χ out trace robust systems, let us first give an example.

Example 3 [χ in -Robust Arbiter] Continuing from Example 2, we can introduce the recovery specification
which states that the output can be disregarded for two ticks starting at a rising edge of r.
We can express χ in -χ out trace robustness as a formula in quantified temporal logic. We will use existential quantification to express that a correct output exists that is close to the actual output ′ .
. Fig. 4 χ in -χ out trace robustness vs. χ in -χ out tree robustness. In both subfigures, we have that the trace defined by I and O satisfies Φ . To generate an χ in -χ out tree-robust system ( M ′ in the right diagram), we synthesize M ′ and W according to (10), which guarantees that M ′ satisfies the recovery specification Let ∈ S(I) , � ∈ S(I � ) , and ≈ χ in � with witness in ∈ S({sb in }) . Then ‖ � ‖ in ⊧ χ � in (sb in , I, I � ) and therefore there exist ∈ S(O) and out ∈ S({sb out }) such that O) . This implies that ≈ χ out M � ( � ) with witness out and that ‖ ⊧ Φ (I, O) , and therefore M � (I � , O � ) is χ in -χ out trace robust for Φ by Definition 4. ◻ Complexity The synthesis procedure for QPTL formulas with an ∀∃ quantifier prefix is doubly-exponential. We can first build a nondeterministic Büchi word automaton with exponentially many states from the formula χ � in (sb in ) → χ � out (sb out ) ∧ Φ [28]. From this automaton, we can existentially quantify O and sb out without an additional blowup [25]. Viewing the word automaton as a tree automaton, we can then apply the construction for partial observability from [17], which yields an alternating tree automaton of the same size. This alternating tree automaton can be turned into a Mealy machine using the construction of Kupferman and Vardi [17,Theorem 4.6]. The last step incurs another exponential blowup. The double-exponential complexity of this algorithm is optimal because LTL synthesis (which is double-exponential) can be reduced to the synthesis of χ in -χ out tracerobust systems, with χ in and χ out defined as identities.

χ in -χ out tree robustness
We will now present a stronger definition to χ in -χ out robustness. Definition 4 requires that every actual output trace is close to a correct output trace. It does not require that there exists a correct system that produces these traces. We can take exactly that criterion as a definition and require that such a "witness" system exists.
Definition 5 ( inout Tree robust system) A system M � (I � , O � ) is χ in -χ out -tree robust for Φ if there exists a system W(I ∪ I � , O ∪ {sb out }) such that for all sequences of sb in , or, developing according to (2) and (7), We refer to this definition as "tree" robustness since it refers to the computation tree of a system rather than to a single trace. Thus, instead of requiring the output to be close to a correct trace, as we did in the last section, we now ask for a system that is "close" to a correct system W in the sense that its output traces are always close to the outputs of W. This definition is illustrated in Fig. 4b.
In some ways, Trace Robustness and Tree Robustness are similar. For instance, if we set χ out = Gsb out , then χ in -χ out Tree Robustness for Φ requires the system to produce correct outputs even if the input is adulterated according to χ in , which is a stronger notion than plain adherence to Φ . Similarly, the notion can be weakened by choosing a χ out that allows sb out to be false.
However, the notions of Trace and Tree Robustness do not coincide. For any output trace of M ′ , system W witnesses the existence of a trace that is close to it. Thus, tree robustness implies trace robustness, but not vice versa, as the following example illustrates.

Example 4 (Trace robustness vs. tree robustness)
To see that the definitions of tree and trace robustness are not the same, suppose that r is an input, and g is an output. Let Φ = G(Xr ↔ g) , χ in = Gsb in , and χ out = G(Xr → sb out ) . It should be clear that Φ is not realizable and thus Φ is not χ in -χ out tree-realizable. However, Φ is χ in -χ out trace-realizable (by a system that always outputs a g). Thus, one might see tree-realizability as a more interesting notion as it precludes the possibility that a specification that is not realizable in the first place is realized "robustly".
Synthesis of tree-robust systems is a distributed synthesis problem of constructing M ′ and W. The proof that the distributed synthesis problem is decidable is given in the next theorem. The theorem depends on the notion of an information fork [12]. Intuitively, a component in a distributed synthesis setting receives information when it has an input. An information fork occurs when the information for each component is not fully ordered. In other words, there are components P and P ′ so that P receives information that P ′ does not receive and P ′ receives information that P does not receive.
In our case, W receives all information that M ′ receives: W has inputs I ′ and I, whereas M ′ only receives I. Thus, the information is fully ordered ( {I � , I} ⊇ {I} ) and an information fork does not exist. Since there is no information fork, distributed synthesis is decidable following the construction in [12]. Proof The proof goes along the lines of the synthesis procedure shown in [12]. Our architecture is ordered in an information sense: W has all information that M ′ has. Whereas [12] assumes that the specification is given as a μ -a specification in LTL. While μ-calculus formulas can be translated into alternating tree automata in polynomial time [12,Theorem 4.4], [18], for LTL the translation into a tree automaton (a game) results in an exponential blowup [6]. The rest of the construction outlined in [12] can be followed as is, resulting in a triply exponential bound. ◻

Examples of robust specifications
We begin by showing several robustness specifications, corresponding to the definitions (R1)-(R3).
Recall that (R1) defines a system to be robust if it satisfies s even if the inputs are corrupted for a finite number of steps. We can translate this definition into an LTL robustness specification: Similarly, R2 is captured by the robustness specification (12) χ in (sb in ) = FGsb in .
i. e., the inputs are corrupted up to k steps consecutively. (R3) is the most general definition and therefore as a special case can be (12) or (13), but also other specifications such as (the inputs will be corrupted infinitely often) or (we trust the inputs only if they are equal). A plausible scenario is when the relationship between the inputs is an indication of whether they can be trusted. For example, when using error-correcting code, the relation between bits in a packet and a parity bit at its end determines whether to trust the packet.

Robustness against unreliable environments
In the previous section, we focused on robustness against corrupted signals; Let us now focus on robustness against environmental assumptions that cannot always be trusted.

Assumption robustness
As explained in Sect. 3, the number of times that a specification is violated, as prescribed by (R7), is generally not well-defined, because temporal formulas are either satisfied by a path or not (but see [5]). We can, however, restrict ourselves to invariant properties, i.e., formulas of the form G where is propositional. We then interpret 'failure' as a temporary failure of the invariant (this is the same interpretation of robustness taken by Ehlers [9]). Hence the failure count refers to the number of time frames in which is false. We denote by χ e a robustness specification that relates to an assumption failure. It is defined over a synchronization bit sb e .
As explained before, our approach is more general than the one in [9] because we have the freedom to specify the robustness specification, whereas [9] suggested an algorithm tailored for systems that are eventually reliable. In the case of Ehlers [9], χ e can be thought of as being fixed to FGsb e . In addition, our method works with any existing synthesizer, whereas [9] requires a tailored algorithm based on a generalized Rabin(1) automaton.
Let our specification Φ = e → s be defined by e ≡ G e and s ≡ G s , where e , s are propositional.
To synthesize a system that is robust against environmental failures, we repeat the same pattern that we used in the previous sections. We first define That is, χ � e is true for paths that satisfy χ e , and each time sb e is true, then the assumption e holds.
We define robust systems as follows.
Definition 6 (χ e -Assumption-robust system) Intuitively, every path in M ′ must satisfy G s if in every time step either sb e is false or e is true. This is of course a stronger requirement than satisfying Φ itself because the latter is trivially satisfied once there is a time step in which the environment assumption is violated. Thus, every χ e -assumption-robust system is correct. As an example of using (17), synthesizing the following formula yields a system that is robust according to (R4): We continue with an example of a formula that has to be synthesized according to a robustness specification relating to its assumptions. Fig. 5a shows a system realizing Φ . Once the assumption fails, the system transits to a 'sink' state in which it does not satisfy s .

Example 5 Consider the specification
Suppose that (19) works in an adversarial environment, and we want it to start trusting the inputs only after it receives some password confirming the identity of the environment. This pattern can be specified with where pwd is the password. For this example assume that pwd = r 1 ∧ r 2 . The systems depicted in Fig. 5b are the result of synthesizing (17). Note that as long as sb e is down (at state 0, before the password was given) violating the assumption does not lead us to a sink state, and the guarantee still holds.
Similarly, for (R7), which defines robustness as the requirement that the guarantee s holds even if the environment violates ( e ) up to k consecutive steps, we can use once again the pattern of (17) while defining χ e as follows: Indeed this definition guarantees that every path that satisfies e at least once in every window of size k, satisfies the guarantee G s as well.  (19) Φ = (G(r 1 ∨ r 2 )) → (G(r 1 → g 1 ) ∧ G(r 2 → g 2 )).

Recovery specifications for environment failures
Similar to Sect. 4.3, we now consider the problem of synthesis with a recovery specification.

Example 6
Suppose that we allow the system to provide an arbitrary output for k time steps after a failure of the environment to satisfy e . We will denote by χ s the recovery specification, and it will operate over a synchronization bit sb s . Hence, for our example we will specify It is left to specify the behavior of sb s in the first k steps. We may require that in step i ≤ k , the synchronization bit sb s is high if in all previous steps the assumption e is true. In this case we conjoin χ s (sb s ) with We can now define χ s -robustness and again, we have the choice between defining a trace variant (cf. Definition 4) and a tree variant (cf. Definition 5). For simplicity, we will only define the tree variant here. Definition 7 (χ s -Tree-robustness) A system M � (I � , O � ) is χ s -tree-robust for Φ if there exists a system W(I, O) such that Interestingly, this definition does not refer to χ e . χ s in itself can refer to e , as can be seen in (22). In other words, here we synthesize a system that satisfies s based on the satisfaction of e . In contrast, in Definition 6, we required that G s holds [see (17)] if the environment assumption holds according to χ e . It should be possible to combine the two definitions, i.e., the specification will be vacuously satisfied if sb e ∧ ¬ e . A different direction is to replace the recovery specification e with e ∨ ¬sb e , and hence strengthen the recovery requirement. We leave such directions for future research.
The following example demonstrates the result of adding the recovery specification (22) with k = 1.

(23)
e → sb s ∧ e ∧ X e → Xsb s ∧ … e ∧ X e ∧ ⋯ ∧ X k e → X k sb s .

Fig. 6
An assumption-robust system with a recovery specification 0 1 2 0 r 1 r 2 /g 1 g 2 r 1 ∨ r 2 /g 1 g 2 r 1 r 2 /g 1 g 2 r 1 ∨ r 2 /g 1 g 2 r 1 r 2 /g 1 g 2 r 1 r 2 /g 1 g 2 r 1 r 2 /g 1 g 2 r 1 r 2 /g 1 g 2 * /g 1 g 2 Figure 6 shows a system satisfying the resulting specification. The system satisfies Φ from (19), with the robustness specification χ e (sb e ) (where again pwd = r 1 ∧ r 2 ) from (20) and the recovery specification (25). Note that after a violation of the assumption with r 1 r 2 at state 0, the system transits to state 0 ′ with an arbitrary output and then has an additional transition with an arbitrary output (in this case Party chose as arbitrary output ḡ 1ḡ2 ).

Conclusion
The incompleteness of most specifications gives a certain freedom to the synthesis algorithm. It is left for the designer of the synthesis algorithm to set criteria for what system is preferable from the large set of options. Several previous publications suggested using this freedom to synthesize a robust system. Each publication had its definition of robustness, and each suggested a synthesis method to achieve a robust system accordingly. This article showed a general framework that enables specifying with LTL formulas χ in and χ e the expected robustness. From there on, it reduces the robust synthesis problem to that of normal synthesis. Furthermore, with LTL formulas, we enable to specify a recovery specification. Our framework is flexible and easy to use with existing reactive synthesizers. As we showed, it covers most of the previously published definitions of robustness and several new ones that we introduced here. Our tool Party, which implements our method, is available for download from Party [20].