Keywords

1 Introduction

Prescriptive process monitoring methods recommend runtime interventions that optimize the performance of a process with respect to one or more performance measures, such as the success rate – the percentage of cases of a process that end in a positive outcome [15]. For example, prescriptive process monitoring methods may recommend the next task to execute or the resource to assign a task to.

Prior work on prescriptive process monitoring focuses on developing algorithms to learn intervention policies from execution data based on process mining [12, 28], machine learning [8, 20, 26], or causal inference [3, 24] methods. In contrast, little attention has been given to ensuring that the outputs of these techniques are understandable and useful, although this has been highlighted as one of the challenges of applying process mining in organizations [17]. Only a handful of studies discuss the understandability or usefulness of prescriptive process monitoring outputs [6, 19] or include an interface for end users [12, 28]. However, in all cases, information needs of end users, such as process workers, were not explicitly analyzed nor evaluated. Previous research highlights the need of providing end users with suitable information to facilitate technology acceptance, with examples in expert systems [29] and, more recently, recommender systems [1]. Cases in manufacturing [5] and construction [16] exemplify that information provided to end-users should be comprehensible and supportive of the task at hand in order to be useful. In the context of prescriptive process monitoring, it has been shown that users sometimes do not follow recommendations produced by such solutions even if they understand them [6].

In light of this, our research objective (RO) is to develop an interface that provides end users with relevant information items generated by prescriptive process monitoring methods. To pursue this objective, we follow a design science methodology [11]. We first analyze existing prescriptive process monitoring methods and tools to elicit common information items. We then create a user interface concept, realized as a wireframe, and evaluate it with experts. Based on the feedback, we refine the wireframe. The evaluation shows that different end-user groups (operational users, tactical managers, and process analysts) could benefit from the information items included in the interface.

The contributions of the paper are a wireframe for a prescriptive process monitoring interface and an evaluation of information items that may help users to decide where and when to trigger interventions in a process. These contributions are relevant for developers of process mining tools and researchers. Developers benefit from a better understanding of end users’ informational needs, and researchers gain insights into possible avenues for future work.

2 Background and Related Work

Prescriptive process monitoring methods recommend interventions to optimize performance measures, such as success rate (percentage of cases that end in a positive outcome) [7, 24], on-time completion rate [26], cycle time [3], or processing time [20]. In the past five years, the variety of methods has grown and new methods are being proposed [15]. Existing methods differ w.r.t. the interventions they prescribe, such as the next task in a case [26] or the resource to assign a task to [27], and w.r.t. the basis of these prescriptions. Along the latter dimension, methods can be correlation-based or causality-based. For example, the authors of [13] and [7] propose methods that prescribe interventions based on correlation-based predictions of case outcomes. Causality-based methods estimate the effect of an intervention, in addition to predicting the case outcome. For example, several methods [3, 24] estimate the CATE (Conditional Average Treatment Effect) of an intervention at each point during the execution of a case to recommend interventions that maximize a performance measure.

While there is a substantial body of techniques for generating recommendations for prescriptive process monitoring, only a handful of studies consider the design of user interfaces to communicate these recommendations. In this respect, one study proposes an interface for a tool that allows for discovering and visualizing treatment rules that increase the probability of positive case outcomes based on causal machine learning [2]. In [28], the authors propose a UI to recommend a process trace (representing a treatment) in the medical domain. In [12], the authors provide a UI to review case goals, predictions, and recommendations. However, these interfaces center around technological capabilities of introduced methods and not on the end-users. For instance, visualizations such as confusion matrix, scatter plot for clusters, bar charts for attributes are used in [2, 28]. However, even end-users with knowledge of BPM and ML struggle with comprehending plots used for explainability [22]. In this paper, we analyze both the technical capabilities of existing methods and the needs of end-users to ensure that information items presented to them serve their needs.

Commercial process mining tools also consider the design of interfaces for predictive and prescriptive functionalities. For instance, ApromoreFootnote 1 and Appian (formerly Lana)Footnote 2 include predictive process monitoring interfaces that highlight cases with a high probability of leading to a negative outcome. Other tools, such as ABBYY TimelineFootnote 3 provide the functionality to prevent deviating process flows. Celonis Action EngineFootnote 4 generates suggestions for continuous improvement based on data. In summary, these tools generate alert-based recommendations [7] based on predictions or correlations. In contrast, in this paper, we consider information items that are applicable to a broader set of prescriptive methods, also including causality-based methods and user guidelines [15].

3 Research Method

The aim of this study was to develop an interface that provides users with relevant information items generated by prescriptive process monitoring methods. We followed the design science methodology (DSM) [11] to achieve this aim. The design science methodology prescribes beginning with exploring the problem relevance and defining the objectives. In our case, the problem identified from prior work is related to defining relevant information items for users of prescriptive process monitoring outputs (sect. 2). In the next phase of DSM, we identified our objectives. To do that, we analyzed tools and conducted a domain analysis to elicit information items for our interface. This corresponds to step 1 in Fig. 1 (Sect. 3.1). The next phase of DSM is Design & Development of an artifact. We developed an initial wireframe for prescriptive process monitoring (step 2.1, Sect. 3.2). In the next phase, Evaluation & Refinement, we evaluated the wireframe with experts and refined it based on the feedback we received (steps 2.2, 2.3, Sect. 3.2). This paper communicates our findings. In the future, we aim to conduct one more iteration of the phases of Design & Development, and Evaluation & Refinement by implementing an interactive prototype and conducting a usability evaluation (step 3). This step is outside of the scope of this paper.

Fig. 1.
figure 1

Research process (steps in the scope of this paper are in blue dotted box). (Color figure online)

3.1 Step 1: Information Items Elicitation

We analyzed existing prescriptive monitoring tools to identify information items to include in a prescriptive process monitoring interface. By tools we refer to academic and commercial solutions based on prescriptive process monitoring algorithms. We also drew insights from literature using the prescriptive process monitoring framework presented in [15]. The aim of these analyses was to identify the capabilities of existing prescriptive process monitoring methods.

Step 1.1: Tools Analysis. First, we analyzed commercial process mining solutions based on the survey of 17 process mining tools [25]Footnote 5. We reviewed the list to identify tools that provide “recommendations (prescriptive analytics)”. According to the list, only Celonis has prescriptive functionality. We also manually examined each of the listed solutions since new features could have been introduced since the survey was conducted. Thus, we also added SAP Signavio and ABBYY Timeline to the analysis. Next, we included academic solutions that propose an interface for the prescriptive method ( [12, 28]). Selection of these tools is based on the review of prescriptive process monitoring methods [15]. In total, five solutions that provide interfaces were identified.

We analyzed the selected tools using a visual analytics framework [18]. More specifically, we extracted how each tool corresponds to the questions “What?” (items and attributes), “Why?” (performed task, usually expressed as a verb and a noun), and “How?” (visualization elements) as prescribed by the framework [18, 21]. For example, one tool [28] presented a statistical analysis (what) to explaine the calculation (why) using a heatmap (how).

We clustered the results and elicited four main tasks (“whys”) as per the visual analytics framework: (i) Describe Case, (ii) Describe Recommendation, (iii) Explain Recommendation, and (iv) Assign Resource. These tasks served as the base for groups of information items for the interface. For example, we refined the first task into a group of information items “Case Description” which includes information items related to describing the ongoing case.Footnote 6

Step 1.2: Domain Analysis. In this step, we analyzed the prescriptive process monitoring framework presented in [15]. In this framework, existing methods are categorized according to their objective, intervention types, modeling technique, and policy. We created a UML domain model [23] for prescriptive process monitoring based on the framework. Then, we drew insights related to information items to be additionally added. Such items include, for instance, the type of an intervention and its frequency (see Sect. 4.1 for elicited information items).

3.2 Step 2: Wireframing & Evaluation

Step 2.1: Wireframe Design. Using FigmaFootnote 7, we created a simple wireframe [9] that the experts would use during the evaluation. We used a loan application process to exemplify included information items since it is one of the most known and used event logs in the communityFootnote 8.

Step 2.2: Information Relevancy & Usefulness Evaluation. Finally, we conducted an evaluation of the initial wireframe to assess the information included in the wireframe at an early stage before investing resources into developing a working prototype. We particularly focused on assessing the relevancy and usefulness of the information items. For this, we recruited 13 process mining experts from different consultancies and companies that conduct process improvement projects (Table 1). The aim was to recruit individuals that have an overview of the needs of different end users. Our participants had seven years of experience with process mining on average. Ten participants represented consulting domain, and three were from product-oriented companies.

Table 1. Evaluation interviews participants.

We conducted semi-structured interviews with the participants. This approach is suitable because we wanted the participants to be able to discuss their own perspective [10]. During interviews, we showed the wireframe and described the visualized information items using the example of the aforementioned loan application process. More specifically, we explained that the interface would allow the user to gain an overview of open loan applications and optimize ongoing cases. To add focus, we asked the participants to think about a recent situation where a similar interface could be used. This allowed us to discuss a specific situation instead of collecting scattered opinions from different contexts. After introducing the information, we asked the participant three questions based on our research objective. Namely, the first question aimed at evaluating the relevancy of each group of information items to the task of optimizing ongoing cases: “Which information do you find most/least relevant and why?” With the second question, the interviewee was asked to evaluate the usefulness of the information w.r.t. recommendations in the ongoing case: “Which information do you find most/least useful and why?” Finally, with the third question, the participant provided suggestions on crucial information items not included: “What information do you think is missing?”

The interviews lasted between 14 and 25 min. We recorded the interviews, transcribed the audio files with Otter.aiFootnote 9, and manually corrected the transcripts. Then, we used thematic analysis [4] to analyze the interviews. We combined deductive and inductive coding. Namely, one researcher first familiarized themselves with the interviews and created the first set of codes based on our research objective. The first set included codes related to information items elicited in Sect. 3.1. For instance, we tagged “Effect of recommendation” on parts of the interviews that referred to information included in the group Recommendation Explanation, and “Process model is relevant” to comments about the process model included in Case Description. When conducting the coding, we noticed the need to add additional codes, such as “End user” that related to comments the interviewees made about information being relevant for different user groups. Thus, we added this code to the list. We discussed the codes inside the research team and refined them by marking to which group of information items the code related to and iterated the coding procedure. For example, “Effect of recommendation” was refined into “Recommendation explanation: Effect of recommendation”. We then clustered the codes into themes. We identified seven themes in total: one for each group of information item (e.g., “Case Description” and the additional themes “End User”, “Cases Prioritization”, and “Overview (Multiple Cases)”. For example, the theme “End User” captured comments about the information needs of different user groups, and the theme of “Cases Prioritization” described the importance of prioritizing ongoing cases according to different criteria.Footnote 10

Step 2.3: Wireframe Refinement. Finally, we summarized the findings from the evaluation. Utilizing the findings about relevant information items, we adapted the wireframe (see Sect. 4.2).

4 Results

This section is organized along the methodological steps described in Fig. 1. We first elaborate on information items to be included in the interface (step 1, Sect. 4.1). Then, we describe the wireframe design, the findings from the wireframe evaluation, and the refinement based on the evaluation results (step 2, Sect. 4.2).

4.1 Step 1: Information Items Elicitation

Step 1.1: Tools Analysis & Step 1.2: Domain Analysis. As to the information items required in a prescriptive process monitoring interface (RO), we elicited four groups. Case Description (i) includes items describing the general attributes of an ongoing case. The second group, Recommendation Description (ii), describes basic information about the prescribed recommendations. The third is Recommendation Explanation (iii) which elaborates on how a recommendation is calculated. Finally, Resource Assignment (iv) provides an overview of resources, facilitating assigning of a suitable resource for the recommendation.

Fig. 2.
figure 2

Domain model for prescriptive process monitoring (based on [15]).

Table 2. List of elicited information items. Screen numbers and information items numbers correspond to the wireframe in Fig. 3.

In addition, we categorized specific information items for each group. For example, the information item “process visualization” (process model) is included in the group Case Description because it provides general information about an ongoing case. Similarly, remaining time, impact, and statistical analysis are parts of the group Recommendation Explanation (see Table 2). Furthermore, we created a domain model (Fig. 2) derived from the prescriptive process monitoring framework [15] to add additional input to groups (i), (ii), and (iii). For instance, we included intervention frequency, which can be discrete or continuous, in the Recommendation Description (iii) group. The elicited information items are summarized in Table 2.

4.2 Step 2: Wireframing and Evaluation

Step 2.1: Wireframe Design. We designed a wireframe based on the information items elicited from the tools review and the domain model in the first step of our study (Table 2). The wireframe represented a loan application process. A case in this process is a loan application, and case attributes are the requested amount, purpose, and applicant.

For the wireframe, we followed the common interface layout of process mining tools, e.g., including tabs for different categories, such as Cases and Resources. The wireframe consists of four screens (Fig. 3)Footnote 11. The groups of information items from Table 2 refer to one individual ongoing case. However, in a process, there are multiple concurrent cases being executed. Therefore, screen #1 refers to a group of ongoing cases (loan applications) and shows an overview of case attributes. The next screens refer to a single ongoing case. Screen #2 covers information items categorized under Case Description (i in Table 2), whereas Screen #3 incorporates information items of the Recommendation Description (ii) and Recommendation Explanation (iii). Last, screen #4 refers to information items of Resource Assignment (iv).

Fig. 3.
figure 3

Initial wireframe for prescriptive process monitoring interface. Screen numbers and information items numbers correspond to Table 2.

Step 2.2: Information Relevancy & Usefulness Evaluation. In this section, we present the results of the expert evaluation (see summary in Table 3). Our findings indicate that the majority of interview participants (8) pointed towards different information needs for different end-user groups. This is evident by a comment from an interviewee: “But what I would really define upfront is, who is going to be your consumer of the information. Because for me, these are different levels of abstraction.” (I-04). As another interviewee put it, “I was thinking, am I looking at this strategically or operationally? What I’m seeing might change on that basis.” (I-10). More specifically, in its current state, the participants found different information items to be relevant and useful for different end-user groups. As such, our evaluation indicates three distinct groups of end users: operational, tactical manager, and process analyst. The operational user is concerned with processing the ongoing case. Next, the tactical manager is interested in optimizing the resource allocation. Finally, process analysts seek to identify improvement opportunities for the business process. In the following paragraphs, we detail the information items for different user groups.

Operational User. Interview participants expressed that operational users should focus on their main task, i.e., case processing. Therefore, they should not be overloaded with information items. As one participant expressed it: “They [operational users] always need to stay focused, otherwise, they are lost.” (I-06). Similarly, another interviewee said that “the user who’s only doing [processing work] should be fast, and s/he should concentrate on doing task after task; something like this [looking at all screens] would be getting out of this flow.” (I-01).

Seven participants found the recommendation explanation (Fig. 3, screen #3) too complex for operational users. One interviewee exemplified by stating that “... usually talking about the prediction accuracy, these people [operational workers] don’t understand. Or have any clue about that.” (I-13). At the same time, the majority of the participants (9) considered it important for the operational user to understand the predicted effect a recommendation could have on an ongoing case. As one interviewee expressed it, “I think it would be much more about knowing the outcome of the action rather than suggesting the action. [...] Really, rather to evaluate the impact of the action.” (I-08).

Half of the participants (6) proposed to introduce the history of following a recommendation. This would help in evaluating whether the recommendation brought the desired effect in the past: “The next question would be also, if this is completed, did it help? If I follow the recommendation, did I achieve what was predicted, what was in the recommendation as the effect?” (I-06). As another interviewee described, such a history “gives me reassurance.” (I-11).

Furthermore, half of the participants (6) suggested making explanations available on demand. In this way, the user could learn about the calculation details when needed but would not be part of the information items. As described by one interviewee, “And if s/he [user] wants to, s/he can drill down to the information about why is this recommendation. So drill down should be optional. Not mandatory for the whole process.” (I-02).

As to the group of cases (Fig. 3, screen #1), the participants found it relevant for operational users to know which cases are assigned to them. As one interviewee put it, “So it is important for me to have an overview, how many cases do I need to action on? What are the cases that are most important? Because that gives me an idea of my workload.” (I-11). Moreover, the majority of participants (8) expressed that cases should be prioritized based on the process objective, and the user should be presented with attainable cases. In this way, “everything which is one day late can maybe still be on time if we pick it up now.” (I-07).

In summary (Table 3), according to the experts we interviewed, the most relevant information about the recommendation for operational users is the predicted effect of the recommendation and examples of similar past instances where the recommendation was followed. The explanations should be presented to the user on demand. Furthermore, the operational user should have an overview of the cases they are responsible for and know which are prioritized based on the process objectives and the possibility of influencing the ongoing case.

Tactical Manager. According to our interviews, information about resource assignment (Fig. 3, screen #4) can be considered to be most relevant for tactical managers, i.e., those responsible for managing a group of resources. Several participants (4) consider information items on resource performance and workload as required for tactical managers to ensure efficient resource assignment: “And that will be interesting, if I have several loan specialists, how quickly are they typically processing a car loan versus a home improvement? So, can I get a KPI for that specific case that tells me: this person is typically better in handling these type of cases.” (I-04). In addition, according to the participants, the tactical manager might be interested in an overview of cases allocated to different teams to “see how many open cases per team member or per department, or is there a significant difference between my teams.” (I-08). In conclusion, the participants of our study perceived recommendations from the resource perspective to be the main interest of tactical managers (Table 3).

Process Analyst. Our participants perceived process analysts to be interested in analyzing recommendations prescribed for cases to elicit policies that improve the overall performance of the business process. One interviewee expressed that “if the user is improving the processes, and s/he needs some kind of recommendation to improve it, it’s the way to go.” (I-01). According to four participants, a process map (Fig. 3, screen #2) is relevant for process analysts, but would not be valuable to the other two user groups: “I think that this part [process map], although very nice to have for analytics, is not necessary for guys who are you know, relocate resources and do day to day jobs.” (I-02). In addition, detailed explanations of recommendation calculations could help to understand their background and thus, deciding on new policies: “And a huge level of detail of why such a recommendation is done or what is the base, it is interesting for someone that’s doing process mining, but for someone that’s doing operational work, it can be overwhelming.” (I-11). Therefore, in the context of our study, the most relevant information for a process analyst is a process map and a detailed explanation of recommendation calculation.

In conclusion, the evaluation provided an indication that, although the information included in the original wireframe was mostly relevant, different items were relevant for different end-user groups. Namely, an operational user responsible for ongoing cases might require information on cases assigned to them, as well as a specific recommendation on how to improve them and the estimated effect of the recommendation. A tactical manager, in addition to the overview of ongoing cases and recommendations, might require information on resources that could be assigned to the recommendations. The information relevant for a process analyst involved in improving the process should include a process model and a detailed explanation of recommendation calculation for the reasoning behind the improvement policies. Thus, with regard to the RO, an interface for prescriptive process monitoring outputs could be used by different end-user groups. It should, therefore, be adjusted depending on the target user.

Table 3. Summary of end users’ information needs.

Step 2.3: Wireframe Refinement. We used evaluation results to refine the information items included in the wireframe (Fig. 4). According to the evaluation, no new information items had to be added. Rather, existing information items had to be adjusted. The evaluation indicated that information needs differ for different user groups. We refined the wireframe according to the needs of the operational worker. Operational users can be expected to benefit the most from using the interface as they are responsible for optimizing ongoing cases. To achieve this, we used the summary of information needs of operational workers from Table 3. As such, we added prioritization to the current duration column on screen #1 as it refers to ongoing cases. In screen #2, we removed the process map as the evaluation showed it to be less relevant for operational users. In its place, we added information items from the Recommendation Explanation group (iii), making them available on demand. For the explanations, we highlighted the effect of the proposed recommendation and added similar cases. Thus, screen #3, which initially contained the Recommendation Explanation group (iii) was removed. We also removed screen #4 with the Resource Assignment group (iv) since its items were found to be more relevant for tactical managers.

Fig. 4.
figure 4

Refined wireframe for prescriptive process monitoring interface. Screen numbers and information items numbers correspond to Table 2.

5 Discussion

In this section, we discuss information items for a prescriptive process monitoring interface (RO). We draw implications for research (Sect. 5.1) and practice (Sect. 5.2) and highlight the limitations of our study (Sect. 5.3).

5.1 Implications for Research

Our findings indicate that end users of prescriptive process monitoring methods are not homogeneous. We identified three distinct user groups (operational users, tactical managers, and process analysts) that could benefit from prescriptive process monitoring outputs. We also found that each user group might consider different information items as relevant and useful. For instance, our evaluation indicated a potential need for explanations of how recommendations are calculated, but – depending on the user group – at different levels of detail.

Process analysts will likely seek detailed explanations for recommendation to understand why and how to improve the business process. Thus, detailed explanations and process models might be more useful for process analysts. This finding is aligned with [14] that reports on process analysts using process maps and advanced views into the data as a starting point for process improvement. This might be also be one of the reasons for the emergence of research on explainable prescriptive process monitoring ([19]). Such works focus on information items that, according to our evaluation, would best be suited for process analysts. However, operational users might consider explainability differently. Operational users – according to our findings – consider information items about the effect and historical evidence of following the same recommendations as most relevant and useful. This is confirmed by [22], who found information about past actions for similar cases valuable for decision making. Explainability is thus relevant for both operational users and process analysts, but in a different way.

In light of this, operational users seem to require an estimate of the effect of the recommendation on the outcome (e.g., [3, 24]). However, most existing methods rely on correlations between case characteristics and the probability of a given case outcome. In other words, they are correlation-based [15]. Thus, a possible venue for future research could be developing causality-based methods that can estimate the effect of a proposed intervention using causal models.

Existing studies focus on what can be communicated based on the prescriptive method rather than which user group to target or their information needs. For instance, in [2], the authors present a tool that allows for discovering and visualizing treatment rules that increase the probability of positive case outcomes, but the intended user is not discussed. Likewise, prescriptive methods that prescribe next actions, such as [12, 28], also include information items that could be more suitable for process analysts. Thus, the effectiveness of prescriptive methods might be limited if the needs of the intended user group are not considered. Therefore, another direction of future work is customization of interfaces based on the information needs of the end users.

5.2 Implications for Practice

Our findings indicate that it is helpful to consider the information items that intended end users consider relevant. However, existing tools providing prescriptive process monitoring functionality (e.g., Celonis) focus predominantly on process analysts, although some parts are better suited for tactical managers (e.g., resource assignment). When commercial tools are enhanced to support operational users, it will be necessary to consider their informational needs. Our findings indicate that operational users might require less information than tactical managers or process analysts. Rather, operational users focus on relevant information to execute a recommendation, such as cases assigned to them, the recommendation, and its predicted effect.

We also found that prioritizing ongoing cases can be also relevant and useful. In other words, when starting to optimize ongoing cases, it is not feasible to address all cases at once. Therefore, ongoing cases could be assigned a priority to help operational users determine which case to work on next. Prioritization can be based on process objectives or specific organizational criteria.

5.3 Limitations

For our study, we applied the design science research methodology [11]. There are several limitations associated with different stages of the approach. First, when eliciting information items to include in the interface, we began by analyzing existing tools and reviewing the outputs of existing methods. This could lead to missing information items required by end users. We mitigated this threat by conducting an evaluation with 13 experts from different domains and backgrounds. Second, it is possible that conducting the evaluation with different experts from different domains might have yielded different results. Using examples of other processes in the wireframe might have also yielded different results. However, we evaluated the information items and not the particular process attributes. Third, we opted to conduct the evaluation with experts and not end users, which poses an additional limitation. This limitation is acceptable because our aim was to get a broad perspective about the interface. Another limitation is associated with only evaluating the wireframe concept. We acknowledge this limitation for this study. However, we plan to conduct a second evaluation on an interactive prototype using a real-life scenario. Next, when analyzing qualitative data, there is a threat of misinterpreting the data due to bias or subjectivity. To reduce this threat, we discussed the data collected and analyzed within the research team. Finally, we abstain from making causal claims and prioritizing specific findings. Instead, we describe the observations made and discuss differences in the information items required by different end-user groups.

6 Conclusion

In this paper, we aimed to develop an interface that provides end users with relevant information items from prescriptive process monitoring methods. To achieve this objective, we analyzed existing tools and research to elicit information items to include in an interface. We elicited four main groups of information items: Case Description, Recommendation Description, Recommendation Explanation, and Resource Assignment. We then developed the first version of the interface and evaluated it with experts. The results indicate that the included information items are relevant. However, we also observed that prescriptive monitoring outputs are of interest to three distinct user groups (operational worker, tactical manager, and process analyst). Thus, certain information items are more relevant for one user group as compared to another. The contribution of this paper is an initial version of an interface and a summary of information items relevant to each of the three user groups when working with prescriptive process monitoring outputs. We also formulate implications for practice, in particular, for developers of process mining tools, and provide insight into directions for further academic research. For future work, we aim to implement the wireframe for operational users and evaluate its usability and usefulness with the users on a real-life scenario.