Introduction

In Industry 4.0 (Schwab, 2016), also known as the Digital Transformation (Bounfour, 2016), organizations are forced to continuously improve their business processes. This improvement is powered by the increasingly new possibilities offered by the extremely fast development and the practical absorption of information technology (IT) (Novak et al., 2023). This is a prerequisite for maintaining a competitive position and sustainable development (Van Looy, 2021). Business process management (BPM), which provides a holistic view of the organization, focused on strategic alignment and, at the same time, ongoing efficiency, is one of the most frequently used methods enabling management in accordance with the requirements of Industry 4.0. The implementation of BPM enables organizations not only to ensure ongoing effective operations, but also to continuously redesign and reengineer them (Bispo et al., 2019; Hammer, 2015; Davenport, 2015). However, this requires the organization to constantly monitor ongoing processes in order to simultaneously, sustainably use its opportunities to increase their current efficiency and ensure constant readiness to implement radical innovation (Helbin & Van Looy, 2019). For this to be possible, ongoing access to data on ongoing processes, the context of their implementation, and the ability to use data on previously implemented processes are necessary. It results in business process management (BPM) in Industry 4.0 practically becoming strictly tied to the use of diverse information systems (IS) (Pena et al., 2022). The possibility of modeling and managing process flow is available not only in workflow software or business process management systems (BPMS), but also in enterprise resource planning (ERP) systems or customer relationship management (CRM) systems, whose logs contain up-to-date key data for the operations of the organization (Szelągowski et al., 2022). The ability to use a wide range of detailed data is increasingly important because of the use of intelligent IS incorporating elements of artificial intelligence (AI) (Zirar et al., 2023). The efficiency of using these systems, and thus the efficiency of the business processes they support, depends on the data on which they were trained and on which they work (Bartlett et al., 2023; Farrow, 2019; Thesmar et al., 2019). Therefore, methodologies and tools for obtaining reliable, diverse, detailed data are of great importance for the ongoing efficiency and improvement of operations management, including production processes. Unfortunately, the methodologies of work observation and measurement used so far, such as work sampling, snapshot analysis, or timing, as well as process mining techniques using data from IS event logs, have numerous limitations (IEEE Task Force on Process Mining, 2012). On the one hand, their labor consumption and costs, and on the other, the subjectivity and incompleteness of the collected data make their use ineffective or even impossible in many areas of the organization's operation. The availability of accurate, current, high-quality data is of particular importance for Manufacturing Execution Systems (MES), which operate in real time. Also the effectiveness of AI solutions, including machine learning (ML), which are becoming more frequently planned and implemented to improve the efficiency of business processes, are burdened by a crucial limitation in the scope, detail, and quality of data on the implemented business processes. The lack of required data or the low quality of accessible input data significantly affect the value of the output generated by process mining techniques or AI algorithms. This issue can be perceived as the main barrier for the common usage of AI technologies in business process improvement initiatives. And while the literature contains numerous positions discussing the above-mentioned limitations (van der Aalst, 2019a, 2019b), there is a lack of scholarship which would propose both a methodology and practical solutions. In answer to the mentioned problem, the authors of this article have designed, created, and implemented in conducted research a proprietary (copyrighted) method called the Workspace Performance Measurement (WPM). The WPM as a solution consists of a methodology of operation and intelligent IS enabling the collection of digitalized data based on the visual observation of work performance. The analysis of the image of the work performed by the WPM enables the collection of digital data describing the work, their automatic initial analysis, and entering the processed information into various types of IS and AI. Due to a wide range of registered parameters, the WPM not only reflects a flow of activities in processes, but goes deeper into a cognitive mechanism focused on finding answers to a simple question: Why? Why a given activity takes longer than planned or expected? Why the performance of a stakeholder involved in a process dynamically changes and affects the process? This is why the WPM as an method perfectly fits all kinds of process improvement frameworks, solutions, and tools: from Lean Management and Process Mining to systems based on ML and AI algorithms. At the current stage of development, the WPM is dedicated in a special way to the continuous monitoring of operations management, but the method of its work allows for data collection and analysis of both traditional and dynamically managed business processes.

The article has the following structure: The next section presents a historical overview of the hitherto used methods of work observation and measurement and their limitations. The requirements and opportunities created by observing work in Industry 4.0 are also presented. The second section presents previous attempts to digitalize work observations. The third section introduces the working methods and architecture of the WPM. This section also discusses how the WPM eliminates or significantly mitigates the limitations of existing methods of work observation and measurement. The fourth section presents a case study on the use of the WPM. Finally, the summary section outlines the key implications of using the WPM. Future directions of research and current limitations of the WPM are also discussed.

State of the art

Considered to be the forerunner of BPM, Scientific Management (Taylor, 1911) referred only to structured, repetitive mass production processes. Since then, in over 100 years of its development, BPM has covered all organizational processes, regardless of their nature or place in the value chain, and even crossed the boundaries of the organization (Mendling et al., 2020; Szelągowski, 2019). The development of BPM was accompanied by the development of methods for observing and measuring the implementation of business processes.

Historical outline of the methods of observing the implementation of business processes

The history of business process measurement methods begins during the industrial revolution. Those methods are characterized by the application of a leading way of information acquisition: observation. Observation as a method of building knowledge about a flow of activities in a production process has been used for centuries. One of the most important books, which played a crucial role in the stimulation of theoretical discussion about the division of labor, published by Adam Smith (1776), presents a list of activities "seen in a small manufactory".

Observation methods performed before and during the eighteenth century were immature, spontaneous, and missing a quantitative perspective. Starting from the beginning of the twentieth century, the observation method focused on the discovery of specific rules which decide on the production workspace being more effective. As such, the method has become more structured and scientific. In 1911, Taylor published his work The Principles of Scientific Management, which has begun a new era of production management. He added an important factor to the analysis of work, which is time. The method of observation shifted from not only how production is organized in sets of different activities, but also how long each activity takes. He also noted that the measurement of work and the effort of collecting and analyzing data on process flows must be minimal, and access to information for controllers must be in real or near real time.

Further development of the work observation method in management science has appeared again when a young construction worker, Frank Gilbreth, was observing the bricklaying process. Thanks to the results of this observation, he modified the existing process and more than tripled its effectiveness. That experience became a trigger to design and develop a new method of workspace observation called a “motion study.” Gilbreth went much deeper in his analysis then Taylor (Gilbreth & Kent, 1911). In a center of motion study, analysis is not only an action or a step taken in a particular process, but a move that observed workers perform. He classified all moves he started to call 'therbligs' and created a list of 17 unique ones. In a more detailed version of this analysis, it is suggested to analyze each hand separately.

Indisputably, the motion study method described above is the most detailed way to measure activities conducted by workers in a workplace. However, the effort and size of resources which are required to execute the method regularly or on a big scale is so huge that it will always be applied to a limited degree. To break this barrier and make it possible for work measurement to be executed more frequently and on a wider scale, Tippett (1935) has designed and announced a simpler version of the motion and the time study analysis known as a “work sampling.” The main assumption behind the new method was that instead of measuring the duration of each activity that a particular production process consists of, an observer responsible for the research should only register information about its appearance. That method significantly popularized the application of statistical techniques into scientific management and business process management as well. It made it possible to reduce labor consumption, but did not remove other limitations of work observation and measurement methods, such as:

  1. a.

    The impact of the very fact of observation on the implementation of processes,

  2. b.

    The subjectivity of interpretation by the observer in the case of non-numerical data,

  3. c.

    The need to allocate additional resources for observation,

  4. d.

    The need to train observers in the methodology of observation and in the scope of the observed processes in order to avoid misinterpretation of the activities carried out,

  5. e.

    The usually low resolution of recorded events, resulting from limiting the labor intensity and costs of observation,

  6. f.

    Not taking into account actions between individual samples,

  7. g.

    The fragmentation of observations, collecting data from one place of observation of a process that is usually carried out in many places and even different organizational units.

The vast majority of work efficiency measurement methods developed in the first half of the twentieth century are a development of or improvement on earlier ideas and methods. Method Time Measurement (MTM), published in 1948 by Maynard et al., has been created as a development of the Glibreths method. Still, it maintains the same weaknesses as mentioned above (e.g. lack of sequential order). In 2012, Balaila and Gilad (2012) published Process of Determining Manpower Standards (PDMS), a method that incorporates values from work measurement and metrics from other production outputs.

New technology: process mining

In Industry 4.0, ISs are used to analyze the course and support the implementation and improvement of business processes. Using process mining techniques based on the data collected in their event logs, it is possible to identify the course of the processes being carried out, and on this basis, compare them with the applicable standard and specify proposals for improvements allowing for the preparation of an improved standard (Badakhshan et al., 2022; Choi et al., 2022; IEEE Task Force on Process Mining, 2012; Lamghari et al., 2021; Lorenz et al., 2021; Mamudu et al., 2023). This is the scope of identification and analysis that fully covers earlier observation methods such as work sampling, snapshot observations, or work timing. However process mining techniques, similarly to the previously used observation techniques, have significant limitations are significantly limited on the one hand to data foreseen by the creators of the system supporting business process execution, and on the other – to data consciously input by the user. According to van der Aalst (2019a, 2019b), Vogelgesang et al., (2016), or Jans (2012), these are:

  1. 1.

    Analysis limited to selected data sources

The process mining analysis covers only the data collected in the logs of selected ISs, e.g. ERP, MRPII, BPMS, or workflow. Data from other sources are completely omitted, e.g. data from e-mail, social media, SMS, or telephone calls, and often also service systems, "logistics applications," etc.

  1. 2.

    Incomplete data

The analysis does not take into account activities carried out between successive entries in the IS event logs, for which no data has been saved, e.g. dynamic interpersonal interactions, downtime at work etc. Especially in the case of data entered by process contractors, the analysis includes only data entered at previously identified points in the process. Typically, these are data on starting or finishing tasks, known already in the design phase and included in the design of the business process and the IS that supports it. Even in the case of processes in which the project provides for many variants of execution, we can talk about tracking the implementation of the process rather than discovering the actual paths of the process implementation.

  1. 3.

    Data unreliability

The input of data in systems supporting business process execution is decided upon in a subjective manner by the user executing the process and operating the system. Striving to limit the labour-intensity of data input, the user will limit such data to the bare minimum, which will further strengthen the limitation set in point 2 “Data incompleteness.” In most cases, the subjective selection of the input data, including the conscious or unconscious desire to present one's actions in a positive light, may lead to the unreliability of such data and disrupt the complete picture of the executed process, the reasons behind troubles in its execution, or the failure to achieve the set goals.

  1. 4.

    Data discontinuity ("quantization"/"density" in time).

For process contractors, manually entering data into the IS is an additional activity, often perceived as redundant and distracting from the performance of work. Therefore, the data entered by process performers is usually related to completed tasks, and sometimes refers to when tasks start and end. Even if many activities are performed as part of the task (e.g. measurements of important process parameters), only result data or, at best, quantized data including values at defined points in time are entered (e.g. if patient blood pressure measurements are carried out every hour, then there is usually one entry per day in the documentation). Only in the case of data collected from automatic sensors is it possible to collect and analyze basically any "dense" time data on how the process is carried out.

  1. 5.

    Unpredictability and uniqueness

Processes may require individual adaptation to a specific execution context, customer requirements, or quick changes in the way of execution (Löhr et al., 2022). This may make it difficult to analyze the collected data (e.g. a large and rapidly growing number of process implementation variants) or require significant work, time, and costs to collect detailed retail data.

  1. 6.

    Technological limitations

For obvious reasons, the results of process mining depend on the quality of the data, including its completeness, consistency, and reliability. The result is also significantly affected by the limitations of process mining techniques, such as:

  1. o

    Lack of flexibility in process analysis – process analysis methods that work for one process may not work for another,

  2. o

    Methodological problems with the analysis and modeling of complex processes or their fragments carried out concurrently,

  3. o

    Limited possibilities of identification and analysis of different process variants,

  4. o

    Focus of process mining analyses on discovering the course of processes, with much less diagnostic or predictive possibilities.

  5. 7.

    Organizational restrictions

The implementation of business processes is often divided between various organizations, and within an organization between many organizational units, often with different ISs (e.g. production processes are usually dispersed in various departments, production centers, or teams; diagnostic and therapeutic processes between various healthcare units, and within individual units between different departments and support units (laboratories, pathology departments, genetics departments etc.) – this can lead to difficulties in identifying and analyzing the overall business process throughout the implementation period.

  1. 8.

    Difficulties related to the protection of sensitive data

This applies, for example, to personal data or confidential information that needs to be identified and protected so that it does not fall into the wrong hands and is not used for unintended purposes.

Process mining, therefore, provides limited insight into process execution. The incompleteness and discontinuity of data in many cases do not allow for the reliable reconstruction of the actual process flow (Löhr et al., 2022). At the same time, it does not explain why certain events occurred, what caused them, or what happened between registered events. It is obvious that if the data is incomplete and of low quality, then the results of the analysis and the decisions made on its basis will also be of low quality and subject to the risk of error. Incomplete or discontinuous data will not allow us to answer the question: Why does a given action take longer than anticipated? And the subjective selection of data, in many cases bordering on being unreliable, will not allow for the reliable analysis of why the actions of the process executor deviated from the anticipated workflow.

Industry 4.0 requirements and opportunities for business process observation

In the age of Industry 4.0 (Schwab, 2016; Hermann et al. 2015), organizations are forced to constantly raise their operational efficiency and flexibility on the basis of the use of technologies such as cyber-physical systems, cloud computing, process mining, robotics, and artificial intelligence (AI). This paradigm shift had led to the an unprecedented degree of automation, connectivity, and exchange of data in industrial environments, providing support to what is often called the concept of the “smart factory” (Lasi et al., 2014; Lu, 2017; Oztemel & Gursev, 2020; Zonnenshain & Kenett, 2020)). By going outside of Industry 3.0, the goal of which was to automate work, Industry 4.0 sets its goal as autonomous operation, in which decisions and actions are undertaken autonomously in real time on the basis of data available directly in the production hall itself, thanks to different Internet of Things (IoT) or Industrial Internet of Things (IIoT) sensors, without the need of human input. This vision is possible particularly thanks to artificial intelligence, which thanks to predictive algorithms not only allows for the optimization of business processes, but also for avoiding potential defects and problems prior to their appearance (Hervas-Oliver et al., 2019).

BPM in Industry 4.0 is in practice closely related to the use of various ISs (Bazan & Estevez, 2022; Beerepoot et al., 2023; Mahendrawathi et al., 2017). They accelerate the implementation, reduce the labor intensity and the risk of error in the implementation of business processes, and provide data for their analysis and learning by the organization during and after the implementation of business processes (van der Aalst et al., 2016). The effectiveness of IS support for the implementation and improvement of business processes depends on the scope, detail, and reliability of data available to them (Zirar et al., 2023). In particular, this applies to the increasingly used and important intelligent IS containing AI elements. The lack of data, their fragmentation, or inconsistency with the actual course of business processes immediately increases the risk of incorrect learning for the AI, and thus of errors in the support of managers by IS (Pery et al., 2022). The consequence is that managers make and implement wrong decisions that have negative economic consequences (Mariani et al., 2023). Therefore, a prerequisite for the effectiveness of IS support for BPM is the possession of reliable data on the course and context of the implementation of processes, such as the execution time of individual tasks, the time interval between the execution of individual tasks, the involvement of working and waiting resources, waiting times for resources to be made available etc. It should be emphasized that this condition has become particularly important with the start of the use of intelligent IS.

ISs supporting the implementation of business processes normally use data collected in event logs in the Implementation and Execution BPM Lifecycle phases, based on information entered by employees servicing them and information automatically provided by sensors, e.g. Internet of Things (IoT) / Industrial Internet of Things (IIoT) (Dumas et al., 2018; Ghosh et al., 2022; Sestino et al., 2020; Szelągowski, 2019). In the case of intelligent IS, the effectiveness of support for the implementation of business processes is additionally influenced by the availability and reliability of data contained in the BigData sets used, describing the current execution context, the course and results of previous executions of the process, or other data having at least an indirect impact on the implementation of the process. ISs are not able to create this data on their own, or even refine the data they already have (Zirar et al., 2023).

Earlier attempts to digitalize the observation of work

In past years, researchers and practitioners have taken several initiatives to digitalize work sampling analysis. Snapshot analysis is a method in which data samples are collected at fixed time intervals (snapshots). This technique has wide applications in many fields of science, such as: finance, industrial production, healthcare, and sociology. In Industry 4.0, thanks to different sensors including those of the Internet of Things (IoT) or the Industrial Internet of Things (IIoT), data may be collected automatically and autonomously in real time in virtually any small-time intervals. For example, data from sensors in CNC machining centers or electroencephalographic (EEG) or electrocardiographic (ECG) signals can be collected automatically (and autonomously) in real time, without virtually any noticeable delays.

One of the earliest solutions applied to researches focused on the digitalization of the work sampling method used a Personal Digital Assistant (PDA), a piece of equipment that allows one to input information in digital form. The PDA was adopted to work sampling to replace the paper form. That solution no doubt has two main benefits. It collects data in fully digital form and significantly reduces the time required to prepare the gathered information for the following phase, namely analysis. A second benefit is the possibility to automatically register a timestamp for each entered activity. The description of a case discussed by Robinson (2009) presents an implementation of a PDA in work sampling. The practical aspects of this implementation shoved clearly some key weaknesses of this new method. The most important one, and the one which can be a subject for a separate discussion, is a problem of who is responsible for entering information into a PDA form. If this role is assigned to a worker whose performance is subject to measurement, then two additional questions arise. Are those data trustful when an observer performs an observation? How to measure the steps in a given process when the measurement method itself is causing the extension of the list of activities conducted by the observed persons? A more general overview of the PDA work sampling method is presented by Groover (2014), where the division of roles in the observation method is more traditional. The PDA and the electronic form installed on it replaces paper and still remain in the researcher's or analyst's hands. Another separate type of solution was described in Knoch et al. (2018). The proposed approach to work sampling digitalization covers much more advanced technologies then mentioned above in the case of a PDA. The new method uses hardware like microsensors and micro cameras. This approach can certainly deliver very detailed information about each and every activity conducted in a measured process however it has its own list of limitations. Sensors proposed in this method are designed as wearable equipment, which makes this measurement process fully automated, but also strongly interferes with a given workplace. Another important barrier pertains to the cost and effort involved with the implementation of this particular method.

Luoa et al. (2018) proposed a video analysis solution to digitalize a work sampling analysis implemented in a skyscraper’s construction project. The recommended solution is based on video registration hardware which is placed above the construction area. The captured video file is divided into separate frames. And then a predefined image analysis algorithm is executed to detect the location of workers taking part in a given process. The proposed method of digitalization of work sampling is designed not only to locate workers but assign them to the execution of particular activities. This part of the analysis is possible to conduct thanks to small changes in a workplace. Employees are assigned to different roles and activities in a given construction process. This assignment is communicated by a separate color. This small change, which minorly interferes which the natural flow of the process, makes this method a very promising direction for further development.

It has to be stated in short that out of the three methods of work sampling digitalization presented above, the last one is the one which interferes the least in an observed process and is the most efficient.

Workplace performance measurement

Workplace Performance Measurement (WPM) is a proprietary managerial tool, the purpose of which is to enable the collection of data on work performance and their automatic analysis without interfering with the work itself. It was developed in response to the challenges faced primarily by managers in production companies. The WPM consists of the methodology of operating in the tested environment and intelligent IS offering the possibility of workplace performance analysis. This enables researchers and managers using the WPM to flexibly respond to emerging limitations and barriers. The WPM can be applied in organizations of all sizes, regardless of industry, to collect data which can help to solve problems appearing in”scheduling in discrete manufacturing systems” (Grabot & Letouzey, 2000). These systems require access to good quality data defining standard times of activities in production processes, and the WPM can be implemented to provide that data.

The WPM method for improving business processes

The WPM allows you to collect and analyze digital traces of actually implemented processes. The precision and level of detail of the data it collects is much greater than those of data recorded in IS event logs, because the WPM allows for the collection of data and subsequent analysis of activities that are not usually recorded in IS event logs. These may be, by their nature, unpredictable activities related to the work being performed or activities not related to the ongoing process at all. By way of an example:

  • Conversations during ad hoc arranged informal breaks at work,

  • Smaller or larger breaks caused by failures or technical breaks at work,

  • Additional actions caused by defects or material shortages,

  • Additional ad hoc activities related to the configuration or calibration of tools,

  • Short breaks related to waiting for interactions with other employees or processes, e.g. waiting for semi-finished products delivered by other teams or processes.

The WPM is not based on predicting the course of processes, but based on the analysis of recorded images and data from IS event logs, it allows you to build a precise model of the implemented process. The results of the analysis of such a model are fully consistent with the actual situation and definitely more reliable than the results of analyzes or predictions based on IS event logs.

The WPM method for improving business process follows the following sequence of tasks (Fig. 1):

  1. 1.

    Selection of a process to be analyzed,

  2. 2.

    Setup of parameters which should be registered in a workspace and setup of cameras, sensors, and registrators,

  3. 3.

    Video registration of activities conducted during production process,

  4. 4.

    Digitalization and analysis of acquired video content by the WPM tool and matching acquired data with process parameters,

  5. 5.

    Discussion and assessment of proposed changes and development of a recommendation,

  6. 6.

    Implementation of recommended changes into analyzed process,

Fig. 1
figure 1

The WPM method (WPM) for improving business process

The above-mentioned improvement process can be repeated many times for a selected process or for a group of processes.

The first element of the method results in the selection of a process and a definition of a list of business process parameters which will be analyzed. In the second step, sensors are selected such as cameras, thermometers, light intensity sensors, ECG, or eye trackers (Nesterak & Nesterak, 2021). The selection of proper sensors is determined by a scope of the analysis and type of process. For manufacturing processes, the first-choice sensor is a camera. Other sensors could be added to the list when researchers are looking for more data to help answer questions related to potential factors or variables affecting some of the anomalies appearing in an analyzed process in a particular workspace.

A second element of the WPM method is the preparation of a definition of the process parameters in the WPM tool. Of most importance is a list of tasks or activities defining a process flow. Other parameters which should be taken under consideration are types and categories of tasks or conditions of the workspace, such as temperature and tools. The last element of the WPM method is a process conducted in the WPM software, where data collected via sensors and registration are merged with definitions of process parameters. As shown in Fig. 2, sensor data (e.g., parameters of product components at the workplace) are dynamically detected and captured during the survey. The presented screenshot from the dashboard generated by the WPM tool shows a set of charts presenting changes over time in parameter values recorded by sensors selected for a specific study. The idea behind the WPM tool is to provide researchers with the flexibility to determine which of the sensors supported by the WPM are best suited to their planned research project. Therefore, some of the sensors not selected for the presented sample study do not collect data and do not display any values on the dashboard in Fig. 2.

Fig. 2
figure 2

Dynamic detection and recording of product element parameters

As an outcome of this element, the WPM delivers analyses and reports presenting a detailed picture of workspace effectiveness. The novelty of this method is that it was applied to processes without a digital footprint to collect digital data that can be used to conduct multidimensional Process Mining analysis. The other not less important new value is that thanks to this method it is possible to extend the period of measurement without the necessity to increasing the costs of the project.

The WPM method allows us to measure the time of usage of detected resources and then calculate the acquired values into other metrics like cost or energy efficiency, but to also add other measurements to the same timeline. The WPM is an open method that can use other sensors such as: ECG, temperature, or eye trackers, which provide more detailed information, allowing researchers and managers to more precisely assess the effectiveness of a process conducted in particular workspace. The WPM method consists of procedures dedicated to:

  • Configuring of video registrations in particular in a manufacturing workplace,

  • Configuring of work sampling analysis form,

  • Conducting work sampling analysis,

  • Configuring of a machine learning component.

Due to the fact that each manufacturing workspace can differ from others even in the same company, the mentioned procedure is more like a list of best practices then a precise manual containing a list of steps to follow.

Phases of the WPM implementation project

The primary purpose of using the WPM is to analyze the implementation and streamline the business process. The WPM method consists of two phases of process improvement initiatives: Phase I Analysis and Phase II Automatic controlling (Fig. 3).

Fig. 3
figure 3

Source: results of own research

Schema of an implementation of the WPM method.

Phase I Analytics presented on Fig. 3 is focused on research, discovery, and analysis of workspace activities of the analyzed process. The result of this phase is a report. This phase consists of three elements:

  1. 1.

    Research setup and information collection – provided mostly in manufacturing workspace,

  2. 2.

    Information processing and data extraction – processed by the WPM tool,

  3. 3.

    Report with insight and recommendation delivery with support of the Data Visualization tool.

The purpose of this phase is to analyze current flow and performance of an as-is process to discover inefficiencies, better understand their reasons, and finally create a list of recommended changes in a process and workspace organization, as well as tools and resource configuration.

Phase II Automatic controlling of WPM implementation is dedicated to supporting modern controlling mechanisms, which in a fully automated manner monitor if process performance meets planned values or KPIs. Human engagement in this phase is significantly reduced to the setup of data registration tools (including video) and to the configuration of a ML module. ML setup is a step where the person responsible for controlling mechanism implementation has to select one of the algorithms, which best detects Value Added Activities. Figure 4 presents a visual example of one of those defined algorithms. Each video frame captured by the camera is automatically analyzed to detect specific parameters of an image. The idea behind this particular ML solution is to apply for each workspace algorithm and image analysis option which the best fits to particular process. The image with huge number of green lines it is just a graphical representation of number of elements detected by an algorithm on analyzed frame from captured video file. What an algorithm is focused on is a dynamic of changed values. The algorithm calculates quantity of appearances of key parameters on each frame and compares values between captured frames.

Fig. 4
figure 4

ML video parameters detection executed controlled by algorithm

Functionality of the WPM tool

Functionality of the WPM tool is offered as set of modules. Each of the mentioned modules is responsible for serving a different element of the WPM method (Fig. 5).

Fig. 5
figure 5

Source: results of own research

WPM modules.

The WPM tool consists of the following modules:

  • Module VIDEO allows us to define a new workspace and to upload registered video files. This module offers a mechanism to configure a list of required sensors or registrations, which the analysis should be equipped with. This is a form with a list of hardware and software that can be added to an observation.

  • Module PROCESS is dedicated to the configuration of process parameters. It allows us to create a list of activities of which the process consists of. In the same module it is possible to add a set of categories of activities. The most important feature of this module is its flexibility. Each initially created list of activities can be modified at any given moment when the analyst finds during an observation that additional activity has to be added to cover the real flow of the process.

  • Module ANALYSIS performs the role of transforming work observations into digital data structured in an event log form. By using a form containing the list of process steps, an analyst can assign to each repeated activity a start and end timestamp. The pace of observation can be controlled by the analyst by setting up a video player speed parameter. An analytical module allows us to merge data from WPM observations with data extracted from other systems.

  • Module AUTOMATION is a machine learning functionality that has been designed to let the WPM solution detect a begging and an end of each repeated value-added activity in the process. Detected values are automatically stored in a database. The outcome of this module can be displayed either in the third module or in any tool dedicated to data processing.

  • Module REPORTING is a set analytical mechanism to deliver datasets for reports enriched by the value of insights.

Data from all modules are saved and available for further analysis in the SQL database. The WPM limits the size of the required database by only collecting changes in sensor states. All analyses are designed and performed on views, not on separate and dedicated analytical and reporting tables. The mentioned mechanisms let us explore the effectiveness of workspace resources applying several metrics linked to the same unified timeline. Standard video registration can be combined with thermometers, ECG, or eye trackers to deepen the impact of other factors on work efficiency.

One of the most important assumptions which has been made by a team of researchers is the data analysis component. It has been decided to not develop any kind of new, sophisticated data analysis and visualization components. The data related output of the module is a high value dataset, which can be integrated with:

  • Most of Business Intelligence (BI) tools currently available on the market,

  • Process Mining solutions,

  • Machine Learning solutions.

The dataset is equipped with not only acquired metrics describing observed processes, but with a full range of dimensions provided by the WPM in the configuration phase.

The digitalization of work monitoring by the WPM allows for the elimination of most of the limitations of the hitherto used methods of work monitoring and measurement. The observation of work is not carried out by dedicated observers, so it does not affect the way the work is carried out, and at the same time it does not require additional training of the individuals conducting the observation. Conducting the observation does not burden the people carrying out the work with additional activities, such as filling out paper questionnaires or entering data into the IS. Image registration and analysis means that the resolution of the analyzed events depends on the configuration of the analytical module. The method of analysis used in the WPM allows fort the collection of data with a precision of even 0,25 of a second, and raising the density of the collected data in no way hinders the work itself, as it does not require breaks in work caused by the need to enter data. Data on all elements of work performance observed by the WPM are obviously analyzed, including those that were not previously recorded in observation questionnaires or IS logs, e.g. dynamic interpersonal interactions, overcoming additional obstacles, performing activities not provided for in the standard work plan, work breaks etc. Full data on process execution are registered. They do not undergo subjective selection by process executors or other stakeholders. The registered data are automatically provided with timestamps and workplace identifiers, which allows the analysis to automatically track mutual relations between processes carried out in different locations in the company or between the company and its contractors. Of course, the analysis of the data provided by the WPM may also include data from the event logs of selected ISs that affect the course of the processes being carried out. The WPM provides data whose scope and detail allows for the elimination or at least minimization of most of the limitations of process mining. In this way, it allows for the establishment of the actual process workflow, including reliable analysis of why a given action lasts longer than planned or expected. Thanks to the possibility of analysing data outside of SI logs themselves, it also enables the analysis of not only traditional, predictable business processes, but also unpredictable and often unique business processes that require dynamic management. By offering access to a broad spectrum of reliable data, which is not the result of subjective selection, the WPM may be a source of data for any AI methodologies using ML to improve business processes (Pery et al., 2022).

Workplace performance measurement implementation case study

The aim of the WPM implementation project presented below was to provide managers of a production company with a tool to track and digitalize data on the implementation of production processes that require manual work, as well as to develop and implement their improvements. The described researched project has been conducted in close cooperation with a private manufacturing company located in the south part of Poland, specializing in processing steel and delivering products to the construction industry. One of the key production processes that require manual labor in this company is the process of welding elements. Despite noticing its low efficiency, the management staff was unable to plan its improvement without reliable, accurate, and up-to-date data describing its actual course. The issue also had a significant negative impact on several other production and management processes, including ongoing production planning and price calculations. The lack of visibility of the actual course of activities in the process and abnormal times did not allow for the implementation of effective planning activities and caused turbulence in the calculation of costs and margins.

The situation faced

The management of the company was definitely aware of unsatisfactory efficiency and fluctuations in execution time, and the resulting need to streamline and stabilize the process. To achieve this, it had to overcome problems such as:

  • Lack of a business process flows related data which could be used to support improvement initiatives,

  • Lack of information permanently flowing to controlling mechanisms, which supports the management of production planning processes,

  • Lack of access to on-line data presenting production process performance.

To provide a full picture of the challenge, it is necessary to better describe production workplaces that the company managed during the time when the team of researchers conducted the project. Majority of tasks conducted in those workplaces required manual effort. Workers have been equipped with specialized tools such as welders, but there were no options to install dedicated equipment to register timestamps for the beginning and end of relevant tasks.

In the mentioned production environments, some ways of controlling activities have been implemented and occasionally executed. During interviews conducted with process owners, it was identified that the most common measurement method to collect data from workplaces was work sampling. The organization was aware of the weaknesses and limitations of the mentioned method. The main weaknesses highlighted by managers are quality of data, form of data, costs and frequency of the method. The company clearly stated that will not invest in any specialized, expensive, dedicated measurement solutions. The expected output of the project was to define as two objectives:

  • The design and implementation of a digital version of work sampling, which could eliminate the weaknesses of the standard method and deliver valuable insight for process improvement teams.

  • The design and implementation of a solution which could constantly and fully automatically deliver digital information to a controlling mechanism supporting production planning management.

Action taken

The implementation of the task described above involved the use of the Process Redesign method processed by Davenport and Short (1990) (Dumas et al., 2018; Hanafizadeh et al., 2009). This method is dedicated to the ongoing improvement of business processes on the basis of experiences derived from their execution (Fehrer et al., 2022; Wastell et al., 1994). This method was described as set of steps conducted by companies which successfully implemented Business Process Redesign with support of Information Technologies (Kubrak et al., 2023; Palma-Mendoza et al., 2014). This method consists of five observed steps:

  1. 1)

    Develop Business Vision and Process Objectives,

  2. 2)

    Identify Processes to be Redesigned,

  3. 3)

    Understand and Measure Existing Processes,

  4. 4)

    Identify IT Levers, and

  5. 5)

    Design and Prototype Process.

The mentioned method was applied to solve both identified problems related to the lack of valuable, trustful information required by the controlling processes. Following the Business Process Redesign method steps, the team created a simple table containing relevant answers (Table 1).

Table 1 Result of the conducted analysis based on the Business Process Redesign method for the project of design and implementation of the WPM tool

Following the steps presented above, the team has decided to build a prototype of a digital version of a work sampling form and conduct research on video files registered in the regular manufacturing environment. The first set of materials analyzed with a use of prototype confirmed the high value of the designed solutions and delivered information which helped to create a list of valuable improvements.

Further analysis became a source of data that fully answered the requirements defined by the management team of the manufacturing company. Standard times calculation and detection of inefficiencies in process flows became the first two gains which the researched solution delivered to the company. The usage of video files in the semi-automatic solution showed that traditional process of work sampling measurement transformed with the support of Information technology applied in a manufacturing environment to processes without digital footprint can significantly improve controlling mechanisms.

The second challenge in the project required us to find a solution which could work in the fully automatic mode. In order to design and implement an automated controlling solution using the WPM tool, the 5-step Process Redesign method was used in the same way as before. It led the team to design and implement a solution which requires incorporation of an effective ML algorithm to conduct image analysis. As a result of the conducted analyses, it was agreed that in the constantly executed mechanism of acquiring data for controlling purpose solution should be focused on a limited number of process steps or phases or, from another perspective, should focus on the statuses of a product flowing in a particular process. As a consequence of the taken assumptions, a prototype has been developed and tested on video files containing recordings from over 20 h of manufacturing process conduction. To validate and assess the accuracy of the applied algorithms, the team could reuse data from the previously conducted work sampling method. The purpose of this assessment process was to verify if all targeted statuses were detected by ML algorithms. In general, one hour of video content covering up to 20 repetitions of the observed process was enough to train the system.

In contrast to phase one of the project, where the WPM has been used to measure and analyze all types of activities in production process, phase two followed a different approach. To achieve effective results by a design solution focused on supporting the controlling mechanism in the manufacturing process, the authors decided, according to the LEAN methodology, to measure only these activities or steps which are classified as adding value (meaning: relating to operations related to the creation or logistics of the product). It was a conscious decision to not measure all other types of activities. Calculated distance between two consecutive value adding steps can be treated as a time dedicated to non-value adding activities.

Results achieved

There are at least two perspectives that should be applied to summarize and assess the results of the completed project. Those perspectives are:

  1. 1.

    The business value of the designed and implemented solution,

  2. 2.

    The scientific value of the proposed solution.

It is stated that in the first area of assessment the results achieved can be declares as positive. The digital version of the work sampling measurement method has proven its business value by delivering the required information to support managerial processes in the manufacturing environment. Collected data from the observed workplace confirmed its significant value not only because of their high quality (resolution, limited number of errors) but due to the structure and shape of the dataset as well. The reported results of the analysis helped to detect weaknesses of the process and define a set of new values of standard times for key activities. The short-term benefits of the first phase of WPM implementation focused on:

  • Finding inefficiencies in the process, mainly in detecting and measuring activities that do not provide added value,

  • Recommendations for changes in the course of the process, such as the division of tasks between skilled and unskilled employees,

  • Data collection to determine new standard times, which improved the quality of the planning process.

A measurable benefit of the first phase of the project is the significant improvement in the KPI values. Process Effectiveness KPI, calculated as a duration of time spent by welders on value added activities vs total contracted time, has been selected as a most basic one. Another KPI was Number of Frames Assembled per Hour. After implementing the recommended improvements, the organization achieved an increase in the value of KPIs of Process Effectiveness, which rose from 60% before the implementation of WPM to over 90%.

Thanks to the WPM, it was possible to identify the reasons behind workers leaving the workspace. It was caused by the design of the process, where welders were responsible for the transport of finished products to the warehouse. Calculation of the total duration of this type of activity highlighted how inefficient the mentioned design was. 30% of productive time lost for non-added value activity.

The long-term benefit of the project is the implementation of a solution based on ML, which continuously, in a fully automated way, provides data allowing for constant monitoring of key parameters of the production process without interfering with its course. The solution provides data to the controlling tool as well as to the production planning tool. The quality, accuracy, and structure of the acquired data is often used to support process improvement initiatives and to assess the effects of changes. The accuracy of automated detection of product state (begin, end) on analyzed workspace reached 0.25 of a second thanks to designed and implemented algorithm.

Lessons learned

Given the second perspective om assessing the results of the project, it has to be stated that the designed solution can be used in a future by other teams to conduct research in organizations characterized by a lack of a digital footprint in manufacturing processes. In comparison with the traditional method of work sampling analysis, its improved digital alternative will allow other researchers to benefit from the high quality of data that can used in the inference process. Similarly, from scientific perspective the solution which automatically measures duration of added value steps by analyzing video file content will bring an indisputable benefit for researchers. The flexibility of the proposed algorithms and quality of data delivered from huge samples can be treated as a convincing argument.

The first phase of the project showed how important is close cooperation with a management team of a production site to properly understand the analyzed process. Process owner is a right source of knowledge which can let correctly identify and interpret the observed activities and steps. This is the most important support for the implementation of Machine Learning solution in phase two.

The team has learned that to apply automatic detection of motions in phase one, the production workspace should be equipped with more cameras, registering video from different perspectives.

Conclusions

The WPM of production processes, without interfering with the processes themselves and without increasing their labor consumption or exceeding the costs acceptable to users, enables:

  • The collection and analysis of data on the course of processes, to an extent that goes beyond the scope available so far (e.g. IS event log data).

  • The identification of errors and irregularities in the implementation of work based on the analysis of data from cameras and sensors or IS. The second accuracy of the WPM allows for detailed monitoring of production processes to detect any anomalies, whether repetitive (e.g. incorrect placement of components) or unique (e.g. mistakes in assembly).

  • Improvement of efficiency by identifying areas where manufacturing processes can be optimized, which can lead to better resource utilization, reduced machine downtime, and improved overall production performance.

The WPM method is devoid of most limitations and provides a more holistic picture of the analyzed process carried out in the selected work area than the existing work measurement methods, such as: work sampling, snapshot analysis, and timing and process mining. In the WPM, the entire context of the process is observed. The data collected with the accuracy of a second, covering the activities of employees and objects of the manufacturing process, also includes data usually not available in observation questionnaires or IS event logs. After all, WPM cameras also record what is happening in the environment, as well as between individual tasks. The scope of the analyzed data may include not only the image from the surveillance cameras, but also data from various sensors or IS supporting the work. Thanks to this, the results of the analysis conducted by the WPM include full data about the context and course of the process. This allows us not only to identify the activities performed, but also to identify their causes, the relationships between them, the involvement of individual participants, the impact of the availability or unavailability of resources etc. The WPM enables a detailed analysis of the effectiveness of the activities carried out. It also provides the opportunity to optimize them by showing the observed anomalies, inefficiencies, or wasted time, and the reasons for their occurrence. Thanks to continuous observation and analysis, the WPM allows for a quick response to identified errors and anomalies, and thus reduces the costs associated with their re-execution or repairs. The digital form of data and analyses provided by the WPM enables their further use as the basis for AI operation for analysis and optimization of work, including in real time.

The WPM method reduces the gap identified in available scholarship, particularly with respect to practical solutions enabling the collection of data on the actual flow of business processes, without having to disrupt them or raise their labor-intensity. The breakthrough nature of the WPM lies in the fact that it eliminates the limitations of the existing work measurement methods resulting from the limited scope of data. What is more, while earlier methods focus on detecting processes and anomalies, the WPM allows you to understand the factors and causes of their formation at the same time. These can be various factors that are not visible in traditional work sampling or IS event logs, and therefore are not visible in snapshot analysis, timing, or process mining reports. The quality and ergonomics of tools, environmental parameters, the condition of employees and their level of stress, seniority and the level of skills of the employee, method of communication, or rules for coordinating the tasks performed are factors that play an important role in the course of the process, and therefore researchers should take them into account. The key benefit of the WPM is the wide range of measured factors represented in the workplace and the ability to integrate them as part of a holistic view of job performance.

Further development of the WPM method will, on the one hand, be aimed at expanding the set of process efficiency factors, defining the method of their identification and defining their potential impact on the efficiency of business processes. These are factors that have an indirect, often hidden impact on the outcome of the work. It is usually omitted in the analyses, although it does not have to be negligible. The WPM provides researchers with great opportunities to discover how various combinations, including those usually beyond the scope of factor analysis, can determine the effectiveness of the process carried out in the workspace.

The second direction of future work will be to expand the use of AI. Work is planned to support the WPM's decision-making during work execution, optimization, and simulation of selected variants of process implementation based on the analysis of collected data, predictive analyses enabling prediction of future trends and proactive preparation for upcoming challenges. So the WPM could also be the answer or part of an answer to new challenges posed by emerging ideas generated by requirements like “an irregular demand using annualized hours.”

The limitation of the presented solution is the lack of verification of a larger number of various business processes. This causes a limitation in the available set of standard WPM tool configurations, including elements using AI. The article also does not refer to the necessary solution to ethical issues related to image capture and continuous assessment of human resources performance.