Abstract
Manual repair tasks in the industry of maintenance, repair, and overhaul require experience and object-specific information. Today, many of these repair tasks are still performed and documented with inefficient paper documents. Cognitive assistance systems have the potential to reduce costs, errors, and mental workload by providing all required information digitally. In this case study, we present an assistance system for object-specific repair tasks for turbine blades. The assistance system provides digital work instructions and uses augmented reality to display spatial information. In a user study with ten experienced metalworkers performing a familiar repair task, we compare time to task completion, subjective workload, and system usability of the new assistance system to their established paper-based workflow. All participants stated that they preferred the assistance system over the paper documents. The results of the study show that the manual repair task can be completed 21% faster and with a 26% lower perceived workload using the assistance system.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
1 Introduction
In contrast to the paradigm of workerless factories from the 1980s, humans remain the most flexible entities in the era of Industry 4.0 and Smart Manufacturing [1]. However, complexity is ever-increasing due to new manufacturing paradigms such as shorter product life cycles, increasing product variety, and mass customization [2, 3]. To overcome these challenges, industrial assistance systems can be utilized in order to enhance a person’s physical, sensorial or cognitive capabilities, leading to a so-called Operator 4.0 [4]. Cognitive assistance systems (CAS) enhance cognitive capabilities by providing digital work instructions, helping users make decisions, and assisting them in learning new tasks [5]. The digitization of paper-based documents can lead to significant increases in efficiency, as providing digital work instructions removes the need for printing, searching, filing, and retrieving paper-based documents [6].
WithIn traditional paper-based work instructions, e.g., for assembly or maintenance tasks, the user must first search for relevant documents. Then, he has to switch between the two-dimensional work instructions, which tell him how to perform the task, and the execution of the task on the three-dimensional physical object [7]. According to Cognitive Load Theory, splitting the user’s attention between spatially or temporally separated information sources leads to additional cognitive load [8]. This so-called Split-Attention Effect can be reduced with Augmented Reality (AR) [9], which superimposes virtual objects onto the real world.
Research on industrial CAS with AR has seen a steady increase in the number of publications over the last decade and focuses primarily on manual assembly and maintenance tasks [10,11,12]. For those applications, digital work instructions can be semi-automatically generated, given that each task follows a standardized meta-model of a workflow [13,14,15].
Manual repair tasks in the industry of maintenance, repair, and overhaul (MRO) are less standardized and instead require intuition- and experienced-based decision-making from experienced workers [16, 17], e.g., for the repair of turbine blades or fiber-reinforced composite structures in an aircraft. In our previous work [18], we described the system architecture of a CAS based on the digital twin for manual, object-specific repair tasks. In this present work, we describe the evaluation of our CAS in a case study with experienced workers on a familiar turbine blade repair task. The main contributions of this work are the following:
-
the presentation of a CAS for manual, object-specific repair tasks which are frequently performed in the MRO industry, and
-
experimental results from a user study with experienced shop floor metalworkers comparing the presented CAS to their current paper-based workflow.
2 Related works
Research on CAS with AR has been focusing primarily on manual assembly and maintenance tasks [10,11,12]. Multiple studies have shown the benefits of CAS with AR, such as reducing the time to complete manual tasks [19, 20], human errors [21, 22], mental workload [7], or improving the learning curve for new tasks [23, 24]. The three most commonly used metrics for evaluating industrial assistance systems are task completion time (TCT), the number of human errors in the process, and the subjectively perceived workload with the NASA-TLX survey [7, 11].
Funk et al. [25] developed an assistance system for manual assembly equipped with a Microsoft Kinect 3D camera and a projector for AR work instructions. Their assistance system recognizes the current assembly step by evaluating pick locations, the current assembly workpiece, and a predefined tool zone. In an 11-day user study at an assembly line of a car manufacturing company, they found that the in-situ projections were useful during the learning phase of untrained workers. However, for expert workers, the assistance system had a negative effect, i.e., perceived workload and TCT increased [26].
In [27], Lai et al. developed a CAS for manual assembly tasks. They equipped a workbench with two webcams and a display for AR. A neural network was trained on synthetic images to detect relevant tools with a webcam and highlight their position in digital work instructions. In a user study with 20 university students with no prior experience, participants were asked to carry out a spindle motor assembly task. Compared to paper-based documents, the CAS reduced the TCT by 33% and the number of errors during the task by 32%.
Uva et al. [28] developed a projective AR workbench for assembly and maintenance tasks. A projector was used to visualize text and 2D symbols on the workbench and on the maintenance object. The maintenance object was mounted on a movable tracking board with markers. A user study was conducted in [29] with 16 untrained engineering students to compare projective AR with paper-based documents. For a combination of assembly and disassembly tasks on a motorbike engine, they found a 20% reduction in TCT using the assistance system, as well as an improvement in error rate and subjectively better ease of use, satisfaction level, and intuitiveness. In conclusion, they point out that the assistance system is particularly beneficial for tasks with high complexity.
Kästner et al. [24] presented an assistance system with monitor-based AR for complex manual tasks. They used deep learning models for object detection and action recognition to automatically detect the current work step and display corresponding work instructions. In a user study with 30 inexperienced participants, they found that the group with AR assistance had a lower TCT than the group without AR assistance in the first 13 iterations. However, for more than 13 iterations, the group without AR assistance performed better than the AR group. Therefore, it was concluded that AR assistance is beneficial for untrained workers in the first iterations when learning a new task.
Havard et al. [30] conducted a user study with 20 engineering students to compare PDF maintenance instructions with AR maintenance instructions on a mobile tablet. The maintenance process consisted of 27 actions to replace two springs in a machine. Several markers were placed on the machine for spatial registration of the AR work instructions. In the evaluation, no statistically significant difference was found for either group in terms of TCT or mental workload using NASA-TLX. In summary, they recommend AR work instructions for training inexperienced workers in sufficiently complex tasks and tasks that are frequently performed.
In [31], AR work instructions using a projector were compared to paper and oral work instructions for three different assembly tasks. 44 participants, some of whom were cognitively or physically impaired, participated in the user study. The evaluation found that with AR work instructions, participants had a lower perceived task complexity and made fewer errors. However, there was no significant difference in TCT. Vanneste et al. point out that the advantages of AR diminish over time with repetition, therefore AR should be used for less experienced workers.
In summary, current research on CAS in manufacturing has focused primarily on manual assembly and maintenance. In the application area of maintenance, research on assistance systems has been focusing on standardized replacement repair according to the scheme of step-by-step disassembly, component replacement, and step-by-step assembly [29, 30, 32,33,34]. Evaluations of industrial assistance systems mostly use inexperienced participants in their user study, e.g., computer science students, who are not familiar with the evaluated task [35]. Evaluation results often show that positive effects from the usage of CAS diminish over time, thus most industrial assistance systems are found to be useful for training new employees but not necessarily for daily use by experienced workers.
3 Research questions
In order to be used in daily productive operations by experienced workers, a new assistance system must provide a noticeable improvement over the established workflow. Based on the literature from Section 2, there is a gap in research regarding the evaluation of CAS with experienced workers on familiar real-world tasks.
In [18] we described the development of a CAS for the manual repair process of turbine blades in MRO. The system was developed with a human-centered design approach specifically for experienced workers. To evaluate the developed assistance system, we formulate the following two research questions (RQ) for this case study:
-
RQ 1: What effect does the developed CAS have on the TCT of experienced workers when performing familiar repair tasks on turbine blades?
-
RQ 2: What effect does the developed CAS have on the perceived workload of experienced workers when performing familiar repair tasks on turbine blades?
4 Materials and methods
The system architecture of the developed CAS has been described in our previous work in [18]. Thus, only a brief overview of the system and its assistance functions will be given in Section 4.1. Afterward, the conducted user study will be described in detail in Section 4.2.
4.1 The cognitive assistance system
As shown in Fig. 1, we equipped a manual workstation for grinding work on turbine blades with the following additional hardware:
-
two Microsoft Azure Kinect 3D cameras for context awareness,
-
a Cognex DataMan DM 8600 scanner to read 1D and 2D bar codes,
-
a 27-inch touchscreen monitor for user interaction and
-
a 43-inch monitor to show screen-based AR content.
We chose the Robot Operating System (ROS) as a framework for asynchronous many-to-many communication between our software modules through ROS topics. The system’s components, depicted in Fig. 2, can be grouped into ROS nodes, a dynamic web application as a human–machine interface, and a digital twin for object-specific data.
We use the Asset Administration Shell (AAS) [36] to represent digital industrial assets. The AAS metamodel is documented in [37] and the HTTP/REST API is documented in [38]. Digital twins according to the AAS specifications can be created using available open-source AASX tools from the Industrial Digital Twin Association, i.e., the AASX Package Explorer and the AASX Server.Footnote 1 Given an object’s serial number, object-specific information can be accessed via the AASX server’s REST API using the HTTP GET method and updated using the HTTP PUT method. Based on the data from the AASX Server, we implemented the following assistance functionalities for this case study:
4.1.1 Automatic part identification
Currently, the shop floor workers must manually read the serial number on each turbine blade and then look it up in a paper file, which is slow and prone to error. Therefore, we added a barcode scanner for automatic part identification. Specifically, we use 2D Data Matrix codes [39] to encode the serial number, which is applied onto each turbine blade via direct part marking. Each time a Data Matrix code is scanned, its serial number is published to the assistance system in a ROS topic. The serial number is then used to access data from the AASX Server.
4.1.2 Digital work instructions
We developed a web application to display digital work instructions on the touchscreen. In general, the web application first provides an overview of the open tasks for the given serial number, then detailed information about the selected task, and the ability to provide digital documentation at the end of the task. The developed web application is depicted in Fig. 11 in the Appendix.
After scanning a serial number, defects for the current object are shown in a list. The task of each metalworker in this case study is to repair all defects classified as open. All instructions are linked to the serial number of the current object and are dynamically requested using the REST API of the AASX server. The digital work instructions include the following core features:
-
an interactive 3D model of the current object with spatial information about defects, zones, and residual wall thickness measurements,
-
detailed text information about the selected defect, such as its type, length, and additional comments,
-
opening PDF documents linked to the serial number,
-
a glossary for common abbreviations, and
-
digital documentation of the performed repair task.
4.1.3 Screen-based augmented reality
According to the pinhole camera model, the relationship between a 3D point \(\tilde{\textbf{M}} = [X, Y, Z, 1]^{T}\) and its 2D image projection \(\tilde{\textbf{m}} = [u, v, 1]^{T}\) is given by Eq. 1, where s is an arbitrary scale factor, \(\textbf{A}\) is the camera intrinsic matrix and \((\textbf{R},~\textbf{t})\) is the extrinsic rotation and translation between world coordinate system and camera coordinate system [40].
Based on Eq. 1, we use the top-mounted RGB camera for screen-based AR by projecting three-dimensional AR objects into the live camera image. In general, we use AR to help the worker locate spatial information about the object being repaired. For the manual repair of turbine blades, we superimpose defects, zones, and residual wall thickness measurements on the live image of the work area, see Fig. 3. The user can toggle which information is displayed.
Each time the object being repaired is moved, we have to recompute the extrinsic parameters \((\textbf{R},~\textbf{t})\). This problem is known as object pose estimation. Using homogeneous transforms \(\textbf{T}\), we can rewrite Eq. 1 to Eq. 2 with a dynamic transformation \(\textbf{T}_{O}^{W}\) from object coordinates to world coordinates and a static transformation \(\textbf{T}_{W}^{C_{1}}\) from world coordinates to the top-mounted camera coordinates. The camera intrinsic matrix \(\textbf{A}\) and the transformation \(\textbf{T}_{W}^{C_{1}}\) can be obtained using standard camera calibration techniques [40].
To obtain the transformation \(\textbf{T}_{O}^{W}\), we use the markerless pose estimation process depicted in Fig. 4. A new pose estimation is triggered by the user’s hands leaving the work area. Then, the object is first localized using 2D object detection on RGB images [41]. The obtained bounding boxes are used to crop the point cloud data from both 3D cameras. Based on the cropped point cloud, the 6D object pose \(\textbf{T}_{O}^{W}\) is estimated using point pair feature matching [42] and then refined using the point-to-plane iterative closest point algorithm [43].
4.2 User study
4.2.1 Participants
In order to perform the evaluated manual repair task, study participants had to be professionals with a certain level of work experience. Therefore, a total of ten male metalworkers with a median age of 46 years, ranging from 33 to 57 years, were recruited to participate in the user study. All participants were experienced in the grinding repair process of turbine blades, with a median work experience of 14 years, ranging from a minimum of 3 years to a maximum of 28 years.
4.2.2 Study design
In a between-subjects experimental study design, each participant is assigned to a single group, such as a control group or the CAS group. In a within-subjects study design, participants are assigned to multiple groups. Because within-subjects study designs usually have greater statistical power, they require fewer participants [44].
Because of the limited number of study participants, a within-subjects study design was performed, i.e., each participant used both systems (paper file and assistance system). To reduce practice effects, the sequence of the two systems was counterbalanced. That is, the first participant started with the CAS and then performed the same task with the paper file. The second participant started with the paper file and then used the CAS.
In the first phase, all participants watched a 4-min-long introduction video explaining all the relevant features of the CAS. Afterward, each participant had up to 10 min to get used to working with the assistance system. Next, all participants were given up to 10 min to check the paper-based work instructions contained in a file. Because the file was designed to replicate their current work instructions, most participants did not require much time. During this phase, the CAS and the paper-based file did not include the relevant defects of the evaluation.
In the second phase, each participant was asked to complete the given task according to Section 4.2.3 consecutively with both systems. Because the given repair task requires mostly cognitive reasoning, and it was not feasible to physically repair real defects, participants were asked to realistically indicate the physical grinding process with unplugged tools. During the task, we measured the TCT for each turbine blade as a dependent variable. This phase was recorded on video with sound for further evaluation.
In the final phase, each participant was asked to fill out standardized questionnaires for each system and provide feedback on the CAS. We used the NASA-TLX questionnaire [45] to measure cognitive load and the UMUX-LITE questionnaire [46] to measure system usability.
4.2.3 Task
The aim of the given task was to simulate a familiar real-world repair job of multiple turbine blades. To this end, we visited the shop floor and reviewed the metalworker’s labor practices and required documents. For this study, participants were given three different serial numbers with one defect each. The participants were asked to repair any assigned defects on the surface of the three corresponding turbine blades. The necessary amount of material that has to be removed depends on the defect type, its assigned zone, and nearby residual wall thickness measurements. Additionally, they were asked to document their work and the required amount of time. All required information was either contained in a paper file or provided digitally by the assistance system. The study setup for the paper file is depicted in Fig. 5a and the CAS is shown in Fig. 5b. For a more detailed workflow for both systems, see Figs. 10 and 11 in the Appendix.
To be consistent with the participants’ usual work routine, we did not impose step-by-step work instructions for the task. To complete the given task, all workers performed the actions depicted in Fig. 6 based on their work experience.
5 Results
5.1 Task completion time
Overall, 7 out of 10 participants completed the task faster with the CAS compared to the paper file. The time of the paper-based workflow was strongly dependent on whether action 3 and action 4 were carried out or skipped. Additionally, we observed that finding the defect information page in the paper file (action 2) often took a long time.
For the CAS, time was often lost due to the unfamiliar navigation of the digital work instructions. Furthermore, after moving the turbine blade, participants had to wait each time roughly one second for the pose estimation of the AR.
The mean results for TCT are shown in Fig. 7 with 95% confidence intervals (CI). With the CAS, participants were on average 21.2% faster than the participants with the paper file.Footnote 2
We performed a paired t-test to compare the overall TCT for both systems (significance level \(\alpha = 0.05\)). The test’s normality assumption was checked with the Shapiro-Wilk test. Results of the two-tailed paired t-test indicate that there is a non-significant medium difference between the paper file (\(M = 466.1\), \(SD = 171.3\)), and the CAS (\(M = 367.3\), \(SD = 95.5\)), \(t(9) = 2.2\), \(p = 0.054\). According to Cohen’s \(d_{z}\), which measures the standardized mean difference [48], the effect size of the CAS on the TCT was medium with \(d_{z} = 0.7\).
5.2 Perceived workload
The perceived workload was measured with the NASA Task Load Index questionnaire [45]. A meta-analysis of 556 studies [49] found that the average score for this questionnaire is at RTLX = 42. Figure 8 shows the RTLX values for both systems. Using the CAS, participants reported on average a 26.8% reduction in perceived workload compared to the paper-based workflow. We statistically compared RTLX values for both systems with a paired t-test (\(\alpha = 0.05\)). The test’s normality assumption was checked with the Shapiro-Wilk test. Results of the two-tailed paired t-test indicate that there is a significant large difference between the paper file (\(M = 44.4\), \(SD = 16.2\)) and the CAS (\(M = 32.5\), \(SD = 12.6\)), \(t(9) = 3.1\), \(p = 0.012\). The effect size of the CAS on the perceived workload was large with Cohen’s \(d_{z} = 0.9\).
5.3 System usability
To rate the usability of both systems, we used the UMUX-LITE [46] questionnaire. UMUX-LITE is a short questionnaire that rates the perceived usefulness (PU) and the perceived ease of use (PEU) of a system on a seven-point Likert scale and then combines them to a score ranging from 0 to 100. Table 1 shows the UMUX-LITE scores and the corresponding school grades on the Sauro/Lewis curved grading scale (CGS) [50].
The UMUX-LITE shows poor ratings for the paper-based workflow, both for PU and PEU. The CAS was perceived as both very useful and easy to use. During the feedback, all ten participants stated that they would prefer to use the CAS over the paper file in production.
5.4 Ranking of individual assistance functions
We asked all participants to rate each implemented assistance function according to how important it was to them. The average results from the five-point Likert scale are presented in Fig. 9. All assistance functions were rated above important. The most important function was to see the defects on the touchscreen and on the AR display. Although the glossary was not used by any participants during the task, they rated the feature as nice to have and important for new employees.
6 Discussion
The results from Section 5 showed better efficiency, perceived workload, and usability with the CAS, compared to the paper file. However, results from the user study are limited to experienced male metalworkers with a median age of 46. We did not test the CAS on workers with less than three years of work experience.
Regarding RQ 1, the TCT was reduced by an average of 21.2% for a familiar turbine blade repair task by experienced workers using the CAS. The paired t-test showed a non-significant medium effect for the time difference between the CAS and the paper file. We observed that some participants skipped action 3 and action 4 for the paper-based workflow, i.e., they did not check zoning information and residual wall thickness measurements in the paper file (see Fig. 10b, c, and d). Action 3 can be skipped with enough work experience, but neglecting the measurements from action 4 can potentially lead to errors. In comparison, the CAS always displayed all required information by default. Therefore, we presume that the CAS will reduce manual errors in the long term.
The group of participants had a median work experience of 14 years repairing turbine blades with the paper file, whereas the time to practice the usage of the CAS was limited to a maximum of 10 min during the user study. We observed that some participants lost time during the task due to the unfamiliar navigation of the digital work instructions. To this effect, multiple participants stated that if they had more time to get used to working with the assistance system, the CAS would be much faster than their current paper-based workflow.
Because all participants had to have a certain level of work experience in the evaluated manual repair task, we were unable to recruit more than ten study participants. The small sample size of ten resulted in a low statistical power of the conducted t-tests. The statistical power is the probability to reject a null hypothesis, i.e., the probability to find a statistically significant difference in the observed groups [48]. Statistically significant results for the TCT might be found by repeating the study with a bigger sample size, or by giving participants more time to get used to working with the CAS.
As for RQ 2, according to NASA-TLX, the perceived workload was reduced by an average of 26.8% with the CAS. The paired t-test showed a significant large effect for the RTLX difference between the CAS and the paper-based workflow. Feedback from participants indicated that searching for documents in the paper file was a major problem that could be eliminated through the use of the CAS. Furthermore, participants stated that evaluating the wall thickness measurement table (Fig. 10c and d) is very challenging, whereas the CAS was easy to use in this regard. This might be the reason action 4 was often skipped in the paper-based workflow.
All three AR functions were rated by the participants on average between important and very important. However, we observed that some participants mostly ignored the screen-based AR during the task and focused on the touchscreen with the 3D model instead. During the feedback, they stated that the AR technology was very new to them. Looking at the turbine blade’s 3D model on the touchscreen monitor was more familiar to them. The AR assistance functions might be more useful during a real grinding process. In the present user study, participants could only indicate the physical repair process.
7 Conclusion
All ten participants preferred the CAS over their current paper-based workflow. The UMUX-LITE ratings showed better perceived usefulness and perceived ease of use for the CAS. Additionally, all assistance functions were rated on average between important and very important.
In conclusion, the user study showed that the developed CAS can reduce the high-cost manual repair time for turbine blades of experienced metalworkers. Additionally, the digital work instructions with AR reduced the perceived workload. Furthermore, we could observe that the participants skipped inconvenient work steps with the paper file. With the assistance system, on the other hand, all the required information was conveniently displayed by default.
In future work, we would like to measure long-term effects by repeating the study after the workers had more time to get used to working with the assistance system. Additionally, we would like to test if the assistance system has a positive effect on the learning curve and the number of errors of new employees. We hypothesize that new employees who are not yet familiar with the paper file might learn the turbine blade repair process faster with the assistance system than without it. Finally, we would like to take a closer look at the advantages of the AR display compared to the digital work instructions shown on the touchscreen.
Notes
https://github.com/admin-shell-io [accessed: 15-03-2023]
Within-subject confidence intervals were calculated according to [47]
References
Gorecky D, Schmitt M, Loskyll M, Zuhlke D (2014) Human-machine-interaction in the industry 4.0 era. In: 2014 12th IEEE International Conference on Industrial Informatics (INDIN). IEEE pp 289–294
Brinzer B, Schneider K (2020) Complexity assessment in production: linking complexity drivers and effects. Procedia CIRP 93:694–699. https://doi.org/10.1016/j.procir.2020.04.014
Alkan B, Vera DA, Ahmad M, Ahmad B, Harrison R (2018) Complexity in manufacturing systems and its measures: a literature review. Eur J Ind Eng 12(1):116. https://doi.org/10.1504/ejie.2018.089883
Romero D, Bernus P, Noran O, Stahre J, Fast-Berglund Å (2016) The Operator 4.0: human cyber-physical systems & adaptive automation towards human-automation symbiosis work systems. In: Nääs I, Vendrametto O, Mendes Reis J, Gonçalves RF, Silva MT, von Cieminski G, et al., editors. Advances in production management systems. Initiatives for a Sustainable World. Cham: Springer International Publishing pp 677–686
Mark BG, Rauch E, Matt DT (2021) Industrial assistance systems to enhance human –machine interaction and operator’s capabilities in assembly. In: Implementing industry 4.0 in SMEs. Springer International Publishing pp 129–161
Heng BJM, Ng AK, Tay RKH (2019) Digitization of Work Instructions and Checklists for Improved Data Management and Work Productivity. In: 2019 4th international conference on intelligent transportation engineering ( ICITE). IEEE pp 79–83
Jeffri NFS, Rambli DRA (2021) A review of augmented reality systems and their effects on mental workload and task performance. Heliyon 7(3). https://doi.org/10.1016/j.heliyon.2021.e06277
Sweller J, Ayres P, Kalyuga S (2011) The Split-Attention Effect. In: Cognitive load theory. Springer New York pp 11–128
Dixon D, Terton U, Greenaway R (2018) Reducing the Split-Attention Effect in Assembly based Instruction by Merging Physical Parts with Holograms in Mixed Reality. In: CSEDU (1) pp 235–244
Bottani E, Vignali G (2019) Augmented reality technology in the manufacturing industry: a review of the last decade. IISE Trans 51(3):284–310. https://doi.org/10.1080/24725854.2018.1493244
Egger J, Masood T (2020) Augmented reality in support of intelligent manufacturing – a systematic literature review. Comput Ind Eng 140:106195. https://doi.org/10.1016/j.cie.2019.106195
de Souza Cardoso LF, Mariano FCMQ, Zorzal ER (2020) A survey of industrial augmented reality. Comput Ind Eng 139:106159. https://doi.org/10.1016/j.cie.2019.106159
Lindorfer R, Froschauer R, Schwarz G (2018) ADAPT - A decision-model-based approach for modeling collaborative assembly and manufacturing tasks. In: 2018 IEEE 16th international conference on industrial informatics (INDIN). IEEE pp 559–564
Quint F, Loch F, Orfgen M (2016) Zuehlke D (2016) A System architecture for assistance in manual tasks. Ambient Intell Smart Environ. 21:43–52. https://doi.org/10.3233/978-1-61499-690-3-43
Geng J, Song X, Pan Y, Tang J, Liu Y, Zhao D et al (2020) A systematic design method of adaptive augmented reality work instruction for complex industrial operations. Comput Ind 119(103229). https://doi.org/10.1016/j.compind.2020.103229
Esposito M, Lazoi M, Margarito A, Quarta L (2019) Innovating the maintenance repair and overhaul phase through digitalization. Aerospace 6(5):53. https://doi.org/10.3390/aerospace6050053
Bertram P, Kränzler C, Rübel P, Ruskowski M (2020) Development of a context-aware assistive system for manual repair processes - a combination of probabilistic and deterministic approaches. Procedia Manufacturing 51:598–604. https://doi.org/10.1016/j.promfg.2020.10.084
Eversberg L, Ebrahimi P, Pape M, Lambrecht J (2022) A cognitive assistance system with augmented reality for manual repair tasks with high variability based on the digital twin. Manufacturing Letters 34:49–52. https://doi.org/10.1016/j.mfglet.2022.09.003
Illing J, Klinke P, Grünefeld U, Pfingsthorn M, Heuten W (2020) Time is money! evaluating augmented reality instructions for time-critical assembly tasks. In: 19th international conference on mobile and ubiquitous multimedia. ACM pp 277–287
Ababsa F (2020) Augmented reality application in manufacturing industry: maintenance and non-destructive testing (NDT) use cases. In: De Paolis LT, Bourdot P (eds) Augmented reality, virtual reality, and computer graphics. Springer International Publishing, Cham, pp 333–344
Funk M, Kosch T, Schmidt A (2016) Interactive worker assistance: comparing the effects of in-situ projection, head-mounted displays, tablet, and paper instructions. In: Proceedings of the 2016 acm international joint conference on pervasive and ubiquitous computing. ACM pp 934–939
Chu CH, Liao CJ, Lin SC (2020) Comparing augmented reality-assisted assembly functions—a case study on dougong structure. Appl Sci 10(10):3383. https://doi.org/10.3390/app10103383
Hou L, Wang X, Bernold L, Love PED (2013) Using animated augmented reality to cognitively guide assembly. J Comput Civil Eng 27(5):439–451. https://doi.org/10.1061/(asce)cp.1943-5487.0000184
Kästner L, Eversberg L, Mursa M, Lambrecht J (2021) Integrative object and pose to task detection for an augmented-reality-based human assistance system using neural networks. In: 2020 IEEE eighth international conference on communications and electronics (ICCE) pp 332–337
Funk M, Lischke L, Mayer S, Shirazi AS, Schmidt A (2017) Teach me how! interactive assembly instructions using demonstration and in-situ projection. In: Assistive augmentation. Springer Singapore pp 49–73
Funk M, Bächler A, Bächler L, Kosch T, Heidenreich T, Schmidt A (2017) Working with augmented reality? In: Proceedings of the 10th international conference on pervasive technologies related to assistive environments. ACM pp 222–229
Lai ZH, Tao W, Leu MC, Yin Z (2020) Smart augmented reality instructional system for mechanical assembly towards worker-centered intelligent manufacturing. J Manuf Syst 55:69–81. https://doi.org/10.1016/j.jmsy.2020.02.010
Uva AE, Fiorentino M, Gattullo M, Colaprico M, de Ruvo MF, Marino F et al (2016) Design of a projective AR workbench for manual working stations. In: Lecture notes in computer science. Springer International Publishing pp 358–367
Uva AE, Gattullo M, Manghisi VM, Spagnulo D, Cascella GL, Fiorentino M (2017) Evaluating the effectiveness of spatial augmented reality in smart manufacturing: a solution for manual working stations. Int J Adv Manuf Technol 94(1–4):509–521. https://doi.org/10.1007/s00170-017-0846-4
Havard V, Baudry D, Jeanne B, Louis A, Savatier X (2021) A use case study comparing augmented reality (AR) and electronic document-based maintenance instructions considering tasks complexity and operator competency level. Virtual Reality. https://doi.org/10.1007/s10055-020-00493-z
Vanneste P, Huang Y, Park JY, Cornillie F, Decloedt B, den Noortgate WV (2020) Cognitive support for assembly operations by means of augmented reality: an exploratory study. Int J Hum Comput Stud 143:102480. https://doi.org/10.1016/j.ijhcs.2020.102480
Fiorentino M, Uva AE, Gattullo M, Debernardis S, Monno G (2014) Augmented reality on large screen for interactive maintenance instructions. Comput Ind 65(2):270–278. https://doi.org/10.1016/j.compind.2013.11.004
Henderson S, Feiner S (2011) Exploring the benefits of augmented reality documentation for maintenance and repair. IEEE Trans Vis Comput Graph 17(10):1355–1368. https://doi.org/10.1109/tvcg.2010.245
Obermair F, Althaler J, Seiler U, Zeilinger P, Lechner A, Pfaffeneder L et al(2020) Maintenance with augmented reality remote support in comparison to paper-based instructions: experiment and analysis. In: 2020 IEEE 7th international conference on industrial engineering and applications (ICIEA). IEEE pp 942–947
Merino L, Schwarzl M, Kraus M, Sedlmair M, Schmalstieg D, Weiskopf D (2020) Evaluating mixed and augmented reality: a systematic literature review (2009-2019). In: 2020 IEEE international symposium on mixed and augmented reality (ISMAR). IEEE pp 438–451
Neidig J, Orzelski A, Pollmeier S Asset administration shell reading guide. Plattform Industrie 4.0. Available from: https://www.plattform-i40.de/IP/Redaktion/DE/Downloads/Publikation/AAS-ReadingGuide_202201.html
Plattform Industrie 4 0. Details of the Asset Administration Shell - Part 1. The exchange of information between partners in the value chain of Industrie 4.0 (Version 3.0RC02); 2022. Available from: https://industrialdigitaltwin.org/wp-content/uploads/2022/06/DetailsOfTheAssetAdministrationShell_Part1_V3.0RC02_Final1.pdf
Plattform Industrie 4 0. details of the asset administration shell - Part 2. Interoperability at Runtime – Exchanging Information via Application Programming Interfaces (Version 1.0RC02); 2021. Available from: https://industrialdigitaltwin.org/wp-content/uploads/2021/11/Details_of_the_Asset_Administration_Shell_Part_2_V1.pdf
ISO/IEC 16022:2006-09: Information technology - Automatic identification and data capture techniques - Data Matrix bar code symbology specification. International Organization for Standardization
Zhang Z (2000) A flexible new technique for camera calibration. IEEE Trans Pattern Anal Mach Intell 22(11):1330–1334. https://doi.org/10.1109/34.888718
Eversberg L, Lambrecht J (2021) Generating Images with physics-based rendering for an industrial object detection task: realism versus domain randomization. Sensors 21(23). https://doi.org/10.3390/s21237901
Drost B, Ulrich M, Navab N, Ilic S (2010) Model globally, match locally: efficient and robust 3D object recognition. In: 2010 IEEE computer society conference on computer vision and pattern recognition. IEEE pp 998–1005
Chen Y, Medioni G (1992) Object modelling by registration of multiple range images. Image Vision Comput 10(3):145–155. https://doi.org/10.1016/0262-8856(92)90066-c
Keren G (1993) Between- or within-subjects design: a methodological dilemma. In: Keren G, Lewis C, editors. A handbook for data analysis in the behaviorial sciences. Psychology Press pp 257–272
Hart SG, Staveland LE (1988) Development of NASA-TLX (Task Load Index): results of empirical and theoretical research. In: Advances in Psychology. Elsevier pp 139–183
Lewis JR, Utesch BS, Maher DE (2013) UMUX-LITE. In: Proceedings of the SIGCHI conference on human factors in computing systems. ACM pp 2099–2102
Cousineau D, O’Brien F (2014) Error bars in within-subject designs: a comment on Baguley (2012). Behav Res Methods 46(4):1149–1151. https://doi.org/10.3758/s13428-013-0441-z
Cohen J (1988) Statistical power analysis for the behavioral sciences. 2nd ed. L. Erlbaum Associates
Hertzum M (2021) Reference values and subscale patterns for the task load index (TLX): a meta-analytic review. Ergonomics 64(7):869–878. https://doi.org/10.1080/00140139.2021.1876927
Lewis JR (2018) Measuring perceived usability: SUS, UMUX, and CSUQ ratings for four everyday products. Int J Human-Comput Interact 35(15):1404–1419. https://doi.org/10.1080/10447318.2018.1533152
Acknowledgements
We would like to thank our project partners from Siemens Energy, Gestalt Robotics, YOUSE, and Fraunhofer Institute for Production Systems and Design Technology for their work in the development process of the assistance system.
Funding
Open Access funding enabled and organized by Projekt DEAL. This work was part of the project MRO 2.0 - Maintenance, Repair, and Overhaul (ProFIT-10167454) and was supported in part by the European Regional Development Fund (ERDF).
Author information
Authors and Affiliations
Contributions
Leon Eversberg: methodology, software, formal analysis, investigation, writing — original draft, writing — review and editing, visualization. Jens Lambrecht: conceptualization, writing — review and editing, supervision, project administration, funding acquisition
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing interests.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Eversberg, L., Lambrecht, J. Evaluating digital work instructions with augmented reality versus paper-based documents for manual, object-specific repair tasks in a case study with experienced workers. Int J Adv Manuf Technol 127, 1859–1871 (2023). https://doi.org/10.1007/s00170-023-11313-4
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00170-023-11313-4