1 Introduction

Industry 4.0, often referred to as the fourth industrial revolution, is a term used to describe the ongoing transformation of traditional industries through the integration of digital technologies and the Internet of Things (IoT) [1,2,3,4]. It emphasizes the extensive interconnectivity between machines and resources, automation, machine learning, and the utilization of real-time data [5,6,7,8]. In practice, the adoption of Industry 4.0 brings about organizational, technical, and human-related transformations across the various layers of industrial companies [9,10,11]. In this regard, human-machine interactions are evolving in response to new digital technologies, resulting in a direct impact on operational level workers and their work. As a result, the role of physical work will be increasingly replaced by cognitive tasks in future production systems. These involve activities such as coordinating and managing materials and production resources, overseeing control and monitoring functions, and decision-making, particularly when dealing with production uncertainties [12, 13].

Operator 4.0 represents a skilled and highly adaptable worker with enhanced physical, sensory, and cognitive capabilities, who collaboratively interact with new systems, machines, and advanced digital technologies. These interactions create working environments characterized by not only the integration of humans and machines but also the convergence of the digital and physical worlds [14]. In this way, operators will collaborate with digitalized and automated production systems, utilizing and enhancing their creative, innovative, and improvisational skills to address unforeseen challenges [15]. However, to fully adopt the Operator 4.0 paradigm, operators must acquire new skills and knowledge to effectively interact with emerging digital technologies and machines, thus enhancing their efficiency [16]. Acquiring such competencies and knowledge is an essential and imperative shift, since technology innovation is the driving force for the transition to Industry 4.0.

Among the new digital solutions [17], Augmented Reality (AR) stands out as a pivotal technology that supports the development of the Operator 4.0 paradigm within the Industry 4.0 framework. It effectively aids factory workers and engineers by providing virtual computer-generated information, enhancing a smart operator’s perception of the factory environment. This is achieved using different visualization technologies including projectors, wearable or handheld devices [18, 19]. This technology transforms operators into an Augmented Operator [20], providing real-time support that minimizes human errors and reduces reliance on printed work instructions, computer screens, and operator memory [14, 21,22,23]. AR holds significant promise for modern workplaces, offering a wide array of potential applications across various facets of factory operations. Industrial workers view this transformative technology as an opportunity to enhance their performance, alleviating physical and cognitive burdens, and fostering a safer and healthier work environment. Additionally, AR empowers workers to exercise greater autonomy in task management and decision-making, thereby enhancing productivity, while preserving their pivotal role in the industrial landscape [24]. Yet, this physical and mental transformation of operators must be accompanied by manufacturing enterprises for incorporating them into the modern workforce [25] to avoid inclusion issues of operators in smart factories that could perceive more stress, frustration, and an increased cognitive load [15]. To tackle these challenges and facilitate the effective adoption of new enabling technologies, it is of crucial importance to embrace human-centric design and engineering methodologies. These approaches should encompass a comprehensive understanding of workers’ physical, cognitive, and sensory capabilities, alongside the Operator 4.0 concept. This holistic approach is essential for ensuring a seamless and sustainable transition while accommodating pertaining socio-technical challenges [14, 26,27,28]. To achieve this, numerous industrial and manufacturing enterprises are progressively acknowledging the need of implementing human-centric production systems. These systems view advanced digital and manufacturing technologies as pivotal elements for further enhancing human sensory and cognitive capabilities.

Based on these considerations, the paper introduces an innovative AR-based tool designed for assisting smart operators in real-time inspection tasks withing their work environments. Specifically, this AR tool has been developed using a user-centered design (UCD) approach to align with end-users’ needs and expectations, thereby achieving a high level of acceptability. A user study has been carried out with industrial subjects, focusing on a real-world case study, to assess and explore the usability and user acceptance of the proposed AR tool.

2 Related works

As abovementioned in the previous section, traditional companies have been witnessing the emergence of industrial transformation over the past few years, driven by the development of new Industry 4.0-enabling technologies. But what is actually begins to appear is that, in practice, because of the increasingly complex products, short product development cycles, and fast changes in production processes, many of these new digital technologies are not sufficiently tested to be adequately and efficiently introduced and used in an industrial context [29,30,31]. These issues arise from the fact that priority is often given to implementation aspects since the human operator is considered to be the most flexible element in the production system because they are the ones who have the highest capacity to adapt [32]. A primary aspect is therefore often mistakenly set aside: “User” is still the main driver of the digital transformation process and “behaviour modelling” is a must.

Therefore, it is crucial to consider the needs and expectations of workers to foster and attain a more seamless collaboration between individuals and machines, leading to a more advanced level of intelligent interactions with manufacturing systems. In this context, UCD approaches play a significant role in helping industrial enterprises seamlessly integrate the user perspective into the design process, ultimately yielding user-friendly systems [26, 27, 33, 34]. These approaches ensure that users’ needs and expectations are factored in, contributing to the development of the factories of the future, where a harmonious human-machine symbiosis is realized. In accordance with the ISO 9241 − 210 [35] standard, the UCD approach encompasses four fundamental activities that are iterated until achieving user satisfaction. To begin, the design process commences with the identification of potential users and an understanding of their context of use. The second step involves the definition of users’ requirements. Subsequently, in the third step, design solutions are generated to fulfill these requirements. The final step involves evaluating the product against the requirements, assessing how effectively the design aligns with users’ feedback. All these steps involve the user perspective from the beginning of the design process.

In this context, numerous research studies have engaged end-users in both the development and testing phases of industrial AR-based systems across various application fields, such as assembly [36,37,38,39], maintenance [40,41,42,43], training [44,45,46,47,48], and inspection [49,50,51,52,53,54,55,56]. Likewise, this paper introduces a novel AR tool that has been designed and developed using a UCD approach, with the involvement of end-users throughout all stages of the development process. The primary aim is to assist operators at the workplace in real-time as they carry out inspection activities on assembled products. In the last years, different AR-based systems and tools have been proposed to assist industrial operators in inspection tasks [49,50,51,52,53,54,55,56]. However, these systems are limited by their need for specialized and costly hardware, complex calibration processes, confined operational environments, high user skill demands, and reliance on markerless approaches for registering virtual models in augmented scenarios. These limitations create difficulties in accurately identifying components that deviate from the intended design. These challenges are circumvented by the proposed solution that operates on off-the-shelf handheld devices, ensuring better compatibility with commonly used devices such as smartphones and tablets and providing users with increased freedom of movement. Importantly, the adoption of the proposed AR tool eliminates the necessity for external dedicated hardware components, such as depth cameras or optical tracking systems, as all computations are performed directly on the device. Moreover, a marker-based approach is employed for aligning virtual content with physical objects, without requiring complex calibration processes.

3 UCD approach

A UCD approach has been adopted to develop the proposed AR tool, involving users at every stage of the design process. Their active participation ensured that users play a pivotal role from the definition of the initial concept to prototype evaluation and testing. Figure 1 illustrates the adopted UCD approach consisting of four iterative stages.

Fig. 1
figure 1

UCD approach adopted for the development of the proposed AR tool

In the first step (1), users’ needs, comments, and suggestions were collected, and the context and work’s analysis were carried out to identify limitations and typology of users. This information was then used to define users’ and functional requirements, and to frame the design of to-be (2). Based on these requirements, the third step involved the design and development of low fidelity prototypes (i.e., paper sketches) (3), which were subsequently tested with end-users in the final stage of the process (4).

Specifically, the first stage (1) encompassed structured interviews conducted with manufacturing and production line managers, engineers, and line operators to collect user needs and preferences. Additionally, scheduled site visits played a crucial role in conducting contextual inquiries, allowing us to observe user activities, ask questions, and gain a deeper understanding of the tools and procedures utilized for inspection activities. Recording the entire usage context in a single round of interviews is not advisable. Typically, in the early stages of a project, questioning tends to be broad and exploratory. As the project progresses and analysts become more familiar with the context, questions tend to become more specific and tightly focused. This iterative approach allows for a deeper and more nuanced understanding of the context, which can lead to more effective design and decision-making. Additionally, combining observations and enquiries allows recording actual events in detail, understanding the reasons for and context of these events. This additional information is useful for determining the required information content of a system, identifying navigation patterns and scoping system functionality [57]. In this case, two main limitations resulted from the analysis of the current inspection processes in line with existing literature [18, 31, 53, 58,59,60,61]. First, inspection tasks are conducted by referring to technical documents like paper-based drawings and printed manuals, as well as photographic materials. It is worth noting that these resources may not always provide a comprehensive view and can potentially lead to misconceptions and inaccuracies in the inspection process. Second, there are no structured and reliable methods and tools to assist workers in documenting and formalizing information related to detected design changes and errors. Currently, during the inspection activities, workers write down a few notes on papers or on technical drawings. This paper-based approach can result in at least three different issues. First, results of inspection depend on the operator’s capabilities and thus only experts with experience in interpreting technical documents can perform this activity accurately. Second, the user’s attention is shared between design data (manuals, technical drawings) and the physical prototype by requiring high mental concentration to users. Third, paper notes about detected errors may be lost or not exhaustive for a proper interpretation by the office in charge. Indeed, these annotations, sometimes accompanied by photographic material, are sent to the technical office that has to infer from these 2D sources a precise determining of the position in 3D space of the detected design discrepancies. Therefore, there exists a necessity to empower workers in their workplace to efficiently identify and gather design variations. Additionally, there is a need to formalize communication among all the actors involved in the process, ensuring efficient communication while minimizing misunderstandings and information loss.

All data collected in the initial stage were then used to define both user and functional requirements (2). These requirements served as a blueprint for developing a tool dedicated to facilitating workplace inspection activities and streamlining the identification of design discrepancies. The decision to prioritize AR technologies over other solutions was driven by their tangible efficiency advantages in this specific context of use, surpassing traditional tools employed for detecting design discrepancies [62]. User requirements were specifically defined to enhance the overall user experience and ensure the seamless integration of the AR technology into the inspection process. Noteworthy aspects included the ability to detect assembly errors without relying on technical documents, enabling a direct on-site comparison of the design with the built product, adopting a portable tool without impeding movement at the workplace, and facilitating the sharing of detected errors and associated data with the technical department for prompt intervention. Subsequently, these user requirements were translated into functional requirements pertaining to the features and capabilities offered by the AR tool. Throughout the development process, particular emphasis was placed on prioritizing features that facilitate intuitive navigation within the augmented environment. This encompassed the precise visualization of 3D design models in real-time superimposed onto physical objects, customization of 3D models’ visual appearances, and efficient identification, annotation, and sharing of potential discrepancies. Simultaneously, the definition of the functional requirements was guided by adherence to industry standards, particularly in terms of privacy and data security. Concurrently, the main functionalities and user scenarios were defined to visualize aspects of the proposed AR tool which users might appreciate most in their contexts of use, thereby orienting the next design stages.

During the subsequent prototyping stage (3), an analysis of hardware and software technologies was conducted to determine the most suitable technical solutions that satisfy the defined requirements. In terms of hardware solutions, mobile devices were prioritized on projectors and static screens due to their inherent advantage in enabling users to move freely within the real environment without any limitations. Furthermore, the applications of mobile computing systems are becoming ubiquitous and pervasive in any sector as they allow users anytime, anywhere access to information and computational resources. Finally, the decision to adopt a tablet device was driven by a specific user requirement who expressed a preference for larger displays than smartphones’ screen. Regarding software development, the AR tool was developed using Unity®, a renowned platform for AR development. Unity® offers a wide range of features and stands out for its hierarchical development environment, intuitive visual editing capabilities, and extensive external resources (a library of assets and plugins) that empower developers to design, prototype, and deploy AR applications efficiently [63]. Additionally, ARCore™ [64] was preferred over other alternatives for several reasons. ARCore™ is an open-source software development kit (SDK) that does not require specific and dedicated hardware components; it can run on any Android™ device with version 7.0 and above, and it is compatible with the Unity platform. This choice offered distinct advantages. Firstly, it ensured a very high level of compatibility with the commonly used devices for which workers are already trained, making the transition to the new system smoother. Secondly, ARCore™ leverages hybrid-tracking techniques that combine both vision- and sensor-based methods. This approach enhances the reliability of AR visualization, further empowering its suitability for our specific needs.

Based on the selection of specific hardware and software solutions, several functionalities were implemented to enable the user to:

  • perform real-time augmentation of 3D models, aligning them with the physical assembled product as designed by the technical office;

  • simplify the detection of discrepancies by customizing the color and transparency settings of 3D models;

  • report detected discrepancies and error directly onto virtual models by adding 3D virtual annotations;

  • share the collected data with the technical office.

Low-fidelity prototypes were implemented and tested by end-users (4), by also adopting usability inspection methods, such as the Cognitive Walkthrough [65,66,67], particularly during the initial and intermediate design stages. This serves as an effective alternative to traditional usability testing, generating results at a low cost and quickly. Moreover, this iterative approach allowed us to refine and adapt the design, meeting requirements and align with users’ needs and expectations. Notably, the proposed tool underwent continuous adjustments and optimizations, incorporating new features, influenced by user requests or suggestions that surfaced as users familiarized themselves with intermediate prototypes. Significant implementations, taken into account during the development process, included features like the capability to create diverse note types based on error categories and customize the size of 3D notes in the augmented scenario. Furthermore, user feedback obtained through one-to-one interviews played a pivotal role in introducing additional enhancements, such as incorporating an indicator that displays the ID of the last recognized marker. Adjustments, including changes in button sizes and adopting a vertical layout, were implemented to enhance user accessibility during interactions with the user interface, particularly on large tablet screens.

4 AR inspection tool

Based on the iterative approach shown in Fig. 1, the design of the proposed AR tool was informed by a series of experiments and prototypes that were previously developed and tested. This allowed us to gain a deeper understanding of end-users’ needs, the specific conditions of use, and the functionalities that needed to be implemented in the tool. In particular, four main functionalities (Fig. 2) were implemented including: real-time AR visualization; color customization; virtual annotation; and data sharing.

Fig. 2
figure 2

AR tool’s main functionalities, data and users

As abovementioned in the previous sections, these features were designed to provide support to end-users, namely operators and technicians, withing the workplace. These were implemented with the intention of aiding the users in real-time, particularly while engaged in inspection activities on assembled products. Specifically, the AR visualization function consists of the augmentation of 3D models, which are first imported from the 3D models’ database, and then integrated into the real environment. In this way, the user can observe the real scenario, enriched with the contextualized virtual information, through the screen of the device while holding it in his/her hands. This functionality was implemented by means of ARCore SDK which provides one of the most important AR capabilities, i.e. motion tracking for the estimation of the user’s pose in the real-world space. The second main functionality, i.e. color customization, was implemented in order to support the user for detecting design discrepancies. In particular, this function allows the user to customize the visual style of the 3D models by acting on their material properties including colors and level of transparency. It operates on the mesh renderer component that is attached to each 3D model displayed on the screen. The third functionality, i.e. virtual annotation, is dedicated to the creation and the editing of 3D annotations that contains a predefined set of properties related to the detected design discrepancy including its spatial position, typology, and 3D model referred to. These annotations are automatically saved and stored in the 3D notes database that is accessible at any time during the AR visualization. This allows the user to accomplish the inspection tasks in different work sessions, and to upload the 3D notes added by other operators to validate them or carry out additional inspections.

The data sharing feature allows users to send and share digital data with the technical office. The data consists of 3D annotations, but they also include textual information and photographic material. The engineers and technicians, who belong to the technical office, are then indirect users of the proposed AR tool as they benefit from accessing the annotations based on which they can plan interventions to fix the detected errors, and update or modify the 3D models, which in turn are used for the augmented visualization.

About the user interface of the AR tool, as depicted in the Fig. 3a, most of the screen is allocated to displaying the live video stream for the AR visualization. Additionally, a small number of command buttons are positioned on both sides of the screen. In particular, on the right side of the screen, a command button enables the user to capture a screenshot of the current screen. Additionally, two indicators display the battery level and the ID of the most recently detected marker. Whereas, on the left side of the screen, the menu button opens a list of the four main functionalities. Figure 3b depicts the pop-up menu that appears to the user after selecting an augmented 3D model on which to add a 3D annotation. Specifically, this menu allows the user to: add textual information, categorize the note by type, scale the virtual pin indicating the position of the note on the 3D model.

Fig. 3
figure 3

The user interface of the AR tool (a), and the human operator while editing a 3D annotation added in correspondence to a detected discrepancy (b) © 2023 Baker Hughes – All rights reserved

Regarding hardware compatibility, the AR application was specifically designed for installation and operation on Android smartphones and tablets to ensure widespread accessibility on commonly used consumer devices. This choice enhances mobility within the workspace and accommodates a variety of screen dimensions based on user preferences, thereby improving the usability of the AR tool. In this study, the AR application was deployed on a Samsung Tab S8, a high-end device featuring the powerful Qualcomm SM8459 Snapdragon 8 processor with 64-bit architecture. Additionally, the device is equipped with a range of sensors, including dual 12 MP and 6 MP cameras, an accelerometer, a compass, and a gyroscope. This setup demonstrated high tracking capabilities across various settings, as documented in [68], making it highly recommended for AR applications according to [19, 69]. The selection of this tablet is justified by the advantage it offers in allowing users to view models on a larger display, thereby improving the ease of understanding and performing assembly inspection tasks. Furthermore, since all computations are performed on the device itself, there is no need for external hardware components.

5 User study

A user study was conducted, involving representative users, in a real case study at the Baker Hughes Nuovo Pignone plant located in Vibo Valentia (Italy). Nuovo Pignone Srl (IET, GTE) serves as the global center of excellence for research, development, and manufacturing of gas turbines, compressors, pumps, valves, and associated services in Italy. The case study focused on a large auxiliary baseplate system, which serves as a support system for a gas turbine. Its role involves the management of temperature and pressure values of the lubrification oil. Due to the large size of the baseplate system, the test was conducted on one of its sides, as shown in Fig. 4. The resulting area of experimentation covered an approximate area of 6 m in length and 2 m in width. Five fiducial markers were placed at intervals of about 1.5 m apart on the physical components. This arrangement enabled users to move freely along the entire length of the case study while interacting with the AR system.

Fig. 4
figure 4

Case study: assembly inspection of a baseplate system © 2023 Baker Hughes - All rights reserved

The experimental procedure, as illustrated in Fig. 5, involved a user study designed to assess usability and evaluate cognitive aspects related to the users’ acceptance of the proposed AR tool. The user study encompassed three stages and included the participation of two representative user groups, namely engineers and factory workers.

Fig. 5
figure 5

Experimental procedure

During the initial stage, participants were introduced to the experiment and asked to complete a demographic information questionnaire. Following the completion of the demographic survey, each user received a brief demonstration lasting about 1 to 2 min, outlining the primary functionalities of the AR tool, which was provided by a tutor. In the second stage, the participants were tasked with conducting an assembly inspection of the case study by completing a total of six predefined tasks in the specified sequence. First, users framed a marker positioned on the actual prototype to initiate marker detection and the AR visualization of 3D models within the real environment, as illustrated in Fig. 6a. Subsequently, users started the inspection task, aiming to identify potential design discrepancies between the designed and assembled components. Once at least one difference was identified, the participants added a 3D annotation at the corresponding location, simultaneously capturing a picture of the augmented scenario. After completing the third task, users were asked to edit the 3D annotation in terms of content and size. Following that, participants customized the color and transparency levels of the augmented 3D model associated with the detected discrepancy, as shown in Fig. 6b. The final task involved sharing both the picture and the 3D annotation with the technical office. However, the tasks assigned to participants are summarized in the following table (Table 1).

Table 1 Assigned tasks for participants

After completing all tasks, participants were invited to participate in one-to-one personal interview to collect their opinions, comments, and suggestions regarding the proposed AR tool. Standard metrics, i.e. SEQ (Single Ease Question) and TAM (Technological Acceptance Model) questionnaires, were adopted as an alternative to the think-aloud protocol because of their capability for catching cognitive aspects related, respectively, to usability and users’ acceptance. Specifically, the SEQ (Single Ease Question) usability metric [70, 71] was utilized for measuring user satisfaction with task performance. This metric involves a standardized question: “Overall, how difficult or easy did you find this task?”. Participants were asked to rate the ease of use on a Likert scale, ranging from 1 (very difficult) to 7 (very easy). The SEQ is considered reliable, sensitive, and valid metric, making it preferred choice for this preliminary study. It is user-friendly and requires minimal time, making it a practical option for gathering usability feedback [19]. About TAM, it is widely adopted in many research fields where it has been demonstrated to be a valid and reliable metric to predict the user’s attitude towards using new technologies [72,73,74]. In particular, TAM allows for evaluating the perceived usefulness of users, which is a construct defined as the degree to user’s expectancy that using a specific application system will improve his/her performance [75]. Thus, a further questionnaire was administered to participants at the end of the inspection activity for assessing the overall user experience in terms of technology acceptance, including 10 items scored on a 7-point Likert scale from one (strongly disagree) to seven (strongly agree). Table 2 presents the items of the TAM questionnaire.

Table 2 TAM questionnaire
Fig. 6
figure 6

The user while visualizing the augmented visualization of 3D models on the real scenario (a) and customizing their colors and transparency level (b) © 2023 Baker Hughes Company - All rights reserved

5.1 Participants

Sixteen participants were recruited among representative users and separated in two homogenous groups. Specifically, the first one, i.e. G1, consisted of engineers (4 males and 4 females, 24–46 years old, mean = 30.5, SD = 6.6); Conversely, the second group, i.e. G2, involved 8 factory workers (all males, 23–53 years old, mean = 40.8, SD = 10.5). Including both user groups enabled us to conduct a comprehensive assessment of the usability and level of technology acceptance of the proposed AR tool. However, no rewards were offered to participants. Among the participants in the G2 group, only one individual did not report frequent usage of smartphones and tablets, with the rest indicating that they use these devices several times per day. As depicted in Figs. 7 and 62.5% of factory workers and 25% of engineers had little or no prior experience with AR technologies. In contrast, 37.5% of factory workers and 75% of engineers had previously used AR applications, though all of them were still newcomers to the integration of such technologies into their work environments.

Fig. 7
figure 7

AR applications frequency of use among groups

6 Results and discussions

Descriptive statistics and t-test analysis were employed for analyzing the data collected in the user study. All data analyses were performed using the statistical software packages IBM® SPSS® and Microsoft® Excel®. The significance level for statistical tests was set at p < 0.05. The dataset meets the assumptions of normality and homoscedasticity, as confirmed by the Shapiro-Wilk and Levene tests, respectively. The results of SEQ questionnaire are presented in the following graph (Fig. 8), with error bars representing the 95% confidence interval.

Fig. 8
figure 8

SEQ results by both groups

The findings indicate that, on average, factory workers rated tasks as slightly easier (mean value of M = 6.19 and SD = 0.26) compared to engineers (mean value of M = 6.67, SD = 0.29), except for task 5, where both groups perceived similar usability levels. However, it is important to note that these differences between the two groups were not statistically significant, as confirmed by an independent t-test analysis, which results are reported in Table 3.

Table 3 Results of the independent samples t-test analysis

Hence, even though all participants were newcomers to the use of Industry 4.0-enabled AR technologies, both groups demonstrated commendable performance and exhibited a notable level of usability. The results of the TAM questionnaire, assessing the perceived user acceptance of the proposed AR tool, are presented in the graph below (Fig. 9). Error bars in the graph denote a 95% confidence interval.

Fig. 9
figure 9

TAM results by groups

A Cronbach’s alpha was calculated to assess the reliability of the construct. An alpha of 0.88 was obtained which is acceptable since it is greater than 0.7 [76]. The results show that both groups perceived high usefulness while using the proposed AR tool. In particular, engineers perceive the proposed tool more useful than factory workers, with a mean value of M = 5.88 (SD = 1.17) and M = 5.73 (SD = 1.23), respectively. Nevertheless, there is no statistically significant difference between the two groups as confirmed by an independent t-test analysis (t (158) = -0.789, p = 0.432). What emerges from these results, is that the AR tool is clearly assessed with high usefulness by all participants, which is also supported by comments gathered at the end of the experiments. In particular, some comments from users were: “Very interesting tool for support our work”, and “I would like to use the tool to detect discrepancies as soon as possible”.

Additionally, the one-to-one personal interviews provided very positive feedback regarding the AR tool’s functionalities, ease of use, and intuitiveness. In particular, the users were impressed with the capability to detect design discrepancies and assembly errors using virtual 3D models that were augmented on actual designs according to the project plan defined by the technical office. Examples of such discrepancies can be observed in Figs. 3a and 6, where errors involving different components and assembly errors related to varying positioning and orientation of real components are clearly depicted. Furthermore, the positive outcomes resulting from the implementation of the proposed AR tool were reinforced by additional comments obtained during the concluding interviews. These comments primarily addressed two key aspects: the potential adoption of the AR tool as a supporting instrument for inspection activities; the level of users’ satisfaction with the screen size of the device. In both cases, very positive comments were obtained. In particular, both engineers and factory workers were enthusiastic to adopt the proposed AR tool for improving and simplifying the detection of discrepancies of built products. Likewise, most of the participants agreed with the screen size, while a few of them appeared uncertain. In summary, all of the end-users involved in the study reported a high level of usability for the proposed tool, and they unequivocally expressed their satisfaction and eagerness to incorporate this tool into their work environments for inspection purposes. While our study provides valuable insights into the effectiveness of implementing the proposed AR tool for supporting assembly inspection activities from a user-centered perspective, it is crucial to acknowledge the main limitations requiring attention for practical adoption within the company’s inspection procedures. Initially, utilizing a marker-based approach to register virtual models in the augmented scenario may present challenges, despite ensuring high tracking accuracy. This is attributed to the quantity of markers needed based on prototype dimensions and the necessity to affix them in specific locations without disrupting the inspection process. Secondly, although users generally expressed positive feedback about handheld devices (HDDs), it may be necessary to explore the adoption of alternative visualization technologies to assess their effectiveness compared to HDDs. This exploration could significantly contribute to enhancing the tool’s adaptability and effectiveness across various inspection scenarios. Thirdly, during experimentation, the influence of environmental light and surface reflections emerged as factors that can impede tracking estimation accuracy, consequently affecting the AR tool’s performance. Addressing these issues will be crucial for ensuring the tool’s reliability in real-world inspection scenarios.

7 Conclusions

This paper introduces an AR tool designed to assist smart operators in real-time workplace inspection activities. This AR tool was developed according to an UCD approach with the involvement of end-users at every stage of the design process. A preliminary user study was conducted within an industrial company that was undergoing a digital transformation. This study focused on a real case study involving representative users to assess their acceptance of the proposed AR tool. The results indicated a high level of usability and user acceptance, with all participants expressing positive opinions and a strong interest in the effective implementation of this AR tool in their daily work activities. In future endeavors, the focus will be on conducting extensive field experiments with end-users to have the usability and effectiveness of the proposed AR tool assessed using both objective and subjective standard metrics. Furthermore, the AR tool will be integrated with the PLM system of the factory to easily access product data and share detected design discrepancies with the technical office.