1 Introduction

Smart manufacturing (SM) is a vanguard approach where all assets are oriented to generate added value [1]. The assets have to be digitized in order to be integrated into the production process, thus creating a collaborative environment. For this purpose, various technologies in the field of industrial instrumentation (i.e., sensors), communication networks, and computing are employed in order to make optimized decisions [2].

According to this new perspective of industry based on digitization, connectivity, and intelligence, the concept of cyber-physical system (CPS) is being developed [3]. These CPSs are employed in some assets (e.g., AMRs, cobots, or wearables) [4] which due to their embedded intelligence and connectivity features are integrators of the entire value chain. Moreover, CPSs are equipped with sensors and actuators which intervene in the production process. Therefore, CPSs provide a bridge between the virtual, or digital world, and the physical, or real world, thus bridging the gap between information technology (IT) and operational technology (OT) [5].

However, to develop a collaborative environment, besides CPS, an industrial communication network is required, as well as specific computing equipment in order to make decisions in an optimized and coordinated manner [6]. In this context, numerous industrial internet platforms (IIP) have been developed in the literature [7,8,9,10], where topologies, technologies, and protocols have been analyzed technologies and protocols to communicate a large number of CPSs. These IIPs can incorporate algorithms based on artificial intelligence (AI) for intelligent decision-making [11,12,13]. This requires an Internet of Things (IoT) network where wireless communications in industrial environments for mobile or battery-powered devices are included [14].

One example of mobile CPSs is autonomous mobile robots (AMRs) [15, 16], which perform intralogistics functions within production processes. AMRs are autonomous vehicles that navigate due to the perception of the environment by means of advanced sensors (e.g., cameras, LiDAR, ultrasound) as well as their ability to process this data in an on-board manner. Despite having numerous sensors and onboard intelligence, the positioning accuracy of AMRs is limited due to the variability of the environment.

However, there are studies [17, 18] in the literature addressing this positioning issue by using sensor networks external to the AMRs. Nevertheless, this methodology is not suited for high-precision applications, such as pick and place or assembling operations. In contrast, collaborative robots, known as cobots [19], are equipped with high accuracy and precision sensors, which enables them to perform such high-precision operations.

Furthermore, unlike traditional robots, cobots do not require safety zones because they have intrinsic safety systems that detect possible collisions with objects and humans. In case of a collision, the cobot activates the emergency stop, preventing any movement by shutting off all motors. Therefore, these robotic arms have embedded intelligence and connectivity, being considered a CPS characteristic of the SM, which reduces the space required for production processes [4].

According to the definition of CPSs [20], AMRs and cobots can communicate through the IIP and collaborate among them to perform combined tasks. However, the heterogeneity of physical media and protocols used in communications prevent direct connectivity, which is a technological barrier to its implementation [21].

Another difficulty presented by the cooperation of these devices is the high relative error between the position of the AMR and the effector of the cobot, which prevents them to perform tasks automatically without making corrections in the programming of the robotic arm [22]. This fact makes it difficult to perform precision industrial operations (e.g., assembly, pick and place, smart warehousing) mainly due to the uncertainty of the localization of AMRs in the industrial plant.

In this sense, numerous authors in the literature have addressed this problem by using various technologies (e.g., computer vision (CV) [23], radio frequency identification (RFID) [24], light detection and ranging (LiDAR) [9]). However, these systems require external hardware and communication systems for their proper functioning, which may compromise their actual implementation. Furthermore, these systems are dependent on additional power supplies and are susceptible to environmental conditions.

For this reason, in this paper, we propose a novel contact system method (CSM) for the determination of the spatial pose deviation (i.e., location and orientation) of the AMR using only the collaborative robot without the need for external systems. A contact-based methodology is employed to determine the pose of the AMR, which assesses the deviations in all degrees of freedom (DoF) of the autonomous vehicle in a real application case where accuracy has been analyzed. The proposed methodology reduces costs as well as implementation time in production processes due to the fact that no external systems are required. This proposal is of special interest for the 5.0 industry which has dynamic layouts [25] where indoor positioning [26] and digitization with humans [27] is key.

Furthermore, the authors have implemented an IIP in order to coordinate the actions of the AMR with the cobot. In this IIP, a gateway has been included which allows the interoperability of the communication protocols, as well as the monitoring of the process from the Internet. Therefore, with this novel technological implementation, a collaborative environment is achieved, fruitfully developing the Smart Manufacturing paradigm.

Hence, the main contributions of this paper are listed hereafter:

  • The proposal of a contact-based method for attaining the precise integration between AMRs and cobots.

  • The actual implementation of the contact-based methodology through an industrial internet platform for achieving interoperability among systems.

  • The analysis of the uncertainty of a high-precision mechanical assembly procedure performed after the contact-based method proving the feasibility of the methodology.

The remainder of the paper is organized as follows: Sect. 2 discusses the current literature on existing methods to address this problem. Section 3 defines the integration problem and characterises the pose uncertainty of the employed AMR. Section 4 details the geometric analysis that the Contact Integration Procedure undergoes to determine the spatial pose and its implementation in a real scenario. Section 5 discusses the achieved results in the implementation of the technique and Sect. 6 states the conclusions of this paper.

2 Related Works

Multiple authors [23, 28, 29] have proposed distinct methodologies to determine the spatial pose of autonomous mobile robots for their integration with cobots in the production processes. Given the diversity of methodologies, an exhaustive analysis has been carried out to review the state of the art, evaluate the available resolution methods, and compare them with the methodology proposed in this paper. In this context, Table 1 presents a comparative analysis of the distinctive features of different methodologies for the relative positioning of industrial robots.

Table 1 Comparison of methodologies for determining the relative location and orientation of industrial robots

Multiple authors, have proposed the implementation of radio frequency identification (RFID) systems for attaining the pose of an element for industrial applications. In this sense, in [24], a Monte Carlo analysis based on a RFID dual-antenna system is performed for positioning a mobile platform in indoor scenarios.

However, the accuracy and precision achieved through RFID localization systems may result insufficient for certain applications, while also being systems susceptible to inferences and prompt to erratic detections [31]. In this context, the actual distribution of the RFID antennas plays a significant role in the achieved accuracy, a problem which in [30] the authors attain through a optimization algorithm.

On the other hand, LiDAR systems stand out for their accuracy and speed of data acquisition, as shown by [38]. These sensors measure the time-of-flight (ToF) and they are frequently used in industrial applications such as AMR localization [39].

However, while authors such as [33] have proven the capabilities of LiDAR systems for operating in dark environments while maintaining a high level of accuracy, their performance may result compromised when facing adverse environments with the presence of particles or smoke [40]. Furthermore, the presence of obstacles and the opacity of the object to be located can interfere with the reliability of the attained measurements [17, 41, 42].

Another approach to the problem involves vision-based localization systems [34], whose suitability for collaborative robots have been demonstrated in the literature [23], even when mounted on AMRs. The implementation of vision-based systems has numerous and varied industrial applications [29, 35, 43]. This is due to both its speed of response and its degree of accuracy towards localization and positioning problems [37, 44].

However, the performance of CV is highly susceptible to different ambient and light conditions, while their scalability may result compromised as a consequence of the required investment in hardware.

Nevertheless, several authors have proposed the combination of vision-based location systems with tactile sensors. In [36], the authors have devised a methodology that performs low-tolerance joining operations through a combination of visual localization and tactile sensor measurements. Moreover, in [37], the authors have developed an accurate localization the authors system based on the combination of these systems, which was implemented into a UR5e cobot. While these systems may achieve higher accuracies from the inclusion of tactile sensors, the performance of the resulting localization system is yet again compromised by the CV counterpart.

As a consequence of these limitations, we propose in this paper a new methodology for achieving accurate and precise positioning in industrial robots through a CSM. This methodology can attain all six DoFs that characterise the 3D spatial pose of an object in space through a relative positioning system purely based on force sensors, which are already equipped within the cobot hardware.

Some of the main contributions of this method to the existing literature are:

  • No additional hardware (e.g., cameras, sensors) is required to determine positioning. reducing costs and implementation time.

  • Measurement results are not dependent on environmental conditions (i.e., illumination, dirt) and are more stable than those based on vision.

  • The processing of the information to determine the position is lower than that of other technologies. In addition, the programming can be done in the cobot itself without the need of external systems for information processing.

However, this method requires the definition and implementation of the geometric interaction between the moving cobot and a stationary element, from which the cobot takes contact measurements for determining its relative location.

Nevertheless, this analysis relies on the determination of the initial pose of both the effector and the object to locate, their navigation uncertainties, and their resulting integration, which we will address in the following section.

3 Integration Between the Cobot and the AMR

The integration between cobots and AMRs entails different problems that must be addressed in order to achieve a complete and efficient collaborative environment [45]. The difference in both accuracy and precision between these systems severely restricts the collaboration between these devices, as they exhibit up to 3 levels of magnitude difference in accuracy. This margin of discrepancy pays a heavy toll when performing pick and place, assembly operations, or other precise operations of objects above the AMR from a stationary cobot [46, 47].

In this context, available methods rely on additional sensors (e.g., cameras, LiDAR) to obtain a more accurate localization of moving objects external to the cobot. However, the inclusion of supplementary sensors further escalates the complexity of another already existing problem, the communication among heterogeneous systems.

However, regardless of the localization system employed, a cinematic model of the behaviour of the cobot is required for navigating the effector of the arm to the located position, which is the foundation of our contact-based method.

Therefore, in this section, we provide an analysis of the cinematic model of both the cobot and the AMR, which is required for determining the characteristics of the geometric interaction between both devices. Furthermore, we perform an error characterisation of the utilised AMR for this paper, which will be used for evaluating the performance of the proposed CSM. In addition, a final subsection is included regarding the interoperability requirement between these systems.

3.1 Determination of the Relative Location of the Cobot Effector

Our utilised cobot for this paper is the UR5 e-Series model. Similarly to other cobots, it is equipped with 6 motors for achieving a 6 DoF movement [48]. Furthermore, the UR5e is also equipped with high-accuracy force sensors for safety purposes when operating near humans [49]. When a certain force is detected within these sensors, they command the cobot to stop any current movement [50], a system from which our proposed CSM localization method takes advantage of.

The movement of the effector of the cobot can be described following the Denavit-Hartenberg (DH) method [51]. Through this methodology, it is possible to establish the pose of a secondary coordinate system i from a reference coordinate system i − 1 following a series of matrix transformations, depicted in Eq. 1 [52]:

$$\begin{aligned} \small {}^{i\text {-}1}{} T_i = R(z_{i \text {-}1}, \theta _i,) P(z_{i\text {-}1},d_i) P(x_i,a_i) R(x_i, \alpha _i) \end{aligned}$$
(1)

where \({}^{i-1}{} T_i\) is the homogeneous transformation from frame \(i-1\) to frame i; R and P are rotation and translation matrices; \(x_i, z_{- i}\) are the x and z axis of the respecting frames and \(\theta _i, d_i, a_i, \alpha _i\) are the DH parameters of the frame i of the cobot.

For our UR5e model, a DH transformation is proposed for attaining the relative pose of the effector from its stationary base. Figure 1 details the representation of the performed transformation, and the actual DH parameters for the UR5e are detailed in Table 2. \(\{S_{i}\}\) refer to coordinate systems of links \(e_{i}\), which \(\{S_{0}\}\) is the base of the cobot, and \(q_{i}\) are its revolute joints. The direction axes \(x_{i}\), \(y_{i}\), \(z_{i}\) of each \(\{S_{i}\}\) are represented in green, red and blue respectively.

Fig. 1
figure 1

Representation of the DH parameters for the UR5e for attaining the relative pose of the effector from the cobot base

Table 2 UR5 e-Series DH parameters

The resulting transformation from the original frame to the effector pose can be derived from Eq. 1, resulting in Eq. 2:

$$\begin{aligned} {}^{1}{} T_6 = \prod _{i=2}^6 {}^{i \text {-}1}{} T_i \end{aligned}$$
(2)

The motors and sensors equipped within the UR5e achieve a target accuracy when moving the effector of 0.1 mm [53], which enables these systems for performing high-demanding applications and accurate measurements. On the other hand, AMR may present significantly higher pose uncertainties, which we will particularise and evaluate in the following subsection.

3.2 AMR Uncertainty Analysis

As for the AMR, we have utilised the MIR100 model for the proposed analysis. Similarly to other AMRs, the MIR100 is equipped with two differential drive wheels, which allow both translational and rotational movements [54], generating two translational DoF, contained in the displacement plane, and a rotational DoF, whose axis is perpendicular to the displacement plane [9, 55], resulting in a total of 3 DoF.

Nevertheless, while the AMR movement is contained within the floor plane, the existence of terrain irregularities or in the presence of an uneven floor may cause additional rotations in the AMR pose that an external stationary observer (i.e., the cobot) may perceive.

Therefore, despite being characterised by 3 DoF, the movement of the AMR may endure deviations in all 6 DoF. Consequently, the proposed CSM method of this paper is discussed for the general geometric scenario, where all 6 DoF are mutable. However, this method is then particularised for a completely horizontal floor where we have tested the accuracy achieved.

Furthermore, in order to measure the performance of the proposed methodology, the uncertainty of the AMR pose relative to the cobot must be quantified. An analysis has been performed for evaluating both the location and orientation uncertainty of the MIR100 when navigating to a given target point.

Following Euler’s Theorem [56], when considerated as a solid rigid, any movement performed by the AMR can be described as the composition of a translation and a rotation. When navigating to a predefined position and orientation, uncertainties within the AMR localization or control system may result in deviations in each of these two movement compositions. Therefore, when analysing only the AMR navigation, we can differentiate two main pose errors, a translational error, and a rotation error (i.e., difference in angle of rotation respect the desired orientation).

These two errors are independent of the point of application, as they describe the overall AMR pose error. However, AMR operations usually require the interaction with different points distributed along the AMR working surface, where the actual placement error differs depending on the actual point location with respect of the AMR centre of rotation.

Therefore, besides the translational and rotational error of the AMR, a third error for a point of interests results meaningful when evaluating the performance of the AMR for SM applications, the placement error. This error indicates the feasibility of performing a given task within the analysed location, which we will perform later on in the article.

For the proposed uncertainty analysis, we have evaluated these three errors for our equipment. For the placement error, we have considered the point of interest, \(B_1\), which is situated proximate to the AMR corners, where the rotational error is significant. In order to evaluate these three errors, we have devised a methodology that takes advantage of the high-accuracy cobot force sensors for measuring all uncertainties [57].

In the presented methodology, the AMR was commanded to navigate to a recorded set point proximate to the cobot workplace, where it should align itself with respect of the cobot station. Once the AMR had finished its navigation, the cobot was commanded to locate the effector in a set location for all tests, \(B_1\). Once the effector was in place, the cobot performed a sequence of movements in the cobot base \(\{S_6\}\) for determining the relative distances between \(B_1\) and the AMR frames.

The measurement procedure, represented in Fig. 2, commences at point \(B_1\) and moves in a given direction until it collides with the AMR frame, at \(A_1\). Then, the cobot records the location and returns to point \(B_1\), repeating the same process but through a perpendicular direction, thus reaching point \(A_2\). Once returned to point \(B_1\), the effector moves contrary to the first movement a known distance, reaching point \(B_2\), where the same procedure procures the last contact point \(A_3\).

Fig. 2
figure 2

Representation of the measurement procedure for quantifying the pose uncertainty

From these coordinates, relative to the cobot coordinate system, distances \((d_1, \hdots , d_4)\) are obtained from which it is possible to deduce the AMR rotational error \(\alpha\):

$$\begin{aligned} \alpha = \tan ^{-1} \left( \frac{d_2 - d_4}{d_3} \right) \end{aligned}$$
(3)

Once determined the rotational angle, the correction of these distances for the measured rotation determines the actual location of point \(B_1\) with respect of the AMR centre of rotation \(O_{R/ AMR }\):

$$\begin{aligned} \overrightarrow{R_{B_{1/ AMR }}} = \cos (\alpha ) \cdot (-d_1, d_2) - \overrightarrow{R_{O_{R/ AMR }}} \end{aligned}$$
(4)

Having that a translational error should be measured in an oriented coordinate system, these corrections are necessary to avoid taking distances of a non-oriented system (i.e., \(\{S_{6}\}\)). By applying the rotational correction, these distances have the actual directions of the axes of the AMR instead of the \(\{S_{6}\}\) axes.

Being known the coordinates of point \(B_1\) and the expected point of arrival of the effector, where no errors would be measured, we can determine both the translational error of the AMR and the placement error in point \(B_1\) (i.e., combined effect of the translation and rotational errors).

Fig. 3
figure 3

Graph depicting the vector relations between: (1) the ideal coordinate system of the AMR, where no errors are perceived; (2) an AMR coordinate system rotated \(\alpha\) degrees with respect of system (1) but with no translation; (3) an AMR with a coordinate system both rotated \(\alpha\) degrees and translated a certain vector quantity from system (1)

Following the diagram depicted in Fig. 3, where coordinate system (3) is the practical case of consideration, we can define the placement error as the difference between the expected location of point \(B_1\) within the AMR coordinate system (i.e., \(B_{1 expt / 3}\)) and the actually measured location where the cobot arrived (i.e., \(B_{1 msrd / 3}\)):

$$\begin{aligned} \text { {Placement Error}}(B_1) = ||\overrightarrow{R_{B_{1 expt / 3}}} - \overrightarrow{R_{B_{1 msrd / 3}}}|| \end{aligned}$$
(5)

This magnitude contains both the translational and rotational errors when applied in point \(B_{1}\) and thus reflect the actual uncertainty when trying to interact with such point within the AMR.

Furthermore, within the same analysis, we can attain the translational error as the difference between the measured point of \(B_1\) in the AMR system and the corrected location of expected location of point \(B_1\) within an unrotated system (i.e., \(B^*_{1 msrd / 3}\)):

$$\begin{aligned} \text { {Translational Error}} = ||\overrightarrow{R_{B_{1 expt / 3}}} - \overrightarrow{R^*_{B_{1 msrd / 3}}}|| \end{aligned}$$
(6)

where \(\overrightarrow{R^*_{B_{1 msrd / 3}}}\) can be obtained by rotation \(\overrightarrow{R_{B_{1 msrd / 3}}}\) \(\alpha\) degrees from \(O_{R/ AMR }\).

From this methodology we can attain the translational and rotational error of the AMR as well as their combined effect on the point of interest \(B_1\). Repeating the described procedure during a 40 evaluation allows us to quantify these uncertainties, represented in Fig. 4, being their mean and standard deviation values included in Table 3.

Fig. 4
figure 4

Histograms depicting the rotational and translational errors of the AMR pose and the placement error within point \(B_1\), measured through the devised analysis

Table 3 Representative values of the AMR pose uncertainty

This measurement procedure described requires the integration and communication between the AMR and the cobot, which we have attained through an Industrial Internet Platform, which we present in the following subsection.

3.3 Interoperability Among Systems

Communications between these industrial equipment are key to the coordination of operations in the industry. In this case the AMR MIR 100 and the cobot UR5e-Series are equipped with the connectivity and embedded intelligence characteristic of the CPSs. Although both devices are based on transmission control protocol (TCP) ethernet, the application layer communication protocols are different, preventing direct data exchange.

The integration of the CPSs is a requisite for enabling the automatic execution of the predefined operations in a coordinated and controlled manner. In addition, both sampling and subsequent operations are automated. For this purpose, an industrial computer is used as a gateway which guarantees the interoperability of the communication protocols by establishing an IIP. Nevertheless, the navigation actions of the AMR and control of the cobot’s position must be controlled by its embedded intelligence.

4 CSM Localisation

In this section we present our proposed cobot contact-based system for localising the AMR, where the performed geometric analysis behind the CSM methodology is detailed. Furthermore, a discussion of the proposed implementation for testing the achieved accuracy of the CSM is included, along the devised IIP for interoperability among devices.

4.1 Geometric Mathematical Characterization

The CSM methodology relies on the definition of the characteristic geometric planes (CGP) that embody the AMR in order to determine its actual spatial pose.

The defined CGPs, depicted in Fig. 5, contain the tangible external faces of the AMR structure and they form an orthonormal base that entails the coordinate system of the AMR, being the origin determined by the intersection of the three CGP. Since this origin acts as the absolute reference of the object, any point of the AMR can be determined. The direction axes \(x_i\), \(y_i\), \(z_i\) of \(\{S_{AMR}\}\) are colored in green, red and blue respectively.

Fig. 5
figure 5

Representation of the CGPs of the AMR and the coordinate system originated from them

The definition of these CGPs is dependent on the relative orientation between the cobot and the AMR, being \(\pi _\textit{lateral}\) the most proximate plane, \(\pi _\textit{upper}\) the top plane of the structure, and \(\pi _\textit{frontal}\) the plane perpendicular to the last two that achieves the positive-oriented coordinate system of the AMR.

From this geometric representation, the CSM pretends to attain the described coordinate system through the determination of the pose of each CGP. The orientation and location of each plane are attained through the measurement of the location of different points within each plane by a contact-measurement system established in the tip of the cobot effector. In this context, the determination of two different points within each plane defines a line that is contained within each CGP.

Being each plane perpendicular to the other two, there is only one possible combination of planes that satisfies the perpendicularity constraint while containing each of the measured lines. Therefore, from the contact measurement of different points, it is possible to determine the pose of each CGP, and thus the pose of the coordinate system that they define, where the origin of the AMR system is located in the intersection between these three planes.

The location of each point is measured from the cobot station coordinate system, performing the homogeneous transformation described in Eq. 1 for referring all the analysis to the stationary system.

Figure 6 illustrates an example of the contact measurement procedure for the general case analysis of 6 DoF, where 6 measurements are required (i.e., \(C_1, \dots , C_6\)) for determining the coordinate system pose. In this procedure, the effector tip moves from an initial position Start in a given direction of \(\{S_6\}\) until it collides with \(\pi _\textit{lateral}\) at the point \(C_1\).

Fig. 6
figure 6

Representation of the geometric analysis performed under the CSM for 6 DoF

The effector then moves a known distance in another direction of the cobot base, thus reaching the auxiliary point \(P_1\), from which it moves in the initial direction of collision until the point \(C_2\) is reached. The combination of the \(C_1\) and \(C_2\) location measurements defines a line that is contained within the \(\pi _ lateral\) plane.

This procedure is repeated for the other two CGPs, where the effector relocates itself into another known auxiliary point \(P_2\), which has been drawn within the AMR for ease of visualization. Repeating the prior process applied to the first plane, the two remaining lines for the sought analysis are determined, from which the relative orientation of the AMR coordinate system can be computed from the cobot stationary base.

Subsequently, \(\{S_{6}\}\) must be adapted to the calculated orientations in order to move as \(\{S_{AMR}\}\) direction axes. Finally, colliding to a CGP and then to the two remaining CGPs by moving always in trajectories that contain the collided CGPs, the relative placement of the AMR coordinate system can be reached successfully locating the AMR for future operations.

This analysis performs a 6 DoF determination, however, actual applications may only present 3 DoF, as the floor can be assumed as perfectly horizontal and the height of AMR respect the cobot station is known. In the next subsection, we discuss our proposed particularisation of this method for this particular case, and its implementations with the actual AMR and cobot for testing the accuracy of our proposed methodology.

4.2 Application Case

To demonstrate the geometrical-mathematical model, it has been implemented in a real application case in order to evaluate its performance in industrial operations with collaborative robots.

In this application case, only the three DoF derived from the AMR have been contemplated, as the ground is considered to be completely horizontal, thus parallel to \(\pi _{upper}\).

As a consequence, the general geometric analysis detailed in the previous subsection can be particularised for this analysis. In this context, there is only a singular relative rotation around the z-axis between the cobot and the AMR coordinate system. Therefore, the CSM only requires attaining points \(C_1\) and \(C_2\) of Fig. 6 to determine the AMR coordinate system orientation.

Fig. 7
figure 7

Scheme of movements and measured points to be recorded for the a orientation analysis and b position analysis in the application case. During the orientation analysis, the cobot perform movements in the directions of the effector base \(\{S_6\}\), while in the position analysis the base of movement is adapted to the AMR orientation \(\{S_ AMR \}\)

However, the defined line from points \(C_1\) and \(C_2\) only determines the relative orientation, while the origin of the coordinate system requires determining the intersection of all three CGPs. In this context, being the height of the AMR known, it would only require a distance between a known point of reference and the \(\pi _ frontal\) location for determining the intersection among planes. Consequently, following the fundamentals of the CSM, we have implemented a touch probe for determining the relative distance between \(\pi _ frotal\) and the effector location, designed under a triangular prism form, which successfully locates the frontal plane in the proposed application case.

In the proposed methodology, depicted in Fig. 7, the cobot only performs measurements on the transverse and planar top bar of the AMR structure. First, the cobot takes the two-point measurements as illustrated in Fig. 7a. An additional auxiliary point, \(P_2\), is included in the general analysis for avoiding collision with the prismatic element. Such analysis returns the relative rotation deviation from the cobot axis, which can be deduced from the following expression:

$$\begin{aligned} \theta _{z_ AMR } = \tan ^{-1} \left( \frac{ ||\overrightarrow{C_1 P_1}|| - ||\overrightarrow{C_2 P_2}||}{|| \overrightarrow{P_2 P_1}||} \right) \end{aligned}$$
(7)

The position analysis is performed subsequently after the orientation analysis, following the scheme depicted in Fig. 7b. An important note is that the orientation analysis is performed under the directions of movement of the effector base, \(\{S_6\}\). Once determined the orientation of the AMR, the cobot modifies the followed directions of movement to that of the AMR base system, \(\{S_ AMR \}\), where the directions of movement are parallel and perpendicular to the AMR structure.

In the position analysis, the effector retraces a fixed distance from the last measured point, \(C_2\), thus reaching the auxiliary point \(P_3\). From this point, the effector then moves into a collision direction of the triangular prism, contacting the prismatic element at \(C_3\). Being the location of the prism known, the origin of the coordinate system above the AMR is set based on the relative coordinates of point \(C_3\), thus determining the position of the AMR.

Nevertheless, performing this methodology efficiently and without human intervention requires automated communication between the cobot and the AMR. We have achieved interoperability among these two systems through the implementation of an IIP, which we describe in the following subsection.

4.3 Industrial Internet Platform

The IIP supports communications among heterogeneous assets, both internal and external. In this real application case, the collaboration between the cobot and the AMR is required to perform the experiments without the need for human intervention. The integration is based on establishing the navigation missions to the AMR as well as knowing its status to initiate in the cobot the CSM method or the posterior procedures. Furthermore, the IIP has also been used to obtain, classify and store the results automatically.

To achieve a collaborative environment characteristic of the SM paradigm, a platform that communicates all assets is required. For this purpose, Ethernet networks have been implemented based on the IEEE 802.3 standard for wired equipment (e.g., cobot) and IEEE 802.11 for wireless equipment (e.g., AMR). Thus, effective communication is achieved from a technical level. However, the communication protocols used for both AMR and cobot are different, preventing direct communication.

To solve this challenge, a gateway with ethernet interfaces has been implemented which receives data from all CPSs regardless of the communication protocol. In this manner, it is possible to structure the information in order to guarantee the syntactic and semantic interoperability necessary to achieve efficient communication. Moreover, by centralizing all the data in the gateway, it can interpret them to determine the appropriate actions in an optimized way. The CPS (i.e., AMR and cobot) communicate their status and the gateway sends the actions (e.g., set AMR navigation target, start the program on the cobot) with the compatible communication protocol of each device.

The cobot communications are based on a REST API where variables are exchanged under Hypertext Transfer Protocol (HTTP) using the IEEE 802.3 standard. This allows obtaining data such as joint positions as well as executing predefined programs.

The AMR, on the other hand, communicates via the OPC-UA protocol where the gateway is the client and the vehicle is the server. This protocol based on the client–server philosophy allows the gateway to access and modify both variables and files. Therefore, missions can be established as well as the status of the AMR (i.e., battery level, location, speed). In this case, the equipment is connected wirelessly via the IEEE 802.11 standard.

The gateway is an industrial computer with several communication interfaces. We have programmed it with Node.JS which is a cross-platform, open-source runtime environment written in JavaScript. In addition to the interoperability service among protocols, a web server has been implemented where the status of the equipment can be seen in real-time as well as controlling the equipment by means of human machine interfaces (HMI).

As can be seen in the architecture of Fig. 8, the web server is accessible from outside the network. To prevent unwanted third-party access, cybersecurity policies have been implemented (e.g., virtual private network (VPN) access) using a firewall. Therefore, the wired and wireless network connecting the CPS of the production process and the gateway forms the IIP in our application case.

Fig. 8
figure 8

Industrial Internet platform implemented for the creation of a collaborative industrial environment

Through this IIP it is possible to coordinate not only the AMR and the cobot but also any other CPS of the manufacturing plant or external services (e.g. digital twin (DT), optimization algorithms, or human interfaces) which can be incorporated in future work.

Therefore, in this real application case, the IIP enables effective asset communication in order to coordinate AMR missions with cobot actions. In this way, it has been possible to automate the repetitive tasks of both CSM experimentation and contact error measurement. Furthermore, the IIP has provided the cobot contact positions to the key gateway to obtain the results, which are discussed in the following section.

5 Results

An initial analysis of the proposed methodology to integrate AMRs and cobots requires considering that, although being CSM an analytical method that mathematically leads to a unique and exact solution, the measurements obtained by the cobot have a series of uncertainties that are present in the final evaluation of the method. These uncertainties are added to the environmental conditions of operation (e.g., variations in the straightness of the contact surfaces, vibrations of the constituent elements, deformation of the materials...) which prevent the determination of an exact position even if an analytical exact method as CSM is being used.

Thus, we follow in this paper a methodology to quantify the accuracy of the CSM method in a real application case that uses the UR5e cobot and an AMR MIR100. For this purpose, two experiments have been carried out to evaluate both the numerical accuracy and the applicability of the CSM in industrial precision processes.

The first experiment focuses on the quantification of the accuracy and precision of the CSM while the second analysis undergoes the feasibility of the methodology to perform a precision mechanical assembly operation, being both described hereafter.

5.1 Quantification of Deviations

The first experiment aims to quantify the deviations of the method introduced in this paper in a real case. In order to evaluate the accuracy of the CSM, we have implemented the same methodology proposed in Sect. 3 for characterising the AMR pose uncertainty, where the cobot performs a series of predetermined movements based on the estimated coordinate system of the AMR, as shown in Fig. 2.

Through this technique, the distances traveled by the effector from the auxiliary points (i.e., \(B_1\), \(B_2\)) to the AMR structure (i.e., \(A_1\), \(A_2\), \(A_3\)) are quantified. Following Eqs. 36, we can derive from these distances the orientation and translational error of the AMR as well as the placement error of the point of interest \(B_1\).

For the evaluation of the method, 40 tests have been performed. Within each test, the AMR firstly navigates to a pre-recorded set point. Once its arrival has been communicated via the devised IIP, the cobot initiates the CSM for determining the AMR pose. The estimated pose is then evaluated through the previously specified methodology, where the location of the points of interest was recorded using the IIP for a posterior accuracy processing.

The errors resultant of the CSM application from this experimentation have been compared to the previously quantified pose uncertainty of the AMR, in Sect. 3. Being both analysis performed under the same conditions and with the same set location of the AMR, a comparison among them indicates the error correction that the CSM is capable of achieving.

The result of this comparison is depicted in Fig. 9, where the rotational, translational and placement errors are compared. Furthermore, Table 4 includes the mean and standard deviation for the quantified errors after the CSM, while the respective values without CSM can be consulted in Table 3.

Table 4 Representative values of the three errors measured after the CSM application
Fig. 9
figure 9

Boxplot representation of the a rotational and b translational error of the AMR along the c placement error within point \(B_1\) before and after the implementation of the CSM

The achieved results indicate a significant reduction of all errors, up to a level of magnitude in both rotation and translation. An important note is that, in the measured point of interest, the perceived error was reduced up to two levels of magnitude, more than the other two error sources.

This reduction was achieved through the compensation among these two errors, as they display a systematic error behaviour. Therefore, the mean values and directions of both errors at point \(B_1\) mostly compensate between each other, a characteristic only achieved after the CSM application. This error level was consistent, which recommends the consideration of point \(B_1\) for high precision analysis, which we perform within the following section.

The proposed method is an analytical methodology by its mathematical demonstration which converges to a unique solution. Nevertheless, the results in the application case bring some errors derived from the position and strength sensors of the cobot, vibrations in the cobot and its supporting structure, and the inherent accuracy of the cobot. These errors multiplied among them generating a total error that can reduce the repeatability and accuracy of the methodology.

In this sense, other real-world applications of the integration between these two industrial robots in the literature [37, 44] have proposed different positioning methods for cobot precise operations. Despite their capabilities towards a stationary scenario, our case study considers a dynamical scenario with the AMR supposing a novelty in the literature and requires addressing higher placement and orientation errors.

Therefore, the needs of our proposal require performing actual experimentation to determine the suitability of the proposed methodology. For this reason, an experiment requiring high-accuracy operations toward an actual industrial scenario is provided hereafter.

5.2 Precision in an Assembly Operation

The subsequently performed experiment aims to test the feasibility of the CSM when implemented within a real industrial process that requires high accuracy cooperation capabilities between the cobot and the AMR.

For this purpose, we have performed different tests over a high-precision assembly process, where the cobot must insert a 608Z bearing inside shafts of varying diameters, as illustrated in Fig. 10.

Fig. 10
figure 10

Assembly operation of a bearing on an 8 mm shaft with the cobot using CSM

In the proposed experiment, the AMR navigates from a remote position to a defined point where the cobot performs the CSM. Using the determined AMR coordinate system, the assembly operation is conducted, where the cobot inserts a bearing into multiple shafts, distributed over the AMR surface, near the previously evaluated accuracy at point \(B_1\), where the placement error (i.e., the existing error for this applications) was minimal.

This procedure is automated through the IIP, and the relative locations of each shaft from the AMR coordinate system are known. Furthermore, the bearing is fixed at the cobot effector, being the location of the bearing centre known for performing the assembly operation. In this context, once located and oriented the AMR, the cobot is commanded to relocate the centre of the bearing above each shaft centre, being both the relative location of the bearing centre and the shafts centres known with respect of the effector tip and the AMR coordinate systems respectively.

Once performed the previous movement, the cobot effector descends slowly for performing the assembly operation. If the bearing inner ring collides with the shaft, the generated resistance would be measured by the cobot force sensors which command the cobot to stop the assembly process, labeling the resulting operation as failure.

Evaluating the successful rate of the multiple assemblies performed between the bearing and the different shafts, we can evaluate the effectiveness of the proposed methodology for performing high accuracy applications. Table 5 shows the measured successful attempts for each tested shaft after 40 iterations, which prove the the feasibility of the CSM for performing high demanding industrial applications

Table 5 Success rate of an application case using CSM

This experiment also validates the CSM as an effective method to allow the precise integration of two characteristic CPSs of the SM (i.e., cobots and AGVs) despite having different coordinate systems and despite the presence of the localization errors of AMRs, as a consequence of their imperfect positioning and navigation in the industrial plant.

Thus, this completely new low-cost methodology that takes advantage of the force sensors of the analyzed cobot enables the employment of cobots for precision industrial tasks in collaboration with AMRs thus creating the aimed collaborative environment of the SM.

In addition, the described method is not susceptible to environmental or light conditions of the industrial plant as it happens in other literature methods, which shows the robustness and repeatability of the CSM, rendering it a reliable procedure to integrate AMRs and cobots in any potential context of application.

6 Conclusions

The generation of added value by creating collaborative environments with CPSs is a challenge of Smart Manufacturing. In this sense, the integration between industrial mobile CPSs such as AMRs and collaborative robots is affected by the high uncertainty between their relative locations and the different communication protocols of the devices.

This uncertainty is produced by deviations in both the AMR location and orientation when navigating throughout the industrial plant. The combined effect of these two errors may induce significant uncertainties along the AMR working surface, which prevents other CPSs like cobot from addressing precision industrial tasks on it. Moreover, heterogeneous communication protocols are also a barrier to their integration into production processes.

This method has several applications in the future industry 5.0 where the dynamic distribution of the CPS will be a technological challenge for its integration in the value chain. In addition, the CSM method can be used when an AMR incorporates the cobot, and this binomial has to be positioned with respect to a fixed object (e.g., rack, table). However, the implementation of this analytical method presents certain technological challenges such as positioning time or accuracy which is dependent on the cobot.

For these reasons, the objective of this paper is to develop a novel contact-based positioning method to know the relative location of an AMR with regard to its associated cobot to perform collaborative industrial tasks and the proposal of a communications architecture that allows effective communication among devices in order to fully automate the integration of these Smart Manufacturing assets.

For the calculation of the relative position between the AMR and the cobot base, a mathematical model based on a contact measurement method, CSM, has been developed where the position of a free object is characterized by the existing six DoFs. Furthermore, to solve the problem of interoperability between the CPSs, an architecture based on a gateway with external connectivity has been proposed.

This integration procedure has been proven in real industrial applications. The results obtained indicate an improvement of 96.2% in positional accuracy, a 85.4% in orientation compared to the base AMR accuracy.

Furthermore, we achieved up to a 99.2% reduction error when performing operations in a point of interest within the AMR surface. This error reduction was achieved through the compensation of the translational and rotational errors of the AMR, which followed a systematic error distribution, which recommends the characterisation and minimisation of the achieved error for the points of interest within the AMR in future works.

In addition, we proved the feasibility of the methodology by performing an assembly operation between a bearing and a shaft of the same diameter, where 92.5% of the assemblies were performed successfully.

Therefore, CSM together with the proposed architecture allows the AMR and the cobot to collaborate in high accuracy industrial operations through a novel low-cost methodology based on contact measurements, representing a breakthrough in the scientific literature.