Remote laser welding with in-line adaptive 3D seam tracking

  • 579 Accesses


Remote laser welding systems can usually focus a laser beam to diameters of 0.1 mm. Therefore, high quality clamping and precise teaching is needed in order to achieve appropriate process tolerance for a sound weld. This brings reservation in the field of small series and user-customized manufacturing, where product individualization requires flexible and adaptive systems as workpiece geometry is not exact due to manufacturing tolerances and thermal deformations during welding. The preparation for welding is therefore often time-consuming. To solve this, we have developed an innovative system, which enables in-line adaptive 3D seam tracking. The system consists of an industrial robot (Yaskawa MC2000), a scanning head (HighYag RLSK; working area 200 mm × 300 mm × 200 mm) with optical triangulation feedback and a fiber laser (IPG, YRL-400-AC; 400 W). A feed-forward loop was used to achieve positioning accuracy of under 0.05 mm during on-the-fly welding. Experimental results show that between welding speeds of 25 and 150 cm/min, average tracking deviations are 0.043 mm and 0.276 mm in y and z directions, respectively. Moreover, teaching times for a specified seam can be shortened for more than 10 times due to the fact that only rough seam teaching is required. The proposed system configuration could be adapted to other classical welding processes.


A typical remote laser welding application consist of guiding the robot and directing the laser beam through the scanning head optics. The laser beam is usually simultaneously guided with combined movement of the robot and the scanning head optics which significantly increases the system flexibility and decreases processing times. We can see the rise of remote laser welding applications in many fields such as automotive, electronics, and appliance industry where new and thinner materials are being used [1,2,3]. A demand is also seen in the field of monitoring and controlling such a process in order to achieve a better weld quality [4,5,6].

Even though a weld path is known in advance, there is often a deviation from the taught or preprogramed path due to tolerances of workpieces, clamping devices, and imprecise teaching [7,8,9]. The tolerance of laser beam positioning is of the order of the beam radius and its Rayleigh length [10]. As the quality of modern laser sources increases [11, 12], the diameter of the laser beam in focus gets smaller and can achieve diameters of under 0.1 mm. Therefore, hand teaching or even just applying small corrections to a complex seam trajectory in 3D space is often time-consuming and can be inaccurate, as the human eye resolution is approximately 0.2 mm [13].

Welding simulations help with teaching of a welding path [14, 15], but this does not take into account the thermal deformations that occur during welding, unrepeatability of clamping, shaking of a robot manipulator during movement, and synchronization between the robot controller and scanning head controller. All this leads to lesser quality welds and products.

Many researches have investigated the basic principles of monitoring and controlling laser welding [16,17,18]. Usually, optical methods prevail as they are robust and can capture more information. Such an example is seen in [19,20,21] where a laser triangulation principle was used to track a seam with conventional optics. De Graaf et al. [22] achieved accuracies for laser welding of under 0.1 mm with this kind of configuration but the greatest drawback presents the tracking itself as the system welds with delay, because the seam is detected a few centimeters from the welding spot, which can cause weld misalignment due to thermal deformations, especially for curved workpieces. Another example of look-ahead vision tracking is presented in [23, 24] where a camera was used to passively control robotic gas metal arc welding (GMAW). Both cases look for image features that correspond with the specific seam shape.

A special place in the monitoring and control have coaxial sensor configurations where the sensor path corresponds with the laser path. Such a configuration can be seen in [25, 26] where an interaction zone can be directly observed with a camera, which enables laser power control and/or correction of laser focus position. An upgrade to these systems is presented in [27] where two laser stripes are added to monitor not only the interaction zone but the pre- and post-seam positions. The application is limited to classical welding heads, where the movement of the head is parallel to the welding path.

In order to achieve greater laser welding flexibility, we have developed a system which enables adaptive three-dimensional laser beam positioning with a scanning head according to the seam position in space. The method is based on an optical triangulation principle where a camera monitors a wide work area of the scanning head. The scanning head is mounted on an industrial robot. An additional illumination laser in a coaxial position with the welding laser is used in order to equalize the light intensity of the interaction zone with the light intensity of the surrounding area. With such a configuration, we can achieve greater positioning precision for a range of welding speeds during welding on-the-fly compared to a manual teaching method. Also, simplified clamping devices can be used and faster teaching times are possible as only approximate teaching of start and end weld points is needed. Moreover, the system can distinguish the laser beam interaction (keyhole, melt pool, etc.) which could be used to implement process control [18].

Experimental setup and algorithms

The presented configuration for 3D seam tracking is composed of three main steps: (i) approximate teaching of first welding point (this should be in the range of less than 1 mm); (ii) rough teaching of second point (this determines the direction of the weld); and (iii) the welding itself, where the system continually detects the seam position in three dimensions and adapts the laser beam position in order to assure a precise alignment.


The system for remote laser welding consists of an industrial robot (Yaskawa MC2000), a scanning head (HighYag, RLSK; working area of 200 mm × 300 mm × 200 mm in x, y, and z directions, respectively) with a console attached camera (PointGrey, Flea3, model FL3-U3-13Y3M-C) and a fiber laser (IPG, YRL-400-AC; 400 W). The camera covers an area of 75 mm × 90 mm. For distinguishing the seam from the surrounding area, an illumination laser (Fotona XD-2, 810 nm ± 10 nm) was used in order to equalize the brightness of the keyhole and the surrounding area. The optics of the illumination laser was set such that a circular illumination area with a radius of 9 mm was achieved. The camera axis forms an angle of 15° with the axis of the scanning head. The magnification factor in the middle was 1 pixel = 0.1 mm. The focus is positioned 536 mm from the end of the scanning head, with a Rayleigh length of 0.8 mm. A band-pass filter (Thorlabs, FBH810-10; central wavelength 810 nm, FWHM = 10 nm) was attached on the camera lens in order to further improve the visibility of the seam.

The whole welding on-the-fly process with adaptive 3D beam positioning principle is shown in Fig. 1. A personal computer (Dell, Intel Core i7 @3.4 GHz, 8 GB RAM) with an integrated graphics card (Intel HD Graphics 4000) was used to capture and process images from the camera. CAN protocol between the PC and scanning head controller was used to direct the laser beam according to the processed image. The robot is used for rough positioning of the scanning head. LabView software is used for user interface, communication, and data analysis.

Fig. 1

Welding on-the-fly system with in-line adaptive 3D beam positioning (left) and the programming principle (right)

Algorithm of in-line 3D seam tracking

The necessary steps for correcting the laser beam position are the detection of the keyhole position on acquired image, detection of the seam position, extrapolation of the seam curve to the nearest point relative to the keyhole, and 3D transformation of the detected position into the coordinate system of the scanning head.

Figure 2 shows a typical image captured with a camera during welding on-the-fly, with welding beam power of 400 W, illumination beam power of 5 W, and camera exposure time of 64 μs. Because of the band-pass filter, we can differentiate between the keyhole and the seam position. The camera detects the keyhole in the position uk and vk. This coordinates are then normalized to:

$$ {u}_n=\frac{\left({u}_k-{c}_u\right)\cdotp \Delta u}{f} $$
$$ {v}_n=\frac{\left({v}_k-{c}_v\right)\cdotp \Delta v}{f} $$

where cu and cv denote the optical center location of the image, Δu and Δv the size of a pixel in the camera u and v coordinates, respectively, and f the focal length of the optics. The height zc.s coordinate of the keyhole in 3D, according to the camera position in space, is determined by:

$$ {z}_{c.s}=\frac{P_y+{P}_z\cdotp \tan \alpha }{v_n+\tan \alpha } $$

where Py and Pz refer to the laser origin source position relative to the camera position. The angle α is the triangulation angle between the camera axis and the laser beam. From this, the xc.s and yc.s coordinates of the keyhole in the camera coordinate system is determined by:

$$ {x}_{c.s}={z}_{c.s}\cdotp {v}_n $$
$$ {y}_{c.s}={z}_{c.s}\cdotp {u}_n $$
Fig. 2

Example of a typical acquired image during welding with adaptive seam tracking of a butt weld. The image size is 1280 × 1024 pixels

In order to get the 3D position of the keyhole (xk, yk, zk) in the coordinate system of the scanning head, a 3D transformation is calculated:

$$ \left[\begin{array}{c}{x}_k\\ {}{y}_k\\ {}{z}_k\end{array}\right]={{\mathbf{R}}_{\mathbf{}}}^{-1}\cdotp \left(\left[\begin{array}{c}{x}_{c.s}\\ {}{y}_{c.s}\\ {}{z}_{c.s}\end{array}\right]\cdotp {\mathbf{T}}_{\mathbf{}}\right) $$

where and denote the rotation and translation matrices respectively, which describe the relative rotation and position between the camera and the scanning head coordinate system (tool frame).

Calibration between the camera position and orientation relative to the scanning head, lens distortion, and focal length correction is determined with multiple measurement of a reference groove-shaped surface of known dimensions. The mathematic formulation is made using the same formulation as presented in [28]. The reference measurement is made with the welding laser beam at low, modulated power [29] at known distances between the scanning head and the reference surface to ensure that the entire measuring range is calibrated. The measured profiles are then transformed into a 3D space using estimated values of the above-mentioned parameters. These parameters are then numerically optimized until the minimum deviation (the sum of the squared errors) between the measured profiles and the reference surface is found.

The detection of the keyhole begins with a robust detection of the illumination beam, which is shown in Fig. 3. The illumination laser is coaxial with the welding laser and more than 30 times larger in diameter. For faster processing speeds, the image is resized to a smaller dimension by a factor p (typical values are 2 or 4). With averaging according to columns and rows, the rough position of the illumination laser beam is calculated. The beginning of illumination area, uill and vill, is determined with a threshold, which represents the median of all the averaged rows and columns.

Fig. 3

Robust detection of the laser beam on the whole image (top). The source image was scaled by a factor of 1/2. The beginning of vill (row) and uill (column) is determined with a threshold

The exact location of the keyhole is then determined within the original image. The search location starts at U0= uill∙p and V0= vill∙p and is as large as the illuminated area (Will × Hill). In this region, the welding beam interaction is visualized as an overexposed area on the camera sensor, as it is shown in Fig. 4a. Due to the fact that spatter can cause similar features on acquired images, we incorporated an algorithm which takes into account the pre-knowledge of high laser beam quality and the history of previously detected shapes and sizes of the interaction of the laser beam with the workpiece. First, each image is filtered with a Gaussian kernel in order to emphasize the location of the keyhole and to filter out a high-frequency image noise (Fig. 4b). Then, the overall image threshold at 95% of the maximum detected intensity is applied. This results in a binary image with at least one white region which corresponds to the keyhole. In this image, the black color represents zero values and the white color unit values. But in order to eliminate the possibility of detecting secondary regions corresponding to spatter, a multiplication of the current binary image with a previous one is made. Since the spatter position changes very rapidly from one image to another, only the keyhole remains in the resulted image (Fig. 4c). The exact keyhole location (uk, vk) is finally calculated as the centroid of the detected area.

Fig. 4

Detection steps of keyhole position. The input image (a) is filtered with a Gaussian kernel (b) from which we search the maximum intensity pixel. The area that is the biggest and has similar position and size as in the previous images is used for the new location of the laser beam (c)

The seam edge is detected within the search area Ox × Oy (see Fig. 5a), where the search area for the seam is determined according to the position of the detected keyhole, direction of the robot movement, and the seam path. The presence of speckles in the illumination laser requires Gaussian smoothing of the search area (Fig. 5b) which is already done prior to the keyhole detection. The possible locations of the seam position are then detected in all rows of the search area where a peak detection algorithm is used according to the change in intensity (Fig. 5c). These seam locations are then used for the seam curve approximation using the least square algorithm.

Fig. 5

Steps for the seam locations detection. (a) The source image with detected keyhole and determined edge search area; (b) filtering of the search area using a Gaussian kernel to remove speckles; and (c) an example of pixel values along selected rows

The approximated seam curve (dashed line on Fig. 5a) is used to calculate the closest location of the seam edge to the keyhole. We also consider a small offset dv (look ahead distance) from the detected keyhole location in order to reduce the response time. This location (us, vk+ dv) is transformed to 3D coordinates using Eqs. 16.

We must point out that using a classical PID logic to control the laser beam position gives greater errors and slower response times as it only corrects the misplacement error. With a foreknowledge of the approximate seam position in 3D, we can speed up the beam positioning up to 10 times. The comparison of a normal PID-controlled laser beam and a feed-forward-controlled laser beam is shown in Fig. 6.

Fig. 6

Comparison of responsiveness between a PID seam tracking algorithm (a) and a feed-forward seam tracking algorithm (b)

The extrapolated location is the basis of the feed-forward loop. Because the calibration is not precise, the robot movement and shaking, system lag, thermal deformations, and relative change of seam position between two successive images, a small error is always present during the welding process. The latter is further corrected with an I-controller that corrects the error between the detected positions of the keyhole (uk, vk) and the seam (us, vs). A new position is, therefore, calculated using the following equation:

$$ \left[\begin{array}{c}{x}_{np}\\ {}{y}_{np}\\ {}{z}_{np}\end{array}\right]=\left[\begin{array}{c}{x}_s\\ {}{y}_s\\ {}{z}_s\end{array}\right]+{K}_i\cdotp \sum \limits_{i=1}^n\left[\begin{array}{c}{x}_{k_i}-{x}_{s_i}\\ {}{v}_{k_i}-{y}_{s_i}\\ {}0\end{array}\right] $$

where xnp, ynp, and znp denote the new position of the laser beam in x, y, and z coordinates respectively; xs, ys, and zs denote the calculated coordinates of the extrapolated seam edge in 3D space. The last equation element presents the implementation of the integration control where only x and y coordinates are taken into account and multiplied by an integral coefficient Ki. Here, xki and yki denote the coordinates of the keyhole and xsi and ysi the coordinates of the seam in 3D space. As stated previously, this step is necessary in order to eliminate the system offset which is the consequence of the above-mentioned phenomena.

As stated previously, the algorithm saves the previously detected positions of the keyhole on the image. To further improve its stability and tracking capabilities, a position check against excessive beam movement is done prior to sending the new location to the scanning head. If a beam movement limit is exceeded then a new beam position is calculated based on a previous correction trend line. This error can occur from increased reflection of the illumination laser, greater plasma blowout or an error in keyhole or seam location detection.

Measurements of response to initial condition and position accuracy

Response to the initial condition of the system has been determined with a programmatically induced position error at a certain time interval during welding on-the-fly with in-line adaptive 3D seam positioning on a 3 mm thick butt seam. Experiments were repeated 20 times. The welds were then 2D scanned with an optical scanner and a program was made to determine the deviation of the performed trajectory from the actual seam, settling time, and overshoot. The scanning resolution was 0.021 mm.

The position accuracy was determined on curved edge flat seams. The thickness of each plate was 1 mm (total thickness is 2 mm). Vices were used to clamp the workpieces together. These were freely placed on the work table. During welding, the robot made a linear move with changing x, y, and z coordinates. The start point was taught near the seam and the end point was taught away from the seam in all 3 directions. The weld was scanned with a 2D optical scanner with the same scanning resolution.

We tested 6 different welding speeds ranging from 25 to 150 cm/min with steps of 25 cm/min. Laser power of 400 W and a laser beam diameter of 0.2 mm was used. Camera frame rate was set to 80 frames per second with an acquired image size of 1280 × 1024 pixels. The system with robot, the scanning head, the illumination laser, and the work table are shown in Fig. 7a. In-line adaptive 3D seam tracking of the flat edge seam was done with simplified vices with a linear robot movement, as it is shown in Fig. 7b.

Fig. 7

a Experimental system; b curved flat edge seam with vices used for the seam tracking

Experimental validation and discussion

The induced error was made every 850 ms (every 14 mm) during a 130 mm weld length. A closer look of the given example welding is shown in Fig. 6b. It is shown that the laser beam repositioning is almost instantaneous. The lengths of seam repositioning were measured under a microscope and the times for repositioning were calculated from the known welding velocity. The averaged results for the system responsiveness for all the experiments are shown in Table 1. We can see that the average time for the system to reposition to the weld location was 12 ms which corresponds with a move equal to 0.2 mm. In presented configuration, this is also the diameter of the laser beam.

Table 1 System responsiveness characteristics made with welding speed of 100 cm/min with an image acquisition of 80 fps and an image size of 1280 × 1024 pixels

Figure 8a shows the result of adaptive 3D seam tracking of a flat edge seam made with welding speed of 100 cm/min. The robot made a linear movement with changing x, y, and z coordinates. In the given case, the average error of the beam misplacement in the welding direction was less than 0.03 mm, with a standard deviation of less than 0.05 mm. This is shown in Fig. 8b. The most significant result is presented in Fig. 8c where the comparison between measured weld position and sent beam position is shown. Here, we can see that a deviation of almost 1 mm was present but, when analyzing the weld (Fig. 8a left), no deviation from seam position is visible. This proves that thermal deformation was present during welding which the system resolved successfully in real-time.

Fig. 8

Result of in-line adaptive 3D seam tracking of the flat edge seam. (a) Shape of the workpiece with a detailed view; (b) deviation analysis between the detected center of weld and the seam position; (c) comparison of measured weld position and the sent laser beam position during welding

Figure 9 shows a time-lapse difference from a side view between welding without (a) and with (b) in-line 3D adaptive seam tracking of the flat edge seam. The difference in plume height and shape, and the presence of spatter is a good indicator on the stability and accuracy of the welding path according to the seam position. We can see that a stable plume is present when using the developed algorithm. The measured z coordinate in the welding direction is shown in Fig. 9c. When taking into account that the workpiece height change is linear, we can see that the standard deviation of sent z location (shown in Fig. 9d) is in the range of 0.23 mm. This deviation corresponds to less than 30% change in the Rayleigh length.

Fig. 9

Time-laps of welding without (a) and with (b) the in-line adaptive 3D seam welding. Welding starts from right to left. c The adaptive change in z coordinate during welding. d The deviation between the measured z coordinate and a linear change of workpiece height

Beam positioning stability for different welding speeds is shown in Fig. 10. An average standard deviation over the entire welding speed range was 0.043 mm and 0.276 mm in y and z direction, respectively. No significant influence of welding velocity on deviation in y direction is seen, whereas in z direction, the deviation increases more than twice from lowest to highest tested speed. We assume that the instability and motion of keyhole because of a high laser beam quality at higher welding velocities contribute to the higher tracking uncertainties [30, 31]. Furthermore, the inclination angle and dynamics of plasma changes [32], which influences on the detected position of the keyhole on the acquired image especially in the vertical, z direction. Even if the keyhole and plasma should collapse [7], the melt pool also emits some of the light in the 810 nm spectrum and as it is located around the keyhole, the y direction does not change significantly.

Fig. 10

Standard deviation of sent beam location in y and z direction for different welding speeds which determines the precision of 3D seam tracking

Temporal filtering of detected positions on a broader interval should be applied in order to minimize the deviation, for example using a Kalman filter [33]. But in this case, a faster image acquisition and processing pipeline should be needed in order to maintain the same response time.


An innovative system for in-line adaptive 3D seam tracking during remote laser welding was developed. The process is based on calculating the position of the laser beam and the seam in 3D in real-time during welding with a camera and an additional illumination laser.

The algorithm was tested on a freely placed curved edge flat seam with an inclination during welding on-the-fly with changing x, y, and z coordinates. The average time to achieve the repositioning to seam was 12 ms which implies that for a non-step-like welding path, welding speeds of less than 100 cm/min are required. This speed can be greatly increased with faster image acquisitions and faster processing times.

The stability of the algorithm was also tested for welding speeds from 25 to 150 cm/min. It was shown that average standard deviation of 0.043 mm and 0.276 mm in sent y and z direction respectively were measured. This is at least two times better than the resolution of a human eye. The deviation in z direction increases with speed while the deviation in y direction is nearly insensitive to speed within the tested interval.

This kind of adaptive welding needs approximate teaching of only the first welding point, thus enabling faster teaching times. We assume that the teaching process for welding, where a seam feature can be visible with the camera, will be at least 10 times quicker due to the fact that approximate start and end welding points must be taught to the system. This is the key system feature in an industry with simplified clamping systems, adverse workpiece parts, and low manufacturing tolerances.

This kind of system configuration and tracking principle could also be adapted for other classical welding processes. It facilitates direct monitoring capabilities of the interaction zone which enables us to investigate the possibility to simultaneously control the laser power during the adaptive 3D beam positioning to achieve a stable welding process (less plasma and spatter blowout, even weld depth, etc.) and consequently improve weld quality and 3D seam tracking.


  1. 1.

    Zaeh MF, Moesl J, Musiol J, Oefele F (2010) Material processing with remote technology - revolution or evolution? Phys Procedia 5:19–33.

  2. 2.

    Lu J, Kujanpää V (2013) Review study on remote laser welding with fiber lasers. J Laser Appl 25:052008.

  3. 3.

    Ceglarek D, Colledani M, Váncza J, Kim DY, Marine C, Kogel-Hollacher M, Mistry A, Bolognese L (2015) Rapid deployment of remote laser welding processes in automotive assembly systems. CIRP Ann 64:389–394.

  4. 4.

    Jezeršek M, Kos M, Kosler H, Možina J (2017) Automatic teaching of a robotic remote laser 3D processing system based on an integrated laser-triangulation profilometry. Teh Vjesn 24:89–95.

  5. 5.

    Huang W, Kovacevic R (2012) Development of a real-time laser-based machine vision system to monitor and control welding processes. Int J Adv Manuf Technol 63:235–248.

  6. 6.

    Emmelmann C, Schenk K, Wollnack J, Kirchhoff M (2011) High-precision calibration of a weld-on-the-fly-system. Phys Procedia 12:739–743.

  7. 7.

    Katayama S (2013) Handbook of laser welding technologies. Woodhead Publishing Limited, Philadelphia

  8. 8.

    Kannatey-Asibu E (2009) Principles of laser materials processing. John Wiley & Sons, Inc, New Jersey

  9. 9.

    Regaard B, Kaierle S, Poprawe R (2009) Seam-tracking for high precision laser welding applications—methods, restrictions and enhanced concepts. J Laser Appl 21:183–195

  10. 10.

    Ready JF (2001) LIA Handbook of laser materials processing. Laser Institute of America Magnolia Publishing, Inc, Orlando

  11. 11.

    Kratky A, Schuöcker D, Liedl G (2009) Processing with kW fibre lasers: advantages and limits. Proc. SPIE 7131, XVII International Symposium on Gas Flow, Chemical Lasers, and High-Power Lasers, 71311X.

  12. 12.

    Kawahito Y, Mizutani M, Katayama S (2009) High quality welding of stainless steel with 10 kW high power fibre laser. Sci Technol Weld Join 14:288–294.

  13. 13.

    Gåsvik KJ (2002) Optical metrology. John Wiley & Sons, Ltd, Chichester

  14. 14.

    Craig J (2005) Introduction to robotics mechanics and control, 3rd ed. Pearson Education International, New Jersey

  15. 15.

    Hatwig J, Reinhart G, Zaeh MF (2010) Automated task planning for industrial robots and laser scanners for remote laser beam welding and cutting. Prod Eng 4:327–332.

  16. 16.

    Everton SK, Hirsch M, Stravroulakis P, Leach RK, Clare AT (2016) Review of in-situ process monitoring and in-situ metrology for metal additive manufacturing. Mater Des 95:431–445.

  17. 17.

    You DY, Gao XD, Katayama S (2014) Review of laser welding monitoring. Sci Technol Weld Join 19:181–201.

  18. 18.

    Purtonen T, Kalliosaari A, Salminen A (2014) Monitoring and adaptive control of laser processes. Phys Procedia 56:1218–1231.

  19. 19.

    Muhammad J, Altun H, Abo-Serie E (2016) Welding seam profiling techniques based on active vision sensing for intelligent robotic welding. Int J Adv Manuf Technol 88:127–145.

  20. 20.

    Gu WP, Xiong ZY, Wan W (2013) Autonomous seam acquisition and tracking system for multi-pass welding based on vision sensor. Int J Adv Manuf Technol 69:451–460.

  21. 21.

    Fang Z, Xu D, Tan M (2010) Visual seam tracking system for butt weld of thin plate. Int J Adv Manuf Technol 49:519–526.

  22. 22.

    de Graaf M, Aarts R, Jonker B, Meijer J (2010) Real-time seam tracking for robotic laser welding using trajectory-based control. Control Eng Pract 18:944–953.

  23. 23.

    Xu Y, Fang G, Chen S, Zou JJ, Ye Z (2014) Real-time image processing for vision-based weld seam tracking in robotic GMAW. Int J Adv Manuf Technol 73:1413–1425.

  24. 24.

    Nele L, Sarno E, Keshari A (2013) An image acquisition system for real-time seam tracking. Int J Adv Manuf Technol 69:2099–2110.

  25. 25.

    Kim C-H, Ahn D-C (2012) Coaxial monitoring of keyhole during Yb:YAG laser welding. Opt Laser Technol 44:1874–1880.

  26. 26.

    Eriksson I, Powell J, Kaplan AFH (2010) Signal overlap in the monitoring of laser welding. Meas Sci Technol 21:105705.

  27. 27.

    Dorsch F, Braun H, Keßler S, Magg W, Pfitzner D, Plaßwich S (2012) Process sensor systems for laser beam welding. Laser Tech J 9:24–28.

  28. 28.

    Jezeršek M (2009) High-speed measurement of foot shape based on multiple-laser-plane triangulation. Opt Eng 48:113604.

  29. 29.

    Diaci J, Bračun D, Gorkič A, Možina J (2011) Rapid and flexible laser marking and engraving of tilted and curved surfaces. Opt Lasers Eng 49:195–199.

  30. 30.

    Matsunawa A, Kim J-D, Seto N, Mizutani M, Katayama S (1998) Dynamics of keyhole and molten pool in laser welding. J Laser Appl 10:247–254.

  31. 31.

    Patschger A, Seiler M, Bliedtner J (2018) Influencing factors on humping effect in laser welding with small aspect ratios. J Laser Appl 30:032409.

  32. 32.

    Tenner F, Brock C, Klämpfl F, Schmidt M (2015) Analysis of the correlation between plasma plume and keyhole behavior in laser metal welding for the modeling of the keyhole geometry. Opt Lasers Eng 64:32–41.

  33. 33.

    Gao X, You D, Katayama S (2012) Seam tracking monitoring based on adaptive Kalman filter embedded Elman neural network during high-power fiber laser welding. IEEE Trans Ind Electron 59:4315–4325.

Download references


We would like to thank Yaskawa Slovenia for supplying their commercially available robotic system for remote laser welding applications.

Funding information

The authors received financial support from the GOSTOP program, contract number C3330-16-529000, co-financed from Slovenia and EU under ERDF and from the Slovenian Research Agency (research core funding No. P2-0392, P2-0270; research project funding No. L2-8183).

Author information

Correspondence to Matjaž Kos.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

(WMV 18549 kb)

(MOV 37107 kb)

(MOV 34748 kb)

(MTS 31968 kb)

(MTS 46848 kb)


(JPG 780 kb)


(MP4 4563 kb)


(MP4 3648 kb)


(WMV 18549 kb)


(MOV 37107 kb)


(MOV 34748 kb)


(MTS 31968 kb)


(MTS 46848 kb)

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (, which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Kos, M., Arko, E., Kosler, H. et al. Remote laser welding with in-line adaptive 3D seam tracking. Int J Adv Manuf Technol 103, 4577–4586 (2019) doi:10.1007/s00170-019-03875-z

Download citation


  • Remote laser welding
  • In-line 3D adaptive laser beam positioning, easy teaching
  • Triangulation feedback