Skip to main content
Log in

In Situ labeling and monitoring technology based on projector-camera synchronization for human–machine collaboration

  • ORIGINAL ARTICLE
  • Published:
The International Journal of Advanced Manufacturing Technology Aims and scope Submit manuscript

Abstracts

Nowadays, automation technology has increased productivity and quality in production lines, and human–machine collaboration prioritizes safety and flexibility in industrial environments. As many manufacturers still rely on manual sorting when dealing with multiple types of products, a considerable amount of time and capacity are required to train operators to sort more efficiently, and operators spend a great deal of time identifying and sorting objects. This study presents an in situ monitoring and labeling system based on projector-camera synchronization, providing a cost-effective solution for different feeding systems, supporting quality checking, security screening, and reducing labor training costs. Non-contact labeling technologies provide worker assistance systems in production lines and allow on-site operators to identify moving targets safely and correctly. In order to support object labeling, this study developed a projector-camera synchronization system (PASS) that gives a value of the minimum mean square error of 12.36 pixels. Engineering validation tests in this study use moving objects, a digital camera, an optical projection, and target labeling to compose a polynomial model of image calibration. Mean square error (MSE) of image distortion is discussed and minimized via ChArUco and ArUco calibration and random sample consensus, which quantizes the systematic error of digital camera and optical projector. An embedded system uses the data-driven polynomial model to provide real-time image correction and mark multiple objects on a feeding conveyor. The PASS system is a cost-effective and easy-to-use solution for production line automation, which is suitable for semi-automatic factories employing human–machine collaboration. On-premise workers are directly reminded by the projected color and text on the object of the correct instructions, which ease the burden of object identification during long working hours.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

Availability of data and material

The raw data is at https://drive.google.com/drive/folders/1dtS0KS2Ep3Dqs3S12ZOk2dIoyXZERB_R.

References

  1. Zhang Y, Michi A, Wagner J et al (2020) A generic human-machine annotation framework based on dynamic cooperative learning. IEEE Trans Cybern 50:1230–1239

    Article  Google Scholar 

  2. Gely C, Trentesaux D, Pacaux-Lemoine M-P, Sénéchal O (2021) Human-machine cooperation with autonomous CPS in the context of industry 4.0: a literature review. In: Service oriented, holonic and multi-agent manufacturing systems for industry of the future. Springer International Publishing, Cham, pp 327–342

  3. Pacaux-Lemoine M-P, Trentesaux D, Zambrano Rey G, Millot P (2017) Designing intelligent manufacturing systems through human-machine cooperation principles: a human-centered approach. Comput Ind Eng 111:581–595

    Article  Google Scholar 

  4. Longo F, Nicoletti L, Padovano A (2017) Smart operators in industry 4.0: a human-centered approach to enhance operators’ capabilities and competencies within the new smart factory context. Comput Ind Eng 113:144–159

    Article  Google Scholar 

  5. Saucedo-Martínez JA, Pérez-Lara M, Marmolejo-Saucedo JA et al (2018) Industry 4.0 framework for management and operations: a review. J Ambient Intell Humaniz Comput 9:789–801

    Article  Google Scholar 

  6. Bayro-Corrochano E, Garza-Burgos AM, Del-Valle-Padilla JL (2020) Geometric intuitive techniques for human machine interaction in medical robotics. Int J Soc Robot 12:91–112

    Article  Google Scholar 

  7. Xue K, Wang Z, Shen J et al (2021) Robotic seam tracking system based on vision sensing and human-machine interaction for multi-pass MAG welding. J Manuf Process 63:48–59

    Article  Google Scholar 

  8. Wu S, Wang Z, Shen B et al (2020) Human-computer interaction based on machine vision of a smart assembly workbench. Assem autom 40:475–482

    Article  Google Scholar 

  9. Bettoni A, Montini E, Righi M et al (2020) Mutualistic and adaptive human-machine collaboration based on machine learning in an injection moulding manufacturing line. Procedia CIRP 93:395–400

    Article  Google Scholar 

  10. Chen J, Sun D, Li Y et al (2021) Human–machine cooperative scheme for car-following control of the connected and automated vehicles. Physica A 573:125949

  11. Janssen CP, Boyle LN, Kun AL et al (2019) A hidden Markov framework to capture human–machine interaction in automated vehicles. Int J Hum Comput Interact 35:947–955

    Article  Google Scholar 

  12. Pekedis M, Mascerañas D, Turan G et al (2015) Structural health monitoring for bolt loosening via a non-invasive vibro-haptics human–machine cooperative interface. Smart Mater Struct 24:085018

  13. Mascareñas D, Plont C, Brown C et al (2014) A vibro-haptic human–machine interface for structural health monitoring. Struct Health Monit 13:671–685

    Article  Google Scholar 

  14. Rong G, Xu Y, Tong X, Fan H (2021) An edge-cloud collaborative computing platform for building AIoT applications efficiently. Res Square

  15. Yuan Y, Cai X (2021) A human-machine interaction scheme based on background knowledge in 6G-enabled IoT environment. IEEE Internet Things J 1–1

  16. Połap D (2018) Human-machine interaction in intelligent technologies using the augmented reality. Inf Technol Contr 47. https://doi.org/10.5755/j01.itc.47.4.21602

    Article  Google Scholar 

  17. Wang T, Li J, Deng Y et al (2021) Digital twin for human-machine interaction with convolutional neural network. Int J Comput Integr Manuf 34:888–897

    Article  Google Scholar 

  18. Um J, Gezer V, Wagner A, Ruskowski M (2020) Edge computing in smart production. Advances in service and industrial robotics. Springer International Publishing, Cham, pp 144–152

    Chapter  Google Scholar 

  19. Fan C-F, Chan C-C, Yu H-Y, Yih S (2018) A simulation platform for human-machine interaction safety analysis of cyber-physical systems. Int J Ind Ergon 68:89–100

    Article  Google Scholar 

  20. Garrido-Jurado S, Muñoz-Salinas R, Madrid-Cuevas FJ, Marín-Jiménez MJ (2014) Automatic generation and detection of highly reliable fiducial markers under occlusion. Pattern Recognit 47:2280–2292

    Article  Google Scholar 

  21. An G, Lee S, Seo M-W et al (2018) Charuco board-based omnidirectional camera calibration method. Electronics (Basel) 7:421

    Google Scholar 

  22. Puggelli L, Furferi R, Governi L (2020) Low cost device to perform 3D acquisitions based on ChAruCo markers. Lecture notes in mechanical engineering. Springer International Publishing, Cham, pp 189–200

    Google Scholar 

  23. Bacik J, Durovsky F, Fedor P, Perdukova D (2017) Autonomous flying with quadrocopter using fuzzy control and ArUco markers. Intell Serv Robot 10:185–194

    Article  Google Scholar 

  24. Zhang Z (2000) A flexible new technique for camera calibration. IEEE Trans Pattern Anal Mach Intell 22

  25. Li X, Li W, Yuan X et al (2020) DoF-dependent and equal-partition based lens distortion modeling and calibration method for close-range photogrammetry. Sensors (Basel) 20:5934

    Article  Google Scholar 

Download references

Funding

The authors received financial support for this research from the Ministry of Science and Technology (Republic of China) under Grant MOST 110–2218-E-002–040.

Author information

Authors and Affiliations

Authors

Contributions

Ching-Yuan Chang contributes to the mathematical model and numerical analysis. Don-Rong Chen contributes to the experimental setup and data collection. En-Tze Chen contributes to the data analysis and figure plotting.

Corresponding author

Correspondence to Ching-Yuan Chang.

Ethics declarations

Consent for publication

The grant’s policy encourages researchers to broaden industrial automation applications and to publish the latest research in a reputable journal. It is a novel contribution to the scientific literature that has not been published elsewhere previously or simultaneously in whole or in part.

Conflict of interest

The authors declare no competing interests.

Additional information

Publisher's note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Chang, CY., Chen, DR. & Chen, ET. In Situ labeling and monitoring technology based on projector-camera synchronization for human–machine collaboration. Int J Adv Manuf Technol 120, 4723–4736 (2022). https://doi.org/10.1007/s00170-022-08951-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00170-022-08951-5

Keywords

Navigation