Springer Nature is making SARS-CoV-2 and COVID-19 research free. View research | View latest news | Sign up for updates

Sensor-based Human–Process Interaction in Discrete Manufacturing

  • 41 Accesses


The rise of Industry 4.0 and the convergence with business process management provide new potential for the automatic gathering of process-related sensor information. In manufacturing, information about human behavior in manual assembly tasks is rare when no interaction with machines is involved. We suggest technologies to automatically detect material picking and placement in the assembly workflow to gather accurate data about human behavior and flexible support of human–process interaction. The detection of material picking is achieved by using background subtraction in combination with scales. For placement detection, two approaches are tested: image classification using convolutional neural networks and object detection using Haar wavelets. The detected fine-grained worker activities are then correlated with a hybrid model of the assembly workflow using the business process model and notation and case management model and notation, enabling the measurement of production time (time per state) and quality (frequency of error) on the shop floor as an entry point for conformance checking and process optimization. The approach has been evaluated in a quantitative case study recording the assembly process 30 times in a laboratory setup within 4 h. Under these conditions, the classification of assembly states using a neural network provides a test accuracy of 99.25% on 38 possible assembly states. Material picking based on background subtraction has been evaluated in an informal user study with six participants performing 16 picks each, providing an accuracy of 99.48%. The suggested method offers a promising approach to easily assess fine-grained timings and error rates of assembly steps which can be used to optimize the corresponding process.

This is a preview of subscription content, log in to check access.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7


  1. 1.



  1. 1.

    Ahonen T, Hadid A, Pietikainen M (2006) Face description with local binary patterns: application to face recognition. IEEE Trans Pattern Anal Mach Intell 28(12):2037–2041

  2. 2.

    Bader S, Aehnelt M (2014) Tracking assembly processes and providing assistance in smart factories. In: 6th International conference on agents and artificial intelligence, SCITEPRESS, science and technology publications, Lda, Portugal, ICAART, vol 1, pp 161–168

  3. 3.

    Bertram P, Motsch W, Rübel P, Ruskowski M (eds) (2019) Intelligent material supply supporting assistive systems for manual working stations. In: International conference on flexible automation and intelligent manufacturing (FAIM-2019), June 24-28, Limerick, Ireland, Elsevier B.V

  4. 4.

    Cameranesi M, Diamantini C, Potena D (2018) Discovering process models of activities of daily living from sensors. In: Teniente E, Weidlich M (eds) Business process management workshops. Springer, Cham, pp 285–297

  5. 5.

    Carolis BD, Ferilli S, Redavid D (2015) Incremental learning of daily routines as workflows in a smart home environment. ACM Trans Interact Intell Syst 4(4):20:1–20:23. https://doi.org/10.1145/2675063

  6. 6.

    Cavanillas JM, Curry E, Wahlster W (eds) (2016) New Horizons for a Data-Driven Economy : a Roadmap for Usage and Exploitation of Big Data in Europe. Springer, Cham

  7. 7.

    Fei-Fei L, Fergus R, Perona P (2007) Learning generative visual models from few training examples: an incremental bayesian approach tested on 101 object categories. Comput Vis Image Underst 106(1):59–70

  8. 8.

    Funk M, Schmidt A (2015) Cognitive assistance in the workplace. Pervasive Comput 14(3):53–55

  9. 9.

    Grzeszick R, Lenk JM, Rueda FM, Fink GA, Feldhorst S, ten Hompel M (2017) Deep neural network based human activity recognition for the order picking process. In: Proceedings of the 4th international workshop on sensor-based activity recognition and interaction, ACM, New York, NY, USA, iWOAR ’17, pp 14:1–14:6, https://doi.org/10.1145/3134230.3134231

  10. 10.

    He K, Zhang X, Ren S, Sun J (2016) Deep residual learning for image recognition. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 770–778

  11. 11.

    He K, Gkioxari G, Dollár P, Girshick R (2017) Mask r-cnn. In: Proceedings of the IEEE international conference on computer vision, pp 2961–2969

  12. 12.

    Henderson SJ, Feiner SK (2011) Augmented reality in the psychomotor phase of a procedural task. In: 10th International symposium on mixed and augmented reality. IEEE, ISMAR, pp 191–200

  13. 13.

    Hull R, Motahari Nezhad HR (2016) Rethinking bpm in a cognitive world: transforming how we learn and perform business processes. In: La Rosa M, Loos P, Pastor O (eds) Business process management. Springer, Cham, pp 3–19

  14. 14.

    Janiesch C, Koschmider A, Mecella M, Weber B, Burattin A, Di Ciccio C, Gal A, Kannengiesser U, Mannhardt F, Mendling J, Oberweis A, Reichert M, Rinderle-Ma S, Song W, Su J, Torres V, Weidlich M, Weske M, Zhang L (2017) The internet-of-things meets business process management: mutual benefits and challenges. Computing research repository (709.03628):1–9, https://arxiv.org/pdf/1709.03628

  15. 15.

    Jaroucheh Z, Liu X, Smith S (2011) Recognize contextual situation in pervasive environments using process mining techniques. J Ambient Intell Humaniz Comput 2(1):53–69. https://doi.org/10.1007/s12652-010-0038-7

  16. 16.

    Kagermann H, Helbig J, Hellinger A, Wahlster W (2013) Recommendations for implementing the strategic initiative INDUSTRIE 4.0: securing the future of German manufacturing industry; final report of the Industrie 4.0 working group. Forschungsunion

  17. 17.

    Kerber F, Lessel P (2015) Adaptive und gamifizierte werkerassistenz in der (semi-)manuellen industrie 4.0-montage. In: DeLFI WOrkshops

  18. 18.

    Knoch S, Kerber F, Pavlov V, Ponpathirkoottam S (2016) Automatic capturing and analysis of manual manufacturing processes with minimal setup effort. In: International joint conference on pervasive and ubiquitous computing. ACM, UbiComp, pp 305–308

  19. 19.

    Knoch S, Ponpathirkoottam S, Fettke P, Loos P (2018) Technology-enhanced process elicitation of worker activities in manufacturing. In: Teniente E, Weidlich M (eds) Business process management workshops. Springer, Cham, pp 273–284

  20. 20.

    Knoch S, Herbig N, Ponpathirkoottam S, Kosmalla F, Staudt P, Fettke P, Loos P (2019) Enhancing process data in manual assembly workflows. In: Sheng QZ, Motahari H (eds) Business process management workshops, vol 342. Lecture notes in business information processing (LNBIP). Springer, Cham, pp 269–280

  21. 21.

    Lasi H, Fettke P, Kemper HG, Feld T, Hoffmann M (2014) Industrie 4.0. WIRTSCHAFTSINFORMATIK 56(4):261–264. https://doi.org/10.1007/s11576-014-0424-4

  22. 22.

    Lee DC, Kanade T (2007) Boosted classifier for car detection. unpublished,http://www.cs.cmu.edu/~dclee

  23. 23.

    Lenz C, Sotzek A, Roeder T, Radrich H, Knoll A, Huber M, Glasauer S (2011) Human workflow analysis using 3d occupancy grid hand tracking in a human–robot collaboration scenario. In: 2011 IEEE/ RSJ international conference on intelligent robots and systems, pp 3375–3380, https://doi.org/10.1109/IROS.2011.6094570, http://ieeexplore.ieee.org/document/6094570/

  24. 24.

    Marrella A, Mecella M (2017) Cognitive business process management for adaptive cyber-physical processes. In: Teniente E, Weidlich M (eds) Business process management workshops, Springer, lecture notes in business information processing, vol 308, pp 429–439, http://dblp.uni-trier.de/db/conf/bpm/bpmw2017.html#MarrellaM17

  25. 25.

    Monteiro G, Peixoto P, Nunes U (2006) Vision-based pedestrian detection using haar-like features. Robotica 24:46–50

  26. 26.

    Nesselrath R (2016) SiAM-dp: An open development platform for massively multimodal dialogue systems in cyber-physical environments. Ph.D. thesis, Universitaet des Saarlandes, Saarbruecken,http://scidok.sulb.uni-saarland.de/volltexte/2016/6385

  27. 27.

    Petersen N, Pagani A, Stricker D (2013) Real-time modeling and tracking manual workflows from first-person vision. In: international symposium on mixed and augmented reality. IEEE, ISMAR, pp 117–124

  28. 28.

    Poppe R (2010) A survey on vision-based human action recognition. Image Vis Comput 28(6):976–990. https://doi.org/10.1016/j.imavis.2009.11.014 http://www.sciencedirect.com/science/article/pii/S0262885609002704

  29. 29.

    Roitberg A, Somani N, Perzylo A, Rickert M, Knoll A (2015) Multimodal human activity recognition for industrial manufacturing processes in robotic workcells. In: Proceedings of the 2015 ACM on international conference on multimodal interaction, ACM, New York, NY, USA, ICMI ’15, pp 259–266, https://doi.org/10.1145/2818346.2820738, http://doi.acm.org/10.1145/2818346.2820738

  30. 30.

    Russakovsky O, Deng J, Su H, Krause J, Satheesh S, Ma S, Huang Z, Karpathy A, Khosla A, Bernstein M, Berg AC, Fei-Fei L (2015) ImageNet large scale visual recognition challenge. Int J Comput Vis (IJCV) 115(3):211–252. https://doi.org/10.1007/s11263-015-0816-y

  31. 31.

    Sora D, Leotta F, Mecella M (2018) An habit is a process: a bpm-based approach for smart spaces. In: Teniente E, Weidlich M (eds) Business process management workshops. Springer, Cham, pp 298–309

  32. 32.

    Stiefmeier T, Roggen D, Ogris G, Lukowicz P, Troester G (2008) Wearable activity tracking in car manufacturing. IEEE Pervasive Comput 7(2):42–50. https://doi.org/10.1109/MPRV.2008.40

  33. 33.

    Thoben KD, Poeppelbuss J, Wellsandt S, Teucke M, Werthmann D (2014) Considerations on a lifecycle model for cyber-physical system platforms. Innovative and knowledge-based production management in a global-local world. In: Grabot B, Vallespir B, Gomes S, Bouras A, Kiritsis D (eds) Advances in production management systems. Springer, Berlin, pp 85–92

  34. 34.

    Ullrich C, Aust M, Dietrich M, Herbig N, Igel C, Kreggenfeld N, Prinz C, Raber F, Schwantzer S, Sulzmann F (2016) Appsist statusbericht: Realisierung einer plattform für assistenz-und wissensdienste für die industrie 4.0. In: DeLFI workshops, pp 174–180

  35. 35.

    Viola P, Jones M (2001) Rapid object detection using a boosted cascade of simple features. In: Computer vision and pattern recognition, 2001. CVPR 2001. Proceedings of the 2001 IEEE computer society conference on, IEEE, vol 1, pp I–I

  36. 36.

    Wombacher A (2011) How physical objects and business workflows can be correlated. In: 2011 IEEE International conference on services computing, pp 226–233, https://doi.org/10.1109/SCC.2011.24

  37. 37.

    Zivkovic Z, Van Der Heijden F (2006) Efficient adaptive density estimation per image pixel for the task of background subtraction. Pattern Recogn Lett 27(7):773–780

Download references


This research was funded in part by the German Federal Ministry of Education and Research under grant number 01IS16022E (project BaSys4.0). The responsibility for this publication lies with the authors. The authors thank Mettler Toledo for providing the hardware setup used for inventory control in this research.

Author information

Correspondence to Sönke Knoch.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Knoch, S., Herbig, N., Ponpathirkoottam, S. et al. Sensor-based Human–Process Interaction in Discrete Manufacturing. J Data Semant (2019). https://doi.org/10.1007/s13740-019-00109-z

Download citation


  • Manual assembly
  • Activity detection
  • Computer vision
  • Process enhancement
  • Industry 4.0