Advertisement

Self-contained optical-inertial motion capturing for assembly planning in digital factory

  • Wei Fang
  • Lianyu Zheng
  • Jiaxing Xu
ORIGINAL ARTICLE

Abstract

In assembly activities, performing assembly planning is a crucial issue for the human-centered manufacturing. The challenges involve in retrieving and utilizing the real-time data about human-based work activities in a shop floor. Instead of the simulation-based assembly planning, the marker-based motion capture system can acquire realistic motion data of workers in assembling sites, but this method is inclined to corrupt due to the occlusion and is troublesome to be installed within the shop floor. Therefore, based on the complement of the optical and inertial sensor, this paper presents a self-contained motion capture method for assembly planning in real shop floor. It can provide a real-time and portable motion capturing for workers, avoiding the failure of traditional outside-in motion capture system due to occlusions or incorrect installations. What is more, the portable motion capture method can run on consumer mobile devices, providing a convenient and low-cost way to perceive the workers’ motion in shop floor, which is significant for extensive applications in assembly verification and planning for digital factories. Finally, experiments are carried out to demonstrate the accuracy and feasibility of the proposed motion capture method for assembly activities.

Keywords

Motion capture Optical inertial fusion Assembly planning Human-centered assembly Digital factory 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Pan CX (2005) Integrating CAD files and automatic assembly sequence planning. PhD thesis, Iowa State UniversityGoogle Scholar
  2. 2.
    Jayaram S, Connacher HI, Lyons KW (1997) Virtual assembly using virtual reality techniques. Comput Aided Des 29(8):575–584CrossRefGoogle Scholar
  3. 3.
    Liu Z, Tan J (2007) Constrained behavior manipulation for interactive assembly in a virtual environment. Int J Adv Manuf Technol 32(7):797–810CrossRefGoogle Scholar
  4. 4.
    Gao W, Shao X, Liu H (2016) Enhancing fidelity of virtual assembly by considering human factors. Int J Adv Manuf Technol 83:873–886CrossRefGoogle Scholar
  5. 5.
    Agethen P, Otto M, Mengel S, Rukzio E (2016) Using marker-less motion capture systems for walk path analysis in paced assembly flow lines. In: Proceedings of 6th CIRP Conference on Learning Factories, Gjøvik, Norway, pp. 152–157Google Scholar
  6. 6.
    Baines T, Hadfield L, Mason S, Ladbrook J (2003) Using empirical evidence of variations in worker performance to extend the capabilities of discrete event simulations in manufacturing. In: Proceedings of the 2003 Winter Simulation Conference, New Orleans, USA, pp. 1210–1216Google Scholar
  7. 7.
    INTERACT (2013-2016) Interactive Manual Assembly Operations for the Human-Centered Workplaces of the Future. http://www.interact-fp7.eu/
  8. 8.
    Manns M, Otto M, Mauer M (2016) Measuring motion capture data quality for data driven human motion synthesis. In: Proceedings of 48th CIRP conference on manufacturing systems, Ischia, Italy, pp. 945–950Google Scholar
  9. 9.
    Puthenveetil SC, Daphalapurkar CP, Zhu W, Leu MC, Liu XF, Gilpin-Mcminn JK (2015) Computer-automated ergonomic analysis based on motion capture and assembly simulation. Virtual Real-London 19(2):119–128CrossRefGoogle Scholar
  10. 10.
    Yang Q, Wu DL, Zhu HM, Bao JS, Wei ZH (2013) Assembly operation process planning by mapping a virtual assembly simulation to real operation. Comput Ind 64(7):869–879CrossRefGoogle Scholar
  11. 11.
    Wang X, Ong SK, Nee AYC (2016) Real-virtual components interaction for assembly simulation and planning. Robot Comput Integr Manuf 41:102–114CrossRefGoogle Scholar
  12. 12.
    Du JC, Duffy VG (2007) A methodology for assessing industrial workstations using optical motion capture integrated with digital human models. Occup Ergon 7(1):11–25Google Scholar
  13. 13.
    Ming CL, Elmaraghy HA, Nee AYC, Ong SK, Lanzetta M, Putz M, Zhu W, Bernard A (2013) CAD model based virtual assembly simulation, planning and training. CIRP Ann Manuf Technol 62(2):799–822CrossRefGoogle Scholar
  14. 14.
    Geiselhart F, Otto M, Rukzio E (2015) On the use of multi-depth-camera based motion tracking systems in production planning environments. In: Proceedings of 48th CIRP Conference on manufacturing systems, Ischia, Italy, pp. 759–764Google Scholar
  15. 15.
    Kruger J, Nguyen TD (2015) Automated vision-based live ergonomics analysis in assembly operations. CIRP Ann Manuf Technol 64(1):9–12CrossRefGoogle Scholar
  16. 16.
    Agethen P, Otto M, Gaisbauer F, Rukzio E (2016) Presenting a novel motion capture-based approach for walk path segmentation and drift analysis in manual assembly. In: Proceedings of 6th Changeable, Agile, Reconfigurable and Virtual Production, Bath, United Kingdom, pp. 286–291Google Scholar
  17. 17.
    Prabhu VA, Song B, Thrower J, Tiwari A, Webb P (2015) Digitisation of a moving assembly operation using multiple depth imaging sensors. Int J Adv Manuf Technol 85:163–184CrossRefGoogle Scholar
  18. 18.
    Wang Y, Zhang S, Yang S, He W, Bai X, Zeng Y (2016) A LINE-MOD-based markerless tracking approach for AR applications. Int J Adv Manuf Technol 89(5):1699–1707Google Scholar
  19. 19.
    Puthenveetil SC, Daphalapurkar CP, Zhu W, Leu MC, Liu XQ, Chang AM, Gilpin-Mcminn JK, Wu PH, Snodgrass SD (2013) Comparison of marker-based and marker-less systems for low-cost human motion capture. In: Proceedings of the ASME 2013 international design engineering technical conferences and computers and information in engineering conference, Portland, USA, pp. V02BT02A036Google Scholar
  20. 20.
    Wagner D, Reitmayr G, Mulloni A, Drummond T, Schmalstieg D (2008) Pose tracking from natural features on mobile phones. In: Proceedings of IEEE/ACM international symposium on mixed and augmented reality,Cambridge, UK, pp. 125–134Google Scholar
  21. 21.
    Mei Z, Maropoulos P (2014) Review of the application of flexible, measurement-assisted assembly technology in aircraft manufacturing. Proc IMechE B J Eng Manuf 228(10):1185–1197CrossRefGoogle Scholar
  22. 22.
    Wang Z, Mastrogiacomo L, Franceschini F, Maropoulos P (2011) Experimental comparison of dynamic tracking performance of iGPS and laser tracker. Int J Adv Manuf Technol 56(1):205–213CrossRefGoogle Scholar
  23. 23.
    Chen X, Zheng J, Hamel W, Tan J (2005) An inertial-based human motion tracking system with twists and exponential maps. In: Proceedings of IEEE international conference on robotics and automation, Hong Kong, China, pp. 5665–5670Google Scholar
  24. 24.
    Weiss S, Achtelik MW, Lynen S, Achtelik MC, Kneip L, Chli M, Siegwart R (2013) Monocular vision for long-term micro aerial vehicle state estimation: a compendium. J Field Robot 30(5):803–831CrossRefGoogle Scholar
  25. 25.
    Leutenegger S, Lynen S, Bosse M, Siegwart R, Furgale P (2014) Keyframe-based visual–inertial odometry using nonlinear optimization. Int J Robot Res 34(3):314–334CrossRefGoogle Scholar
  26. 26.
    Lange S, Sunderhauf N, Protzel P (2013) Incremental smoothing vs. filtering for sensor fusion on an indoor UAV. In:Proceedings of IEEE international conference on robotics and automation, Karlsruhe, Germany, pp. 1773–1778Google Scholar
  27. 27.
    Tedaldi D, Pretto A, Menegatti E (2014) A robust and easy to implement method for IMU calibration without external equipments. In:Proceedings of IEEE international conference on robotics and automation, Hong Kong, China, pp.3042–3049Google Scholar
  28. 28.
    Weiss S, Siegwart R (2011) Real-time metric state estimation for modular vision-inertial systems. In:Proceedings of IEEE International conference on robotics and automation, Shanghai China, pp. 4531–4537Google Scholar
  29. 29.
    Siegwart R, Nourbakhsh I, Scaramuzza D (2011) Introduction toautonomous mobile robots, 2nd edn. MIT Press, CambridgeGoogle Scholar
  30. 30.
    Rublee E, Rabaud V, Konolige K, Bradski G (2011) ORB: an efficient alternative to SIFT or SURF. In:Proceedings of 2011 I.E. international conference on computer vision, Barcelona, Spain, pp. 2564–2571Google Scholar
  31. 31.
    Baker S, Matthews I (2004) Lucas–Kanade 20 years on: a unifying framework. Int J Comput Vis 56:221–255CrossRefGoogle Scholar
  32. 32.
    Klein G, Murray D (2007) Parallel tracking and mapping for small AR workspaces. In: Proceedings of IEEE/ACM international symposium on mixed and augmented reality, Nara, Japan, pp.1–10Google Scholar
  33. 33.
    Ourakis M, Argyros A (2009) SBA: a software package for generic sparse bundle adjustment. ACM TMath Softw 36(1):26–30MathSciNetGoogle Scholar

Copyright information

© Springer-Verlag London 2017

Authors and Affiliations

  1. 1.School of Mechanical Engineering and AutomationBeihang UniversityBeijingPeople’s Republic of China

Personalised recommendations