Skip to main content
Log in

Sensor fusion based head pose tracking for lightweight flight cockpit systems

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Flight cockpit head tracking systems (HTS) are one of the most important impetuses for head pose tracking in the field of augmented reality. For the purpose of implementing the natural interaction between pilot and the complete internal environment in our lightweight flight cockpit system, a head tracking system consisting of inside-out tracking (IOT) and outside-in tracking (OIT) is designed and a novel approach using sensor fusion is proposed to dynamically track pilot’s head pose. The proposed approach utilizes a sensor fusion framework, composed of extended Kalman filters and fusion filter, to fuse the poses from complementary IOT and OIT. An experimental setup is established to simulate the cockpit HTS and verify the proposed approach. Experimental results show that the proposed tracking scheme based on sensor fusion is capable of achieving more accurate and stable pose outputs, extending tracking range as well as better robustness compared with single IOT or OIT.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13

Similar content being viewed by others

References

  1. Auer T, Pinz A (1999) Building a hybrid tracking system: integration of optical and magnetic tracking. In: Proceedings of IWAR ‘99, San Francisco, CA, USA, pp 13–22

  2. Azuma R (1997) A survey of augmented reality. Presence Teleoperators Virtual Environ 6(4):355–385

    Google Scholar 

  3. Azuma R, Baillot Y (2001) Recent advances in augmented reality. IEEE Comput Graph Appl 21(6):34–47. doi:10.1109/38.963459

    Article  Google Scholar 

  4. Baillot Y, Julier SJ, Brown D, Livingston MA (2003) A tracker alignment framework for augmented reality. In: Proceedings of ISMAR’03. Tokyo, Japan, pp 142–150

  5. Bar-Shalom Y, Campo L (1986) The effect of the common process noise on the two-sensor fused-track covariance. Trans Aerosp Electronic Syst AES-22(6):803–804. doi:10.1109/TAES.1986.310815

    Article  Google Scholar 

  6. Broll W, Lindt I, Herbst I (2008) Toward next-gen mobile AR game. IEEE Comput Graph Appl 28(4):40–48. doi:10.1109/MCG.2008.85

    Article  Google Scholar 

  7. Chandaria J, Thomas GA, Stricker D (2007) The MATRIS project: real-time markerless camera tracking for augmented reality and broadcast applications. J Real-Time Image Proc 2:69–79. doi:10.1007/s11554-007-0043-z

    Article  Google Scholar 

  8. Chang KC, Saha RK, Bar-shalom Y (1997) On optimal track-to-track fusion. Trans Aerosp Electronic Syst 33(4):1271–1276. doi:10.1109/7.625124

    Article  Google Scholar 

  9. Feiner S, MacIntyre B, Höllerer T (1997) A touring machine: prototyping 3D mobile augmented reality systems for exploring the urban environment. In: Proceedings of ISWC’97, Cambridge, MA, pp 74–81

  10. Ferrin F J (1991) Survey of helmet tracking technologies. In: Proceedings of SPIE.1456: 86–94

  11. Foxlin E, Harrington M (2000) WearTrack: A self-reference head and hand tracker for wearable computer and portable VR. In: The Fourth International Symposium on Wearable Computers, Atlanta, GA, USA, pp 155–162

  12. Foxlin E, Naimark L (2003) VIS-tracker: a wearable vision-inertial self-tracker. In: Proceedings of IEEE Virtual Reality. pp 199–203

  13. Foxlin E, Altshuler Y, Naimark L et al (2004) FlightTracker: a novel optical/inertial tracker for cockpit enhanced vision. In: Proceedings of the Third IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR’04), Arlington, VA, USA, pp 212–221

  14. Girija G, Raol JR, Raj RA et al (2000) .Tracking filter and multi-sensor data fusion. Sādhanā ( Printed in India) 25(2):159–167. http://www.ias.ac.in/sadhana/Pdf2000Apr/Pe900.pdf ( online July,2009)

  15. Hamza-Lup FG, Santhanam AP, Imielinska C, Meeks SL, Rolland JP (2007) Distributed augmented reality with 3-D lung dynamics-a planning tool concept. Trans Inf Technol Biomed 11(1):40–46. doi:10.1109/TITB.2006.880552

    Article  Google Scholar 

  16. Haritos T, Macchiarella ND (2005) A mobile application of augmented reality for aerospace maintenance training. In: The 24th Digital Avionics Systems Conference (DASC 2005), 1: 5.B.3–5.1-9

  17. Hartley R, Zisserman A (2003) Multiple view geometry in computer vision, Second Edit, Cambridge University Press

  18. Hoff W (1998) Fusion of data from head-mounted and fixed sensors. In: Proceedings of IWAR ’98, San Francisco.

  19. Hol JD, Schön TB, Luinge H, Slycke PJ, Gustafsson F (2007) Robust real-time tracking by fusing measurements from inertial and vision sensors. J Real-Time Image Proc 2:149–160

    Article  Google Scholar 

  20. Horn BKP (1987) Closed-form solution of absolute orientation using unit quaternion. J Opt Soc Am A 4(4):629–642. doi:10.1364/JOSAA.4.000629

    Article  MathSciNet  Google Scholar 

  21. InterSense Inc homge page:http://www.intersense.com

  22. Janin A L , Zikan K , Mizell D et al (1995) Videometric head tracker for augmented reality applications. In: Proceedings of SPIE. 2351:308–315

  23. Jiang B, Neumann U, You S (2004) A robust hybrid tracking system for outdoor augmented reality. In: Proceedings of IEEE Virtual Reality Conference. Chicago, IL, USA, pp 3–275

  24. Julier S, Baillot Y, Lanzagorta M, Brown D, Rosenblum L (2000) BARS: Battlefield augmented reality system. In: NATO Symposium on Information Processing Techniques for Military System. http://www.ait.nrl.navy.mil/3dvmel/papers/cp_NATO00.pdf (online July 2009)

  25. Klein G, Drummond T (2004) Sensor fusion and occlusion refinement for tablet-based AR. In: Proceedings of ISMAR’04. Arlington, VA, USA, 38–47

  26. Latham R (1997) Designing virtual reality systems: a case study of a system with human/ robotic interaction. In: Proceedings of Compcon’97, New York, pp 302–307

  27. Lin L, Liu Y, Zheng W, Wang Y (2006) Registration algorithm based on image matching for outdoor AR system with fixed viewing position. In: Proceedings of Vis. Image. and Signal Processing. 153(1):57–62

  28. Ma S, Zhang Z (2003) Computer vision. Science, Beijing

    Google Scholar 

  29. Marchand E, Spindler F, Chaumette F (2005) ViSP for visual servoing: a generic software platform with a wide class of robot control skills. Rob Autom Mag 12(4):40–52. doi:10.1109/MRA.2005.1577023

    Article  Google Scholar 

  30. Park FC, Martin B (1994) Robot sensor calibration: solving AX=XB on the Euclidean group. IEEE Trans Robot Autom 10(5):717–721. doi:10.1109/70.326576

    Article  Google Scholar 

  31. Roecker JA, McGillem CD (1988) Comparison of two-sensor tracking methods based on state vector fusion and measurement fusion. IEEE Trans Aerosp Electronic Syst 24(4):447–449. doi:10.1109/7.7186

    Article  Google Scholar 

  32. Saha RK (1996) Effect of common process noise on two-sensor tracking fusion. J Guidance Control Dyn 19(4):829–835

    Article  MATH  MathSciNet  Google Scholar 

  33. Satoh K, Uchiyama S, Yamamoto H et al (2003) Robust vision-based registration utilizing Bird’s-Eye view with user’s view. In: Proceedings of ISMAR’03. Tokyo, Japan, 46–55

  34. Schoen T, Gustafsson F (2005) Integrated navigation of cameras for augmented reality. International Federation of Automatic Control World Congress, Prague

    Google Scholar 

  35. Wagner M (2006) Tracking with multiple sensors, PhD Dissertation, http://campar.in.tum.de/Chair/ResearchIssueUbiTrack ( online July 2009)

  36. Ward M, Azuma R, Bennett R et al (1992) A demonstrated optical tracker with scalable work area for head-mounted display systems. In: Proceedings of 1992 Symposium on Interactive 3D Graphics, Cambridge, MA. pp 43–52

  37. Welch G, Bishop G (1997). SCAAT: Incremental tracking with incomplete information. In: Proc. SIGGRAPH Conference. pp 333–344

  38. Welch G, Bishop G (2002) An introduction to the kalman filter, Technical Report TR 95-041, University of North Carolina.

  39. Welch G, Bishop G, Vicci L, Brumback S, Keller K, Colucci D (2001) High-performance wide-area optical tracking:the HiBall tracking system. Presence Teleoperators Virtual Environ 10(1):1–21. doi:10.1162/105474601750182289

    Article  Google Scholar 

  40. Yamazoe H, Utsumi A, Tetsutani N, Yachida M (2007) Vision-based human motion tracking using head-mounted cameras and fixed cameras. Electron Comm Jap Pt 2, 90(2):40–53 doi:10.1002/ecjb.20331

    Google Scholar 

  41. You S, Neumann U (2001) Fusion of vision and gyro tracking for robust augmented reality registration. In: Proceedings of IEEE Virtual Reality Conference. Yokohama, Japan, pp 71–78

  42. You S, Neumann U, Azuma R (1999) Hybrid inertial and vision tracking for augmented reality registration. In: Proceedings of IEEE Virtual Reality Conference. Houston, Texas, USA, pp 260–267.

  43. Zhang Z (1998) A flexible new technique for camera calibration. Technical Report MSR-TR-98-71. Microsoft Research

Download references

Acknowledgement

This work was supported by the National High Technology Research and Development Program of China (863 Program). Grant No.: 2006AA02Z4E5, 2008AA01Z303, 2007AA01Z325, National Natural Science Foundation of China. Grant No. 60827003 and the Innovation Team Development Program of the Chinese Ministry of Education (IRT0606). In addition, thanks to Meng Ding, Ningning Shi and Beibei Li for their collaboration and assistance during the process of our experimental test for the cockpit head tracking system.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Bin Luo.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Luo, B., Wang, Y. & Liu, Y. Sensor fusion based head pose tracking for lightweight flight cockpit systems. Multimed Tools Appl 52, 235–255 (2011). https://doi.org/10.1007/s11042-010-0468-4

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-010-0468-4

Keywords

Navigation