Fuzzy Logic Based Sensor Fusion for Accurate Tracking

  • Ujwal Koneru
  • Sangram Redkar
  • Anshuman Razdan
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6939)

Abstract

Accuracy and tracking update rates play a vital role in determining the quality of Augmented Reality(AR) and Virtual Reality(VR) applications. Applications like soldier training, gaming, simulations & virtual conferencing need a high accuracy tracking with update frequency above 20Hz for an immersible experience of reality. Current research techniques combine more than one sensor like camera, infrared, magnetometers and Inertial Measurement Units (IMU) to achieve this goal. In this paper, we develop and validate a novel algorithm for accurate positioning and tracking using inertial and vision-based sensing techniques. The inertial sensing utilizes accelerometers and gyroscopes to measure rates and accelerations in the body fixed frame and computes orientations and positions via integration. The vision-based sensing uses camera and image processing techniques to compute the position and orientation. The sensor fusion algorithm proposed in this work uses the complementary characteristics of these two independent systems to compute an accurate tracking solution and minimizes the error due to sensor noise, drift and different update rates of camera and IMU. The algorithm is computationally efficient, implemented on a low cost hardware and is capable of an update rate up to 100 Hz. The position and orientation accuracy of the sensor fusion is within 6mm & 1.5 °. By using the fuzzy rule sets and adaptive filtering of data, we reduce the computational requirement less than the conventional methods (such as Kalman filtering). We have compared the accuracy of this sensor fusion algorithm with a commercial infrared tracking system. It can be noted that outcome accuracy of this COTS IMU and camera sensor fusion approach is as good as the commercial tracking system at a fraction of the cost.

Keywords

Fuzzy Logic Augmented Reality Inertial Measurement Unit Sensor Fusion Fuzzy Class 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Welch, G., Foxlin, E.: Motion tracking: No silver bullet, but a respectable arsenal. IEEE Computer Graphics and Applications 22(6), 24–38 (2002)CrossRefGoogle Scholar
  2. 2.
    Owen, C., Xiao, F., Middlin, P.: What is the best fiducial. In: The First IEEE International Augmented Reality Toolkit Workshop, pp. 98–105 (2002)Google Scholar
  3. 3.
    Kato, H., Billinghurst, M.: Marker tracking and hmd calibration for a video-based augmented reality conferencing system. International Workshop on Augmented Reality, 85 (1999)Google Scholar
  4. 4.
    ARToolkitPlus, open source optical tracking software, http://studierstube.icg.tu-graz.ac.at/handheldar/artoolkitplus.php
  5. 5.
    Wagner, D., Schmalstieg, D.: Artoolkitplus for pose tracking on mobile devices. In: Proceedings of 12th Computer Vision Winter Workshop (CVWW 2007), Citeseer, pp. 139–146 (2007)Google Scholar
  6. 6.
    Paul, A., Wan, E.: Dual Kalman filters for autonomous terrain aided navigation in unknown environments. In: Proceedings of IEEE International Joint Conference on Neural Networks, IJCNN 2005, vol. 5 (2005)Google Scholar
  7. 7.
    Han, S., Zhang, Q., Noh, H.: Kalman filtering of DGPS positions for a parallel tracking application. Transactions of the ASAE 45, 553–559 (2002)Google Scholar
  8. 8.
    Fitzgerald, R.: Divergence of the Kalman filter. IEEE Transactions on Automatic Control 16, 736–747 (1971)CrossRefGoogle Scholar
  9. 9.
    Subramanian, V., Burks, T., Dixon, W.: Sensor Fusion Using Fuzzy Logic Enhanced Kalman Filter for Autonomous Vehicle Guidance in Citrus Groves. Transactions of the ASAE 52, 1411–1422 (2009)CrossRefGoogle Scholar
  10. 10.
    Abdelnour, G., Chand, S., Chiu, S.: Applying fuzzy logic to the Kalman filter divergence problem. In: Proc IEEE Int. Conf. Syst., Man, Cybern, IEEE, NJ(USA), vol. 1, pp. 630–634 (1993)Google Scholar
  11. 11.
    Sasiadek, J., Wang, Q.: Sensor fusion based on fuzzy Kalman filtering for autonomous robotvehicle. In: Proceedings of IEEE International Conference on Robotics and Automation, vol. 4 (1999)Google Scholar
  12. 12.
    Ling, Y., Xu, X., Shen, L., Liu, J.: Multi sensor data fusion method based on fuzzy neural network. In: IEEE 6th IEEE International Conference on Industrial Informatics, INDIN 2008, pp. 153–158 (2008)Google Scholar
  13. 13.
    Narayanan, K.: Performance Analysis of Attitude Determination Algorithms for Low Cost Attitude Heading Reference Systems. PhD thesis, Auburn University (2010)Google Scholar
  14. 14.
    Narayanan, K., Greene, M.: A Unit Quaternion and Fuzzy Logic Approach to Attitude Estimation. In: Proceedings of the 2007 National Technical Meeting of The Institute of Navigation, pp. 731–735 (2007)Google Scholar
  15. 15.
    Sparkfun, Inertial Measurement Unit(IMU) manufacturer, http://www.sparkfun.com/commerce/categories.php
  16. 16.
    Point Grey, CCD and CMOS cameras for research, http://www.ptgrey.com/
  17. 17.
    Computar, optical lens manufacturer, http://computarganz.com/
  18. 18.
    Vicon, Infra Red(IR) motion capture systems, http://www.vicon-cctv.com/

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Ujwal Koneru
    • 1
  • Sangram Redkar
    • 1
  • Anshuman Razdan
    • 1
  1. 1.Arizona State UniversityUSA

Personalised recommendations