Advertisement

An Epipolar Geometry-Based Approach for Vision-Based Indoor Localization

  • Yinan Liu
  • Lin Ma
  • Xuedong Wang
  • Weixiao Meng
Conference paper
Part of the Lecture Notes in Electrical Engineering book series (LNEE, volume 463)

Abstract

Indoor positioning is getting more and more attention and research. We propose an epipolar geometry-based method for vision-based indoor localization using images. It needs an image collected in the positon that is aiming to localize. It uses SURF to pick up the feature points and filtrate them to remain good ones and get rid of bad ones. The good feature points are used to match the feature points in the database. (The feature points are selected by the images whose positions are already known). We use the matched feature points to calculate the essential matrix that include the translation information and rotary information. Then we can complete the localization by the relationship between the query image and the images in the database. What’s more we use the feature points to replace the images to build the database aiming to reduce the space and speed up the localization.

Keywords

Indoor localization Epipolar geometry SURF Essential matrix 

References

  1. 1.
    Nistér, D.: An efficient solution to the five-point relative pose problem. IEEE Trans. Pattern Anal. Mach. Intell. 26(6), 756–770 (2004)Google Scholar
  2. 2.
    Yang, J., Chen, L., Liang, W.: Monocular vision based robot self-localization. In: 2010 IEEE International Conference on Robotics and Biomimetics (ROBIO), pp. 1189–1193. IEEE (2010)Google Scholar
  3. 3.
    Muramatsu, S., Chugo, D., Jia, S., Takase, K.: Localization for indoor service robot by using local-features of image. In: ICCAS-SICE, pp. 3251–3254. IEEE (2009)Google Scholar
  4. 4.
    Bay, H., Ess, A., Tuytelaars, T., et al.: Speeded-up robust features (SURF). Comput. Vis. Image Underst. 110(3), 346–359 (2008)Google Scholar
  5. 5.
    Nicole, R.: Title of paper with only first word capitalized. J. Name Stand. Abbrev. (in press)Google Scholar
  6. 6.
    Yorozu, Y., Hirano, M., Oka, K., Tagawa, Y.: Electron spectroscopy studies on magneto-optical media and plastic substrate interface. IEEE Transl. J. Magn. Japan 2, 740–741 (1987). Digests 9th Annual Conference of Magnetics Japan, p. 301, 1982Google Scholar
  7. 7.
    Young, M.: The Technical Writer’s Handbook. University Science, Mill Valley (1989)Google Scholar
  8. 8.
    Horaud, R., Conio, B., Leboulleux, O., Lacolle, B.: An analytic solution for the perspective 4-point problem. Comput. Vision Graph. Image Proces. 47(1), 33–44 (1989)Google Scholar
  9. 9.
    Wang, J., Zha, H., Cipolla, R.: Coarse-to-fine vision-based localization by indexing scale-invariant features. IEEE Trans. Syst. Man Cybern. Part B Cybern. 36(2), 413–422 (2006)Google Scholar
  10. 10.
    Liqin, H., Caigan, C., Henghua, S., et al.: Adaptive registration algorithm of color images based on SURF. Measurement 66, 118–124 (2015)Google Scholar
  11. 11.
    Harris, J.M., Nefs, H.T., Grafton, C.E.: Binocular vision and motion-in-depth. Spat. Vis. 21(6), 896–899 (2014)Google Scholar
  12. 12.
    Tourap, A.M.: Enhanced predictive zonal search for single and multiple frame motion estimation. In: Visual Communications and Image Processing (2012)Google Scholar
  13. 13.
    Olson, C.F., Abi-Rached, H., Ye, M., Hendrich, J.P.: Wide-baseline stereo vision for mars rovers. In: Proceedings Of the 2003 IEEE/RSJ International Conference On Intelligent Robots And Systems, vol. 2, pp. 1302–1307, October 2003Google Scholar

Copyright information

© Springer Nature Singapore Pte Ltd. 2019

Authors and Affiliations

  1. 1.Communications Research CenterHarbin Institute of TechnologyHarbinChina

Personalised recommendations