UKF-Based Image Filtering and 3D Reconstruction

  • Abdulkader JoukhadarEmail author
  • Dalia Kass Hanna
  • Etezaz Abo Al-Izam


In the global world of robotics, robots have missions to achieve interactively with human environments and online learning. Practically, for null error of a robot’s achievement, robotics systems should be provided with minimal certain information in advance. This is crucial for any high-performance robotics systems. For instance, a medical assistive robot in a medical operation room has to be able to learn as precisely as possible about tissues. In the last few years, visual simultaneous localization and mapping (VSLAM) has become an open rich area in mobile robotics research for developing truly autonomous robots. VSLAM aims to estimate simultaneously the robot pose and 3D structure of the scene through a set of matched correspondences and features extracted from multiple images. To increase efficiency, the majority of online VSLAM algorithms use Kalman filter (KF) which is a Gaussian Bayesian filter, to merge the uncertainties in the Cartesian motion and observation model. This chapter concentrates on stereo vision noise source that results in 3D reconstruction of the scene, strategies for image filtering, and feature extraction. Modern and advanced techniques, for example, KF, extended Kalman filter (EKF), and unscented Kalman filter (UKF), will be presented in detail with practical examples in the field of robotics vision research.


Mobile robots Image processing Kalman filter 


2D, 3D

Two- and three-dimensional space


Four-wheeled differential drive mobile robot


Absolute orientation


Bundle adjustment


Degree of belief


Extended Kalman filter


Exterior orientation


Gaussian random variable


Interior orientation


Standard Kalman filter


Maximize the posterior estimation


Maximum likelihood estimator


Probability density function


Random sample consensus


Relative orientation


Structure from motion problem


Scale invariant feature transform


Speeded up robust features


Unscented Kalman filter


Visual simultaneous localization and mapping


  1. 1.
    Goliaei, S., Ghorshi, S., Manzuri, M. T., & Mortazavi, M. (2011). A Kalman filter techniques applied for medical image reconstruction. In 8th International Multi-Conference on Systems, Signals & Devices.Google Scholar
  2. 2.
    Ling, L. (2013). Dense real time 3D reconstruction from multiple images. PhD thesis, College of Science, Engineering and Health, RMIT University.Google Scholar
  3. 3.
    Saputra, M. R. U., Markham, A., & Trigoni, N. (2018). Visual SLAM and structure from motion in dynamic environments: A survey. ACM Computing Surveys, 51, 2.CrossRefGoogle Scholar
  4. 4.
    Klančar, G., Teslic, L., & Skrjanc, I. (2014). Mobile robot pose estimation and environment mapping using an extended Kalman filter. International Journal of Systems Science, 45(12), 2603–2618.MathSciNetCrossRefGoogle Scholar
  5. 5.
    Basaca-Preciado, L. C., Sergiyenko, O. Y., Rodríguez-Quinonez, J. C., Garcia, X., Tyrsa, V. V., Rivas-Lopez, M., & Tabakova, I. (2014). Optical 3D laser measurement system for navigation of autonomous mobile robot. Optics and Lasers in Engineering, 54, 159–169.CrossRefGoogle Scholar
  6. 6.
    Park, J., & Lee, S. (2009). Correction robot pose for SLAM based on extended Kalman filter in rough surface environment. International Journal of Advanced Robotic System, 6(2), 67–72.Google Scholar
  7. 7.
    Moreno, F. A., Blanco, J. L., & Gonzalez, J. (2009). Stereo vision-specific models for particle filter-based SLAM. Robotics and Autonomous Systems, 57(9), 955–970.CrossRefGoogle Scholar
  8. 8.
    Tully, S. T. (2012). BodySLAM: Localization and mapping for surgical guidance. PhD thesis, Garnegir Mellon University, Pittsburgh.Google Scholar
  9. 9.
    Szeliski, R. (2010). Computer vision: Algorithms and applications. Springer.Google Scholar
  10. 10.
    Thrun, S., Fox, D., & Burgard, W. (2006). Probabilistic robotics. Massachusetts Institute of Technology.Google Scholar
  11. 11.
    Merwe, R. (2004). Sigma point Kalman Filter for probabilistic inference in dynamic state space models. PhD thesis, Oregon Health & Science University.Google Scholar
  12. 12.
    Sibley, G., Sukhatme, G., & Matthies, L. (2006). The iterated sigma point Kalman filter with applications to long range stereo. Robotics Science and Systems.Google Scholar
  13. 13.
    Förstner, W., & Wrobel, B. P. (2016). Photogrammetric computer vision statistics, geometry, orientation and reconstruction. Cham, Switzerland: Springer.CrossRefGoogle Scholar
  14. 14.
    Särkkä, S. (2011). Bayesian filtering and smoothing (Vol. 3). Cambridge University Press.Google Scholar
  15. 15.
    Corke, P. (2011). Robotics vision and control fundamental algorithms in MATLAB. In Springer tracts in advanced robotics (Vol. 73). Springer.Google Scholar
  16. 16.
    Haykin, S. M. (2001). Kalman filtering and neural networks. Wiley.Google Scholar
  17. 17.
    Joukhadar, A., & Kass Hanna, D., (2018). UKF and adaptive optimal control-based localization enhancement of 4WDDMR, ROS framework-based design and implementation. In Cogent engineering, System and Control Research Article. Google Scholar
  18. 18.
    Long, A. W., Wolfe, K. C., Mashner, M. J., & Chirikjian, G. S. (2013). The banana distribution is Gaussian: A localization study with exponential coordinates (pp. 265–272). Cambridge, MA: Robotics: Science and Systems VIII; MIT Press.Google Scholar
  19. 19.
    Mahmoudi, Z., Poulsen, N. K., Madsen, H., & Jorgensen, J. B. (2017). Adaptive unscented Kalman filter using maximum likelihood estimation. IFAC-Papers Online, 50(1), 3859–3864. Scholar
  20. 20.
    Press, W. H., Teukolsky, S. A., Vetterling, W. T., & Flannery, B. P. (2007). Numerical recipes. The art of scientific computing. Cambridge, UK: Cambridge University Press.zbMATHGoogle Scholar
  21. 21.
    Rodríguez-Quiŕíonez, J. C., Sergiyenko, O., Flores-Fuentes, W., Rivas-lopez, M., Hernandez-Balbuena, D., Rascón, R., & Mercorelli, P. (2017). Improve a 3D distance measurement accuracy in stereo vision system using optimization methods’ approach. Opto-Electronics Review, 25(1), 24–32.CrossRefGoogle Scholar
  22. 22.
    Hu, Y., Chen, Q., Feng, S., Tao, T., Asundi, A., & Zuo, C. (2019). A new microscopic telecentric stereo vision system—Calibration, rectification, and three-dimensional reconstruction. Optics and Lasers in Engineering, 113, 14–22.CrossRefGoogle Scholar
  23. 23.
    Bergamini, M. L., Ansaldo, F. A., Bright, G., & Zelasco, J. F. (2017). Digital camera calibration, relative orientation and essential matrix parameters. WSEAS Transaction on Signal Processing, 13.Google Scholar
  24. 24.
    Hayet, J. B., Lerasle, F., & Devy, M. (2002). A visual landmark framework for indoor mobile robot navigation. In International Conference on Robotics & Automation. Washington, DC.Google Scholar
  25. 25.
    Florczyk, S. (2005). Video-based indoor exploration with autonomous and mobile robots. Wiley.Google Scholar
  26. 26.
    Hartmann, G., Huang, F., & Klette, R. (2013). Landmark initialization for unscented Kalman filter sensor fusion in monocular camera localization. Auckland, New Zealand: The University of Auckland.CrossRefGoogle Scholar
  27. 27.
    Parnian, N., & Golnaraghi, F. (2010). Integration of multi-camera vision system and strap down inertial navigation system (SDINS) with a modified Kalman filter. Sensors Journal, 10(6), 5378–5394.CrossRefGoogle Scholar
  28. 28.
    Boyat, A. K., & Joshi, B. K. (2015). A review paper: Noise models in digital image processing. Signal & Image Processing: An International Journal (SIPIJ), 6(2).Google Scholar
  29. 29.
    Siegwart, R., & Noubakhsh, I. R. (2004). Introduction to autonomous mobile robots. Cambridge, MA, London: The MIT Press.Google Scholar
  30. 30.
    Beinhofer, M. Müller, J., Krause, A., & Burgard, W. (2013). Robust landmark selection for mobile robot navigation. In Intelligent Robots and Systems (IROS), IEEE/RSJ International Conference.Google Scholar
  31. 31.
    Haghighipanah, M., Miyasaka, M., Li, Y., & Hannaford, B. (2016). Unscented Kalman filter and vision to improve cable driven surgical robot joint angle estimation. In IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden.Google Scholar
  32. 32.
    Yu, Y. K., Wong, K. H., Or, S. H., & Chang, M. M. Y. (2006). Recursive camera motion estimation with trifocal tensor. IEEE Transaction on System, Man and Cybernetics B, 36(5), 1081–1090.CrossRefGoogle Scholar
  33. 33.
    Yu, Y. K., Wong, K. H., Or, S. H., & Chang, M. M. Y. (2008). Robust 3D motion tracking from stereo images: A model-less method. IEEE Transaction on Instrumentation and Measurement, 57(3).Google Scholar
  34. 34.
    Negenborn, R. (2003). Robot localization and Kalman Filters on finding your position in a noisy word. MSc thesis, Utrecht University.Google Scholar
  35. 35.
    Joukhadar, A., Kass Hanna, D., Müller, A., & Stöger, C., (2017). UKF-Assisted SLAM for 4WDDMR Localization and Mapping. In 1st International Congress for the Advancement of Mechanism, Machine, Robotics and Mechatronics Sciences, Beirut, Lebanon, 17–19 October.Google Scholar
  36. 36.
    Kelly, J., & Sukhatme, G. S. (2009). Visual-inertial simultaneous localization, mapping and sensor-to-sensor self-calibration. Korea: CIRA.CrossRefGoogle Scholar
  37. 37.
    Kass Hanna, D., & Joukhadar, A. (2015). A novel control-navigation system-based adaptive optimal controller & EKF localization of DDMR. International Journal of Advance Research in Artificial Intelligence, 4(25), 29–37.Google Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  • Abdulkader Joukhadar
    • 1
    Email author
  • Dalia Kass Hanna
    • 1
  • Etezaz Abo Al-Izam
    • 2
  1. 1.Department of Mechatronics Engineering, Faculty of Electrical and Electronic EngineeringUniversity of AleppoAleppoSyria
  2. 2.Department of Computer Engineering, Faculty of Electrical and Electronic EngineeringUniversity of AleppoAleppoSyria

Personalised recommendations