Experimental Analysis of Measurements Fusion for Pose Estimation Using PMD Sensor

  • Ksenia KlionovskaEmail author
  • Heike Benninghoff
  • Eicke-Alexander Risse
  • Felix Huber
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11401)


This article presents the experimental investigation of the fusion concept of two relative position and orientation (pose) estimates of the rotating target using single Photonic Mixer Device (PMD) sensor for the frame-to-frame tracking. For each frame PMD depth sensor provides co-registered depth and amplitude images of the scene. We propose to use two different pose estimation techniques for each of the data channel with a further fusion of its measured state vectors. The fusion architecture of the state measurements is based on a low complexity weighted average algorithm. The weights for the fusion operator are calculated experimentally with the real data from PMD sensor mounted on DLR’s European Proximity Operations Simulator. The fused state vector obtained with experimental results outperforms the accuracy of the two incoming pose measurements. This allows us to ensure robust pose estimation of a rotating target for the whole tracking.


PMD sensor Data fusion Pose estimation Optical navigation 


  1. 1.
  2. 2.
    Opromolla, R., Fasano, G., Rufino, G., Grassi, M.: Uncooperative pose estimation with a LIDAR-based system. Acta Astronautica 110, 287–297 (2015)CrossRefGoogle Scholar
  3. 3.
    Oumer, N.W.: Visual Tracking and Motion Estimation for an On-Orbit Servicing of a Satellite (2016)Google Scholar
  4. 4.
    Ringbeck, T., Hagebeuker, B.: A 3D Time of Flight Camera for object detection. Optical 3D Measurement Technologies (2007)Google Scholar
  5. 5.
    Klionovska, K., Ventura, J., Benninghoff, H., Huber, F.: Close Range Tracking of an Uncooperative Target in a Sequence of Photonic Mixer Device (PMD) Images (2018).
  6. 6.
    Schramm, S., Rangel, J., Kroll, A.: Data fusion for 3D thermal imaging using depth and stereo camera for robust self-localization. In: IEEE Sensors Applications Symposium (SAS) (2018)Google Scholar
  7. 7.
    Kim, J., Han, D.S., Senouci, B.: Radar and vision sensor fusion for object detection in autonomous vehicle surroundings. In: Tenth International Conference on Ubiquitous and Future Networks (2018)Google Scholar
  8. 8.
    Deilamsalehy, H., Havens, T.C.: Sensor fused three-dimensional localization using IMU, camera and LiDAR. IEEE Sensors (2016)Google Scholar
  9. 9.
    Benninghoff, H., Rems, F., Risse, E.-A., Mietner, C.: European Proximity Operations Simulator 2.0. (EPOS)- a robotic-based rendezvous and docking simulator. J. Large Scale Res. Facil. 3, A107 (2017)CrossRefGoogle Scholar
  10. 10.
    Mckay, N., Besl, P.: A method for registration of 3d shapes. IEEE Trans. Pattern Anal. Mach. Intell. 14, 239–256 (1992)CrossRefGoogle Scholar
  11. 11.
    Blais, G., Levine, M.: Registering multiview range data to create 3D computer objects. IEEE Trans. Pattern Anal. Mach. Intell. 17, 820–824 (1995)CrossRefGoogle Scholar
  12. 12.
    Nocedal, J., Wright, S.: Numerical Optimization. Springer, New York (2006). Scholar
  13. 13.
    Grossman, J., Grossman, M., Katz, R.: The First Systems of Weighted Differential and Integral Calculus. Archimedes Foundation, Rockport (1980)zbMATHGoogle Scholar
  14. 14.
    Schrgendorfer, A., Elmenreich, W.: Extended confidence-weighted averaging in sensor fusion. In: Proceedings of the Junior Scientist Conference (2006)Google Scholar
  15. 15.
    Elmenreich, W.: Fusion of continuous-valued sensor measurements using confidence-weighted averaging. J. Vib. Control. 13, 1303–1312 (2007)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Ksenia Klionovska
    • 1
    Email author
  • Heike Benninghoff
    • 1
  • Eicke-Alexander Risse
    • 1
  • Felix Huber
    • 1
  1. 1.German Space Operations CenterWesslingGermany

Personalised recommendations