Multisensor Data Fusion

Abstract

Multisensor data fusion is the process of combining observations from a number of different sensors to provide a robust and complete description of an environment or process of interest. Data fusion finds wide application in many areas of robotics such as object recognition, environment mapping, and localization.

This chapter has three parts: methods, architectures, and applications. Most current data fusion methods employ probabilistic descriptions of observations and processes and use Bayes’ rule to combine this information. This chapter surveys the main probabilistic modeling and fusion techniques including grid-based models, Kalman filtering, and sequential Monte Carlo techniques. This chapter also briefly reviews a number of nonprobabilistic data fusion methods. Data fusion systems are often complex combinations of sensor devices, processing, and fusion algorithms. This chapter provides an overview of key principles in data fusion architectures from both a hardware and algorithmic viewpoint. The applications of data fusion are pervasive in robotics and underly the core problem of sensing, estimation, and perception. We highlight two example applications that bring out these features. The first describes a navigation or self-tracking application for an autonomous vehicle. The second describes an application in mapping and environment modeling.

The essential algorithmic tools of data fusion are reasonably well established. However, the development and use of these tools in realistic robotics applications is still developing.

2-D

two-dimensional

3-D

three-dimensional

4-D

four-dimensional

ASN

active sensor network

CAN

controller area network

CCI

control command interpreter

COTS

commercial off-the-shelf

COV

characteristic output vector

DDF

decentralized data fusion

DFRA

distributed field robot architecture

EKF

extended Kalman filter

GPS

global positioning system

GUI

graphical user interface

GV

ground vehicle

HO

human operator

ILLS

instrumented logical sensor system

IR

infrared

JDL

joint directors of laboratories

LSS

logical sensor system

MC

Monte Carlo

MESSIE

multi expert system for scene interpretation and evaluation

RCS

real-time control system

SAR

synthetic aperture radar

SFX

sensor fusion effect

SIR

sampling importance resampling

SMC

sequential Monte Carlo

UAV

fusing air vehicle

References

  1. 35.1
    S. Thrun, W. Burgard, D. Fox: Probabilistic Robotics (MIT Press, Cambridge 2005)MATHGoogle Scholar
  2. 35.2
    J.O. Berger: Statistical Decision Theory and Bayesian Analysis (Springer, Berlin, Heidelberg 1985)CrossRefMATHGoogle Scholar
  3. 35.3
    A. Elfes: Sonar-based real-world mapping and navigation, IEEE Trans. Robotics Autom. 3(3), 249–265 (1987)CrossRefGoogle Scholar
  4. 35.4
    L. Matthies, A. Elfes: Integration of sonar and stereo range data using a grid-based representation, Proc. IEEE Int. Conf. Robotics Autom. (ICRA) (1988) pp. 727–733Google Scholar
  5. 35.5
    L.D. Stone, C.A. Barlow, T.L. Corwin: Bayesian Multiple Target Tracking (Artech House, Norwood 1999)MATHGoogle Scholar
  6. 35.6
    Y. Bar-Shalom: Multi-Target Multi-Sensor Tracking (Artec House, Norwood 1990)Google Scholar
  7. 35.7
    Y. Bar-Shalom, T.E. Fortmann: Tracking and Data Association (Academic, New York 1988)MATHGoogle Scholar
  8. 35.8
    P.S. Maybeck: Stochastic Models, Estimaton and Control (Academic, New York 1979)MATHGoogle Scholar
  9. 35.9
    W. Sorensen: Special issue on the applications of the Kalman filter, IEEE Trans. Autom. Control 28(3), 254–255 (1983)CrossRefGoogle Scholar
  10. 35.10
    B.D.O. Anderson, J.B. Moore: Optimal Filtering (Prentice Hall, Englewood Cliffs 1979)MATHGoogle Scholar
  11. 35.11
    J. Manyika, H.F. Durrant-Whyte: Data Fusion and Sensor Management: An Information-Theoretic Approach (Ellis Horwood, New York 1994)Google Scholar
  12. 35.12
    S. Sukkarieh, E. Nettleton, J.H. Kim, M. Ridley, A. Goktogan, H. Durrant-Whyte: The ANSER project: Data fusion across multiple uninhabited air vehicles, Int. J. Robotics Res. 22(7), 505–539 (2003)CrossRefGoogle Scholar
  13. 35.13
    R.E. Moore: Interval Analysis (Prentice Hall, Englewood Cliffs 1966)MATHGoogle Scholar
  14. 35.14
    D. Dubois, H. Prade: Fuzzy Sets and Systems: Theory and Applications (Academic, New York 1980)MATHGoogle Scholar
  15. 35.15
    S. Blackman, R. Popoli: Design and Analysis of Modern Tracking Systems (Artec House, Boston 1999)MATHGoogle Scholar
  16. 35.16
    D. Pagac, E.M. Nebot, H. Durrant-Whyte: An evidential approach to map-building for autonomous vehicles, IEEE Trans. Robotics Autom. 14(4), 623–629 (1998)CrossRefGoogle Scholar
  17. 35.17
    D. Hall, J. Llinas: Handbook of Multisensor Data Fusion (CRC, Boca Raton 2001)Google Scholar
  18. 35.18
    E.L. Waltz, J. Llinas: Sensor Fusion (Artec House, Boston 1991)Google Scholar
  19. 35.19
    M. Kam, Z. Zhu, P. Kalata: Sensor fusion for mobile robot navigation, Proceedings IEEE 85, 108–119 (1997)CrossRefGoogle Scholar
  20. 35.20
    H. Carvalho, W. Heinzelman, A. Murphy, C. Coelho: A general data fusion architecture, Proc. 6th Int. Conf. Inf. Fusion, Cairns (2003)Google Scholar
  21. 35.21
    A. Makarenko: A Decentralized Architecture for Active Sensor Networks, Ph.D. Thesis (University of Sydney, Sydney 2004)CrossRefGoogle Scholar
  22. 35.22
    M. Dekhil, T. Henderson: Instrumented logical sensors systems, Int. J. Robotics Res. 17(4), 402–417 (1998)CrossRefGoogle Scholar
  23. 35.23
    J.A. Profeta: Safety-critical systems built with COTS, Computer 29(11), 54–60 (1996)CrossRefGoogle Scholar
  24. 35.24
    H. Hu, J.M. Brady, F. Du, P. Probert: Distributed real-time control of a mobile robot, J. Intell. Autom. Soft Comput. 1(1), 63–83 (1995)CrossRefGoogle Scholar
  25. 35.25
    S.A. Schneider, V. Chen, G. Pardo: ControlShell: A real-time software framework, AIAA Conf. Intell. Robotics Field Fact. Serv. Space (1994)Google Scholar
  26. 35.26
    D. Simon, B. Espiau, E. Castillo, K. Kapellos: Computer-aided design of a generic robot controller handling reactivity and real-time issues, IEEE Trans. Control Syst. Technol. 4(1), 213–229 (1993)CrossRefGoogle Scholar
  27. 35.27
    C. Giraud, B. Jouvencel: Sensor selection in a fusion process: a fuzzy approach, Proc. IEEE Int. Conf. Multisens. Fusion Integr., Las Vegas (1994) pp. 599–606Google Scholar
  28. 35.28
    R. Kapur, T.W. Williams, E.F. Miller: System testing and reliability techniques for avoiding failure, Computer 29(11), 28–30 (1996)Google Scholar
  29. 35.29
    K.H. Kim, C. Subbaraman: Fault-tolerant real-time objects, Communication ACM 40(1), 75–82 (1997)CrossRefGoogle Scholar
  30. 35.30
    D.B. Stewart, P.K. Khosla: Mechanisms for detecting and handling timing errors, Communication ACM 40(1), 87–93 (1997)CrossRefGoogle Scholar
  31. 35.31
    G. Weller, F. Groen, L. Hertzberger: A sensor processing model incorporating error detection and recovery. In: Traditional and Non-Traditional Robotic Sensors, ed. by T. Henderson (Springer, Berlin, Heidelberg 1990) pp. 351–363CrossRefGoogle Scholar
  32. 35.32
    R.R. Brooks, S. Iyengar: Averaging Algorithm for Multi-Dimensional Redundant Sensor Arrays: Resolving Sensor Inconsistencies, Tech. Rep. (Louisiana State University, Baton Rouge 1993)Google Scholar
  33. 35.33
    T.C. Henderson, M. Dekhil: Visual Target Based Wall Pose Estimation, Tech. Rep. UUCS-97-010 (University of Utah, Salt Lake City 1997)Google Scholar
  34. 35.34
    S. Iyengar, D. Jayasimha, D. Nadig: A versatile architecture for the distributed sensor integration problem, Computer 43, 175–185 (1994)Google Scholar
  35. 35.35
    D. Nadig, S. Iyengar, D. Jayasimha: A new architecture for distributed sensor integration, Proc. IEEE Southeastcon (1993)Google Scholar
  36. 35.36
    L. Prasad, S. Iyengar, R.L. Kashyap, R.N. Madan: Functional characterization of fault tolerant integration in distributed sensor networks, IEEE Trans. Syst. Man Cybern. 25, 1082–1087 (1991)CrossRefGoogle Scholar
  37. 35.37
    L. Prasad, S. Iyengar, R. Rao, R. Kashyap: Fault-tolerence sensor integration using multiresolution decomposition, Am. Phys. Soc. 49(4), 3452–3461 (1994)Google Scholar
  38. 35.38
    H.F. Durrant-Whyte: Integration, Coordination, and Control of Multi-Sensor Robot Systems (Kluwer, Boston 1987)CrossRefGoogle Scholar
  39. 35.39
    F. Groen, P. Antonissen, G. Weller: Model based robot vision, IEEE Instrum. Meas. Technol. Conf. (1993) pp. 584–588Google Scholar
  40. 35.40
    R. Joshi, A.C. Sanderson: Model-based multisensor data fusion: A minimal representation approach, Proc. IEEE Int. Conf. Robotics Autom. (ICRA) (1994)Google Scholar
  41. 35.41
    A.J. Briggs, B.R. Donald: Automatic sensor configuration for task-directed planning, Proc. IEEE Int. Conf. Robotics Autom. (ICRA) (1994) pp. 1345–1350Google Scholar
  42. 35.42
    G. Hager: Task Directed Sensor Fusion and Planning (Kluwer, Boston 1990)CrossRefGoogle Scholar
  43. 35.43
    G. Hager, M. Mintz: Computational methods for task-directed sensor data fusion and sensor planning, Int. J. Robotics Res. 10(4), 285–313 (1991)CrossRefGoogle Scholar
  44. 35.44
    B. Donald: On information invariants in robotics, Artif. Intell. 72, 217–304 (1995)CrossRefMATHGoogle Scholar
  45. 35.45
    V. Braitenberg: Vehicles: Experiments in Synthetic Psychology (MIT Press, Cambridge 1984)Google Scholar
  46. 35.46
    R.A. Brooks: A robust layered control system for a mobile robot, IEEE Trans. Robotics Autom. 2(1), 14–23 (1986)CrossRefGoogle Scholar
  47. 35.47
    K.P. Valavanis, A.L. Nelson, L. Doitsidis, M. Long, R.R. Murphy: Validation of a Distributed Field Robot Architecture Integrated with a Matlab Based Control Theoretic Environment: A Case Study of Fuzzy Logic Based Robot Navigation, CRASAR Tech. Rep. 25 (University of South Florida, Tampa 2004)Google Scholar
  48. 35.48
    R.R. Murphy: Introduction to AI Robotics (MIT Press, Cambridge 2000)Google Scholar
  49. 35.49
    S. Lee: Sensor fusion and planning with perception-action network, Proc. IEEE Conf. Multisens. Fusion Integr. Intell. Syst., Washington (1996)Google Scholar
  50. 35.50
    S. Lee, S. Ro: Uncertainty self-management with perception net based geometric data fusion, Proc. IEEE Conf. Robotics Autom. (ICRA), Albuquerque (1997)Google Scholar
  51. 35.51
    B.A. Draper, A.R. Hanson, S. Buluswar, E.M. Riseman: Information acquisition and fusion in the mobile perception laboratory, Proc. SPIE Sens. Fusion VI (1993)Google Scholar
  52. 35.52
    S.S. Shafer, A. Stentz, C.E. Thorpe: An architecture for sensor fusion in a mobile robot, Proc. IEEE Int. Conf. Robotics Autom. (ICRA) (1986) pp. 2002–2007Google Scholar
  53. 35.53
    S. Nagata, M. Sekiguchi, K. Asakawa: Mobile robot control by a structured hierarchical neural network, IEEE Control Syst. Mag. 10(3), 69–76 (1990)CrossRefGoogle Scholar
  54. 35.54
    M. Pachter, P. Chandler: Challenges of autonomous control, IEEE Control Syst. Mag. 18(4), 92–97 (1998)CrossRefGoogle Scholar
  55. 35.55
    R. Joshi, A.C. Sanderson: Multisensor Fusion (World Scientific, Singapore 1999)CrossRefMATHGoogle Scholar
  56. 35.56
    V. Berge-Cherfaoui, B. Vachon: Dynamic configuration of mobile robot perceptual system, Proc. IEEE Conf. Multisens. Fusion Integr. Intell. Syst., Las Vegas (1994)Google Scholar
  57. 35.57
    V. Clement, G. Giraudon, S. Houzelle, F. Sandakly: Interpretation of Remotely Sensed Images in a Context of Multisensor Fusion Using a Multi-Specialist Architecture, Rapp. Rech.: No. 1768 (INRIA, Sophia-Antipolis 1992)Google Scholar
  58. 35.58
    J. Albus: RCS: A cognitive architecture for intelligent multi-agent systems, Proc. IFAC Symp. Intell. Auton. Veh., Lisbon (2004)Google Scholar
  59. 35.59
    R. Camden, B. Bodt, S. Schipani, J. Bornstein, R. Phelps, T. Runyon, F. French: Autonomous Mobility Technology Assessment, Interim Rep., ARL-MR 565 (Army Research Laboratory, Washington 2003)Google Scholar
  60. 35.60
    T. Queeney, E. Woods: A generic architecture for real-time multisensor fusion tracking algorithm development and evaluation, Proc. SPIE Sens. Fusion VII, Vol. 2355 (1994) pp. 33–42Google Scholar
  61. 35.61
    T. Henderson, E. Shilcrat: Logical sensor systems, J. Robotics Syst. 1(2), 169–193 (1984)CrossRefGoogle Scholar
  62. 35.62
    T. Henderson, C. Hansen, B. Bhanu: The specification of distributed sensing and control, J. Robotics Syst. 2(4), 387–396 (1985)CrossRefGoogle Scholar
  63. 35.63
    J.D. Elliott: Multisensor Fusion within an Encapsulated Logical Device Architecture, Master's Thesis (University of Waterloo, Waterloo 2001)Google Scholar
  64. 35.64
    M.D. Naish: Elsa: An Intelligent Multisensor Integration Architecture for Industrial Grading Tasks, Master's Thesis (University of Western Ontario, London 1998)Google Scholar
  65. 35.65
    A. Makarenko, A. Brooks, S. Williams, H. Durrant-Whyte, B. Grocholsky: A decentralized architecture for active sensor networks, Proc. IEEE Int. Conf. Robotics Autom. (ICRA), New Orleans (2004) pp. 1097–1102Google Scholar
  66. 35.66
    B. Grocholsky, A. Makarenko, H. Durrant-Whyte: Information-theoretic coordinated control of multiple sensor platforms, Proc. IEEE Int. Conf. Robotics Autom. (ICRA), Taipei (2003) pp. 1521–1527Google Scholar
  67. 35.67
    R. Gregor, M. Lützeler, M. Pellkofer, K.-H. Siedersberger, E. Dickmanns: EMS-Vision: A perceptual system for autonomous vehicles, IEEE Trans. Intell. Transp. Syst. 3(1), 48–59 (2002)CrossRefMATHGoogle Scholar
  68. 35.68
    B. Rao, H. Durrant-Whyte, A. Sheen: A fully decentralized multi-sensor system for tracking and surveillance, Int. J. Robotics Res. 12(1), 20–44 (1993)CrossRefGoogle Scholar
  69. 35.69
    B. Upcroft: Non-gaussian state estimation in an outdoor decentralised sensor network, Proc. IEEE Conf. Decis. Control (CDC) (2006)Google Scholar
  70. 35.70
    S. Kumar, F. Ramos, B. Upcroft, H. Durrant-Whyte: A statistical framework for natural feature representation, Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst. (IROS), Edmonton (2005) pp. 1–6Google Scholar
  71. 35.71
    S. Gao, Y. Zhong, W. Li: Random weighting method for multisensor data fusion, IEEE Sens. J. 11(9), 1955–1961 (2011)CrossRefGoogle Scholar
  72. 35.72
    M. Lhuillier: Incremental fusion of structure-from-motion and GPS using constrained bundle adjustments, IEEE Trans. Pattern Anal. Mach. Intell. 34(12), 2489–2495 (2012)CrossRefGoogle Scholar
  73. 35.73
    S. Yu, L. Tranchevent, X. Liu, W. Glanzel, J. Suykens, B. DeMoor, Y. Moreau: Optimized data fusion for kernel k-means clustering, IEEE Trans. Pattern Anal. Mach. Intell. 34(5), 1031–1039 (2012)CrossRefGoogle Scholar
  74. 35.74
    K. Kolev: Fast joint estimation of silhouettes and dense 3D geometry from multiple images, IEEE Trans. Pattern Anal. Mach. Intell. 34(3), 493–505 (2012)CrossRefGoogle Scholar
  75. 35.75
    C. Loy: Incremental activity modeling in multiple disjoint cameras, IEEE Trans. Pattern Anal. Mach. Intell. 34(9), 1799–1813 (2012)CrossRefGoogle Scholar
  76. 35.76
    N. Poh, J. Kittler: A unified framework for biometric expert fusion incorporating quality measures, IEEE Trans. Pattern Anal. Mach. Intell. 34(1), 3–18 (2012)CrossRefGoogle Scholar
  77. 35.77
    M.-F. Weng, Y.-Y. Chuang: Cross-domain multicue fusion for concept-based video indexing, IEEE Trans. Pattern Anal. Mach. Intell. 34(10), 1927–1941 (2012)CrossRefGoogle Scholar
  78. 35.78
    M. Hwangbo, J.-S. Kim, T. Kanade: Gyro-aided feature tracking for a moving camera: fusion, auto-calibration and GPU implementation, Intl. J. Robotics Res. 30(14), 1755–1774 (2011)CrossRefGoogle Scholar
  79. 35.79
    H. Seraji, N. Serrano: A multisensor decision fusion system for terrain safety assessment, IEEE Trans. Robotics 25(1), 99–108 (2009)CrossRefGoogle Scholar
  80. 35.80
    H. Himberg, Y. Motai, A. Bradley: Interpolation volume calibration: a multisensor calibration technique for electromagnetic trackers, IEEE Trans. Robotics 28(5), 1120–1130 (2012)CrossRefGoogle Scholar
  81. 35.81
    K. Zhou, S.I. Roumeliotis: Optimal motion strategies for range-Only constrained multisensor target tracking, IEEE Trans. Robotics 24(5), 1168–1185 (2008)CrossRefGoogle Scholar
  82. 35.82
    N.R. Ahmed, E.M. Sample, M. Campbell: Bayesian multicategorical soft data fusion for human–robot collaboration, IEEE Trans. Robotics PP(99), 1–18 (2012)Google Scholar
  83. 35.83
    H. Frigui, L. Zhang, P.D. Gader: Context-dependent multisensor fusion and its application to land mine detection, IEEE Trans. Geosci. Remote Sens. 48(6), 2528–2543 (2010)CrossRefGoogle Scholar
  84. 35.84
    J.G. Garcia, A. Robertson, J.G. Ortega, R. Johansson: Sensor fusion for compliant robot motion control, IEEE Trans. Robotics 24(2), 430–441 (2008)CrossRefGoogle Scholar
  85. 35.85
    R. Heliot, B. Espiau: Multisensor input for CPG-based sensory—motor coordination, IEEE Trans. Robotics 24(1), 191–195 (2008)CrossRefGoogle Scholar
  86. 35.86
    S. Liu, R.X. Gao, D. John, J.W. Staudenmayer, P.S. Freedson: Multisensor data fusion for physical activity assessment, IEEE Trans. Bio-Med. Eng. 59(3), 687–696 (2012)CrossRefGoogle Scholar
  87. 35.87
    A. Martinelli: Vision and IMU data fusion: closed-Form solutions for attitude, speed, absolute scale, and bias determination, IEEE Trans. Robotics 28(1), 44–60 (2012)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2016

Authors and Affiliations

  1. 1.Australian Centre for Field Robotics (ACFR)University of SydneySydneyAustralia
  2. 2.School of ComputingUniversity of UtahSalt Lake CityUSA

Personalised recommendations