Abstract
The issue of data association arises frequently in sensor networks; whenever multiple sensors and sources are present, it may be necessary to determine which observations from different sensors correspond to the same target. In highly uncertain environments, one may need to determine this correspondence without the benefit of an a priori known joint signal/sensor model. This paper examines the data association problem as the more general hypothesis test between factorizations of a single, learned distribution. The optimal test between known distributions may be decomposed into model-dependent and statistical dependence terms, quantifying the cost incurred by model estimation from measurements compared to a test between known models. We demonstrate how one might evaluate a two-signal association test efficiently using kernel density estimation methods to model a wide class of possible distributions, and show the resulting algorithm’s ability to determine correspondence in uncertain conditions through a series of synthetic examples. We then describe an extension of this technique to multi-signal association which can be used to determine correspondence while avoiding the computationally prohibitive task of evaluating all hypotheses. Empirical results of the approximate approach are presented.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
T. Cover and J. Thomas. Elements of Information Theory. John Wiley & Sons, New York, 1991.
S. Kullback and R. A. Leibler. On information and sufficiency. Annals of Mathematical Statistics, 22:79–86, 1951.
E. J. Kelly. An adaptive detection algorithm. IEEE Transactions on Aerospace and Electrical Systems, 22(1):115–127, 1986.
J. Beirlant, E. J. Dudewicz, L. Györfi, and E. C. van der Meulen. Nonparametric entropy estimation: An overview. International Journal of Math. Stat. Sci., 6(1):17–39, June 1997.
B.W. Silverman. Density Estimation for Statistics and Data Analysis. Chapman and Hall, New York, 1986.
J.W. Fisher III and J.C. Principe. A methodology for information theoretic feature extraction. In A. Stuberud, editor, International Joint Conference on Neural Networks, 1998.
J. W. Fisher III, A. T. Ihler, and P. Viola. Learning informative statistics: A nonparametric approach. In S. A. Solla, T. K. Leen, and K-R. Müller, editors, Neural Information Processing Systems 12, 1999.
A. Ihler, J. Fisher, and A. S. Willsky. Nonparametric estimators for online signature authentication. In International Conference on Acoustics, Speech, and Signal Processing, May 2001.
C. Bishop. Neural Networks for Pattern Recognition. Clarendon Press, 1995.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2003 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Ihler, A.T., Fisher, J.W., Willsky, A.S. (2003). Hypothesis Testing over Factorizations for Data Association. In: Zhao, F., Guibas, L. (eds) Information Processing in Sensor Networks. IPSN 2003. Lecture Notes in Computer Science, vol 2634. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-36978-3_16
Download citation
DOI: https://doi.org/10.1007/3-540-36978-3_16
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-02111-7
Online ISBN: 978-3-540-36978-3
eBook Packages: Springer Book Archive