Joint European Conference on Machine Learning and Knowledge Discovery in Databases

ECML PKDD 2015: Machine Learning and Knowledge Discovery in Databases pp 20-36

Discriminative Interpolation for Classification of Functional Data

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9284)

Abstract

The modus operandi for machine learning is to represent data as feature vectors and then proceed with training algorithms that seek to optimally partition the feature space \(S\subset {\mathbb {R}}^{n}\) into labeled regions. This holds true even when the original data are functional in nature, i.e. curves or surfaces that are inherently varying over a continuum such as time or space. Functional data are often reduced to summary statistics, locally-sensitive characteristics, and global signatures with the objective of building comprehensive feature vectors that uniquely characterize each function. The present work directly addresses representational issues of functional data for supervised learning. We propose a novel classification by discriminative interpolation(CDI) framework wherein functional data in the same class are adaptively reconstructed to be more similar to each other, while simultaneously repelling nearest neighbor functional data in other classes. Akin to other recent nearest-neighbor metric learning paradigms like stochastic k-neighborhood selection and large margin nearest neighbors, our technique uses class-specific representations which gerrymander similar functional data in an appropriate parameter space. Experimental validation on several time series datasets establish the proposed discriminative interpolation framework as competitive or better in comparison to recent state-of-the-art techniques which continue to rely on the standard feature vector representation.

Keywords

Functional data classification Wavelets Discriminative Interpolation 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Abraham, C., Cornillon, P.A., Matzner-Lober, E., Molinari, N.: Unsupervised curve clustering using B-splines. Scandinavian J. of Stat. 30(3), 581–595 (2003)CrossRefMathSciNetMATHGoogle Scholar
  2. 2.
    Alonso, A.M., Casado, D., Romo, J.: Supervised classification for functional data: A weighted distance approach. Computational Statistics & Data Analysis 56(7), 2334–2346 (2012)CrossRefMathSciNetMATHGoogle Scholar
  3. 3.
    Antoniadis, A., Brossat, X., Cugliari, J., Poggi, J.M.: Clustering functional data using wavelets. International Journal of Wavelets, Multiresolution and Information Processing 11(1), 1350003 (30 pages) (2013)Google Scholar
  4. 4.
    Berlinet, A., Biau, G., Rouviere, L.: Functional supervised classification with wavelets. Annale de l’ISUP 52 (2008)Google Scholar
  5. 5.
    Berndt, D., Clifford, J.: Using dynamic time warping to find patterns in time series. In: AAAI Workshop on Knowledge Discovery in Databases, pp. 229–248 (1994)Google Scholar
  6. 6.
    Biau, G., Bunea, F., Wegkamp, M.: Functional classification in Hilbert spaces. IEEE Transactions on Information Theory 51(6), 2163–2172 (2005)CrossRefMathSciNetMATHGoogle Scholar
  7. 7.
    Daubechies, I.: Ten Lectures on Wavelets. CBMS-NSF Reg. Conf. Series in Applied Math., SIAM (1992)Google Scholar
  8. 8.
    Domeniconi, C., Gunopulos, D., Peng, J.: Large margin nearest neighbor classifiers. IEEE Transactions on Neural Networks 16(4), 899–909 (2005)CrossRefGoogle Scholar
  9. 9.
    Garcia, M.L.L., Garcia-Rodenas, R., Gomez, A.G.: K-means algorithms for functional data. Neurocomputing 151(Part 1), 231–245 (2015)CrossRefGoogle Scholar
  10. 10.
    Globerson, A., Roweis, S.T.: Metric learning by collapsing classes. In: Advances in Neural Information Processing Systems, vol. 18, pp. 451–458. MIT Press (2006)Google Scholar
  11. 11.
    Gudmundsson, S., Runarsson, T., Sigurdsson, S.: Support vector machines and dynamic time warping for time series. In: IEEE International Joint Conference on Neural Networks (IJCNN), pp. 2772–2776, June 2008Google Scholar
  12. 12.
    Hall, P., Hosseini-Nasab, M.: On properties of functional principal components analysis. Journal of the Royal Statistical Society: Series B (Statistical Methodology) 68(1), 109–126 (2006)CrossRefMathSciNetMATHGoogle Scholar
  13. 13.
    Hinton, G., Osindero, S., Teh, Y.W.: A fast learning algorithm for deep belief nets. Neural Computation 18(7), 1527–1554 (2006)CrossRefMathSciNetMATHGoogle Scholar
  14. 14.
    Jeong, Y.S., Jayaramam, R.: Support vector-based algorithms with weighted dynamic time warping kernel function for time series classification. Knowledge-Based Systems 75, 184–191 (2014)CrossRefGoogle Scholar
  15. 15.
    Keogh, E., Xi, X., Wei, L., Ratanamahatana, C.: The UCR time series classification/clustering homepage (2011). http://www.cs.ucr.edu/eamonn/time_series_data/
  16. 16.
    Keysers, D., Deselaers, T., Gollan, C., Ney, H.: Deformation models for image recognition. IEEE Transactions on PAMI 29(8), 1422–1435 (2007)CrossRefGoogle Scholar
  17. 17.
    Kreyszig, E.: Introductory Functional Analysis with Applications. John Wiley and Sons (1978)Google Scholar
  18. 18.
    López-Pintado, S., Romo, J.: Depth-based classification for functional data. DIMACS Series in Discrete Mathematics and Theoretical Comp. Sci. 72, 103 (2006)Google Scholar
  19. 19.
    Prekopcsák, Z., Lemire, D.: Time series classification by class-specific Mahalanobis distance measures. Adv. in Data Analysis and Classification 6(3), 185–200 (2012)CrossRefMATHGoogle Scholar
  20. 20.
    Ramsay, J., Silverman, B.: Functional Data Analysis, 2nd edn. Springer (2005)Google Scholar
  21. 21.
    Rossi, F., Delannay, N., Conan-Guez, B., Verleysen, M.: Representation of functional data in neural networks. Neurocomputing 64, 183–210 (2005)CrossRefGoogle Scholar
  22. 22.
    Rossi, F., Villa, N.: Support vector machine for functional data classification. Neurocomputing 69(7–9), 730–742 (2006)CrossRefGoogle Scholar
  23. 23.
    Tarlow, D., Swersky, K., Charlin, L., Sutskever, I., Zemel, R.S.: Stochastic \(k\)-neighborhood selection for supervised and unsupervised learning. In: Proc. of the 30th International Conference on Machine Learning (ICML), vol. 28, no. 3, pp. 199–207 (2013)Google Scholar
  24. 24.
    Trivedi, S., McAllester, D., Shakhnarovich, G.: Discriminative metric learning by neighborhood gerrymandering. In: Advances in Neural Information Processing Systems (NIPS), vol. 27, pp. 3392–3400. Curran Associates, Inc. (2014)Google Scholar
  25. 25.
    Vapnik, V.N.: The nature of statistical learning theory. Springer, New York (1995)CrossRefMATHGoogle Scholar
  26. 26.
    Weinberger, K.Q., Blitzer, J., Saul, L.K.: Distance metric learning for large margin nearest neighbor classification. In: Advances in Neural Information Processing Systems (NIPS), vol. 18, pp. 1473–1480. MIT Press (2006)Google Scholar
  27. 27.
    Yuille, A., Kosowsky, J.: Statistical physics algorithms that converge. Neural Computation 6(3), 341–356 (1994)CrossRefGoogle Scholar
  28. 28.
    Zhang, D., Zuo, W., Zhang, D., Zhang, H.: Time series classification using support vector machine with Gaussian elastic metric kernel. In: 20th International Conference on Pattern Recognition (ICPR), pp. 29–32 (2010)Google Scholar
  29. 29.
    Zhu, P., Hu, Q., Zuo, W., Yang, M.: Multi-granularity distance metric learning via neighborhood granule margin maximization. Inf. Sci. 282, 321–331 (2014)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • Rana Haber
    • 1
  • Anand Rangarajan
    • 2
  • Adrian M. Peter
    • 3
  1. 1.Mathematical Sciences DepartmentFlorida Institute of TechnologyMelbourneUSA
  2. 2.Department of Computer and Information Science and EngineeringUniversity of FloridaGainesvilleUSA
  3. 3.Systems Engineering DepartmentFlorida Institute of TechnologyMelbourneUSA

Personalised recommendations