Abstract
A classification framework using only a set of distance matrices is proposed. The proposed algorithm can learn a classifier only from a set of distance matrices or similarity matrices, hence applicable to structured data, which do not have natural vector representation such as time series and graphs. Random forest is used to explore ideal feature representation based on the distance between points defined by a set of given distance matrices. The effectiveness of the proposed method is evaluated through experiments with point process data and graph structured data.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
So, we need to access to the distance functions.
References
Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE Trans. Inf. Theor. 13(1), 21–27 (1967)
Boser, B., et al.: A training algorithm for optimal margin classifiers. In: COLT (1992)
Yang, L., Jin, R.: Distance metric learning: A comprehensive survey. Technical report: Michigan State University (2006)
Kulis, B.: Metric learning: a survey. Found. Trends Mach. Learn. 5(4), 287–364 (2013)
Goldberger, J., et al.: Neighborhood component analysis. In: NIPS (2004)
Weinberger, K., et al.: Distance metric learning for large margin nearest neighbor classification. In: NIPS (2006)
Davis, J.V., et al.: Information-theoretic metric learning. In: ICML (2007)
Shawe-Taylor, J., Cristianini, N.: Kernel Methods for Pattern Analysis. Cambridge University Press, New York (2004)
Shervashidze, N., Borgwardt, K.M.: Fast subtree kernels on graphs. In: NIPS (2009)
Neumann, M., Patricia, N., Garnett, R., Kersting, K.: Efficient graph kernels by randomization. In: Flach, P.A., De Bie, T., Cristianini, N. (eds.) ECML PKDD 2012, Part I. LNCS, vol. 7523, pp. 378–393. Springer, Heidelberg (2012)
Feragen, A., et al.: Scalable kernels for graphs with continuous attributes. In: NIPS (2013)
Reiss, R.-D.: A Course on Point Processes. Springer Series in Statistics. Springer, New York (1993)
Kreuz, T., et al.: Measuring spike train synchrony. J. Neurosci. Methods 165(1), 151–161 (2007)
van Rossum, M.C.W.: A novel spike distance. Neural Compt. 13(4), 751–763 (2001)
Houghton, C.: Studying spike trains using a van rossum metric with a synapse-like filter. J. Comput. Neurosci. 26(1), 149–155 (2009)
Hunter, J.D., Milton, J.G.: Amplitude and frequency dependence of spike timing: implications for dynamic regulation. J. Neurophysiol. 90(1), 387–394 (2003)
Quiroga, R.Q.: Event synchronization: a simple and fast method to measure synchronicity and time delay patterns. Phys. Rev. E 66, 041904 (2002)
Schreiber, S., et al.: A new correlation-based measure of spike timing reliability. Neurocomputing 52, 925–931 (2003)
Paiva, A.R.C., et al.: A reproducing kernel hilbert space framework for spike train signal processing. Neural Comput. 21(2), 424–449 (2009)
Breiman, L.: Random forests. Mach. Learn. 45(1), 5–32 (2001)
Balcan, M., Blum, A.: On a theory of learning with similarity functions. In: Proceedings of ICML (2006)
Lanckriet, G.R.G., et al.: Learning the kernel matrix with semidefinite programming. J. Mach. Learn. Res. 5, 27–72 (2004)
Cortes, C., et al.: Algorithms for learning kernels based on centered alignment. J. Mach. Learn. Res. 13, 795–828 (2012)
Suryanto, C.H., et al.: Combination of multiple distance measures for protein fold classification. In: ACPR (2013)
Fellous, J.M., et al.: Discovering spike patterns in neuronal responses. J. Neurosci. 24(12), 2989–3001 (2004)
Acknowledgements
Part of this work is supported by KAKENHI No.26120504, 25870811, and 25120009.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this paper
Cite this paper
Takano, K., Hino, H., Yoshikawa, Y., Murata, N. (2015). Patchworking Multiple Pairwise Distances for Learning with Distance Matrices. In: Vincent, E., Yeredor, A., Koldovský, Z., Tichavský, P. (eds) Latent Variable Analysis and Signal Separation. LVA/ICA 2015. Lecture Notes in Computer Science(), vol 9237. Springer, Cham. https://doi.org/10.1007/978-3-319-22482-4_33
Download citation
DOI: https://doi.org/10.1007/978-3-319-22482-4_33
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-22481-7
Online ISBN: 978-3-319-22482-4
eBook Packages: Computer ScienceComputer Science (R0)