Learning Figures with the Hausdorff Metric by Fractals

  • Mahito Sugiyama
  • Eiju Hirowatari
  • Hideki Tsuiki
  • Akihiro Yamamoto
Conference paper

DOI: 10.1007/978-3-642-16108-7_26

Part of the Lecture Notes in Computer Science book series (LNCS, volume 6331)
Cite this paper as:
Sugiyama M., Hirowatari E., Tsuiki H., Yamamoto A. (2010) Learning Figures with the Hausdorff Metric by Fractals. In: Hutter M., Stephan F., Vovk V., Zeugmann T. (eds) Algorithmic Learning Theory. ALT 2010. Lecture Notes in Computer Science, vol 6331. Springer, Berlin, Heidelberg

Abstract

Discretization is a fundamental process for machine learning from analog data such as continuous signals. For example, the discrete Fourier analysis is one of the most essential signal processing methods for learning or recognition from continuous signals. However, only the direction of the time axis is discretized in the method, meaning that each datum is not purely discretized. To give a completely computational theoretical basis for machine learning from analog data, we construct a learning framework based on the Gold-style learning model. Using a modern mathematical computability theory in the field of Computable Analysis, we show that scalable sampling of analog data can be formulated as effective Gold-style learning. On the other hand, recursive algorithms are a key expression for models or rules explaining analog data. For example, FFT (Fast Fourier Transformation) is a fundamental recursive algorithm for discrete Fourier analysis. In this paper we adopt fractals, since they are general geometric concepts of recursive algorithms, and set learning objects as nonempty compact sets in the Euclidean space, called figures, in order to introduce fractals into Gold-style learning model, where the Hausdorff metric can be used to measure generalization errors. We analyze learnable classes of figures from informants (positive and negative examples) and from texts (positive examples), and reveal the hierarchy of learnabilities under various learning criteria. Furthermore, we measure the number of positive examples, one of complexities of learning, by using the Hausdorff dimension, which is the central concept of Fractal Geometry, and the VC dimension, which is used to measure the complexity of classes of hypotheses in the Valiant-style learning model. This work provides theoretical support for machine learning from analog data.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer-Verlag Berlin Heidelberg 2010

Authors and Affiliations

  • Mahito Sugiyama
    • 1
  • Eiju Hirowatari
    • 2
  • Hideki Tsuiki
    • 3
  • Akihiro Yamamoto
    • 1
  1. 1.Graduate School of InformaticsKyoto UniversityKyotoJapan
  2. 2.Center for Fundamental EducationThe University of Kitakyushu 
  3. 3.Graduate School of Human and Environmental StudiesKyoto University 

Personalised recommendations