Machine Learning

, Volume 44, Issue 1, pp 67–91

Stochastic Finite Learning of the Pattern Languages

  • Peter Rossmanith
  • Thomas Zeugmann

DOI: 10.1023/A:1010875913047

Cite this article as:
Rossmanith, P. & Zeugmann, T. Machine Learning (2001) 44: 67. doi:10.1023/A:1010875913047


The present paper proposes a new learning model—called stochastic finite learning—and shows the whole class of pattern languages to be learnable within this model.

This main result is achieved by providing a new and improved average-case analysis of the Lange–Wiehagen (New Generation Computing, 8, 361–370) algorithm learning the class of all pattern languages in the limit from positive data. The complexity measure chosen is the total learning time, i.e., the overall time taken by the algorithm until convergence. The expectation of the total learning time is carefully analyzed and exponentially shrinking tail bounds for it are established for a large class of probability distributions. For every pattern π containing k different variables it is shown that Lange and Wiehagen's algorithm possesses an expected total learning time of \(O(\hat \alpha ^k E[\Lambda ]\log _{1/\beta } (k))\), where \({\hat \alpha }\) and β are two easily computable parameters arising naturally from the underlying probability distributions, and E[Λ] is the expected example string length.

Finally, assuming a bit of domain knowledge concerning the underlying class of probability distributions, it is shown how to convert learning in the limit into stochastic finite learning.

inductive learning pattern languages average-case analysis learning in the limit stochastic finite learning 
Download to read the full article text

Copyright information

© Kluwer Academic Publishers 2001

Authors and Affiliations

  • Peter Rossmanith
    • 1
  • Thomas Zeugmann
    • 2
  1. 1.Institut für InformatikTechnische Universität MünchenMünchenGermany
  2. 2.Institut für Theoretische InformatikMedizinische Universität LübeckLübeckGermany

Personalised recommendations