Skip to main content

Ordinal mind change complexity of language identification

  • Conference paper
  • First Online:
Computational Learning Theory (EuroCOLT 1997)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 1208))

Included in the following conference series:

Abstract

The approach of ordinal mind change complexity, introduced by Freivalds and Smith, uses constructive ordinals to bound the number of mind changes made by a learning machine. This approach provides a measure of the extent to which a learning machine has to keep revising its estimate of the number of mind changes it will make before converging to a correct hypothesis for languages in the class being learned. Recently, this measure, which also suggests the difficulty of learning a class of languages, has been used to analyze the learnability of rich classes of languages. Jain and Sharma have shown that the ordinal mind change complexity for identification from positive data of languages formed by unions of up to n pattern languages is ω n. They have also shown that this bound is essential. Similar results were also established for classes definable by length-bounded elementary formal systems with up to n clauses. These later results translate to learnability of certain classes of logic programs.

The present paper further investigates the utility of ordinal mind change complexity. It is shown that if identification is to take place from both positive and negative data, then the ordinal mind change complexity of the class of languages formed by unions of up to n+1 pattern languages is only ω×o n (where ×o represents ordinal multiplication). This result nicely extends an observation of Lange and Zeugmann that pattern languages can be identified from both positive and negative data with 0 mind changes.

Existence of an ordinal mind change bound for a class of learnable languages can be seen as an indication of its learning “tractability.” Conditions are investigated under which a class has an ordinal mind change bound for identification from positive data. It is shown that an indexed family of computable languages has an ordinal mind change bound if it has finite elasticity and can be identified by a conservative machine. It is also shown that the requirement of conservative identification can be sacrificed for the purely topological requirement of M-finite thickness. Interaction between identification by monotonic strategies and existence of ordinal mind change bound is also investigated.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. S. Arikawa, S. Miyano, A. Shinohara, T. Shinohara, and A. Yamamoto. Algorithmic learning theory with elementary formal systems. IEICE Trans. Inf. and Syst., E75-D No. 4:405–414, 1992.

    Google Scholar 

  2. D. Angluin. Finding patterns common to a set of strings. Journal of Computer and System Sciences, 21:46–62, 1980.

    Google Scholar 

  3. J. Case, S. Jain, and M. Suraj. Not-so-nearly-minimal-size program inference. In Klaus P. Jantke and Steffen Lange, editors, Algorithmic Learning for Knowledge-Based Systems, volume 961 of Lecture Notes in Artificial Intelligence, pages 77–96. Springer-Verlag, 1995.

    Google Scholar 

  4. R. Freivalds and C. Smith. On the role of procrastination in machine learning. Information and Computation, pages 237–271, 1993.

    Google Scholar 

  5. E. M. Gold. Language identification in the limit. Information and Control, 10:447–474, 1967.

    Google Scholar 

  6. K. P. Jantke. Monotonic and non-monotonic inductive inference. New Generation Computing, 8:349–360, 1991.

    Google Scholar 

  7. S. Jain and A. Sharma. On the intrinsic complexity of language identification. In Proceedings of the Seventh Annual Conference on Computational Learning Theory, New Brunswick, New Jersey, pages 278–286. ACM-Press, July 1994.

    Google Scholar 

  8. S. Jain and A. Sharma. Elementary formal systems, intrinsic complexity, and procrastination. In Proceedings of the Ninth Annual Conference on Computational Learning Theory, pages 181–192. ACM-Press, June 1996.

    Google Scholar 

  9. S. C. Kleene. Notations for ordinal numbers. Journal of Symbolic Logic, 3:150–155, 1938.

    Google Scholar 

  10. S. Lange and T. Zeugmann. Monotonic versus non-monotonic language learning. In Proceedings of the Second International Workshop on Nonmonotonic and Inductive Logic, pages 254–269. Springer-Verlag, 1993. Lecture Notes in Artificial Intelligence 659.

    Google Scholar 

  11. T. Motoki, T. Shinohara, and K. Wright. The correct definition of finite elasticity: Corrigendum to identification of unions. In L. Valiant and M. Warmuth, editors, Proceedings of the Fourth Annual Workshop on Computational Learning Theory, Santa Cruz, California, page 375. Morgan Kaufman, 1991.

    Google Scholar 

  12. Y. Mukouchi. Inductive inference of an approximate concept from positive data. In S. Arikawa and K. P. Jantke, editors, Algorithmic Learning Theory, 4th International Workshop on Analogical and Inductive Inference, AII'94 and 5th International Workshop on Algorithm Learning Theory, ALT'94, Lecture Notes in Artificial Intelligence, 872, pages 484–499. Springer-Verlag, 1994.

    Google Scholar 

  13. D. Osherson, M. Stob, and S. Weinstein. Systems that Learn, An Introduction to Learning Theory for Cognitive and Computer Scientists. MIT Press, Cambridge, Mass., 1986.

    Google Scholar 

  14. H. Rogers. Theory of Recursive Functions and Effective Computability. McGraw-Hill, New York, 1967. Reprinted, MIT Press 1987.

    Google Scholar 

  15. G. E. Sacks. Higher Recursion Theory. Springer-Verlag, 1990.

    Google Scholar 

  16. T. Shinohara. Studies on Inductive Inference from Positive Data. PhD thesis, Kyushu University, Kyushu, Japan, 1986.

    Google Scholar 

  17. T. Shinohara. Rich classes inferable from positive data: Length-bounded elementary formal systems. Information and Computation, 108:175–186, 1994.

    Google Scholar 

  18. M. Sato and T. Moriyama. Inductive inference of length bounded EFS's from positive data. Technical Report DMSIS-RR-94-2, Department of Mathematical Sciences and Information Sciences, University of Osaka Prefecture, Japan, 1994.

    Google Scholar 

  19. K. Wright. Identification of unions of languages drawn from an identifiable class. In R. Rivest, D. Haussler, and M. K. Warmuth, editors, Proceedings of the Second Annual Workshop on Computational Learning Theory, Santa Cruz, California, pages 328–333. Morgan Kaufmann Publishers, Inc., 1989.

    Google Scholar 

  20. T. Zeugmann and S. Lange. A guided tour across the boundaries of learning recursive languages. In K.P. Jantke and S. Lange, editors, Algorithmic Learning for Knowledge-Based Systems, pages 190–258. Lecture Notes in Artificial Intelligence No. 961, Springer-Verlag, 1995.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Shai Ben-David

Rights and permissions

Reprints and permissions

Copyright information

© 1997 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Ambainis, A., Jain, S., Sharma, A. (1997). Ordinal mind change complexity of language identification. In: Ben-David, S. (eds) Computational Learning Theory. EuroCOLT 1997. Lecture Notes in Computer Science, vol 1208. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-62685-9_25

Download citation

  • DOI: https://doi.org/10.1007/3-540-62685-9_25

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-62685-5

  • Online ISBN: 978-3-540-68431-2

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics