Skip to main content

In Search of Machine Learning Theory

  • Conference paper
  • First Online:
Proceedings of the Future Technologies Conference (FTC) 2021, Volume 1 (FTC 2021)

Part of the book series: Lecture Notes in Networks and Systems ((LNNS,volume 358))

Included in the following conference series:

Abstract

Machine Learning (ML) and Artificial Intelligence (AI), in general, are based on search algorithms. In this paper, we use paradoxically the same search techniques again to look for the general theory of machine learning itself. In other words, we search for a unifying machine learning theory. For this purpose, we turn out for a help to the general theory of computation, called $-calculus, that is based on meta-search and super-turing/hypercomputational models. We hope that in such a way, we can unify machine learning and this should be useful in the development of new methods, algorithms, embedded devices and computer programs in the future. Firstly, we overview main machine learning areas as our training examples, and by applying our background knowledge we hand pick-up a hypothetical reasonable theory of ML. Next, we justify that this is a good generalization of ML. The open research question remains whether by applying various ML techniques we can induce automatically the optimal ML theory from the hypothesis space of possible theories. Thus, hopefully, the follow-up paper would be titled “In search of the optimal machine learning theory”.

The 1st author is a retired Professor of Practice, RPI Hartford, CT. This paper was written in memory of my friend Professor Houman Younessi from RPI Hartford. The work of 2nd author has been financed by Polish Ministry of Science and Higher Education under the program “Regional Initiative of Excellence” in 2019–2022, project number 027/RID/2018/19.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 229.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 299.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Burgin, M.: Super-Recursive Algorithms. Springer, Heidelberg (2005)

    MATH  Google Scholar 

  2. Burgin, M., Eberbach, E.: Evolutionary automata: expressiveness and convergence of evolutionary computation. Comput. J. 55(9), 1023–1029 (2012)

    Article  Google Scholar 

  3. Eberbach, E.: Approximate reasoning in the algebra of bounded rational agents. Intern. J. Approx. Reasoning 49(2), 316–330 (2008)

    Article  MathSciNet  Google Scholar 

  4. Eberbach, E.: The $-Calculus process algebra for problem solving: a paradigmatic shift in handling hard computational problems. Theoret. Comput. Sci. 383(2–3), 200–243 (2007)

    Article  MathSciNet  Google Scholar 

  5. Eberbach, E.: $-Calculus of bounded rational agents: flexible optimization as search under bounded resources in interactive systems. Fund. Inform. 68(1–2), 47–102 (2005)

    MathSciNet  MATH  Google Scholar 

  6. Eberbach, E., Goldin, D., Wegner, P.: Turing’s ideas and models of computation. In: Teuscher, C. (ed.) Alan Turing: Life and Legacy of a Great Thinker, pp. 159–194. Springer, Heidelberg (2004). https://doi.org/10.1007/978-3-662-05642-4_7

  7. Eberbach, E.: Toward a theory of evolutionary computation. BioSystems 82(1), 1–19 (2005)

    Article  MathSciNet  Google Scholar 

  8. Garzon, M.: Models of Massive Parallelism: Analysis of Cellular Automata and Neural Networks. Springer, Heidelberg (1995)

    Book  Google Scholar 

  9. Goodfellow, I., Bengio, Y., Courville, A.: Deep Learning. The MIT Press, Cambridge (2016)

    MATH  Google Scholar 

  10. Haykin, S.: Neural Networks: A Comprehensive Foundation. Macmillan College Publishing, New York (1994)

    MATH  Google Scholar 

  11. Hertz, J., Krogh, A., Palmer, R.G.: Introduction to the Theory of Neural Computation. Addison-Wesley, Redwood City (1991)

    Google Scholar 

  12. Kearns, M.J., Vazirani, U.M.: An Introduction to Computational Learning Theory. MIT Press, Cambridge (1994)

    Book  Google Scholar 

  13. Michalewicz, Z., Fogel, D.B.: How to Solve It: Modern Heuristics, 2nd edition, Springer, Heidelberg (2004). https://doi.org/10.1007/978-3-662-07807-5

  14. Motooka, T., Kitsuregawa, M.: The Fifth Generation Computer: The Japanese Challenge. Wiley, New York (1984)

    Google Scholar 

  15. Russell, S., Norvig, P.: Artificial Intelligence: A Modern Approach, 3rd edn. Prentice-Hall, Hoboken (2010)

    Google Scholar 

  16. Sutton, R.S., Barto, A.G.: Reinforcement Learning: An Introduction. MIT Press, Cambridge (1999)

    MATH  Google Scholar 

  17. Siegelmann, H.T.: Neural Networks and Analog Computation: Beyond the Turing Limit. Birkhauser (1999)

    Google Scholar 

  18. Turing, A.: Intelligent machinery, 1948. In: Ince, D.C. (ed.) Collected Works of A.M. Turing: Mechanical Intelligence (1992)

    Google Scholar 

  19. Vapnik, V.: Statistical Learning Theory (1998)

    Google Scholar 

  20. Deisenroth, M.P., Faisal, A.A., Ong, C.S.: Mathematics for Machine Learning, Cambridge University Press, Cambridge (2020)

    Google Scholar 

  21. Kleinberg, J., Tardos, E.: Algorithm Design. Pearson/Addison Wesley, New York (2006)

    Google Scholar 

  22. Gödel, K.: Uber formal unentscheidbare Satze der Principia Mathematica und verwander Systeme. Monatschefte fur Mathematik und Physik 38, 173–198 (1931)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Eugene Eberbach .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Eberbach, E., Strzalka, D. (2022). In Search of Machine Learning Theory. In: Arai, K. (eds) Proceedings of the Future Technologies Conference (FTC) 2021, Volume 1. FTC 2021. Lecture Notes in Networks and Systems, vol 358. Springer, Cham. https://doi.org/10.1007/978-3-030-89906-6_40

Download citation

Publish with us

Policies and ethics