Skip to main content
Log in

Minimization of Empirical Risk as a Means of Choosing the Number of Hypotheses in Algebraic Machine Learning

  • SELECTED CONFERENCE PAPERS
  • Published:
Pattern Recognition and Image Analysis Aims and scope Submit manuscript

Abstract—

The paper examines a new approach to assessing the number of required hypotheses about the causes of a target property. It follows the classical method of V.N. Vapnik and A.Y. Chervonenkis—minimization of the number of classification errors on the training sample. However, there is a very close analogy with the procedure of an abductive explanation of the training sample by V.K. Finn.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

REFERENCES

  1. JSM-method of Automatic Generation of Hypotheses. Logical and Epistemological Foundations, Ed. by V. K. Finn and O. M. Anshakov (URSS, Moscow, 2009).

    Google Scholar 

  2. L. A. Iakimova, “The investigation of overfitting in algebraic machine learning,” Pattern Recognit. Image Anal. (2023).

  3. S. O. Kuznetsov, “A fast algorithm for computing all intersections of objects in a finite semi-lattice,” Autom. Doc. Math. Linguist. 27 (5), 11–21 (1993).

    Google Scholar 

  4. V. N. Vapnik and A. Ya. Chervonenkis, Pattern Recognition Theory (Nauka, Moscow, 1974). In Russian

    MATH  Google Scholar 

  5. D. V. Vinogradov, “The rate of convergence to the limit of the probability of encountering an accidental similarity in the presence of counter examples,” Autom. Doc. Math. Linguist. 52, 35–37 (2018). https://doi.org/10.3103/s0005105518010090

    Article  Google Scholar 

  6. D. V. Vinogradov, “Machine learning based on similarity operation,” in Artificial Intelligence, Ed. by S. Kuznetsov, G. Osipov, and V. Stefanuk, Communications in Computer and Information Science, Vol. 934 (Springer, Cham, 2018), pp. 46–59. https://doi.org/10.1007/978-3-030-00617-4_5

  7. D. Vinogradov, “A probabilistic combinatorial formal method of machine learning based on lattice theory,” Doctoral Dissertation in Computer Science (Fed. Res. Center Computer Science and Control, Moscow, 2018).

  8. D. V. Vinogradov, “Algebraic machine learning: Emphasis on efficiency,” Autom. Remote Control 83, 831–846 (2022). https://doi.org/10.1134/s0005117922060029

    Article  MathSciNet  MATH  Google Scholar 

Download references

ACKNOWLEDGMENTS

The author thanks his colleagues at the Dorodnicyn Computing Center of the Federal Research Center “Computer Science and Control” of the Russian Academy of Sciences for support and useful discussions. Special appreciation is expressed to L.A. Iakimova for the joint work, discussion, and support.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to D. V. Vinogradov.

Ethics declarations

The author declares that he has no conflicts of interest.

Additional information

Dmitry V. Vinogradov is a Leading Research Fellow at the Federal Research Center “Computer Science and Control” of the Russian Academy of Sciences.

In 1986, he graduated from the Faculty of Mechanics and Mathematics of the Lomonosov Moscow State University.

In 2019, he earned a Doctor of Science degree in Theoretical Computer Science.

His research areas are probabilistic algorithms, Markov chains, machine learning, lattice theory.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Vinogradov, D.V. Minimization of Empirical Risk as a Means of Choosing the Number of Hypotheses in Algebraic Machine Learning. Pattern Recognit. Image Anal. 33, 525–528 (2023). https://doi.org/10.1134/S1054661823030458

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1134/S1054661823030458

Keywords:

Navigation