Abstract—
The paper examines a new approach to assessing the number of required hypotheses about the causes of a target property. It follows the classical method of V.N. Vapnik and A.Y. Chervonenkis—minimization of the number of classification errors on the training sample. However, there is a very close analogy with the procedure of an abductive explanation of the training sample by V.K. Finn.
Similar content being viewed by others
REFERENCES
JSM-method of Automatic Generation of Hypotheses. Logical and Epistemological Foundations, Ed. by V. K. Finn and O. M. Anshakov (URSS, Moscow, 2009).
L. A. Iakimova, “The investigation of overfitting in algebraic machine learning,” Pattern Recognit. Image Anal. (2023).
S. O. Kuznetsov, “A fast algorithm for computing all intersections of objects in a finite semi-lattice,” Autom. Doc. Math. Linguist. 27 (5), 11–21 (1993).
V. N. Vapnik and A. Ya. Chervonenkis, Pattern Recognition Theory (Nauka, Moscow, 1974). In Russian
D. V. Vinogradov, “The rate of convergence to the limit of the probability of encountering an accidental similarity in the presence of counter examples,” Autom. Doc. Math. Linguist. 52, 35–37 (2018). https://doi.org/10.3103/s0005105518010090
D. V. Vinogradov, “Machine learning based on similarity operation,” in Artificial Intelligence, Ed. by S. Kuznetsov, G. Osipov, and V. Stefanuk, Communications in Computer and Information Science, Vol. 934 (Springer, Cham, 2018), pp. 46–59. https://doi.org/10.1007/978-3-030-00617-4_5
D. Vinogradov, “A probabilistic combinatorial formal method of machine learning based on lattice theory,” Doctoral Dissertation in Computer Science (Fed. Res. Center Computer Science and Control, Moscow, 2018).
D. V. Vinogradov, “Algebraic machine learning: Emphasis on efficiency,” Autom. Remote Control 83, 831–846 (2022). https://doi.org/10.1134/s0005117922060029
ACKNOWLEDGMENTS
The author thanks his colleagues at the Dorodnicyn Computing Center of the Federal Research Center “Computer Science and Control” of the Russian Academy of Sciences for support and useful discussions. Special appreciation is expressed to L.A. Iakimova for the joint work, discussion, and support.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
The author declares that he has no conflicts of interest.
Additional information
Dmitry V. Vinogradov is a Leading Research Fellow at the Federal Research Center “Computer Science and Control” of the Russian Academy of Sciences.
In 1986, he graduated from the Faculty of Mechanics and Mathematics of the Lomonosov Moscow State University.
In 2019, he earned a Doctor of Science degree in Theoretical Computer Science.
His research areas are probabilistic algorithms, Markov chains, machine learning, lattice theory.
Rights and permissions
About this article
Cite this article
Vinogradov, D.V. Minimization of Empirical Risk as a Means of Choosing the Number of Hypotheses in Algebraic Machine Learning. Pattern Recognit. Image Anal. 33, 525–528 (2023). https://doi.org/10.1134/S1054661823030458
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1134/S1054661823030458