Skip to main content

On metric spaces arising during formalization of recognition and classification problems. Part 1: Properties of compactness


In the context of the algebraic approach to recognition of Yu.I. Zhuravlev’s scientific school, metric analysis of feature descriptions is necessary to obtain adequate formulations for poorly formalized recognition/classification problems. Formalization of recognition problems is a cross-disciplinary issue between supervised machine learning and unsupervised machine learning. This work presents the results of the analysis of compact metric spaces arising during the formalization of recognition problems. Necessary and sufficient conditions of compactness of metric spaces over lattices of the sets of feature descriptions are analyzed, and approaches to the completion of the discrete metric spaces (completion by lattice expansion or completion by variation of estimate) are formulated. It is shown that the analysis of compactness of metric spaces may lead to some heuristic cluster criteria commonly used in cluster analysis. During the analysis of the properties of compactness, a key concept of a ρ-network arises as a subset of points that allows one to estimate an arbitrary distance in an arbitrary metric configuration. The analysis of compactness properties and the conceptual apparatus introduced (ρ-networks, their quality functionals, the metric range condition, i- and ρ-spectra, ε-neighborhood in a metric cone, ε-isomorphism of complete weighted graphs, etc.) allow one to apply the methods of functional analysis, probability theory, metric geometry, and graph theory to the analysis of poorly formalized problems of recognition and classification.

This is a preview of subscription content, access via your institution.


  1. 1.

    Yu. I. Zhuravlev, “Set-theoretic methods in logic algebra,” Probl. Kibernet. 8 1, 25–45 (1962).

    MathSciNet  Google Scholar 

  2. 2.

    Yu. I. Zhuravlev, “Correct algebras for the sets of incorrect (heuristic) algorithms. I,” Kibernetika, No. 4, 5–17 (1977).

    MATH  Google Scholar 

  3. 3.

    Yu. I. Zhuravlev, “Correct algebras for the sets of incorrect (heuristic) algorithms. II,” Kibernetika, No. 6, 21–27 (1977).

    MathSciNet  MATH  Google Scholar 

  4. 4.

    Yu. I. Zhuravlev, “Correct algebras for the sets of incorrect (heuristic) algorithms. III,” Kibernetika, No. 2, 35–43 (1978).

    MathSciNet  MATH  Google Scholar 

  5. 5.

    Yu. I. Zhuravlev, “On algebraic approach for solving recognition and classification problems,” in Problems of Cybernetics (Nauka, Moscow, 1978), Issue 33, pp. 5–68 [in Russian].

    Google Scholar 

  6. 6.

    Yu. I. Zhuravlev, K. V. Rudakov, and I. Yu. Torshin, “Algebraic criteria of local solvability and regularity as a tool for researching amino acid sequences morphology,” Trudy Mosk. Fiz.-Tekhn. Inst. 3 4, 67–76 (2011).

    Google Scholar 

  7. 7.

    K. V. Rudakov and I. Yu. Torshin, “Solvability problems of secondary protein structure,” Inf. Primen. 4 2, 25–35 (2010).

    MATH  Google Scholar 

  8. 8.

    K. V. Rudakov and I. Yu. Torshin, “The way to analyze motifs informativity on the base of solvability criteria in a recognition problem of secondary protein structure,” Inf. Primen. 5 4, 40–50 (2011).

    MATH  Google Scholar 

  9. 9.

    K. V. Rudakov and I. Yu. Torshin, “The way to select informative attributes on the base of solvability criteria in a recognition problem of secondary protein structure,” Dokl. Akad. Nauk 441 1, 1–5 (2011).

    MATH  Google Scholar 

  10. 10.

    I. Yu. Torshin, “On solvability, regularity, and locality of the problem of genome annotation,” Pattern Recogn. Image Anal. 20 3, 386–395 (2010).

    Article  Google Scholar 

  11. 11.

    I. Yu. Torshin, “The study of the solvability of the genome annotation problem on sets of elementary motifs,” Pattern Recogn. Image Anal. 21 4, 652–662 (2011).

    Article  Google Scholar 

  12. 12.

    I. Yu. Torshin, “Optimal dictionaries of the final information on the basis of the solvability criterion and their applications in bioinformatics,” Pattern Recogn. Image Anal. 23 2, 319–327 (2013).

    Article  Google Scholar 

  13. 13.

    Yu. I. Zhuravlev and K. V. Rudakov, “Algebraic correction for information processing procedures,” in Problems of Applied Mathematics and Informatics (Nauka, Moscow, 1987), pp. 187–198 [in Russian].

    Google Scholar 

  14. 14.

    K. V. Rudakov, “Universe and local limitations in a problem on heuristic algorithms correction,” Kibernetika, No. 2, 30–35 (1987).

    MathSciNet  Google Scholar 

  15. 15.

    K. V. Rudakov, “The way to apply universal limitations for researching classification algorithms,” Kibernetika, No. 1, 1–5 (1988).

    MathSciNet  Google Scholar 

  16. 16.

    I. Yu. Torshin and K. V. Rudakov, “On the application of the combinatorial theory of solvability to the analysis of chemographs. Part 1: fundamentals of modern chemical bonding theory and the concept of the chemograph,” Pattern Recogn. Image Anal. 24 1, 11–23 (2014).

    Article  Google Scholar 

  17. 17.

    I. Yu. Torshin and K. V. Rudakov, “On the application of the combinatorial theory of solvability to the analysis of chemographs. Part 2. Local completeness of the chemographs’ invariants in view of the combinatorial theory of solvability,” Pattern Recogn. Image Anal. 24 2, 196–208 (2014).

    Article  Google Scholar 

  18. 18.

    O. A. Gromova, I. Yu. Torshin, A. G. Kalacheva, L. E. Fedotova, A. N. Gromov, and K. V. Rudakov, “Chemical information analysis of orotic acid molecular indicates the anti-inflammatory, neuroprotective, and cardioprotection properties of magnesium ligand,” Farmateka, No. 13 (2013).

  19. 19.

    I. Yu. Torshin and O. A. Gromova, Expert Data Analysis in Molecular Pharmacology (Moscow Center for Continuous Mathematical Education, Moscow, 2012) [in Russian].

    Google Scholar 

  20. 20.

    O. A. Gromova, A. G. Kalacheva, I. Yu. Torshin, K.V.Rudakov, U. E. Grustlivaya, N. V. Yudina, E. Yu. Egorova, O. A. Limanova, L. E. fedotova, O. N. Gracheva, N. V. Nikiforova, T. E. Satarina, I. V. Gogoleva, T. R. Grishina, D. B. Kuramshina, L. B. Novikova, E. Yu. Lisitsyna, N. V. Kerimkulova, I. S. Vladimirova, M. N. Chekmareva, et al., “Magnesium deficiency is a reliable risk factor of comorbide states. Results of large-scale screening of magnesium status in Russian regions,” Farmateka, No. 6 259, 116–129 (2013).

    Google Scholar 

  21. 21.

    I. Yu. Torshin and K. V. Rudakov, “On the theoretical basis of the metric analysis of poorly formalized problems of recognition and classification,” Pattern Recogn. Image Anal. 25 4, 577–587 (2015).

    Article  Google Scholar 

  22. 22.

    T. Hastie, R. Tibshirani, and J. Friedman, The Elements of Statistical Learning: Data Mining, Inference, and Prediction (Springer-Verlag, 2009).

    Book  MATH  Google Scholar 

  23. 23.

    Yu. I. Zhuravlev, V. V. Ryazanov, and O. V. Sen’ko, Recognition. Mathematical Methods. Program System. Practical Applications (Fazis, Moscow, 2006) [in Russian].

  24. 24.

    K. Bailey, “Numerical taxonomy and cluster analysis,” in Typologies and Taxonomies (New York, 1994), p. 34.

    Google Scholar 

  25. 25.

    V. Estivill-Castr, “Why so many clustering algorithms,” ACM SIGKDD Explor. Newslett. 4 1, 65–75 (2002).

    Article  Google Scholar 

  26. 26.

    D. Defays, “An efficient algorithm for a complete link method,” J. Brit. Comp. Soc. 20 4, 364–366 (1977).

    MathSciNet  MATH  Google Scholar 

  27. 27.

    S. Lloyd, “Least squares quantization in PCM,” IEEE Trans. Inf. Theory 28 2, 129–137 (1982).

    MathSciNet  Article  MATH  Google Scholar 

  28. 28.

    Ì. Ester, Í. Ð. Kriegel, J. Sander, and X. Xu, “A densitybased algorithm for discovering clusters in large spatial databases with noise,” in Proc. 2nd Intl. Conf. on Knowledge Discovery and Data Mining (KDD-96) (AAAI Press, 1996), pp. 226–231.

    Google Scholar 

  29. 29.

    A. N. Kolmogorov and S. V. Fomin, Elements of Function Theory and Functional Analysis (Nauka, Moscow, 1976) [in Russian].

    Google Scholar 

  30. 30.

    P. S. Aleksandrov, Introduction to Set Theory and General Topology (Nauka, Moscow, 1977) [in Russian].

    Google Scholar 

  31. 31.

    A. N. Kolmogorov, Selected Works. Probability Theory and Mathematical Statistics (Moscow, 1986) [in Russian].

    MATH  Google Scholar 

  32. 32.

    N. V. Smirnov, “The way to approximate random values distribution laws according to empirical data,” Usp. Mat. Nauk, No. 10, 179–206 (1944).

    Google Scholar 

Download references

Author information



Corresponding author

Correspondence to I. Yu. Torshin.

Additional information

Ivan Yur’evich Torshin. Born 1972. Graduated from the Department of Chemistry, Moscow State University, in 1995. Received candidates degrees in chemistry in 1997 and in physics and mathematics in 2011. Currently is an associate professor at Moscow Institute of Physics and Technology, lecturer at the Faculty of Computational Mathematics and Cybernetics, Moscow State University, leading scientist at the Russian Branch of the Trace Elements Institute for UNESCO, and a member of the Center of Forecasting and Recognition. Author of 205 publications in peer-reviewed journals in biology, chemistry, medicine, and informatics and of 3 monographs in the series “Bioinformatics in Post-genomic Era» (Nova Biomedical Publishers, NY, 2006-2009).

Konstantin Vladimirovich Rudakov. Born 1954. Russian mathematician, corresponding member of the Russian Academy of Sciences, Head of the Department of Computational Methods of Forecasting at the Dortodnicyn Computing Center, Russian Academy of Sciences, and Head of the Chair “Intelligent Systems” at the Moscow Institute of Physics and Technology.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Torshin, I.Y., Rudakov, K.V. On metric spaces arising during formalization of recognition and classification problems. Part 1: Properties of compactness. Pattern Recognit. Image Anal. 26, 274–284 (2016).

Download citation


  • algebraic approach
  • metric analysis of data
  • theory of classification of feature values
  • compact metric spaces
  • clustering
  • combinatorial theory of solvability