Skip to main content

Enhancing the Cognition and Efficacy of Machine Learning Through Similarity


Similarity is a key element of machine learning and can make human learning much more effective as well. One of the goals of this paper is to expound on this aspect. We identify real-world concepts similar to hard-to-understand theories to enhance the learning experience and comprehension of a machine learning student. The second goal is to enhance the work in the current literature that uses similarity for transcoding. We uniquely try transcoding from Python to R and vice versa, something that was not attempted before, by identifying similarities in a latent embedding space. We list several real-world analogies to show similarities with and simplify the machine learning narrative. Next, we use Cross-Lingual Model Pretraining, Denoising Auto-encoding, and Back-translation to automatically identify similarities between the programming languages, Python and R and convert code in one to another. In the course of teaching machine learning to undergraduate, graduate, and general pool of students, the first author found that relating the concepts to real-world examples listed in this paper greatly enhanced student comprehension and made the topics much more approachable despite the math and the methods involved. When it comes to transcoding, in spite of the fact that Python and R are substantially different, we obtained reasonable success measured using various evaluation metrics and methods as described in the paper. Machines and human beings predominantly learn by way of similarity, a finding that can be explored further in both the machine and human learning domains.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13


  1. Domingos P. Every model learned by gradient descent is approximately a kernel machine. 2020. arXiv preprint arXiv:2012.00152.

  2. Pendyala VS. Relating machine learning to the real-world: analogies to enhance learning comprehension. In: International Conference on Soft computing and its engineering applications, 2022; vol. 1572, p. 127–139. Springer.

  3. Lample G, Conneau A, Denoyer L, Ranzato M. Unsupervised machine translation using monolingual corpora only. In: International Conference on learning representations. 2018.

  4. Duit R. On the role of analogies and metaphors in learning science. Sci Educ. 1991;75(6):649–72.

    Article  Google Scholar 

  5. Brink H, Richards J, Fetherolf M. Real-world Machine Learning. New York: Simon and Schuster; 2016.

    Google Scholar 

  6. Mehta P, Bukov M, Wang C-H, Day AG, Richardson C, Fisher CK, Schwab DJ. A high-bias, low-variance introduction to machine learning for physicists. Phys Rep. 2019;810:1–124.

    MathSciNet  Article  Google Scholar 

  7. Helmstaedter M. The mutual inspirations of machine learning and neuroscience. Neuron. 2015;86(1):25–8.

    Article  Google Scholar 

  8. Hope, T., Chan, J., Kittur, A., Shahaf, D.: Accelerating innovation through analogy mining. In: Proceedings of the 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 235–243 (2017)

  9. Fan Y, Tian F, Qin T, Li X-Y, Liu T-Y. Learning to teach. In: International Conference on learning representations (2018)

  10. Burrell J. How the machine ‘thinks’: Understanding opacity in machine learning algorithms. Big Data Soc. 2016;3(1):2053951715622512.

    Article  Google Scholar 

  11. Fiebrink R. Machine learning education for artists, musicians, and other creative practitioners. ACM Trans Comput Educ (TOCE). 2019;19(4):1–32.

    Article  Google Scholar 

  12. Smith LB, Slone LK. A developmental approach to machine learning? Front Psychol. 2017;8:2124.

    Article  Google Scholar 

  13. Zhu J-Y, Park T, Isola P, Efros AA. Unpaired image-to-image translation using cycle-consistent adversarial networks. In: Proceedings of the IEEE International Conference on computer vision, 2017; p. 2223–232.

  14. Welander P, Karlsson S, Eklund A. Generative adversarial networks for image-to-image translation on multi-contrast mr images-a comparison of cyclegan and unit. 2018. arXiv preprint arXiv:1806.07777.

  15. Surana R, Varshney A, Pendyala V. Deep learning for conversions between melodic frameworks of Indian classical music. In: Reddy AB, Kiranmayee BV, Mukkamala RR, Srujan Raju K. editors. Proceedings of Second International Conference on Advances in Computer Engineering and Communication Systems, 2022; pp. 1–12. Springer, Singapore.

  16. Sutskever I, Vinyals O, Le QV. Sequence to sequence learning with neural networks. Adv Neural Inf Proces Syst. 2014;27:3104–3112.

    Google Scholar 

  17. Bahdanau D, Cho K, Bengio Y. Neural machine translation by jointly learning to align and translate. 2014. arXiv preprint arXiv:1409.0473.

  18. Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, Kaiser L , Polosukhin I. Attention is all you need. Adv Neural Inf Proces Syst. 2017;30:6000–6010.

    Google Scholar 

  19. Lample G, Ott M, Conneau A, Denoyer L, Ranzato M. Phrase-based & neural unsupervised machine translation. 2018. arXiv preprint arXiv:1804.07755.

  20. Aziguli W, Zhang Y, Xie Y, Zhang D, Luo X, Li C, Zhang Y. A robust text classifier based on denoising deep neural network in the analysis of big data. Sci Programm. 2017;2017.

  21. Kano T, Sakti S, Nakamura S. End-to-end speech translation with transcoding by multi-task learning for distant language pairs. IEEE/ACM Trans Audio Speech Lang Process. 2020;28:1342–55.

    Article  Google Scholar 

  22. Weiss RJ, Chorowski J, Jaitly N, Wu Y, Chen Z. Sequence-to- sequence models can directly translate foreign speech. In: Proc. Interspeech 2017, 2017; p. 2625–629.

  23. He D, Xia Y, Qin T, Wang L, Yu N, Liu T-Y, Ma W-Y. Dual learning for machine translation. In: Proceedings of the 30th International Conference on neural information processing systems, 2016; pp. 820–828. Barcelona, Spain

  24. Artetxe M, Labaka G, Agirre E, Cho K. Unsupervised neural machine translation. In: International Conference on learning representations. 2018.

  25. He K, Zhang X, Ren S, Sun J. Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on computer vision and pattern recognition, 2016; pp. 770–78.

  26. Ba JL, Kiros JR, Hinton GE. Layer normalization. 2016. arXiv preprint arXiv:1607.06450.

  27. Li Z, Wallace E, Shen S, Lin K, Keutzer K, Klein D, Gonzalez J. Train big, then compress: Rethinking model size for efficient training and inference of transformers. In: International Conference on machine learning, 2020;p. 5958–5968. PMLR

Download references


The authors wish to thank Harika Andugula, Gayathri Ganesh, and Nihanjali Mallavarapu for their contributions to the transcoding part of this research.

Author information

Authors and Affiliations


Corresponding author

Correspondence to Vishnu Pendyala.

Ethics declarations

Conflict of Interest

On behalf of all authors, the corresponding author states that there is no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This article is part of the topical collection “Soft Computing for Real Time Engineering Applications” guest edited by Kanubhai K. Patel and Pritpal Singh.

Rights and permissions

Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Pendyala, V., Amireddy, R. Enhancing the Cognition and Efficacy of Machine Learning Through Similarity. SN COMPUT. SCI. 3, 442 (2022).

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI:


  • Machine learning
  • Similarity
  • Transcoding
  • Deep learning