Skip to main content

Text2PyCode: Machine Translation of Natural Language Intent to Python Source Code

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 12844))

Abstract

Natural Language Processing has improved tremendously with the success of Deep Learning. Neural Machine Translation (NMT) has arisen as the most powerful with the power of Deep Learning. The same idea has been recently applied to source code. Code Generation (CG) is the task of generating source code from natural language input. This paper introduces a Python parallel corpus of natural language intent and source code pairs. It also proposes a Code Generation model based on Transformer architecture used for NMT by using code tokenization and code embeddings on the custom parallel corpus. The proposed architecture achieved a good BLEU score of 32.4 and Rouge-L of 82.1, which is on par with natural language translation.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    https://github.com/trending/python.

  2. 2.

    https://stackoverflow.com/questions/405374/python-source-code-collection.

  3. 3.

    https://projecteuler.net/.

  4. 4.

    https://github.com/sridevibonthu/Text2PyCode.

  5. 5.

    https://spacy.io.

  6. 6.

    https://conala-corpus.github.io/.

References

  1. Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. arXiv preprint arXiv:1409.0473 (2014)

  2. Bojanowski, P., Grave, E., Joulin, A., Mikolov, T.: Enriching word vectors with subword information. Trans. Assoc. Comput. Linguist. 5, 135–146 (2017)

    Article  Google Scholar 

  3. Chen, S.F., Beeferman, D., Rosenfeld, R.: Evaluation metrics for language models (1998)

    Google Scholar 

  4. Chen, Z., Monperrus, M.: A literature study of embeddings on source code. arXiv preprint arXiv:1904.03061 (2019)

  5. Gu, X., Zhang, H., Kim, S.: Deep code search. In: 2018 IEEE/ACM 40th International Conference on Software Engineering (ICSE), pp. 933–944. IEEE (2018)

    Google Scholar 

  6. Hieber, F., et al.: Sockeye: a toolkit for neural machine translation. arXiv preprint arXiv:1712.05690 (2017)

  7. Iyer, S., Konstas, I., Cheung, A., Zettlemoyer, L.: Summarizing source code using a neural attention model. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 2073–2083 (2016)

    Google Scholar 

  8. Kanade, A., Maniatis, P., Balakrishnan, G., Shi, K.: Learning and evaluating contextual embedding of source code. In: International Conference on Machine Learning, pp. 5110–5121. PMLR (2020)

    Google Scholar 

  9. Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014)

  10. Koehn, P., Knowles, R.: Six challenges for neural machine translation. arXiv preprint arXiv:1706.03872 (2017)

  11. Lin, C.Y.: Rouge: a package for automatic evaluation of summaries. In: Text Summarization Branches Out, pp. 74–81 (2004)

    Google Scholar 

  12. Mikolov, T., Chen, K., Corrado, G., Dean, J.: Efficient estimation of word representations in vector space. arXiv preprint arXiv:1301.3781 (2013)

  13. Mukherjee, K., Khare, A., Verma, A.: A simple dynamic learning rate tuning algorithm for automated training of DNNs. arXiv preprint arXiv:1910.11605 (2019)

  14. Papineni, K., Roukos, S., Ward, T., Zhu, W.J.: Bleu: a method for automatic evaluation of machine translation. In: Proceedings of the 40th Annual Meeting of the Association for Computational Linguistics, pp. 311–318 (2002)

    Google Scholar 

  15. Pennington, J., Socher, R., Manning, C.D.: Glove: global vectors for word representation. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 1532–1543 (2014)

    Google Scholar 

  16. Rehurek, R., Sojka, P., et al.: journal=Retrieved from genism. org, y.: Gensim-statistical semantics in python

    Google Scholar 

  17. Sutskever, I., Vinyals, O., Le, Q.V.: Sequence to sequence learning with neural networks. arXiv preprint arXiv:1409.3215 (2014)

  18. Vaswani, A., et al.: Attention is all you need. In: Advances in Neural Information Processing Systems, pp. 5998–6008 (2017)

    Google Scholar 

  19. Yin, P., Deng, B., Chen, E., Vasilescu, B., Neubig, G.: Learning to mine aligned code and natural language pairs from stack overflow. In: International Conference on Mining Software Repositories, pp. 476–486. MSR, ACM (2018). https://doi.org/10.1145/3196398.3196408

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 IFIP International Federation for Information Processing

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Bonthu, S., Sree, S.R., Krishna Prasad, M.H.M. (2021). Text2PyCode: Machine Translation of Natural Language Intent to Python Source Code. In: Holzinger, A., Kieseberg, P., Tjoa, A.M., Weippl, E. (eds) Machine Learning and Knowledge Extraction. CD-MAKE 2021. Lecture Notes in Computer Science(), vol 12844. Springer, Cham. https://doi.org/10.1007/978-3-030-84060-0_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-84060-0_4

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-84059-4

  • Online ISBN: 978-3-030-84060-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics