Skip to main content

Resume Shortlisting and Ranking with Transformers

  • Conference paper
  • First Online:
Intelligent Systems and Machine Learning (ICISML 2022)

Abstract

The study shown in this paper helps the human resource domain eliminate the time-consuming recruitment process task. Screening resume is the most critical and challenging task for human resource personnel. Natural Language Processing (NLP) techniques are the computer’s ability to understand spoken/written language. Now a day’s, online recruitment platform is more vigorous along with consultancies. A single job opening will get hundreds of applications. To discover the finest candidate for the position, Human Resource (HR) employees devote extra time to the candidate selection process. Most of the time, shortlisting the best fit for the job is time-consuming and finding an apt person is hectic. The proposed study helps to shortlist the candidates with a better match for the job based on the skills provided in the resume. As it is an automated process, the candidate’s personalized favor and soft skills are not affected by the hiring process. The Sentence-BERT (SBERT) network is a Siamese and triplet network-based variant of the Bidirectional Encoder Representations from Transformers (BERT) architecture, which may generate semantically significant sentence embeddings. An end-to-end tool for the HR domain, which takes hundreds of resumes along with required skills for the job as input and provides the better-ranked candidate fit for the job as output. The SBERT is compared with BERT and proved that it is superior to BERT.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 69.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 89.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Siddique, C.M.: Job analysis: a strategic human resource management practice. Int. J. Human Resour. Manage. 15(1), 219–244 (2004)

    Article  Google Scholar 

  2. Sanabria, R., et al.: How2: a large-scale dataset for multimodal language understanding. arXiv preprint arXiv:1811.00347 (2018)

  3. Arora, S., Li, Y., Liang, Y., Ma, T., Risteski, A.: A latent variable model approach to PMIbased word embeddings. Trans. Assoc. Comput. Linguist. 4, 385–399 (2016)

    Article  Google Scholar 

  4. Rieck, B., Leitte, H.: Persistent homology for the evaluation of dimensionality reduction schemes. Comput. Graph. Forum 34(3) (2015)

    Google Scholar 

  5. Stein, R.A., Jaques, P.A., Valiati, J.F.: An analysis of hierarchical text classification using word embeddings. Inf. Sci. 471, 216–232 (2019)

    Article  Google Scholar 

  6. Mishra, M.K., Viradiya, J.: Survey of sentence embedding methods. Int. J. Appl. Sci. Comput. 6(3), 592 (2019)

    Google Scholar 

  7. Chernyavskiy, A., Ilvovsky, D., Nakov, P.: Transformers: “The End of History” for natural language processing? In: Oliver, N., Pérez-Cruz, F., Kramer, S., Read, J., Lozano, J.A. (eds.) ECML PKDD 2021. LNCS (LNAI), vol. 12977, pp. 677–693. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-86523-8_41

    Chapter  Google Scholar 

  8. Suryadjaja, P.S., Mandala, R.: Improving the performance of the extractive text summarization by a novel topic modeling and sentence embedding technique using SBERT. In: 2021 8th International Conference on Advanced Informatics: Concepts, Theory and Applications (ICAICTA), pp. 1–6 (2021). https://doi.org/10.1109/ICAICTA53211.2021.9640295

  9. Cao, S., Kitaev, N., Klein, D.: Multilingual alignment of contextual word representations. arXiv preprint arXiv:2002.03518 (2020)

  10. Choi, H., Kim, J., Joe, S., Gwon, Y.: Evaluation of BERT and ALBERT sentence embedding performance on downstream NLP tasks. In: 2020 25th International Conference on Pattern Recognition (ICPR), pp. 5482–5487 (2021). https://doi.org/10.1109/ICPR48806.2021.9412102

  11. Dharma, E.M., et al.: The accuracy comparison among word2vec, glove, and fasttext towards convolution neural network (CNN) text classification. J. Theor. Appl. Inf. Technol. 100(2), 31 (2022)

    Google Scholar 

  12. Vaswani, A., et al.: Attention is all you need. In: Advances in Neural Information Processing Systems, vol. 30 (2017)

    Google Scholar 

  13. Liu, Y., Sun, C., Lin, L., Wang, X.: Learning natural language inference using bidirectional LSTM model and inner-attention. arXiv preprint arXiv:1605.09090 (2016)

  14. Choi, H., Cho, K., Bengio, Y.: Fine-grained attention mechanism for neural machine translation. Neurocomputing 284, 171–176 (2018). ISSN 0925-2312

    Google Scholar 

  15. Giorgi, J., et al.: DeCLUTR: deep contrastive learning for unsupervised textual representations. arXiv preprint arXiv:2006.03659 (2020)

  16. Conneau, A., et al.: Supervised learning of universal sentence representations from natural language inference data. arXiv preprint arXiv:1705.02364 (2017)

  17. Parameswaran, P., Trotman, A., Liesaputra, V., Eyers, D.: Detecting the target of sarcasm is hard: really?? Inf. Process. Manage. 58(4), 102599 (2021)

    Article  Google Scholar 

  18. Naseem, U., Musial, K.: DICE: deep intelligent contextual embedding for Twitter sentiment analysis. In: 2019 International Conference on Document Analysis and Recognition (ICDAR), pp. 953–958 (2019). https://doi.org/10.1109/ICDAR.2019.00157

  19. Reimers, N., Gurevych, I.: Alternative weighting schemes for elmoembeddings. arXiv preprint arXiv:1904.02954 (2019)

  20. Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, vol. 1 (Long and Short Papers), Minneapolis, Minnesota, pp. 4171–4186. Association for Computational Linguistics (2019)

    Google Scholar 

  21. Dai, Z., Yang, Z., Yang, Y., Carbonell, J., Le, Q.V., Salakhutdinov, R.: Transformer-Xl: attentive language models beyond a fixed-length context. arXiv preprint arXiv:1901.02860 (2019)

  22. Liu, Y., et al.: Roberta: a robustly optimized BERT pretraining approach. arXiv preprint arXiv:1907.11692 (2019)

  23. Jo, T., Lee, J.H.: Latent keyphrase extraction using deep belief networks. Int. J. Fuzzy Logic Intell. Syst. 15(3), 153–158 (2015)

    Article  Google Scholar 

  24. Papagiannopoulou, E., Tsoumakas, G.: Local word vectors guiding keyphrase extraction. Inf. Process. Manage. 54(6), 888–902 (2018)

    Article  Google Scholar 

  25. Bennani-Smires, K., Musat, C., Hossmann, A., Baeriswyl, M., Jaggi, M.: Simple unsupervised keyphrase extraction using sentence embeddings. In: Proceedings of the 22nd Conference on Computational Natural Language Learning, Brussels, Belgium, pp. 221–229. Association for Computational Linguistics (2018)

    Google Scholar 

  26. Reimers, N., Gurevych, I.: Sentence-BERT: sentence embeddings using Siamese BERT-networks. arXiv preprint arXiv:1908.10084 (2019)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Vinaya James .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 ICST Institute for Computer Sciences, Social Informatics and Telecommunications Engineering

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

James, V., Kulkarni, A., Agarwal, R. (2023). Resume Shortlisting and Ranking with Transformers. In: Nandan Mohanty, S., Garcia Diaz, V., Satish Kumar, G.A.E. (eds) Intelligent Systems and Machine Learning. ICISML 2022. Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering, vol 471. Springer, Cham. https://doi.org/10.1007/978-3-031-35081-8_8

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-35081-8_8

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-35080-1

  • Online ISBN: 978-3-031-35081-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics