Abstract
In recent practices, the BERT model that utilizes contextual word Embedding with transfer learning has arisen as a popular state-of-the-art deep learning model. It improved the performance of several Natural Language Processing (NLP) Applications [1]. In this paper, following the effectiveness that these models demonstrated, we use the advantages of training Arabic Transformer-based representational language models to create three Arabic NLP applications for two Arabic varieties; MSA and Arabic Dialects. We build an Arabic representational language model using BERT as the Transformer-based training model [2]. Then we compare the resulting model to the pre-trained multi-lingual models. This step is accomplished by building multiple Arabic NLP applications and then evaluating they are evaluating their performances. Our system had an accuracy of 0.91 on the NER task, 0.89 on the document classification application, and 0.87 on the sentiment Analysis application. This work proved that using a language-specific model outperforms the trained multilingual models on several NLP applications.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
- 2.
BERT pre-trained models for TensorFlow https://github.com/google-research/bert.
References
Polignano, M., Basile, P., de Gemmis, M., et al.: AlBERTo: Italian BERT language understanding model for NLP challenging tasks based on tweets. In: CEUR Workshop Proceedings, vol. 2481 (2019)
Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. arXiv Prepr arXiv181004805. arXiv:1811.03600v2 (2018)
Mikolov, T., Chen, K., Corrado, G., Dean, J.: Efficient estimation of word representations in vector space. In: ICLR 2020 Conference Blind Submission, pp. 4069–4076 (2013). https://doi.org/10.48550/arXiv.1301.3781
Pennington, J., Socher, R., Manning, C.: Glove: global vectors for word representation. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 1532–1543 (2014)
Peters, M.E., Neumann, M., Iyyer, M., et al.: Deep contextualized word representations. In: Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 2227–2237 (2018)
Li, X., Zhang, H., Zhou, X.H.: Chinese clinical named entity recognition with variant neural structures based on BERT methods. J. Biomed. Inform. 107, 103422 (2020). https://doi.org/10.1016/j.jbi.2020.103422
Moradshahi, M., Palangi, H., Lam, M,S., et al.: HUBERT untangles BERT to improve transfer across NLP tasks. 1–13 (2019)
Benajiba, Y., Rosso, P.: ANERsys 2.0 : Conquering the NER task for the Arabic language by combining the maximum entropy with POS-tag information. In: 3rd Indian International Conference Artificial Intelligence, pp. 1814–1823 (2007)
Tang, D., Wei, F., Qin, B., et al.: Sentiment embeddings with applications to sentiment analysis. IEEE Trans. Knowl. Data Eng. 28, 496–509 (2016). https://doi.org/10.1109/TKDE.2015.2489653
Li, J., Li, J., Fu, X., et al.: Learning distributed word representation with multi-contextual mixed embedding. Knowl.-Based Syst. 106, 220–230 (2016). https://doi.org/10.1016/j.knosys.2016.05.045
Rajasekharan, A.: Examining BERT’s raw embeddings - towards data science (2019). https://www.quora.com/q/handsonnlpmodelreview/Examining-BERT-s-raw-embeddings-1?ch=10&share=4afe89e5. Accessed 26 Jun 2020
Alsentzer, E., Murphy, J.R., Boag, W., et al.: Publicly available clinical BERT embeddings. In: Proceedings of the 2nd Clinical Natural Language Processing Workshop, pp. 72–78 (2019)
Ashish, V., Noam, S., Niki, P., et al.: Attention is all you need. In: Advances in Neural Information Processing Systems, pp. 6000–6010 (2017)
Alammar, J.: The Illustrated Transformer – Jay Alammar – Visualizing machine learning one concept at a time. Jay alammar Github 27 (2018)
Chaimae, A., Yacine, E.Y., Rybinski, M., Montes, J,F.A.: BERT for Arabic named entity recognition. In: 2020 International Symposium on Advanced Electrical and Communication Technologies (ISAECT), pp. 1–6 IEEE (2020)
Chaimae, A., El Younoussi, Y., Moussaoui, O., Zahidi, Y.: An Arabic dialects dictionary using word embeddings. Int. J. Rough Sets Data Anal. 6, 18–31 (2019). https://doi.org/10.4018/IJRSDA.2019070102
Chaimae, A., Rybinski, M., Yacine, E.Y., Montes, J.F.A. Comparative study of Arabic word embeddings : evaluation and application. Int. J. Comput. Inf. Syst. Ind. Manag. Appl. ISSN-2150-7988 12, 349–362 (2020)
Mohit, B., Schneider, N., Bhowmick, R., et al.: Recall-oriented learning of named entities in Arabic Wikipedia. In: EACL 2012 - 13th Conference of the European Chapter of the Association for Computational Linguistics Proceedings, pp. 162–173 (2012)
Baly, R., Khaddaj, A., Hajj, H., El-Hajj, W., Shaban, K.B.: ArSentD-LEV : a multi-topic corpus for target-based sentiment analysis in arabic levantine tweets. In: OSACT3, pp. 37–43 (2018)
Abdulla, N.A., Ahmed, N.A., Shehab, M.A., Al-ayyoub, M.: Arabic Sentiment Analysis: lexicon-based and corpus-based. In: Jordan Conference Applied Electrical Engineering Computing Technologies, vol. 6, pp.1–6 (2013). https://doi.org/10.1109/AEECT.2013.6716448
El-haj, M., Koulali, R.: KALIMAT a multipurpose Arabic corpus. In: Second Work Arab Corpus Linguist, pp. 22–25 (2013)
Polignano, M., De Gemmis, M., Basile, P., Semeraro, G. A comparison of word-embeddings in emotion detection from text using BiLSTM, CNN and self-attention. In: ACM UMAP 2019 Adjun - Adjun Publ 27th Conference User Model Adapt Pers, pp. 63–68 (2019). https://doi.org/10.1145/3314183.3324983
Shaalan, K., Oudah, M.: A hybrid approach to Arabic named entity recognition. J. Inf. Sci. 40, 67–87 (2013). https://doi.org/10.1177/0165551513502417
Al-Zoghby, A., Eldin, A.S., Ismail, N.A., Hamza, T.: Mining Arabic text using soft-matching association rules. In: ICCES’07 - 2007 International Conference on Computer Engineering and Systems (2007)
Tartir, S., Abdul-Nabi, I.: Semantic sentiment analysis in arabic social media. J. King Saud Univ. – Comput. Inf. Sci. 29, 229–233 (2017). https://doi.org/10.1016/j.jksuci.2016.11.011
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Azroumahli, C., Elyounoussi, Y., Badir, H. (2024). BERT for Arabic NLP Applications: Pretraining and Finetuning MSA and Arabic Dialects. In: Tabaa, M., Badir, H., Bellatreche, L., Boulmakoul, A., Lbath, A., Monteiro, F. (eds) New Technologies, Artificial Intelligence and Smart Data. INTIS INTIS 2022 2023. Communications in Computer and Information Science, vol 1728. Springer, Cham. https://doi.org/10.1007/978-3-031-47366-1_5
Download citation
DOI: https://doi.org/10.1007/978-3-031-47366-1_5
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-47365-4
Online ISBN: 978-3-031-47366-1
eBook Packages: Computer ScienceComputer Science (R0)