Abstract
We propose a BERT-based data augmentation for labeled sentences called Aug-Bert. New sentences are generated by stochastically selecting words and replacing them with other words predicted by Aug-BERT. After a two-stage training, including CLP and L-MLM, BERT can be fine-tuned to be Aug-BERT. Aug-BERT can predict stochastically selected words according to both the context and the label. Depending on the ability of BERT’s deep bidirectional language model and the information of label incorporated via label-segment embedding, Aug-BERT can generate sentences of high quality. Experiments on six different text classification tasks show that our methods outperform most of the others and can improve the performance of classifiers obviously.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Deschacht K, Moens MF (2009) Semi-supervised semantic role labeling using the latent words language model. In: Proceedings of the 2009 conference on empirical methods in natural language processing, vol 1. Association for Computational Linguistics, pp 21–29
Devlin J, Chang M-W, Lee K, Toutanova K (2018) Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805
Fadaee M, Bisazza A, Monz C (2017) Data augmentation for low-resource neural machine translation. arXiv preprint arXiv:1705.00440
Kim Y (2014) Convolutional neural networks for sentence classification. arXiv preprint arXiv:1408.5882
Kobayashi S (2018) Contextual augmentation: Data augmentation by words with paradigmatic relations. arXiv preprint arXiv:1805.06201
Kolomiyets O, Bethard S, Moens M-F (2011) Model-portability experiments for textual temporal analysis. In: Proceedings of the 49th annual meeting of the Association for Computational Linguistics: Human Language Technologies: short papers, vol 2. Association for Computational Linguistics, pp 271–276
Li X, Roth D (2002) Learning question classifiers. In: Proceedings of the 19th international conference on computational linguistics, vol 1. Association for Computational Linguistics, pp 1–7
Miller GA (1995) Wordnet: a lexical database for English. Commun ACM 38(11):39–41
Pang B, Lee L (2004) A sentimental education: Sentiment analysis using subjectivity summarization based on minimum cuts. In: Proceedings of the 42nd annual meeting on Association for Computational Linguistics. Association for Computational Linguistics, p 271
Pang B, Lee L (2005) Seeing stars: exploiting class relationships for sentiment categorization with respect to rating scales. In: Proceedings of the 43rd annual meeting on Association for Computational Linguistics. Association for Computational Linguistics, pp 115–124
Socher R, Perelygin A, Wu J, Chuang J, Manning C-D, Ng A, Potts A (2013) Recursive deep models for semantic compositionality over a sentiment treebank. In: Proceedings of the 2013 conference on empirical methods in natural language processing, pp 1631–1642
Taylor WL (1953). Cloze procedure: a new tool for measuring readability. J Bull 30(4):415–433
Wang WY, Yang D (2015). Thats so annoying!!!: A lexical and frame-semantic embedding based data augmentation approach to automatic categorization of annoying behaviors using# petpeeve tweets. In: Proceedings of the 2015 conference on empirical methods in natural language processing, pp 2557–2563
Wiebe Janyce, Wilson Theresa, Cardie Claire (2005) Annotating expressions of opinions and emotions in language. Lang Resour Eval 39(2–3):165–210
Wu X, Lv S, Zang L, Han J, Hu S (2019) Conditional BERT contextual augmentation. In:International conference on computational science. Springer, pp 84–95
Xie Z, Wang SI, Li J, LĂ©vy D, Nie A, Jurafsky D, Ng AY (2017) Data noising as smoothing in neural network language models. arXiv preprint arXiv:1703.02573
Zhang X, Zhao J, LeCun Y (2015) Character-level convolutional networks for text classification. In: Advances in neural information processing systems, pp 649–657
Acknowledgements
This research work has been funded by the National Natural Science Foundation of China (Grant No. 61772337, U1736207), and the National Key Research and Development Program of China NO. 2016QY03D0604.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Shi, L., Liu, D., Liu, G., Meng, K. (2020). AUG-BERT: An Efficient Data Augmentation Algorithm for Text Classification. In: Liang, Q., Wang, W., Liu, X., Na, Z., Jia, M., Zhang, B. (eds) Communications, Signal Processing, and Systems. CSPS 2019. Lecture Notes in Electrical Engineering, vol 571. Springer, Singapore. https://doi.org/10.1007/978-981-13-9409-6_266
Download citation
DOI: https://doi.org/10.1007/978-981-13-9409-6_266
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-13-9408-9
Online ISBN: 978-981-13-9409-6
eBook Packages: EngineeringEngineering (R0)