Skip to main content

AUG-BERT: An Efficient Data Augmentation Algorithm for Text Classification

  • Conference paper
  • First Online:
Communications, Signal Processing, and Systems (CSPS 2019)

Part of the book series: Lecture Notes in Electrical Engineering ((LNEE,volume 571))

Abstract

We propose a BERT-based data augmentation for labeled sentences called Aug-Bert. New sentences are generated by stochastically selecting words and replacing them with other words predicted by Aug-BERT. After a two-stage training, including CLP and L-MLM, BERT can be fine-tuned to be Aug-BERT. Aug-BERT can predict stochastically selected words according to both the context and the label. Depending on the ability of BERT’s deep bidirectional language model and the information of label incorporated via label-segment embedding, Aug-BERT can generate sentences of high quality. Experiments on six different text classification tasks show that our methods outperform most of the others and can improve the performance of classifiers obviously.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 629.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 799.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 799.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://github.com/huggingface/pytorch-pretrained-BERT.

  2. 2.

    https://github.com/pfnet-research/contextual_augmentation.

References

  1. Deschacht K, Moens MF (2009) Semi-supervised semantic role labeling using the latent words language model. In: Proceedings of the 2009 conference on empirical methods in natural language processing, vol 1. Association for Computational Linguistics, pp 21–29

    Google Scholar 

  2. Devlin J, Chang M-W, Lee K, Toutanova K (2018) Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805

  3. Fadaee M, Bisazza A, Monz C (2017) Data augmentation for low-resource neural machine translation. arXiv preprint arXiv:1705.00440

  4. Kim Y (2014) Convolutional neural networks for sentence classification. arXiv preprint arXiv:1408.5882

  5. Kobayashi S (2018) Contextual augmentation: Data augmentation by words with paradigmatic relations. arXiv preprint arXiv:1805.06201

  6. Kolomiyets O, Bethard S, Moens M-F (2011) Model-portability experiments for textual temporal analysis. In: Proceedings of the 49th annual meeting of the Association for Computational Linguistics: Human Language Technologies: short papers, vol 2. Association for Computational Linguistics, pp 271–276

    Google Scholar 

  7. Li X, Roth D (2002) Learning question classifiers. In: Proceedings of the 19th international conference on computational linguistics, vol 1. Association for Computational Linguistics, pp 1–7

    Google Scholar 

  8. Miller GA (1995) Wordnet: a lexical database for English. Commun ACM 38(11):39–41

    Google Scholar 

  9. Pang B, Lee L (2004) A sentimental education: Sentiment analysis using subjectivity summarization based on minimum cuts. In: Proceedings of the 42nd annual meeting on Association for Computational Linguistics. Association for Computational Linguistics, p 271

    Google Scholar 

  10. Pang B, Lee L (2005) Seeing stars: exploiting class relationships for sentiment categorization with respect to rating scales. In: Proceedings of the 43rd annual meeting on Association for Computational Linguistics. Association for Computational Linguistics, pp 115–124

    Google Scholar 

  11. Socher R, Perelygin A, Wu J, Chuang J, Manning C-D, Ng A, Potts A (2013) Recursive deep models for semantic compositionality over a sentiment treebank. In: Proceedings of the 2013 conference on empirical methods in natural language processing, pp 1631–1642

    Google Scholar 

  12. Taylor WL (1953). Cloze procedure: a new tool for measuring readability. J Bull 30(4):415–433

    Google Scholar 

  13. Wang WY, Yang D (2015). Thats so annoying!!!: A lexical and frame-semantic embedding based data augmentation approach to automatic categorization of annoying behaviors using# petpeeve tweets. In: Proceedings of the 2015 conference on empirical methods in natural language processing, pp 2557–2563

    Google Scholar 

  14. Wiebe Janyce, Wilson Theresa, Cardie Claire (2005) Annotating expressions of opinions and emotions in language. Lang Resour Eval 39(2–3):165–210

    Article  Google Scholar 

  15. Wu X, Lv S, Zang L, Han J, Hu S (2019) Conditional BERT contextual augmentation. In:International conference on computational science. Springer, pp 84–95

    Google Scholar 

  16. Xie Z, Wang SI, Li J, LĂ©vy D, Nie A, Jurafsky D, Ng AY (2017) Data noising as smoothing in neural network language models. arXiv preprint arXiv:1703.02573

  17. Zhang X, Zhao J, LeCun Y (2015) Character-level convolutional networks for text classification. In: Advances in neural information processing systems, pp 649–657

    Google Scholar 

Download references

Acknowledgements

This research work has been funded by the National Natural Science Foundation of China (Grant No. 61772337, U1736207), and the National Key Research and Development Program of China NO. 2016QY03D0604.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Linqing Shi .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Shi, L., Liu, D., Liu, G., Meng, K. (2020). AUG-BERT: An Efficient Data Augmentation Algorithm for Text Classification. In: Liang, Q., Wang, W., Liu, X., Na, Z., Jia, M., Zhang, B. (eds) Communications, Signal Processing, and Systems. CSPS 2019. Lecture Notes in Electrical Engineering, vol 571. Springer, Singapore. https://doi.org/10.1007/978-981-13-9409-6_266

Download citation

  • DOI: https://doi.org/10.1007/978-981-13-9409-6_266

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-13-9408-9

  • Online ISBN: 978-981-13-9409-6

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics