Abstract
Product summarization aims to automatically generate product descriptions, which is of great commercial potential. Considering the customer preferences on different product aspects, it would benefit from generating aspect-oriented customized summaries. However, conventional systems typically focus on providing general product summaries, which may miss the opportunity to match products with customer interests. To address the problem, we propose CUSTOM, aspect-oriented product summarization for e-commerce, which generates diverse and controllable summaries towards different product aspects. To support the study of CUSTOM and further this line of research, we construct two Chinese datasets, i.e., SMARTPHONE and COMPUTER, including 76,279/49,280 short summaries for 12,118/11,497 real-world commercial products, respectively. Furthermore, we introduce EXT, an extraction-enhanced generation framework for CUSTOM, where two famous sequence-to-sequence models are implemented in this paper. We conduct extensive experiments on the two proposed datasets for CUSTOM and show results of two famous baseline models and EXT, which indicates that EXT can generate diverse, high-quality, and consistent summaries (https://github.com/JD-AI-Research-NLP/CUSTOM).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. In: ICLR (2015)
Chen, Q., Lin, J., Zhang, Y., Yang, H., Zhou, J., Tang, J.: Towards knowledge-based personalized product description generation in e-commerce. In: SIGKDD (2019)
Cho, J., Seo, M., Hajishirzi, H.: Mixture content selection for diverse sequence generation. In: EMNLP-IJCNLP, Hong Kong, China (2019)
Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: NAACL-HLT, Minneapolis, Minnesota (2019)
Dong, L., et al.: Unified language model pre-training for natural language understanding and generation. In: NeurIPS (2019)
Fan, A., Grangier, D., Auli, M.: Controllable abstractive summarization. In: NMT@ACL, Melbourne, Australia (2018)
Gehrmann, S., Deng, Y., Rush, A.: Bottom-up abstractive summarization. In: EMNLP, Brussels, Belgium (2018)
Hirschberg, D.S.: Algorithms for the longest common subsequence problem. J. ACM 24, 664–675 (1977)
Hsu, W.T., Lin, C.K., Lee, M.Y., Min, K., Tang, J., Sun, M.: A unified model for extractive and abstractive summarization using inconsistency loss. In: ACL, Melbourne, Australia (2018)
Hu, Z., Yang, Z., Liang, X., Salakhutdinov, R., Xing, E.P.: Toward controlled generation of text. In: ICML (2017)
Khatri, C., Singh, G., Parikh, N.: Abstractive and extractive text summarization using document context vector and recurrent neural networks. arXiv (2018)
Lewis, M., et al.: BART: denoising sequence-to-sequence pre-training for natural language generation, translation, and comprehension. In: ACL (2020)
Li, H., Yuan, P., Xu, S., Wu, Y., He, X., Zhou, B.: Aspect-aware multimodal summarization for Chinese e-commerce products. In: AAAI (2020)
Li, J., Galley, M., Brockett, C., Gao, J., Dolan, B.: A diversity-promoting objective function for neural conversation models. In: NAACL, San Diego, California, pp. 110–119 (2016)
Lin, C.Y.: ROUGE: a package for automatic evaluation of summaries. In: ACL (2004)
Liu, D., et al.: Diverse, controllable, and keyphrase-aware: a corpus and method for news multi-headline generation. In: EMNLP (2020)
MacQueen, J., et al.: Some methods for classification and analysis of multivariate observations. In: Proceedings of the 5th Berkeley Symposium on Mathematical Statistics and Probability, pp. 281–297 (1967)
Nallapati, R., Zhai, F., Zhou, B.: SummaRuNNer: a recurrent neural network based sequence model for extractive summarization of documents. In: AAAI (2017)
Raffel, C., et al.: Exploring the limits of transfer learning with a unified text-to-text transformer. arXiv (2020)
See, A., Liu, P.J., Manning, C.D.: Get to the point: summarization with pointer-generator networks. In: ACL, Vancouver, Canada (2017)
Shao, Z., Huang, M., Wen, J., Xu, W., Zhu, X.: Long and diverse text generation with planning-based hierarchical variational model. In: EMNLP, Hong Kong, China (2019)
Xiao, J., Munro, R.: Text summarization of product titles. In: eCOM@SIGIR (2019)
Zhang, T., Zhang, J., Huo, C., Ren, W.: Automatic generation of pattern-controlled product description in e-commerce. In: WWW (2019)
Zhou, Q., Yang, N., Wei, F., Zhou, M.: Selective encoding for abstractive sentence summarization. In: ACL, Vancouver, Canada (2017)
Acknowledgments
We are grateful to all the anonymous reviewers. This work is supported by the National Key Research and Development Program of China under Grant (No. 2018YFB2100802).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Liang, J., Bao, J., Wang, Y., Wu, Y., He, X., Zhou, B. (2021). CUSTOM: Aspect-Oriented Product Summarization for E-Commerce. In: Wang, L., Feng, Y., Hong, Y., He, R. (eds) Natural Language Processing and Chinese Computing. NLPCC 2021. Lecture Notes in Computer Science(), vol 13029. Springer, Cham. https://doi.org/10.1007/978-3-030-88483-3_10
Download citation
DOI: https://doi.org/10.1007/978-3-030-88483-3_10
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-88482-6
Online ISBN: 978-3-030-88483-3
eBook Packages: Computer ScienceComputer Science (R0)