Abstract
Text Generation has become one of the functional areas of Natural Language Processing (NLP) and is considered a major advancement in this generation of creative texts such as literature. Every year, the field is receiving increased future research. With the advent of Machine Learning and the Deep Neural Network, it has become possible to process and find hidden patterns from huge amounts of information (Big Data). Our research is focused on the use of the Neural Network Model to construct an AI assistant for writers, primarily poets, using Bi-Directional Long Short-Term Memory (LSTM), a variant of the Recurrent Neural Network (RNN), to generate poetry in the English language by providing it with a huge corpus of poems compiled specifically for research by renowned poets. With two unique themes, love and nature, the dataset is specifically chosen. Most of the poems are from the Renaissance and Modern period. This method will function in two ways, firstly by creating full-length poems and sonnets, and secondly as a provocative tool for the writer to make it a hybrid system in which a creative piece of work is produced by both the writer and AI. Some issues with previous versions and models have been improved, such as lack of coherence and rhyme, along with a newly designed use case introduced in our report, such as AI assistant. This model can produce unique poetry along with providing meaningful suggestions with a suggestion of rhyming with appropriate precision in some outputs.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Alpaydin, E.: Introduction to Machine Learning. MIT Press (2020)
Huang, Z., Xu, W., Yu, K.: Bidirectional LSTM- CRF models for sequence tagging. arXiv preprint arXiv:1508.01991 (2015)
Mikolov, T., et al.: Extensions of recurrent neural network language model. In: 2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE (2011)
Oliveira, H.G.: PoeTryMe: a versatile platform for poetry generation. Computational Creativity, Concept Invention, and General Intelligence, vol. 1 (2012), p. 21
Manurung, H.: An evolutionary algorithm approach to poetry generation (2004)
Manurung, R., Ritchie, G., Thompson, H.: Using genetic algorithms to create meaningful poetic text. J. Exp. Theor. Artif. Intell. 24(1), 43–64 (2012)
Wang, Z., et al.: Chinese poetry generation with planning based neural network. arXiv preprint arXiv:1610.09889 (2016)
Toivanen, J., Toivonen, H., Valitutti, A., Gross, O.: Corpus-based generation of content and form in poetry. In: Maher, M.L., Hammond, K., Pease, A., Pérez y Pérez, R., Ventura, D., Wiggins, G. (eds.) Proceedings of the Third International Conference on Computational Creativity. University College Dublin, Dublin, pp. 175–179, International Conference on Computational Creativity (ICCC) (2012)
Korzeniowski, Marek, and Jacek Mazurkiewicz. “Data- Driven Polish Poetry Generator.” International Conference on Artificial Intelligence and Soft Computing. Springer, Cham, 2017
Mikolov, T., Zweig, G.: Context dependent recurrent neural network language model. In: 2012 IEEE Spoken Language Technology Workshop (SLT). IEEE (2012)
Roh, Y., Heo, G., Euijong Whang, S.: A survey on data collection for machine learning: a big data-ai integration perspective. IEEE Trans. Knowl. Data Eng. (2019)
Lipton, Z.C., et al.: Learning to diagnose with LSTM recurrent neural networks. arXiv preprint arXiv:1511.03677 (2015)
Pentheroudakis, J.E., Bradlee, D.G., Knoll, S.K.: Tokenizer for a natural language processing system. U.S. Patent No. 7,092,871. 15 Aug 2006
Sherstinsky, A.: Fundamentals of Recurrent Neural Network (RNN) and Long Short-Term Memory (LSTM) Network. arXiv:1808.03314
Mandelbaum Adi Shalev, A.: Word Embeddings and Their Use In Sentence Classification Tasks. arXiv:1610.08229v1 [cs.LG] 26 Oct 2016
Van Gompel, M., Van den Bosch, A.: Efficient n-gram, Skipgram and Flexgram modelling with Colibri Core. J. Open Res. Softw. 4. https://doi.org/10.5334/jors.105 (2016)
Enyinna Nwankpa, C., Ijomah, W., Gachagan, A., Marshall, S.: Activation Functions: Comparison of Trends in Practice and Research for Deep Learning. arXiv:1811.03378v1 [cs.LG] 8 Nov 2018
Pham∗†, V., Bluche ́ ∗‡, T., Kermorvant∗, C., ome Louradour ˆ ∗ ∗ A2iA, J.: 39 rue de la Bienfaisance, 75008 - Paris - France † SUTD, 20 Dover Drive, Singapore ‡LIMSI CNRS, Spoken Language Processing Group, Orsay, France. Dropout improves Recurrent Neural Networks for Handwriting Recognition. arXiv:1312.4569v2 [cs.CV] 10 Mar 2014
Shwartz-Ziv, R., Tishby, N.: Opening the black box of deep neural networks via information. arXiv preprint arXiv:1703.0081 (2017)
Jain, P., et al.: Story generation from sequence of independent short descriptions. arXiv preprint arXiv:1707.05501 (2017). Wang, Z., et al.: Chinese poetry generation with planning based neural network. arXiv preprint arXiv:1610.09889 (2016)
Jacob, I.J.: Performance evaluation of caps-net based multitask learning architecture for text classification. J. Artif. Intell. 2(1) (2020)
Mitra, A.: Sentiment analysis using machine learning approaches (Lexicon based on movie review dataset). J. Ubiquitous Comput. Commun. Technol. (UCCT) 2(03), 145–152 (2020)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Ratawal, Y., Makhloga, V.S., Raheja, K., Chadha, P., Bhatt, N. (2022). PoemAI: Text Generator Assistant for Writers. In: Raj, J.S., Palanisamy, R., Perikos, I., Shi, Y. (eds) Intelligent Sustainable Systems. Lecture Notes in Networks and Systems, vol 213. Springer, Singapore. https://doi.org/10.1007/978-981-16-2422-3_45
Download citation
DOI: https://doi.org/10.1007/978-981-16-2422-3_45
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-16-2421-6
Online ISBN: 978-981-16-2422-3
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)