Skip to main content

PoemAI: Text Generator Assistant for Writers

  • Conference paper
  • First Online:
Intelligent Sustainable Systems

Part of the book series: Lecture Notes in Networks and Systems ((LNNS,volume 213))

  • 838 Accesses

Abstract

Text Generation has become one of the functional areas of Natural Language Processing (NLP) and is considered a major advancement in this generation of creative texts such as literature. Every year, the field is receiving increased future research. With the advent of Machine Learning and the Deep Neural Network, it has become possible to process and find hidden patterns from huge amounts of information (Big Data). Our research is focused on the use of the Neural Network Model to construct an AI assistant for writers, primarily poets, using Bi-Directional Long Short-Term Memory (LSTM), a variant of the Recurrent Neural Network (RNN), to generate poetry in the English language by providing it with a huge corpus of poems compiled specifically for research by renowned poets. With two unique themes, love and nature, the dataset is specifically chosen. Most of the poems are from the Renaissance and Modern period. This method will function in two ways, firstly by creating full-length poems and sonnets, and secondly as a provocative tool for the writer to make it a hybrid system in which a creative piece of work is produced by both the writer and AI. Some issues with previous versions and models have been improved, such as lack of coherence and rhyme, along with a newly designed use case introduced in our report, such as AI assistant. This model can produce unique poetry along with providing meaningful suggestions with a suggestion of rhyming with appropriate precision in some outputs.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 219.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 279.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Alpaydin, E.: Introduction to Machine Learning. MIT Press (2020)

    Google Scholar 

  2. Huang, Z., Xu, W., Yu, K.: Bidirectional LSTM- CRF models for sequence tagging. arXiv preprint arXiv:1508.01991 (2015)

  3. Mikolov, T., et al.: Extensions of recurrent neural network language model. In: 2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE (2011)

    Google Scholar 

  4. Oliveira, H.G.: PoeTryMe: a versatile platform for poetry generation. Computational Creativity, Concept Invention, and General Intelligence, vol. 1 (2012), p. 21

    Google Scholar 

  5. Manurung, H.: An evolutionary algorithm approach to poetry generation (2004)

    Google Scholar 

  6. Manurung, R., Ritchie, G., Thompson, H.: Using genetic algorithms to create meaningful poetic text. J. Exp. Theor. Artif. Intell. 24(1), 43–64 (2012)

    Article  Google Scholar 

  7. Wang, Z., et al.: Chinese poetry generation with planning based neural network. arXiv preprint arXiv:1610.09889 (2016)

  8. Toivanen, J., Toivonen, H., Valitutti, A., Gross, O.: Corpus-based generation of content and form in poetry. In: Maher, M.L., Hammond, K., Pease, A., Pérez y Pérez, R., Ventura, D., Wiggins, G. (eds.) Proceedings of the Third International Conference on Computational Creativity. University College Dublin, Dublin, pp. 175–179, International Conference on Computational Creativity (ICCC) (2012)

    Google Scholar 

  9. Korzeniowski, Marek, and Jacek Mazurkiewicz. “Data- Driven Polish Poetry Generator.” International Conference on Artificial Intelligence and Soft Computing. Springer, Cham, 2017

    Google Scholar 

  10. Mikolov, T., Zweig, G.: Context dependent recurrent neural network language model. In: 2012 IEEE Spoken Language Technology Workshop (SLT). IEEE (2012)

    Google Scholar 

  11. Roh, Y., Heo, G., Euijong Whang, S.: A survey on data collection for machine learning: a big data-ai integration perspective. IEEE Trans. Knowl. Data Eng. (2019)

    Google Scholar 

  12. Lipton, Z.C., et al.: Learning to diagnose with LSTM recurrent neural networks. arXiv preprint arXiv:1511.03677 (2015)

  13. Pentheroudakis, J.E., Bradlee, D.G., Knoll, S.K.: Tokenizer for a natural language processing system. U.S. Patent No. 7,092,871. 15 Aug 2006

    Google Scholar 

  14. Sherstinsky, A.: Fundamentals of Recurrent Neural Network (RNN) and Long Short-Term Memory (LSTM) Network. arXiv:1808.03314

  15. Mandelbaum Adi Shalev, A.: Word Embeddings and Their Use In Sentence Classification Tasks. arXiv:1610.08229v1 [cs.LG] 26 Oct 2016

  16. Van Gompel, M., Van den Bosch, A.: Efficient n-gram, Skipgram and Flexgram modelling with Colibri Core. J. Open Res. Softw. 4. https://doi.org/10.5334/jors.105 (2016)

  17. Enyinna Nwankpa, C., Ijomah, W., Gachagan, A., Marshall, S.: Activation Functions: Comparison of Trends in Practice and Research for Deep Learning. arXiv:1811.03378v1 [cs.LG] 8 Nov 2018

  18. Pham∗†, V., Bluche ́ ∗‡, T., Kermorvant∗, C., ome Louradour ˆ ∗ ∗ A2iA, J.: 39 rue de la Bienfaisance, 75008 - Paris - France † SUTD, 20 Dover Drive, Singapore ‡LIMSI CNRS, Spoken Language Processing Group, Orsay, France. Dropout improves Recurrent Neural Networks for Handwriting Recognition. arXiv:1312.4569v2 [cs.CV] 10 Mar 2014

  19. Shwartz-Ziv, R., Tishby, N.: Opening the black box of deep neural networks via information. arXiv preprint arXiv:1703.0081 (2017)

  20. Jain, P., et al.: Story generation from sequence of independent short descriptions. arXiv preprint arXiv:1707.05501 (2017). Wang, Z., et al.: Chinese poetry generation with planning based neural network. arXiv preprint arXiv:1610.09889 (2016)

  21. Jacob, I.J.: Performance evaluation of caps-net based multitask learning architecture for text classification. J. Artif. Intell. 2(1) (2020)

    Google Scholar 

  22. Mitra, A.: Sentiment analysis using machine learning approaches (Lexicon based on movie review dataset). J. Ubiquitous Comput. Commun. Technol. (UCCT) 2(03), 145–152 (2020)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Ratawal, Y., Makhloga, V.S., Raheja, K., Chadha, P., Bhatt, N. (2022). PoemAI: Text Generator Assistant for Writers. In: Raj, J.S., Palanisamy, R., Perikos, I., Shi, Y. (eds) Intelligent Sustainable Systems. Lecture Notes in Networks and Systems, vol 213. Springer, Singapore. https://doi.org/10.1007/978-981-16-2422-3_45

Download citation

Publish with us

Policies and ethics