The Book of Endless History: Authorial Use of GPT2 for Interactive Storytelling
We present The Book of Endless History, an infinite Wikipedia of fantasy stories written in the style of Borges and Calvino, exploring the use of structural conditioning on GPT2 to generate text with explicit subjects and embedded web-links. Users are presented with a Wikipedia-like interface, containing a short fantasy description of the topic and containing embedded web-links to other related subjects. Users may click on links to learn more about different topics, following an endless trail of generated pages. The GPT2 architecture is a text completion model – it has no explicit understanding of structure and it can be a challenge to integrate it with authorial intent. Nevertheless, through this work we show that it can be conditioned to learn to write about a subject and additionally to generate the topology for an encyclopedia. We refer to this technique as subject conditioning, or more generally, structural conditioning.
KeywordsStructural conditioning GPT2 Interactive storytelling
- 1.Honnibal, M., Montani, I.: spaCy 2: natural language understanding with bloom embeddings, convolutional neural networks and incremental parsing (2017, to appear)Google Scholar
- 2.Mordvintsev, A., Olah, C., Tyka, M.: Inceptionism: going deeper into neural networks, Blog (2015). https://ai.googleblog.com/2015/06/inceptionism-going-deeper-into-neural.html
- 3.Radford, A., Wu, J., Child, R., Luan, D., Amodei, D., Sutskever, I.: Language models are unsupervised multitask learners. OpenAI (2019)Google Scholar
- 4.Zellers, R., et al.: Defending against neural fake news. CoRR abs/1905.12616 (2019). http://arxiv.org/abs/1905.12616