The Book of Endless History: Authorial Use of GPT2 for Interactive Storytelling

  • John AustinEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11869)


We present The Book of Endless History, an infinite Wikipedia of fantasy stories written in the style of Borges and Calvino, exploring the use of structural conditioning on GPT2 to generate text with explicit subjects and embedded web-links. Users are presented with a Wikipedia-like interface, containing a short fantasy description of the topic and containing embedded web-links to other related subjects. Users may click on links to learn more about different topics, following an endless trail of generated pages. The GPT2 architecture is a text completion model – it has no explicit understanding of structure and it can be a challenge to integrate it with authorial intent. Nevertheless, through this work we show that it can be conditioned to learn to write about a subject and additionally to generate the topology for an encyclopedia. We refer to this technique as subject conditioning, or more generally, structural conditioning.


Structural conditioning GPT2 Interactive storytelling 


  1. 1.
    Honnibal, M., Montani, I.: spaCy 2: natural language understanding with bloom embeddings, convolutional neural networks and incremental parsing (2017, to appear)Google Scholar
  2. 2.
    Mordvintsev, A., Olah, C., Tyka, M.: Inceptionism: going deeper into neural networks, Blog (2015).
  3. 3.
    Radford, A., Wu, J., Child, R., Luan, D., Amodei, D., Sutskever, I.: Language models are unsupervised multitask learners. OpenAI (2019)Google Scholar
  4. 4.
    Zellers, R., et al.: Defending against neural fake news. CoRR abs/1905.12616 (2019).

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.A Stranger GravitySan FranciscoUSA

Personalised recommendations