Abstract
The proliferation of powerful new forms of automated assistive writing technologies, natural language generation technologies in particular, raises crucial questions about the future of literacy living and learning. This article situates such technologies within the historical trajectory of literacy studies, arguing that the acceleration of natural language generation platforms like GPT-3 may reflect the emergence of a new autonomous model of literacy. Guided by recent theoretical work on automation and global computation, the article offers a series of speculative propositions for digital writing under the new autonomous model of literacy, focusing on questions of agency and subjectivity within a regime of computational racial capitalism. The article concludes with a gesture towards a resistive digital writing pedagogy wherein literacy scholars, educators, and students can resist the dominating potentials of technologies ostensibly designed to assist them.
Similar content being viewed by others
Change history
04 December 2022
The original version of this article was updated to correct the typo error, "fart" to "far".
References
Alvermann, D. E., Young, J. P., Green, C., & Wisenbaker, J. M. (1999). Adolescents' perceptions and negotiations of literacy practices in after-school read and talk clubs. American Educational Research Journal, 36(2), 221–264. https://doi.org/10.3102/00028312036002221.
Andrejevic, M. (2020). Automated media. Abingdon and New York: Routledge.
Bacalja, A., Aguilera, E., & Castrillón-Ángel, E. F. (2021). Critical digital literacy. In J. Z. Pandya, R. A. Mora, J. H. Alford, N. A. Golden, & R. S. De Roock (Eds.), The handbook of critical literacies (pp. 373–380). New York and London: Routledge.
Barton, D. (2001). Directions for literacy research: Analysing language and social practices in a textually mediated world. Language and Education, 15(2-3), 92-104. https://doi.org/10.1080/09500780108666803.
Barton, G. (2018). Why your Netflix thumbnails don’t look like mine. Vox, 21 November. https://www.vox.com/2018/11/21/18106394/why-your-netflix-thumbnail-coverart-changes. Accessed 15 November 2022.
Beller, J. (2021). The world computer: Derivative conditions of racial capitalism. Durham, NC: Duke University Press.
Bender, E. M., Gebru, T., McMillan-Major, A., & Shmitchell, S. (2021). On the dangers of stochastic parrots: Can language models be too big?. In L. Irani, S. Kannan, M. Mitchell, & D. Robinson (Eds.), Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency (pp. 610–623). New York: Association for Computing Machinery. https://doi.org/10.1145/3442188.3445922.
Berninger, V. W., Fuller, F., & Whitaker, D. (1996). A process model of writing development across the life span. Educational Psychology Review, 8(3), 193-218. https://doi.org/10.1007/BF01464073.
Braidotti, R. (2013). The posthuman. Cambridge, UK: Polity Press.
Brown, T., Mann, B., Ryder, N., Subbiah, M., Kaplan, J. D., Dhariwal, P., Neelakantan, A., Shyam, P., Sastry, G., Askell, A., Agarwal, S., Herbert-Voss, A., Krueger, G., Henighan, T., Child, R., Ramesh, A., Ziegler, D., Wu, J., Winter, C…Amodei, D. (2020). Language models are few-shot learners. Advances in neural information processing systems, 33, 1877-1901.
Buckingham, D., & Sefton-Green, J. (2003). Gotta catch'em all: Structure, agency and pedagogy in children's media culture. Media, Culture & Society, 25(3), 379-399. https://doi.org/10.1177/0163443703025003005.
Bulfin, S., & Koutsogiannis, D. (2012). New literacies as multiply placed practices: expanding perspectives on young people's literacies across home and school. Language and Education, 26(4), 331-346. https://doi.org/10.1080/09500782.2012.691515.
Carlson, M. (2015). The robotic reporter: Automated journalism and the redefinition of labor, compositional forms, and journalistic authority. Digital Journalism, 3(3), 416-431. https://doi.org/10.1080/21670811.2014.976412.
Chintagunta, B., Katariya, N., Amatriain, X., & Kannan, A. (2021). Medically aware GPT-3 as a data generator for medical dialogue summarization. Proceedings of Machine Learning Research, 149, 354-372.
Cyphert, A. (2021). A human being wrote this law review article: GPT-3 and the practice of law. WVU College of Law Research Paper Forthcoming, UC Davis Law Review, 55(1).
Daccord, T. (2021). Now AI is doing student writing. Intrepid News,17 August. https://intrepidednews.com/now-ai-is-doing-student-writing-tom-daccord/. Accessed 15 November 2022.
Dale, R., & Viethen, J. (2021). The automated writing assistance landscape in 2021. Natural Language Engineering, 27(4), 511-518. https://doi.org/10.1017/S1351324921000164.
Debaise, D., & Stengers, I. (2016). The insistence of possible: Towards a speculative pragmatism. Trans. A. Brewer. Parse, 7, 12-19.
Dehouche, N. (2021). Plagiarism in the age of massive Generative Pre-trained Transformers (GPT-3). Ethics in Science and Environmental Politics, 21, 17-23. https://doi.org/10.3354/esep00195.
Deleuze, G. (1992). Postscript on the societies of control. October, 59, 3–7.
Dixon-Román, E., Nichols, T. P., & Nyame-Mensah, A. (2020). The racializing forces of/in AI educational technologies. Learning, Media and Technology, 45(3), 236-250. https://doi.org/10.1080/17439884.2020.1667825.
Edwards, J., Perrone, A., & Doyle, P. R. (2020). Transparency in language generation: Levels of automation. In L. Clark & M. Porcheron (Eds.), Proceedings of the 2nd Conference on Conversational User Interfaces (article 26). New York: Association for Computing Machinery. https://doi.org/10.1145/3405755.3406136.
Ehret, C. (2019). Propositions from affect theory for feeling literacy through the event. In D. E. Alvermann, N. J. Unrau, M. Sailors, & R. B. Ruddell (Eds.), Theoretical Models and Processes of Literacy. 7th Ed. (pp. 563–581). New York and London: Routledge.
Eubanks, V. (2018). Automating inequality: How high-tech tools profile, police, and punish the poor. New York: St. Martin’s Press.
Foucault, M. (2007). Security, territory, population: Lectures at the Collège de France, 1977-78. London: Palgrave Macmillan.
Frohm, J., Lindström, V., Winroth, M., & Stahre, J. (2008). Levels of automation in manufacturing. Ergonomia, 30(3), 1-28.
Ganesh, M. I. (2020). The ironies of autonomy. Humanities and Social Sciences Communications, 7(1), 1-10. https://doi.org/10.1057/s41599-020-00646-0.
Garcia, A., & Nichols, T. P. (2021). Digital platforms aren’t mere tools—they’re complex environments. Phi Delta Kappan, 102(6), 14–19. https://doi.org/10.1177/0031721721998148.
Gee, P. J. (2007). Social linguistics and literacies: Ideology in discourses. New York and London: Routledge.
Gough, P. B. (1976). One second of reading. In H. Singer & R. B. Ruddell (Eds.), Theoretical models and processes of reading. 2nd Ed. (pp. 509–535). Newark, NJ: International Reading Association.
Green, B. (2001). English teaching, ‘literacy” and the post-age: On compos(IT)ing and other new times metaphors. In C. Beavis & C. Durrant (Eds.), P(ICT)ures of ENglish: English teaching, literacy and new technologies (pp. 249-271). Adelaide: Wakefield Press.
Gupta, R. (2021). We’re joining Course Hero! Quillbot, 19 August. https://quillbot.com/blog/were-joining-course-hero/. Accessed 15 November 2022.
Hardt, M., & Negri, A. (2000). Empire. Cambridge, MA: Harvard University Press.
Hermansson, C., & Saar, T. (2017). Nomadic writing1 in early childhood education. Journal of Early Childhood Literacy, 17(3), 426–443. https://doi.org/10.1177/1468798417712341.
Hopkins, D., & Schwanen, T. (2021). Talking about automated vehicles: What do levels of automation do?. Technology in Society, 64, 101488. https://doi.org/10.1016/j.techsoc.2020.101488.
Jung, C. G. (1936/1969). The concept of the collective unconscious. In C. G. Jung, Collected Works of C.G. Jung, Volume 9 (Part 1): Archetypes and the Collective Unconscious (pp. 42-53). Princeton, NJ: Princeton University Press.
Kiepas, A. (2020). Human subjectivity in the dawn of digital society (industry 4.0). International Journal of Pedagogy, Innovation and New Technologies, 7, 31-38. https://doi.org/10.5604/01.3001.0014.4455.
Koutsogiannis, D. (2007). A political multi-layered approach to researching children's digital literacy practices. Language and Education, 21(3), 216-231. https://doi.org/10.2167/le748.0.
Kuby, C. R., & Rowsell, J. (2017). Early literacy and the posthuman: Pedagogies and methodologies. Journal of Early Childhood Literacy, 17(3), 285–296. https://doi.org/10.1177/1468798417715720.
Lankshear, C., & Knobel, M. (2011). Literacies: Social, cultural and historical perspectives. New York: Peter Lang.
Latour, B. (1993). We have never been modern. Cambridge, MA: Harvard University Press.
Leander, K., & Boldt, G. (2013). Rereading “A pedagogy of multiliteracies” bodies, texts, and emergence. Journal of Literacy Research, 45(1), 22–46. https://doi.org/10.1177/1086296X12468587.
Leander, K. M., & Burriss, S. K. (2020). Critical literacy for a posthuman world: When people read, and become, with machines. British Journal of Educational Technology, 51(4), 1262-1276. https://doi.org/10.1111/bjet.12924.
Lynch, T. L. (2017). Soft (a) ware in the English classroom: Babble labs: Reimagining English class for a multimodal world. The English Journal, 107(1), 96–98.
Marche, S. (2021). The computers are getting better at writing. The New Yorker, 30 April. https://www.newyorker.com/culture/cultural-comment/the-computers-are-getting-better-at-writing. Accessed 15 November 2022.
McKnight, L. (2021). Electric Sheep? Humans, Robots, Artificial Intelligence, and the Future of Writing. Changing English, 28(4), 442-455. https://doi.org/10.1080/1358684X.2021.1941768.
Mills, K. A. (2010). A review of the “digital turn” in the new literacy studies. Review of Educational Research, 80(2), 246–271. https://doi.org/10.3102/0034654310364401.
Monea, B. (2020). Looking at screens: examining human-computer interaction and communicative breakdown in an educational online writing community. Computers and Composition, 58, 102605. https://doi.org/10.1016/j.compcom.2020.102605.
Narasimhan, A., & Rao, K. P. A. V. (2021). CGEMs: A metric model for automatic code generation using GPT-3.arXiv:2108.10168. Accessed 15 November 2022.
Nichols, T. P., & Johnston, K. (2020). Rethinking availability in multimodal composing: Frictions in digital design. Journal of Adolescent & Adult Literacy, 64(3), 259-270. https://doi.org/10.1002/jaal.1107.
Nichols, T. P., LeBlanc, R. J., & Slomp, D. (2021). Writing Machines: Formative Assessment in the Age of Big Data. Journal of Adolescent & Adult Literacy, 64(6), 712-719. https://doi.org/10.1002/jaal.1160.
Noble, S. U. (2018). Algorithms of oppression. New York: New York University Press.
OpenAI. (2018). OpenAI charter. OpenAI, 9 April. https://openai.com/charter/. Accessed 15 November 2022.
Pajkovic, N. (2022). Algorithms and taste-making: Exposing the Netflix Recommender System's operational logics. Convergence, 28(1), 214–235. https://doi.org/10.1177/13548565211014464.
Parasuraman, R., Sheridan, T. B., & Wickens, C. D. (2000). A model for types and levels of human interaction with automation. IEEE Transactions on systems, man, and cybernetics-Part A: Systems and Humans, 30(3), 286-297.
Patterson, D., Gonzalez, J., Le, Q., Liang, C., Munguia, L. M., Rothchild, D., So, D., Texier, M., & Dean, J. (2021). Carbon emissions and large neural network training. arXiv:2104.10350. Accessed 15 November 2022.
Perelman, L. (2016). Grammar checkers do not work. WLN: A journal of writing center scholarship, 40(7–8), 11–20. https://doi.org/10.1109/3468.844354.
Perrotta, C., Selwyn, N., & Ewin, C. (2022). Artificial intelligence and the affective labour of understanding: The intimate moderation of a language model. New Media & Society. https://doi.org/10.1177/14614448221075296.
Robinson, B. (2022). “yeet nitro boosted”: A postdigital perspective on young people’s literacy engagements with the Discord platform. Literacy Research: Theory, Method, and Practice. https://doi.org/10.1177/23813377221115738.
Ross, J. (2017). Speculative method in digital education research. Learning, Media and Technology, 42(2), 214-229. https://doi.org/10.1080/17439884.2016.1160927.
Selwyn, N., Hillman, T., Bergviken Rensfeldt, A., & Perrotta, C. (2021). Digital Technologies and the Automation of Education—Key Questions and Concerns. Postdigital Science and Education. https://doi.org/10.1007/s42438-021-00263-3.
Sheng, E., Chang, K. W., Natarajan, P., & Peng, N. (2019). The woman worked as a babysitter: On biases in language generation.arXiv:1909.01326. Accessed 15 November 2022.
Smalley, S. (2022). Course Hero, ed-tech company, hires ed-tech critic. Inside Higher Ed, 31 January. https://www.insidehighered.com/news/2022/01/31/course-hero-hires-ed-tech-critic. Accessed 15 November 2022.
Southgate, E., & Judge, I. (2022). The new machine age for English teachers: The pedagogical possibilities and pitfalls of artificial intelligence. Metaphor, 2, 10-16.
Street, B. (1984). Literacy in theory and practice. Cambridge: Cambridge University Press.
Street, B. (2003). What’s “new” in New Literacies Studies? Critical approaches to literacy in theory and practice. Current Issues in Comparative Education, 5(2), 77-91.
Street, B. (2005). Recent applications of new literacy studies in educational contexts. Research in the Teaching of English, 39(4), 417–423. https://doi.org/10.1007/978-94-6300-899-0_2.
Suchman, L. (2006). Human-machine reconfigurations. Cambridge: Cambridge University Press.
Taylor, A. (2018). The automation charade. Logic, 5. https://logicmag.io/failure/the-automation-charade/. Accessed 15 November 2022.
Toews, R. (2020). GPT-3 is amazing—and overhyped. Forbes, 19 July. https://www.forbes.com/sites/robtoews/2020/07/19/gpt-3-is-amazingand-overhyped/?sh=6dd404da1b1c. Accessed 15 November 2022.
Unrau, N. J., Alvermann, D., & Sailors, M. (2019). Literacies and their investigation through theories and models. In D. E. Alvermann, N. J. Unrau, M. Sailors, & R. B. Ruddell (Eds.), Theoretical models and processes of literacy. 7th Ed. (pp. 3–34). New York and London: Routledge.
Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., Kiser, Ł, & Polosukhin, I. (2017). Attention is all you need. Advances in neural information processing systems, 30, 1-11.
Vincent, J. (2020). OpenAI’s latest breakthrough is astonishingly powerful, but still fighting its flaws. The Verge, 30 July. https://www.theverge.com/21346343/gpt-3-explainer-openai-examples-errors-agi-potential. Accessed 15 November 2022.
Yalçın, O. G. (2021). Applied Neural Networks with TensorFlow 2. Berkeley, CA: Apress. https://doi.org/10.1007/978-1-4842-6513-0.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Robinson, B. Speculative Propositions for Digital Writing Under the New Autonomous Model of Literacy. Postdigit Sci Educ 5, 117–135 (2023). https://doi.org/10.1007/s42438-022-00358-5
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s42438-022-00358-5